Thursday, December 31, 2009


Ok, everybody made speculations about the meaning of the F in F-theory. Possibly the most accepted one was that it was due to Cumrumm Va(F)a. But an article appearing now in arxiv has shown it's real origin.

The authors of the article are Adil Belha and Leila Medari. It is titled "Superstrings, Phenomenology and F-theory". the abstract reads:

We give brief ideas on building gauge models in superstring theory, especially the four-dimensional models obtained from the compactification of F-theory. According to Vafa, we discuss the construction of F-theory to approach non-perturbative aspects of type IIB superstring. Then, we present local models of F-theory, which can generate new four-dimensional gauge models with applications to phenomenology.

It is based on invited talks given by A. Belhaj in Oviedo, Rabat, Zaragoza.

Untill here nothing seems to support my claim of the explanation of the origin of the name. But if you go and see the paper, available in: one finds that it is written in French. That explains all it ;-).

Fortunately I have a relatively good knowledge of French and I could make a quick reading of the article. It is a good introduction to the topic, from the very beguining explaining the basics of string theory, D-branes and all that. Later it explains the basics of F-theory, of local models and of local F-theory GUT models. All of it in a short article of 15 pages.

Despite the name it doesn't dive too much into phenomenology. But still it gives a good introduction to many aspects of the subject for non initiated people. In that sense it is far better than the blog entry of Jackes Distler about the first big paper of Vafa. And, definitively, it looks like a good chance for Spanish people people interested in the subject but not speaking English and maybe speaking French.

By the way, for those that didn't read the Spanish entry about the CDMS announcement just say that F-theory GUTS predicts that the LSP (lightest supersymmetric partner) is the gravitino, which is not a viable candidate for a WIMP. The CDMS two events finding (irrespective of how statistically significant it could be) is kind of a hint that the LSP is a WIMP (maybe a neutralino) so if confirmed the actual Vafa models of F-theory GUT would become invalidated. Possibly the experts on the subject could recook some aspects of the more phenomenological aspects of the theory (mainly the supersymmetry breaking mechanism) to fit the new data. But certainly the best aspect of the whole construction, reproducing the standard model and make concrete predictions, would go away.

But, as Vafa said in the strings 2009 conferences. That's the bad point of making predictions, that they could be invalidated.

If someone is interested in knowing it I must say that since the CDMS announcement I have decided to study in more detail what heterotic phenomenology can offer. It doesn't mean that F-theory is not interesting any more, but irrespectively of the CDMS I needed to pay more attention to heterotic theories. The CDMS is just a good excuse.

Also I am reading (and in some cases rereading) a lot of articles in black holes (stringy and not stringy ones). You can read about it in my other blog (if you speak Spanish). Still I guess that I will also talk about the subject in this blog in a near future, when I have finished reading carefully a few bunch of articles. For example, today there is an article about the subject of B-h creation in particle collisions:

Other interesting articles today in arxiv are: Unification of Residues and Grassmannian Dualities by Nima Arkani-Hamed, Jacob Bourjaily, Freddy Cachazo and Jaroslav Trnka. The article continuate the MHV program to give a twistorial technique to find scattering amplitudes. I must admit that although I recognize it's interest I am not following too much that developments. Still I think some readers can find it more attractive than me.

Also I would note two papers in dark energy:

Inverse problem - reconstruction of dark energy models


We review how we can construct the gravity models which reproduces the arbitrary development of the universe. We consider the reconstruction in the Einstein gravity coupled with generalized perfect fluid, scalar-Einstein gravity, scalar-Einstein-Gauss-Bonnet gravity, Einstein-$F(G)$-gravity, and $F(R)$-gravity. Very explicit formulas are given to reconstruct the models, which could be used when we find the detailed data of the development of the universe by future observations. Especially we find the formulas using e-foldings, which has a direct relation with observed redshift. As long as we observe the time development of the Hubble rate $H$, there exists a variety of models describing the arbitrary development of universe.

The F(R) theories of the subject refers to approaches where one consider gravity theories with terms in the lagrangian that contain higher order terms in the curvature that appear as counterterms in the renormaliztion program of conventinal quantum gravity (the theory actually is not enormalizable because of the need of infinite diferent terms). There was recently a good review article about the subject and if I have time to read it I will post about that kind of theories.

Also about dark energy is a paper by A. M. Polyakov: Decay of Vacuum Energy .


This paper studies interacting massive particles on the de Sitter background. It is found that in some cases (depending on even/odd dimensionality of space, spins, masses and couplings of the involved particles etc) the vacuum acts as an inversely populated medium which is able to generate the stimulated radiation. This "cosmic laser" mechanism depletes the curvature and perhaps may help to solve the cosmological constant problem. The effect is more robust in the odd dimensional space-time, while in the even case additional assumptions are needed.

Polyakov is a very original thinker, and despite that sometimes it's ideas seems a bit non conventional it always worth reading him.

Possibly there are more interesting papers in axiv today, but I'll stop here.

Good new year to all readers.

Friday, December 25, 2009

Relation betwen the Sokolov–Ternov effect and the Unruh effect

I have been disucisong in my other (and in the miguis forum) the proposal of Crane to use a black hole as an starship impulsor, bases on his arxiv article: ARE BLACK HOLE STARSHIPS POSSIBLE?.

You can read (if you understand spanish) the three post about the suject: 1 , 2 and 3.

While discusing that papers I have ben reading in wikipedia about it's litle brother, the Unruh effect.

As explained there in detaill that effect consist of the observance of thermal radiation by an acelerated observed of what is vaccum for an stationary observer. The temperature of the radiation is proportional to the aceleration: $$T=ha/4\pi^2ck $$ (k is the bolstman constant, the other quantities have their obvious meaning).

To my surprise in the entry is mentioned that there is a claim that the radiation has been observed. In particular it has been claimed to be observed in the Sokolov–Ternov effect: the effect of self-polarization of relativistic electrons or positrons moving at high energy in a magnetic field. The self-polarization occurs through the emission of spin-flip synchrotron radiation. and, in particular:

it was shown that if one takes an accelerated observer to be an electron circularly orbiting in a constant external magnetic field, then the experimentally verified Sokolov-Ternov effect coincides with the Unruh effect.

This results date back to 2005, so they are not new at all. And I am almost sure that they are controversial or someone would have a nobel prize for it ;-). The whole thing is that despite I try to be informed, I have no idea about it. Maybe other readers of the blog also were unaware of it and they could be curious to know.

Thursday, December 17, 2009

Dark matter live webcast

Ok, a litle bit late, but still something is going on:

Fermilab webcast in dark matter CDMSresults

Or, if you prefer you can watch the other simultaneous conference:

As I am posting late just tell that the main announcement has been already made, two events. That means not a definitive discovering, because of statistical considerations, but certainlly something. Now they are preciselly discusing exactly how significant this is.

Update: If you want to see a summary of the results by the CDMS team, get it here (it is a two pages pdf, without formulae, readable for most people).

Quick summary, as said in CF: if these events are interpreted as signal, the lower bound on the WIMP mass for these recoil energies is roughly 0.5 GeV.

I would add, a good guess (it gives the best possible cross-section) is a 70 GeV WIMP. DAMA claims of dark matter discovering, via inelastic dark matter (that is the WIMP has excited energy state) is compatible with CDMS results in a reasonable parameter range.

I invite you to read the entries on the topic in many of the blogs in my link list (and possibly many others). Although not a discovering there will be a lot of discussion about these results in the near future. And new results are announced for the future, when the new superCDMS would be working.

Update: You can see the recorded video of one of the conferences from this website:

The arxiv paper, still not submitted when I am posting this, is availabe here

There are some discussion in the blogs about the actual relevance of the signal. The most accepted one is a 1.5 sigma result. The discrepancies differ in how to actually consider the background. The data of 1.3 goes with the blinded background (optimized background obtained without knowledge of the existence of the signals). If one use other background one could get as much as (almost) 3 sigmas, or as few as 0. By the way, the very use of "sigma" is more appropriate for gaussian distributions, but it is used commonly for non gaussian ones with the appropriated corrections.

For the future I have read that before de superCDMS it is expected to have data from another experiment, the XENON100. They talk about "early in the 2010". It remains to see what "early" exactly means, and -more important- what the results are.

If one wants to read an easy introduction to the detailss of how CDMS works one can read this entry in the old tomasso dorigo blog. Be aware that Dorigo dosn't like too much supersymmetry and it argues that the (previous) CDMS result convince him a little bit more about that. Curiously he hasn't any entry about this new CDMS dataset.

I had not time to answer Matti to a question in the previous post. I leave here a link to his own view of these results as a compensation:

Tuesday, December 08, 2009

Se rumorea que se ha descubierto la materia oscura

Pues si, pues sí. La famosa materia oscura que forma el noventa y tantos por ciento de la masa del universo cuya presencia se infiere por el comportamiento de la materia visible pero de la que no había evidencia directa parece que al final ha sido descubierta en uno de los numerosos experimentos de laboratorio que actualmente se dedican a su búsqueda.

En realidad hay un grupo experimental italiano que responde a las siglas DAMA que llevan un tiempo diciendo haberla encontrado. Pero por una parte su evidencia es un tanto circunstancial, habiendo hallado variaciones estacionales de cierto tipo de eventos posiblemente relacionados con algunos candidatos posibles a materia oscura. Por otro lado experimentos con una sensibilidad igual, o superior, al DAMA no han encontrado nada. E realidad hay diferencias sutiles entre los diversos tipos de detectores y es posible -pero muy improbable- un cierto tipo de materia oscura que sea detectable por DAMA y no el resto de detectores.

Pero no es DAMA lo que esta ahora en candelero ( a raíz de este post en el blog de Jesster, resonances) sino CDMS, siglas de cryogenic dark matter search. Este grupo ha puesto detectores en una mina de sodio enterrada profundamente en algún lugar de Minessota. En 2007 este grupo entrego un informe negativo dónde ponían unos límites experimentales a las características posibles que podía tener la materia oscura tipo WIMP (weakly interacting massive particles). Se esperaba que ya estuviese publicado el artículo con la nueva remesa de datos, más extensa y tomada con instrumentos de mejorada sensibilidad. Pero se han retrasado y han enviado l artículo a nature, y esta revista ha aceptado el artículo, lo que hace pensar que pueda ser importante. Nature es una de las pocas revistas que quedan actualmente que tiene un contrato de confidencialidad (o como quiera traducirse disclosure) y hasta el 18 de este mes no estará disponible el artículo. Posiblemente ese mismo día también haya otro artículo paralelo en arxiv (libre de descarga para todo el mundo por consiguiente).

Realmente esta sería una estupenda noticia para todo el mundo, excepto tal vez el físico de cuerdas Cunrum Vafa y colaboradores, que en los últimos dos años habían desrrollado un excelente y elaborado modelo basado en teoría de cuerdas que reproducía el modelo standard de partículas sin aditamentos exóticos comunes en otros modelos fenomenológicos, y, aparte, hacía algunas predicciones. Entre ellas que la materia oscura esta formada principalmente por el gravitino (compañero supersimétrico del gravitón), que no es una partícula tipo WIMP. Si se confirma el hallazgo habría que ver si pueden reacomodar su modelo para incorporar este hallazgo sin destruir el resto de características buenas de su teoría.

Por lo que yo tengo comprendido de la teoría F la mayoría de restricciones que utiliza para hacer predicciones se basa en su modelo de ruptura de la supersimetria. Allí usan un modelo de mediación gauge (una variante de algo conocido como modelo de guidicce-massiero usado en modelos de mediación gravitatoria), dónde el mensajero es un bosón asociado a una simetría gauge tipo Peccei-Quin,asociada al axión de la QCD. Es un modelo bastante minimalista dónde casi no hay "sector oscuro" supersimétrico y en ese sentido parece muy buena idea. Pero claro, si ahora deben acomodar un WIMP como partícula supersimétrica mas ligera deberían revisar las cosas -si ello es posible- y posiblemente ese mecanismo de ruptura de la supersimetría sea lo que mas se presta a ello. Otra posibilidad, que a mi me parece muy remota, es que ya que tiene un WIMP -el neutralino mas ligero (un neutralino es una combinación del zino, fotino y higgsino) como posible NLSP (aunque la mejor opcion es un stau)- tal vez haya un mecanismo extraño de decay que pueda llevar a que haya WIMPS sueltos por ahí, y que el gravitino siga siendo el LSP (y por tanto el componente mayoritario de la materia oscura). Por las características del CDMS no podría detectar el gravitino. En fin, estas son especulaciones rápidas, y posiblemente con mi aún paupérrimo entendimiento de esas partes de la teoría F quizás sean demasiado arriesgadas. Por si acaso le he preguntado a motl (que también ha posteado la noticia en su blog, y en su respuesta parece estar de acuerdo con de lo que yo digo.

Como quiera que sea, le pese a quien le pese, si realmente se ha descubierto la materia oscura estamos ante un acontecimiento histórico. Es más, podría tener consecuencias para el experimento del LHC pues posiblemente este debería ser capaz de producir esta partícula recién observada, y así tendríamos una doble confirmación (aparte de una guía muy exacta de como afinar los detectores del LHC, lo cuál hará mas fácil la detección).

Tuesday, November 24, 2009

Introducción a la supersimetría II: El modelo de Wess-Zumino

Había escrito, hace ya tiempo, una entrada sobre supersimetría, esta. Continuo el tema introduciendo una realización de dicha supersimetría en términos de un lagrangiano sencillo, lo que se conoce como el modelo de Wess-Zumino. Quien no tenga muy recientes sus conocimientos de teoría cuántica de campos, y en particular los tipos posibles de spinores, puede leer sobre ello en esta entrada de mi otro blog.

Este va a constar de dos campos, un campo escalar complejo \[\phi\] formado por dos campos reales A y B, \[\phi=(A+iB/\sqrt{2})\] y un campo spinorial de Majorana \[\psi\]. Ambos campos van a carecer de masa. El motivo para ello es que en la naturaleza no se ha observado la supersimetría, lo cuál indica que caso de existir, la supersimetría debe estar rota. Se supone que las partículas supersimétricas de las partículas conocidas habrán adquirido masa a trvés de un proceso de ruptura de esta supersimetría. Con estos ingredientes el término cinétco de nuestro lagrangiano será.

1.\[ L= \partial^{\mu} \phi^*\partial_{\mu}\phi ~ + ~ 1/2i\bar\Psi\displaystyle{\not} \partial \Psi \]

Ese lagrangiano es invariante bajo una tranformación SUSY global:

2. \[delta A=\bar\epsilon\psi\]$\delta B=i\bar \epsilon\gamma_5 \psi\]

\[delta \psi=-i\gamma^\mu[\partial_\mu (A + i \gamma_5B)]\epsilon\]

Donde\[epsilon\] es el generador infinitesimal (asumo que el lector esta familiarizado con como surgen los generadores infinitesimales de simetrías en mecánica cuántica y su relación con las simetrías globales a través de la exponenciación) de la supersimetría, un spinor infinitesimal de Majorana.

Puede verse que, como se espera de una supersimetría, esta transformación nos cambia campos bosónicos en campos fermiónicos. Para ser supersimétrica el lagrangiano debe ser invariante bajo esta transformación. Se puede verificar que bajo ese cambio la variación del lagrangiano es:

3.\[\delta L=\partial_\mu[1/2\bar\epsilon\gamma^\nu(\displaystyle{\not}\partial(A + i\gamma_5 B))\psi] \]

Este \[delta L\]es una derivada total y por tanto no contribuye a la variación total de la acción y , como anunciaba, hace que 1 sea un lagrangiano supersimétrico. En general los lagrangianos supersimétricos no pueden ser invariantes bajo supersimetría, salvo que sean constantes, y siempre debe entenderse la invarianza en el sentido de que su variación es una derivada total.

Este lagrangiano es adecuado para partículas libres. Si añadimos interacciones se encuentra que le conmutador de dos transformaciones no es cerrado fuera de la capa de masas, y por tanto no es adecuado. Para paliar eso deben añadirse dos campos bosónicos extra, normalmente designados F y G, cuyo lagrangiano es:

4. \[ L= 1/2F^2 + 1/2 G^2 \]

La solución de la ecuación de Euler Lagrange asociada al lagrangiano 4 es F=G=0 y por tanto estos campos no tiene estados en la capa de masas, intervienen en la teoría sólo como partículas virtuales intermedias.

Se ha descrito hasta ahora como sería el lagrangiano para partículas sin masa. Nada impide construir el lagrangiano para partículas con masa. El término de masa tendria la forma:

5.\[L _m= m(FA + GB -1/2\bar\psi \psi) \]

La forma mas general de un término de interacción -renormalizable sería.

6.\[L_i= g/\sqrt{2}[FA^2 - FB^2 + 2GAB - \bar\psi(A - i\gamma_5B)\psi] \]

Este sería el modelo elemental de Wess-Zumino. Si uno pretende hacer teorías de campos supersimétricas realistas debería trabajar con fermiones quirales zurdos. No es especialmente complicado hacerlo, y repitiendo los pasos uno llegaría a una expresión de los lagrangianos anteriores en términos de esos fermiones quirales. El aspecto más interesante de ese desarrollo es que uno termina con un lagrangiano que puede expresarse de la forma:

7.\[L = L_K - |\partial W/\partial \phi|^2 ~ - ~ 1/2(\partial^2 W/\partial \phi^2\psi^T_L C \psi_L + herm.conj) \]

Aquí $L_k$ sería el término cinético para los campos correspondientes y W sería lo que se conoce como el superpotencial. Este juega un papel importante en muchas discusiones sobre supersimetría y será tratado con mas detalle en ulteriores entradas. Por ahora decir que para el modelo sencillo que estamos considerando aquí su expresión más general sería:

8.\[W= 1/2m\phi^2 ~ + ~ 1/3 g\phi^3\]

En esta entrada se ha presentado el que posiblemente sea el tratamiento mas sencillo posible de la supersimetría. Actualmente es muy común usar el formalismo de supercampos. Este se basa en la noción de superespacio. El superespacio es el resultado de añadir a las componentes geométricas normales unas componentes "fermiónicas" representadas como variables de Grassman. Un supercampo dependería de ambos tipos de variables. Dadas las peculiares propiedades de las variables de grassman es muy sencillo ver que un desrrollo en serie en términos de las mismas es finito y que, por tanto, se puede dar una expresion general para un supercampo. Cuando se hace eso para campos que solamente tengan spin 1/2 y y 0 se puede ver que el modelo de supercampos obtenido es equivalente a el modelo de Wess-Zumino presentado aquí. Si además se impone que los campos fermiónicos sean quirale se obtiene la versión quiral del modelo de Wess-Zumino. El supercampo que cumple esas características es conocido cómo "supercampo quiral". Por supuesto se pueden hacer construcciones supersimétricas para campos gauge y, de ese modo, teorías gauge supersimétricas y análogos supersimétricos del modelo standard. La extensión supersimétrica mas sencilla de el modelo standard se conoce como MSSSM (minimal supersymetric stadard modell).

Aquí hemos tratado la supersimetría global. Cuando esta se hace local aparece de manera natural la gravitación y tendríamos teorias de supergravedad. Dado que la supersimétria no esta realizada en el modelo standard se asume que si el universo presenta supersimetría debe hacerlo en una versión con supersimetría rota. La ruptura de supersimetría es un tópico complejo, y juega un papel fundamental en la mayoria de modelos fenomenológicos que se postulan para extender el modelo standard de partículas. Indirectamente eso significa que también juegan un papel en las teorías de cuerdas, en sus diversas variantes. Por ejemplo la teoría F, la mas desarrollada a nivel fenomenológico utiliza una variante del mecanismo de supersimetría conocido como modelo de guidicce-massiero.

Se irán tratando esos tópicos en posteriores entradas.

Finalizo diciendo que estos posts siguen principalmente el libro de texto de P.D. B. Collins, A.D. Martin y E.J Squires "Particle physics and cosmology". A eso he añadido información adicional de los libros de M. Dine "Supersymmetry and superstrings" y el volumen III de el libro de teoría cuántica de campos de Steven Weinberg.

Saturday, October 31, 2009

The dark side of the landscape

In the previous post I had presented the multiverse in a way that made it look almost innocuous. As I have said a few times in this blog I had heard about how the landscape (existence of a large number of vacua) in string theory made it unavailable to make predictions.

Despite it the actual articles I had read didn't give to me that impression, so I suspected that I was missing something, the problem is that I didn't know what. Reading a recent entry in Lubos blog titled A small Hodge three-generation Calabi-Yau I faced again that problem of missing information. So I reared once again the KKTLT paper and I searched for bibliography that would give me some cloud.

At last I was lead to the correct paper, The statistics of string/M theory vacua by Michael R. Douglas (he is not the actor, of course). The abstract of the paper says all it:

We discuss systematic approaches to the classification of string/M theory vacua, and physical
questions this might help us resolve. To this end, we initiate the study of ensembles of
effective Lagrangians, which can be used to precisely study the predictive power of string
theory, and in simple examples can lead to universality results. Using these ideas, we outline
an approach to estimating the number of vacua of string/M theory which can realize
the Standard Model.

I still haven't finished to read the paper, but the image is clear. Yes, one can have a chaotic/eternal inflation scenario that creates an infinite of universes, or one can go from one to another thought some kind of CDL (Coleman de Luccia) or Hawkings instantons among deSitter vacua or whatever mechanism to create an universe for whatever vacua of the one available in string theory. An yes, every new universe would have an smaller cosmological constant that the previous one. In that way one has an universe with the small cosmological constant (cc) observed in ours. The anthropic principle (or ideology as prefer to name it Lubos) says that in universes with large cc there are no observers so it is not that bizarre that we observe such an small cc, despite the fact that naturally theories with broken supersymmetry would have a big one to begin with.

The real problem is that in that paper is argued that even with the restrictions of an small cc and the observed gauge content (the standard model one) one still has a large number of solutions with the values of the coupling constants, masses of the particles and etc. in the observed margin of the standard model. I have made quick search in the paper to see if it was here where it appeared the famous $10^{500}$ but I couldn't find it (the search feature of acrobat seems to not work with math expressions) but in the text appear ofthem $10^{100 }~ 10^{400}$ so it is in the right order of magnitude. I have intention of reading this paper soon, as well as another by Kallosh and Linde, Landscape, the scale of SUSY breaking, and inlation

It is not that I like the idea of the landscape, I don't, and that's why I hadn't found sooner this papers and I had searched other lines of investigations, such as the ones mentioned in this blog. But like it seems that cosmology is a such a hot topic nowadays, mainly because the large amount of data available, I think it is a good idea to know this kind of things in some detail.

As I had said previously, in other blog entries, I was aware that there were some concrete approaches that tried to disprove the landscape, understood in the sense presented here-that is, too many vacuums compatible with the standard model, not just too many vacuums compatible with an small cosmological constant-. Some of that papers are The String Landscape and the Swampland discussed by Lubos here and also discussed by Distler in his entry YOU CAN’T ALWAYS GET WHAT YOU WANT. More entries in Lubos blog discussing papers against the (SM)landscape are Ooguri and Vafa's swampland conjectures. He also has a paper with C. Vafa and Nima Arkani-Hamed titled The String Landscape, Black Holes and Gravity as the Weakest Force. I think that I have seen a blog entry of him about that paper, but it doesn't appear in the trackback for some reason.

Well, I leave this entry as a loosely discussed bibliography of the real problems of the landscape ideology. As I said there are possibly good reasons to expect a good "vacuum selection method" as M. Douglas calls it and so one wouldn't care too much about it. Possible the LHC could give a cloud of it. It is good to know that beams are beginning to circulate in it again, at least partially, and thatvry soon-if everything goes ok-it will be giving data.

Saturday, October 17, 2009

Universe or multiverse?

Universe or multiverse?

Recently there has been some peak of comments on the blogosphere about the multiverse, partially because of a new article by Linde and Vanchury titled How many universe are in the multiverse?.

But the battle against the multiverse, and it's buddy's, the anthropic principle and the string landscape are not new at all. Peter Woit is a championship of that cause. I have always considered P.W. as innocuous, and a source of information about string theory, even if he doesn't like it. Being he a mathematician, or at most a mathematical physicist it is not a lack of respect to his position in an university to not take seriously their objections to a branch of physics that he mostly doesn't understand.

But recently I am beginning to think that in fact he can be causing some damage. The problem I see is that he is so intended to criticise string theory that he only search the part that is good for his purposes without worrying of understanding the whole picture. Worse still, he can make believe that his biased view is the whole view. And that's very bad because it gives a very wrong perspective of what is being done in string theory, and in cosmology and high energy physics in general.

In particular what he says about multiverse, string landscape and similar topics is totally misleading. I am not saying that these are not conflictive areas. Only that what Woit says about them is not representative. To begin with one may realize that the existence of multiverses is, in some cases, inflation, mostly a consequence of already proved physicist with the only assumption of some special issues on he potential of the inflaton. Also, if one trust string theory, the multiverses would rise as a consequence of a saltatory cosmological constant. In fact that two scenarios are similar in spirit, although very different in the details.

If a reader of this blog would want to get a much better idea I would recommend him the reading of the book presented at the beginning of this entry. IT is edited by Bernard Card, who also makes a presentation chapter and thematic one, "the anthropic principle revisited". The firs chapter gives an overview of the rest of the book and explains the many meanings of the term "multiverse" that are treated.

The book itself is bases in a series of conferences partially supported by the templeton foundation. The list of participants includes a list of very prominent physicist such as S. Weinberg, S. Hawkings, L. Susskind,A. Linde, P. Davies, R. Kallosh and a long etc. The list of topics is also very broad, covering many of the variants of the multiverse idea and why it raises in nowadays physics research.

I only have read a few (six) part of the articles and I am alternating them with other articles about inflation in string theory, supersymmetry breaking and general literature about the cosmological constant. My idea is that probably there are better alternatives for the apparent existence of an accelerated universe (and in general, possibly, for some fine tunning problems) but that it would be stupid not to read (at least a part of) what appears in that book if ones is concerned about it.

Monday, September 28, 2009

Collisions of universes

The arxiv blog took notice of an article about the possible existence ofo experimental evidence of the collision of universes. You can read it here.

For those people who has read the conventional cosmological sceneries, that is, a FRW type cosmology, slightly corrected by a very tiny cosmological constant and an early period of inflation the idea of many universes could sound to them very speculative. and probably the could immediately think that it is based on that "evil string theorists" inventions.

In these last dates my readings in theoretical physics have been mainly related to cosmology. I have learned that the most extended idea of the multiple universes is rooted in the "eternal inflation" and also by the alternative idea of "chaotic inflation". Those are inventions of cosmologists and not of string theorists, although the last ones have got the idea, merged it with ideas of Weinberg and the Bethe-Theitleboin mechanism and implemented it in stirng theory leading to the bousso-Polchinsky proposal, the KKLT realization of it and in last instance the whole odea of the landscape.

But, basically the idea of eternal inflation is independent of string theory. If one takes seriously the problem of the vacuum energy in QFT one is faced with the problem that this is of the order of the supersymmetry breaking (if one doesn't belive in supersymmetry one would have the planck energy). That energy would led to a very fast expansion of the universe when part of it is released by transition to a level with less energy. The whole thing is that that transition would not be simultaneous on all points. Then one would have an expanding universes coexisting with an stationary one. The expanding universes could, agian, suffer another phase transition to a less energetic vacuum, resulting in new expanding universe. and so on, in a nested proccess. This is, roughly speaking, the idea of "bubble nucleation" that I have mentioned sometimes in this blog without too many explanations.

Ideally that universes would be disconnected and couldn't collide. That paper on arxiv investigate the opposite possibility.

One consequence of eternal inflation is the possibility that there would be an infinite number of universes. Taking account of the fact that here are a limited number of configurations that results in the need that every event , and it's variants would happen an infinite of times. It is a variant, for practical purposes, of the Everet's "many worlds" interpretation of quantum mechanics.

Well, if in some way one could us that infinite worlds variants amazing things could happen. In fact some people propose seriously that quantum computers make basically that. They would be many copies of the same computer, but in different universes, every one doing an independent calculation. Some people even try to relate the quantum and the eternal inflation into a single mechanism.

All this is very CF like. I remember the film "the unique" by the martial arts star Jet Li. There the protagonist kills his copies in the other universes and recipes it's strength and knowledges.

If insome wya I could use that many worlds doubles I could read in detaillthis paper about a new proposal of unified theories, somewhat in the line of the everybody favourite surfer physicist. A superficial reading seems to suggest that the proposal is most satisfactory but I am skeptic about it.

Also I could read some other interesting papers in arxiv today, and advance with my own ideas faster. And I could write the article, in Spanish, about F-theory. In fact I am reading some further papers about F-theory and I'll wait a little bit before writing that entry.

Besides reading physics/maths I could dedicate more time to touch piano, to make some amateur composition of electronic music, to practice more hours in the week martial arts, and make also other sport activities. And I still would have time to read more CF books, go cinema, and etc, etc. But lacking that technology a most likely possibility would be to win some random game (euromillion or similar) and in that way I wouldn't have to waste time in "economic activities" (althought fortunately sometimes this overlap with my research/academic activities).

But even in the worst of cases I guess that I am going to have more time to dedicate to the blogs and think that there are a good bunch of interesting themes to write about.

Thursday, August 13, 2009

Recently on arxiv

Summertime, holidays, nothing important going on, isn't it?

Maybe, but in my opinion some recent papers on arxiv are interesting.

I am going to begin by CYBERsusy. The name can suggest some SF novel cyberpunk heroine. But not. It is an acronym for 'CohomologicallY Broken Effective Retro SUperSYmmetry'. The article is titled A new mechanism for supersymmetry breaking
in the Supersymmetric Standard Model

To break supersymmetry is a good thing and a new mechanism to do so sounds important. If, moreover, the abstract ends with "The theory also leads to a zero cosmological
constant after SUSY breaking." one could begin to feel seriously interested. Being so it was a bit amazing to see how it passed unnoticed in the "big guys" blogs (you know who they are ;-) ). So I investigated a bit and I found this old entry by Lubos: Dixon law firm: CyberSUSY.

I also found that there is a blog mantained by Jon Dixon in the theme, John Dixon: Cohomology, supersymmetry and cybersusy

Well, the Lubos article refers to an older version of the theory, maybe the new one has corrected the fails, but I guess that not too much.

Another paper that I have found intriguing is related to the cosmological constant and the expanding universe. It is this, Does Unruh radiation accelerate the universe? A novel approach to the cosmic acceleration" and it's arxiv. This is the abstract:

We present a novel mechanism for the present acceleration of the universe. We find that the temperature of the Unruh radiation perceived by the brane is not equal to the inherent temperature (Hawking temperature at the apparent horizon) of the brane universe in the frame of Dvali-Gabadadze-Porrati (DGP) braneworld model. The Unruh radiation perceived by a dust dominated brane is always warmer than the brane measured by the geometric temperature, which naturally induces an energy flow between bulk and brane based on the most sound thermodynamics principles. Through a thorough investigation to the microscopic mechanism of interaction between bulk Unruh radiation and brane matter, we put forward that an energy influx from bulk Unruh radiation to the dust matter on the brane accelerates the universe

Well, the two last papers were authored by a Lubos discredited physicist in one case and by a not well known Chinese team in the other so One can find reasonable to see no resemblance of them.

More surprising are the two next papers.

The first one is by mr. Brian "elegant universe" Green and coauthores by Daniel Kabat1;2y and Stefanos Marnerides. The title sound interesting, "Dynamical decompacti cation and Three Large Dimensions". The actual paper is this and this is the abstract:

We study string gas dynamics in the early universe and seek to realize the Brandenberger-
Vafa mechanism{a goal that has eluded earlier works{that singles out three or fewer
spatial dimensions as the number which grow large cosmologically. To this end, we
consider a dilute gas of strings on a large torus, so that strings typically interact at signifi cant impact parameters. A strong exponential suppression in the interaction rates for d > 3 spatial dimensions rejects the classical argument that string worldsheets generically only intersect in four or fewer spacetime dimensions. As a consequence of this suppression, a scan over initial conditions establishes that in the dilute regime
decompacti cation of d = 3 spatial dimensions is favored over d > 3.

Nowadays the trend seems to be bottom-up approaches where one consider the standard model as an input and looks for stringy constructions to implement it in a coherent way (this goes for intersecting brane worlds and for F-theory GUT's). But I think that still is very interesting to see how "natural" is to have a world with three macroscopic dimensions and this papers seems a good approach to answer that question.

There is another paper authored by two very well known string theorists, Joseph Polchinski and Eva Silverstein. It is this: Dual Purpose Landscaping Tools: Small Extra Dimensions in AdS/CFT.

Because my knowledge of the AdS/CFT is still limited to it's basics, the chapters in the Becker-Becker-Schwartz and Clifford Johnson books plus a not too successful reading of a few reviews I can't say too much about the relevance of the paper. But still it sounds important and I am surprised not to have read about it in the you know who blogs. Maybe it is just a question of time, who knows.

The last paper I'll link is this: The Search for a Realistic String Model at LHC The title is self-explanatory. The authors are well known people, James A. Maxin, Van E. Mayes, D.V. Nanopoulos (well, actually the last one is well known for sure).

It is about the construction of a realistic intersecting D-brane model. At first sight one could think that with the rise of the F-theory GUT models the interest on this kind of phenomenological approaches would have become somewhat outdated, but seemingly it is not the case. Possibly one of the reason is that there is a lot of literature on the subject of seeking signatures of this kind of models in the LHC. You can find many of the articles doing a search for "string hunters" on arxiv hep-th. As you can see recently there was a paper on that subject, "The LHC string hunters companion II". I am actually reading the paper that correspond to the first part. It is very illustrative on how the Randall-Sundrum sceneries are actually achieved in fully string compactifications through branes wrapped in "Swiss chase" Calabi-Yau manifolds. And it contain a lot of formulae for cross sections. I think that It can be useful to me if at last my idea about a tired light mechanism to explain the observations interpreted as accelerated universe expansion make any sense after all. Independently of that they are possibly the "state of the art" in string phenomenology out of the F-Theory GUT's and (M-)heterotic models. Possibly I'll make a dedicated post to this intersecting brane models some day. But for now I think that the reader already has a lot of papers to search for.

Tuesday, August 04, 2009

TGD in vixra

At last the pdf articles of Matti pitkanen about topological geometrodynamics have been uploaded to vixra.

If someone wants to beguin he would read Topological Geometrodynamics: Overview. Advise, despite the "overview" word in the title that pdf has 1000+ pages.

There are many other papers that the reader can find by himself in vixra.

I am aware that TGD is considered crackpot. For example Lubos said in a recent post (not specifically about TGD) that TGD only existed in Matti Pitkanen's imagination. Maybe, but I think that Matti is a nice smart guy, with a correct behaviour, and he deserves the right that someone with an accredited academic position claiming that TGD is crackpot would do a proper debunking of TGD (a reasonable part of it, it is not necessary to debug the whole 1000+ paper).

But surely I am not the right person to do such a debunking. Instead I'll debug in a separate post another paper in vixra, The Graviton Background Vs. Dark Energy. The abstract says:

In the model of low-energy quantum gravity by the author, cosmological redshifts are caused by interactions of photons with gravitons. Non-forehead collisions with gravitons will lead to an additional relaxation of any photonic flux. It gives a possibility of another interpretation of supernovae 1a data. Every massive body would be decelerated due to collisions with gravitons that may be connected with the Pioneer 10 anomaly. This mechanism needs graviton pairing and "an atomic structure" of matter for working it. Also an existence of black holes contradicts to the equivalence principle: any black hole should have a gravitational mass to be much bigger - about three orders - than an inertial one.

As the readers of this blog can deduce the author of this paper is presenting a TLT (tired light theory) as a replacement of the big bang. And the will remember that I had being considering the possibility that a TLT mechanism could be used to give and explanation of the supernovae data that is being interpreted as evidence of an accelerated expansion of the universe. I have developed a little bit more my ideas and I am not sure if they work, but as far as I see what that article says makes not special sense. I'll try to explain why in the next post.

Tuesday, July 28, 2009

Easy latex on blogpsot

I had said that I wanted an easy way to use latex on blogspot. Untill now I was using an external mimetex server and using the html img tag with the url of the mimetex cg and latex code in it. The idea was to create some kind of customized tag that would write the html code. That has too good points. On one hand it makes the edition of the post easier. On the other if the mimetex cg is changed you only need the link of the cgi in the script code that is asociated to the tag.

In order to do so I have tried to study the blogspot templates structure, but I have not found too usefull information. Most that I hve found is concerning the look of the blog and I am not too interested in that kind of things. I have tried to see if I could create tags, but I have not found too much info. I have made an easy javascript function, but it was not of too much help.

Fortunately I have found someone who made the work for me. You can read how to here. As you can read there once you add a personalized javascript gadget to your blog you can insert your latex code inside two dollar symbols and it its replaced by the right latex image when you publish the entry.

The code behind the scenes is this. I have not had time to study it. If you follow the stepts indicated in the webpage I linked before that javascript is called from a remote .js file and you can't customize it. Obvioulsly one could write the whole javascript code in the customized gadget or create your own copy of the js file whereever you want and customize it.

One bad aspect of the latex, as I use it in this blog, is that the images have some annoying borders, as you can see:


I think that this problem can be advoided doing some CSS to create some tag for latex images but that is something that I'll do sometime later. Also I would like to study that javascript to use the wordpress tag for wrting latex, $latex the code goes here $ instead of the tow dolar symbols, in order to improve the crosscompatibility betwen blogspot and wordpress. But, anyway, that are minor questions, the hard problem is solved. Now it only remains that I would post about physics and write the equations, of course ;-).

Monday, July 13, 2009

Strings 2009: the slides

This year the annual conference in string theory, celebrated at Roma, has not had an internet live TV broadcast as it happened the last year.

Because of that reason I didn't do a post about the topic. I have waited until the slides where out and I could have read some of them. The slides of conferences, if they are detailed enough, are a good thing because they are addressed to non specialists in that particular field, so they can be easily read, and they condense a great amount of information from various papers.

You can get access to the lists of talks, wth the corresponding slides, here.

I have read a few ones already. The first was the one given by Howava. I was greatly interested in reading how he defended his theory against the recent papers with showed the problems of renormalizability it seems to actually has, despite of being power counting renormalizable. Well, I didn't see any mention of it. The slide talks about the "foundational" papers on the subject and explains it's relation to the M2 brane of M-theory, to the CDT (causal dynamics triangulations) result that in the short length the effective dimension of space time is near 2, and that his theory resembles that, and a few other topics. I find specially curious that one of the motivations for his theory is that string theory violates Lorentz symmetry. Well, I am not sure why he says that, but certainly said without further explanation looks weird. It is a pity that there was not live streaming, nor non-live videos, of the talks so one can't see what questions people made him.

About the F-theory GUT's there were three talks. One from Vafa. It's ppt (than not pdf) is very schematic and without some previous knowledge on the subject I am not sure how much information one can get from it. Anyway, if one reads the papers I cited in my post about F-theory for non experts maybe he could get a much better understanding. Vafa makes a decent work explaining the two foundational papers, the paper in cosmology, and the paper in LHC footprints, that I have read. It also talks about some papers I haven't read, as for example the ones in gauge mediation (although I had read some resumes of the results). The conclusions seem to be that there are two clear predictions from their models. One, in cosmology, is that the dark matter candidate is the gravitino. that rules out models on WIMPS and implies that ATIC, PAMELA and similar results that seems to indicate an anomalous ratio of positrons over electrons over certain ranks of energies would have astrophysical origins. Or not exit at all. Recent results from FERMI/GLAST seem to contradcit ATIC and PAMELA (see, for example this post by Jester, in resonances blog) would agree with this prediction.

The other prediction mentioned on the slide is that there will be some charged track on the LHC leaving the detector. It would be due to the NLSP whose lifetime, 10^1-4 secs, is long enough to allow it scape from the detector.

There are two more talks about F-theory. One by Sakura Schafer-Namek. I have read it but from all the part related to spectral covers I coudn't get any useful informrmation. I simply don't know enough form that mathemathical topic. The other paper in F-theory is the one by Jonathan Heckman. It is centred in flavor hierarchies for quarks and leptons. Well, an interesting topic for sure, but not my favourite one. Anyway the slide is good enough to get some general idea of the topic from it.

Another paper I read is the one of Strominger about the KERR/CFT correspondence. About that topic I only had read a paper dated from the last summer. Well, I am not sure if too much progress has been achieved so far neither I have clear whether the whole field is terribly significant, but possibly that is my fault.

Possibly the most awaited paper was the one from Nima-Arkani-Hamed about twistors and the S-Matrix. There are rumorology out there saying that it's not a paper in string theory but an attempt to create some kind of supersymmetric GUT diferent from string theory. I haven't still read the slide and I can't say anything about. But for sure it is a theory that many people will discuses sooner or later, possibly when the actual paper on the subject would be out.

I'll possibly read more slides later, but I am not sure if I will post about them. But everybody can try to rad the linked slides by themselves. There are good choices that anyone with a decent basic on high energy physics could get some amount of info from them.

UPDATE: In a thread in physicis forums someone, seemengly well informed, said that actually Horava recognized the problems recently found in his theory in his talk as strings 2009. Also the same physic forums poster explained that the actual problems where that one couldn't decouplee the gosths from the theory. Curiosulsly that has lead to a posible reinterpretation of that gosths as dark matter. I have not read the relevant papers but at first sight that looks very bizaree. Gosths are negative norm states tht usually appear in the quantizationo of gauge theories as intermediate states that can be shown not to appear in external legs, i.e., are no observabbles. Toclaims thatusually unwanted negative normed states can go in external lines and actually represent viable particles (in the form of dark matter) seems like one could try to do the same thing for any theory and one wouldn't need gauge theories. I suppose that there will be something special in that gosths that make them diferent from the usual ones and permits people doing such conjectures, but, as I said, looks an a priory contravied claim.

P.S. I am looking for an easier way to use LaTeX in this blog that the one I am using (writing the latex code in the url of an image generated by an external LaTeX server). If I don't find a good solution I would seriously consider the option to migrate this blog to wordpress where writing LaTeX is "natively" supported (that's the reason I make an extensive use of it in my other blog).

Thursday, July 09, 2009

Vixra, the arxiv mirror symmetric

In Kea Monad/Marni Dee Sheppeard blog there has been recently a few entries about the freedom to publish scientific results.

As a result Tomasso Dorigo suggested her a bizarre idea. s a result of comments exchange it resulted into another idea, the birth of a new archive for scientific publication. In a really fast movement a new domain was registered and the site is already available. The name for the new site is arxiv written in the opposite direction, that is vixra, which, with some minor licences can be considered as a mirror symmetric of arxiv. The actual link for the website is: Note that at the date of writing this It is in a very beta status.

I leave here the manifest that justifies it's creation and it's purpose, as declared by the creator:

Why viXra?
In 1991 the electronic e-print archive, now known as, was founded at Los Alamos National Laboritories. In the early days of the World Wide Web it was open to submissions from all scientific researchers, but gradually a policy of moderation was employed to block articles that the administrators considered unsuitable. In 2004 this was replaced by a system of endorsements to reduce the workload and place responsibility of moderation on the endorsers. The stated intention was to permit anybody from the scientific community to continue contributing. However many of us who had successfully submitted e-prints before then found that we were no longer able to. Even those with doctorates in physics and long histories of publication in scientific journals can no longer contribute to the arXiv unless they can find an endorser in a suitable research institution.

The policies of Cornell University who now control the arXiv are so strict that even when someone succeeds in finding an endorser their e-print may still be rejected or moved to the "physics" category of the arXiv where it is likely to get less attention. Those who endorse articles that Cornell find unsuitable are under threat of losing their right to endorse or even their own ability to submit e-prints. Given the harm this might cause to their careers it is no surprise that endorsers are very conservative when considering articles from people they do not know. These policies are defended on the arXiv's endorsement help page

A few of the cases where people have been blocked from submitting to the arXiv have been detailed on the Archive Freedom website, but as time has gone by it has become clear that Cornell have no plans to bow to pressure and change their policies. Some of us now feel that the time has come to start an alternative archive which will be open to the whole scientific community. That is why viXra has been created. viXra will be open to anybody for both reading and submitting articles. We will not prevent anybody from submitting and will only reject articles in extreme cases of abuse, e.g. where the work may be vulgar, libellous, plagiarius or dangerously misleading.

It is inevitable that viXra will therefore contain e-prints that many scientists will consider clearly wrong and unscientific. However, it will also be a repository for new ideas that the scientific establishment is not currently willing to consider. Other perfectly conventional e-prints will be found here simply because the authors were not able to find a suitable endorser for the arXiv or because they prefer a more open system. It is our belief that anybody who considers themselves to have done scientific work should have the right to place it in an archive in order to communicate the idea to a wide public. They should also be allowed to stake their claim of priority in case the idea is recognised as important in the future.

Many scientists argue that if had such an open policy then it would be filled with unscientific papers that waste peoples time. There are problems with that argument. Firstly there are already a high number of submissions that do get into the archive which many people consider to be rubbish, but they don't agree on which ones they are. If you removed them all, the arXiv would be left with only safe papers of very limited interest. Instead of complaining about the papers they don't like, researchers need to find other ways of selecting the papers of interest to them. could help by providing technology to help people filter the article lists they browse.

It is also often said that the exclusion policies dont matter because if an amateur scientist were to make a great discovery, it would certainly be noticed and recognised. There are two reasons why this argument is wrong and unhelpful. Firstly, many amateur scientists are just trying to do ordinary science. They do not have to make the next great paradigm shift in science before their work can be useful. Secondly, the best new ideas do not follow from conventional research and it may take several years before their importance can be appreciated. If such a discovery cannot be put in a permanent archive it will be overlooked to the detriment of both the author and the scientific community.

Another argument is that anybody can submit their work to a journal where it will get an impartial review. The truth is that most journals are now more concerned with the commericial value their impact factor than with the advance of science. Papers submitted by anyone without a good affiliation to a reasearch institution find it very difficult to publish. Their work is often returned with an unhelpful note saying that it will not be passed on for review because it does not meet the criteria of the journal.

In part is a parody of to highlight Cornell University's unacceptable censorship policy. It is also an experiment to see what kind of scientific work is being excluded by the arXiv. But most of all it is a serious and permanent e-print archive for scientific work. Unlike tt is truly open to scientists from all walks of life. You can support this project by submitting your articles now.

What do I think of this. Well, there is a famous phrase of Richard Feynman about physics (valid for science in general), and it's role as a practical discipline:

"Physics is like sex. It can have practical consequences sometimes but that is not the reason we do it".

Well, that's the idea. And publishing would be part of the fun. But seemengly to publish (as well as otherparts of a scientifi carrer) have become a game where many factors ouor of the pure siceintifc content play a role as least as important as the quality of papers. Still worst, it is not very clear what the rules of that game are. That converts publishing in a very risky busines and an error can bban one from arxiv (the papers that people actually read, peer to peer reviews have become invisible). In fact I personally think that I could find some endorser for froseable future papers. But in the actual state of the subject it is too much presure bor bboth, me and the endorser.

For that reason an alternative as arxiv is a goo option. One can publish ideas and exchange them with other people. It is important the concept of exchange. There are some kind of papers when one can have, or almost have,the secutiry that they are right. But there are other that are subjecto to many uncertainties. And, possibly, one can save only a limited amount of the difficuties he face. Possibly if one has round him people working on that field he could discusse that ideas privately. but it is not always possible (even if you are in a academic position). In that sense to publish ideas in a preliminar state of development that you are not sure you can pursue further, that maybe they could be usefull. That's the idea of scientific exchange as fr as I see. And if one is wrong, well, that's always a possibilitie. Of course one would do the usual homework to try to search as much as possible similar ideas beofre publishing rubish results. Definitively is good to publish that kind of papers in a site where if somone is wrong doesn't he (and his family, friends and cat) become banned for the rest of his life I wuold say .

Still better, as far as I see both archives wouldn't be mutually exclusives. One could publish "serious" papers in arxiv and more risked ones in vixra. Well, at least in theory. Surely someone will find good reasons to find incompatibilities among them ;-).

Wednesday, July 08, 2009

F-theory GUT for non experts

I have found a few papers that do a good job explaining the basics of F-theory in a relatively easy way. I could have posted them as un update of the previous post on the subject but I think it deserves an small separated post.

One paper is : F-theory, GUTs and Chiral Matter.

Another one, written by Hackman and Vafa is: From F-theory GUTs to the LHC

Also I think that the interested reader would try to understand more basic settings, previous to the F-theory revolution. I am talking about the intersecting branes scenarios. A short and good review is: Progress in D-brane model building.

The reason to investigate the last paper is that I find it is interesting to understand how one calculate family numbers, how chiral fermions arise and so that in more conventional D-brane models. In fact the firs paper I cite makes a good job explaining some of that aspects, but still.

Also I recommend, once again, the original paper of Ibañez, Quevedo et all in local models D-Branes at Singularities : A Bottom-Up Approach to the String Embedding of the Standard Model. I have finished to read it and I find it very clear. As a plus it also has a brief chapter about F-theory.

Certainly the last papers about D-brane model building are not required to understand the F-theory ones, but It is good to understand what existed previously to better understand the goodness of the new. In that sense the papers recommended in my entry about the prehistory of F-theory GUTS are also valuables and focuses in diferent aspects than the ones cited here.

Anyway, if someone only wants a quick, but accurate, idea of the subject the two papers cited at the start of the post make a wonderful work

Friday, June 26, 2009

Trántorian physics

Trantor is a ficticial planet presented in the Isaac Asimov series of books about the foundation. It is the centrer of a galactic imperia.

In that universe the king of sciences is psicohistory. By that name is referred a mathemathical model of human societies with detailed qualitative predictive power. Physics has become an obsolete discipline that had dead of success long time ago. Supposedly it had answered all the basic questions and no new important discovery had been made for hundreds of years.

But still there were some physicists. The problem with them is that the lack of new experimental results had resulted in a vicious system where the quality of one particular physicist depended on the knowledge of the achievements and, maybe, his ability to reinterpret them in new, basically irrelevant, ways that didn't lead to new discovering.

Well, that is fictional. But sometimes actual physics somewhat resemblances that trantorian physicists. Lot of people like to culprit string theory for that, but I don't agree at all, it is a problem common to all the alternatives.

I mean, what actual observable predictions make alternative theories?

LQG great achievement was the frequency depending speed of light. In fact Liouville strings also predicted that Well, FERMI/GLAST has almost ruled that possibility (although there is some discrepancy on the interpretation of results depending of who
writes about it, for example Lubos Motl and Sabine Hossenfander disagree, as always).

Horava's gravity, being as a classical theory slightly different from Einstein's gravity makes predictions not too hard to measure. But after the initial explosion of papers it is somewhat stopped now due to some papers that posed serious doubt about is goodness as a quantum theory despite being power counting renormalizable. It would have been nice to see how it was received in the actually developing strings 2009 conference, but this year there is no live broadcast nor, at least until now, none bloging about it.

Nonconmutative theories are also almost dead, despite they had some time of glory (although today, afther many months, there is a paper in arxiv in the subject There are two types of NCT theories, field theoretic ones, and geometric ones. The fist are inspired in string theory. The last ones re mainly geometric and were promoted by the mathemathician Alain Connes. They mde a firm prediction, a value for the Higgs mass, that was ruled out (at lest in the original way, I am not sure whether some modifications have been suggested) last year by measures of the tevatron.

So, basically, we have that despite many theoretical efforts in many different approaches to basic physic (i.e., particle physics) we have no new experimentally confirmed since the formulation of the standard model, in the last sixties and former seventies of the past century. The only new result was the confirmation that neutrinos have an small mass. The other experimental news come from cosmology, and, as I said in previous posts, are not so firm as laboratory experiments.

Is this a problem of theoretical physicists. I hardly think so. String theory is a very rich framework. Different aspects of them actually are promising candidates for phenomenology. For example the mesoscopic extra dimensions suggested by Arkani-Hammed et all in the last nineties was a very original idea, that has led to cheap experiments that had put new bounds on the size of that dimensions. LQG, as said did a good prediction (shared by most Lorentz violating theories) and LQC is trying to do observable predictions about cosmology, maybe not rigorous ones, but if the were observed none would care too much about it ;).

The big problem I see is not related to theory but to experiments. And, specially, to collider experiments. USA cancelled founds for a new linear accelerator in the nineties. The LCH schedule has seen almost five years of delay (that is, if finally beguines to operate in September, as expected). The tevatron has made it's bests, going beyond the expectations. It has showed that QCD at high temperatures behaves not as a quantum gas (as expected) but as a quantum liquid.That doesn't means new basic physics, but at least it gives clouds about the properties of QCD that are very hard to study mathematically and computationaly. And, hey, it has ruled out NCG ;-). Even there are some possibilities that a careful analysis of the collected data would find the Higss bosson. Not that bad for a recicled collider.

If there is no serious money inverted in experiments researchers are going to spend time in increasingly twisted theories. Internal coherence is a good guide, but it is not clear that that alleged coherence is so free of doubts as some people try to present it. That goes for LQG and for string theory (and the alternatives). Again that is not a reason to go again string theory (or the alternatives, well, some of the alternatives are theorethically unlikely, but still). The ultimate justification of the theoretical developments is that they re made searching for compatibility with known physics and also guessing new phenomenology. What is seriously need is that experiments would be made. The LHC is, hopefully, next to operate, but there s no serious project for the post LHC.

Maybe some people could think that there is no good reason to expend a lot of money in that expensive experiments. Specially not in the current economic crisis. In my opinion that is a narrow minded vision. Certainly other areas of physics are giving interesting results (solid state/condensed mater and the wide area know as nanotechnology) but they are based on very all basic physics. It is necessary to pursue the development of new physics. For example, one very important problem that the society need to face is the energy supply. There are discrepancies about how many fossil combustibles (specially at cheap prices) remain. In fact that depends heavily on the growth of demand. But sooner or later (and most likely sooner) they will extinct. The "ecological" alternatives (solar energy, wind, etc) are mostly propagandistic solutions. Nuclear energy has better chances, but it depends on a limited resource,uranium. Certainly there are proposals for fast breed reactors that could create fissible elements. But they are somewhat experimental. It is an open question where they will operate as expected. The other alternative, nuclear fusion is fine. But again governments are not spending enough money on it (as the fate of ITTER clearly shows).

The thing is that when we are looking for energy sources the best thing we can is understand how the universe behaves at high energies. If one looks at the way on how "energy sources" work one sees a common pattern. One has a two energy state system separated by a barrier where the difference of energy between the two states is greater than the energy of the barrier. If one supply the system with energy enough to go over the barrier when the system goes to the lower energy state it returns more energy than the employed one. That is the way chemical combustibles work. And also the way nuclear fission and fusion works. Nuclear process involve higher energies and so they return more energy also (well, in fact it could be otherwise, but it would be very unnatural).

Well, if we go to higher energies one expects that, somewhere, there will be some systems that share that property (a good name for it would be metastability).For example in some supersymmetric models there is, if R-symmetry is present, a lightest supersymmetric partner, LSP, which is stable, and a candidate for dark matter. And also there is the possibility of a NLSP (next to light supersymetric partner) that would be metastable. Well, that is the kind of thing we like. One would expect that there is a big energy difference among them. If they are found and it is discovered a way to force the decay of the NLSP into the LSP we would have an energy source. Moreover, dark matter represent the, 75%, 90%? of the mass of the universe. That could men that there is a lot of it out there. One could argue that if we are not able to do nuclear fusion, using known elements we badly could develop a technology to extract energy from something that is still hypothetical. But the truth is that we don't know. Maybe it is a lot easier to extract energy form dark matter (let it be (N)LSP, WIMPS or whatever) that from known sources.

Still there are other possibilities. There is an small possibility that if the LHC creates black holes it could also create wormholes. Wormholes (lorentzian ones) have received a lot of attention in SF as a tool for interstellar travel or even as time machines. But there are other interesting uses for them if they would actually exist. If one mouth of the wormhole is posed in a very energetic environment it could drive that energy onto the other mouth by a direct way. For example one could put one mouth deep inside the earth and the other in the surface. That would be a good way to extract geotermic energy. Of course one could think that is a lot more likely to use more conventional ways to get that energy, but still it could be not. Other very energetic environment would be the sun. It is not totally clear how much energy requires to create a wormhole, but one would expect that if the outer distance between the mounts growth the same applies to the required energy. But it could again not to be so. Still there is a problem in using the sun, the gravitational interaction. The gravitational field of the sun would be transferred together with light and it could alter the earth orbit.

There is a more interesting possibility for wormholes (or maybe we would call them warmholes, or not, depending on how one would worry about double meanings of words xD). If they are created at the LHC that would probably mean that the reason behind it is that mesoscopic extra dimensions exist. In string theory there are various ways to realize that sceneries. A common feature of many of them is that the would mean that we leave in a three dimensional (or effectively three dimensional) brane. But it is possible the existence of additional branes. It could be that some of them would have a high background energy. And it also could be that they would bee not too far away into that additional dimensions. Actually they could be so near that it wouldn't be improbable that a wormhole could be created with one mouth inside that hot brane and the other in ours. Still better, the sceneries with mesoscopic extra dimensions offer good possibilities for wormholes becoming stable. That would rise the possibilit to use that wormholes to extract energy from that hot branes. Depending on the details they could be a mean to solve all the energy requirements of the human kind at a level that exceeds all the actual xpectations.

All the hipothethical energy sources that I have presented are related to string theory likely situations. Alternative theories maybe also would offer options. For example black holes in alternative theories could not evaporate completely and one could use the remanents to extract energy from them in Penrose lie process. A serious problem with it is that without mesoscopic dimensions there is no way to create black holes in the LHC so we woudn't have remanents either.

By the way, black hole physics is a very good example of trantorian physisic. Specially the black holes inners. The gravity/LQG community has a, widely accepted, viewpoint of them where the radius behaves as a time coordinate. Well, in string theory there are very different proposals, none of them too friendly with that LQG viewpoint. Also the string theory strongly supports the complementary principle. Well, some people in LQG don'even know of it's existence (or at least not until they published a paper that was incompatible with that principle). My problem with this is that we don't have a near black hole to do experimental tests. In fact even if we would create them into the LHC it is not clear that we could make experimental tests about black hole inners. Neither is too clear how that black hole inners have any consequence into the behaviour of the event horizon. Well, if naked singularities are allowed the thing would improve, but then they wouldn't be black holes ;-).

Well, certainly in this post, apart of some sociological consideratins, I have presented very speculative ideas with two few details about them. Maybe that is what top notch physicist do in trantor. Not being there I hope to present more earth based physic in next entries ;-).

By the way, if one is absolutely serious about it many proposal for alternative "ecological" energy sources are actually less unlikely to be good alternatives to oil that the ones I have proposed here. They look otherwise because they are based in things that laymen think that they understand, but if one goes into the details of the implied physics one really hope that wormholes actually exists xD.

Thursday, June 04, 2009

String theory is good for...phenomenology of particle physics

Yesterday the number of visits to this blog had a major increase. Most of the traffic came from this post in Miguis web/blog.

The post was a translation to Spanish of an article in new scientist about the good points of string theory. I had seen a discussion of that article in Lubos blog, concretely here.

Well, that article comes to say that string theory is nowadays a good theory because it´s math structure, through the AdS/CFT correspondence is useful in QCD and condensed matter physics. Well, I don't know too much about that applications but if the experts in that subjects say so is a good sign.

But, actually, I don't think that that image is quite right nowadays. Readers of this blog know that I have played attention to many alternative theories. Some of the proponents of that theories make claims against string theory. Others, who don't actually offer any theory, that is, Peter Woit, claims that string theory "makes no predictions". In his blog he usually bring attention mostly to the most speculative articles written by string theorists.

Well, I am following, as much as I can, the actual F-theory minirevolution. Doing so I have become very surprised, and impressed, by how close string theory has become of actual physics. Before going into it I must say that I somewhat understand the sceptics in string theory. If one reads the books on the subject one certainly gets the impression that actual predictions are far away. For example the 1999 (that is, not too old) book of Michio Kaku Introduction to superstrings and M-theory in its chapters about phenonenology show the results of heterotic compactifications. In that results the best one could get were the standard model plus some additional U(1) factors. Also it was stated that to achieve the right number of generations , given by n=1/2X(Cy), that is, one half of the Euler characteristic of the Calabi-Yau mainfold, was difficoult (if not almost impossible).

Other books, as Polchinsky´s two volumes book and the Clifford Jones "D-branes" don't say too much about realistic compactifications. There are good reasons for that. The books are mostly concerned about the D-brane revolutions and its consequences, the black hole entropy calculation and the AdS/CFT conjecture. The most recent book of Becker-Becker-Schwartz makes more in deep cover of compactifications. But , with a good criteria, somewhat cares more about technical issues such as the moduli space of the compactification, mirror symmetries among type II A and type II B, and flux compactifications, which are relevants for the very important issue of moduli stabilization and the KKLT like models (related to the landscape). And , of course, the all make introductions to dualities, M theory, and , to a least extent, F-theory.

In fact all that are important technical aspects, and it requires time to learn them (one must read some articles if he really wants to properly understand some aspects). But one gets the impression that everything is still to far from LHC phsyics and cosmology testable predictions. In fact there is a very recent book, by a Michel Dine which goes into phenomenology title "supersymmetry and superstrings". I must say that I find that book somewhat failed. It is to brief covering subjects that even with some previous knowledge are hard to appreciate properly.

Well, in definitive, a lot of text books and no a clear signal of actual testable physics. Certainly discouraging. Divulgative books are not too different. That, certainly, can explain why some people has the impression that string theory is far from it's objectives. Blogs from string theorists try to say, to whoever listen them, that string theory is "the only game in town". In fact there are not many blogs in string theory with a decent publication rate. However I had also the idea that string theory was far of phenomenology, and I had not purchased too mcuh that topic.

F-theory minirevolution has changed that. I have read at last the two big pioneer papers of Vafa (arXiv:0802.3391v1 and arXiv:0806.0102v1), and almost completed the reading of the F-theory Guts cosmology (arXiv:0812.3155v1). Also I have made partial readings of some subsequent papers, and a few previous papers needed to understand the formalism developed.

Certainly are hard to understand papers. But once one gets familiar with them one sees what kind of physics is discussed. The first thing to say is one need to know the details of GUT's and symmetry (and supersymmetry) breaking. F-theory local models, with the right decoupling from gravity, can give an SU(5) model, without any exotics. They offer it's own way to break SU(5) into the MSSM, through an U(1) flux of hyperchrge, That mechanism avoids some of the problems presents in purely field theoretic models. In particular they can avoid problems with the observed lifetime of the proton. Ulterior papers get values in the CKM matrix that are good to get the observed asymetry oof baryons in universe. They offer ways to advoid the singlet-tripplet spliting problem of GUT's (That is, requiring the existence of Higgs doublets (1, 2)±1/2 leads necessarily also to color triplets. However, there exist rather uncomfortable lower bounds on the mass of these triplets). They offer a natural way to get small neutrino masses. In cosmology, trough a late decay of the saxion (whose lifetimem is predicted, that is, properly bounded, by the theory), they can avoid some of the problems that symmetry breaking bring to cosmology (the gravitino problem) and gives a righ way to obtain reheating after an inflactionary phase and some extra things that I haven't finished to read.

As you can see these models are quite near the cutting edge phenomenology. They offer solutions to problems not available by other approaches. And F-theory is not alone. Seemingly M-theory is going also into the local models + gravity decoupling business, see for example the paper Hitchin’s Equations and M-Theory Phenomenology by Tony Pantev and Martijn Wijnholt.

As I said I hadn't followed previously phenomenology with too much attention. But, in fact, more traditional approaches also had made some advances. For example this 2008 short review article of heterotic compactifications, From strings to the MSSM also cares about some of the previously mentioned aspects.

Another very recent paper, Towards Realistic String Vacua From Branes At Singularities, by Joseph P. Conlon, Anshuman Maharana, Fernando Quevedo, use the D-brane approach to phenomenology, not related to the gravity decoupling approach. They offer the bonus of moduli stabilization (something more habitual in cosmological models). In the abstract the conclude saying: "We propose that such a gauge boson could be responsible for the ghost muon anomaly recently found at the Tevatron’s CDF detector". Well, there some serious doubts about the real existence of that anomalies (see the tomasso dorigo´s blog, linked in this web site, and search for discussion of that topic).

Well, certainly there are a few bunch of models inspired by string theory, and not all of them (if any) can be truth at once. Also not all models make firm predictions. But the point is that they are actually reproducing the MSSM, GUT´s supersymmetric models, and mechanism to enhance the purely particle physics models. Also , in cosmology there are many different points where string is enhancing purely field theoretic models.

But, such as I see it, string theory is actually dictating the construction of (at least some of) the models that are going to be checked in the near future. Also one must not forget about the RS models, inspired by string theory, where one could get black holes in the LHC (that models possibly are not compatible with F-theory GUTs).

With all of this I think that string theory is doing exactly what one would expect from a traditional fundamental theory of physics such it has been made traditionally. Certainly I am talking about very, very, recent developments, most of them from this year and the previous one. But, anyway, it looks like if string theory is definitively "landing" into experimental physics, that is what it was expected from it. And, still, it is doing progress into clarifying it`s theoretical aspects, and the description of black holes (a topic not too easy to study in laboratory, except if LCH produce black holes, that is).

I am not at all a radical and I understand if some people wants to keep doing alternative approaches. The point of this post is to say that, as far as I see, the "not even wrong" criticism of string theory doesn't make too much sense nowadays.

And please, remember that I am not in a faculty position getting money from doing research in string theory. I have no economic, doctrinal or political reason to favour one theory or another. It is just that, according to what I know just now, string theory seems a perfectly good theory for doing high energy physics, and I have tried to explain why.

Monday, June 01, 2009

Quick ideas to become a cosmology atheist

As I said in the other post, and not for the first time, I don't take cosmology too seriously. I find that there are many uncertainties in the observed data and also in the interpretations. Because of that I hadn't bothered to think too much about that questions.

In the last post Kea sugested me to read the Louise Riofrio theory, which resulted to be a version of the VSL (variable speeds of light) cosmologies. I have partially readed some of their statements, and also made the usual googling abbout the topic. The first thing that one finds is a mention of Von Riemman space. Well, I have no idea of what that is supposed to be. of course that could be because iI am not spetialist in the field so I googled for it and Ireached a physiscs forum's thread where other people also agreed that they didn't know it. Well, there som other points in her papers whose motivation I don't see clear. Beeing so I can't say too much else about the general theory

Another apect where she seems to see a point, independent of the general model, favouring her theory of a VSL is the following argument. In some epoch ths sun , according the standard model o solar evolutions, radiates a 75% of the energy that it radiates now. Ok, according to that she claims that earth should be a ice ball contradicitng the fact that there was life in it. The VSl solves the problem because someway the VSl implies tht the sun luminosity should be corrected to the right factor.

Without going into the detaill I must say that I find very unlikely that conventinal astronomy wouldn't have considered that possibbility before. Also there is another consideration. Earth is hot by itself. The friction energy that leaded to it's formation is accumulted inside it. In the XIX there was a controversy among the geologists and a prominent physics I don't remember for sure but I think that it was kelvin). The geological observations dated the antiquitie of earth in a number of years that was imcompatible with its temperature. Using the heat equation and the conventional data for earth materials one could see that earth would have frozen long time sooner that the age estimated bby the geologists. Later Somerfeld said that the reconciliation of the two viewpoints was the prsence of radiactive materials inside earth. Beeing sommerfeld such a well qualified physicist the argument was accepted as valid withouth criticism.

Well, in fact if one does the actual calculationsit can be shown that the radiactive materials are not enought to achieve the hotting of earth,. The reaosn earth is still hot (inthe outside) is that the heat equation used by Kelvin was not right. One needs to consider also trasnport phenomena, that is, convection. Doing so
it canbe shown that earth is hot bbecause of it's inner hot adquired within it's formatioin.

Beeing so I am not sure of how much of the riofrio argument makes too much sense. Also I find that history interesting because it whos explicitly how cautous one must be with arguments not based in observations made inlaboratory controlled conditions. Simply there are too many uncertaintiees.

Well, It has coincided that this mount the spanish edition of scientific american has an article where the cosmological arguments leading to the cosmological constant where revised. Being a divulgative article, that is, easy to read, I did so (it didn't take too mcuh time) The idea is that the observational reason why we belive univserse is expanding aceleratedly is that we see that far supernovaes light arrives to us with less intensity that what it would be expected from it's red shift it the univserse would be under a decelerating FRW (Friedman-Robertson-Walker) expansion. In the article they offert an alternative explanation. They say that it we would be in a particularly empty region of space-time the local decelartion of the universe would be slowest here that in distant points (for example the points near the observed supernovae. It contraicts the copernican principle that says that we are not in a particular place in space time. But tht can be circunvated in a natural way. If in the early universe there would e a random distribution of density inhomogenities that respected that principle the evolution would make that the less dense parts would increase it's size bby a factor ggreater than the more dense ones. In that way it would be mor probable that we would be in a relatively empty region of the universe. The last part of the argument is very similar to the nucleation mechanism that susskind used to explain the cosmological constant (but there are also diferences, of course).

Well, afther reading all that I wondered if I myself could ideate a mechanism to go agains the conventional big bang + inflaction scenary. Well, indeed I could.

The firs thing I did is to think about how fiabble is the red shit factor. Certainly the usual idea, that the expansion of universe generatees red shit is reaonable. And things like the BBN (big ang nucleosinthesys) respald that oservation. But the thing is that maybe there are aditional contributions to red shift. I have not had time to thnk for detailed mechanims. One of the first things that I have thought is that photons have mass. To be more explicit they have energy, and energy is a source of gravitation. One traditional way to put maths on that idea is to asign a mass to the photon MUhV whre v is it's frecuency. Themetiric corresponding to that mas is the Aichelbburg-Sexl solution (basically a Lorentz boost to v=c of the Schwarschild solution). Once one realizes that photons can radiate (and that they indded should radiate) one can think that it must llose energy. That loose would mean an aditional red shift. Ok, one could do the calculation for four dimensions, but that's not the whole history. If one thinks of a Randal-Sundrum like scnary one could expec that there is aditionanl radiation of energy to the one corrspoonding to four dimensions, that is, some energy would be radiated to the bulk. The RS scenary means that there are a range of possible values to the radiated energy that could be used to fit observations (that is, the aditioal red shiths would mean that the univser could have n smaller size that the expected oneand so advoid the problem of thermaliztion of causally disconected zones, usually solved by inflation, or, in other non-sntandrd scenaries, bby VSL). It also has another point. The warp factor could be variating in time. That could mean that the radiated energy could have been greater in the past and that would explain the cosmological constant. This is a particlar mechanis for aditional distant depending red shift, but surely one could guess many others I think.

Well, surelly there are drawacks in the argument.But I have used around an hour to think about the question. I think that for the inverted time I have got a well sounding arguments (certainly there are many imprecisions on the exposed arguments, don't look at them as definitive serious proposals). I have also imaginated another possibility, but it sounds less convincing. Well, don't take the idea too seriously (althought I don't totally dislike it as an a priory total crap one). But the point is that if in so short time ihave ideated an alternative to the standard scenary, even if it is false, possibly there are ther many more options.

1)Not too surprisingly my idea about additional redshift for photons was not new. It dates back to as soon as 1929 made by Fritz Zwicky. See the entry Tired Light in wikipedia for additional details. To be honest my idea has some differences with that proposals. To beguine with I didn't intend to create an alternative to the BBT but to modify it by a small amount. ON the other hand the kind of ways I had thought for the tired light phenomena where very different to what is exposed in wikipedia. The main objection I have read there is that there is no observed redshift for photons within our galaxy and any tired light mechanism would operate in it, and not only in the light from distant galaxies. Maybe I'll make some additional thinking about this topic, to see if the diferences in my proposals save something, but the proposal very probably will dimmer without success.

Anyway, it was only a quick idea. It is not bad to see that it has been considered before by important people. Also it means that the BBT is solid enough to resist elementary attacks. Still the arguments of the scientific american article keep making some sense. Also applies to the uncertainties in the nature of CMB anisotropy explained in the previous entry.

2) For VSL theories you can see this recent post (not the first one he does) about the toic on Lubos blog