Wednesday, December 24, 2008

A christmas present for string theorists from Matthew Headrick

Today in Arxiv it hs appeared the following paper: A solution manual for Polchinski's "String Theory"

The name says it all. The abstrac dissipates any possible dude:

We present detailed solutions to 81 of the 202 problems in J. Polchinski's two-volume textbook "String Theory".

Also I find interesting the announcement of Tommaso Dorigo in his blog about an upper limit on the mass of the graviton (or at least the mass of the graviton in the context of certain Randall-Sundrum models). The bblog entry is this. Of course it will be better to wait untill the corresponding arxiv paper, but it looks certainly very interesting.

Also I have noticed that Amelio Camelia is publishing recently a few papers about nonconmutative geometry in string theory. Camelia is famouse because of the paper whre he presented the idea of double special relativity. Ultimately the idea of DSR has been flawed with many problems, but still I think that it was a brilliant proposal so I am interested in reding that new papers. But before that I am reading other things that I had pending about more well stablised topics in string theory. Anway, if the papers of Camelia are interesting Ill try to bblog about then when I would read them.

Monday, December 15, 2008

Some recomended papers in string theory and cosmology

Status of Superstring and M-theory

If someone is interested in getting an idea of the status of string theory today, in a readable form for non specialists, I seriously recommend this paper by John Schwarz Status of Superstring and M-theory.

It begins with an introduction to the very subject of string theory, explaining the basic of string theory. It explains in more detail than usual the connection between the former "hadronic" string and the actual unified field theory string (and gives the link to another paper by the author devoted to the beginnings of string theory which is worth reading also). Later, in the paper, it explains the second string revolution and the dualities among the five string theories. It talks about the D-branes revolution, the flux compactifications, the warped compactifications sceneries and the possible role of string theory in cosmology. Obviously such a lot of material compressed in 13 pages means that it doesn’t go nearly deep in any of the subjects. It can be readed as an almost divulgative article (a divulgative article suited for theoretical physicists not specialist in string theory, if such thing still exists at all xD).

The second part of the article is devoted to the actual subject, the review of the status of sting theory. Or, to be more concrete, the status of string theory phenomenology. It discuss the most studied sceneries of "compactification" (understood in a general sense). It briefly mentions the early attempts of compactifitcation in perturbative string theory, specially in the heterotic string.

Later he talks about heterotic M-theory, the Horawa-Witten model (you can read additional info about it in this entry of my blog). Later he briefly discuses compactifications of (non heterotic, i.e. type-IIA strong limit coupling) M-theory in G2 Manifolds. Up to here most of the material of the article is written in form that closely resembles the introductions of the corresponding chapters in his recent book (Becker-Becker-Schwartz) about string theory. The remaining part is devoted to subjects not covered in the book.

For example, it follows with an intro to intersecting D-branes in type II string theory. He explains very clearly the relevant points and later he gives bibliography with appropriate reviews (as, in fact, makes for all the subjects). The next subject is type II in calaby-yaus with fluxes. In this kind of sceneries string theory admits a lot of vacuums. This enormous amount of vacuums have been used to implement an old idea of Weinberg to explain the existence of a positive and small cosmological constant and have given rise to the string landscape and, even worst, the resurgence anthropic principle. Schwarz, as many others, doesn’t support too much the idea of the landscape.

Perhaps the most interesting chapter is the last one. It is devoted to a review of the recent papers by Vafa et all about F-theory based phenomenology. This is a very recent subject and excepting for the corresponding entries in the blogs of Motl and Distler (well, Woit did his try, but he lacks of enough knowledge in string theory to be able to extract usfull info from the papers, as he admits in the post)there has not been, as far as I know, too much discussion on this papers (why the hell is the string coffee clossed?). I have not still readed the original papers of Vafa (I am waiting to conclude a doctorate courses I am assisting about algebraic geometry before doing that) so this chapter is the first non blog info I have about it. I must say that Schwarz exposition of the subject is very, very clear. He introduces very well what F-theory is. He explains that these models are based in a novel idea, the decoupling between the GUT scale and the Planck scale. Models with this feature are named local models to distinguish them from the usual models, named global models. I´ll not try to describe here the details of how local models of F-theory are constructed and I remit the readers to the actual paper.

Quintessence from string theory

Another interesting paper, at least for me, is arXiv:0810.5346v1 entitled "Where in the String Landscape is Quintessence". When the expanded acceleration was firs observed there were many proposals. Nowadays the most commonly accepted one is that the universe is de-Sitter, that is, there is a cosmological constant. Apparently one of the reasons for this broad acceptance stems from string theory and the works of Bousso, Polchinsky and others who explain this cosmological constant in the scenario of the string theory landscape. I guess that the logical chain is something like: String theory must be true->string theory can explain an small positive constant->we have an small positive constant. Well, this paper uses similar ideas to that of Bousso and Polchinsky to support quintessence.

Some readers of the blog may be don’t know what quintessence is. Let’s remember first that a cosmological constant may be seen as due to the vacuum expectation value of a quantum field. That field, usually a scalar one, must have constant value in space and time. Quintessece also should be due to a scalar field. But this would vary in space and time. It must also fit some additional features related to its equation of state (the equation of state that represent the field in the cosmological models based on general relativity. In particular it must have pq = wρq. Here p is the pressure, rho the density and w is the quotient of bothe (w=p/rho). The actual value of w must equal to 1/3 during radiation domination and 0 during matter domination) until w undergoes a transition to less than -1/3 which initiates the accelerated expansion of the universe. For more details I suggest to read the relevant wikipedia entries: ,, and Or, if they have time and are very interested in cosmology they could try to read the recent book of Steven Weinberg on the subject:

The reason I find this paper particularly interesting is because many solutions which allow the existence of wormhole solutions, a topic that I find particularly interesting. Maybe people not so wormholmy wouldn’t find so interesting this paper.

Astrophysical Probes of Unification

The last paper that I want to mention is this. It is very recent, it has appeared just today in arxiv, and I guess that soon Sean Carrol and/or Lubos Motl (or Distler if he decides to reapear from it's retire) and possibly others will make extenses comments on it so I will only copy here the abstract:

Traditional ideas for testing uni cation involve searching for the decay of the proton and its branching
modes. We point out that several astrophysical experiments are now reaching sensitivities that allow
them to explore supersymmetric uni ed theories. In these theories the electroweak-mass DM particle can
decay, just like the proton, through dimension six operators with lifetime  1026 sec. Interestingly, this
timescale is now being investigated in several experiments including ATIC, PAMELA, HESS, and Fermi.
Positive evidence for such decays may be opening our rst direct window to physics at the supersymmetric
uni cation scale of MGUT  1016 GeV, as well as the TeV scale. Moreover, in the same supersymmetric
uni ed theories, dimension ve operators can lead a weak-scale superparticle to decay with a lifetime of
 100 sec. Such decays are recorded by a change in the primordial light element abundances and may well
explain the present discord between the measured Li abundances and standard big bang nucleosynthesis,
opening another window to uni cation. These theories make concrete predictions for the spectrum and
signatures at the LHC as well as Fermi.

Tuesday, November 18, 2008

The fqxi time essay contest

At least some readers of this blog will read some of the links, and also other blogs (and phorums) not linked here.

If so it is very probable that they will already know about the fqxi foundation and it´s content about time nature. The actual webpage of the essay is this .

The content, or at least many of the papers appeared when he id the post, has been judged by Jonh Baez, in a post in the n-cathegory coffeé as "crackpot". I find that crackpot obession somewhat,well, crackpot. For example Jonh Baez itsefl, author of the crackpot indicator belongs (whether if he wants or not) to the LQG comunity which is considered crackpot. I think that this crackpotery issue is becoming totally nonsensic and very boring.

Said this, the truth is that some of the published articles are very philosofical (to classify them in some way) and very non phyisical. And, certainly, up to now no heavy weigth of string theory has sended an essay (and he is not expected). Of course there are good reasons for that which can be guessed easilly by anyone who thinks about that a bit.

That is not necesarilly a bad thing and invites to physicians not belonging to the top notch spheres to participate, simply because they have some reasonable possibilities to win (or at least to get some of the secondary primzs). In fact I am aware that some bloggers such as Mitta pitkannen, Cristinhe Dantas (I must add therem egregium to the links sections some day of these), Carle Brannen and, may be, Kea has sended their essays. Among the best known phyisicians are Carlo Rovelly, Rodolfo Gambini and Clauss Kiefer.

In a diferent category i would point to Hrvoje Nikoli, who has some papers in arxiv about string theory (certainly a bit outside of the mainstream of string theory) and, one of the last essays, by his biographycal note I think that he could be considered as an cosmologist in the orbit of the Randall-Sumdrum brane worlds, that is, in a position related to the mainstream of stringy cosmologists.

Also I will make a separate mention of an Spanish competitor, Venerando Solis Barrado. I must say that for good o bad I am somehat responsible of his participation because it was bye mans of me that he knew of the conent. Al i can say is (almost) good luck ;-).

I have readed some of the papers, and I have intention to read most of them, at least the ones written by known people and that have recibed votes and/or responses.

And now the delicate question. Will I participate?

Well, I wasaware for some years on the problem of canonical gravity (wheler-de Witt or LQG) with the problem of no evolution in time. I had readed their guessed solution by means of the relational time. I had readed (as is reflected by some posts on the topic) about wormholes. Womrmholes can work as time machines and peoplee working in wormhles usually make discusion about the topic, including non womrholmy time machines. I had readed about tachyons outside string theory (I guess I have readed most of the relevants papers) and something about tachyon condensation. I had readed various books by Illya Prigogyne aout the arrow of time. And, still worst, I was demanded by a friend to invent a guessed explanation about the amercian film "the butterfly effects" which covers the topic of time travel. That resulted in a toy theory which I like to call "cronoquantum mechanics".

I think that that means that I have a reasonable background in the topic. Reading some of the already published articles I see that I was not aware of some things, but not of too many. Well, physic of time and specially of time travel is a funny topic for fans of science fiction, and I am one of them. That is mainly the reason I have learned about that topics. For a while I tried to develope a litle bit more the "quantum cronomechanics" but I ended up bored of the topic (and I must add that it is a good source to get a headcache to try to think too much about it).

Even thought now there is this content and they pay a good money to the winner, and also there are some secondary prizes which result interesting. Certainly I can live withouth them, but, hey, this blog is freelance quantum gravity, and my other blog is freelance science. The name of the blogs is inspired in the freelance workers in the infomaic market. The idea is easy to understand. In the informatic buisenes there are people who don´t like to work in fixed position in an enterprise (or governement position) and prefer to work as freelances from their homes. I must say that there is a wide market for that sector and that many people in that sector are formers top notch programmers working on well known enterprises who decided to be iindependents.

The physic market, quite on the countrary, is based on an academic/funtionary paradigm. People want to get positions at universities and research institutes and devote to that purpose a lot of effort, from the very begining of their universiary studies. Of course there are good reasons for that, to begin with, tradition.

But if one analyzes the situation the fact is that the model has some serious drawbacks. The academic positions are got by people who has a reasonable expectation of beeing in the charge for around 40 years. That means that he will be blocking that postion for around 8 promotions of students (assuming a promotions takes a romedia of 5 years). It is important to note that universitary positions are permanent and that new universities (specially ones wich need theoretical physics working in that topic) are not created too ofthem.

That means that there is very few possibilities to get a permanent possition. Once this is realized there are two possibilties. To begin a fierce competition from the very begining triying to get the best possible academic quailifications and triying to convence to your teacher that you are the smartes guy since Newton. It is important to note that getting the best qualifitaions relay a lot in learning `rmarly what your teacher decides that is important about a topic. If you decide that you are interested in other aspects of the topic and study them it results ofthem in a worst qualification.

Other possible way is to organize life in a way that you could have reasonable amounts of time working, when possible, in topics related to physics. You can use that time to learn what you prefer. I certainly have opted always by this way. Of course that doesn´t mean that I reject the possibilty to do a PhD and all that. But it would be in maths, I belive that I know too much physics to do a PhD in physics (yeah, it is contradictory).

Well, all this las diegressions it to congratulate the fqxi foundation for their iniciative. To publish a paper requires to do some very specific work,no always fun. And I like to be payed for my work (at least if it is a good work). I mean, I am sure that now I could publish papers on string theory or LQG. Certainlly they would, very probably, would be not too goood papers (ate least not now). But even that papers would requiere good amounts of time. And I wouldn´t get any rewar for them. So i see not point in triying to publish anything if I am not almost sure that it is not crystal clear that would be a top notch paper (a definitve prrof of the AdS/CFT conjecture, the millenium prize in the yang-mills masss gap or things like that). Certainly I think it is not a good idea to depend on that possibiilities.

On the other sie this more modests fqxi prizes are very interesting because they are a more realistic objective which can serve to animte people to do good papers that he woind´t do otherwise.

Ok, a lot of sociology. Am I going to publish a paper in that content?. Well, today I have had an ocurrence for a very nice title for the paper, It would be a mesh to not use it. Most important, I am really tired of the topic of time physics and I think that the best way to never to have to woory about the topic is to publish the paper. But I still dind´t answer (I know that people who know me personally have realized that point perfectlly), shal I try to send a paper to the content? Well, may be, if I have time... ;-).

Update: The deadline for the submition of the paper is over and I still haven´t it ready. Un unfortunate minor illnes has had me down (for purposes of doing serious work) for around a week and that is bad for someone like me that belongs to the "wait last minute" category. If there is some flexibility byt the fqxi comite maybe still it could be some chance. If not I´ll try to put the paper somewhere (once it is finished) in order people could read it (if any is interested).

Friday, October 31, 2008

New physics for Hallowen

Dos papers salieron ayer dando señales claras de nueva física. De un lado anlizando los datos del satélite PAMELA se ha confirmado un exceso de positrones en los rayos cósmicos respecto a lo esperado según los modelos convencionales.

Para explicarlo se sugieren modelos basados en producción de positrones basados en ciertos tipos de materia oscura como puede leerse en este paper:

Podeis leer una entrada de blog dónde lo explican con mas detalle aquí:

El otro gran evento es el que describen en este paper:

El CDF es uno de los dos detectores del acelerador de partículas americano Tevatron (el otro detector es el D0). El grupo encagado de analizar los datos de ese detector se ha encontrado con que encuentran muchos evento que producen un muón producido a mas de 1.5 cms del centro de colisiones dónde se produce un evento dimuón. Para que esto sea posible se arguye que debe haberse formado una partícula en la colisión que se aleja esos 1.5 cms antes de desintegrarse en el muón.

Lo interesante es que ningún componente cnocido del modelo standard parece ser un posible candidato a ser esa partícula y por tanto todo apunta a qu ese trata de una nueva partícula.

¿que tipo de partícula? Un paper de hace unas 3 semanas por Nima-Arkani-Hamed y Neal Weiner presentaban un modelo que pretendía , basado en ideas similares a lo que ha revelado PAMELA sobre la dark matter un a predicción sobre que el lHC podria encontrar cierto tipo de nueva física que encaja muy bien con lo que, anticipándose, parece haber encontrado el tevatrón. El modelo de nima-Weiner es un modelo en que se hace una aplicación de la supersimetría al sector escondidio de la materia oscura. Podeis leer detalles al respecto en varios blogs:


Woit (que no se entera muy bien de lo que dice el paper de nima y Weiner):

Dorigo (que de hecho estan en el grupo de trabajo del CDF):


Wednesday, October 29, 2008

Non quantum gravity and dark matter

I keep reading Physicsforums, specially the beyond the standard model forum. Recently there was a discussion about a new proposal appeared in arxiv arguing that maybe gravity wouldn’t need to be quantized after all.

The paper in question, authored by Stephen Boughn is this.

It is a very clear paper where the usual assumptions are reviewed. As is commonly known we actually have a quantum theory, the standard model, which describes all known interactions but gravity. The best available description, experimentally supported, of gravity is Einstein theory of gravity, which is a non quantum theory.

In order to approach both theories one can begin by quantizing the standard model in the curved backgrounds of general relativity, instead of doing it in plain Minkowsky space (see my previous post for an easy introduction- in Spanish, sorry for non Spanish people-).

The next step one could try is simply to consider the gravitational field created by the averaged value of the energy momentum tensor and forget the idea of quantizing gravity at all. That is to replace:




This proposal has many well known problems, both theoretical and practical. The author discuss them in the chapter 6 of his paper. Consider a state of matter, with probability !/2 of being in O1 region of space time and a probability ½ of being in a disjoint region O2. If you use equation 2 you get a gravitational field appropriate for matter being distributed in both regions. If later a measurement is made and the state is resolved to one of the Oi then the gravitational field would change in a discontinuous and acausal manner.

The key point of the paper is to keep eqt. 1 as valid and forget about 2. Of course one can’t do it without further assumptions. The author establish that the energy momentum tensor must satisfy the following prerequisite. In the language of decoherence theory, that the system is in a decoherent, mixed quantum state for it is only then that the probability predictions of quantum theory agree with those of classical physics. (You can read about decoherence in, for example, this paper by Lubos Motl.).

This assumption immediately implies another one, that non-localized, coherent quantum systems are not sources of gravity. That sounds as a very hard assumption, but the author argues that It will turn out that for microsopic systems, in which quantum coherence is most commonly observed, the effects of gravity are, in principle, unobservable. For larger macroscopic systems, decoherence is the norm and classical stress-energy is well defined. This leaves open the question of gravitational interactions of mesoscopic, coherent sytsems.

After that he goes through some chapters reviewing the detectability of possible quantum gravity phenomena. He begins, in chapter two, considering the detectability of gravitons. Remember that a graviton should be the quanta that would mediate gravity interactions if one insist in doing quantum gravity in a particle physicist like way. This chapter is very well written, and it relates the gravitons to gravity waves. Note that one of the authors research activities is precisely in the field of experimental detection of gravity waves so he can be considered an authority in that particular.

In chapter 3 he dwells with gravity and quantum interference, that is, double slit like thought experiments. He concludes the existence of a conditions that must be satisfied for a gravitational measurement to be made that will sufficiently localize the incident particle so as to destroy the quantum interference which are stated in terms of the separation of the two slits, r, the acceleration of the test mass at, the velocity of the incoming particle vi. The actual conditions are:

r >¯h r^2/Gm^3 (here ¯h is h bar, i.e. h/2π)

t >¯h^3 / G2m^5

at < G^3m7 /¯h^4

vi < Gm^2 / ¯h

If the conditions are not satisfied, the gravitational interaction is insufficient to detect the incident particle and quantum interference remains intact..He concludes that for quantum coherent systems with masses less than ∼ 10^7mp (mp=Planck mass), there is not a measurable gravitational effect that would compromise their coherence. He does further analysis and get further restrictions. The conclusion of the arguments is that the question of whether or not coherent quantum systems are sources of gravity is unanswerable for systems with masses < 10^10 mp. That leaves unanswered the question of mesoscopic systems, which he analyzes later.

The chapter 4 is a continuation, in a certain sense, of the previous. The most interesting is the chapter 5 where he fully analyzes the central issue of the paper. The key point, if I rightly understand is the following statement:

“Because macroscopic systems
invariably undergo decoherence on very short time scales, they behave as they would
in a classical world, i.e., no quantum interference effects.”

Or stated, together with another claims of the chapter, in a more generic way it could be said: “the experimental data available to date only takes account of interactions between matter systems in a decoherent state.”
That raise the question of what would be the behaviour of macroscopic, or at least, mesoscopic, systems which are in coherent states. He talks about the copper pairs in superconductivity, Bose-Einstein condensates and systems like that. Here I would add a few things. A few years ago an condensate-matter physicist, Podkeltnov, made a claim, in a press conference, about some kind of gravity shielding that appeared unexpectectly in experiments which implied some kind of high temperature superconducting devices. He didn’t provide all the details of the experimental device and ulterior attempts to reply the experiment, based on the available data, are until now unsuccessful. Later Podkelnov he improved the experiment and even tried to conjecture an explanation. His argument was related to the suppression of Fourier modes of gravity because of coupling of the Landau-Ginzburg lagrangian which could be used to describe the superconductor to the energy of the cosmological constant. Certainly the “non quantum gravity” proposal could be considered as an alternative explanation if one would try to insist in explaining an effect non firmly established experimentally, of course.

To conclude my review of this proposal I’ll mention a few problems that the own Stephen Boughn recognizes. The main one, in my opinion, is that if a coherent system exchange momentum with a coherent one, and later becomes non coherent his proposal could lead to a violation of momentum conservation. Another one is a legitimate criticism of the conjecture put forward in this paper is its lack of predictive power. Except possibly in the case of the coherent to decoherent transitions in mesoscopic systems, and even in these cases the conjecture makes no specific prediction, the nonquantum conjecture makes no additional predictions that can not already be made by quantum theory and general relativity. There are some more concerns, that the author acknowledge in the final chapter and I´ll not talk here about them.

Let’s go now with the next topic of this post, dark matter. A few weeks after this paper appeared Sean Carrol in his blog, cosmic variance, made this post. Soon there was a reply by Lubos Motl here.

They are very interesting posts in their own. But I bring them here because it is stated there that dark matter, if it interact only by means of gravity with itself, and the rest of the universe, would decohere very solowly. In fact, if the non quantum gravity proposal would be taken to it’s full consequences it could be expected that it wouldn’t decohere at all. But if so, it wouldn’t interact gravitatorilly at all. That is a very bad thing because dark matter is postulated to explain unobserved mass in the universe which accounts the observed rate of cosmological expansion.

In fact, in a very recent paper it is discussed the possibility that dark matter could not exist, or, at least, not be the main responsible of some experimental data. The paper is this. It is discussed in a blog entry by Lubos Motl here. Quickly, the idea is that a field associated to string theory, could take a nonvacuum expected value and that if particles are actually strings, would couple to it resulting in a Lorentz type force which would explain the problem with the way galaxies rotate in an alternative way to the usual explanations of dark matter of MOND (modified newtoninan dynamics). If this non quantum gravity proposal would be taken seriously the stringy paper would gain additional value because dark matter, even if it exists, could not interact gravitationally, or at least not too much.. Of course if we accept the nonquantum gravity proposal string theory would loose one of it’s more important reason to exist, it’s status as a quantum theory of gravity and it would have to be questioned if it’s explanation of galaxies rotations could be still accepted.

In fact I admit that I actually didn’t do the actual calculations of exactly how much dark matter would interactuate gravitationally if the non quantum gravity proposal would be truth. I find surprising that the author, Stephen Boughn, wouldn’t consider it in his paper when he claims that he is actually working on cosmology, but, of course, he could easily not have realized this lack of coherence in dark matter, which is only obvious once one is told about it, but not before.

Anyway, the paper is interesting in it’s own, even if it’s wrong, because of the review of many aspects related to gravity and it has served me to take quote of some issues that have happened in the quantum gravity world in the recent times. Hope the reader would find them interesting.

Saturday, September 20, 2008

The LHC is your friend, trust the LHC

Afther listenign almost all the confrences of strings 2008 that I mentioned in the earlier post I had to make a break into blogging because of a few diferent reasons (computer virus, preparing people for september exams and so on). In the while I have ahd time to read a few papers and a few books, the books not mainly about quantum gravity related things.

In all this time the most interesting source of news has, umdoubtly, been the LHC. Oo one side because of all those people worring about the end of the world black hole. On the other the bets about what will be discovered when the machine at last would collide protons.

About the first subject I have readed a few arxiv articles about evaluationof the possible danger of black holes. The subject depends (once Hawking radiation is discarded as a way to destroy the black hole, a very unprobable thing) on classical general relativity and rates of acrretion and things like that. I must say that I had no previous knowwledge on the subject and I have found it interesting althought certainly a bit far from the usual target of cutting edge theoretical physics. As a side effect I have had to reconsider the precise meaing of the grouth of a black hole. It could seen obvious, but in fact it isn´t. The classical scenary is to calculate the rate of accretion (using Bondi theory or whatever) and later relay on the classical laws of black holes stated by Hawkings, specially the one relating the icrement of area to the increment of mass. But the reality is more complex and to get a precise mathematical tratement one must go to the theory of dynamical and isolated horizonts. I´ll write a post about the details in the other blog as soon as possible.

But the LHC is important not because of the black hole and similar catastrofic secenaries (all of them very unlikely, to say the less, people worried about real problems have a lot of better places to wath for). The real interst is if it will find the higgs bososn, supersymmetry or whatever. In fact it seems that it has benn an increasing amount of papers diving into the data of the tevatron with the aim of profiling the best chances for the LHC finding new physyc. In one of hat scenaries there was a good chance tht in the first five days, or so, of activity (that is, colliding activity) of the LCH the supersymmetry could be found. That was supposed to be as soon as the next week (althought probably the actual annalisis of data would require most time). I was, partially, waiting that notice. It would be certainly a relly good notice for a post (afther all it would be the best new in particle physics in around 30 years). But unfortunatelly it seems that ther has been a somewhat serious problem in the LHC, rupture in a part of the collider has resulted in the lost of liquid hellium. It is still ot known how seriously the problem is but some people say it could mean that the LHC propgram would be dealyed untill the winter shutdown so we would need to wait a litle bit more to get relevant experimental data. If this is confirmed stil there are possibilties of getting new physic from astrophysic/cosmology. In fact this week has been seen what looks like a bridge of dark matter aaround which galaxied penetrate into regios that, untill now, where considered as giant vacua in the universe. And the GLAST satelite is working propoerly for more than a mounth so it is possible that it could find signals of WIMPS (weakly interacting massive particles) a favourite candidate of same people for dark matter constituent, or, maybe a final answer (possitive or negative) for the LQG prediction about dispersion of light speed in vacuum. In fact I had readed in a newspaper that a great gamma ray burst had been detected a week ago and I have been waitng since them to read that the GLAST had looked at it so the question could be really, reaaly, next to be answered, but, unfourtunately, It seems that the GLAST losed it, aand we must wait a litle bit more.

Anyway, as unfourtunately it seens that great news are delaying I decided to blog agin about more conventional things.

Monday, August 18, 2008

Strings 2008: TV en directo

¿Alguna vez te has planteado asistir en directo a un congreso sobre física de cuerdas?

Ahora puedes, al menos virtualmente:

P.S. Podeis ver los horarios, con las diversas conferencias aquí

Lubos esta actualizando su blog con comentarios sobre las diversas conferencias (a la fecha de escribir este post script van 3). Podeis seguirlo en esta entrada Yo por mi parte intentaré comentar alguna cosa, pero sería absurdo plantearse competir con Lubos xD.

Saturday, August 09, 2008

Ideas básicas sobre cuantización en espacios curvos

Un entrada "ligerita", para que no se quede esto abandonado demasiado tiempo.

La teoria cuántica de campos ordinaria se formula en espacios planos, en partícular en el espacio de Minkowsky. Se empieza explicando la teoria de campos libres, interesante para explicar el concepto de vacio y de espacio de Fock. En cuántica ordinaria (no relativista) se suele hacer teoria cuántica de una partícula. Y lo más importante, podemos tener fijo el número de partículas. Esto no es posible en relatividad especial por aquello del E=mc2 que permite fabricar partículas "desde el vacio" durante un tiempo muy pequeño debido a la relacion de incertidumbre entre tiempo y energía. Así pués debemos describir la física, no en un espacio de Hilbert (hablando libremente, que en realidad la partícula libre no va en un espacio de Hilbert según lo definen los matemáticos ya que la funcion de onda de una particula libre no pertece a L2, conjunto de funciones de cuadrado integrable) sino en un espacio de Fock. ¿Y esto que es? Pués una serie infinita de productos tensoriales del espacio de hilbert de una particula adecuadamente simetrizado o antisimetrizado, según tengamos bosones o fermiones. Osease, algo así cómo (uso una notacion simbolica para aclarar ideas):

1. F=0 +H + H1xH2 + H1xH2xH3+ ...

Aquí H es el espacio de hilbert de una partícula. Cuando hay dos partículas tenemos H1 Y H2, el espacio de hilbert de cada una de las dos partículas. X denotaría el producto tensorial simetrizado o antisimetrizado de las dos partículas. El signo + indicaría suma directa. Vamos, que debemos considerar la posibilidad de tener una sóla partícula, dos partículas, tres partículas o un número infinito de partículas, y nuestro formalismo debe contemplar esa posibilidad. Eso es el espacio de Fock, y parte de la idea de la mecánica cuántica relativista. El desarrollo de la teoria relativista requiere considerar particulas en interacción, lo que nos llevaría a la matriz S y etc, etc.

Pero para el tema de cuantización en espacios curvos sólo necesito campos libres. Fijaros que en la ecuación 1 he puesto, un tanto libremente un 0. Esto es el vacio, donde no hay partículas. Por otro lado la mecánica relativista es una teoria cuántica de campos. Esta sutil diferencia semántica puede interpretarse como que los campos crean partículas. Sin entrar en detalles simplemente comentar que tenemos un lagrangiano clásico en términos de los campos (por ejemplo el campo electromagnético, expresado en términos de sus potenciales) o el de una partícula de klein-gordon. La ecuacion de Euler-Lagrange para esos campos es una ecuación en derivadas parciales que puede resolverse por separación de variables y resultado de ella sale una expansión del campo clásico en términos de modos de fourier. Bien, cuantizar es imponer relaciones de conmutación a los campos, sustituyendo los campos clásicos por operadores, que actuan en el espacio de Fock. Imponerlas "a saco" es un tanto dificil. Pero cuando tenemos el desarrollo en serie de Fourier cada modo de Fourier se convierte en un operador. Y puede interpretarse como un operador de creación, o de aniquilación. Un operador de creación crea una partícula y uno de aniquilacion la destruye.

Esto es para campos libres en el espacio plano de Minkowsky. En espacios curvos la cosa se complica. Resulta que lo que un observador ve como vacio otro observador puede verlo como no vacio. El caso más sencillo es el conocido como espacio de Rindler. Este es simplemente el espacio de Minkowsky visto por un observador enmovimiento unifromemente acelerado. El observador acelerado ve el vacio de Minkowsky lleno de partículas con una distribucion térmica. La temperatura de esa distribución depende de la aceleracion. A más aceleración más temperatura. La forma técnica de expresar esto es mediante lo que se conoce como transformaciones de Bogolubov que transforman el vacio de un observador en otro. Otro ejemplo, mucho mas famoso, de cuantización en un espacio curvo es cuando se cuantiza un campo libre en una geometría de Schwarschild, la que describe un agujero negro. Ahi se tienen que un observador en caida libre que este muy cerca del horizonte de sucesos vería vacio la zona cercana al miso. Sin embargo otro observador que estuviera inmovil a una distancia fija del horizonte de sucesos vería lo que el otro observador encuentra vacio lleno de partículas. Esto puede interpretarse como que el agujero negro esta emitendo partículas y es lo que se conoce como radiacion de Hawkings (por cierto, no confundir esto con un concepto muy similar aprentemente, el de entropia de un agujero negro). La intensidad de la radiacion depende de una magnitud, la gravedad superficial del agujero negro, que puede demostrarse que esta ligada a una potencia inversa del área del agujero negro. Así pués a menor área mayor emision de Hawkings.

Las fórmulas concretas del resultado son:

$$ T_H=\alpha_H/2\pi $$ dónde $$\alpha$$ es la gravedad superficial del agujero negro, de valor: $$ \alpha= 1/4M$$

Si no trabajamos en unidades naturales la fórmula se convierte en:

$$T_H=hc^3/16\pi^2GMk $$ dónde todos los factores tiene un significado obvio, excepto quizás la k, que es la constante de Boltzman.

Si el agujero negro esta cargado y/o gira se pueden obtener las fórmulas concretas usando las métricas de Reissner-Nordstrom o la de Kerr-Newman. No pondré los resultados pues serían mas adecuados para una entrada específica.

P.S. Un buen artículo de review es el siguiente: arXiv:gr-qc/0010055

Monday, July 14, 2008

A watch at the string landscape

Like many physicist I am a reader of science fiction. String theory is not a topic which is too broadly covered in SF, and, anyway, it is not covered too properly. For example, it could be that the author limits to cite the words "calaby-yau" as some kind of manra. Even thought there is one particular novel, writen in the eiguthies, where there was a fine usage of string theory. There an alien spae-craft arrived to earth and they tripulants beguined a discusion with relevant human figures in art, politics an scince. In paarticular, in the sicence area, they tolked with string theorists and discused with them many mathematical aspects and conceptula developments that they found terribly exciting. Despite that no concrete experimental evidence was provided. While doing that the aliens had throught a black hole inside the earth which growed slowly, but fast enought to eat the whole earth a few mounths later, toward the end of the novel. Fortunately anonther space-craft had appeared, tripulated by a diferent alien specie, and saved some selected humans. I guess that any informed reader will be able to see the possible funny possible analogies with the actual situation :-).

The purpose of this introduction was to sign the fact that string theory has grown a lot in many directions since the eighties, and it is somewhat discouragint to try to get a prcise idea of the many lines (some of them alsmost death) of development followed in the while. But if I wuld be one of the "bad aliens" that would try to give some guidance to an eighties string theoretic maybe I could use this post as a begining, or at least that is my intention.

The great chalenge in string theory is to get a proer way to get a decent way to go from 10 to 4 dimensions. In the eighties the most pomising way was to look for compactifications of heterrotic string theory in calaby-yau mamifolds, or maybe in orbifolds. Soon it as realized that it was interesting to study not one, but families of calaby-yaus. One went from one to other by variiying some moduli. Another easy way to compatify were orbifolds, tori acted by some discrete group. The fixeed points of that action were singular, and the studie of that singularities revealed to be very interesting. It was necesary to go troguht a revolution, the discovering of the importance of branes, to give more fuel to the compactifications. One could use branes to solve the singularities of the orbifold fixed points. And it was found that that pints could do transitions among calaaby-ayus with diferent topologies. Also the Calabi-Yau moduli space revealed to hae singular points, called conifold points. Coriosly the own moduli space of a C-Y could be, in some sense, characterized a calaby yau of an special type, one with conifold points (i.e., a point similar to the edge of a cone, that is, a continuous but not diferentiable point). If one suits a D-Brane at that point one can "blow-up" the singularitie. But, anyway, the thing is that conifolds can also give transitions betwen vacua of diferent topology. In fact the scenarie is worst. Ther ecan be transitions to phases where the vacua doesnt´admit an obvious gemoetric description and one must use CFT/non linear sigma models, to describe the theory. In fact Witten argued that in M-theory, an aditional development of string theoyr corresponding to strongly coupled type II A strings, only geometric phases were allowed.

In adition to compactification "braane worlds" were considrd. The idea was that the observable world would be some kind of brane. Precise realizations of that idea were purchased form many viepoints (I guess that the most recent try use the idea of intersecting D6-Branes).

In the mean time it was discovred that the universe is accelerating. And there are som kind of consensun that at a constant rate. That means that "phantm energy" scenaries seen to be ruled out and we must look for a de sitter universe emerging from string theory. The firs realization of this was the KKLT theory. In that scenarie there were required vacuums where supersymmetry was broken in a way that it gived some cosmological constant. It was argued that the univrse could be populated by manu diferne vacua. Each vacuawith a diferent value of the cosmological constant would expand at diferents speeds so we would live in some buble of a particular vacua. That lead to the counting of vacua that shrd some properties, and to do an analisys of the statistical distribution of other properties. For example, in some kind of vacua compatible with aa certain value of the cosmological constant there were more solutions with large extra dimensions. But another kinds of such vacua ere in the opposite direction. By the way, vaua with cosmological constants are not tru vacuums, they are metastable states whose decay rate is graater that the actual ge of the universe, oh yeah ;-).

Some interesting remarks about this models are that they give a potential for the scalar fields that describe the moduli of the vacuas. Taht is, they re, in a certain sense, properly defined theories with all the measurable values fixed. This had proved to be a very difcould task. The Dine-Seiberg conjecture stated that a proper determination of the value of the modulis required to go to non-perturbative range of string theory. But the hope was that once one had a theory with aall that values fixed one would have a unique, of almost unique, theory. In fact one has, in some scenaries, around 10^500 theory (i.e. vacua). whose average cosmologicla constant is the observed one (the counting was made by first time by Bousso and Polchinsky for some particular kind of models). Another point is that there is not a natural way to make statistic mechanic for that diferent vacua. I.E, Ine can´t make a proper statistichal ensmble out of them because that vacua should be separated into diferent sectors with superselection rules not allowing going from one to another. I recmend to llok at the blog of Dimityr (non equibrium net) to get a mch better discusion of this topic.

By the way. Most of this studies were made for type II theories. What was of the hetrotic string?. Well, infact there is an heterotic landscape also. It is courious. Another development of string theory was to prove a counting of the benckenstein entropy of a black holes (or at least a paarticular kind of them). for that puropose Type II theories, and their D-Branes, were used. But later it was seen that one also could use heterotici strings to describe black holes. It seems like if heterotic string theory always has aa delay in the achievement of the resoults. But, in the positive point, heterotic strings still seem to be promising. FOr example teh heterotic landscape contians many fewer vacua.

In the eighties ther was a hope that string field theory could provide some kind of dynamics which could indicate how these compactifications could be achieed. Unfortunately string field theory had not succes and has proved to be very dificoult anyway. I fact one would have an string field theory fo rany of the diferen string theories.

With all that I have exposed it looks like if there are too many things going on. In fact it is so. I think taht what I would like to see is a way to see how topologicla transitions could be used to connect diferent vacua. In fact the vacua of the landscape aare, as I said, not aall of them supersymmetric vacua. That would mena to consider a more generic king of compactifications, and studie the possible topologicla transitions betewem them. One way to consider taht could be the use of instanton/euclidean wommholes. And also to see how to describe this in some kind of SFT. Also it one consider that the diferent string theories are related by dualities, meaning that in some sense they re a single one, one could study wormholes, ot whatever, connecting them. A way to beguin this program could be to try to describe some kind of wormhle like solution connecting diferent compactifications (or a noncapctifed space to a compactified one).

I muist advertize that like this post contain many, many, topics, I have not pretended to be very exact in the descriptions. My idea was just to give a broad perspective. I hope to wite in a near future more detailed posts on more concrete topics, but I guess it was too much tiem since the last posts and that It was a good idea not to bee too lazy and write smething ;-).

Sunday, June 15, 2008

The case for dynamics in canonical LQG

It could be easy to suppose that following some selected blogs, and reading discussions in the appropriate forums one is aware of "what´s going on" in quantum gravity related issues. Well, the Ian Malcon´s law, you know, "98 % of what you believe is false" strikes again. If you follow the link´s sections of some of the most well known blogs (Motl, Distler, Woits, etc) you find that many of the linked blogs are not about quantum gravity, and many of the QG side are sparsely updated, or contain a lot of non physic related entries. The result is that it is easy that a lot of papers, or even full lines of research pass unperceived. Perhaps the more complete list of papers is this thread in physics forums. And of course you could simply follow arxiv (If you didn´t notice I added an RSS feed to the last five entries on hep-th to the blog) . But although Marcus, the author of the cited thread make some brief comments about the papers he links it is hard to get a proper idea of the relevance of the papers.

In this blog I have tried to give an overview of various (surelyy not all) lines of research, even if they are not too likely to be viable in the long run. I have coved things such as conformal quantum gravity, Garret Lissi´s E8, topological geometrodynamics, and, recently, the new idea of 't Hooft (I am still waiting for the paper covering the quantum part). Most of this approaches have almost null follow up. For example the last paper of ' t Hooft has only one citation so far, despite the fact of he being a top notch physicist.

This mean that we are left with only three approaches with a wide coverage, string theory, of course, LQG and, far behind this two, the non commutative theory of Alain Connes. Well, if someone has read the book "three roads to quantum gravity" by Lee Smollin (I didn´t) he will be somewhat surprised that I don´t mention the twistor theory of Roger Penrose. After reading the last chapter of the Penrose´s book "the road to reallity) describing that theory and it´s state of development I think it is too far of being a theory of quantum gravity. If I have to choose I would bet for the euclidean quantum gravity approach advocated by Hawkings instead, but I am not too sure if someone keep publishing in that topic nowadays.

Some people find this discouraging. For example Mitta Pitkanen, the creator of TGD, asked some mounths ago about the reason string theory was so majoritary. I guess that the reason for this is easy to understand. String theory strongly relies in ideas of particle physic and quantum field theory. As far as most theoretical physicists have this background it is very natural that from the standard model they would go to string theory. That also explain why LQG has also a reasonably good amount of followers. It is the natural approach to quantum gravity for people with a deep implication in general relativity ideas. In fact possibly a lot of people who a few year about hold their hopes in twistors or euclidean quantum gravity have gone into LQG nowadays. And it also explains the relative success of NCG. Simply mathematicians follow the work of a field medalist in a work bases in the standard model, that is a theory expressible in fiber bundles language so they can learn it relatively easy, and is near to the things they know. People simply follow natural developments of the theories they know well. Or at least that is my opinion.

Ok, this has been a long prelude to somewhat justify why I am writing the actual topic of the post, pertaining to LQG research. After I had said in previous posts that I was becoming progressively more and more centered in string theory, and that I was becoming progressively more skeptic in LQG. Well, the reason is that I hadn´t seen previously any discussion of this particular topic and I think it deserves some attention. Surely there is out there people lot more qualified to cover it in a blog/forum, but as far as I know nobody did.

Let´s state the arena to understand the problem. In canonical LQG one tries to solve the constraints appearing when one tries to go from the Einstein-Hilbert Lagrangian to a Hamiltonian. Doing an slice of spacetime and a change to the Astekhar variables one ends up with three constraints, a Gauss constraint, a diffeomorphism, or kinematic, constrain and a dynamical constraint. The first two can be solved in what is known as a spin network (which now are going to be very popular in the string community also, because the have recently been used, in a very different framework, to almost prove the Maldacena conjecture). Basically an spin network is a graph. The edges are labelled by representations of SU(2) and the nodes by intertwiners, which are distinct ways of extracting the identity representation from the products of representations of the incident edges. Loosely speaking a given spin network represent the gravity field at a given instant of time. IF the canonical LQG program would be complete one would have a dynamical constraint that would give the evolution in time of a given spin network. But that has proved to be a very hard task. I thought that because of that LQG people had gone along with spin foams or with causal triangulations. What I didn´t know is that there was a relationship between canonical LQG and causal triangulations. I had knowledge of this by a somewhat unusual route, beginning in a recent paper of Fotini Markopoulus that lead me to a paper by herself and Lee Smollin, arXiv:gr-qc/9702025. There he argues that according to an idea of Penrose the very idea of fluctuations of the metric could be invalid in a theory of quantum gravity. The reason is that the causality depends on the metric, and that if this fluctuates the window is open for violations of causality. In view of this he believes that it would be of greater interest to keep trace of causality and the quantum fluctuations would fuzz the meaning of points and events. Smollin and Fotini propose a somewhat different approach.

The key point is that LQG is a theory where the area operator has a discrete spectrum in the sin networks and that suggests that there is a discrete length and time. In this framework there are discrete quantum analogues of both null rays and spacetime events. The latter are sharply defined because they are indeed defined in terms of the coincidence of causal processes. Quantum amplitudes are then defined in terms of sums over histories of discrete causal structures, each of which are constructed by a set of rules that respect its own causal relations. This is, I gues, the general justification of the approach of causal triangulations (I never have readed a dedicated paper about that topic). The interesting thing is how they relate it to canonical LQG. It is quite well explained in the paper how the proceed:

Each such structure, which we take as the discrete analogue of a spacetime,
is foliated by a set of discrete spatial slices, each of which is a combinatorial
spin-network. These discrete “spatial slices” are then connected by “null”
edges, which are discrete analogues of null geodesics. The rules for the
amplitudes are set up so that information about the structure of the spin
networks, and hence the quantum state, propagates according to the causal
structure given by the null edges.

The dynamics is specified by a set of simple rules that both construct
the spacetime networks, given initial spin networks, and assign to each one a
probability amplitude. Each spacetime net is then something like a discrete

But as I said before there is no dynamic constraint in LQG, where do these "simple rules" come from? By consistency with microcausality. I guess that the whole point is to use the rules of causal triangulations to the spin networks instead of simplices (If I am not wrong the causal triangulations are rooted in Regge calculus which is based in a discrete approach to path integral using simplices to represent space time). Supposedly he gives a rules that could led to relate the theory to some topological field theories in one side, and to some percolation models for which the renormalization group flow could be well studied. I find curious, to sy the least, that in a theory whose initial purpose was to do a canonical formulation the stop in the middle of the program and regret to a path integral, which is more naturally associated to Lagrangians, but, ok, let´s see what follows, here is idea of how goes the first rule for the 2d case:

Rule 1
Consider an initial spin network Γ0, which consists of a set of edges eij and
nodes ni (where eij connects the two nodes ni and nj). To obtain the 2+ 1
dimensional version of the theory we will restrict 􀀀0 to be trivalent, which
means it can be embedded in a two dimensional surface.
The first evolution rule constructs a successor network Γ1 together with a
set of “null” edges which each join a node of Γ0 to Γ1. The rule is motivated
by the idea that the null edges should correspond to a discrete analogue of
null geodesics joining spacetime events.


The result of this rule is a spacetime spinnetwork G01 bounded by the
two ordinary spin networks Γ0 and Γ1 whose nodes are connected by a set
of null edges.In general a spacetime spin network (or spacetime net, for
short) will consist of a set of N ordinary spin network, Γi, i = 0, 1, ...,N,
together with a set of null edges that join nodes of Γi to nodes of Γi+1.


After that a most concrete justification of how the causality lead to that rule. IT continuates a clarification of how to do calculation with taht rule. Later the second rule is introduced, and, firstly, justified, this is the justification.

Rule 2
We might just apply Rule 1 over and over again, but the result would be that
each successor spin network has nodes of higher and higher valency. (This
is easy to see, if each node of Γn has valence P, each node of Γn+1 will have
valence 2(P − 1).) To prevent this from happenning we need a second rule
that lowers rather than raises the valence of the nodes

Later the tow rules are combined in a transition law. I refer to the reader to the actual paper for the details. Now I stop to make some general concerns. First of all to remember that canonical LQG is centered in pure gravity, i.e., without matter. Naively it was expected that matter could be introduced by assigning to the spin networks, their nodes and edges, additional information that would represent matter. But that idea resulted to be plagued with problems, specially with chiral fermions, anomalies, etc. Aside or more or less conventional developments, such as arXiv:gr-qc/9704013 this approach derived in something that has become famous, the "octopi". You can read the lubos entry, obviously not favourable, about that particularhere or here

To be honest, I didn´t read the papers discussed by Motl, only it´s opinions and some related discussions in physic forum where Bilson Thomas, coauthor of that papers explained some aspects. The reason for not reading as basically that for me the whole idea seemed like a very bizarre way to describe matter in a combinatorial way. The "key idea" was expressed as "matter as a topological defect". Well, that´s fine in some sense, but still it was canonical LQG and it had no time evolution. So who is interested in a a very bizarre and incomplete description of non evolving matter? Nt me, for sure.

Well, as I mentioned before I added a RSS feed to the blog, and recently I saw there a paper by Fottini. Fottini Markopoulo has been promoted, at least in a a subliminal way as a "new Einstein", which looks like kind of an obsession for Lee Smollin. I was doubting, I still so, about writing an entry about this "new Einstein" affair so I decided to read the paper, actualy this one

By the same time I had ended to read a review paper about the status of M-brane interactions in M-theory just previous to the Bagger-Lambert revolution (it actually includes an intro to the Bagger Lambert theory). M-theory doesn´t contain a dilaton and however that in string theory the dilaton is associated to the string coupling constant there is not such a constant and consequently there is not a possibility to do a perturbative series for M-theory. This has some resemblance with one of the worsts problems in LQG, the lack of a classical limit. Being "non perturbative" LQG can´t probe that there is a low energy limit of it that, ideally, would be the Einstein theory. In the paper Fotini states that she has an approach that could lead to that classical limit. The idea is to identify quantities in canonical LQG that are conserved under time evolution and to try to identify them with classical observables. Afther reading this entry the reader can suspect that reading "canonical LQG" and "time evolution" in the same phrase is something shocking. That was which led me to do a fast reading of the paper, to locate the bibliography where that time evolution was explained (the one discussed in this entry of course) in order to get an idea of how to evaluate the meaning of conserved.

I am not going to explain the whole paper. Simply to say that the rules of evolution of the previous papers had been related to something called "Pachner moves". She stated that this moves can be shown not to change the topology of the simplicial complex that can be associated to a graph. This means that LQG doesn´t allow topological transitions. Fotini claims that that is good because classical general relativity neither allows it. I don´t know what she exactly means because in fact general relativity actually allows it, if we restric to the Einstein equations of motion at least, and only if additional restrictions are imposed that topology changes are forbidden. In fact the restrictions that one needs to impose are precisely causality related ones. It is not surprising that an evolution of spin networks based on causality restrictions neither allows topological changes. String theory, on the other side, allows such transitions, by the means, for example, of conifold, I hoe to post some day in detail about this very interesting topic.

Anyway, Fotini finds some invariants in the theory by means of the pachner moves. It dosen´t look a terribly difficult task, aat least superficially, if the pachner moves conserve topology every topological quantity would be conserved, isn´t it?

Because of some reason she makes a brief review of the theory of classification of two-dimensional surfaces based on relatively simple tools such as cross cup products and in general basic homological algebra. It is fine for the people not knowing that but it result curious to read it when string people use the most sophisticated topological machinery available without worrying about trivial things such like most phD in math don´t know such things. However, sh states that the conserved quantities can be probably associated to particles. In this way particles would emerge as topological conserved quantities from pure quantum gravity. She also predicts the existence of an infinity number of particle families. Not too bad thing, at lest if one forgets that there re some cosmological restrictions that heavily indicates that there are only three such families (the actually observed ones). If some additional family would ever be observed maybe LQG would gain a point. It is also notorious that a few mounts ago, doing a search for "fermionic wormhole" in arxiv I found an old paper by Smollin in that subject. There he tried to use the wormhole as a device to get matter from pure gravity. In fact I got the idea that the octonion idea was related to this purpose. This paper by Fotini, on the contrary, gets matter as a conserved quantity in an evolutions which doesn´t allow topology changes. aS far as wormholes are related to topology changes (at least if one doesn´t make some ad hoc considerations to advoid it)it looks like if the ideas has gone under a warped twist.

I remember once again to the readers the usual disclaimer that I don´t intend to present this posts as some kind of definitive review trying to impose some kind of authority argument. My only intention is to make an exposition of the flow of ideas and I leave to the reader to make his own conclusions

Sunday, May 25, 2008

Wormholes at the LCH?

The LCH is next to open. Hopefully supersymmetry and the Higgs boson will be found. But there are more possible things that we can find there. The recent Planck 2008 congress was devoted to that topic. You can see blog entries covering that in the Dmitry (non equilibrium phenomena) blog, concretely this entry, and previous ones (ok, the entry with the interview to Polyakov is not about Planck 2008, but it is worth reading anyway). Another ongress, PPC2008 was devoted to LHC, and string phenomenology in general. You can read blog entries about it in Tomasso Dorigos blog, particularly in this entry and subsequent ones (advise, there are a lot of them). Another coverage of the congress is here. If some reader has a lot of time and is interested in reading still more about scientific congress and phenomenology he could read the coverage that Marni Dee has made of neutrinos 2008. At the monet of writing this entry her las post is this.

Well, i am going to add some fuel into the phenomenology of LCH possible predictions in this post. I am going to talk about wormholes production in LCH. That possibility heavily relies in the large extra dimensions scenario. You can read a post in the thomaso dorigos series on PPC2008 about the subject, this. Also you could read my own post about large extra dimensions and warped geometries.

As I said in that entry one thing that can possibly found in the LHC, if the LED sceneries are true, are microblack holes. But that is not the whole history. One can find another things such as p-branes or wormholes. You can read about the formers in this paper. Advise, there they talk about a somewhat peculiar type of p-branes, cosmic branes, introduced in this other paper. I am not going to describe those papers, just quote the abstract, which is self explanatory:

We compute the cross section for p-brane creation by super-Planckian scattering processes
in a (n +4)-dimensional space-time with n−m flat large extra dimensions and m
flat small dimensions of size of the fundamental gravitational scale. We find that the
cross section for the formation of a brane with dimension 1 ≤ p ≤ m, completely wrapped
on the small dimensions, is larger than the cross section for the creation of a spherically
symmetric black hole. Therefore, in space times with asymmetric compactifications we
predict that branes are more likely to be created in super-Planckian scattering processes
than black holes. The higher rate of p-brane production significantly enhances possible
detection of non-perturbative gravitational events by future hadron colliders and cosmic
rays detectors.

Now I am going, at last, with the actual topic of this post. I will explain the resuoults of two papers, Time Machine at the LHC and If LHC is a Mini-Time-Machines Factory, Can We Notice?.

The first paper face the question of the actal calculation of the probability of the wormhole beeing formed. The second, the sing that it could leave in the LHC if it actually forms.

The first paper begins with a brief review of how calculation of black hole formation can be obtained. That subject is generically know under the name of "planckian scattering". He beguines with a "quantum gravity" approach, based on the wheeler-de Witt formalism. There the possibility of black hole formation is calculated by considering the kernell of the following transtion amplitude:

The key in that approach is to study the transition between geometries describing two particles and the geometry describing black holes (or wormholes). For the details the authors refers to the paper of arxiv gr-qc/9404036. which, unfortunately, is unavailable seemingly due to a corruption in the latex that arxiv uses to render the pdf.

Other approach discussed in that brief review of black hole formation is to suppose that that ultra-relativistic particles are represented by plane gravitational waves, which interacting collide and produce a black hole. For 3D geometry the energies required are not available in the LHC, but in the LED sceneries the thing changes. The Schwarzschild radius of a 4+n dimensional black hole of mass M = √s is approximately:

( is the square of the center of mass energy of colliding particles, M4+n is the 4+n dimensional Planck mass and the 4 dimensional Planck mass is given by:

Where Vn is the volume of the extra dimensions. This problem can be achieved using the Aichelburg-Sexl metric, which describe a particle at ultra relativistic speeds and is obtained by doing a Lorentz boost to the Schwarzschild metric and some convenient changes of variables. In the article the details of the calculations are not presented and the author refers to the literature. Maybe some reader is surprised why string theory is not mentioned in this article to analyze the formation of black holes. I guess that it is a good idea to give some things about it. Of course the ultimate reason to even bother about practical formation of black holes depends on warped dimensions. Although warped dimensions can be studied by classical general relativity in 4+1 dimensions (for one warped dimension) the practical motivation to do that depends on string theory. Ok, that was clear, I guess, so I´ll say some quick ideas about string theory and black holes. String theory can describe black holes of an special type, Reissner-Nordstrom black holes, using Dp-branes, and calculate it´s entropy. But the approach used to describe that black holes doesn´t say nothing about the actual formation of them. AS soon as in the eighties that question was studied by Veneziano who calculated the possibility of black hole formation by exchange of gravitons in a sting theoretic description. I don´t really know why exactly that approaches are not followed, but at least I leave Constance of their existence.

The article follows with a very brief introduction to wormholes. That is a topic widely discussed in the net. The reader can find references to it in many places. To avoid to make this entry too long I have made a separate entry in my other blog. I must apologize for English readers because the entry is written in Spanish, anyway, the actual url for it is this.

Anyway, in that entry I only discuss the basic of wormholes. For the article discussed here it is necessary to consider wormholes in a braneworld scenario. There, where the Universe is considered as a 3-brane embedded in a D-dimensional bulk, the four-dimensional Einstein field equations contain the effective four-dimensional stress energy tensor:

The effective energy momentum tensor is a sum of the stress energy tensor of a matter confined on the brane, Tμv and correction terms that arise from a projection of the D-dimensional Einstein equation to the four-dimensional space-time. For some particular examples it is possible to show that the four-dimensional effective stress energy tensor violates the NEC meanwhile the total five-dimensional stress energy tensor does respect the NEC.

After discussing how wormholes are described in branworld sceneries the actual question of their formation is discussed. The approach followed mimetizes the approach followed for black holes. It is not very detailed and it is based in the following cross section:

Here √s is the center of mass energy, x and τ/x are the parton momentum fractions, and fi are the parton distribution functions. The parameter τm = M2min/s where Mmin corresponds to the minimum mass for a valid wormhole description.

σij→wh(s) is the geometrical cross section of the wormhole production and depends in a form factor F. The form factor F(√s/MD) incorporates the theoretical uncertainties in description of the process, such as the amount of the initial center mass energy that goes into the wormhole, the distribution of wormhole masses as function of energy. These corrections are similar to corrections in the formula for black hole production.

The second paper centers on the description of the observable traces of the existence of a wormhole. Much emphasizes is made in the possibility of the wormhole behaving as a time machine. For this it´s two motous must be shifting away at a considerable speed for some lapse of time. Anyway, the authors prefer to use MTM (minitime machine) instead of wormhole. Anyway the actual possibilities of detection discussed in the paper can be summarized as follows:

(i) change of the energy spectrum due to the frequency-filtration property of MTM,
(ii) possible production of anomalously energetic particles, accelerated by passing many times
through gravitational field inside the MTM,
(iii) acceleration of particle decays, since the proper time of a particle moving inside MTM can
strongly exceed the laboratory time,
(iv) CPT and naive unitarity violation (thermalization) due to effective non-local interactions
caused by MTM and to possible ambiguity in the population of closed world-lines inside MTM,
(v) collective effects due to conversion of a single particle into a bunch of its co-existing copies
within the MTM

Sunday, May 18, 2008

Black holes information distortion paradox

A few days ago a friend of mine, graduate theoretical physician, but not an active physician nowadays, and an ocasional reader of this blog,let me know of a new in the media versing about a resolution of the "black hole information paradox". The new was published in many webs, for example here. By the same time a thread was opened in physics phorums about the topic, concretelly Physicists Demonstrate How Information Can Escape From Black Holes based in LQG.

Ok, I suposse that I would have to somehow give an opinion about the new. I have waited a bit to see if, apart of physics phorums people, some of the big (or even not so bigs) ones on the blogosphere said something about the particular. Afther all Astekhar, the mainauthor of the paper behind the new, is one of the greatests personalities in LQG (the whole field begins with a work of him about new variables in canonical gravity) and the theme is very catchy, to say the less. For some reason there has not been such an entry in referential blogs, so I´ll try to give my humble opinion abot the particular.

First of all to say that I find that the claim of the new somewhat distort the actual nature of the achivement. Suposedly the paper solves the questions in the framework of LQG. Afther all in the news release you can read:

"Once we realized that the notion of space-time as a continuum is only an approximation of reality, it became clear to us that singularities are merely artifacts of our insistence that space-time should be described as a continuum."

The idea of discrete space-time strongly suggest that they are talking about LQG. Well, in the physics porum post somone pointed to the arxiv paper behind the news release, concretelly this, Information is Not Lost in the Evaporation of 2-dimensional Black Holes. The first bad thing comes in the title, "2-dimensional black holes". That is they solve the problem in a simplified modell, that always opens the possibilitie that the problem could not be solved in the full environment, afther all 3-d quantum gravity is very diferent from 4-d one.

Anyway, let´s see what is going on. In the last post I talked about LQG and I did a brief description of how LQG treats black holes (or at least one of the ways they do it when they face the singularity problem). As not every reader of this blog is assumed to speak spanish I´ll re-explain it. They don´t work in the full LQG framework but in modell with reduced simmetry. They get the Scharschild solution (a solution for vacuum Einstei equations, statif and with radial symmetry) and write the hamiltonial constraint equation fo it. They treat the radious as a discrete time coordinate. That results in a diference equation that can be solved and they show that they can evolve the solution for negative values of the radious, so they, seemengly, advoid the central singularity. Before reading the paper of Astekharet all I tried to figure how they could have procceded. To begin with the information paradox problem is related to matter in the vecinity of the horizont. So they would need to introduce in the description mttr in some wayor another. The original work of Hawkings that raised the whole problem used a fiexed Schwarschild background and an scalar field propagating in it. By the properties of quantization of fields in a courved backgrounds it was known that a vacuum state contianing no particles for aan observer is transformed in a state containg particles by a bogoliougov transformation for another non inertial observer. Playing with that, and with the conformal diagrams of black holes, Hawking derived that black holes actually emit radiation, in thermal quilibrium. That raises the problem that the black hole aan evaaporate because that procces. But the b-h was formed by matter in a pure state, and the thermally described matter is in a mixed stte, so the evolution would be not unitary (that is a bried description of the problem we are trating, for the ssafe of someone wouldn´t know it). Canonical LQG, the one in which Astekhar usually works, normally trates pure gravity, althought it can describe non fermionic matter also. Knowing that I thoguht that they would use some variant of the singularity removal approach including a klein-gordon field. Well, I was too naive.

They work in something called "Callen-Giddings-Harvey-Strominger (CGHS) black holes". I had not previous knowledge of that model, but the names behind it sound me "stringy", in particular Strominger is mainly an string theorist. Well, I was not wrong this time. The paper makes begins with a brief description of the hawking problems, some previous aproachs to the solution (açone by Hawking himself aaproach based in the maldacnena AdS/CFT corresondence) and just afther that talks about some workd in the early ninties triying to solve it in a toy two dimensional modell, the CGHS.

Just before writing the actual equation of the model the aouthor make a very courious advise:

"Although our
considerations are motivated by loop quantum gravity, in
this Letter we will use the more familiar Fock quantization
since the main argument is rather general"

So they say that we are not going to see a formalism related to LQG, alathought LQG is behing the scene. Well, that means taht we must belief in LQG, but we are not ging to see it. Ok, lets belive, at least for a while. Let´s see (part of)the lagrangian describing the model:


Now it is when one can beguin to be really surprised. We have that Phi is said to be a dilaton. But a dilaton is a field related to string theory. All of the strings theories have a dilaton. So we are in a modell inspired by string theory (an aspect that it is not mentionesd anywhere in the paper). R is the curvature and f is an scalar field. Well, ok, no problem, someone would expect their appearence.

Afther that they introduce the equatios associated to the lagrangian and begins the task of finding solutons resembling a black hole suited for their purposes. First they affront the classical case. They do it in a perturbative, recoursive, way. That is, they choose a candidate metric, calculate the stress tensor for the fields and reintroduce it in the Einstein equation. By dong that they find that the metric an develope a singularity that they can identify as a proper black hole.

Afther that they consider a quantized version, they add hats xD). Not, serously, they use a fock space tratement (in teh spirit of the Wald aproach to quantization in courved backgrounds, but this time quantizing the metric also). They afrront the uestion of quantization (solving the conmutator eqations to say that) by a bootstrapping procedure, a recoursive way similar to the classical one. They do the suual stuff of identifiying the average values with the classical solutions,but they face a problem when the metric becomes singular, and they cann not continuate the bootstraping. Afther that they use another procedure, a mean field approach MFA. They argue that the e relevant part to solve the information paradox depends on the behaviour of the MFA in the near future ifinity and with some 3 extra sumptions ( they explain that two of them aare commonly accepted and that the other is very natural) they can calculate the S-Matrix and se that it is unitary.

Well, the detaills of how valids are the asumptions (2-D space time, MFA, asymtotic regions, etc) is something that, fourtunately, I don´t need to judge. The key point that I want to raise is that what we see in the paper is very, very, far from any formalisms related to LQG. So to claim that this can be seem as a trioumph of LQG, if they don´t bring a future a paper (or smewhat point me that I am missing something important) where they addapt the calculations to something more LQG like, is to somewhat distort the truth. Or, at least, a too propagandistic deformation of facts ;-).

P.S. Seriously, I would like to leave this tasks to the famous physicists bloggers. For example, I am still wating Sean Carrol to post about the 't Hoof paper I writted about two posts above.

Tuesday, May 13, 2008

The trouble with LQG

A través del excelente blog física en la ciencia ficción llegué a otro blog, la bella teoria. Supongo, no lo sé con certea, que eso de "la bella teoria" debería ir por la teoria de cuerdas, que en su momento alguna gente consideró matemáticamente bella y elegante. En todo caso en el blog ví que se trataban con cierta frecuencia temas sobre teoria de cuerdas (aunque no exclusivamente). También observé que algunas de esas entradas contenían algunos errores elementles. De hecho le señalé al autor uno en una de las entradas, a lo que me respondio dándome las gracias por el aviso. Posteriormente he visto algún que otro error, pero no me he molestado en írselos indicando uno a uno. Yo no me considero una autoridad en teoria de cuerdas ¿alguien aparte de Lubos Motl se atreve a asumir ese rol xD? y soy consiciente de que es probable que en este blog haya errores en algún que otro aspecto. Además el tono de ese blog es claramente divulgativo, para un público general y las entradas estan escritas de manera amena y cuidda (con fotos y cosas así). Dado que no hay en español muchos blogs (si es que hay algún otro) que cumpla similar función (yo, por ejemplo, oriento este blog a gente de un nivel de conocimientos bastante mas elevado) no tiene sentido ir señalando fallitos, que es algo que supongo podriá resultar irritante.

El caso es que el autor ahora ha ido a leer el polémico libro de Lee Smollin "the trouble with physics". Tras hacer una presentación del mismo en una entrada publico una segunda titulada Gravedad cuántica, continuando la revolución de Einstein Cómo quiera que es muy poco probable que la gente que lea su blog (y aparentemetne el mismo autor) este al tanto de los detalles de las "string wars" en los blogs Motl ,Distler, String Coffe y not even wrong, y de las críticas que los teoricos de cuerdas han elaborado sobre el estado actual de la LQG dejé una respuesta, que no ha recibido ulterior réplica dónde resumía algunas de ellas. Por su interés para los lectores de este blog dejo aquí esa respuesta:

"Salvador, lo que estas comentando es lo que se conoce cómo LQG. loop quantum gravity, o gravedad cuántica de lazos en español.

Dado que en tu anterior post hablabas del libro "the trouble with physics" asummo que has sacado de allí la información sobre este post. Aprovecho pués para comentarte ambos temas en esta respuesta.

Llevo siguiendo la LQG, a nivel técnico, no de divulgacion, desde el 2003, cuando se hizo populara en una hábil maniobra publicitria de algunos de su máximos represetantes. Fué un producto muy bien vendido, sobre todo mediante un paper de review de Robert Thieman. Te presentan la idea general, explicando cómo el programa de cuantización canónica lleva a que el operador área esta cuantizado. Luego hay argumentaciones de que, mediante algo conocido cómo "double special relativity", que extiende el formalismo de la relatividad especial al caso de una longitud privilegiada (la de Planck) se llega a que la velocidad de la luz en el vacio depende de la frecuencia, para luego señalar que los satelites GLAST podrian llegar a comprobar eso en un lapso de tiempo breve (su lanzamiento estaba previsto para el 2005).

De ahí pasan a argumentar que deducen la entropia de un agujero negro, reproduciendo la fórmula conocida mediante la aproximación semiclásica de que depende de una potencia del área. No entran en los detalles, que dependen de una construccion de relatividad general poco conocida llamada "isolated horizonts" (y también de el concepto relacionado de "dynamic horizonts". Cómo casi nadie esta familiarizado con eso conceptos consiguen que en una primera lectura uno no caiga en que en el cálculo se esta asumiendo, como punto de partida, que la entroia depende del área. Además la teoria de la LQG tiene un parámetro libre, concido cómo parámetro de inmirizzi. Ajustando ese valor se consigue que la fórmula obtenida se ajuste. Total, que el único punto medianamente serio del cálculo es que el área esta cuantizada qu ehay que dividir el área total entre la unidad mínima de área para encontrar el número posible de microestados.

Luego pasan a cosmologia. Anuncian qu econsiguen demostrar el "rebote", es decir que no puede haber un big crunch pués usando factores cuánticos se comprueba que el colapso se detiene en un radio mínimo y luego hay un nuevo big bang. Claaro, eso no se hace dentro de la teoria completa sino restringiéndose a "minisuperespacios", es decir, subgrupos de las métricas posibles. Vale, se admite que es una aproximación, el problema es que no dan ningún criterio que permita hacerse una idea de cuan buena o mala es esa aproximación.

Por cierto, la LQG canónica, basada directamente en la gravedad clásica de Einstein y luego cuantizada de un modo raro, presuntamente no perturbativo, es estática. Para tener un espaciotiempo dinámico tiene que irse a un formalismo lagrangiano (las spinfoams). El caso es que las spin-foams no parten de la relatividad de Einstein sino de unas teorias clásicas que con ciertas restricciones, son más o menos clasicamente equivalentes a la relatividad de Einstein. Además hay varias de ess teorias clásicas y cada una lleva a una teoría cuántica esencialmente diferente.

Y bueo esas dos son las líneas fundamentales, pero aparte hay otras que asumen de partida que el espaciotiempo es discreto, lo cuál es mucho suponer. Y no quda muy claro la relación entre ellas.

Mas importante es que esas teorias cuánticas deberían tener un limite clásico que reprodujera la gravedad clásica de Einstein. Pué sbien, no han conseguido eso de una manera medianamente clara (lo mejor que tiene es obtener, mediante una serie de aproximaciones de dudosa justificación) reproducir, en el marco de algunas spinfoams algo que debería ser similar al potencial de Newton.

Vamos, que no digo yo que la LQG sea un sinsentido, pero desde luego tiene muchos problemas bastante serios.

Respecto a su mejor punto, la posibilidad de confirmación de la dispersión de la luz en el vacio, al final los satélites GLAST han tardado muchísimo en lanzarse, (deben estar a punto de salir ahí fuera por estas fechas). Un experimento en tierra hace unos meses dió un resultado preliminar en esa línea, pero no es conluyente (y hay ciertas teorias de cuerdas, bastante raras todo sea dicho, las cuerdas de liouville, que predicen algo similaar. De hehco fué un fisico de cuerdas quien hizo populñaar ese experimento).

Y, por si no esta claro, la teoria de cuerdas también es una gravedad cuántica, así que no se puede presentar esto cómo si fuera le más de lo más en gravedad cuántica y dejar de lado la teoría de cuerdas.

En fín, no me parece mal que comentes sobre la LQG, pero cuidado, que no es oro todo lo que reluce con es gente."

Bien, tras esa entrada Salva ha publicado otra . Más allá de los agujeros negros. No he leido "the trouble with hysics, sólo los comentarios, unos favorables (en la comunidad LQG) otros adversos, así que no tengo la certeza, pero si la firme sospecha, de que los datos de esa entrada salen de ese libro. En cualquier caso lo que si conozco es el tema específico tratado (agujeros negros en LQG) pués lei en su momento unos cuantos artículos originales sobre el particular. La entrada en sí no contiene ningún error, pero creo que es interesante que complete la respuesta que reproduje antes para explicar aalgo respecto a la singularidad de los agujeros negros en LQG.

El tratamiento de los mismos se hace inspiróandose en el formalismo de la LQG canónica, es decir, partiendo del formalismo Hamiltoniano. Al igual que se hacía en cosmologia con los modelos de Friedman-Robertson-Walker* se elige un subgrupo de métricas, para empezar la métrica de Schwarschild, que describe el campo gravitatorio estático y con simetría eférica que se supone es producido por una distribucion de masa esférica, idealmente concentrada en un punto, y se construye el hamiltoniano que corresonde a esa métrica. Recuérdese que en gravitación canónica el hamiltoniano consta exclusivamente de lo que se concoe cómo ligaduras (una ligadura es una relación que surge, en sistemas denominaods singulares, debido a que no es posible invertir la transformación que nos daría los "momentos" en términos de las "velocidades"). Tenemos pués el mismo problema que citaba antes para la LQC (loop quantum cosmology), que estamos eliminando grados de libertad , por estar restringiéndonos a un subconjunto de métricas, haciendo por tanto una aproximación. Lo malo es que no se tiene una estimación del error cometido en esa aproximación. La gente de teoria de cuerdas arguye que se estan eliminado demasiados grados de libertad y que la aproximación es totalmente injustificada.

Hay, sin embargo, un aspecto en que el tratamiento de los agujeros negros difiere de la LQC. En FRW tenemos una variable temporal, en una solucion de agujero negro tipo Schwarschild no. La forma de proceder consiste en que la ligadura cinemática de la LQG puede resolver en términos de Spin Networks. HAy una cantidad infinita, pero numerable, de spin networks. Bien, la "evolución temporal", es en cierto modo, la accion del hamiltonianao sobre la base de esas spin networks. La ligadura toma la forma de ecuacione en diferencias dónde el tiemo es el índice en la relacion de recurrencias. Para ese caso particular el índice de las spin networks pasa a ser el radio, es decir, el radio juega el papel del tiempo. No estoy 100% convencido de que me crea del todo esa interpretación, pero admito que tiene un cierto ingenio. Además en gravedad clásica es bien sabido qeu el radio si juega el papel del tiempo (por el cambio de signo de los componentes de la métrica en el interiro del horizonte de sucesos dle agujero negro). En todo caso lo que se hace es resolver esa ecuación en diferencias que da la evolución. A diferencia de la geodésica de la relatividad geneal clásica, que termina en el centro del agujero negro en un tiempo propio finito, o´nde el vlaor de la métrica se hace infinito (es una singularidad auténtica, y no producto de la elección de un mal sistema de coordenadas, como la dle horizonte de sucesos). Esa evolución puede prolongarse para valores negativos del radio. La interpretación de ese resultado es algo sobre lo que en los artículos no habia una pronunciacion clara, pero se aarguía que podria ser similar a los "baby universes" que popularizó Hawkings. Esos "baby universes" surgíeron en la aproximacion euclidea a la gravedad cuántica (el formalismo favorito de Hawkings). Alli lo que se hace es trabajar con métricas euclideas (resultado de cambiar en la métrica de Einstein de tiempo real a tiempo imaginario). En esas teorias también se puede eludir la singularidad central yéndose aun tiempo imaginario, eso se interpreta como la creación de un "baby universe". No conozco mucho la gravitacio cuántica euclidea, pero creo que esos resultados se conseguían alli haciendo una aproximación tipo WKB, con lo cuál no eran del todo firmes. De hecho yo siempre había visto la LQG cómo una versión de la gravitacion cuántica euclidea en un formalismo mas matemáticamente preciso, bastante cercana a su espíritu (algo con lo que no coinciden los teoricos de cuerdas). En cualquier caso esa es la secuencia de razonamientos que llevana evadir la singularidad del centro de un agujero negro en LQG. La fecha de publicación del libro de Smollin coincide más o menos con la fecha de los artículos que leí en su momento, con lo cuál no recoge los desrrollos posteriores de ese trabajo. La verdad es que esos trabajos en agujeros negros fueron casi lo último que leí con detalle de LQG, y que desde entonces la he seguido muy por encima, a través de los posts de physic forums (excepcion hehca del trabajo de Reuters en el grupo de renormalizacion, que sigo teniendo pendiente exponer por aquí algún dia de estos). Tengo entendido que lo que se hizo fué generalizar los resultados a conjuntos más amplios demétricas (especialmente en LQC, aunque también algo se hizo en agujeros negros) para ver si en esos conjuntosmás ampliso se mantenian los resltados y que estos o eran debidos a restringirse a métricas con excesiva simetría. Aparentemte se mantenian parte, la parte esencial, de los resultaddos, pero ya no puedo dar detalles (si hay algún experto en LQG entre los lectores se agradecerían las posibles puntualizaciones).

En fín, sirva este posts, en español, cómo rsumen de algunas objeciones comunes a la LQG expuestas de una manera que aunque somera, tenga un mínimo de detalle