It could be easy to suppose that following some selected blogs, and reading discussions in the appropriate forums one is aware of "what´s going on" in quantum gravity related issues. Well, the Ian Malcon´s law, you know, "98 % of what you believe is false" strikes again. If you follow the link´s sections of some of the most well known blogs (Motl, Distler, Woits, etc) you find that many of the linked blogs are not about quantum gravity, and many of the QG side are sparsely updated, or contain a lot of non physic related entries. The result is that it is easy that a lot of papers, or even full lines of research pass unperceived. Perhaps the more complete list of papers is this thread in physics forums. And of course you could simply follow arxiv (If you didn´t notice I added an RSS feed to the last five entries on hep-th to the blog) . But although Marcus, the author of the cited thread make some brief comments about the papers he links it is hard to get a proper idea of the relevance of the papers.
In this blog I have tried to give an overview of various (surelyy not all) lines of research, even if they are not too likely to be viable in the long run. I have coved things such as conformal quantum gravity, Garret Lissi´s E8, topological geometrodynamics, and, recently, the new idea of 't Hooft (I am still waiting for the paper covering the quantum part). Most of this approaches have almost null follow up. For example the last paper of ' t Hooft has only one citation so far, despite the fact of he being a top notch physicist.
This mean that we are left with only three approaches with a wide coverage, string theory, of course, LQG and, far behind this two, the non commutative theory of Alain Connes. Well, if someone has read the book "three roads to quantum gravity" by Lee Smollin (I didn´t) he will be somewhat surprised that I don´t mention the twistor theory of Roger Penrose. After reading the last chapter of the Penrose´s book "the road to reallity) describing that theory and it´s state of development I think it is too far of being a theory of quantum gravity. If I have to choose I would bet for the euclidean quantum gravity approach advocated by Hawkings instead, but I am not too sure if someone keep publishing in that topic nowadays.
Some people find this discouraging. For example Mitta Pitkanen, the creator of TGD, asked some mounths ago about the reason string theory was so majoritary. I guess that the reason for this is easy to understand. String theory strongly relies in ideas of particle physic and quantum field theory. As far as most theoretical physicists have this background it is very natural that from the standard model they would go to string theory. That also explain why LQG has also a reasonably good amount of followers. It is the natural approach to quantum gravity for people with a deep implication in general relativity ideas. In fact possibly a lot of people who a few year about hold their hopes in twistors or euclidean quantum gravity have gone into LQG nowadays. And it also explains the relative success of NCG. Simply mathematicians follow the work of a field medalist in a work bases in the standard model, that is a theory expressible in fiber bundles language so they can learn it relatively easy, and is near to the things they know. People simply follow natural developments of the theories they know well. Or at least that is my opinion.
Ok, this has been a long prelude to somewhat justify why I am writing the actual topic of the post, pertaining to LQG research. After I had said in previous posts that I was becoming progressively more and more centered in string theory, and that I was becoming progressively more skeptic in LQG. Well, the reason is that I hadn´t seen previously any discussion of this particular topic and I think it deserves some attention. Surely there is out there people lot more qualified to cover it in a blog/forum, but as far as I know nobody did.
Let´s state the arena to understand the problem. In canonical LQG one tries to solve the constraints appearing when one tries to go from the Einstein-Hilbert Lagrangian to a Hamiltonian. Doing an slice of spacetime and a change to the Astekhar variables one ends up with three constraints, a Gauss constraint, a diffeomorphism, or kinematic, constrain and a dynamical constraint. The first two can be solved in what is known as a spin network (which now are going to be very popular in the string community also, because the have recently been used, in a very different framework, to almost prove the Maldacena conjecture). Basically an spin network is a graph. The edges are labelled by representations of SU(2) and the nodes by intertwiners, which are distinct ways of extracting the identity representation from the products of representations of the incident edges. Loosely speaking a given spin network represent the gravity field at a given instant of time. IF the canonical LQG program would be complete one would have a dynamical constraint that would give the evolution in time of a given spin network. But that has proved to be a very hard task. I thought that because of that LQG people had gone along with spin foams or with causal triangulations. What I didn´t know is that there was a relationship between canonical LQG and causal triangulations. I had knowledge of this by a somewhat unusual route, beginning in a recent paper of Fotini Markopoulus that lead me to a paper by herself and Lee Smollin, arXiv:gr-qc/9702025. There he argues that according to an idea of Penrose the very idea of fluctuations of the metric could be invalid in a theory of quantum gravity. The reason is that the causality depends on the metric, and that if this fluctuates the window is open for violations of causality. In view of this he believes that it would be of greater interest to keep trace of causality and the quantum fluctuations would fuzz the meaning of points and events. Smollin and Fotini propose a somewhat different approach.
The key point is that LQG is a theory where the area operator has a discrete spectrum in the sin networks and that suggests that there is a discrete length and time. In this framework there are discrete quantum analogues of both null rays and spacetime events. The latter are sharply defined because they are indeed defined in terms of the coincidence of causal processes. Quantum amplitudes are then defined in terms of sums over histories of discrete causal structures, each of which are constructed by a set of rules that respect its own causal relations. This is, I gues, the general justification of the approach of causal triangulations (I never have readed a dedicated paper about that topic). The interesting thing is how they relate it to canonical LQG. It is quite well explained in the paper how the proceed:
Each such structure, which we take as the discrete analogue of a spacetime,
is foliated by a set of discrete spatial slices, each of which is a combinatorial
spin-network. These discrete “spatial slices” are then connected by “null”
edges, which are discrete analogues of null geodesics. The rules for the
amplitudes are set up so that information about the structure of the spin
networks, and hence the quantum state, propagates according to the causal
structure given by the null edges.
The dynamics is specified by a set of simple rules that both construct
the spacetime networks, given initial spin networks, and assign to each one a
probability amplitude. Each spacetime net is then something like a discrete
spacetime.
But as I said before there is no dynamic constraint in LQG, where do these "simple rules" come from? By consistency with microcausality. I guess that the whole point is to use the rules of causal triangulations to the spin networks instead of simplices (If I am not wrong the causal triangulations are rooted in Regge calculus which is based in a discrete approach to path integral using simplices to represent space time). Supposedly he gives a rules that could led to relate the theory to some topological field theories in one side, and to some percolation models for which the renormalization group flow could be well studied. I find curious, to sy the least, that in a theory whose initial purpose was to do a canonical formulation the stop in the middle of the program and regret to a path integral, which is more naturally associated to Lagrangians, but, ok, let´s see what follows, here is idea of how goes the first rule for the 2d case:
Rule 1
Consider an initial spin network Γ0, which consists of a set of edges eij and
nodes ni (where eij connects the two nodes ni and nj). To obtain the 2+ 1
dimensional version of the theory we will restrict 0 to be trivalent, which
means it can be embedded in a two dimensional surface.
The first evolution rule constructs a successor network Γ1 together with a
set of “null” edges which each join a node of Γ0 to Γ1. The rule is motivated
by the idea that the null edges should correspond to a discrete analogue of
null geodesics joining spacetime events.
...............
The result of this rule is a spacetime spinnetwork G01 bounded by the
two ordinary spin networks Γ0 and Γ1 whose nodes are connected by a set
of null edges.In general a spacetime spin network (or spacetime net, for
short) will consist of a set of N ordinary spin network, Γi, i = 0, 1, ...,N,
together with a set of null edges that join nodes of Γi to nodes of Γi+1.
.
After that a most concrete justification of how the causality lead to that rule. IT continuates a clarification of how to do calculation with taht rule. Later the second rule is introduced, and, firstly, justified, this is the justification.
Rule 2
We might just apply Rule 1 over and over again, but the result would be that
each successor spin network has nodes of higher and higher valency. (This
is easy to see, if each node of Γn has valence P, each node of Γn+1 will have
valence 2(P − 1).) To prevent this from happenning we need a second rule
that lowers rather than raises the valence of the nodes.
Later the tow rules are combined in a transition law. I refer to the reader to the actual paper for the details. Now I stop to make some general concerns. First of all to remember that canonical LQG is centered in pure gravity, i.e., without matter. Naively it was expected that matter could be introduced by assigning to the spin networks, their nodes and edges, additional information that would represent matter. But that idea resulted to be plagued with problems, specially with chiral fermions, anomalies, etc. Aside or more or less conventional developments, such as arXiv:gr-qc/9704013 this approach derived in something that has become famous, the "octopi". You can read the lubos entry, obviously not favourable, about that particularhere or here
To be honest, I didn´t read the papers discussed by Motl, only it´s opinions and some related discussions in physic forum where Bilson Thomas, coauthor of that papers explained some aspects. The reason for not reading as basically that for me the whole idea seemed like a very bizarre way to describe matter in a combinatorial way. The "key idea" was expressed as "matter as a topological defect". Well, that´s fine in some sense, but still it was canonical LQG and it had no time evolution. So who is interested in a a very bizarre and incomplete description of non evolving matter? Nt me, for sure.
Well, as I mentioned before I added a RSS feed to the blog, and recently I saw there a paper by Fottini. Fottini Markopoulo has been promoted, at least in a a subliminal way as a "new Einstein", which looks like kind of an obsession for Lee Smollin. I was doubting, I still so, about writing an entry about this "new Einstein" affair so I decided to read the paper, actualy this one
By the same time I had ended to read a review paper about the status of M-brane interactions in M-theory just previous to the Bagger-Lambert revolution (it actually includes an intro to the Bagger Lambert theory). M-theory doesn´t contain a dilaton and however that in string theory the dilaton is associated to the string coupling constant there is not such a constant and consequently there is not a possibility to do a perturbative series for M-theory. This has some resemblance with one of the worsts problems in LQG, the lack of a classical limit. Being "non perturbative" LQG can´t probe that there is a low energy limit of it that, ideally, would be the Einstein theory. In the paper Fotini states that she has an approach that could lead to that classical limit. The idea is to identify quantities in canonical LQG that are conserved under time evolution and to try to identify them with classical observables. Afther reading this entry the reader can suspect that reading "canonical LQG" and "time evolution" in the same phrase is something shocking. That was which led me to do a fast reading of the paper, to locate the bibliography where that time evolution was explained (the one discussed in this entry of course) in order to get an idea of how to evaluate the meaning of conserved.
I am not going to explain the whole paper. Simply to say that the rules of evolution of the previous papers had been related to something called "Pachner moves". She stated that this moves can be shown not to change the topology of the simplicial complex that can be associated to a graph. This means that LQG doesn´t allow topological transitions. Fotini claims that that is good because classical general relativity neither allows it. I don´t know what she exactly means because in fact general relativity actually allows it, if we restric to the Einstein equations of motion at least, and only if additional restrictions are imposed that topology changes are forbidden. In fact the restrictions that one needs to impose are precisely causality related ones. It is not surprising that an evolution of spin networks based on causality restrictions neither allows topological changes. String theory, on the other side, allows such transitions, by the means, for example, of conifold, I hoe to post some day in detail about this very interesting topic.
Anyway, Fotini finds some invariants in the theory by means of the pachner moves. It dosen´t look a terribly difficult task, aat least superficially, if the pachner moves conserve topology every topological quantity would be conserved, isn´t it?
Because of some reason she makes a brief review of the theory of classification of two-dimensional surfaces based on relatively simple tools such as cross cup products and in general basic homological algebra. It is fine for the people not knowing that but it result curious to read it when string people use the most sophisticated topological machinery available without worrying about trivial things such like most phD in math don´t know such things. However, sh states that the conserved quantities can be probably associated to particles. In this way particles would emerge as topological conserved quantities from pure quantum gravity. She also predicts the existence of an infinity number of particle families. Not too bad thing, at lest if one forgets that there re some cosmological restrictions that heavily indicates that there are only three such families (the actually observed ones). If some additional family would ever be observed maybe LQG would gain a point. It is also notorious that a few mounts ago, doing a search for "fermionic wormhole" in arxiv I found an old paper by Smollin in that subject. There he tried to use the wormhole as a device to get matter from pure gravity. In fact I got the idea that the octonion idea was related to this purpose. This paper by Fotini, on the contrary, gets matter as a conserved quantity in an evolutions which doesn´t allow topology changes. aS far as wormholes are related to topology changes (at least if one doesn´t make some ad hoc considerations to advoid it)it looks like if the ideas has gone under a warped twist.
I remember once again to the readers the usual disclaimer that I don´t intend to present this posts as some kind of definitive review trying to impose some kind of authority argument. My only intention is to make an exposition of the flow of ideas and I leave to the reader to make his own conclusions
Sunday, June 15, 2008
Subscribe to:
Posts (Atom)