## Thursday, December 31, 2009

### F(rench)-theory

Ok, everybody made speculations about the meaning of the F in F-theory. Possibly the most accepted one was that it was due to Cumrumm Va(F)a. But an article appearing now in arxiv has shown it's real origin.

The authors of the article are Adil Belha and Leila Medari. It is titled "Superstrings, Phenomenology and F-theory". the abstract reads:

We give brief ideas on building gauge models in superstring theory, especially the four-dimensional models obtained from the compactification of F-theory. According to Vafa, we discuss the construction of F-theory to approach non-perturbative aspects of type IIB superstring. Then, we present local models of F-theory, which can generate new four-dimensional gauge models with applications to phenomenology.

It is based on invited talks given by A. Belhaj in Oviedo, Rabat, Zaragoza.

Untill here nothing seems to support my claim of the explanation of the origin of the name. But if you go and see the paper, available in: http://arxiv.org/pdf/0912.5295 one finds that it is written in French. That explains all it ;-).

Fortunately I have a relatively good knowledge of French and I could make a quick reading of the article. It is a good introduction to the topic, from the very beguining explaining the basics of string theory, D-branes and all that. Later it explains the basics of F-theory, of local models and of local F-theory GUT models. All of it in a short article of 15 pages.

Despite the name it doesn't dive too much into phenomenology. But still it gives a good introduction to many aspects of the subject for non initiated people. In that sense it is far better than the blog entry of Jackes Distler about the first big paper of Vafa. And, definitively, it looks like a good chance for Spanish people people interested in the subject but not speaking English and maybe speaking French.

By the way, for those that didn't read the Spanish entry about the CDMS announcement just say that F-theory GUTS predicts that the LSP (lightest supersymmetric partner) is the gravitino, which is not a viable candidate for a WIMP. The CDMS two events finding (irrespective of how statistically significant it could be) is kind of a hint that the LSP is a WIMP (maybe a neutralino) so if confirmed the actual Vafa models of F-theory GUT would become invalidated. Possibly the experts on the subject could recook some aspects of the more phenomenological aspects of the theory (mainly the supersymmetry breaking mechanism) to fit the new data. But certainly the best aspect of the whole construction, reproducing the standard model and make concrete predictions, would go away.

But, as Vafa said in the strings 2009 conferences. That's the bad point of making predictions, that they could be invalidated.

If someone is interested in knowing it I must say that since the CDMS announcement I have decided to study in more detail what heterotic phenomenology can offer. It doesn't mean that F-theory is not interesting any more, but irrespectively of the CDMS I needed to pay more attention to heterotic theories. The CDMS is just a good excuse.

Also I am reading (and in some cases rereading) a lot of articles in black holes (stringy and not stringy ones). You can read about it in my other blog (if you speak Spanish). Still I guess that I will also talk about the subject in this blog in a near future, when I have finished reading carefully a few bunch of articles. For example, today there is an article about the subject of B-h creation in particle collisions: http://arxiv.org/abs/0912.5481.

Other interesting articles today in arxiv are: Unification of Residues and Grassmannian Dualities by Nima Arkani-Hamed, Jacob Bourjaily, Freddy Cachazo and Jaroslav Trnka. The article continuate the MHV program to give a twistorial technique to find scattering amplitudes. I must admit that although I recognize it's interest I am not following too much that developments. Still I think some readers can find it more attractive than me.

Also I would note two papers in dark energy:

Inverse problem - reconstruction of dark energy models

Abstract:

We review how we can construct the gravity models which reproduces the arbitrary development of the universe. We consider the reconstruction in the Einstein gravity coupled with generalized perfect fluid, scalar-Einstein gravity, scalar-Einstein-Gauss-Bonnet gravity, Einstein-$F(G)$-gravity, and $F(R)$-gravity. Very explicit formulas are given to reconstruct the models, which could be used when we find the detailed data of the development of the universe by future observations. Especially we find the formulas using e-foldings, which has a direct relation with observed redshift. As long as we observe the time development of the Hubble rate $H$, there exists a variety of models describing the arbitrary development of universe.

The F(R) theories of the subject refers to approaches where one consider gravity theories with terms in the lagrangian that contain higher order terms in the curvature that appear as counterterms in the renormaliztion program of conventinal quantum gravity (the theory actually is not enormalizable because of the need of infinite diferent terms). There was recently a good review article about the subject and if I have time to read it I will post about that kind of theories.

Also about dark energy is a paper by A. M. Polyakov: Decay of Vacuum Energy .

Abstract:

This paper studies interacting massive particles on the de Sitter background. It is found that in some cases (depending on even/odd dimensionality of space, spins, masses and couplings of the involved particles etc) the vacuum acts as an inversely populated medium which is able to generate the stimulated radiation. This "cosmic laser" mechanism depletes the curvature and perhaps may help to solve the cosmological constant problem. The effect is more robust in the odd dimensional space-time, while in the even case additional assumptions are needed.

Polyakov is a very original thinker, and despite that sometimes it's ideas seems a bit non conventional it always worth reading him.

Possibly there are more interesting papers in axiv today, but I'll stop here.

Good new year to all readers.

## Friday, December 25, 2009

### Relation betwen the Sokolov–Ternov effect and the Unruh effect

I have been disucisong in my other (and in the miguis forum) the proposal of Crane to use a black hole as an starship impulsor, bases on his arxiv article: ARE BLACK HOLE STARSHIPS POSSIBLE?.

You can read (if you understand spanish) the three post about the suject: 1 , 2 and 3.

While discusing that papers I have ben reading in wikipedia about it's litle brother, the Unruh effect.

As explained there in detaill that effect consist of the observance of thermal radiation by an acelerated observed of what is vaccum for an stationary observer. The temperature of the radiation is proportional to the aceleration: $$T=ha/4\pi^2ck$$ (k is the bolstman constant, the other quantities have their obvious meaning).

To my surprise in the entry is mentioned that there is a claim that the radiation has been observed. In particular it has been claimed to be observed in the Sokolov–Ternov effect: the effect of self-polarization of relativistic electrons or positrons moving at high energy in a magnetic field. The self-polarization occurs through the emission of spin-flip synchrotron radiation. and, in particular:

it was shown that if one takes an accelerated observer to be an electron circularly orbiting in a constant external magnetic field, then the experimentally verified Sokolov-Ternov effect coincides with the Unruh effect.

This results date back to 2005, so they are not new at all. And I am almost sure that they are controversial or someone would have a nobel prize for it ;-). The whole thing is that despite I try to be informed, I have no idea about it. Maybe other readers of the blog also were unaware of it and they could be curious to know.

## Thursday, December 17, 2009

### Dark matter live webcast

Ok, a litle bit late, but still something is going on:

Fermilab webcast in dark matter CDMSresults

Or, if you prefer you can watch the other simultaneous conference:

http://www-group.slac.stanford.edu/kipac/cdms_live.html

As I am posting late just tell that the main announcement has been already made, two events. That means not a definitive discovering, because of statistical considerations, but certainlly something. Now they are preciselly discusing exactly how significant this is.

Update: If you want to see a summary of the results by the CDMS team, get it here (it is a two pages pdf, without formulae, readable for most people).

Quick summary, as said in CF: if these events are interpreted as signal, the lower bound on the WIMP mass for these recoil energies is roughly 0.5 GeV.

I would add, a good guess (it gives the best possible cross-section) is a 70 GeV WIMP. DAMA claims of dark matter discovering, via inelastic dark matter (that is the WIMP has excited energy state) is compatible with CDMS results in a reasonable parameter range.

I invite you to read the entries on the topic in many of the blogs in my link list (and possibly many others). Although not a discovering there will be a lot of discussion about these results in the near future. And new results are announced for the future, when the new superCDMS would be working.

Update: You can see the recorded video of one of the conferences from this website: http://online.kitp.ucsb.edu/online/dmatter_m09/cooley/

The arxiv paper, still not submitted when I am posting this, is availabe here

There are some discussion in the blogs about the actual relevance of the signal. The most accepted one is a 1.5 sigma result. The discrepancies differ in how to actually consider the background. The data of 1.3 goes with the blinded background (optimized background obtained without knowledge of the existence of the signals). If one use other background one could get as much as (almost) 3 sigmas, or as few as 0. By the way, the very use of "sigma" is more appropriate for gaussian distributions, but it is used commonly for non gaussian ones with the appropriated corrections.

For the future I have read that before de superCDMS it is expected to have data from another experiment, the XENON100. They talk about "early in the 2010". It remains to see what "early" exactly means, and -more important- what the results are.

If one wants to read an easy introduction to the detailss of how CDMS works one can read this entry in the old tomasso dorigo blog. Be aware that Dorigo dosn't like too much supersymmetry and it argues that the (previous) CDMS result convince him a little bit more about that. Curiously he hasn't any entry about this new CDMS dataset.

I had not time to answer Matti to a question in the previous post. I leave here a link to his own view of these results as a compensation: http://matpitka.blogspot.com/2009/12/dark-matter-particle-was-not-detected.html#comments

## Tuesday, December 08, 2009

### Se rumorea que se ha descubierto la materia oscura

Pues si, pues sí. La famosa materia oscura que forma el noventa y tantos por ciento de la masa del universo cuya presencia se infiere por el comportamiento de la materia visible pero de la que no había evidencia directa parece que al final ha sido descubierta en uno de los numerosos experimentos de laboratorio que actualmente se dedican a su búsqueda.

En realidad hay un grupo experimental italiano que responde a las siglas DAMA que llevan un tiempo diciendo haberla encontrado. Pero por una parte su evidencia es un tanto circunstancial, habiendo hallado variaciones estacionales de cierto tipo de eventos posiblemente relacionados con algunos candidatos posibles a materia oscura. Por otro lado experimentos con una sensibilidad igual, o superior, al DAMA no han encontrado nada. E realidad hay diferencias sutiles entre los diversos tipos de detectores y es posible -pero muy improbable- un cierto tipo de materia oscura que sea detectable por DAMA y no el resto de detectores.

Pero no es DAMA lo que esta ahora en candelero ( a raíz de este post en el blog de Jesster, resonances) sino CDMS, siglas de cryogenic dark matter search. Este grupo ha puesto detectores en una mina de sodio enterrada profundamente en algún lugar de Minessota. En 2007 este grupo entrego un informe negativo dónde ponían unos límites experimentales a las características posibles que podía tener la materia oscura tipo WIMP (weakly interacting massive particles). Se esperaba que ya estuviese publicado el artículo con la nueva remesa de datos, más extensa y tomada con instrumentos de mejorada sensibilidad. Pero se han retrasado y han enviado l artículo a nature, y esta revista ha aceptado el artículo, lo que hace pensar que pueda ser importante. Nature es una de las pocas revistas que quedan actualmente que tiene un contrato de confidencialidad (o como quiera traducirse disclosure) y hasta el 18 de este mes no estará disponible el artículo. Posiblemente ese mismo día también haya otro artículo paralelo en arxiv (libre de descarga para todo el mundo por consiguiente).

Realmente esta sería una estupenda noticia para todo el mundo, excepto tal vez el físico de cuerdas Cunrum Vafa y colaboradores, que en los últimos dos años habían desrrollado un excelente y elaborado modelo basado en teoría de cuerdas que reproducía el modelo standard de partículas sin aditamentos exóticos comunes en otros modelos fenomenológicos, y, aparte, hacía algunas predicciones. Entre ellas que la materia oscura esta formada principalmente por el gravitino (compañero supersimétrico del gravitón), que no es una partícula tipo WIMP. Si se confirma el hallazgo habría que ver si pueden reacomodar su modelo para incorporar este hallazgo sin destruir el resto de características buenas de su teoría.

Por lo que yo tengo comprendido de la teoría F la mayoría de restricciones que utiliza para hacer predicciones se basa en su modelo de ruptura de la supersimetria. Allí usan un modelo de mediación gauge (una variante de algo conocido como modelo de guidicce-massiero usado en modelos de mediación gravitatoria), dónde el mensajero es un bosón asociado a una simetría gauge tipo Peccei-Quin,asociada al axión de la QCD. Es un modelo bastante minimalista dónde casi no hay "sector oscuro" supersimétrico y en ese sentido parece muy buena idea. Pero claro, si ahora deben acomodar un WIMP como partícula supersimétrica mas ligera deberían revisar las cosas -si ello es posible- y posiblemente ese mecanismo de ruptura de la supersimetría sea lo que mas se presta a ello. Otra posibilidad, que a mi me parece muy remota, es que ya que tiene un WIMP -el neutralino mas ligero (un neutralino es una combinación del zino, fotino y higgsino) como posible NLSP (aunque la mejor opcion es un stau)- tal vez haya un mecanismo extraño de decay que pueda llevar a que haya WIMPS sueltos por ahí, y que el gravitino siga siendo el LSP (y por tanto el componente mayoritario de la materia oscura). Por las características del CDMS no podría detectar el gravitino. En fin, estas son especulaciones rápidas, y posiblemente con mi aún paupérrimo entendimiento de esas partes de la teoría F quizás sean demasiado arriesgadas. Por si acaso le he preguntado a motl (que también ha posteado la noticia en su blog http://motls.blogspot.com/2009/12/cdms-dark-matter-directly-detected.html), y en su respuesta parece estar de acuerdo con de lo que yo digo.

Como quiera que sea, le pese a quien le pese, si realmente se ha descubierto la materia oscura estamos ante un acontecimiento histórico. Es más, podría tener consecuencias para el experimento del LHC pues posiblemente este debería ser capaz de producir esta partícula recién observada, y así tendríamos una doble confirmación (aparte de una guía muy exacta de como afinar los detectores del LHC, lo cuál hará mas fácil la detección).

## Tuesday, November 24, 2009

### Introducción a la supersimetría II: El modelo de Wess-Zumino

Había escrito, hace ya tiempo, una entrada sobre supersimetría, esta. Continuo el tema introduciendo una realización de dicha supersimetría en términos de un lagrangiano sencillo, lo que se conoce como el modelo de Wess-Zumino. Quien no tenga muy recientes sus conocimientos de teoría cuántica de campos, y en particular los tipos posibles de spinores, puede leer sobre ello en esta entrada de mi otro blog.

Este va a constar de dos campos, un campo escalar complejo $\phi$ formado por dos campos reales A y B, $\phi=(A+iB/\sqrt{2})$ y un campo spinorial de Majorana $\psi$. Ambos campos van a carecer de masa. El motivo para ello es que en la naturaleza no se ha observado la supersimetría, lo cuál indica que caso de existir, la supersimetría debe estar rota. Se supone que las partículas supersimétricas de las partículas conocidas habrán adquirido masa a trvés de un proceso de ruptura de esta supersimetría. Con estos ingredientes el término cinétco de nuestro lagrangiano será.

1.$L= \partial^{\mu} \phi^*\partial_{\mu}\phi ~ + ~ 1/2i\bar\Psi\displaystyle{\not} \partial \Psi$

Ese lagrangiano es invariante bajo una tranformación SUSY global:

## Monday, July 13, 2009

### Strings 2009: the slides

This year the annual conference in string theory, celebrated at Roma, has not had an internet live TV broadcast as it happened the last year.

Because of that reason I didn't do a post about the topic. I have waited until the slides where out and I could have read some of them. The slides of conferences, if they are detailed enough, are a good thing because they are addressed to non specialists in that particular field, so they can be easily read, and they condense a great amount of information from various papers.

You can get access to the lists of talks, wth the corresponding slides, here.

I have read a few ones already. The first was the one given by Howava. I was greatly interested in reading how he defended his theory against the recent papers with showed the problems of renormalizability it seems to actually has, despite of being power counting renormalizable. Well, I didn't see any mention of it. The slide talks about the "foundational" papers on the subject and explains it's relation to the M2 brane of M-theory, to the CDT (causal dynamics triangulations) result that in the short length the effective dimension of space time is near 2, and that his theory resembles that, and a few other topics. I find specially curious that one of the motivations for his theory is that string theory violates Lorentz symmetry. Well, I am not sure why he says that, but certainly said without further explanation looks weird. It is a pity that there was not live streaming, nor non-live videos, of the talks so one can't see what questions people made him.

About the F-theory GUT's there were three talks. One from Vafa. It's ppt (than not pdf) is very schematic and without some previous knowledge on the subject I am not sure how much information one can get from it. Anyway, if one reads the papers I cited in my post about F-theory for non experts maybe he could get a much better understanding. Vafa makes a decent work explaining the two foundational papers, the paper in cosmology, and the paper in LHC footprints, that I have read. It also talks about some papers I haven't read, as for example the ones in gauge mediation (although I had read some resumes of the results). The conclusions seem to be that there are two clear predictions from their models. One, in cosmology, is that the dark matter candidate is the gravitino. that rules out models on WIMPS and implies that ATIC, PAMELA and similar results that seems to indicate an anomalous ratio of positrons over electrons over certain ranks of energies would have astrophysical origins. Or not exit at all. Recent results from FERMI/GLAST seem to contradcit ATIC and PAMELA (see, for example this post by Jester, in resonances blog) would agree with this prediction.

The other prediction mentioned on the slide is that there will be some charged track on the LHC leaving the detector. It would be due to the NLSP whose lifetime, 10^1-4 secs, is long enough to allow it scape from the detector.

There are two more talks about F-theory. One by Sakura Schafer-Namek. I have read it but from all the part related to spectral covers I coudn't get any useful informrmation. I simply don't know enough form that mathemathical topic. The other paper in F-theory is the one by Jonathan Heckman. It is centred in flavor hierarchies for quarks and leptons. Well, an interesting topic for sure, but not my favourite one. Anyway the slide is good enough to get some general idea of the topic from it.

Another paper I read is the one of Strominger about the KERR/CFT correspondence. About that topic I only had read a paper dated from the last summer. Well, I am not sure if too much progress has been achieved so far neither I have clear whether the whole field is terribly significant, but possibly that is my fault.

Possibly the most awaited paper was the one from Nima-Arkani-Hamed about twistors and the S-Matrix. There are rumorology out there saying that it's not a paper in string theory but an attempt to create some kind of supersymmetric GUT diferent from string theory. I haven't still read the slide and I can't say anything about. But for sure it is a theory that many people will discuses sooner or later, possibly when the actual paper on the subject would be out.

I'll possibly read more slides later, but I am not sure if I will post about them. But everybody can try to rad the linked slides by themselves. There are good choices that anyone with a decent basic on high energy physics could get some amount of info from them.

UPDATE: In a thread in physicis forums someone, seemengly well informed, said that actually Horava recognized the problems recently found in his theory in his talk as strings 2009. Also the same physic forums poster explained that the actual problems where that one couldn't decouplee the gosths from the theory. Curiosulsly that has lead to a posible reinterpretation of that gosths as dark matter. I have not read the relevant papers but at first sight that looks very bizaree. Gosths are negative norm states tht usually appear in the quantizationo of gauge theories as intermediate states that can be shown not to appear in external legs, i.e., are no observabbles. Toclaims thatusually unwanted negative normed states can go in external lines and actually represent viable particles (in the form of dark matter) seems like one could try to do the same thing for any theory and one wouldn't need gauge theories. I suppose that there will be something special in that gosths that make them diferent from the usual ones and permits people doing such conjectures, but, as I said, looks an a priory contravied claim.

P.S. I am looking for an easier way to use LaTeX in this blog that the one I am using (writing the latex code in the url of an image generated by an external LaTeX server). If I don't find a good solution I would seriously consider the option to migrate this blog to wordpress where writing LaTeX is "natively" supported (that's the reason I make an extensive use of it in my other blog).

## Thursday, July 09, 2009

### Vixra, the arxiv mirror symmetric

In Kea Monad/Marni Dee Sheppeard blog there has been recently a few entries about the freedom to publish scientific results.

As a result Tomasso Dorigo suggested her a bizarre idea. s a result of comments exchange it resulted into another idea, the birth of a new archive for scientific publication. In a really fast movement a new domain was registered and the site is already available. The name for the new site is arxiv written in the opposite direction, that is vixra, which, with some minor licences can be considered as a mirror symmetric of arxiv. The actual link for the website is: vixra.org/. Note that at the date of writing this It is in a very beta status.

I leave here the manifest that justifies it's creation and it's purpose, as declared by the creator:

Why viXra?
In 1991 the electronic e-print archive, now known as arXiv.org, was founded at Los Alamos National Laboritories. In the early days of the World Wide Web it was open to submissions from all scientific researchers, but gradually a policy of moderation was employed to block articles that the administrators considered unsuitable. In 2004 this was replaced by a system of endorsements to reduce the workload and place responsibility of moderation on the endorsers. The stated intention was to permit anybody from the scientific community to continue contributing. However many of us who had successfully submitted e-prints before then found that we were no longer able to. Even those with doctorates in physics and long histories of publication in scientific journals can no longer contribute to the arXiv unless they can find an endorser in a suitable research institution.

The policies of Cornell University who now control the arXiv are so strict that even when someone succeeds in finding an endorser their e-print may still be rejected or moved to the "physics" category of the arXiv where it is likely to get less attention. Those who endorse articles that Cornell find unsuitable are under threat of losing their right to endorse or even their own ability to submit e-prints. Given the harm this might cause to their careers it is no surprise that endorsers are very conservative when considering articles from people they do not know. These policies are defended on the arXiv's endorsement help page

A few of the cases where people have been blocked from submitting to the arXiv have been detailed on the Archive Freedom website, but as time has gone by it has become clear that Cornell have no plans to bow to pressure and change their policies. Some of us now feel that the time has come to start an alternative archive which will be open to the whole scientific community. That is why viXra has been created. viXra will be open to anybody for both reading and submitting articles. We will not prevent anybody from submitting and will only reject articles in extreme cases of abuse, e.g. where the work may be vulgar, libellous, plagiarius or dangerously misleading.

It is inevitable that viXra will therefore contain e-prints that many scientists will consider clearly wrong and unscientific. However, it will also be a repository for new ideas that the scientific establishment is not currently willing to consider. Other perfectly conventional e-prints will be found here simply because the authors were not able to find a suitable endorser for the arXiv or because they prefer a more open system. It is our belief that anybody who considers themselves to have done scientific work should have the right to place it in an archive in order to communicate the idea to a wide public. They should also be allowed to stake their claim of priority in case the idea is recognised as important in the future.

Many scientists argue that if arXiv.org had such an open policy then it would be filled with unscientific papers that waste peoples time. There are problems with that argument. Firstly there are already a high number of submissions that do get into the archive which many people consider to be rubbish, but they don't agree on which ones they are. If you removed them all, the arXiv would be left with only safe papers of very limited interest. Instead of complaining about the papers they don't like, researchers need to find other ways of selecting the papers of interest to them. arXiv.org could help by providing technology to help people filter the article lists they browse.

It is also often said that the arXiv.org exclusion policies dont matter because if an amateur scientist were to make a great discovery, it would certainly be noticed and recognised. There are two reasons why this argument is wrong and unhelpful. Firstly, many amateur scientists are just trying to do ordinary science. They do not have to make the next great paradigm shift in science before their work can be useful. Secondly, the best new ideas do not follow from conventional research and it may take several years before their importance can be appreciated. If such a discovery cannot be put in a permanent archive it will be overlooked to the detriment of both the author and the scientific community.

Another argument is that anybody can submit their work to a journal where it will get an impartial review. The truth is that most journals are now more concerned with the commericial value their impact factor than with the advance of science. Papers submitted by anyone without a good affiliation to a reasearch institution find it very difficult to publish. Their work is often returned with an unhelpful note saying that it will not be passed on for review because it does not meet the criteria of the journal.

In part viXra.org is a parody of arXiv.org to highlight Cornell University's unacceptable censorship policy. It is also an experiment to see what kind of scientific work is being excluded by the arXiv. But most of all it is a serious and permanent e-print archive for scientific work. Unlike arXiv.org tt is truly open to scientists from all walks of life. You can support this project by submitting your articles now.

What do I think of this. Well, there is a famous phrase of Richard Feynman about physics (valid for science in general), and it's role as a practical discipline:

"Physics is like sex. It can have practical consequences sometimes but that is not the reason we do it".

Well, that's the idea. And publishing would be part of the fun. But seemengly to publish (as well as otherparts of a scientifi carrer) have become a game where many factors ouor of the pure siceintifc content play a role as least as important as the quality of papers. Still worst, it is not very clear what the rules of that game are. That converts publishing in a very risky busines and an error can bban one from arxiv (the papers that people actually read, peer to peer reviews have become invisible). In fact I personally think that I could find some endorser for froseable future papers. But in the actual state of the subject it is too much presure bor bboth, me and the endorser.

For that reason an alternative as arxiv is a goo option. One can publish ideas and exchange them with other people. It is important the concept of exchange. There are some kind of papers when one can have, or almost have,the secutiry that they are right. But there are other that are subjecto to many uncertainties. And, possibly, one can save only a limited amount of the difficuties he face. Possibly if one has round him people working on that field he could discusse that ideas privately. but it is not always possible (even if you are in a academic position). In that sense to publish ideas in a preliminar state of development that you are not sure you can pursue further, that maybe they could be usefull. That's the idea of scientific exchange as fr as I see. And if one is wrong, well, that's always a possibilitie. Of course one would do the usual homework to try to search as much as possible similar ideas beofre publishing rubish results. Definitively is good to publish that kind of papers in a site where if somone is wrong doesn't he (and his family, friends and cat) become banned for the rest of his life I wuold say .

Still better, as far as I see both archives wouldn't be mutually exclusives. One could publish "serious" papers in arxiv and more risked ones in vixra. Well, at least in theory. Surely someone will find good reasons to find incompatibilities among them ;-).

## Wednesday, July 08, 2009

### F-theory GUT for non experts

I have found a few papers that do a good job explaining the basics of F-theory in a relatively easy way. I could have posted them as un update of the previous post on the subject but I think it deserves an small separated post.

One paper is : F-theory, GUTs and Chiral Matter.

Another one, written by Hackman and Vafa is: From F-theory GUTs to the LHC

Also I think that the interested reader would try to understand more basic settings, previous to the F-theory revolution. I am talking about the intersecting branes scenarios. A short and good review is: Progress in D-brane model building.

The reason to investigate the last paper is that I find it is interesting to understand how one calculate family numbers, how chiral fermions arise and so that in more conventional D-brane models. In fact the firs paper I cite makes a good job explaining some of that aspects, but still.

Also I recommend, once again, the original paper of Ibañez, Quevedo et all in local models D-Branes at Singularities : A Bottom-Up Approach to the String Embedding of the Standard Model. I have finished to read it and I find it very clear. As a plus it also has a brief chapter about F-theory.

Certainly the last papers about D-brane model building are not required to understand the F-theory ones, but It is good to understand what existed previously to better understand the goodness of the new. In that sense the papers recommended in my entry about the prehistory of F-theory GUTS are also valuables and focuses in diferent aspects than the ones cited here.

Anyway, if someone only wants a quick, but accurate, idea of the subject the two papers cited at the start of the post make a wonderful work

## Friday, June 26, 2009

### Trántorian physics

Trantor is a ficticial planet presented in the Isaac Asimov series of books about the foundation. It is the centrer of a galactic imperia.

In that universe the king of sciences is psicohistory. By that name is referred a mathemathical model of human societies with detailed qualitative predictive power. Physics has become an obsolete discipline that had dead of success long time ago. Supposedly it had answered all the basic questions and no new important discovery had been made for hundreds of years.

But still there were some physicists. The problem with them is that the lack of new experimental results had resulted in a vicious system where the quality of one particular physicist depended on the knowledge of the achievements and, maybe, his ability to reinterpret them in new, basically irrelevant, ways that didn't lead to new discovering.

Well, that is fictional. But sometimes actual physics somewhat resemblances that trantorian physicists. Lot of people like to culprit string theory for that, but I don't agree at all, it is a problem common to all the alternatives.

I mean, what actual observable predictions make alternative theories?

LQG great achievement was the frequency depending speed of light. In fact Liouville strings also predicted that Well, FERMI/GLAST has almost ruled that possibility (although there is some discrepancy on the interpretation of results depending of who
writes about it, for example Lubos Motl and Sabine Hossenfander disagree, as always).

Horava's gravity, being as a classical theory slightly different from Einstein's gravity makes predictions not too hard to measure. But after the initial explosion of papers it is somewhat stopped now due to some papers that posed serious doubt about is goodness as a quantum theory despite being power counting renormalizable. It would have been nice to see how it was received in the actually developing strings 2009 conference, but this year there is no live broadcast nor, at least until now, none bloging about it.

Nonconmutative theories are also almost dead, despite they had some time of glory (although today, afther many months, there is a paper in arxiv in the subject http://arxiv.org/abs/0906.4727). There are two types of NCT theories, field theoretic ones, and geometric ones. The fist are inspired in string theory. The last ones re mainly geometric and were promoted by the mathemathician Alain Connes. They mde a firm prediction, a value for the Higgs mass, that was ruled out (at lest in the original way, I am not sure whether some modifications have been suggested) last year by measures of the tevatron.

So, basically, we have that despite many theoretical efforts in many different approaches to basic physic (i.e., particle physics) we have no new experimentally confirmed since the formulation of the standard model, in the last sixties and former seventies of the past century. The only new result was the confirmation that neutrinos have an small mass. The other experimental news come from cosmology, and, as I said in previous posts, are not so firm as laboratory experiments.

Is this a problem of theoretical physicists. I hardly think so. String theory is a very rich framework. Different aspects of them actually are promising candidates for phenomenology. For example the mesoscopic extra dimensions suggested by Arkani-Hammed et all in the last nineties was a very original idea, that has led to cheap experiments that had put new bounds on the size of that dimensions. LQG, as said did a good prediction (shared by most Lorentz violating theories) and LQC is trying to do observable predictions about cosmology, maybe not rigorous ones, but if the were observed none would care too much about it ;).

The big problem I see is not related to theory but to experiments. And, specially, to collider experiments. USA cancelled founds for a new linear accelerator in the nineties. The LCH schedule has seen almost five years of delay (that is, if finally beguines to operate in September, as expected). The tevatron has made it's bests, going beyond the expectations. It has showed that QCD at high temperatures behaves not as a quantum gas (as expected) but as a quantum liquid.That doesn't means new basic physics, but at least it gives clouds about the properties of QCD that are very hard to study mathematically and computationaly. And, hey, it has ruled out NCG ;-). Even there are some possibilities that a careful analysis of the collected data would find the Higss bosson. Not that bad for a recicled collider.

If there is no serious money inverted in experiments researchers are going to spend time in increasingly twisted theories. Internal coherence is a good guide, but it is not clear that that alleged coherence is so free of doubts as some people try to present it. That goes for LQG and for string theory (and the alternatives). Again that is not a reason to go again string theory (or the alternatives, well, some of the alternatives are theorethically unlikely, but still). The ultimate justification of the theoretical developments is that they re made searching for compatibility with known physics and also guessing new phenomenology. What is seriously need is that experiments would be made. The LHC is, hopefully, next to operate, but there s no serious project for the post LHC.

Maybe some people could think that there is no good reason to expend a lot of money in that expensive experiments. Specially not in the current economic crisis. In my opinion that is a narrow minded vision. Certainly other areas of physics are giving interesting results (solid state/condensed mater and the wide area know as nanotechnology) but they are based on very all basic physics. It is necessary to pursue the development of new physics. For example, one very important problem that the society need to face is the energy supply. There are discrepancies about how many fossil combustibles (specially at cheap prices) remain. In fact that depends heavily on the growth of demand. But sooner or later (and most likely sooner) they will extinct. The "ecological" alternatives (solar energy, wind, etc) are mostly propagandistic solutions. Nuclear energy has better chances, but it depends on a limited resource,uranium. Certainly there are proposals for fast breed reactors that could create fissible elements. But they are somewhat experimental. It is an open question where they will operate as expected. The other alternative, nuclear fusion is fine. But again governments are not spending enough money on it (as the fate of ITTER clearly shows).

The thing is that when we are looking for energy sources the best thing we can is understand how the universe behaves at high energies. If one looks at the way on how "energy sources" work one sees a common pattern. One has a two energy state system separated by a barrier where the difference of energy between the two states is greater than the energy of the barrier. If one supply the system with energy enough to go over the barrier when the system goes to the lower energy state it returns more energy than the employed one. That is the way chemical combustibles work. And also the way nuclear fission and fusion works. Nuclear process involve higher energies and so they return more energy also (well, in fact it could be otherwise, but it would be very unnatural).

Well, if we go to higher energies one expects that, somewhere, there will be some systems that share that property (a good name for it would be metastability).For example in some supersymmetric models there is, if R-symmetry is present, a lightest supersymmetric partner, LSP, which is stable, and a candidate for dark matter. And also there is the possibility of a NLSP (next to light supersymetric partner) that would be metastable. Well, that is the kind of thing we like. One would expect that there is a big energy difference among them. If they are found and it is discovered a way to force the decay of the NLSP into the LSP we would have an energy source. Moreover, dark matter represent the, 75%, 90%? of the mass of the universe. That could men that there is a lot of it out there. One could argue that if we are not able to do nuclear fusion, using known elements we badly could develop a technology to extract energy from something that is still hypothetical. But the truth is that we don't know. Maybe it is a lot easier to extract energy form dark matter (let it be (N)LSP, WIMPS or whatever) that from known sources.

Still there are other possibilities. There is an small possibility that if the LHC creates black holes it could also create wormholes. Wormholes (lorentzian ones) have received a lot of attention in SF as a tool for interstellar travel or even as time machines. But there are other interesting uses for them if they would actually exist. If one mouth of the wormhole is posed in a very energetic environment it could drive that energy onto the other mouth by a direct way. For example one could put one mouth deep inside the earth and the other in the surface. That would be a good way to extract geotermic energy. Of course one could think that is a lot more likely to use more conventional ways to get that energy, but still it could be not. Other very energetic environment would be the sun. It is not totally clear how much energy requires to create a wormhole, but one would expect that if the outer distance between the mounts growth the same applies to the required energy. But it could again not to be so. Still there is a problem in using the sun, the gravitational interaction. The gravitational field of the sun would be transferred together with light and it could alter the earth orbit.

There is a more interesting possibility for wormholes (or maybe we would call them warmholes, or not, depending on how one would worry about double meanings of words xD). If they are created at the LHC that would probably mean that the reason behind it is that mesoscopic extra dimensions exist. In string theory there are various ways to realize that sceneries. A common feature of many of them is that the would mean that we leave in a three dimensional (or effectively three dimensional) brane. But it is possible the existence of additional branes. It could be that some of them would have a high background energy. And it also could be that they would bee not too far away into that additional dimensions. Actually they could be so near that it wouldn't be improbable that a wormhole could be created with one mouth inside that hot brane and the other in ours. Still better, the sceneries with mesoscopic extra dimensions offer good possibilities for wormholes becoming stable. That would rise the possibilit to use that wormholes to extract energy from that hot branes. Depending on the details they could be a mean to solve all the energy requirements of the human kind at a level that exceeds all the actual xpectations.

All the hipothethical energy sources that I have presented are related to string theory likely situations. Alternative theories maybe also would offer options. For example black holes in alternative theories could not evaporate completely and one could use the remanents to extract energy from them in Penrose lie process. A serious problem with it is that without mesoscopic dimensions there is no way to create black holes in the LHC so we woudn't have remanents either.

By the way, black hole physics is a very good example of trantorian physisic. Specially the black holes inners. The gravity/LQG community has a, widely accepted, viewpoint of them where the radius behaves as a time coordinate. Well, in string theory there are very different proposals, none of them too friendly with that LQG viewpoint. Also the string theory strongly supports the complementary principle. Well, some people in LQG don'even know of it's existence (or at least not until they published a paper that was incompatible with that principle). My problem with this is that we don't have a near black hole to do experimental tests. In fact even if we would create them into the LHC it is not clear that we could make experimental tests about black hole inners. Neither is too clear how that black hole inners have any consequence into the behaviour of the event horizon. Well, if naked singularities are allowed the thing would improve, but then they wouldn't be black holes ;-).

Well, certainly in this post, apart of some sociological consideratins, I have presented very speculative ideas with two few details about them. Maybe that is what top notch physicist do in trantor. Not being there I hope to present more earth based physic in next entries ;-).

By the way, if one is absolutely serious about it many proposal for alternative "ecological" energy sources are actually less unlikely to be good alternatives to oil that the ones I have proposed here. They look otherwise because they are based in things that laymen think that they understand, but if one goes into the details of the implied physics one really hope that wormholes actually exists xD.

## Thursday, June 04, 2009

### String theory is good for...phenomenology of particle physics

Yesterday the number of visits to this blog had a major increase. Most of the traffic came from this post in Miguis web/blog.

The post was a translation to Spanish of an article in new scientist about the good points of string theory. I had seen a discussion of that article in Lubos blog, concretely here.

Well, that article comes to say that string theory is nowadays a good theory because it´s math structure, through the AdS/CFT correspondence is useful in QCD and condensed matter physics. Well, I don't know too much about that applications but if the experts in that subjects say so is a good sign.

But, actually, I don't think that that image is quite right nowadays. Readers of this blog know that I have played attention to many alternative theories. Some of the proponents of that theories make claims against string theory. Others, who don't actually offer any theory, that is, Peter Woit, claims that string theory "makes no predictions". In his blog he usually bring attention mostly to the most speculative articles written by string theorists.

Well, I am following, as much as I can, the actual F-theory minirevolution. Doing so I have become very surprised, and impressed, by how close string theory has become of actual physics. Before going into it I must say that I somewhat understand the sceptics in string theory. If one reads the books on the subject one certainly gets the impression that actual predictions are far away. For example the 1999 (that is, not too old) book of Michio Kaku Introduction to superstrings and M-theory in its chapters about phenonenology show the results of heterotic compactifications. In that results the best one could get were the standard model plus some additional U(1) factors. Also it was stated that to achieve the right number of generations , given by n=1/2X(Cy), that is, one half of the Euler characteristic of the Calabi-Yau mainfold, was difficoult (if not almost impossible).

Other books, as Polchinsky´s two volumes book and the Clifford Jones "D-branes" don't say too much about realistic compactifications. There are good reasons for that. The books are mostly concerned about the D-brane revolutions and its consequences, the black hole entropy calculation and the AdS/CFT conjecture. The most recent book of Becker-Becker-Schwartz makes more in deep cover of compactifications. But , with a good criteria, somewhat cares more about technical issues such as the moduli space of the compactification, mirror symmetries among type II A and type II B, and flux compactifications, which are relevants for the very important issue of moduli stabilization and the KKLT like models (related to the landscape). And , of course, the all make introductions to dualities, M theory, and , to a least extent, F-theory.

In fact all that are important technical aspects, and it requires time to learn them (one must read some articles if he really wants to properly understand some aspects). But one gets the impression that everything is still to far from LHC phsyics and cosmology testable predictions. In fact there is a very recent book, by a Michel Dine which goes into phenomenology title "supersymmetry and superstrings". I must say that I find that book somewhat failed. It is to brief covering subjects that even with some previous knowledge are hard to appreciate properly.

Well, in definitive, a lot of text books and no a clear signal of actual testable physics. Certainly discouraging. Divulgative books are not too different. That, certainly, can explain why some people has the impression that string theory is far from it's objectives. Blogs from string theorists try to say, to whoever listen them, that string theory is "the only game in town". In fact there are not many blogs in string theory with a decent publication rate. However I had also the idea that string theory was far of phenomenology, and I had not purchased too mcuh that topic.

F-theory minirevolution has changed that. I have read at last the two big pioneer papers of Vafa (arXiv:0802.3391v1 and arXiv:0806.0102v1), and almost completed the reading of the F-theory Guts cosmology (arXiv:0812.3155v1). Also I have made partial readings of some subsequent papers, and a few previous papers needed to understand the formalism developed.

Certainly are hard to understand papers. But once one gets familiar with them one sees what kind of physics is discussed. The first thing to say is one need to know the details of GUT's and symmetry (and supersymmetry) breaking. F-theory local models, with the right decoupling from gravity, can give an SU(5) model, without any exotics. They offer it's own way to break SU(5) into the MSSM, through an U(1) flux of hyperchrge, That mechanism avoids some of the problems presents in purely field theoretic models. In particular they can avoid problems with the observed lifetime of the proton. Ulterior papers get values in the CKM matrix that are good to get the observed asymetry oof baryons in universe. They offer ways to advoid the singlet-tripplet spliting problem of GUT's (That is, requiring the existence of Higgs doublets (1, 2)±1/2 leads necessarily also to color triplets. However, there exist rather uncomfortable lower bounds on the mass of these triplets). They offer a natural way to get small neutrino masses. In cosmology, trough a late decay of the saxion (whose lifetimem is predicted, that is, properly bounded, by the theory), they can avoid some of the problems that symmetry breaking bring to cosmology (the gravitino problem) and gives a righ way to obtain reheating after an inflactionary phase and some extra things that I haven't finished to read.

As you can see these models are quite near the cutting edge phenomenology. They offer solutions to problems not available by other approaches. And F-theory is not alone. Seemingly M-theory is going also into the local models + gravity decoupling business, see for example the paper Hitchin’s Equations and M-Theory Phenomenology by Tony Pantev and Martijn Wijnholt.

As I said I hadn't followed previously phenomenology with too much attention. But, in fact, more traditional approaches also had made some advances. For example this 2008 short review article of heterotic compactifications, From strings to the MSSM also cares about some of the previously mentioned aspects.

Another very recent paper, Towards Realistic String Vacua From Branes At Singularities, by Joseph P. Conlon, Anshuman Maharana, Fernando Quevedo, use the D-brane approach to phenomenology, not related to the gravity decoupling approach. They offer the bonus of moduli stabilization (something more habitual in cosmological models). In the abstract the conclude saying: "We propose that such a gauge boson could be responsible for the ghost muon anomaly recently found at the Tevatron’s CDF detector". Well, there some serious doubts about the real existence of that anomalies (see the tomasso dorigo´s blog, linked in this web site, and search for discussion of that topic).

Well, certainly there are a few bunch of models inspired by string theory, and not all of them (if any) can be truth at once. Also not all models make firm predictions. But the point is that they are actually reproducing the MSSM, GUT´s supersymmetric models, and mechanism to enhance the purely particle physics models. Also , in cosmology there are many different points where string is enhancing purely field theoretic models.

But, such as I see it, string theory is actually dictating the construction of (at least some of) the models that are going to be checked in the near future. Also one must not forget about the RS models, inspired by string theory, where one could get black holes in the LHC (that models possibly are not compatible with F-theory GUTs).

With all of this I think that string theory is doing exactly what one would expect from a traditional fundamental theory of physics such it has been made traditionally. Certainly I am talking about very, very, recent developments, most of them from this year and the previous one. But, anyway, it looks like if string theory is definitively "landing" into experimental physics, that is what it was expected from it. And, still, it is doing progress into clarifying it`s theoretical aspects, and the description of black holes (a topic not too easy to study in laboratory, except if LCH produce black holes, that is).

I am not at all a radical and I understand if some people wants to keep doing alternative approaches. The point of this post is to say that, as far as I see, the "not even wrong" criticism of string theory doesn't make too much sense nowadays.

And please, remember that I am not in a faculty position getting money from doing research in string theory. I have no economic, doctrinal or political reason to favour one theory or another. It is just that, according to what I know just now, string theory seems a perfectly good theory for doing high energy physics, and I have tried to explain why.

## Monday, June 01, 2009

### Quick ideas to become a cosmology atheist

As I said in the other post, and not for the first time, I don't take cosmology too seriously. I find that there are many uncertainties in the observed data and also in the interpretations. Because of that I hadn't bothered to think too much about that questions.

In the last post Kea sugested me to read the Louise Riofrio theory, which resulted to be a version of the VSL (variable speeds of light) cosmologies. I have partially readed some of their statements, and also made the usual googling abbout the topic. The first thing that one finds is a mention of Von Riemman space. Well, I have no idea of what that is supposed to be. of course that could be because iI am not spetialist in the field so I googled for it and Ireached a physiscs forum's thread where other people also agreed that they didn't know it. Well, there som other points in her papers whose motivation I don't see clear. Beeing so I can't say too much else about the general theory

Another apect where she seems to see a point, independent of the general model, favouring her theory of a VSL is the following argument. In some epoch ths sun , according the standard model o solar evolutions, radiates a 75% of the energy that it radiates now. Ok, according to that she claims that earth should be a ice ball contradicitng the fact that there was life in it. The VSl solves the problem because someway the VSl implies tht the sun luminosity should be corrected to the right factor.

Without going into the detaill I must say that I find very unlikely that conventinal astronomy wouldn't have considered that possibbility before. Also there is another consideration. Earth is hot by itself. The friction energy that leaded to it's formation is accumulted inside it. In the XIX there was a controversy among the geologists and a prominent physics I don't remember for sure but I think that it was kelvin). The geological observations dated the antiquitie of earth in a number of years that was imcompatible with its temperature. Using the heat equation and the conventional data for earth materials one could see that earth would have frozen long time sooner that the age estimated bby the geologists. Later Somerfeld said that the reconciliation of the two viewpoints was the prsence of radiactive materials inside earth. Beeing sommerfeld such a well qualified physicist the argument was accepted as valid withouth criticism.

Well, in fact if one does the actual calculationsit can be shown that the radiactive materials are not enought to achieve the hotting of earth,. The reaosn earth is still hot (inthe outside) is that the heat equation used by Kelvin was not right. One needs to consider also trasnport phenomena, that is, convection. Doing so
it canbe shown that earth is hot bbecause of it's inner hot adquired within it's formatioin.

Beeing so I am not sure of how much of the riofrio argument makes too much sense. Also I find that history interesting because it whos explicitly how cautous one must be with arguments not based in observations made inlaboratory controlled conditions. Simply there are too many uncertaintiees.

Well, It has coincided that this mount the spanish edition of scientific american has an article where the cosmological arguments leading to the cosmological constant where revised. Being a divulgative article, that is, easy to read, I did so (it didn't take too mcuh time) The idea is that the observational reason why we belive univserse is expanding aceleratedly is that we see that far supernovaes light arrives to us with less intensity that what it would be expected from it's red shift it the univserse would be under a decelerating FRW (Friedman-Robertson-Walker) expansion. In the article they offert an alternative explanation. They say that it we would be in a particularly empty region of space-time the local decelartion of the universe would be slowest here that in distant points (for example the points near the observed supernovae. It contraicts the copernican principle that says that we are not in a particular place in space time. But tht can be circunvated in a natural way. If in the early universe there would e a random distribution of density inhomogenities that respected that principle the evolution would make that the less dense parts would increase it's size bby a factor ggreater than the more dense ones. In that way it would be mor probable that we would be in a relatively empty region of the universe. The last part of the argument is very similar to the nucleation mechanism that susskind used to explain the cosmological constant (but there are also diferences, of course).

Well, afther reading all that I wondered if I myself could ideate a mechanism to go agains the conventional big bang + inflaction scenary. Well, indeed I could.