The BMO Financial Group Isaac Newton Chair in Theoretical Physics

I learned this morning from Matin Durrani’s blog that the Perimeter Institute has announced today the first of what they expect to be five very well-funded Perimeter Research Chairs in theoretical physics. The next four will be named after Maxwell, Bohr, Einstein and Dirac (as well as whatever other wealthy individual or organization comes up with funding).

The BMO Financial Group is putting up $4 million and $4 million is coming out of the Perimeter endowment (which is mostly from Blackberry’s Mike Lazaridis). An endowment of $8 million for a chair is quite high. It seems that typical numbers for endowment payouts these days are around 5%, so this would make available $400,000 or so a year to pay some prominent theorist. For comparison, the Simons Foundation has recently announced that it will fund endowed Math+X chairs aimed at mathematicians working at the interface with some other subject. Simons may be the wealthiest hedge-fund manager in the world, but he’s a piker compared to the Canadian financiers, with only $1.5 million going to each chair (to be matched by $1.5 million from the institution that gets the chair, for a total of $3 million). Then again, it just may be that prominent mathematicians are dirt-cheap compared to prominent theoretical physicists.

The Perimeter Institute in recent years has moved away from supporting non-mainstream topics in theoretical physics, while expanding dramatically. The only two conferences announced there for the next year or so are on the topics of LHC physics and AdS/CFT, about as mainstream as one can possibly imagine. If they manage to fund what might be the five highest-paid theoretical physics positions in the world and hire the people they want into them, they will be well on their way to a dominant position in the subject. While, like most industries these days, the tactic here is to shower the top few people in the field with cash, they are also expanding their hiring at more junior levels. According to the rumor mill, last year out of a total of fourteen people hired to tenure-track positions in theoretical particle physics in North America, three of the fourteen went to Perimeter.

For more on this, see here, and an interview with Lazarides and the BMO CEO here.

Posted in Uncategorized | 15 Comments

Assorted News

  • HEPAP is meeting in Washington today, presentations available here. The idea of this regular meeting is for the US HEP community and the funding agencies to meet and plan for the future, something that’s not easily done in an environment where these agencies have no budget at all for the current year, just an authorization to spend money at last year’s rate that expires in a couple weeks from now. No one seems to be sure what funding prospects are for the next few months, much less the next few years. Fermilab is dealing with this situation by offering 600 of its staff incentives to quit or retire next month (see here). There’s a new DOE Committee of Visitors report out here, it contains the bizarrely familiar recommendation of all such reports: the US needs to fund more HEP theory students (they don’t explain why, or where the money should come from).
  • In dark matter news, Princeton this week hosted a workshop on the subject, talks available here. Still no results from the latest Xenon100 run. This week’s Nature has a nice review of the various searches for WIMP dark matter, with conclusion:

    With the advent of the Large Hadron Collider at CERN, and a new generation of astroparticle experiments, the moment of truth has come for WIMPs: either we will discover them in the next five to ten years, or we will witness their inevitable decline.

    (Update: a commenter points out that this article is also available on the arXiv here.)

    One new astroparticle experiment that is supposed to look for evidence of dark matter is Sam Ting’s Alpha Magnetic Spectrometer, set to be launched in February, and described in a front-page New York Times article yesterday.

  • Ten days after first collisions, Alice already has two papers out (here and here) with experimental results on lead-lead collisions at an energy more than an order of magnitude higher than ever before. String theorists are very enthusiastic about this (see here and here), claiming that what is being observed is “properties of a type that can be nicely captured using string theory models”. I’d be quite curious to see any AdS/CFT based predictions that could be compared to these new results (or to forthcoming ones).
  • For the latest from the LHC, see here. Current plan is to have a proton-proton beam back around February 21, followed by at least 2 weeks of beam recommissioning. The proton run would end in November, followed by another ion run. First estimates for 2011 are that the run will be at 4 TeV/beam, and a “reasonable” estimate of total luminosity would be 2.2 inverse femtobarns, double the initial goal. Even more optimistically, the possible “ultimate reach” for next year would be a luminosity that would give a total of 7.2 inverse femtobarns if sustained over the hoped for 200 days of running. This kind of higher luminosity would allow the LHC to see evidence of a Higgs over the entire expected range, as well as allowing it to finally overtake the Tevatron in the Higgs race. The experiments so far are reporting results that match exactly the Standard Model, more announcements to come at the Winter Conferences early next year.
  • There’s an interesting trend of our LA-based theorist-blogger-media-stars starting to resist making dubious media appearances. A few months ago Sean Carroll described storming off the set of a TV pilot here. Now Clifford Johnson (whose media mishaps include appearing as a scientific expert on the question of how big women’s breasts need to be to crush beer cans, see here) tells us that Sometimes I Say No.
  • Posted in Experimental HEP News, Uncategorized | 28 Comments

    A Geometric Theory of Everything

    The December issue of Scientific American is out, and it has an article by Garrett Lisi and Jim Weatherall about geometry and unification entitled A Geometric Theory of Everything. Much of the article is about the geometry of Lie groups, fiber-bundles and connections that underpins the Standard Model as well as general relativity, and it promotes the idea of searching for a unified theory that would involve embedding the SU(3)xSU(2)xU(1) of the Standard Model and the Spin(3,1) Lorentz group in a larger Lie group.

    The similarities between (pseudo)-Riemannian geometry in the “vierbein” formalism where there is a local Spin(3,1) symmetry, and the Standard Model with its local symmetries makes the idea of trying to somehow unify these into a single mathematical structure quite appealing. There’s a long history of such attempts and an extensive literature, sometimes under the name of “graviGUT”s. For a recent example, see here for some recent lectures by Roberto Percacci. The Scientific American article discusses two related unification schemes of this sort, one by Nesti and Percacci that uses SO(3,11), another by Garrett that uses E8. Garrett’s first article about this is here, the latest version here.

    While I’m very sympathetic to the idea of trying to put these known local symmetry groups together, in a set-up close to our known formalism for quantizing theories with gauge symmetry, it still seems to me that major obstructions to this have always been and are still there, and I’m skeptical that the ideas about unification mentioned in the Scientific American article are close to success. I find it more likely that some major new ideas about the relationship between internal and space-time symmetry are still needed. But we’ll see, maybe the LHC will find new particles, new dimensions, or explain electroweak symmetry breaking, leading to a clear path forward.

    For a really skeptical and hostile take on why these “graviGUT” ideas can’t work, see blog postings here and here by Jacques Distler, and an article here he wrote with Skip Garibaldi. For a recent workshop featuring Lisi, as well as many of the most active mathematicians working on representations of exceptional groups, see here. Some of the talks feature my favorite new mathematical construction, Dirac Cohomology.

    One somewhat unusual aspect of Garrett’s work on all this, and of the Scientific American article, is that his discussion of Lie groups puts their maximal torus front and center, as well as the fascinating diagrams you get labeling the weights of various representations under the action of these maximal tori. He has a wonderful fun toy to play with that displays these things, which he calls the Elementary Particle Explorer. I hear that t-shirts will soon be available…

    Update
    : T-shirts are available here.

    Posted in Uncategorized | 110 Comments

    The Anderson-Higgs Mechanism

    One reason for this posting is that exchanges in the comment section of the previous one led me to look into some history, and I found some odd and possibly interesting facts I hadn’t previously known. So, part of this will just be lifting of some links from comments in the last posting.

    Another reason is that while the history may seem obscure, what’s at issue is the central unsolved problem of particle physics: the nature of electroweak symmetry breaking, and no excuse for thinking more about this topic should be let to pass by. The Yang and Mills work on non-abelian gauge theory published in 1954 had one huge problem: in perturbation theory it has massless particles which don’t correspond to anything we see. One way of getting rid of this problem is now fairly well-understood, the phenomenon of confinement realized in QCD, where the strong interactions get rid of the massless “gluon” states at long distances (they are relevant at short distances, visible in terms of jets seen at colliders).

    By the very early sixties, people had begun to understand another source of massless particles: spontaneous symmetry breaking of a continuous symmetry. If the vacuum state is non-invariant under a continuous symmetry, you expect to find one massless state in the theory for each generator of the symmetry. These are called “Nambu-Goldstone” particles, and pions provide an example (only approximately massless, since the symmetry is approximate).

    What Philip Anderson realized and worked out in the summer of 1962 was that, when you have both gauge symmetry and spontaneous symmetry breaking, the Nambu-Goldstone massless mode can combine with the massless gauge field modes to produce a physical massive vector field. This is what happens in superconductivity, a subject about which Anderson was (and is) one of the leading experts. His paper on the subject was submitted to Physical Review that November, and appeared in the April 1963 issue of the journal, in the particle physics section. It explains what is commonly called the “Higgs mechanism” in very much the same terms that the subject appears in modern particle physics textbooks and notes:

    It is likely, then, considering the superconducting analog, that the way is now open for a degenerate-vacuum theory of the Nambu type without any difficulties involving either zero-mass Yang-Mills gauge bosons or zero-mass Goldstone bosons. These two types of bosons seem capable of “canceling each other out” and leaving finite mass bosons only.

    All that is missing here is an explicit relativistic example to supplement the non-relativistic superconductivity one. This was provided by several authors in 1964, with Higgs giving the first explicit relativistic model. Higgs seems also to have been the first to explicitly discuss the existence in models like his of a massive mode, of the sort that we now call a “Higgs particle”, the target of active searches at the Tevatron and LHC.

    Anderson tells his story here:

    So it was probably completed summer ’62. Very little attention was paid to it except that in fact— well, Higgs reinvented it. In some ways the particle physicists tell me he had less understanding; in some ways he had more. He certainly made a real model out of it where I had only a mechanism…

    about the Anderson-Higgs phenomenon, if I may use the word. In the paper that I wrote I definitely said people have been worried about the Goldstone boson in broken symmetry phenomena. The Goldstone boson is not necessary. Here is the possibility of removing the Goldstone boson, mixing it with a gauge boson, and ending up with zero mass. [should be “non-zero” maybe a transcription error]…So I think I really understood the nature of the mechanism…

    It was not published as a paper in the Condensed Matter Physics. It was published as a paper in Particle Physics. Brout paid attention to it. And he and Englert two years later produced a model of symmetry breaking, which if you’ll read carefully the summary of their work that t’Hooft and Veltman give (Nobel Prize winner this year), they say that they took off very much from the Brout-Englert paper, and there’s no way Brout was not perfectly aware of my work and I would be surprised if the Brout Englert paper doesn’t reference it rather than Higgs or along with Higgs. So in fact it didn’t fall completely on deaf ears.

    Note added 5/15/2013: I’ve heard from Martin Veltman that at the time they were working on the renormalizability of Yang-Mills, he and ’t Hooft were not aware of the Brout/Englert work, or of the general issues about the Goldstone theorem and the Higgs mechanism that Brout/Englert and others were addressing. Veltman’s Nobel lecture describes the history in detail, and has nothing like what Anderson describes (neither does ’t Hooft’s).

    Given the background Brout had in condensed matter physics and Anderson’s claim that “there’s no way Brout was not perfectly aware of my work”, it is quite surprising that no reference to Anderson occurs in the paper he and Englert published in Physical Review Letters. It arrived at the journal June 26, 1964 and came out in an issue dated August 31, 1964. In historical talks about this given back in 1997 (available here), Brout and Englert write:

    We knew from our study of ferromagnetism that long range forces give mass to the spin waves and we were aware, from Anderson’s analysis of superconductivity [5], of the fact that the massless mode of neutral superconductors, which is also a Nambu-Goldstone mode, disappears in charged superconductors in favor of the usual massive plasma oscillations resulting from the long range coulomb interactions in metals. Comforted by these facts, we decided to confront, in relativistic field theory, the long range forces of Yang-Mills gauge fields with the Nambu-Goldstone bosons of a broken symmetry.

    The latter arose from the breaking of a global symmetry and Yang-Mills theory extends the symmetry to a local one [6]. Although the problem in this case is more subtle because of gauge invariance, the emergence of the Nambu-Goldstone massless boson is very similar. We indeed found that there were well defined gauges in which the broken symmetry induces such modes. But, as we expected, the long range forces of the Yang-Mills fields were conflicting with those of the massless Nambu Goldstone fields. The conflict is resolved by the generation of a mass reducing long range forces to short range ones. In addition, gauge invariance requires the Nambu-Goldstone mode to combine with the Yang Mills excitations. In this way, the gauge fields acquire a gauge invariant mass!

    This work was finalized in 1964.

    Very oddly, the only reference to Anderson’s work that they give (their [5]) is to a 1958 paper of his, not to the 1963 paper which had the same conclusions as theirs, a year earlier.

    Brout and Englert don’t give a full model, just assume existence of a scalar field with spontaneously broken symmetry, and specified couplings to the gauge fields. Working independently, Peter Higgs in July 1964 sent a paper to Physics Letters arguing that, even relativistically, Anderson’s argument worked and there is no need for massless particles in the case of spontaneous symmetry breaking with a local symmetry. This paper was published, but a paper he sent a week later in which he wrote down an explicit model (the Abelian Higgs model) was rejected. It was later submitted to (August 31, 1964) and accepted at Physical Review Letters (published in the October 19, 1964 issue), where the referee (Nambu) made Higgs aware of the Brout-Englert paper, which Higgs refers to in a footnote. The Higgs paper does refer to Anderson’s 1963 paper, writing in the introduction:

    This phenomenon is just the relativistic analog of the plasmon phenomenon to which Anderson [3] has drawn attention.

    Higgs gives his version of the history here, and refers to the “Anderson mechanism”, writing:

    During October 1964, Higgs had discussions with Gerald Guralnik, Carl Hagen and Tom Kibble, who had discovered how the mass of non-interacting vector bosons can be generated by the Anderson mechanism.

    Guralnik, Hagen and Kibble had been working on what Higgs calls the “Anderson mechanism” and Anderson the “Anderson-Higgs mechanism”, writing a paper about it for submission to PRL. Guralnik gives his version of the history here (writing about the “Brout, Englert, Guralnik, Hagen, Kibble, Higgs phenomenon”, Higgs last, no Anderson), Kibble’s is here. In Guralnik’s version:

    as we were literally placing the manuscript in the envelope to be sent to PRL, Kibble came into the office bearing two papers by Higgs and the one by Englert and Brout. These had just arrived in the then very slow and unreliable (because of strikes and the peculiarities of Imperial College) mail. We were very surprised and even amazed. We had no idea that there was any competing interest in the problem, particularly outside of the United States. Hagen and I quickly glanced at these papers and thought that, while they aimed at the same point, they did not form a serious challenge to our work.

    His explanation for why they did not refer to Anderson is:

    At the same time, Kibble brought our attention to a paper by P.W. Anderson [26]. This paper points out that the theory of plasma oscillations is related to Schwinger’s analysis of the possibility of having relativistic gauge invariant theories without massless vector particles. It suggests the possibility that the Goldstone theorem could be negated through this mechanism and goes on to discuss “degenerate vacuum types of theories” as a way to give gauge fields mass and the necessity of demonstrating that the “necessary conservation laws can be maintained.” In general these comments are correct. However, as they stand, they are entirely without the analysis and verification needed to give them any credibility. These statements certainly did not show the calculational path to realize our theory and hence the unified electroweak theory. It certainly did not even suggest the existence of the boson now being searched for at Fermi lab and LHC. The actual verification that the same mechanism actually worked in non-relativistic condensed-matter theories as in relativistic QFT had to wait for the work of Lange [28], which was based on GHK. We did not change our paper to reference the Anderson work.

    See Guralnik’s paper for a detailed discussion of those points which he feels Anderson, Brout, Englert and Higgs had missed about all this. It remains true that the full understanding of how this works non-perturbatively is rather tricky, especially in the chiral, non-perturbative context that is relevant to the Standard Model. It may very well be that there is some important piece of understanding about this that has been missing and will someday lead to a final understanding of the origin of electroweak symmetry breaking.

    Update: For two other recent expository articles about this subject and its history, see here and here.

    Posted in Favorite Old Posts, Uncategorized | 31 Comments

    Massive

    There’s a wonderful new book about particle physics that has just come out, Massive:The Missing Particle that Sparked the Greatest Hunt in Science, by Ian Sample, who is a science correspondent for the Guardian. The topic is the huge open question currently at the center of particle physics: is the Higgs mechanism the source of electroweak symmetry breaking (and, at the same time, the source of the mass terms in the Standard Model)? The Tevatron and the LHC are now in a race to either detect the Higgs particle or rule out its existence, with one alternative or the other very likely to come through within the next few years.

    Truly explaining what the Higgs mechanism is can only be done with mathematics and physics background far beyond that expected in a popular book, but Massive makes a good try at it. Sample does a wonderful job of telling about the history behind this subject. He’s the first writer I know of who has gotten Peter Higgs to tell his story in detail. The original paper on the subject by Higgs was rejected by Physics Letters, but ultimately published by Physical Review Letters. There’s a complicated priority issue one can argue over and that someday soon a Nobel committee may need to resolve, involving Higgs, Englert, Brout, Guralnick, Hagen and Kibble. My personal opinion is that it was condensed matter theorist Philip Anderson who first understood and described the Higgs mechanism, quite a while before anyone else.

    Sample’s book is full of wonderful stories about particle physics, and alludes to some that he can’t give the details of:

    On June 8,1978, Adams marked the achievement in extraordinary fashion. He jotted down a poem about Rubbia and Van der Meer’s efforts and sent it out as a memo. The poem — too offensive to reprint here — suggested that Rubbia had exploited van der Meer’s brilliance to further his own career.

    The footnote to this says that the memo is in the CERN archive, dated June 8,1978 and entitled “Approval of ppbar facility”.

    One of the later parts of the story involves the discussion of Higgs rumors on particle physics blogs, and debates among the experimental collaborations about this. With a little bit of luck, we may hope to see more of this soon.

    All in all, the book is a great read, by far this year’s best popular book that could be recommended to lay people who want some idea of what’s going on in particle physics now and why it is exciting.

    Posted in Book Reviews | 29 Comments

    This and That

  • There’s a new preprint here explaining the scientific case for running the Tevatron past 2011. A couple weeks ago the P5 subpanel came out with its report on the subject, generating news stories “Momentum builds for Tevatron extension” and “Panel Wants U.S. to Chase ‘God Particle’—If There’s Money”. The panel recommended extending the Tevatron run, but only if another $35 million/year in additional federal funding was provided to do it. Prospects for this are very unclear, with the next stage in the process a decision about whether to include this in the President’s DOE budget request for FY2012, due next February.

    The US for a long time now has operated under a bizarre budgeting system, where government agencies typically spend much of the time operating without a budget. FY 2011 began October 1, with the Congress still far from having come up with a FY 2011 budget, instead operating the government on a series of “continuing resolutions”, which allow spending at the FY 2010 level. The latest of these expires December 3, after which there will undoubtedly be another one. Sooner or later an “omnibus bill” setting the actual budget will presumably get passed and one might think the result would correspond to the levels set by the appropriation committees of the House and Senate (which contain an increase of FY 2010 levels). Every other year though is an election year, so the Congress that had left town for weeks to campaign comes back after a post-election rest as a lame-duck organization, with lots of its members pushing for delaying everything until next year if their side did well in the election. This year the Republicans did very well at the polls arguing that the government spends too much money on “discretionary” things. Unfortunately for HEP, it’s in the relatively small “discretionary” part of the budget. There will undoubtedly be a strong push from the Republicans to not wait for FY 2012, but to start cutting this year’s budget, in the middle of the fiscal year. Given the dysfunctional nature of the US political system, it’s anybody’s guess how this will turn out. For comments from the Fermilab director about this situation, see here.

  • One organization that doesn’t have to worry about federal funding is FQXI, which was initially funded by the Templeton Foundation with a grant that was supposed to take them through the end of last year. I don’t know where their money is coming from these days, but they have recently announced a new essay contest carrying $40,000 or so in prize money (the Gruber Foundation is one sponsor), on the topic “Is Reality Digital or Analog?”.

    Among other FQXI activities, next August its members will go on a cruise during which they will discuss foundational questions related to the nature of time.

  • The LHC proton-proton run has ended for 2010, with an integrated luminosity of about 45 inverse picobarns. The plan is to restart around February 4 and collide protons for 9 months in 2011. It’s possible that the energy of the beams will be raised from 3.5 TeV to 4 TeV. Detailed plans will be made at workshops at Evian and Chamonix (January 24-28). For recent news and some idea of long term plans, see this talk at the US LHC users organization meeting. It has the LHC shutdown to replace splices from December 2011-March 2013, and another long shutdown for a luminosity upgrade in 2016. In the very long term, there are discussions of upgrading the machine with new, higher field magnets that would allow operation at 16.5 TeV/beam, but this is probably about 20 years off…
  • The theorists at CERN have been retreating, see here.
  • There’s an interview with Steven Weinberg in Scientific American, see here.
  • Nature Physics has a review by Eva Silverstein of the Yau/Nadis book (that I discussed here).
  • See Dennis Gaitgory’s website here for some notes in progress on geometric Langlands.
  • Update:
    One more: An interesting diavlog between Lee Smolin and Robert Wright at the Big Questions Online site.

    Update: Tommaso Dorigo has a critical discussion of the paper making the case for an extended Tevatron run.

    Posted in Experimental HEP News, Langlands, Uncategorized | 10 Comments

    Once Before Time

    There’s a new popular book out this week entitled Once Before Time: a whole story of the universe, by Martin Bojowald, promoting his ideas about “Loop Quantum Cosmology”. It’s a translation of the original German edition, Zuruck vor den Urknall, published last year.

    The topic of the book is work by Bojowald on toy models using loop quantum gravity that avoid the Big Bang initial singularity of classical general relativity. For a much shorter version of all this, see his 2008 Scientific American cover story Big Bang or Big Bounce?

    There’s a very deep human desire to understand origins and thus to trace the history of the universe back before the earliest periods for which cosmological theory and observations have provided some degree of scientific understanding. Unfortunately this has led in recent years to a flood of over-hyped claims by physicists claiming to have a scientifically viable theory of what happened “Before the Big Bang”. To qualify as legitimate science, such claims need to be backed up by some conventional sort of evidence. This might take the form of experimental predictions, testable either now or in principle in the future. It might also take the form of a highly constrained and beautiful theory whose success in other realms makes a compelling case that it could also explain experimentally inaccessible phenomena. I don’t know of any example of such pre-Big Bang scenarios now being sold to the public that comes even close to having such backing.

    The cover of Bojowald’s book tells us about Loop Quantum Cosmology:

    Now the theory is poised to formulate hypotheses we can actually test.

    I’m no sure exactly what that is supposed to mean, but it appears to be misleading hype, not corresponding to anything actually in the book. The text of the book itself wavers back and forth, sometimes explaining the overwhelming problems one faces if trying to extract some kind of prediction out of the LQC framework and emphasizing how speculative it all is, at other times expressing ungrounded optimism that somehow these problems will be overcome. It ends on an upbeat, hopeful note

    Will we ever, with a precision that meets scientific standards, see the shape of the universe before the big bang? The answer to such questions remains open. We have a multitude of indications and mathematical models for what might have happened. A diverse set of results within quantum gravity has revealed different phenomena important for revealing what happened at the big bang. But for a reliable extrapolation, parameters would be required with a precision far out of reach of current measurement accuracies. This does not, however, mean that it is impossible to answer questions about the complete prehistory of the universe. Cosmology as well as theoretical investigations are currently moving forward and will result in unforeseen insights. Among them might well be experimentally confirmed knowledge of the universe before the big bang.

    but I found nothing in the book to justify this optimism. The few allusions to specific attempts to find some relation to something observable are vague and suffer from the book’s nearly complete lack of any references to more technical sources of information.

    About fifteen years ago, in The End of Science, John Horgan described the field of fundamental physics as degenerating into what he called “ironic science”, something more like literature, art or philosophy than traditional science, pursued in a “speculative post-empirical mode”. At the time I thought he was going too far, but Bojowald’s book provides an unfortunate confirmation of the phenomenon Horgan was describing. It’s written in a rather dense and sometimes impenetrable style, featuring quotations from Nietzsche, some science-fiction set off in italics, and a few pictures of contemporary art-works supposedly relevant to the argument. Attempts are made to claim a role for pre-Socratic philosophy, with LQC finally providing a means of going beyond the pre-Socratics:

    Otherwise, one can find among the pre-Socratics most of the elements of modern cosmology. Only with quantum gravity did truly new elements enter the game.

    Aficionados of the loop quantum gravity – string wars will find various accurate comments about string theory and the sociology of science, and Bojowald also describes an interesting insider’s point of view on the story of the development of loop quantum gravity and the scientific figures behind it. He’s quite right that it’s a fascinating possible approach to quantum gravity well worth pursuing, but the applications to cosmology seem to me not even close to being ready for prime-time and this kind of treatment in a popular book.

    Update: There’s a review by Brian Clegg at the Wall Street Journal here.

    Posted in Book Reviews | 25 Comments

    Simons Center Inaugural Conference

    I was out in Stony Brook for the past couple days, to attend festivities surrounding the inauguration of the new building there which will house the Simons Center for Geometry and Physics. The Center is funded by Jim Simons, whose Renaissance Technologies is one of the world’s most successful hedge funds. The plan is to ultimately have six permanent members (half in physics, half in math) and a director, as well as quite a few postdocs and visitors. The founding director of the center is John Morgan, who until recently was my colleague here in the math department at Columbia. Mike Douglas is a permanent member in physics, current and recent senior visitors include Nikita Nekrasov and Graeme Segal.

    The building is quite attractive, with the two lower floors forming a public area that includes two auditoriums, an atrium and a dining area which will serve lunch and bring together the local math and physics communities. The three upper floors have offices and seminar rooms. Construction is nearly finished, but not quite, with part of the ground floor still walled off for some last-minute work. The atrium features a large mural containing a selection of historically important equations, the choice of which evidently was a major undertaking. Guests noted one typo, but luckily the current mural is a temporary printed one, with the plan to cut the equations in stone not yet implemented.

    Tuesday night featured a gala opening event, with music provided by the Emerson quartet, and short speeches from various dignitaries. Simons was presented with an original edition of one of the works of Isaac Newton, with the comment that he had shown much greater success than Newton at turning things into gold. He gave a wonderful talk describing the early stages of interaction between math and physics that he was involved in as chair of the math department at Stony Brook in the early seventies. Evidently soon after his arrival he was invited to meet with Frank Yang, who had just started up an institute for theoretical physics there. Yang described his current research on gauge theory, which Simons claims to not have understood a word of. The same thing happened again a year later. However, the third time this happened, Simons all of a sudden realized that what Yang was talking about was something that geometers knew very well: a connection in a principal bundle. This led to a series of lunch-time lectures by Simons to the physicists, to visitor Is Singer getting interested and taking what he learned back to MIT and Atiyah at Oxford, and finally to the modern period of interaction between math and physics that began in the mid-seventies centered around questions related to instantons.

    On Wednesday, the Simons Center hosted a day-long inaugural conference (videos and slides of talks should appear on their web-site at some point). Appropriately, the first talk was an inspirational one from Michael Atiyah, going over a wide variety of different mathematical ideas. One theme was the quaternions, with Atiyah pointing out that Hamilton had written down a square-root of the Laplacian many decades before this trick was re-discovered by Dirac in writing down the Dirac equation. After recalling the relation between the division algebras (real and complex numbers, quaternions, octonions) and the Hopf invariant one problem, Atiyah suggested that the Freudenthal magic square has a similar relationship to the Kervaire invariant one problem, with the recent complicated proof by Hopkins et al. an analog of the original Adams proof in the Hopf case, with the analog of Atiyah’s “postcard proof” still to be discovered. He ended with some comments about ideas of Alain Connes about non-commutative geometry and the Riemann hypothesis, and suggested that the conjectured self-adjoint operator that could explain the Riemann hypothesis might be the Hamiltonian of quantum gravity. I noticed that Atiyah was supposed to be giving a talk at the IAS today with the impressive title of “Quantum Gravity and the Riemann Hypothesis”, but it appears to have been canceled.

    The second morning talk was a rather rambling and elementary one by Polyakov about topics related to Wick rotation, ending with the claim that in the gravitational case the usual ideas about Wick rotation fail and this has something to do with explaining the cosmological constant. The afternoon began with a talk by my colleague Andrei Okounkov using symplectic resolutions to study quantum cohomology, followed by Witten whose topic was “A New Look at Khovanov Homology”. This talk covered similar material to the ones described here, involving a really beautiful story about various 3 and 4 dimensional topological quantum field theories. As in the earlier talks though, at the end there’s a transition to 5 and 6 dimensional theories where I got just as lost as before. I had been hoping that the Simons Center talk would explain the ideas I was missing, but I fear that there’s no way around digging into the details in Witten’s recent paper about this. The last talk was by Cumrun Vafa, who gave a very nice and elementary discussion of the “wall-crossing” phenomena in certain 2d qfts, as motivation for recent work on wall-crossing in 4d gauge theories.

    The talks were uniformly of very high quality, it’s wonderful to see that the Simons Center is off to a great start.

    Update: I just did take another look at Witten’s recent paper, and realized that the part involving 5 and 6 dimensional theories is not there, but in a paper in progress on “Five-branes and Knots”. So, that story will just have to wait for now…

    Posted in Uncategorized | 10 Comments

    Recent Hot Topics in Hep-th

    SLAC’s SPIRES database has a link you can use to search for articles heavily cited during 2009 and 2010. Just looking at the hep-th papers, three are review articles of older work on applying AdS/CFT to condensed matter physics, three are about Erik Verlinde’s claims that gravity is an “entropic force”, and the rest are about Petr Horava’s non-relativistic theory of quantum gravity.

    The story of hep-th in 2009/10 seems to be that the only new ideas getting attention are ones coming from prominent string theorists who have become apostates advocating non-string theory approaches to quantum gravity. The idea of getting gravity out of simple thermodynamics didn’t get much attention back in 1995 when Ted Jacobson was discussing it, partly because it didn’t seem to go anywhere, partly because the conventional wisdom was that the spin-two massless mode of a string was the reason for gravity. Now that, fifteen years later, a prominent string theorist is promoting the idea (see his recent Harvard colloquium on the topic here), it is getting a lot of attention.

    Those abandoning string theory as an explanation for quantum gravity do need to be careful in how they describe what they are doing; see for example the first part of the most heavily cited hep-th paper of 2009 (Horava’s), which begins as follows:

    In recent decades, string theory has become the dominant paradigm for addressing questions of quantum gravity. There are many indications suggesting that string theory is sufficiently rich to contain the answers to many puzzles, such as the information paradox or the statistical interpretation of black hole entropy. Yet, string theory is also a rather large theory, possibly with a huge landscape of vacua, each of which leads to a scenario for the history of the universe which may or may not resemble ours. Given this richness of string theory, it might even be logical to adopt the perspective in which string theory is not a candidate for a unique theory of the universe, but represents instead a natural extension and logical completion of quantum field theory. In this picture, string theory would be viewed—just as quantum field theory—as a powerful technological framework, and not as a single theory.

    If string theory is such an apparently vast structure, it seems natural to ask whether quantum gravitational phenomena in 3 +1 spacetime dimensions can be studied in a self-contained manner in a ‘‘smaller’’ framework. A useful example of such a phenomenon is given by Yang-Mills gauge theories in 3 + 1 dimensions. While string theory is clearly a powerful technique for studying properties of Yang-Mills theories, their embedding into string theory is not required for their completeness: In 3 + 1 dimensions, they are UV complete in the framework of quantum field theory.

    In analogy with Yang-Mills, we are motivated to look for a ‘‘small’’ theory of quantum gravity in 3 + 1 dimensions, decoupled from strings.

    Update: A commenter points out that Verlinde has just received a 2 million euro grant to support this kind of research, more info available here.

    Posted in Uncategorized | 18 Comments

    This Week’s Hype

    Way back in 1997, string theorists were already getting rather touchy about people pointing out string theory’s testability problems. At that time, Gordon Kane published an article in Physics Today with the title String Theory is Testable, Even Supertestable in which he wrote:

    A decade ago in PHYSICS TODAY (May 1986, page 7), Paul Ginsparg and Sheldon Glashow raised this question dramatically, and effectively began a widely repeated myth that string theories, candidates for a primary theory, are not testable. Here I want to dispel this myth, and describe some of the many ways in which string theories are testable. If nature is supersymmetric on the electroweak scale, for which there is exciting but not yet compelling evidence, then string theories are even testable in essentially the same ways as traditional ones. All the tests I describe are doable now or in the foreseeable future with existing or proposed facilities or projects.

    Kane went on to give a long list of testable things that string theory was going to predict, including as an example a detailed spectrum of superpartner masses, all in the range 50-300 GeV (he assures us that supersymmetry is a prediction of string theory, quoting something Gross and Witten wrote for the Wall Street Journal).

    Now that LHC data has finally started to arrive, in amounts large enough to soon start seeing all of the superpartners advertised back in 1997, Physics Today has decided to put out a rather spectacular piece of string theory hype from Kane as their cover story, under the title String Theory and the Real World. In the story, the main theme is the same as 13 years ago: it’s a myth that string theory doesn’t make testable predictions. Now though, the many 1997 predictions are forgotten, and much of the article is devoted to a tendentious discussion of what it means to test a scientific theory. In the 2010 version, there’s no longer a detailed list of things that string theory should be able to predict, instead Kane describes two specific predictions of string theory:

  • the first one is about neutrino masses and is rather bizarre, describing work on one specific string theory compactification:

    We showed that in no case could the theory generate light but not massless neutrinos. That work represents a clear example of a test of string theory.

    So, one specific string theory compactification is known to not look like the real world. That’s a test of string theory????

  • Kane advertises a recent paper of his from this past June (updated version of a couple weeks ago) about “non-thermal cosmological history”. I’ll leave it to readers to decide for themselves how compelling a test of string theory the paper provides.
  • Whatever one thinks of these latest “tests”, the difference between what string theory tests looked like back in 1997 and what they look like in 2010 is rather remarkable.

    Posted in This Week's Hype | 33 Comments