String Vacuum Project 2010

I’ve written before about the String Vacuum Project (back in 2006 and 2008), and there was a story about it in Nature. This week they are having an SVP 2010 Spring Meeting at the KITP, talks available here.

A proposal to the NSF for funding of the String Vacuum Project was first made five years or so ago, but I had heard that this and later versions hadn’t been successful. In recent years perhaps the main proponent of the project has been Keith Dienes of the University of Arizona, who organized its last meeting in Tucson two years ago. Dienes started work as a program manager at NSF last fall. Maybe it’s just a coincidence, but the SVP now has funding through an NSF grant for $150K this year, with the grant paying for bi-annual meetings (of which I guess the KITP one is the first). While “the PIs are proposing a String Vacuum Project (SVP) network with eight geographic nodes”, the grant sponsor is the University of Arizona, where the co-PI for the grant is Shufang Su, a phenomenologist who doesn’t seem to have any history of working on string vacua.

Well, at least stimulus funding is helping get the SVP off the ground…

Update: After looking through some of the workshop talks, it’s very unclear to me what the “String Vacuum Project” actually is. At this point it appears to just be a mechanism for getting the NSF to fund three graduate students working on string phenomenology. From the talk by Michael Douglas you learn that it’s very unclear what a string vacuum even is. It appears to involve an intractable large unknown space (including e.g. “all six manifolds”), with an unknown effective potential on it, with disagreements among practicioners about whether the effective potential is a sensible thing to look at.

Not surprisingly, the discussion session about what the project should be doing was a sad thing to watch. One of the main topics was the SVP Wiki, which people hope to improve. Maybe it’s been moved somewhere else, but the only address I know for it (here) has been down for quite a while.

Update: More discussion showing the current level of understanding (nil) of string vacua here.

Update: There is now a new String Vacuum Project web-site.

Posted in Multiverse Mania | 5 Comments

Oy vey

Last Friday City College held a symposium here in Manhattan celebrating physics at City College. I was able to attend just the morning session, which began with a quick rescheduling of Anton Zeilinger for David Gross, who had overslept. Gross finally did make it and gave a talk on “The Frontiers of Particle Physics”. He says he’s taking bets in favor of supersymmetry being seen at the LHC, with 50/50 odds, and expects first evidence for supersymmetry within a year or two. By the time he got to the part of his slides about string theory he was over time, so he bypassed them, flipping ahead several slides at once.

Unfortunately I seem to have missed the real fireworks, which were at a panel discussion that afternoon. There’s a report at Scientific American, entitled Star physicists trade barbs over cosmological model. Alan Guth was there, promoting the multiverse and the anthropic explanation of the CC. Gross was having none of it:

“In reaction to that last talk—oy vey,”…

Gross called Guth’s concept of eternal inflation somewhat speculative, noting that if other universes do exist, they are causally disconnected from ours—”every goddamn one of them.” As such, Gross added, talk of other universes “does bear some resemblance to talking about angels.”

Posted in Multiverse Mania | 20 Comments

Bohmian Spat

Here’s a story from the boundaries of conventional physics of the sort I normally try to resist paying any attention to, but couldn’t quite help myself this time:

Last week I noticed amongst the e-mail from Jack Sarfatti that clutters my (and many other people’s) mailbox some forwarded messages about a kerfuffle involving the withdrawal of a conference invitation to Brian Josephson. Josephson is a Nobel Prize winner but, on the other hand, he seems to think that this sort of thing makes sense. In one of the messages, I noticed that Josephson defends himself by pointing out that his talks often don’t involve paranormal phenomena, giving as example a recent Hermann Staudinger lecture in Freiburg (Staudinger was a chemistry Nobelist, also my great-uncle).

This mini-scandal has now made it to a Times Higher Education story today, which starts off:

An extraordinary spat has broken out after a Nobel prizewinning physicist was “uninvited” from a forthcoming conference because of his interest in the paranormal.

Details of the conference in August for experts in quantum mechanics sounded idyllic. Participants were due to discuss “de Broglie-Bohm theory and beyond” in the Towler Institute, which is housed in a 16th-century monastery in the Tuscan Alps owned by Mike Towler, Royal Society research fellow at Cambridge University’s Cavendish Laboratory.

Last week, any veneer of serenity was shattered. Conference organiser Antony Valentini, research associate in the Theoretical Physics Group at Imperial College London, wrote to three participants to say their invitations had been withdrawn.

The current situation seems to be that Josephson, David Peat and Jack Sarfatti were un-invited, but now Josephson and Peat have been re-invited.

I had never heard of the Towler Institute before, but it sounds like a beautiful place, which physicist Mike Towler has admirably made available as a site for hosting small meetings and conferences. From the information on its web-site, this looks like the kind of place I’d find it very difficult to turn down an invitation to, no matter what the conference topic.

The conference at issue will be held at the end of the summer, and deals with what is known as “de Broglie-Bohm theory”. One can read about this many places, including this site of Mike Towler’s. The conference summary itself refers to the de Broglie-Bohm theory’s “fringe nature in modern physics”, and for more about why it is controversial see Towler’s lecture Not even wrong: Why does nobody like pilot-wave theory?.

After spending a little time learning about it many years ago, I quickly decided that I personally didn’t like pilot-wave theory, partly because it seems to me that it throws out all the deep, amazing and experimentally verified links between modern physics and mathematics that motivate what I love about the subjects, getting nothing much in return. I don’t see a good reason to believe that research in this area is going to lead to something interesting, but those who do have every right to keep trying. As they do so, they face serious problems in distinguishing crackpot from non-crackpot efforts, as this story makes very clear. Note that I have no intention of putting any time into this problem myself, so in this case I’m adopting a uniform policy of just deleting all comments arguing for or against de Broglie-Bohm. If that’s a topic you like to argue about, do it elsewhere.

There’s more here from Chad Orzel.

Posted in Uncategorized | 16 Comments

Dark Matters

Initial 2010 data from the Theoretical Particle Physics Jobs Rumor Mill indicates that the particle theory job market remains as trend-driven as ever. This year, it seems that if you want a tenure-track job in the US, you must be working on phenomenology. And not just any sort of phenomenology, your work has to be about dark matter. Of the seven theorists offered tenure track jobs so far, no less than 6 are phenomenologists working on dark matter. The seventh is Davide Gaiotto, who has been working with Witten and others at the IAS on mathematically quite interesting topics that use N=2 and N=4 supersymmetric gauge theory. His offer is from Stony Brook, where much of the funding comes from Jim Simons of Renaissance Technologies. Simons is putting profits from the world’s most successful hedge fund to work keeping alive the idea that the intersection of mathematics and physics is still worth pursuing, so not everyone has to become a dark matter phenomenologist.

(By the way, the rumor mill seems to indicate that Kachru and Silverstein are leaving the KITP, heading back to Stanford. Is that right?)

If you’re a young theorist who wants to remain in the field, you better get to work on dark matter phenomenology. I’m afraid that this blog won’t be of much help, you should carefully follow Resonaances, which has the latest news and rumors.

Update: It seems that my point about the dominance of dark matter hiring has even more backing than I thought, since Sergei Dubovsky evidently has an offer from Stony Brook (and other places). So, that makes it seven out of eight for dark matter so far this year.

Posted in Uncategorized | 28 Comments

Podcasts

In case you’re tired of reading me going on about the same topics and instead would like to listen to me going on about such topics, there are now two new options:

  • A couple weeks ago I did a podcast with the folks at the Rationally Speaking web-site, talking to Massimo Pigliucci and Julia Galef (their podcast site is here, direct link to my segment here). In the near future they’ll be doing a different podcast on the topic of the Anthropic Principle. Pigliucci is a philosopher of science and comments on this here.
  • Last month I visited Collin College in Texas, and they have a podcast up from an interview I did there. The site is here, link to interview here.
  • Posted in Uncategorized | 4 Comments

    Tevatron vs. LHC

    The news from Fermilab is that the Tevatron has set a new luminosity record, with a store last Friday that had an initial luminosity of 4.04 x 1032cm-2s-1, or, equivalently, 404 inverse microbarns/sec. For more about this, see a new posting from Tommaso Dorigo, where someone from Fermilab writes in to comment that they’re not quite sure why this store went unusually well.

    Over in Geneva, commissioning of the LHC continues. There, the highest initial luminosity reported by ATLAS is 2.3 x 1027cm-2s-1, or 200,000 times less than the Tevatron number. I haven’t seen a recent number for the total luminosity delivered by the LHC to the experiments so far, but I believe it’s a small number of hundreds of inverse microbarns. The Tevatron is producing more collisions in one second than the LHC has managed in the three weeks since first collisions.

    The current plan for the LHC is to devote most of the rest of 2010 to increasing the luminosity of the machine, with a goal of reaching something somewhat lower than the Tevatron luminosity (1-2 x 1032cm-2s-1) by the end of the year. Then the plan is to run flat out at this luminosity throughout 2011, accumulating data at the rate of about 100 pb-1/month, and ending up with a total of 1 fb-1. The hope is that this will allow them to be competitive for some measurements with Fermilab, the lower luminosity compensated by the advantage of a factor of 3.5 in beam energy that they now enjoy.

    The Tevatron has already produced over 8 fb-1 of data, and the current plan is to run the machine through the end of FY 2011, reaching at least 10 fb-1, and then shut it down for good. The LHC is supposed to go into a long shutdown throughout 2012, not coming back into operation until 2013. Even if all goes well, it likely will not have accumulated enough data to decisively compete with the Tevatron until late 2013 or 2014. Under the circumstances, it’s hard to believe that there aren’t plans being proposed at Fermilab to keep the Tevatron running for several more years, until 2014. The machine should then be able to end up with a total of 15-20 fb-1 worth of data, which could be enough to allow them to see evidence of the Higgs at the 3 sigma level over the entire possible mass range.

    Shutting down the Tevatron as planned would free up funds for other experiments, and free parts of the accelerator complex for use by neutrino experiments. It will be interesting to see whether instead of this, the decision gets made to go for a bid to outrace the LHC to the Higgs and other high energy frontier physics over the next few years.

    Update: The integrated luminosity seen by the ATLAS detector for the current run so far is about 400 inverse microbarns, 350 at CMS (where they have 300 worth of recorded data).

    Update: This morning the LHC has started delivering collisions with stable squeezed beams to the experiments, initial luminosity 1.1-1.2 x1028cm-2s-1.

    Update: Integrated luminosity delivered to each experiment at the LHC is now up to around 1000 inverse microbarns.

    Posted in Experimental HEP News | 27 Comments

    Three Mysteries

    There’s a well-known list of high-profile problems in fundamental theoretical physics that have gotten most of the attention of the field during the past few decades (examples would be the problems of quantizing gravity, solving QCD, explaining dark energy, finding a model of dark matter, breaking supersymmetry and connecting it to experiment, etc.). Progress on these problems has been rather minimal, and in reaction one recent trend has organizations such as FQXI promoting research into questions that are much more “philosophical” (for instance, they are now asking for grant proposals to study “The Nature of Time”). In this posting I’d like to discuss a different class of problems, ones which I believe haven’t gotten anywhere near the attention they deserve, for an interesting reason.

    The three problems share the characteristic of being apparently of a purely technical nature. The argument against paying much attention to them is that, in each case, even if one were to find a satisfactory solution, it might not be very interesting. It’s possible that all one would discover is that the conventional wisdom about these problems, that they’re just “technical” and thus not of much significance, is correct. The argument for paying more attention is that the technical problem may be an indication that we’re doing something wrong, that there is something of significance about the Standard Model that we haven’t yet understood. Achieving this understanding may lead us to the insight needed to successfully get beyond the Standard Model. At the moment all eyes are on the LHC, with the hope that experiment will lead to new insight. Whether this will work out is still to be seen, but in any case it looks like it’s going to take a few years. Perhaps theorists with nothing better to do but wait will want to consider thinking about these problems.

    Non-Perturbative BRST

    The BRST method used to deal with the gauge symmetry of perturbative Yang-Mills theory does not appear to generalize to the full non-perturbative theory, for a rather fundamental reason. This was first pointed out by Neuberger back in 1986 (Phys. Lett. B, 183 (1987), p337-40.), who argued that, non-perturbatively, the phenomenon of Gribov copies implies that expectation values of gauge-invariant observables will vanish. I’ve written elsewhere about a different approach to BRST that I’m working on (see here), which is still at a stage where I only fully understand what is going on in some toy quantum-mechanical models. My own point of view is that there’s still a lot of very non-trivial things to be understood about gauge symmetry in QFT and that the BRST sort of homological techniques for dealing with it are of deep significance. Others will disagree, arguing that gauge symmetry is just an un-physical redundancy in our description of nature, and how one treats it is a technical problem that is not of a physically significant nature.

    One reaction to this question is to just give up on BRST outside of perturbation theory as something unnecessary. In lattice gauge theory computations, one doesn’t fix a gauge or need to invoke BRST. However, one can only get away with this in vector-like theories, not chiral gauge theories like the Standard Model. Non-perturbative chiral gauge theories have their own problems…

    Non-perturbative Chiral Gauge Theory

    Since the early days of lattice gauge theory, it became apparent that chiral symmetry was problematic on the lattice. One way of seeing this is that naively there should be no chiral anomaly on the lattice. The problem was made more precise by a well-known argument of Nielsen-Ninomiya. More recently, it has become clear that one can consistently introduce chiral symmetry on the lattice, at the cost of using fermion fields that take values in an infinite dimensional space. One such construction is known as “overlap fermions”, which have the crucial property of satisfying relations first written down by Ginsparg and Wilson. This kind of construction solves the problem of dealing with the global chiral symmetry in theories like QCD, but it still leaves unsolved the problem of how to deal with a gauged chiral symmetry, such as the gauge symmetry of the Standard Model.

    Poppitz and Shang have recently written a nice review of the problem, entitled Chiral Lattice Gauge Theories Via Mirror-Fermion Decoupling: A Mission (im)Possible? They comment about the significance of the problem as follows:

    Apart from interest in physics of the Standard Model — which, at low energies, is a weakly-coupled spontaneously broken chiral gauge theory that does not obviously call for a lattice study — interest in strong chiral gauge dynamics has both intensified and abated during the past few decades. From the overview in the next Section, it should be clear that while there exist potential applications of strong chiral gauge dynamics to particle physics, at the moment it appears difficult to identify “the” chiral theory most relevant to particle physics model-building (apart from the weakly-coupled Standard Model, of course). Thus, the problem of a lattice formulation of chiral gauge theories is currently largely of theoretical interest. This may or may not change after the LHC data is understood. Regardless, we find the problem sufficiently intriguing to devote some effort to its study.

    In a footnote they compare two points of view on this: Creutz who argues that the question is important since otherwise we don’t know if the Standard Model makes sense, and Kaplan who points out that if there is some complicated and un-enlightening solution to the problem, it won’t be worth the effort to implement.

    You can read more about the problem in the references given in the Poppitz-Shang article.

    Euclideanized Fermions

    Another peculiarity of chiral theories arises when one tries to understand how they behave under Wick rotation. Non-perturbative QFT calculations are well-defined not in Minkowski space, but in Euclidean space, with physical observables recovered by analytic continuation. But the behavior of spinors in Minkowski and Euclidean space is quite different, leading to a very confusing situation. Despite several attempts over the years to sort this out for myself, I remain confused, and can’t help suspecting that there is more to this than a purely technical problem. One natural mathematical setting for trying to think about this is the twistor formalism, where complexified, compactified Minkowski space is the Grassmanian of complex 2-planes in complex 4-space. The problem though is that thinking this way requires taking as basic variables holomorphic quantities, and how this fits into the standard QFT formalism is unclear. Perhaps the current vogue for twistor methods to study gauge-theory amplitudes will shed some light on this.

    On the general problem of Wick rotation, about the deepest thinking that I’ve seen has been that of Graeme Segal, who deals with the issue in the 2d context in his famous manuscript “The Definition of Conformal Field Theory”. I saw recently that he’s given some talks in Europe on “Wick Rotation in Quantum Field Theory”, which makes me quite curious about what he had to say on the topic.

    For some indication of why this confusion over Minkowski versus Euclidean spinors remains and doesn’t get cleared up, you can take a look at what happened recently when Jacques Distler raised it in related form on his blog here (he was asking about it in the context of the pure spinor formulation of the superstring). I’m not convinced by his claim that the thing to do is to go to Euclidean space-time variables, while keeping Minkowski spinors. Neither is Lubos, and he and Jacques manage to have an argument about this that sheds more heat than light. It ends up with Lubos accusing Jacques of behaving like Peter Woit, which lead to him being banned from commenting on the blog. While this all is, as Jacques describes it “teh funny”, it would be interesting to see a serious discussion of the issue. Since it in some sense is all about how one treats time, perhaps one could get FQXI funding to study this subject.

    Update: Lubos Motl has immediately come up with a long posting explaining why these are all non-problems, of concern only to those like myself who are “hopeless students”, “confused by many rudimentary technicalities that prevented him from thinking about serious, genuinely physical topics.” If I would just understand AdS/CFT and Matrix theory I would realize that gauge symmetry is an irrelevance. Few in the theoretical physics community are as far gone as Lubos, but unfortunately he’s not the only one that thinks that concern with these “technicalities” is evidence that someone just doesn’t understand the basics of the subject.

    Posted in Favorite Old Posts, Uncategorized | 76 Comments

    A Tear at the Edge of Creation

    There’s a new book out this week by Marcelo Gleiser, entitled A Tear at the Edge of Creation. Gleiser blogs at the NPR site 13.7, and that site also has a review of the book from his fellow blogger Adam Frank.

    Gleiser started out his professional life as a string theorist, enchanted by the prospect of finding a unified theory, and for many years that motivated his research:

    Fifteen years ago, I would never have guessed that one day I would be writing this book. A true believer in unification, I spent my Ph.D. years, and many more, searching for a theory of Nature that reflected the belief that all is one.

    Over the years he began to become disillusioned with this quest, not only with string theory, but also with other closely associated ideas (e.g. GUTs and supersymmetry) about how unification is suppose to happen. Hopes that GUTs would give predictive theories of inflation or proton decay have fallen by the way-side, and about supersymmetry he is “very skeptical”:

    The fact that the [lightest, stable superpartner] particle has so far eluded detection doesn’t bode well. To make things worse, results from the giant Super-Kamiokande detector in Japan and the Soudan 2 detector in the United States have ruled out supersymmetric GUT models, at least the simpler ones, based again on the proton lifetime. If SUSY is a symmetry of Nature, it is very well hidden.

    About string theory itself, Gleiser refers to my book and Lee Smolin’s, and comments that:

    Responses from notable string theorists were of course highly critical of the books and their writers. Some were even offensive. I find this sort of dueling pointless. People should be free to research whatever they want, although they should also reflect responsibly on whether their goals are realistic.

    Ultimately, Gleiser came to the point of view that all hopes for a unified theory are a misguided fantasy, and explaining this point of view is the main goal of the book. In an interview on his publisher’s web-site, he says:

    After years searching for a “final” answer, as a scientist and as a person, I realized that none exists. The force of this revelation was so intense and life transforming that I felt compelled to share it with others. It completely changed the way I think about Nature and our place in it.

    In his book, he argues repeatedly against the fundamental nature of symmetries in our understanding of physics, seeing the failures of GUTs and supersymmetry as a failure of the idea of getting unification out of larger, more powerful symmetry laws. For him, symmetries are always just approximations, never exactly true principles. He claims to be more interested in asymmetries, in failures of symmetry laws, seeing in asymmetry a fundamental explanatory principle about the universe and humanity’s role in it.

    Personally, I find myself in strong disagreement with him about this, and don’t see much evidence in the book that the abandonment of the search for symmetries that he advocates leads to any positive route to greater understanding of the universe. I agree with him about the failure of the most popular ideas about how to use symmetry to get beyond the Standard Model, but disagree with him about the implications of this.

    The problem with both GUTs and supersymmetry is that one posits a new symmetry only to be faced immediately with the question of how to break it, with no good answer. To be successful, any new symmetry principle needs to come with a compelling explanation of how it is to be realized in fundamental physics. A repeated lesson of the development of the Standard Model was that major advances came not only from coming up with new symmetry groups, but through coming up with unexpected ways of realizing them (e.g. spontaneous symmetry breaking and confinement in gauge theory). I don’t believe that the gauge symmetries of the Standard Model are approximations, but rather that they are among our most powerful and fundamental physical principles, and that much work remains to be done to understand their full implications. The failures that have discouraged Gleiser have also discouraged many others, and a resulting abandonment of symmetry-based attempts to find a better, more unified, fundamental theory would be a shame.

    Posted in Book Reviews | 36 Comments

    $2 Million For The Nature of Time

    FQXI has just announced that it will be awarding another series of large grants, of size \$50K-\$100K, totaling about \$2 million. These grants will be targeted at research into “The Nature of Time.” Initial proposals are due June 14, grants will start January 2011. For more details, see here and here.

    At this point, I’m a bit curious about where the FQXI money is coming from. They started up in 2006 with a seed grant from the Templeton Foundation of about \$8.8 million, and that grant was supposed to finish at the end of last year. Going forward, I haven’t seen any indication of whether they’re still operating on the seed money, or have new money from Templeton or other sources.

    Posted in Uncategorized | 1 Comment

    Freaky Physics Proves Parallel Universes Exist

    Fox News has decided that some recent experimental atomic physics work showing that quantum mechanics works as expected (for a sane discussion of the science, see here) proves that parallel universes exist and that time travel may be feasible. In an article entitled Freaky Physics Proves Parallel Universes Exist, Fox News writer John Brandon develops this idea with help from Sean Carroll and Fred Allan Wolf (aka Dr. Quantum):

    The multi-verse theory says the entire universe “freezes” during observation, and we see only one reality. You see a soccer ball flying through the air, but maybe in a second universe the ball has dropped already. Or you were looking the other way. Or they don’t even play soccer over there.

    Sean Carroll, a physicist at the California Institute of Technology and a popular author, accepts the scientific basis for the multi-verse — even if it cannot be proven.

    “Unless you can imagine some super-advanced alien civilization that has figured this out, we aren’t affected by the possible existence of other universes,” Carroll said. But he does think “someone could devise a machine that lets one universe communicate with another.”

    It all comes down to how we understand time.

    Carroll suggests that we don’t exactly feel time — we perceive its passing. For example, time moves fast on a rollercoaster and very slowly during a dull college lecture. It races when you’re late for work . . . but the last few minutes before quitting time seem like hours.

    Back to the Future

    “Time seems to be a one-way street that runs from the past to the present,” says Fred Alan Wolf, a.k.a. Dr. Quantum, a physicist and author. “But take into consideration theories that look at the level of quantum fields … particles that travel both forward and backward in time. If we leave out the forward-and-backwards-in-time part, we miss out on some of the physics.”

    Wolf says that time — at least in quantum mechanics — doesn’t move straight like an arrow. It zig-zags, and he thinks it may be possible to build a machine that lets you bend time.

    Update: Matt Springer has a more detailed analysis of the Fox News article entitled The Worst Physics Article Ever. For some reason his critique skips over the multiverse part, implying that that’s the one part of the article that makes sense…

    Posted in Multiverse Mania | 5 Comments