The Little Book of String Theory

Back from New Orleans, and there are now three books I’ve read recently that I’ll try and write reviews of. The first is The Little Book of String Theory, by Princeton’s Steve Gubser. The author has a web-site for the book here, and the introduction is available.

While trying to cover a huge amount of complicated material, the book is quite short, with 162 pages of text, in a small format. Gubser has chosen to deal in a radical manner with the problem of deciding whose work to reference, and whose name to mention in connection with various discoveries. There are no footnotes or end-notes, no bibliography of any kind, and no mention of the names of any string theorists, or any living physicists at all for that matter. The history of the subject pretty much only appears in a few of Gubser’s comments about early parts of his own career.

To somehow counterbalance its main focus on a highly sketchy treatment of an intricate and very abstract subject, the book periodically introduces some very concrete and explicit numerical computations, starting with a first chapter devoted to explaining in detail the equation E=mc2. Unfortunately, none of these calculations have anything at all to do with the topic of the book, string theory. The central section of the book, about branes and duality, contains no such concrete calculations, but instead largely consists of page-long paragraphs recounting in words the intricate structures that occur in this subject. I find it hard to believe that anyone not already familiar with this topic will get much out of this kind of discussion.

Gubser intensively uses analogy to try and convey some understanding of the material, and has a fondness for analogies based on his mountain-climbing experience. Here’s an example, based on a climb to the Aiguille du Midi:

The ridge we climbed is famously narrow, heavily trafficked, and snow-covered. For some reason everyone seems to climb it roped up. I’ve never quite approved of the practice of climbing roped when no one is tied to a solid anchor. If one person falls, it’s hard for the others to avoid being pulled off their feet. Usually I think it’s better to trust yourself and climb unroped, or else anchor and belay. But I’ll admit that I climbed the ridge roped up to my climbing partner like everyone else. My partner was a very solid climber, and the ridge isn’t really that tough.

In retrospect, I think that roped teams climbing a narrow ridge provide a good analogy to the Higgs boson, which is one of the things LHC experimentalists hope to discover.

The point here is that the top of the ridge is supposed to be like the unstable maximum at zero of the Higgs potential, but it seems to me that few are likely to get much real understanding out of this kind of analogy. Similarly, the chapter on GR and black holes opens with a chilling story about a fall while climbing near Aspen, but it’s hard to see how it adds much to the reader’s understanding of the subtleties of the modern understanding of gravitation.

The book is advertised as “a non-technical account of string theory and its applications to collider physics.” The last chapter is about recent attempts to use AdS/CFT as an approximate calculational technique in heavy-ion physics. This is Gubser’s specialty, and he does a good job of giving a hype-free explanation of the state of the subject, for instance:

The second reason why it is tricky to compare a prediction of the gauge/string duality with data is that the string theory computations apply to a theory that is only similar to QCD, not to QCD itself. The theorist has to make some translation between one and the other before he or she has a definite prediction to give an experimentalist. In other words, there’s some fudge. The best attempts to handle this translation honestly lead to predictions for the charm quark’s stopping distance that are either in approximate agreement with data, or perhaps as much as a factor of 2 smaller. A similar comparison can be made for viscosity, and the upshot is that the gauge/string duality produces a result that is either in approximate agreement with data, or perhaps a factor of 2 away from agreement.

While giving a reasonable account of the heavy-ion collision story, the description of the relation of string theory to the much more interesting question of what happens in proton-proton collisions at the Tevatron or LHC energy frontier is actively misleading hype. What he is really describing is supersymmetry, and while he begins with the arguable:

Supersymmetry predicts many other particles, and if they are discovered, it would be clear evidence that string theory is on the right track.

he then goes on to claim that:

What is exciting is that string theorists are placing their bets, along with theorists of other stripes, and holding their breaths for experimental discoveries that may vindicate or shatter their hopes…

If it [evidence for supersymmetry] is found, many of us would take it as confirmation of superstring theory

There’s no discussion of the issue of the supersymmetry breaking scale, or acknowledgement of the fact that string theory does not at all require this scale to be low enough for superpartners to be observable at LHC energies. The fact of the matter is that string theory makes no predictions at all about what the LHC will see, and Gubser’s claim that string theorists have some sort of LHC prediction they are betting on is just not true. There is no bet here that string theorists can possibly lose: if superpartners are found, they are likely to trumpet this as “confirmation of string theory”, but if not, they’ll fall back on the accurate statement that string theory predicted nothing about this.

Throughout the book, Gubser is on the defensive about the issue of string theory’s lack of predictivity, invoking highly strained and dubious analogies as excuses. One chapter begins with a discussion of Roman history and its effects on our present-day culture. He then argues that our many centuries remove from this history is somehow like the way string theory makes predictions at high energies, not low energies. I don’t see the analogy (we have lots of evidence for Romans, none for strings), and in any case the problem with string theory is not that it can’t predict what happens at low energies, but that it can’t predict anything at any energy. In another chapter he compares current string theory unification models to the BCS theory of superconductivity, noting that the BCS theory doesn’t work for high-temperature superconductivity. I’m not sure what to make of this analogy, since BCS is a successful theory, string unification models aren’t. The only point of it seems to be the hope that something new will be discovered experimentally (analog of high temperature superconductivity), and some unknown version of string theory will describe it.

Like pretty much all of his colleagues at Princeton, one thing Gubser wants nothing to do with is the multiverse and the anthropic string theory landscape. While he explains the moduli-stabilization problem, the landscape and the multiverse are not discussed, and anthropic argumentation is dismissed with:

Altogether, I find myself unconvinced that this line of argument is useful in string theory.

In the next posting, I’ll write about another new popular physics book, one that I think is much better and much more readable, although it takes the West Coast multiverse interpretation of string theory as gospel, ignoring the views of Gubser and his Princeton colleagues.

Posted in Book Reviews | 7 Comments

Completely Off-Topic

I’m heading off to New Orleans tomorrow morning, will be there for Mardi Gras, back next Wednesday. Light to no blogging for the duration.

Since I’m already off-topic, I can’t resist promoting my friend Alexei Karamazov (aka Mark Ettinger)’s show, which opened last night here in New York at the Minetta Lane theater. The Flying Karamazov Brothers travel the world with a wonderful show that could best be described as demented vaudeville featuring some of the best juggling around. They’re here in New York City for the first time in many years, through March 7. For one of the first reviews, see here. Some of their previous shows featured references to string theory, this one doesn’t. Go to their web-site to learn more, then go buy tickets and help me recoup some of the money I put up as an investor to help make this happen…

Posted in Uncategorized | Comments Off on Completely Off-Topic

String Theory for Undergraduates at Brown

A few years ago various US universities decided it was a good idea to offer a course on string theory for undergraduates (see here), but in recent years most of these seem to have been dropped from the curriculum. Brown University is going in the other direction, offering Physics 1970C, String Theory for Undergraduates, this semester. A report from a Brown undergraduate on Lubos’s blog gives me some encouragement to continue blogging:

Life is carrying on naturally. In fact, if I hadn’t been reading eg woit’s blog, I would’ve suspected we’re still in the middle of a stringy revolution! We even just started a new string theory course for undergrads, and I and quite a few other undergrads held a string theory seminar. Interest in stuff like LQG is completely zero. So in the press you have woit, smolin blahblahblahing, ok. But in the meantime, you have ads/cft, and the whole twistor reformulation of yang mills in terms of contour integrals over grassmannians (inspired by twistor string theory). Even condensed matter physicists accept string theory as one of the greatest things that happened to physics.

The undergraduates at Brown have a String Theory Study Group, Facebook group here.

Update: The undergraduate string theory courses are now facing some competition. LSU is offering an undergraduate Introduction to Loop Quantum Gravity.

Posted in Uncategorized | 48 Comments

LHC Update

A new schedule for operation of the LHC is out. It has sector tests of injection into the LHC starting the evening of Feb. 17, circulating beams again around Feb. 22, about 6 weeks for beam commissioning, then physics starting April 5. On October 18 the LHC would be stopped for two weeks to set up ion beams, which would then run for four weeks, with an end-of-year stop starting Nov. 29.

Last year there there was an estimate of about 10 days to establish collisions at 3.5 TeV/beam, but the latest estimate is more conservative, about 25 days. So, at the earliest, probably about March 4, more likely around March 19. Complicating the matter is CERN’s plan to have first collisions broadcast worldwide on LHC First Physics Day, which is supposed to be a mid-week day announced a week in advance. So, the beam commissioning team is going to have to first get to the point where collisions are possible, then spend the next week of work being very careful to avoid stray collisions that one of the experiments might pick up and someone might blog about…

CMS collaboration members were asked for their best estimates of when first collisions would occur, with results plotted here. Lots of optimists voted for around March 1, the date that got the most votes was April 1.

Posted in Experimental HEP News | 3 Comments

Expanding Crackpottery

Lubos Motl is getting rather concerned (yes, I know about what pops up when I link to his blog…) about recent trends in theoretical physics, especially the implications of recent work (discussed here) of well-known string theorist Erik Verlinde. He claims that many other string theorists share his concern:

The people whose knowledge and opinions about physics are close to mine are finally beginning to realize the worrisome trends affecting the quality and character of the research in theoretical physics. I have a significant number of e-mail exchanges with these folks – and let me assure everyone that you’re not alone.

In many ways I also share Lubos’s concern. Like lots of people, over the years I’ve been deluged with examples of what I’ll call “unconventional physics”, in a spectrum ranging from utter idiocy to serious but flawed work. Much of it shares the all-too-common feature of making grandiose claims for new understanding of fundamental physics, based on vague ideas that often use not much more than a few pieces of high-school level physics and mathematics. The beautiful and deep physical and mathematical ideas that go into the Standard Model are ignored or thrown out the window. In many cases, it’s hard not to suspect that the authors have decided that they can replace modern physics, without bothering to take the trouble to learn what it is. This kind of thing is pretty easy to quickly identify and decide to ignore, and it ends up having no impact on the scientific research community.

In recent years though, some theorists who definitely understand and have made contributions to modern physics have started promoting research which looks depressingly like the typical sad examples of “unconventional physics”. Many of the products of the ongoing multiverse mania fit into this category. Lubos is getting quite worried to see that a very talented and well-known leader of the string theory community, Erik Verlinde, seems to be engaging in this sort of research, and getting positive attention for it. Within a month of its appearance, Verlinde’s “Entropic Force” paper has already generated a dozen or so preprints from other physicists on the same topic. It could easily end up being the most influential (in the sense of heavily referenced) paper of 2010. Seeing this coming from a string theorist he admires is worrying Lubos and his correspondents.

While I agree with Lubos that this is something worth worrying about, his interpretation of the problem is characteristically irrational. In his posting, he argues that this is all due to the influence of the “notorious crackpots” Lee Smolin and Peter Woit. I don’t see how I’m supposed to be responsible for prominent string theorists taking up dubious lines of research I strongly disagree with, other than perhaps having some responsibility for driving them over the edge. In any case, Lubos concentrates his attack on Lee Smolin, arguing that he’s the one mainly responsible for this, an idea which is completely absurd. While Smolin is surely more sympathetic than I am to research like that of Verlinde, he’s a serious scientist and not one with a lot of influence over Verlinde and the string theory community. Lubos’s argument that this is all a left-wing plot organized by the far-left radical hippie Smolin is just laughable. One merciful thing about the string wars always was that positions people took were uncorrelated with their political ideology, keeping politics out of it.

Unlike Lubos though, I’m not convinced that I understand what the source of the problem is. My diagnosis of the current state of the field remains what it was when I wrote my book quite a few years ago: the lack of relevant experimental data coupled with the faddish pursuit of a failed idea about unification has led to a disturbing situation. In a very deep sense though, I just don’t understand why talented physicists react to this by engaging in things like anthropic string landscape research, or vague arguments about “entropic forces”. Lubos is right to notice that this situation has recently become more disturbing. A debate about the causes of this involving people more sober than Lubos would be a good idea. Twenty-some years of string theory hype in the scientific literature and popular press did a lot of damage, and if this gets replaced by hype of ideas even more dubious than string theory unification, things will go from bad to worse. Maybe the LHC will save us, but if this is what it takes, it looks like we’re stuck for a few more years.

Posted in Uncategorized | 29 Comments

Various and Sundry

  • Now that the plan for running the LHC over the next few years is in place, one can start to get an idea of what new physics might emerge from it between now and 2013. For the question of the Higgs, Tommaso Dorigo does some analysis here, going back to 1999 Tevatron projections to see how reliable they were. He concludes that the 1999 projections were accurate for the mass range above 135 Gev. Below that, they depended on assuming a silicon detector upgrade that never was funded. His bottom line is that he sees the Tevatron as ultimately able to rule out the Higgs at 95% confidence level over the entire relevant mass range, but unable to come up with convincing evidence of its existence if it is in the lower part of this mass range. For this, the LHC will be required, but this will have to be after the move to higher energy in 2013:

    The LHC experiments will be unable, in my opinion, to make up in two years of data taking, and with the 3.5 times larger energy, for the 8-year advantage in running time of the Tevatron. The Higgs boson will be unlikely to be discovered before 2013, and it will probably be a sole LHC business; however, until then the Tevatron will retain the better results as far as the mass exclusion range is concerned.

  • Operating on a different reality plane is Michio Kaku:

    “We’re beginning to test string theory with the large Hadron collider outside Geneva, Switzerland, costing ten billion euros, the most expensive machine that science ever created. That’s what I do for a living,” said Kaku in a recent conference call interview from New York.

    This is from a story mainly about Kaku’s new TV show on the Discovery Channel, accurately entitled Fact or Fiction? Physicist Dr. Michio Kaku blurs the line between science and science fiction.

  • NPR has recently started up a project called 13:7 Cosmos and Culture. It’s a blog “set at the intersection of science and culture.” Unfortunately, NPR’s conception of the intersection of physics and culture is occupied by Stuart Kauffman, who has a series of posts arguing that the physical universe cannot be described by physical laws (see here and here). In the most recent one, Kauffman takes up the complicated subject of decoherence and the emergence of classical behavior in quantum systems, and claims to have (inspired by Karl Popper) an argument based on special relativity showing that decoherence cannot be described by any fundamental law of physics. This is supposedly experimentally testable:

    As it happens these ideas may have testable consequences, for they should be more marked as the relative velocities of the event A and one or two receding detectors increase toward the speed of light. And, since quantum decoherence is easier if the quantum processes in the “environment” are locally abundant, they should be more visible in that case. These are testable consequences of Popper’s original idea and my use of it with credit.

    I hope the experiments are done.

    For more about all this, Kauffman refers to his article here from the Edge web-site, where he argues that that the brain is “quantum coherent”, and:

    Reversibility of the coherent to decoherent-classical to recoherent quantum states are essential to my hypothesis for I wish the brain to be undergoing such reversible transformations all the time.

    He gets around problems with time-scales by noting that:

    The time scale of neural activities is a million times slower, in the millisecond range. But it takes light on the order of a millisecond to cross the brain, so if there were a dispersed quantum decohering-recohering mind-brain, reaching the millisecond range is probably within grasp of a quantum theory of the mind-brain system.

    I suppose it is true that it might take light a millisecond to cross one’s brain, if one’s brain were about 200 miles across…

  • Normally I don’t think I can ethically post gossip about mathematician’s love lives here, but once it has already appeared in the media
  • Some ex-colleagues from here at Columbia are among those launching the Journal of Unpublishable Mathematics. From what I hear, they haven’t yet published anything, but have had nominations.
  • Last week the algebraic geometer Eckart Viehweg passed away at the age of 61. His wife Helene Esnault is also an algebraic geometer, and recently posted an article on the arXiv based on joint work, with a heart-breaking abstract.
  • Posted in Uncategorized | 14 Comments

    Particle Theory Job Market

    Erich Poppitz has updated his statistics on the high energy theory job market to include data from 2009. He counts hirings to tenure-track faculty jobs, using data from the Theoretical Particle Physics Jobs Rumor Mill. For 2009, out of 12 hires listed on the Rumor Mill, he counts 9 as in high-energy theory, 3 as cosmologists. Of the 9 high energy theorists, he counts 7 as in phenomenology, 2 as in string theory. I’m not sure exactly who he is counting as a string theorist, probably Easson (string cosmology) and either Elvang (now working on QFT amplitudes) or Shih (supersymmetry breaking). It appears that it is now essentially impossible to get a permanent job in a physics department if you’re working on the more formal end of string theory (or string phenomenology, for that matter). You pretty much have to work in cosmology or phenomenology to have some sort of job prospects.

    The academic job market in general in the US is in a terrible state, and this is reflected in the change from an average of around 20 hires per year in recent years to 9 in 2009. It looks like the situation won’t be any better for 2010. The imbalance between the large number of new PhDs and postdocs, and very few permanent jobs is quite remarkable. According to the postdoc rumor mill, this year already 8 people have accepted postdocs in Princeton, at the university and the IAS, making this small segment of the community large enough to fill almost all the available permanent jobs.

    The US economy remains on its knees due to the economic crisis triggered by the blow-up of debt instruments, especially those designed by quants often coming from a physics background. Luckily for physics PhDs who now have no hope for a job in academia, what I hear from my financial industry friends is that, unlike the rest of the economy, their companies are doing quite well, embarking on new rounds of hiring.

    Posted in Uncategorized | 43 Comments

    LHC Update, More

    According to John Conway, the decision coming out of Chamonix is to go with the first of the two scenarios described here: stay at 3.5 TeV/beam, then a long shutdown to fix all the splices. The idea is to run at 3.5 TeV during 2010 and 2011, stopping for shutdown either when 1 fb-1 has been accumulated, or end of 2011, whichever comes first. The LHC will thus be off throughout 2012, coming back in 2013 for a run at or near the design energy of 7 TeV/beam.

    With the Tevatron counting on having around 12 fb-1 of data at 1 TeV/beam by October 2011, it should remain competitive with the LHC for many sorts of searches, including the search for the Higgs, for much longer than expected. This should be true for more than 3 years from now, until after the LHC has accumulated a significant amount of data at full energy in 2013. The current planning is for Tevatron operation only through FY2011, I wonder whether this will change…

    Update: Science has a story from Adrian Cho here. The D0 co-spokesperson says the decision on running the Tevatron in 2012 “won’t have to be made for several months.” CERN experimenters are quoted as saying that they will still be searching for supersymmetry and extra dimensions. I haven’t seen any studies of exactly what 1 fb-1 at 7 TeV will make possible in terms of doing better than Tevatron limits on such processes and on the Higgs.

    Posted in Experimental HEP News | 5 Comments

    Are There Cosmic Microwave Anomalies?

    No.

    The WMAP team has just released a new set of papers based upon seven years of data from their experiment. For a summary of how this new data has sharpened some of their previous results, see the Cosmological Interpretation paper. They have also gone over claims by many groups to have found deviations from the standard cosmological model in their earlier data sets (for example claims to have found “the unmistakable imprint of another universe” which “points to string theory being on the right track.”) In a paper entitled Are There Cosmic Microwave Background Anomalies, the WMAP team reports:

    In most cases we find that claimed anomalies depend on posterior selection of some aspect or subset of the data… We examine several potential or previously claimed anomalies in the sky maps and power spectra, including cold spots, low quadrupole power, quadropole-octupole alignment, hemispherical or dipole power asymmetry, and quadrupole power asymmetry. We conclude that there is no compelling evidence for deviations from the LCDM model

    They give a humorous example of the problem that plagues typical claims to have found such anomalies, showing that the CMB sky map clearly contains the initials of Stephen Hawking, “aligned neatly along a line of fixed Galactic latitude.”

    Update: For the WMAP team’s summary of its new results for the public, see here.

    Posted in Uncategorized | 13 Comments

    LHC Update

    Those responsible for the LHC machine are having their yearly meeting this week in Chamonix to discuss the state of the project and plans for the future. Last week a subgroup met to discuss plans for beam commissioning to 3.5 TeV/beam, starting next month. The current schedule envisages beam commissioning to restart around February 19, and best estimate is that it will take about a month to establish safe, stable 3.5 TeV beams and begin extended runs for physics purposes. There’s a plan for a big media event when first collisions are achieved at 3.5 TeV/beam, something that may require discouraging experiments from announcing observation of high-energy collisions that happen before the planned moment (evidently this is what occurred last year, when Atlas saw 1.18 TeV/beam collisions before they were supposed to…).

    This year’s schedule includes a possible one-month stop mid-year to increase the beam energy from 3.5 to 5 TeV, but based on the discussions at Chamonix, this looks very unlikely. The most serious problem with the LHC remains the bad splices which are known to exist in the machine, as well as sectors where definitive measurements of all the splices have not been possible (they would require warming up the sector, causing delays of months). The current knowledge of the splices leaves no room for error, even at 3.5 TeV, and going to 5 TeV would require warming up parts of the machine, something which cannot be done during a 1-month stop.

    Discussions are beginning about how long a stop for repairs should be planned for after this year’s run ends in November. To be able to run at 5 TeV/beam will probably require keeping the machine off until May 2011 to fix splices. Going to the design energy of 7 TeV may require even more extensive work on the splices, work that could keep the machine off for all of 2011, with startup again in 2012. To get above 5 TeV, work also needs to be done on retraining the magnets through repeated quenches. Not much of this would be needed to get to 6.5 TeV/beam, but to go all the way to 7 TeV, problems that are still not understood with magnets from one manufacturer will have to be addressed.

    Update: From the Chamonix summary talk, there are two main scenarios now being considered. In the first, the energy of the machine would stay at 3.5 TeV/beam this year and next, with .1-.5 fb-1 integrated luminosity in 2010, 1 fb-1 in 2011, then a year-long shutdown in 2012 to fix all splices before moving to 6.5-7 TeV/beam. In the second, splices would be fixed in stages, running for only 5 months in 2011, at 5 TeV/beam, 1 fb-1 integrated luminosity.

    There will be a summary session at CERN next Friday.

    Posted in Experimental HEP News | 13 Comments