Updates

Some new items mostly updating older ones:

  • Natalie Wolchover has a very good article at Quanta, entitled A Fight for the Soul of Science, reporting on the recent Munich conference discussed here. David Gross sounds a little bit like John Horgan, emphasizing the problem of HEP physics getting too difficult, with an “End of Science” danger. I think he has the problem right:

    “The issue in confronting the next step,” said Gross, “is not one of ideology but strategy: What is the most useful way of doing science?”

    I hadn’t realized quite how radical Dawid is. He seems to have moved on from discussing theory “assessment” to theory “confirmation”. Even the most devoted string theorists like Gross may be unwilling to sign on to this, comfortable with the idea that string theory deserves a positive assessment, as a promising idea still worth working on, much less so with claims from Dawid that one can sensibly discuss string theory as a “confirmed” theory, one that belongs in our school textbooks.

    There was much discussion evidently of Bayesian confirmation theory, and Gross was enthusiastic about this

    Gross concurred, saying that, upon learning about Bayesian confirmation theory from Dawid’s book, he felt “somewhat like the Molière character who said, ‘Oh my God, I’ve been talking prose all my life!’”

    He may have become a bit less enthusiastic later when faced with Joe Polchinski’s Bayesian calculation showing a 94% probability confirming the multiverse.

    Sabine Hossenfelder and Carlo Rovelli both explained well the danger of such claims of non-empirical Bayesian confirmation of one’s ideas:

    The German physicist Sabine Hossenfelder, in her talk, argued that progress in fundamental physics very often comes from abandoning cherished prejudices (such as, perhaps, the assumption that the forces of nature must be unified). Echoing this point, Rovelli said “Dawid’s idea of non-empirical confirmation [forms] an obstacle to this possibility of progress, because it bases our credence on our own previous credences.” It “takes away one of the tools — maybe the soul itself — of scientific thinking,” he continued, “which is ‘do not trust your own thinking.’”

  • For more on the “non-empirical science” front, see Alan Lightman’s long piece in Harper’s on “Quantum Cosmologists” Sean Carroll, Alexander Vilenkin, James Hartle and Don Page. Lightman waxes poetic on the importance of “The need to ask the really big questions”, but unless I missed it, there’s nothing there about the need to provide any evidence for the answers that one comes up with. At least one of the four quantum cosmologists, Don Page, seems to see no particular distinction between theology and science.
  • On the Mochizuki front, there’s this report in Nature about the Oxford conference. On the question of what went wrong with the later talks

    But Conrad and many other participants say they found the later lectures indigestible. Kim counters that part of the difficulty lay in cultural differences: Japanese mathematicians have a more formal style of lecturing than do those in the West and they are not as used to being questioned by a testy audience, he says.

    I don’t think the “Japanese culture” explanation of the problem holds water. Of the three problematic speakers, one (Mok) is not Japanese at all (from Hong Kong), and the other two certainly understand that explaining mathematics to someone is about more than reading a lecture to an audience. It’s not plausible that the reason they didn’t have satisfactory answers to questions from the audience is their Japanese cultural background. For the case of Mochizuki himself, he grew up here in New York, went to prep school, undergraduate and graduate school in the US. The question of why following him is so difficult is a fascinating one, but I don’t think the answer to it has anything to do with his choice to move to Japan.

    A must-read detailed report on the situation is Brian Conrad’s, available here.

Update: For some other commentary on the Munich workshop and relevant issues, see Sabine Hossenfelder and my Columbia colleague Andrew Gelman (who I think, unlike anyone in the theoretical physics community, actually knows something about Bayesian methods).

Update: Fesenko has a comment at the Nature article, where he makes the claim: “There are no questions about the theory which are left unanswered.” I agree with my Columbia colleague David Hansen’s response to this, that this seems to be an absurd statement.

Update: There’s a very good report on the abc conjecture workshop from Kevin Hartnett at Quanta. His take on what happened agrees with others:

Kedlaya’s exposition of Frobenioids had provided the assembled mathematicians with their first real sense of how Mochizuki’s techniques might circle back to the original formulation of Szpiro’s conjecture. The next step was the essential one — to show how the reformulation in terms of Frobenioids made it possible to bring genuinely new and powerful techniques to bear on a potential proof. These techniques appear in Mochizuki’s four IUT theory papers, which were the subject of the last two days of the conference. The job of explaining those papers fell to Chung Pang Mok of Purdue University and Yuichiro Hoshi and Go Yamashita, both colleagues of Mochizuki’s at the Research Institute for Mathematical Sciences at Kyoto University. The three are among a small handful of people who have devoted intense effort to understanding Mochizuki’s IUT theory. By all accounts, their talks were impossible to follow.

There’s also a report now from Fesenko, available here. His take on this is that the problem wasn’t the talks, it was the audience:

Labor omnia vincit. Progress in understanding the talks correlated with preparation efforts for the workshop. There were participants who came unprepared but were active in asking questions, many of which were already answered in recommended surveys and some of which were rather puerile.

Unclear what the point of such remarks is, unless the goal is to make sure that many of the experts who attended the workshop won’t come to any more of them.

Update: The paragraph from Fesenko’s report on the workshop quoted about has been removed, replaced by

Без труда не выловишь и рыбку из пруда. Progress in understanding the talks correlated with preparation efforts for the workshop. Lack of reading of non-classical papers of the theory often correlated with the depth of asked questions.


Update
: Nature covers the Munich conference as Feuding physicists turn to philosophy for help.

Update
: The Fesenko report has been further edited, with the paragraph mentioned above now reading

Без труда не выловишь и рыбку из пруда. According to the feedback, progress in understanding the talks and quality of questions often correlated with preparation efforts for the workshop and reading of non-classical papers of the theory.

Update: Michael Harris’s take on the press coverage of the Oxford conference is here.

Posted in Multiverse Mania | 70 Comments

Run 2 and SUSY

What surprised me most about today’s Run 2 results (see here) was that CMS and ATLAS were able to already significantly push up limits on superpartner masses, especially the gluino mass. Limits on the gluino mass went from 1.3-1.4 TeV in Run 1 to something like 1.6-1.8 TeV in the new Run 2 data (this depends on exactly what channels one is looking at). This not only kills off Gordon Kane’s string theory prediction of a 1.5 TeV gluino, but it also removes a large chunk of the remaining possible mass region that the LHC will be able to access. And it wasn’t just the gluino: ATLAS quoted limits on sbottom masses moving up from 650 GeV in Run 1 to 850 GeV today. Whatever you thought the remaining probability was for SUSY after the negative Run 1 results, it’s significantly smaller today.

Almost all the news has been about the possible diphoton excess, ignoring the quite significant story about SUSY. Davide Castelvecchi at Nature though today talked to Michael Peskin, who has been one of the more consistent proponents of SUSY over the years, and this was part of his story:

Meanwhile, searches for particles predicted by supersymmetry, physicists’ favourite extension of the standard model, continue to come up empty-handed. To theoretical physicist Michael Peskin of the SLAC National Accelerator Laboratory in Menlo Park, California, the most relevant part of the talks concerned the failure to find a supersymmetric particle called the gluino in the range of possible masses up to 1,600 GeV (much farther than the 1,300-GeV limit of Run 1). This pushes supersymmetry closer to the point where many physicists might give up on it, Peskin says.

I had thought that the “physicists give up on SUSY” story wouldn’t get going until next year, but maybe it’s already started.

Update
: In just a few hours after the announcement already 10 papers on hep-ph devoted to explaining the diphoton resonance. SUSY explanations not among the popular ones.

Update
: Another eight or so papers explaining the diphotons. And the press has the obvious explanation: string theory:

The idea seems to be that since people were looking for Randall-Sundrum gravitons (which somehow counts as string theory) then if they find something in the diphoton spectrum it could be a graviton. I’m no expert, but none of the dozens of hep-th papers seem to discuss this possibility, and the papers about searches for Randall-Sundrum gravitons (like this one) set limits way above a TeV. On the other hand, I don’t doubt that some “string vacuum” can be found that will explain the diphotons, and that we’ll hear more about it in the press.

Posted in This Week's Hype, Uncategorized | 16 Comments

LHC Run 2 First Results

First results using the full data from Run 2 at 13 TeV will be presented tomorrow at CERN at 15:00 Geneva time, with a live webcast available here. For some relevant commentary, see Tommaso Dorigo and Matt Strassler.

Among relatively reliable rumor sources, Jester is tweeting about a supposed excess at 750 GeV in the diphoton spectrum. We’ll see tomorrow, but the problem with this is that it would be hard to understand why such a thing didn’t show up in Run 1 at 8 TeV. Tommaso explains why it is only at higher masses that one expects the Run 2 data to be competitive with that from Run 1, and suggests that what to look for is a 2 TeV excess in the dijet spectrum, since there were already hints of such a thing in the Run 1 data.

Matt describes 13 TeV results recently published by the ATLAS and CMS groups looking for exotic behavior at very high mass (predicted for instance by various theories of extra dimensions). Nothing there.

One other thing to look for is whether Gordon Kane will get his Nobel Prize. He’s predicting a 1.5 TeV gluino, with current limits around 1.4 TeV, and this year’s Run 2 data perhaps enough to push those up a bit. It may though be that such analyses will be among those that take longer, not appearing until the Moriond conference in March.

Update: CMS went first, results now publicly available here. Tommaso was pulling our leg, the 2 GeV Run 1 excesses have gone away. There is a diphoton excess at 766 GeV, but an unimpressive one (2.6 sigma locally, 1.2 sigma with look elsewhere effect).

Gluino mass limits have moved up, some as high as 1.7 TeV. Presumably Kane is now at work on new string theory predictions.

Bottom line: nothing beyond the SM so far.

Update: ATLAS next. No gluinos up to 1.8 TeV. 2.2 sigma for the 2 TeV excess that CMS doesn’t see.

They also see an excess in diphotons around 750 GeV, 3.6 sigma local significance, 1.9 sigma with look-elsewhere. So, starts to look interesting if combined with CMS, the rumor was right. They also reanalyzed the Run 1 data, nothing there at 750 GeV, no combination of Run 1 and Run 2.

Results available here.

Bottom line: only thing interesting is the possible 750 GeV diphoton excess. One can predict a flood of theory papers with models predicting such a thing. Will have to wait until at least next summer though to see if this gets confirmed or goes away.

Commentary from Matt Strassler here.

Update: As expected, best explanation and discussion of the implications of the diphoton excess is from Jester, see here.

Reasons to be excited: naively combining CMS and ATLAS gives something of 4 sigma significance, people are making the analogy with the early Higgs signal. Reasons to be less excited: in the case of the early Higgs signal, the tentative signal was what was expected from the Higgs, and we had very good reasons to believe there was a Higgs roughly in that mass range. Here I know of no well-motivated models that predict this: extraordinary claims require extraordinary evidence, and this is not that.

Posted in Experimental HEP News | 13 Comments

White Smoke Over Oxford?

I’ve stolen the title of this posting from Michael Harris, see his posting for a discussion of the same topic.

A big topic of discussion among mathematicians this week is the ongoing workshop at Oxford devoted to Mochizuki’s claimed proof of the abc conjecture. For some background, see here. I first wrote about this when news arrived more than three years ago, with a comment that has turned out to be more accurate than I expected “it may take a very long time to see if this is really a proof.”

While waiting for news from Oxford, I thought it might be a good idea to explain a bit how this looks to mathematicians, since I think few people outside the field really understand what goes on when a new breakthrough happens in mathematics. It should be made clear from the beginning that I am extremely far from expert in any of this mathematics. These are very general comments, informed a bit by some conversations with those much more expert.

What I’m very sure is not going to happen this week is “white smoke” in the sense of the gathered experts there announcing that Mochizuki’s proof is correct. Before this can happen a laborious process of experts going through the proof looking for subtle problems in the details needs to take place, and that won’t be quick.

The problem so far has been that experts in this area haven’t been able to get off the ground, taking the first step needed. Given a paper claiming a proof of some well-known conjecture that no one has been able to prove, an expert is not going to carefully read from the beginning, checking each step, but instead will skim the paper looking for something new. If no new idea is visible, the tentative conclusion is likely to be that the proof is unlikely to work (in which case, depending on circumstances, spending more time on the paper may or may not be worthwhile). If there is a new idea, the next step is to try and understand its implications, how it fits in with everything else known about the subject, and how it may change our best understanding of the subject. After going through this process it generally becomes clear whether a proof will likely be possible or not, and how to approach the laborious process of checking a proof (i.e. which parts will be routine, which parts much harder).

Mochizuki’s papers have presented a very unusual challenge. They take up a large number of pages, and develop an argument using very different techniques than people are used to. Experts who try and skim them end up quickly unable to see their way through a huge forest of unrecognizable features. There definitely are new ideas there, but the problem is connecting them to known mathematics to see if they say something new about that. The worry is that what Mochizuki has done is create a new formalism with all sorts of new internal features, but no connection to the rest of mathematics deep enough and powerful enough to tell us something new about that.

Part of the problem has been Mochizuki’s own choices about how to explain his work to the outside world. He feels that he has created a new and different way of looking at the subject, and that those who want to understand it need to start from the beginning and work their way through the details. But experts who try this have generally given up, frustrated at not being able to identify a new idea powerful enough in its implications for what they know about to make the effort worthwhile. Mochizuki hasn’t made things easier, with his decision not to travel to talk to other experts, and with most of the activity of others talking to him and trying to understand his work taking place locally in Japan in Japanese, with little coming out of this in a form accessible to others.

It’s hard to emphasize how incredibly complex, abstract and difficult this subject is. The number of experts is very small and most mathematicians have no hope of doing anything useful here. What’s happening in Oxford now is that a significant number of experts are devoting the week to their best effort to jointly see if they can understand Mochizuki’s work well enough to identify a new idea, and together start to explore its implications. The thing to look for when this is over is not a consensus that there’s a proof, but a consensus that there’s a new idea that people have now understood, one potentially powerful enough to solve the problem.

About this, I’m hearing mixed reports, but I can say that some of what I’m hearing is unexpectedly positive. It now seems quite possible that what will emerge will be some significant understanding among experts of a new idea. And that will be the moment of a real breakthrough in the subject.

Update: Turns out the “unexpectedly positive” was a reaction to day 3, which covered pre-IUT material. Today, when things turned to the IUT stuff, it did not go well at all. See the link in the comments from lieven le bruyn to a report from Felipe Voloch. Unfortunately it now looks quite possible that the end result of this workshop will be a consensus that the IUT part of this story is just hopelessly impenetrable.

Update: Brian Conrad has posted here a long and extremely valuable discussion of the Oxford workshop and the state of attempts to understand Mochizuki’s work. He makes clear where the fundamental problem has been with communication to other mathematicians, and why this problem still remains even after the workshop. The challenge going forward is to find a way to address it.

Posted in abc Conjecture | 37 Comments

Today’s Hype

Joe Polchinski’s contribution to the ongoing Munich meeting has now appeared on the arXiv, with the title String Theory to the Rescue. Evidently he’s not actually to be at the meeting, I’m not sure how his paper will be presented.

It’s pretty much the usual hype about the string theory and the multiverse, with untestable ideas about quantum gravity the only topic. The one innovation is that it contains a calculation: the probability of a multiverse is 94%:

To conclude this section, I will make a quasi-Bayesian estimate of the likelihood that there is a multiverse. To establish a prior, I note that a multiverse is easy to make: it requires quantum mechanics and general relativity, and it requires that the building blocks of spacetime can exist in many metastable states. We do not know if this last is true. It is true for the building blocks of ordinary matter, and it seems to be a natural corollary to getting physics from geometry. So I will start with a prior of 50%. I will first update this with the fact that the observed cosmological constant is small. Now, if I consider only known theories, this pushes the odds of a multiverse close to 100%. But I have to allow for the possibility that the correct theory is still undiscovered, so I will be conservative and reduce the no-multiverse probability by a factor of two, to 25%. The second update is that the vacuum energy is nonzero. By the same (conservative) logic, I reduce the no-multiverse probability to 12%. The final update is the fact that our outstanding candidate for a theory of quantum gravity, string theory, most likely predicts a multiverse. But again I will be conservative and take only a factor of two. So this is my estimate for the likelihood that the multiverse exists: 94%.
This is not to say that the multiverse is on the same footing as the Higgs, or the Big Bang. Probability 94% is two sigma; two sigma effects do go away (though I factored in the look-elsewhere effect, else I would get a number much closer to 1). The standard for the Higgs discovery was five sigma, 99.9999%.

My problem with a lot of the West Coast theorists is that I don’t seem to have the same sense of humor, so often I have trouble telling when they’re joking (does anyone know when Andrei Linde is joking?). Here though it seems quite clear that Polchinski is pulling the leg of the philosophers gathered to hear him speak. My calculations show that the chance the above text could be a serious contribution to a philosophy of science conference cannot be above .1%.

The other evidence that something comical is going on here comes from some of the over-the-top claims about the virtues of string theory. In particular, we’re told

A remarkable feature of string theory is that the dynamics, the equation of motion, is completely fixed by general principle. This is consistent with the overall direction of fundamental theory, describing the vast range of phenomena that we see in terms of fewer and fewer underlying principles. Uniqueness would seem to be the natural endpoint to this process, but such theories are truly rare…
Indeed, when I assert that the equations of string theory are fully determined by general principle, I must admit that we do not yet know the full form of the equations, or the ultimate principle.

Polchinski is quite right that it is “remarkable” to know that you have a unique theory, with unique equations, fixed by a general, ultimate principle, but you don’t know what the theory is, what the equations are, or what the principle is. I’m curious to hear from people at the conference what gets the bigger laughs: this or Kane’s argument that the unknown theory predicts a 1.5 TeV gluino (unless it’s not found, in which case it doesn’t).

Update
: See here for the latest coverage of the meeting from Massimo Pigliucci, who comments at one point:

[Yet another string theorist. I must say, there does seem to be a stacking of them at this conference, and no experimentalists have been invited either]


Update
: For more on Polchinski’s paper there’s Sean Carroll on Twitter, here and here, who tells us

So strange how the public perception of string theory has been warped by a few contrarian voices. Good topic for some future PhD thesis.

and that Polchinski’s new paper

lays out the case for string theory, and how unexpectedly successful it’s been.

I’m assuming by “contrarian voices” he’s referring to Polchinski. That string theory has been “unexpectedly successful” and the multiverse is a 94% sure thing is a highly contrarian point of view among physicists.

Update: More coverage of the workshop is available from Sabine Hossenfelder and Massimo Pigliucci.

Posted in Multiverse Mania, This Week's Hype | 39 Comments

Next Week’s Hype

Next week there will be a workshop in Munich with the title Why Trust a Theory? Reconsidering Scientific Methodology in Light of Modern Physics. It’s organized by Richard Dawid, to discuss his ideas about “non-empirical theory confirmation”, developed to defend string theory research against accusations that its failure to make any testable predictions about anything make it a failure as a research program.

I guess the idea of such a workshop is to bring together string theory proponents and critics to sort things out, but looking at the program and talk abstracts, this doesn’t look promising, with the central issues to be evaded, and speakers likely to just talk past each other.

While the workshop title refers to “Modern Physics” in general, the talks are mostly focused on one very narrow part of the subject: quantum gravity. String theory is supposed to be something much more than a quantum gravity theory, explaining the Standard Model and low energy physics. This has been a complete failure, and the plan at the workshop seems to be to deal with this elephant in the room by ignoring it, or worse, claiming it isn’t there.

From the talk abstracts, about the only person discussing particle physics will be Gordon Kane (Quevedo may mention it, although his main interest is cosmology). Kane will be claiming that string theory makes testable predictions about particle physics, ones about to be tested. The problem is that he has been making the same claims for thirty years, arguing back in the 90s in the pages of Physics Today and a book that string theory would be tested at LEP and the Tevatron (by finding superpartners). As his predictions have been conclusively falsified, he just refuses to acknowledge this and starts advertising new ones. Perhaps the most outrageous case of this is his latest book (discussed here), which is a reissue of the old one, with falsified predictions simply deleted and replaced, without any acknowledgement of what happened. I don’t think this behavior raises any philosophical issues about theory confirmation, just the sociological issue of why the physics community tolerates this, or why he’s the one person invited to this workshop to address the largest problem of the subject.

Another central problem here is the hype problem. If you give up on testability, and allow theory confirmation based on claims that “my theory is just better” by some ill-defined metric, you open up the obvious problem of how to deal with people’s natural human inclination to praise the wonderful characteristics of their intellectual children. At the workshop, Joe Polchinski’s talk is entitled “String Theory to the Rescue”. I see nothing in the program about any planned examination of the significant string theory hype problem, or even any acknowledgement that it exists.

I’m actually in a way more sympathetic than most people to the idea that “non-empirical” evaluation of a theory is an important and worthwhile topic. Fundamental physics theory is facing a huge problem due to the overwhelming success of the Standard Model and the increasing difficulty of exploring higher energy scales. If it is to continue to make progress there is a real need to do a better job of evaluating theoretical ideas without help from experiment. There is a group of scientists who have a lot of experience with this problem, and have a well-developed culture designed to deal with it. They’re called “mathematicians”. Despite the fact that this workshop is hosted by the Munich Center for Mathematical Philosophy, the organizers don’t seem to have thought it worthwhile to invite any mathematicians or mathematical physicists to participate, missing out on a perspective that would be quite valuable.

Update: It’s now “this week”, not “next week”. Some tweeting from the conference going on, you can try the hashtag #WhyTrustATheory. Massimo Pigliucci comments from the Q and A session about a problem with this kind of thing: some people “very very much like the sound of their own damn voice”. I hear that David Gross claimed to have 20 possible observations that would invalidate string theory, but didn’t say what they were.

Update: Massimo Pigliucci is blogging a detailed account of the conference, see here (he’s also a speaker, slides here).

Update: I don’t think I’d noticed before that Lee Smolin has a very much to the point review of the Dawid book here.

Posted in Favorite Old Posts, This Week's Hype, Uncategorized | 54 Comments

Super and Great Colliders

I’ve recently finished reading two new books on huge collider projects, which make an interesting contrast.

The first is From the Great Wall to the Great Collider, by Steve Nadis and Shing-Tung Yau. It’s a very well-informed and topical book, a bit of a political document, designed to make the case for a Chinese “Great Collider”. This is a proposed machine of up to 100km in circumference, that would operate first as an electron-positron collider, designed to be a “Higgs factory”, allowing precision study of the Higgs. In a second stage the same tunnel would be used for a proton-proton machine with collision energies up to 100 TeV. This would be designed to explore the energy range above a TeV, in much the same way as the LHC, but with seven times the energy, thus a much higher energy reach. This energy would also allow study of Higgs self-interactions.

The Nadis-Yau book is an unusual document in many ways. Yau is a great geometer, but a main concern of the book is something completely different, the question of how one might construct such a huge physics and engineering project. There is a great deal of information in the book about the history and current state of experimental HEP, but from an unusual angle, that of the many Chinese contributions to the subject. I’ve read many histories of HEP, but learned a lot of new things from this one, with its very different emphasis.

This is a short, rather than encyclopedic, book, with about 130 pages of text. It functions well in explaining the case for a large new collider to anyone interested, but has a distinct focus on arguments for the proposal to do this in China. The Chinese government and people in coming years will be deciding whether to go ahead with this, and this book is the perfect place for them to read a serious account of what this proposal is and why it deserves to be taken seriously.

The current state of affairs is that an initial conceptual design has been completed recently, which was reported here. This gave rise to some mistaken reports like this one that the Chinese government had given its approval to the project. There’s still quite a ways to go before that happens, with a final conceptual design not due until next year, and even if there is a go-ahead, construction only starting in 2020-25.

For a detailed look at the physics to be done by such a collider, see this new review article. I was interested to see (page 32) that the previous description by one of the authors of the current situation as leading to only two possibilities (“natural” SUSY or some such, or the multiverse and the end of hope for explaining things) has been expanded to now include a more interesting third possibility: “correlation between the physics of the deep UV and IR”.

Just after finishing the Nadis-Yau book, I got a copy of a new history of the SSC project, Tunnel Visions, by Riordan, Hoddeson and Kolb. This book has been in the works for a long time, with the authors starting to gather material back in the 80s, before the project was cancelled in 1993. I’ve been hearing about the book for quite a while, glad to see that it has finally appeared.

The cancellation of the SSC had a disastrous effect on the US experimental HEP program, moving the center of research conclusively to CERN and its LHC project. A central concern of any book of this kind has to be the “what went wrong?” question. The conclusions drawn are similar to ones I remember often hearing back then in the wake of the disaster: the SSC was a juicy target for a Congress intent on budget-cutting, easily portrayed as out of control (its budget kept increasing from $3 billion early on, to maybe $12 billion at the end), with little support from non-Texas representatives. In some sense the surprising part of the story is that the project got as far as it did before being terminated by an overwhelming Congressional vote.

One part of the story I had never understood was that as the SSC budget expanded it was coming into direct conflict with the plan to keep funding the other HEP labs (Fermilab, SLAC, Cornell, Brookhaven), and that was part of the story of the politics of this within the scientific community. I also hadn’t appreciated the way the challenges of a project of this scale required bringing in companies and other parts of the US military-industrial complex, making it take on some of the aspects of a large defense spending project. A major topic of the book is that of the problematic interaction between this and the standard ways that physicists were used to doing business.

Unlike Nadis-Yau, Tunnel Visions is more of an academic book, with notes and references to a huge number of extensive interviews making up a large part of the text. It’s not at all an inspirational story, nor is there all that much physics discussed. At the same time, it’s the definitive work on a crucial part of the history of high energy physics in the US. One group of people who should definitely be reading it are those planning the Chinese project. Some of the difficulties they will face if they go ahead will be similar: the SSC was an 87km ring, of similar scale to the new proposal. That this almost got off the ground 20 years ago here in the US is a good argument that it is something that could be pulled off in China over the next 20 years if they want to do it.

Based on the fact that the SSC might have worked with more international support, the authors end with the conclusion

Despite the added difficulty of organizing and managing them, pure-science projects at the multibillion-dollar scale should henceforth be attempted only as international enterprises involving interested nations from the outset as essentially equal partners. Nations that attempt to go it alone on such immense projects are probably doomed to failure like the Superconducting Super Collider.

The Chinese proposal is still in its infancy, but there’s reason to expect it might be a “go it alone” project. Given the way the US budget operates, at this point no country is likely to look to the US as a reliable source of sizable funding. CERN has its own proposal for a 100 TeV collider, but it seems hard to believe that both projects will go ahead, although also hard to see the Europeans agree to give up energy frontier physics to China. Many of the lessons of the SSC funding debacle are rather specific to the US and the way budgets are done here. I have no idea what the considerations in China are for projects like this, I guess we’ll start to find out in coming years.

Posted in Book Reviews, Experimental HEP News | 13 Comments

Quick Items

A few quick items before the holiday:

  • I hear that Luis Alvarez-Gaumé will be the next Director of the Simons Center, starting next Fall, taking over from John Morgan, the founding Director. My understanding is that the hope was to have the directorship alternate between mathematicians and physicists, and with the hire of Alvarez-Gaumé, they’ve managed to achieve this. His early work on supersymmetric path integrals and the index theorem (see here) was characteristically lucid and this remains one of the great points of intersection between modern mathematics and the quantum theory. One of the best relatively short introductions to QFT is this one (with a shorter arXiv version here). I think he’s an excellent choice.
  • In physics blogger news, Tommaso Dorigo reports that he has found a publisher for the book he has been writing: Anomaly! – Scientific Discoveries and the Quest for the Unknown, and it should appear next year. I’m very much looking forward to seeing a copy. His insightful but irreverent take on experimental HEP I’m sure will make this a fascinating read for anyone interested in the subject.
  • Matt Strassler’s blog has been dormant for a while, but he has now been heard from. After a couple year visiting position at Harvard, he says he’s now “employed outside of science”, but working on a book about particle physics for non-experts.
  • Jim Holt has a review in the latest New York Review of Books of my colleague Michael Harris’s Mathematics Without Apologies.
  • By some accounting, today is the 100th anniversary of Einstein’s GR field equations, which he presented November 25, 1915 at a lecture in Gottingen Berlin. This anniversary has been celebrated in many places, in many ways this year, so there’s not any need for me to chime in. Among many excellent treatments of the topic, there’s also an unfortunate tendency of some to use Einstein to grind their particular axes. Sean Carroll I suspect has Einstein spinning in his grave, using the PBS NewsHour to enlist Einstein as a multiverse fan:

    The ability for seemingly constant things to evolve and change is an important aspect of Einstein’s legacy. If space and time can change, little else is sacred. Modern cosmologists like to contemplate an extreme version of this idea: a multiverse in which the very laws of physics themselves can change from place to place and time to time. Such changes, if they do in fact exist, wouldn’t be arbitrary; like spacetime in general relativity, they would obey very specific equations.

    Perhaps Carroll could enlighten the public by writing down these “very specific equations” he’s advertising, for comparison to the Einstein field equations.

    If I were to grind my own ax here, it would be to note that Einstein’s great breakthrough came about through close collaboration with some of the best pure mathematicians around, adopting difficult but deep ideas about geometry. Without the mathematicians, I’d guess that the theory of general relativity would have taken many more decades to come to fruition. Maybe there’s a lesson there…

Posted in Uncategorized | 45 Comments

This Week’s Non-Hype

Since I often post here complaints about articles produced by the press offices of various institutions that hype in a misleading way physicist’s theoretical work, I thought it a good idea to make up for this by noting a positive example of how it should be done. The SLAC press office this week has a Q and A with Lance Dixon, with the title SLAC Theorist Lance Dixon Explains Quantum Gravity which is quite good.

Dixon gives an informative explanation at a basic level of what the quantum gravity problem is. He includes an even-handed description of the string theory approach to the problem, and explains a little bit about the alternative that he and collaborators have been pursuing, one that has gotten much less attention than it deserves. This is a very technical subject, so there’s a limit to how much he can explain, but he gives the general idea, and includes a link to his most recent work in this area.

Many promotional efforts for string theory begin by making claims that quantum field theory cannot be used to understand quantum gravity, due to the divergences in the perturbation series. This has been repeated so often, for so many years, that it is an argument most people believe. The situation however is quite a bit more complicated than this, with one interesting aspect of the story the discovery in relatively recent times that long-held assumptions about divergences in perturbative quantum gravity calculations were just wrong. Such calculations turn out to have extra unexpected structure, and thus unexpected cancellations, making naive arguments about divergences incorrect. Continuing progress has come about as Dixon and others have developed new techniques for actually computing amplitudes, uncovering unexpected new symmetries and cancellations.

For a good summary of the current situation, see this talk by Zvi Bern, especially page 7, where Bern details how, going back to 1982, “So far, every prediction of divergences in pure supergravity has either been wrong or missed crucial details”. For N=8 supergravity, current arguments say that a divergence should show up if you could calculate 7 loop amplitudes, but Bern warns against betting on this. In that talk he also explains the recent work with Dixon and others that gets mentioned in the SLAC piece, about the surprising nature of the divergence in pure gravity at two-loops, making its physical significance and whether it really ruins the theory not so clear.

I was interested to read Dixon’s account of his thinking back in the mid-80s:

I began to be concerned that there may be actually too many options for string theory to ever be predictive, when I studied the subject as a graduate student at Princeton in the mid-1980s. About 10 years ago, the number of possible solutions was already on the order of 10500. For comparison, there are less than 1010 people on Earth and less than 1012 stars in the Milky Way. So how will we ever find the theory that accurately describes our universe?

Although this never made it into media stories, I think that by a couple years after the initial enthusiasm for string unification in 1984, many theorists had already started to notice that the idea likely had fundamental problems, with a serious danger that it would turn out to be an empty idea. This now has become clear, but the idea lives on, with “QFT must have divergences” the main argument for continuing to take it seriously. Now that argument isn’t looking so solid…

Update: A good explanation of the situation from 4 gravitons who, thankfully, is not overly worried that he might be giving succor to the Woits of the world…

Posted in Uncategorized | 37 Comments

Langlands Items

There’s an interesting development in the math-physics overlap, with a significant number of physicists getting interested in the theory of automorphic forms, often motivated by the problem of computing string scattering amplitudes. This has led to a group of them writing up a very long and quite good expository treatment of Eisenstein series and automorphic representations, which recently appeared on the arXiv. The emphasis is not on the physics applications (which an introduction explains come about when one is dealing with systems with discrete symmetries like the modular group or higher dimensional generalizations), but on the calculational details of the mathematics. There are quite a few expositions of this material in the mathematics literature but many (mathematicians included), may find the detailed treatment here very helpful.

Another aspect of this area is some overlap with the interesting of mathematicians studying Eisenstein series in the context of Kac-Moody groups. There’s a conference this week bringing together mathematicians and physicists around this topic.

Turning to recent developments in the Langlands correspondence itself, which relates automorphic forms to Galois representations, when I discussed David Nadler’s talk at the Breakthrough Prize symposium (the video is available here), I forgot to mention one thing he talked about that was new to me, the Fargues-Fontaine curve. Nadler explained that Fargues has recently conjectured that the local Langlands correspondence can be understood in terms of ideas from the geometric Langlands correspondence, using the Fargues-Fontaine curve. For more about this from Fargues himself, see materials at his website, which include lecture notes, links to videos of talks at the IAS and MSRI, and this recent survey article. Also informative is some explanation from David Ben-Zvi at MathOverflow.

In April there will be a workshop in Oberwolfach on geometric Langlands that will include this topic, for details of the planned discussions, see here.

Fargues was here today at Columbia, and gave a talk on “p-adic twistors”. Nothing much about Langlands, this was about the question of what the analog is for the Fargues-Fontaine curve when you take the real numbers as your field (the use of “twistors” is that of Simpson’s, see here, not the common use in physics, which is quite different).

I won’t display my extremely limited understanding of this subject by trying to provide my own explanations here. A big problem is that this is mainly about the p-adic Langlands correspondence, something I’ve never been able to understand much about. After making a renewed attempt the last few days, I at least started to get some idea of what are the biggest problematic holes in my knowledge of the math background. Interestingly, it seems many if not most of them have Tate’s name attached (Hodge-Tate, Lubin-Tate, etc, etc…). One pleasant discovery I made is that there are now some excellent expository pieces on this material available, often courtesy of some talented graduate students. One wonderful source I ran into is Alex Youcis’s blog Hard Arithmetic, which has given me some hope that with his help I might soon make a little progress on learning more about this kind of mathematics. I don’t know what’s in the water at Berkeley, but something there keeps producing high-quality blogging by mathematics students, another example is here.

Posted in Langlands | 8 Comments