This Week’s Hype

About every three years KEK issues a hype-filled press release announcing that Jun Nishimura and collaborators have used a supercomputer to get evidence for string theory. Back in 2008, the announcement was of a numerical simulation on a supercomputer of a supersymmetric QM system that supposedly showed that superstring theory explained the properties of black holes (press release here, preprint here, blogging here). In 2011, the claim was of a numerical simulation on a supercomputer that used superstring theory to understand the birth of our universe (press release here, preprint here, blogging here). Both of these papers were published in PRL.

The 2014 press release is now out (see here), based on this preprint from last December. The latest claim is that the authors have solved the black hole information paradox, have shown that we live in a hologram, as well as showing that string theory provides a self-consistent quantization of gravity, all by doing a numerical simulation of a QM system. Even better, they have made the quantum gravity problem just as well-understood and tractable as QCD:

In short, we feel that problems involving quantum gravity have become as tractable as problems involving the strong interaction. The latter can be studied by simulating a gauge theory on a four-dimensional (4D) lattice, and such a method has recently been used to reproduce the mass spectrum of hadrons (28) and the nuclear force (29). We can now apply essentially the same method to study quantum gravity, which has been thought to be far more difficult.

This latest version of the KEK-hype has gotten a lot more attention than the previous two versions. Based on the preprint, late last year for some reason Nature covered this with a story about how Simulations back up theory that Universe is a hologram and this got a lot of media attention (see here for example).

The paper has now been published, and this time it’s not in PRL, but in Science magazine (submission there was a month after the preprint came out, could it be that PRL wouldn’t have it?). Science is giving it a high profile, including together with it a piece by Juan Maldacena, which claims the paper as “further evidence of the internal consistency of string theory”. Science provides the following one-line summary of the Maldacena piece:

A numerical test shows that string theory can provide a self-consistent quantization of gravity.

One obvious problem with this is that even if you take the most optimistic view of it all, what is being described is quantum gravity in 10d space-time. The Japanese authors deal with this problem with a footnote:

Theoretical consistency requires that superstring theory should be defined in ten-dimensional space-time. In order to realize our four-dimensional space-time, the size of the extra six dimensions can be chosen to be very small without spoiling the consistency.

Remarkably, Maldacena has another answer: the multiverse, which now he seems to take as accepted fact.

Of course, the 10-dimensional space under consideration here is not the same as the four-dimensional region of the multiverse where we live. However, one could expect that such holographic descriptions might also be possible for a region like ours.

Absurd hype about string theory is a continuing problem, and it’s not one that can be blamed on journalists, with this latest example getting help from HEP lab press releases, a highly reputable journal, and an IAS faculty member.

Posted in This Week's Hype | 18 Comments

Quick Links

Just returned from a few days in Boston, will try and catch up here on various topics:

  • This past week I was able to attend some of the talks at the conference in honor of David Vogan’s 60th birthday. I’m still trying to make sense of Bernstein’s talk on Stacks in Representation theory, where he argued that the category of equivariant sheaves on a certain stack is a better-behaved construction than the category of representations. I’ve always wondered whether this would be helpful in the case of representations of a gauge group, where the stack has something to do with equivalence classes of connections. It was his first use of Beamer, and some slides went by too fast. I noticed that a few people were documenting talks for themselves with phones/tablets taking pictures of slides/board.

    Among the other interesting talks, Jim Arthur discussed the conjectural automorphic Langlands group, along the lines of this older paper. He indulged in some speculation I’d never heard before, that Langlands functoriality might imply the Riemann hypothesis (somehow by analogy to something from Langlands about the Ramanujan conjecture appearing in Deligne’s proof of the RH in the function field case). Unfortunately the laptop being used to show his slides decided to start installing Windows Updates two-thirds of the way through his talk. For whatever reason, I didn’t manage to follow his comments at the end of the talk about something new having to do with Weil’s explicit formula in number theory. Consulting with some experts later though, I couldn’t find anyone optimistic about the Langlands implies RH speculation.

  • Also last week, the draft P5 report on a strategic plan for US HEP over the next 20 years was released, with discussion at an HEPAP meeting. Besides planned LHC upgrades, high priority goes to neutrino physics based at Fermilab, with a plan to attract more international participation. Other directions getting a high priority are on-going dark matter experiments and CMB research. A continued move of funding from research grants to construction projects will likely keep pressure on grants to university theory groups. Research into muon colliders is down-played, with a recommendation to “consult with international partners on the early termination of MICE.”
  • Skepticism about the BICEP2 primordial gravitational wave claims continues, with for instance this story at Science, and this preprint. In retrospect, it’s curious that the possible problems with foregrounds did not get more attention at the time of the original high-profile announcement.

    See here for a Caltech workshop on the BICEP2 results. Andrei Linde’s talk started with his complaining about the popular and textbook coverage of inflation. He said that when journalists call and ask him what BICEP2 will tell us about Grand Unification, he responds “nothing”. At the end of the talk, Sean Carroll asked him about the multiverse, with Linde’s response emphasizing what a great thing it is to have a theory that can’t be disproved:

    If you cannot disprove it, then you have this powerful weapon of thinking about and explaining things around you in an anthropic way.

  • This coming week here in New York there will be lots of events associated to the World Science Festival. One aimed not so much at a popular audience that I’ll likely attend will be a day-long Symposium on Evidence in the Natural Sciences, which will be at the Simons Foundation. It will end with a discussion between Jim Baggott (author of the recent Farewell to Reality) and Brian Greene (sold out now I fear).

Update: The Princeton crowd now has a preprint out, with the detailed argument that BICEP2 can’t distinguish “gravitational waves” from “galactic schmutz”, see here.

Posted in Langlands, Multiverse Mania, Uncategorized | 9 Comments

Walter Burke Institute for Theoretical Physics

Caltech has just announced the establishment of the Walter Burke Institute for Theoretical Physics, with Hirosi Ooguri as director. It will have a permanent endowment of around \$74 million, with \$30 million of that new funds from the Sherman Fairchild Foundation and the Gordon and Betty Moore Foundation.

To get some idea of the scale of this, the recent worries about HEP theory funding in the US have been due to a drop in funding by the DOE of university research from around \$27.5 million/year to \$24 million/year. So, a few million/year from this endowment should help make up for that, while continuing the trend of changing over theoretical physics funding from government support to philanthropy by the .01%.

Update: In other developments from the .01%, Physics World has the news that nominations are now open for the \$3 million Milner prize in physics. You can submit nominations here. Nominations close June 30, announcement of winners will be November 9.

There’s also now a website for the Milner/Zuckerberg $3 million mathematics prize. Not much info there except that it will reward “significant discoveries across the many branches of the subject.” I’m guessing that, like the other prizes, initial picks will be from Milner/Zuckerberg themselves, with those people going on to form the committee to pick future winners.

Posted in Uncategorized | 11 Comments

BICEP2 News

Recall that this past March results from BICEP2 were announced, claiming a greater than 5 sigma detection of a primordial B-mode signal in the CMB polarization. This result received a huge amount of world-wide attention (and see some commentary here). Yesterday saw a very curious situation, with Adam Falkowski at the Resonaances blog claiming that the BICEP2 foreground analysis was flawed, and that “Various rumors place the significance of the corrected signal between 0 and 2 sigma.” Adrian Cho at Science magazine has a story about this, quoting Clement Pryke, co-PI of BICEP as saying “We stand by our paper”, while acknowledging that, with respect to a plot of Planck data they used to estimate the foreground “It is unclear what that plot shows”.

The controversy surrounds slide 6 of this presentation, with the BICEP foreground analysis evidently relying on scraping data from this slide. The claim at Resonaances is that they didn’t take into account the “Not CIB subtracted” notation on the slide:

However, it seems they misinterpreted the Planck results: that map shows the polarization fraction for all foregrounds, not for the galactic dust only (see the “not CIB subtracted” caveat in the slide). Once you correct for that and rescale the Planck results appropriately, some experts claim that the polarized galactic dust emission can account for most of the BICEP signal.

This is backed up by David Hogg’s report of a talk by Raphael Flauger at NYU yesterday:

At lunch, Raphael Flauger (NYU) gave a beautiful talk on foreground uncertainties related to the BICEP2 results. He built his foreground models as did the BICEP2 team by scraping data out of Keynote ™ presentations posted on the web! I have to say that again: The Planck team showed some maps of foregrounds in some Keynote presentations and posted them on the web. Flauger (and also the BICEP2 team before him) grabbed those presentations, scraped them for the all-sky maps, calibrated them using the scale bars, and worked from there. The coolest thing is that Flauger also simulated this whole process to account in his analysis for the digitization (scraping?) noise. Awesome! He concludes that the significance of the BICEP2 results is much lower than stated in the paper, which makes him (and many others) sad: He has been working on inflation models that produce large signals.

It sounds like this issue is not going to get resolved until there is something more substantial from Planck about this than a slide suitable for data scraping. In the meantime, blogs are your best source of information. Or maybe Twitter, where Erik Verlinde tweets from the Princeton PCTS workshop Searching for Simplicity that:

News from Princeton: BICEP2 polarization data are due to dust foreground and not caused by primordial gravity waves.

Update: There’s also a New Scientist story here. It should be emphasized that the BICEP team are denying that there is any need to revise what is in their paper, with New Scientist quoting John Kovac of BICEP as follows:

Kovac says no one has admitted anything. “We tried to do a careful job in the paper of addressing what public information there was, and also being upfront about the uncertainties. We are quite comfortable with the approach we have taken.”

See comments in the comment section here from Sesh Nadathur and Shaun Hotchkiss explaining why there may not be very much significance to this issue.


Update
: Sesh Nadathur has a detailed post up now about this, New BICEP rumours, nothing to see here. Bottom line is:

The BICEP result is exciting, but because it is only at one frequency, it cannot rule out foreground contamination. Other observations at other frequencies are required to confirm whether the signal is indeed cosmological. One scenario is that Planck, operating on the whole sky at many frequencies but with a lower sensitivity than BICEP, confirms a gravitational wave signal, in which case pop the champagne corks and prepare for Stockholm. The other scenario is that Planck can’t confirm a detection, but also can’t definitively say that BICEP’s detection was due to foregrounds (this is still reasonably likely!), in which case we wait for other very sensitive ground-based telescopes pointed at that same region of sky but operating at different frequencies to confirm whether or not dust foregrounds are actually important in that region, and if so, how much they change the inferred value of r.

Until then I would say ignore the rumours.

Peter Coles also has a blog post here, with bottom line

I repeat what I’ve said before in response to the BICEP2 analysis, namely that the discussion of foregrounds in their paper is disappointing. I’d also say that I think the foreground emission at these frequencies is so complicated that none of the simple approaches that were available to the BICEP2 team are reliable enough to be convincing. My opinion on the analysis hasn’t therefore changed at all as a result of this rumour. I think BICEP2 has definitely detected something at 150 GHz but we simply have no firm evidence at the moment that it is primordial. That will change shortly, with the possibility of other experiments (specifically Planck, but also possibly SPTPol) supplying the missing evidence.

I’m not particularly keen on the rumour-mongering that has gone on, but then I’m not very keen either on the way the BICEP2 result has been presented in some quarters as being beyond reasonable doubt when it clearly doesn’t have that status. Yet.

Update: There will be a talk about this issue in Princeton tomorrow morning, see here.


Update
: Slides from the Flauger talk at Princeton are here. I’ll leave discussion of the results presented to the better-informed, but will comment that this work appears to definitely involve new heights in the technology of data-scraping from Keynote presentations.

Update: Video of the Flauger talk is here. Quite interesting are the introductory remarks of Paul Steinhardt, and the concluding remarks of Lyman Page. See also new blog posts from Jester and Sesh Nadathur. Sesh (via Eiichiro Komatsu at Facebook) includes a transcription of part of Page’s comments on the situation:

This is, this is a really, peculiar situation. In that, the best evidence for this not being a foreground, and the best evidence for foregrounds being a possible contaminant, both come from digitizing maps from power point presentations that were not intended to be used this way by teams just sharing the data. So this is not – we all know, this is not sound methodology. You can’t bank on this, you shouldn’t. And I may be whining, but if I were an editor I wouldn’t allow anything based on this in a journal. Just this particular thing, you know. You just can’t, you can’t do science by digitizing other people’s images.

From looking at all this, and seeing what the people in Princeton are saying, my non-expert opinion is that the BICEP2 result should be interpreted as an observation of B-mode polarization, but there’s no convincing data yet about the crucial question of whether this is foreground or cosmological. The BICEP2 data could not address this, and the relevant Planck data is not yet available (other experiments will soon also provide the data needed to resolve this question). The BICEP2 press release claiming “the first direct evidence for cosmic inflation” now looks like it may have been a highly premature claim.

Update: Talks ongoing at Caltech today about this at a workshop, videos later here. On Twitter, you can follow the BICEP/Planck fight via Sean Carroll:

At Caltech CMB workshop. #BICEP2 folks seem completely unconcerned about recent worries about galactic foregrounds. Wait for Planck paper…

Zaldarriaga on CMB grav waves vs. dust: sane answer is “let’s just wait.” On the other hand… we just can’t. No scientist is that patient…

MZ: Planck hasn’t measured dust in #BICEP2 region. But extrapolating from where they did measure, apparently can fit B-mode signal.

MZ: “I’m not happy this is on Facebook and Twitter.”

Seems to me we’re now stuck with Planck saying they think this is dust, BICEP saying they think it’s not. Planck is the side that has data about dust, BICEP is the side that has something they scraped off a slide of a Keynote presentation…

Update: Excellent article about this in the Washington Post from Joel Achenbach: Big Bang backlash.


Update
: Zaldarriaga: I believe the case in favor of a detection of primordial B modes is not convincing (hopefully just temporarily). See more here and here.

Posted in Uncategorized | 26 Comments

Quillen Notebooks

Daniel Quillen, one of the greatest mathematicians of the latter part of the twentieth century, passed away in 2011 after suffering from Alzheimer’s. For an appreciation of his work and an explanation of its significance, a good place to start is Graeme Segal’s obituary notice, and there’s also quite a bit of material in the AMS Notices.

It’s very exciting to see that the Clay Math Institute now has a project to make available Quillen’s research notebooks. Segal and Glenys Luke have been working on cataloging the set, producing lists of contents of the notebooks, so far from the earliest ones in 1970 up to 1977. Quillen’s work ranged very widely, and for much of the 1980s he was very much involved in what was going on at the boundary of mathematics and quantum field theory. His work on the Mathai-Quillen form provided a beautiful expression for the Thom class of a vector bundle using exactly ingredients that formally generalize to the infinite-dimensional case, where this provides a wonderful way of understanding certain topological quantum field theories. The Mathai-Quillen paper is here, see here for a long expository account of the uses of this in TQFT.

I’ve just started to take a look through the notebooks, and this is pretty mind-blowing. The Mathai-Quillen paper is not the most readable thing in the world; it’s dense with ideas, with motivation and details often getting little attention. Reading the Quillen notebooks is quite the opposite, with details and motivation at the forefront. I just started with Quillen’s notes from Oct. 15 – Nov. 13, 1984, in which he is working out some parts of what appeared later in Mathai-Quillen. This is just wonderful material.

Besides his own ideas, there are notes taken based on other people’s talks. See for instance these notes from a private talk by Witten in Raoul Bott’s office on Dec. 15, 1983.

I was already having trouble getting done a long list of things I am supposed to be doing. Having these notebooks available is going to make this a lot worse….

Posted in Uncategorized | 10 Comments

Quick Links

  • The Defense Department has awarded a $7.5 million grant to Steve Awodey of CMU, Vladimir Voevodsky of the IAS and others to support research in Homotopy Type Theory and the foundations of mathematics. I had thought that getting DARPA 10 years ago to spend a few million on Geometric Langlands research was an impressive feat of redirection of military spending to abstract math, but this is even more so.
  • On some kind of opposite end of the spectrum of government spending on mathematics, there’s the story of the NSA, the largest employer of mathematicians in the US. Tom Leinster has an article in New Scientist about the ethical issues involved. More at the n-category cafe.

    Seven years after the NSA-backdoored NIST standard was discovered by Microsoft researchers, and seven months after Snowden documents confirmed this (see here), the NIST has now removed the backdoored standard from its random number generator standards. As far as I know there has never been an explanation from the NIST explaining how the backdoored algorithm was made a standard, or why anyone should trust any of the rest of their cryptographic standards at this point. Earlier in the year they issued a Draft report on their standards development process which explained nothing about what had happened. The language about the NSA in the report is:

    NIST works closely with the NSA in the development of cryptographic standards. This is done because of the NSA’s vast expertise in cryptography and because NIST, under the Federal Information Security Management Act of 2002, is statutorily required to consult with the NSA on standards.

    which seems to indicate they have no intention of doing anything about the problem of NSA backdoors.

  • On the Langlands front, for those who don’t read French Vincent Lafforgue has produced an English translation of the summary version of his recent summary of his recent work on global Langlands for function fields (already proved by his brother, but he has a way of doing things without using the trace formula).

    Langlands continues to add material to his web-site at the IAS. See for instance his long commentary on some history at the end of this section and his recent letter to Sarnak with commentary at the end of this section, where he gives his point of view on the state of the understanding of functoriality and reciprocity.

  • Sabine Hossenfelder has some interesting commentary on her experiences in the academic theoretical physics environment here.

    Mark Hannam has some related commentary on academia at his new blog here.

  • I’m still trying to finish a first draft of notes about quantum mechanics and representation theory (available here). I recently came across some similar notes which are quite good by Bernard, Laszlo and Renard.

    David Renard also has here some valuable notes on Dirac operators and representation theory.

  • Last Friday and Saturday at the University of South Carolina there was a Philosophy of the LHC Workshop, with talks here. Many of the talks were about the nature of the evidence for the Higgs and its statistical significance. James Wells talked about the supposed Higgs naturalness problem. He argues (see paper here) that you can’t base the problem on the Planck scale and quantum gravity since you don’t know what quantum gravity is (I strongly agree…). Where he loses me is with an argument that there must be lots more scalars out there than the Higgs (because string theory says so, or it just doesn’t seem right for there to only be one), and these cause a naturalness problem. Of course, once you have the naturalness problem, SUSY is invoked as the only known good way to solve it.
  • Posted in Uncategorized | 17 Comments

    Raising the Bar

    If you’re looking for something to do next Tuesday evening here in New York, an event called Raising the Bar has recruited 50 people to give talks at bars around the city. There are some quite interesting talks on the list, but I’ll have to miss them, since I’m scheduled to talk about What We Don’t Know About Fundamental Physics at the Blind Tiger on Bleecker Street at 8:30pm. Not sure yet exactly what I’ll talk about, but the general idea is to start by explaining that the current situation is that we have a fundamental theory (SM + GR) that is frustratingly good in terms of agreement with experiment, but also frustratingly incomplete. I’ll see what I can do to explain the ways in which the SM and GR are incomplete, and what current prospects are for doing better.

    Posted in Uncategorized | 27 Comments

    Supersymmetry and the Crisis in Physics

    The May issue of Scientific American has a very good cover story by Joe Lykken and Maria Spiropulu, entitled Supersymmetry and the Crisis in Physics (the article is now behind their subscriber paywall, but for those with access to Nature, it will soon be here).

    Here are some excerpts:

    It is not an exaggeration to say that most of the world’s particle physicists believe that supersymmetry must be true—the theory is that compelling. These physicists’ long-term hope has been that the LHC would finally discover these superpartners, providing hard evidence that supersymmetry is a real description of the universe…

    Indeed, results from the first run of the LHC have ruled out almost all the best-studied versions of supersymmetry. The negative results are beginning to produce if not a full-blown crisis in particle physics, then at least a widespread panic. The LHC will be starting its next run in early 2015, at the highest energies it was designed for, allowing researchers at the ATLAS and CMS experiments to uncover (or rule out) even more massive superpartners. If at the end of that run nothing new shows up, fundamental physics will face a crossroads: either abandon the work of a generation for want of evidence that na­­ture plays by our rules, or press on and hope that an even larger collider will someday, somewhere, find evidence that we were right all along…

    During a talk at the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara, Nima Arkani-Hamed, a physicist at the Institute for Advanced Study in Princeton, N.J., paced to and fro in front of the blackboard, addressing a packed room about the future of supersymmetry. What if supersymmetry is not found at the LHC, he asked, before answering his own question: then we will make new supersymmetry models that put the superpartners just beyond the reach of the experiments. But wouldn’t that mean that we would be changing our story? That’s okay; theorists don’t need to be consistent—only their theories do.

    This unshakable fidelity to supersymmetry is widely shared. Particle theorists do admit, however, that the idea of natural supersymmetry is already in trouble and is headed for the dustbin of history unless superpartners are discovered soon…

    The authors go on to describe possible responses to this crisis. One is the multiverse, which they contrast to supersymmetry as not providing an answer to why the SM parameters are what they are, although this isn’t something that supersymmetry ever was able to do. Another is large extra dimensions as in Randall-Sundrum, but that’s also something the LHC is not finding, with few ever thinking it would. Finally there’s the “dimensional transmutation” idea about the Higgs, which I wrote about here last year. About this, the authors write:

    If this approach is to keep the useful virtual particle effects while avoiding the disastrous ones—a role otherwise played by supersymmetry—we will have to abandon popular speculations about how the laws of physics may become unified at superhigh energies. It also makes the long-sought connection between quantum mechanics and general relativity even more mysterious. Yet the approach has other advantages. Such models can generate mass for dark matter particles. They also predict that dark matter interacts with ordinary matter via a force mediated by the Higgs boson. This dramatic prediction will be tested over the next few years both at the LHC and in underground dark matter detection experiments.

    It’s great to see such a high-profile public discussion of the implications of the collapse of the paradigm long-dominant in some circles which sees SUSY extensions of the Standard Model as the way forward for the field. One place where I disagree with Lykken and Spiropulu is their claim that “It is not an exaggeration to say that most of the world’s particle physicists believe that supersymmetry must be true.” Actually I think that is an exaggeration, with a large group of theorists always skeptical about SUSY models. For some evidence of this, take a look at this document from 2000, which shows a majority skeptical about SUSY at the LHC. By the way, I hear those on the right side of that bet haven’t yet gotten their cognac, with the bet renegotiated to wait for results from the next LHC run.

    Update: I hear that the 2000 bet was revised in 2011, with a copy displayed publicly at the Niels Bohr Institute. The new bet is about whether a superpartner will be found by June 16, 2016, and the losers must come up with a bottle of good cognac. There are 22 on the yes side (including Arkani-Hamed and Quigg), and 22 on the no side (including ‘t Hooft, Komargodski, Bern). Also, 3 abstentions. It explicitly is an addendum to the 2000 wager, with those who lost the last one given the option of signing again, forfeiting two bottles of cognac, or accepting that “they have suffered ignominious defeat.”

    Update: This report from the APS spring meeting includes the following about Spiropulu’s talk there:

    Supersymmetry and dark matter have become so important to particle physicists that “we have cornered ourselves experimentally,” said Spiropulu. If neither is detected in the next few years, radical new ideas will be required. Spiropulu compared the situation to the era before 1905, when the concept of ether as the medium for all electromagnetic waves could not be verified.

    You can watch the talk and see for yourself here.

    Posted in Uncategorized | 51 Comments

    Ten Years of Not Even Wrong

    This blog was started a little bit over ten years ago, and I’ve been intending for a while to write something marking the occasion and commenting on what has changed over the past ten years. I’ve found this mostly a rather discouraging topic to think about and whatever I have to say about it is going to be pretty repetitive for anyone who regularly reads this blog, so I’ll keep this fairly short.

    Re-reading some of the early postings I’m struck mainly by how little has changed in ten years. Back in March 2004 I was writing about a David Gross talk promoting string theory, about whether CMB measurements would give information about GUT scale physics, about how string cosmology seemed to be an empty subject, and about new twistor-based methods for computing gauge theory amplitudes. There’s been a lot of progress on the last topic since then, but little change for the others.

    One big change over the past ten years is that the argument that string-theory based unification is a failed project is no longer a particularly controversial one, with most physicists now leaning to this conclusion. Last night even Sheldon of The Big Bang Theory acknowledged that this isn’t working out and he needs to find something else to work on (see here). Maybe even Sheldon’s real life model will soon reach this conclusion. Ten years ago the argument one often heard was that string theory was the winner in the marketplace of ideas, with skeptics just sore losers. These days, it’s string theorists who are more often complaining about the unfairness of this marketplace.

    One development that is just starting to have a major impact is the failure of the LHC to find any evidence of SUSY, leading to increased skepticism about SUSY extensions of the standard model. This is a developing story, with results over the next couple years from the LHC likely to make this a textbook example of what scientists do in the face of experimental disconfirmation of their most cherished ideas.

    The discovery of the Higgs has been a wonderful vindication of the ideas and techniques of high energy physics, both experimental and theoretical. As we learn more about the Higgs the lesson seems to be that this sector of the Standard Model behaves in the simplest way possible. This is a significant new piece of information about nature, although a frustrating one since it doesn’t provide a hint of how to improve the Standard Model.

    On the whole though, I fear that thinking about changes over the last ten years mostly puts me in a not very good mood. Some of the depressing developments and trends of the last ten years are:

    • One reaction to string theory’s failures in the marketplace of ideas has been a Russian billionaire’s decision to try and manipulate that marketplace by injecting tens of millions of dollars into it on one side. The largest financial prize in science now is devoted to each year rewarding people for work on a failed project. This is corrupting the marketplace in a significant way.
    • Some of my earliest postings back in 2004 were about KKLT, the string landscape and the multiverse. At the time I was sure that if the landscape proposal being pushed by the Stanford group became widely accepted as an implication of string theory unification, that would be the end of it. Surely no sensible person would try and argue for an extremely complicated, inherently unpredictive theoretical framework. Boy, was I wrong. As I’ve gone on about far too often here, the current multiverse mania is a disastrous and shameful episode for fundamental theoretical physics, threatening its essential nature as a science.
    • Most physics departments have reacted to the failure of string theory by at least partly blaming this failure on the over-emphasis of mathematics, instead of the fact that this was just a wrong idea about physics. An interesting document I recently ran across is this one about the connections of particle physics with other disciplines, written by my advisor Curtis Callan and Shamit Kachru. Mathematics is mentioned in a section discussing past successes in cross-fertilization with other fields, but it appears not at all in the rest of the document discussing opportunities for the future.

    I’m quite surprised that I’ve continued to find topics worth blogging about ten years down the road, this is something I never expected when this started. Right now I’m hoping for something unexpected in coming years, that I’ll be writing about something different and much more interesting ten years from now!

    Posted in Favorite Old Posts, Uncategorized | 62 Comments

    Quick Links

    Just time at the moment for some quick links. I’ll start with some math news, since there hasn’t been much of that here recently:

    • Matt Baker has the sad news here of the death of Berkeley mathematician Robert Coleman recently, at the age of 59. Coleman was a leader in the field of p-adic geometry, and managed to continuously do important research work despite a long struggle with MS. He was both highly influential and well-loved, be sure to read the comments which contain appreciations from many different mathematicians.
    • Also at Matt Baker’s blog is a summary of recent work by Manjul Bhargava and collaborators on the average ranks of elliptic curves. This work shows

      at least 20.6% of elliptic curves over Q have rank 0, at least 83.75% have rank at most 1, and the average rank is at most 0.885…

      and

      at least 66.48% of elliptic curves overQ satisfy the (rank part of the) Birch and Swinnerton-Dyer (BSD) Conjecture (and have finite Shafarevich-Tate group)

      Conjecturally

      50% of elliptic curves have rank 0, 50% have rank 1, and 0% have rank bigger than 1, and thus the average rank should be 0.5. (And conjecturally, 100% of elliptic curves satisfy the BSD conjecture. :))

      Until recently

      the best known unconditional results in this direction were that at least 0% of elliptic curves have rank 0, at least 0% have rank 1, the average rank is at most infinity, and at least 0% of curves satisfy the BSD conjecture.

      so this is dramatic progress.

    • I’ve yet to hear any solid rumors about who will win the 2014 Fields Medals, to be announced at the ICM in August. To my mind, Bhargava is a leading candidate. Others one hears discussed are Jacob Lurie (a question is whether he has a big enough theorem, the one he’s talking about here may not be finished). Often mentioned names whose work I know nothing about are Artur Avila and Maryam Mirzakhani.
    • Another area of huge progress in mathematics over the past year or so has been work of Peter Scholze, who is another excellent Fields Medal candidate, but one young enough that this might get put off until 2018. I’ve been hoping to understand his results on torsion classes in Langlands theory well enough to say something sensible here, but I’m definitely not there yet, maybe some day in the future. In the meantime, watch his extremely clear lectures at the IAS (here, here and here) as well as the talks at this recent MSRI workshop.
    • The math community award structure is for some reason prejudiced against the middle-aged, with the high-profile prizes going to the young (Fields Medals) and the old (Abel Prize). This year’s Abel Prize went to Yakov Sinai, and again, I’m in no position to explain his work. However Jordan Ellenberg was, and there’s video here of the prize announcement, including Ellenberg’s talk about Sinai’s work. In the past Timothy Gowers gave such talks, with not everyone happy about this. No news yet on whether Sowa will change his blog name to Stop Jordan Ellenberg! !!!.
    • Leila Schneps is trying to raise funds for an English translation of the 3rd volume of Winfried Scharlau’s German language biography of Grothendieck. Go here to contribute, I just did.

    Turning to physics news:

    • the recent BICEP2 data is still attracting a lot of attention. Initial news stories were often dominated by nonsense about the multiverse, more recent ones are more sensible, including Adrian Cho at Science Magazine, Clara Moskowitz at Scientific American, and a George Musser interview with Gabriele Veneziano.Yesterday Perimeter hosted a workshop on implications of BICEP2. The theory talks I looked at didn’t seem to me to have much convincing in them, except that Neil Turok acknowledges that this kills off the Ekpyrotic (bouncing brane) Universe and he’s paying off a $200 bet. For some reason, Nima Arkani-Hamed now seems to speak at every single fundamental physics meeting, so was also at this one. More interesting were the experimental talks, with new data soon on its way, including Planck results planned for October, possibly measuring r to +/-.01 (BICEP2 says r=.20).
    • For some perspective on inflationary theory, CU Phil in recent comment section points to a new volume on the Philosophy of Cosmology. It includes some great articles putting multiverse mania in context by George Ellis and Helge Kragh, as well as an enlightening discussion of the issues surrounding inflationary theory, especially “Eternal Inflation”, from Chris Smeenk.
    • For the latest news about LHC results coming in from the Run 1 dataset, see this report from the Moriond conference.
    • Finally, physics continues to inspire frightening movie projects, see here.
    Posted in Uncategorized | 32 Comments