IAS director Robbert Dijkgraaf will be giving the CERN colloquium tomorrow, with the title The Future of Fundamental Physics. Here’s the abstract:
The reports of the death of physics are greatly exaggerated. Instead, I would argue, we are living in a golden era and the best is yet to come. Not only did the past decades see some amazing breakthrough discoveries and show us the many unknowns in our current understanding, but more importantly, science in general is moving from studying `what is’ to `what could be.’ There will be many more fundamental laws of nature hidden within the endless number of physical systems we could fabricate out of the currently known building blocks. This demands an open mind about the concepts of unity and progress in physics.
I don’t know of any “reports of the death of physics”, but there are a lot of reports of the death of string theory (Dijkgraaf’s specialty) and of the larger subject of attempts to go beyond the Standard Model, experimentally or theoretically. CERN yesterday announced new results from LHCb testing lepton universality (a prediction of the Standard Model). LHCb sees a ratio of decays to muons vs. electrons in a certain process that is off from the Standard Model prediction by 3.1 sigma.
If this result is confirmed with better data and careful examination of the theory calculation, that will be a dramatic development, indicating a significant previously unknown flaw in the Standard Model. BSM theory and experiment would be very much undeniably alive (no known relevance of this though to the troubles of string theory). Unfortunately, the experience of the past few decades is that 3 sigma size violations of Standard Model always go away after more careful investigation (see for instance the 750 GeV diphoton excess). It’s exactly this pattern that has people worried about the health of the field of high energy physics.
Dijkgraaf’s claim that “we are living in a golden era” is an odd one to be making at CERN, which has seen some true golden eras and is now facing very real challenges. Even odder is arguing at CERN that the bright future of science is due to it “moving from studying `what is’ to `what could be.’” CERN is at its core a place devoted to investigating “what is” at the most fundamental level. I’m curious to hear what those at CERN make of his talk.
Dijkgraaf’s abstract to me summarizes the attitude that the best way to deal with the current problems of HEP theory is to change the definition of the goals of the field, thereby defining failure away. The failure of heavily promoted ideas about string theory and supersymmetric extensions of the Standard Model is rebranded a success, a discovery that there’s no longer any point to pursue the traditional goals of the subject. Instead, the way forward to a brighter future is to give up on unification and trying to do better than the Standard Model. One is then free to redefine “fundamental physics” as whatever theorists manage to come up with of some relevance to still healthy fields like condensed matter and hot new topics like machine learning and quantum computing. I can see why Dijkgraaf feels this is the way forward for the IAS, but whether and how it provides a way forward for CERN is another question.
Update: I just finished watching the Dijkgraaf talk, together with the question session afterwards. Dijkgraaf basically just completely ignored HEP physics and the issues it is facing. He advertised the future of science as leaving the river of “what is” and entering a new ocean of “what can be”, with the promising “what can be” fields biotech, designer materials and AI/machine learning. He hopes that theorists can contribute to these new fields by trying to find new laws governing emergence from complexity, perhaps via new ideas using quantum field theory tools.
With nothing at all to point to as a reason to be optimistic about HEP, a couple questioners asked whether his river of “what is” might be now hitting not an ocean but a desert, and he didn’t have much of an answer. All in all, I’m afraid that the vision of the future he was trying to sell is not one in which high energy physics has any real place. It fits well with the depressing increasingly popular view of the field, as one which had a great run during the twentieth-century, but now has reached an end.
Update: For more discussion of the reliability of the LHCb result, see comments here and here, as well as Tommaso Dorigo’s blog post.
Update: Tommaso puts his money where his mouth is.
Dijkgraaf seems to jump around a bit beginning with “fundamental physics” leaping into “science in general” and then ends with what seems to be a philosophical reference to “unity in physics.” Buried within these wild leaps is what appears to be the primary aim of his comments, “science…is moving from studying `what is’ to `what could be.’” His point seems to be to justify the trend of highly speculative model-building (“what could be”) rather than methodologies that are testable and rely on evidence (“what is”).
Hi Peter,
There’s not really a theory calculation to be examined for R_K, the Standard Model just gives unity. Of course theory goes into the simulations used to perform parts of the analysis, but this is generally reliable for these use cases.
What you say about 3 sigma excesses is of course true but what’s interesting here is that in almost doubling the data sample from 5 to 9 fb^-1 the significance increased. If it were a fluctuation one might expect it to gradually reduce as more data is added.
https://lhcbproject.web.cern.ch/Publications/p/Directory_LHCb-PAPER-2021-004/FigS5.pdf
Marcus C Thomas,
I think the “what is” versus “what could be” distinction Dijkgraaf is making is not one of testable vs. speculative. His next sentence “There will be many more fundamental laws of nature hidden within the endless number of physical systems we could fabricate out of the currently known building blocks” indicates that he has is mind more
“what is”: the elementary particles and interactions (e.g. the SM).
“what could be”: complicated systems with new emergent behavior that we can make out of elementary particles.
Dijkgraaf and the theorists at the IAS are basically quantum field theorists, take QFT as “fundamental” and are interested in finding new ideas about QFT and new things to do with it. To me he’s basically saying the time for trying to find something new about elementary particles using QFT is over, the future is in new QFT ideas that could describe emergent behavior in complex systems, with practical implications.
Anon,
Thanks!
Is there another experiment able to compete on this measurement (Belle II)?
Two competitors seeing the same thing tends to be more convincing than one experiment’s error bars.
As a theorist, other evidence would be a compelling model that explains this as well as other anomalies. I gather the ambulance chasing has already started, will wait and see what that leads to.
Hi Peter,
For this particular measurement yes it’s Belle II. There are other similar ratios which also have tensions with the Standard Model and can likely be measured by the other LHC experiments. For example:
https://hflav-eos.web.cern.ch/hflav-eos/semi/spring19/r_dtaunu/rdrds_rds_spring2019.pdf
Plenty of ambulances have been chased already (see the references in the paper). Compelling is subjective, but they’re reasonably conventional BSM theories like leptoquarks or Z’.
I was also under the impression that the LHCb excess has a more encouraging pedigree than the diphoton excess, which reportedly seemed from the outset about as random as it turned out to be.
But, as with other BSM ideas and their myriad possibilities, there are diverse ways to conjure up leptoquarks. Maybe too many. A sense of plausibility, or what about the SM is crying out for that particular addition is lacking almost entirely from the popularized accounts that I’ve been able to find. Are mass differences between the generations enough of a reason to expect anything like it? And so on. Seems unwise to get too excited, or to crank up the siren on the ambulance just yet.
Dijkgraaf and myself belong to the same generation and were, I believe, shaped by the same event: the success of CFT in 2D statistical physics (he is of course also shaped by the first superstring generation, something I never was because I didn’t understand it). Although Dijkgraaf made real contributions and comes from the Netherlands, the Mekka of statphys, whereas the train had already left the station when I figured out what was going on.
In the context of 2D statphys it does make sense to talk about what can be and what can not. Here, we know that some things are (the Ising and somewhat more complicated models), but we also know what can be (the discrete unitary series) and what cannot be (values of c inbetween). This is a fundamental success which I think is underappreciated, even if 2D statphys has admittedly a limited domain of applicability.
Alas, applying a similar attitude to SUSY, which does not have the underpinning of a theory which is mathematically deep and empirically successful, makes no sense to me.
Thomas Larsson,
I think you’re right that Dijkgraaf’s point of view has been shaped by starting out in 2d CFT/string theory back in the mid 1980s. He started on his Ph.D. around the time of the first superstring revolution of 1984. At the time the excitement about string theory was driven by hopes to get a unified theory out of it, something that never worked out. From the beginning Dijkgraaf worked not on this failed project of string theory unification, but on doing other things (e.g CFT, TQFT) often motivated by string theory, that had applications elsewhere (2d stat mech, topology). So, to the extent that his current point of view is that one should give up on unification and find QFT applications elsewhere, it’s consistent with his work from the beginning.
Dijkgraaf says:
This sentence seems to be talking about condensed matter physics. I’d be really happy if Dijkgraaf openly admitted high-energy physics has been stagnant for decades, while condensed matter physics is full of exciting discoveries. I’m trying to convince young theoretical physicists of this myself. But he seems reluctant to come out and say this openly.
I’m also not sure what his definition of “fundamental laws” is here, or “fundamental physics”. If he wants to redefine them to include the approximately true but mathematically deep laws governing some condensed matter systems… well, there’s a case to be made for that. But I think to do it he should come out and say the words: “condensed matter”. Otherwise it’s all rather mysterious.
Since I am presently reading Chiara Marletto’s new book about counterfactuals, maybe that’s what he is referring to with the “could be”? The argument is in a nutshell that what you can do with a system is an important property of the system in and by itself. Not sure what Dijkgraaf has to do with that though, so maybe it’s not what he meant.
As to the 3.1 sigma. They need some semi-discovery that the present collider just about can’t confirm to argue we need a bigger collider. Might be this or another one.
Oh, and just in case any young people are reading this, let me add I agree with John about condensed matter physics. It’s interesting and cool and it’s somewhat unfortunate that it has a reputation of being somewhat dull. There’s much to be done there and almost certainly a better research direction at the moment than high energy physics.
On the 3.1 sigma excess in B+ > K+ e+e-… the detector is not e/mu symmetric at all… very, very different acceptance/efficiencies. And electrons emit bremmstrahlung. So the observed number of K+e+e- events is a factor of 2 to 3 smaller than K+mu+mu-, and K+e+e- has a broader signal shape, and extends over more background.
Then simulation is used to correct back for that… and LHCb claims (systematically) 1% error in that correction.
In, say, 1970, this claim would not be accepted. But simulation tools are far more powerful today.
K+e+e- ends up being quite a bit nearer to the floor of background due to lower acceptance*efficiency and bremm smearing.
The kaon system has an awful lot of higher resonances that were part of the particle explosion of the 1960’s, prior to acceptance of quarks… the phenomenology of those resonances is hardly airtight. Predicting the coupling of those resonances to the bag of partons the appears after the quark decay of b to s l+ l- seems to me to be a big challenge.
Then if some of the decay products of the higher resonances are missed by LHCb, they’d appear as background under the bremm tail of the K+e+e-. Getting that background wrong by a few percent would put them down to 2 sigma, and 10-20% would explain the entire effect.
So, either lepton non-universality, or, difficult to model regge-pole type resonances from the 1960’s… which would Carl Sagan say needs the more extraordinary evidence? But good that LHCb is getting out near the envelope of what we understand.
Wandering mudhen,
Thanks for the cautionary comments. I think it’s correct to summarize the situation as “this is an extraordinary claim, with not now extraordinary evidence”.
Peter,
you write “It fits well with the depressing increasingly popular view of the field, as one which had a great run during the twentieth-century, but now has reached an end.” Sorry, HEP will reach an end only if there is no physics beyond the standard model (which indeed seems more and more likely) AND once the origins of the parameters of the standard model are understood.
The origins of the parameters are still in the dark, completely. In your blog I have never read about any attempt to explain their origin – neither in your posts nor in the comments. This needs to change. There is still fun ahead, even if there is a desert and nothing beyond the standard model.
André,
I don’t disagree. What Dijkgraaf is doing is refusing to admit that string theory has failed, trying to act like it was a success, providing us with a “new way of doing science”. This unfortunately discourages anyone from continuing to try and solve these problems for real.
“We are living in a golden era and the best is yet to come”: all Standard Model particles have been seen, LHC discovered no new physics, no next collider is underway because costs got so high and no smarter muon collider has been planned. These multiple issues suggest that maybe we are living in the historical era where hep-ex reached its end.
I wonder if this possibility can be openly discussed, given that members of the last center that explores high energy “shall conduct themselves with due regard to the interest and proper functioning of the Organization”, and “refrain from any act of activity … which would be morally or materially prejudicial to the Organization” and “to express their personal opinions on matters connected with the functioning of the Organization or its activities shall first obtain the written authorisation of the Director General”.
I think there is little to no possibility of having a truly open community discussion onn the future of HEP as it must involve both thinking and voicing “the unthinkable” (which means different things to different people).
CERN would be well advised to invite younger physicists who did not grow up believing that the LHC would be the answer to the future of the field. For example, there are a number of active efforts initiated by theorists that have lead to new experimental initiatives to probe a variety of dark matter candidates. Almost all of these initiatives are driven by folks under the age of 45.
At the end of the day, none of the grand challenges of particle physics have actually gone away. It is abundantly clear that specific directions that have been religiously pursued such as string theory and low energy supersymmetry have gone nowhere. The junior folks, especially in phenomenology, pretty clearly understand this. If CERN wants to hear about the future of the field, they would be better off inviting these folks to give such seminars rather than people who have been cheerleaders for failed efforts.
Wandering mudhen, you describe the challenges of the measurement well. That’s why it takes so much time to update this measurement. Once we understand the simulation, we test the measurement at the J/psi resonance and get 1, as expected from electromagnetism. As we have 100 times more data there we can plot that ratio against any variable of interest. See Figs 3,9,10 of https://lhcbproject.web.cern.ch/lhcbproject/Publications/LHCbProjectPublic/LHCb-PAPER-2021-004.html : it’s flat. In particular the opening angle of the leptons is crucial. The bottom line is that the B is so much boosted at the LHC that differences in the B frame between the J/psi mass range and the signal 1-6 GeV range are washed out. Once the J/psi is understood, we test the Psi(2S) and get 1 again. Only then did we look at the dilepton mass range of interest.
For the K resonances, they affect the signal region and the Psi regions similarly, so we can test our understanding on the control samples. But indeed we improved our modelling of these backgrounds compared to the previous analysis. The net effect on RK is negligible.
Hi,
an Anon above seems impressed that the significance of the R excess grew over time. This is a very common circumstance for null results when experimentalists underestimate their systematic uncertainties. The latter may come from a number of sources, from underestimates of intervals of confidence of internal parameters used in the inference-extraction procedures, to subtle detector effects. We have seen so many of these over the past 40 years that it is just bad propaganda to claim that the growth of a significance is additional evidence for the effect being genuine.
Cheers,
T.
On slide 13, R. Dijkgraaf claims that the graviton was proposed in 1915 and discovered in 2015.
Well… all this presentation looks like some slick corporation show-case.
Krzysztof,
I noticed that during the talk, thought it was quite strange for that mistake to be there.
Hi again Peter,
so the comment by Anon stimulated me to write about systematic uncertainties and other things, and how one should still be very careful with anomalies that “grow with the data” – https://www.science20.com/tommaso_dorigo/another_3_sigma_fluke_from_lhcb-253707
Cheers,
T.
And thanks for linking it from the post!
Cheers,
T.
Before the talk: you are disappointed because you predict he is going to oversell the future of HEP. After the talk: your prediction turned out to be wrong, but somehow you are still disappointed.
Lorenzo Di Pietro,
I wasn’t predicting Dijkgraaf would oversell the future of HEP, was wondering how HEP fit into his argument that science is moving away from “what is” questions. On the LHCb announcement, there is some danger that’s oversold, but that’s a separate question, and I also had no idea if Dijkgraaf would mention that.
I was surprised that Dijkgraaf’s talk to the CERN audience dealt with the future of HEP by not mentioning it at all, essentially implying that it has no future worth talking about.
Speaking of bets and leptons, are there any bets, to be decided very soon, on what Fermilab is about to say about muon g-2? Bets on the number itself, or its standard deviation? Or even guesses?
Doug McDonald,
I’ve seen no bets. I’m a little surprised expectations are so high about this. Maybe I’m wrong, but my impression was that this upcoming announcement is going to be about their Run 1 data, which was of similar size to the earlier Brookhaven experiment. So, no expectation of significantly better precision or more conclusive (5 sigma level) ruling out of the SM prediction. We’ll see soon…
Dear Dr. Woit: https://en.wikipedia.org/wiki/Lepton#lepton_universality_anchor says that the Babar and Belle experiments too found similar deviations from lepton universality just like LHCb did, but the statistical significance did not meet the 5 sigma threshold. When 2 other experiments possibly found deviations from the Standard Model prediction, does the latest LHCb announcement sound so improbable?
Sundar,
The problem is that there are a large number of possible decays you can measure and check for lepton universality, and if you measure a lot of them, statistically you expect to see deviations in some of them, even if lepton universality is correct. You need something more than a random-looking pattern of some decays satisfying lepton universality, some not, at levels not big enough to be convincingly significant.
Hi,
the talk is a typical Dijkgraaf for the public ears – saying a lot without saying much. I like the more honest ones when Robbert is a cheerleader for stringy maths. And don’t get me wrong, I like him very much. He’s always been for experiment and testable consequences to a theory next to other more stringy things.
However, I am not sure what the point was with the ocean of what could be. I imagined first RD would talk about the (endless) possibilities coming out of strings and the likes – and perhaps discovering exactly the five centimeters next to the map. But then he explained how our knowledge leads to the engineering of what there could be. Which plays well with stringy engineering, but needs the very important addition that it’s purely theoretical knowledge. RD but was talking about “unexplored physical phenomena” without any qualification on their kind. And so, the HEP community would hear that they may continue on their track of smashing what there is to discover what there could be. And the theoreticians would hear that they might continue with playing around on and with their models deriving what they could lead to. That is but a description of the current (and much criticized) state of this particular areas of physics today.
It was announced as a talk about the “future” but it was rather a motivational speech to those whose hope is fading into the shadows of other exciting topics in science that are not about fundamental physics.
I remember seeing a Strings’XX talk by Strominger about the “fun” and adventures on the sea, string theorists were sailing like Magellan into unknown shores. And certainly the discoveries will be similarly significant. And perhaps they already are, but not in the way of the actual ‘golden era’ of HEP. They have not discovered the promised new continents so far but saw many Magical things and discovered new Monsters.
The golden era ships were sailing along the river, the shores always in sight. Out into the ocean, let’s not forget, many sailors got lost and eaten by the monsters.
It might be helpful to step from the sailing ship for a moment on to the ground. To look for the five centimeters or something else.
And btw – does anyone know who’s the inquirer at 1:18:18 ? 🙂
Cheers, M