Sabine Hossenfelder has a new piece out, making many of the same arguments she has been making for a while about the state of fundamental theory in physics. These have a lot in common with arguments that Lee Smolin and I were making in our books published back in 2006. The underlying problem is that the way theorists successfully worked up until the seventies is no longer viable, with the Standard Model working too well, up to the highest energies probed:
The major cause of this stagnation is that physics has changed, but physicists have not changed their methods. As physics has progressed, the foundations have become increasingly harder to probe by experiment. Technological advances have not kept size and expenses manageable. This is why, in physics today, we have collaborations of thousands of people operating machines that cost billions of dollars.
With fewer experiments, serendipitous discoveries become increasingly unlikely. And lacking those discoveries, the technological progress that would be needed to keep experiments economically viable never materializes. It’s a vicious cycle: Costly experiments result in lack of progress. Lack of progress increases the costs of further experiment. This cycle must eventually lead into a dead end when experiments become simply too expensive to remain affordable. A $40 billion particle collider is such a dead end.
I have a somewhat different view about a potential next collider (see here), but agree that the basic question is whether it will be “too expensive to remain affordable.”
What has happened over the last forty years is that the way HEP theory is done has become dysfunctional, in a way that Hossenfelder characterizes as follows:
Instead of examining the way that they propose hypotheses and revising their methods, theoretical physicists have developed a habit of putting forward entirely baseless speculations. Over and over again I have heard them justifying their mindless production of mathematical fiction as “healthy speculation” – entirely ignoring that this type of speculation has demonstrably not workhat woed for decades and continues to not work. There is nothing healthy about this. It’s sick science. And, embarrassingly enough, that’s plain to see for everyone who does not work in the field.
This behavior is based on the hopelessly naïve, not to mention ill-informed, belief that science always progresses somehow, and that sooner or later certainly someone will stumble over something interesting. But even if that happened – even if someone found a piece of the puzzle – at this point we wouldn’t notice, because today any drop of genuine theoretical progress would drown in an ocean of “healthy speculation”…
Why don’t physicists have a hard look at their history and learn from their failure? Because the existing scientific system does not encourage learning. Physicists today can happily make career by writing papers about things no one has ever observed, and never will observe. This continues to go on because there is nothing and no one that can stop it.
This story brings up a lot of complex issues in the philosophy and sociology of science, but to me there’s one aspect of the problem that is relatively simple and deserves a lot more attention than it gets: how do you get theorists to abandon failed ideas and move on to try something else?
The negative LHC results about SUSY have had some effect, but even in this case it’s remarkable how many theorists won’t abandon the failed idea of a SUSY extension of the Standard Model. This was always a highly dubious idea, explaining nothing about the Standard Model and adding a huge number of new degrees of freedom and more than a hundred new undetermined parameters. Not seeing anything at the LHC should have put the final nail in the coffin of that idea. Instead, I see that this past fall MIT was still training its graduate students with a course on Supersymmetric Quantum Field Theories. You can try and argue that SUSY and supergravity theories are worth studying even if they have nothing to do with physics at observable energies, but it is a fact that these are extremely complicated QFTs to work with and have explained nothing. Why encourage grad students to devote the many, many hours it takes to understand the details of this subject, instead of encouraging them to learn about something that hasn’t been a huge failure?
The techniques one gets trained in as a graduate student tend to form the basis of one’s understanding of a subject and have a huge influence on one’s future career and the questions one has the expertise needed to work on. Besides SUSY, string theory has been the other major course topic at many institutions, with the best US grad students often spending large amounts of time trying to absorb the material in Polchinski’s two-volume textbook, even though the motivations for this have turned out to also be a huge failure, arguably the largest one in the history of theoretical physics.
To get some idea of what is going on, I took a look at the current and recent course offerings (on BSM theory, not including cosmology) at the five leading (if you believe US News) US HEP theory departments. I may very well be missing some offered courses, but the following gives some insight into what leading US departments are teaching their theory students. Comparing to past years might be interesting, possibly there’s a trend towards abandoning the whole area in favor of other topics (e.g. cosmology, quantum information, condensed matter).
- Harvard:
Fall 2019
PHYSICS 283B: Spacetime and Quantum Mechanics, Total Positivity and Motives
PHYSICS 287A: Introduction to String Theory
Spring 2020
PHYSICS 211BR – Black Holes from A to Z
PHYSICS 287BR – The String Landscape and the String Swampland - Stanford:
No courses beyond QFT in 2019/20 - Caltech:
No courses beyond QFT in 2019/20
[Actually, see comments, there’s Physics 230a last fall, which may or may not have been a course on SUSY models] - Princeton:
Fall 2019
Phy 540 Strings, Black Holes and Gauge Theories (Klebanov)
Spring 2020
Phy 540 Strings, Black Holes and Gauge Theories (Polyakov) - MIT
Fall 2019
8.831 Supersymmetric Quantum Field Theory
Spring 2020
8.851 Effective Field Theory
The places not offering string theory courses this year seem to have had them last year.
Update: Something relevant and worth reading that I think I missed when it came out: Jeremy Butterfield’s detailed review of Lost in Math, which has a lot about the question of why theorists are “stuck”.
Update: There’s some serious discussion of this on Twitter. For those who can stand that format, try looking here and here.
Update: Mark Goodsell has a blog posting about all this here, including a defense of teaching the usual SUSY story to graduate students.
Update: A correspondent pointed me to this recent CERN Courier interview with John Ellis. Ellis maintains his increasingly implausible defense of SUSY, but he’s well aware that times have now changed:
People are certainly exploring new theoretical avenues, which is very healthy and, in a way, there is much more freedom for young theorists today than there might have been in the past. Personally, I would be rather reluctant at this time to propose to a PhD student a thesis that was based solely on SUSY – the people who are hiring are quite likely to want them to be not just working on SUSY and maybe even not working on SUSY at all. I would regard that as a bit unfair, but there are always fashions in theoretical physics.
If there were anything else, people would move on.
It’s exactly because high-energy physics as we have known it no longer produces new results that increasingly imaginative ideas are being tested.
Defeatists have not found viable alternatives.
Warren Siegel,
I agree that if there were a readily-identifiable good alternative, people would move to it. In the past experiment would point in the right direction. That’s gone and it’s a very hard problem to replace that and find other routes to new ideas.
What isn’t so hard though is to recognize when ideas don’t work, and doing so isn’t “defeatist”. The problem isn’t “increasingly imaginative ideas”, it’s going along with increasingly bad justifications for continuing to pursue failed research programs instead of acknowledging the obvious.
The courses for advanced graduate students tend to be more fluid than the core courses. Sometimes an advanced topic is offered every other year (for instance General Relativity). Some of the more esoteric topics can be even less frequent. Also the course might not be called “String Theory” but “Special Topics in XYZ…” And occasionally students will just arrange a weekly meeting with a professor to learn a subject in depth. I wouldn’t put much weight on courses appearing or disappearing from the course catalog.
The problem is driven by physics, not by sociology. Simple explanation in natural units:
(collider radius) ≈ (4π/alpha)^3 (collider energy)/(electron mass)^2.
Collider technology remains based on old physics: electrons and electromagnetism. Because the heavier particles discovered at colliders found almost no practical use, not even for building better colliders. This is the key point. The rest follows: colliders become big, slow, expensive. Theory detaches from experiment.
I regard string theory as etudes for physicists like there exist etudes for piano players. It’s good for improving techniques (mathematics in our case) but not “the real thing”, because string theory is in principle based on quantum theory for extended objects, which is not conceptually really a new idea or even revolutionary, being technically difficult but conceptually almost trivial. We have at the moment no one who is capable to produce a masterpiece like did Bach or Chopin in music, for example. We are just practicing and preparing for it.
But that’s better than nothing.
I would not condemn string theory in total: Learning it’s technical aspects (etudes) may lead us one day to be capable to find the true masterpiece.
Alessandro Strumia,
Yes, it’s clear that the underlying problem is the physics that makes probing higher energy scales more and more expensive. This creates a different problem, with sociological aspects: once they’ve detached from experiment, what happens to HEP theorists and the field of HEP theory?
Some possible reactions to this problem are:
1. Ignore it, refuse to acknowledge it publicly, and keep on pursuing previously popular research programs that have failed. Keep training new generations of students in the complexities of SUSY or string theory.
2. Give up, abandon HEP theory for another healthier field (e.g. cosmology, condensed matter, quantum information theory, machine learning).
3. Find some route to new, more promising ideas.
I’d argue that 3 is still viable. A big problem is the historical sociology of the field has become dysfunctional. It emphasized concentration on a small number of questions, driven by experiment providing the right question. This falls apart when experiment stops providing the right promising question for everyone to focus on.
Mark,
I think that for quite a while post-1984 you could sensibly make the argument that while the string unification conjecture was a failure, there was a lot to be learned from the deeper study of string theory and it was worthwhile for people to pursue that. Unfortunately, over the past 20 years or so, progress in learning new things from the deeper study of string theory has pretty much come to a halt. Keeping doing the same things over and over, while waiting for a genius to save you from yourself is not a good plan. For one thing, up and coming geniuses who take a look at a field and see that going on will flee and look for another field in which to exercise their genius.
Hi Peter,
Thanks for mentioning. The piece actually isn’t new, it’s a reprint of a blogpost I wrote last year.
Butterfield has misstated my position on some issues. I have a brief response to this here.
Peter,
PHYSICS 283B: Spacetime and Quantum Mechanics, Total Positivity and Motives looks like a very interesting course to take for new students. Does it have any promising ideas or is it just the old failed ideas?
Also curious about your take on pursuing deep mathematics as a promising way to progress. Can you elaborate on that?
Where can you disagree with Sabine, on pursuing deep mathematics vs lost in math?
Cut the funding. I mean that’s an idea, not the solution: the solution is the one which make possible a financial limitation to string theory community and alike.
I think of this because, I believe, the true problem is the(!) money that keeps pouring into that kind of bogus research.
And scare resources should be a resource for creativity.
The theoretical work better be focused on technologies oriented to experiments and observations, by driving experimentalists on what could do (what observations/experiments could think of), instead of making unnecessary abstract assumptions on how the universe should be (parallel worlds, holography, strings etc.).
Akhil,
That’s an unusual course. Arkani-Hamed was visiting Harvard, it was an opportunity for him to explain his current research program in detail. I’ve written about this “amplitudes” research many times here on the blog.
There is a great deal in my “Not Even Wrong” book about the relation of deep ideas in mathematics to the Standard Model.
For my disagreements with “Lost in Math”, see my review of the book.
SparkTech,
The problem isn’t a lack of focus on experiment, it’s the lack of experimental anomalies that could point to a way forward. Experimentalists and theorists have worked hard to change this, but nature is not cooperating. The problem is how to make progress given this situation.
Theorists need to work with abstract assumptions on how the universe should be, but they need to do a better job of rejecting ones that fail to lead anywhere and coming up with new ones to try.
Cutting off funding to failed research programs would be helpful, but what really matter is how the leading figures in the field deal with the problem of no progress. Funding decisions should come from the people in the field, not from people who don’t understand the problems theorists are grappling with. Hossenfelder is doing a good job trying to get theorists to face up to the problem, in some sense what we need is a more serious response to her challenge.
Just saw this
https://twitter.com/JimBaggott/status/1217011515385139201
which makes the same point.
One of the problems I see is the concentration of power with certain fields in the theory departments around the world but particularly within the US.
“Just cut of the funding for field X” doesn’t work if the people deciding over funding in theory departments are all from field X.
I think physics would need to revise their publication culture and get closer to that of mathematics. Way fewer papers but an idea worked out in full detail.
There are interesting alternative approaches out there which manage to resolve some of the conceptional issues. Non-Commutative geometry, for example, seems interesting, although lacking a Lorentzian formulation as far as I can tell. E8 theory is another one. Causal Fermion Systems, which can explain the 3 generations of fermions belongs to that set as well.
Then on the foundations of quantum mechanics there is also interesting new work being done. Ellis approach that claims the macro to be as real as the micro aspects of the world, resolving the measurement problem by top down causation.
Fröhlichs Events, Trees, Histories (ETH) approach to quantum mechanics which resolves the measurement problem by providing a sharp definition of events and in the course abandoning unitary evolution (except as a non interacting limit case).
What all of these have in common is that they require deep mathematics which unfortunately scares away many physicists, even theoretical ones, from truly engaging with them.
@Peter Woit
Thanks for your answer and for the time it took. For sure you have better hard data than me to judge on this.
Your statement “Funding decisions should come from the people in the field […]” is perfectly reasonable, and I have accepted it myself for some time by now
However the way I see it now (ok, call it naive if you like): if we keep financing string theory community (for example) in the same way as we did so far, then string theorists will do what the did already in the last past decades. Is there any evidence for a paradigm shift? Maybe occasionally some scientists switch the field (but as I said, you have better data).
As for the nature which does not cooperate, (again, I believe you have better knowledge) I also believe it is a lot easier to cook up a complex mathematical theories about quantum foams than to think of a better technology to extract new data from that quasar which is that far away. By limiting the funding for that juggernaut theoretical community will force them to squeeze their “theoretical” brains towards a more empirical approach.
I also agree that theorists need to work with abstract assumptions on how the universe should be, but there also must be a limit on that: the pathology is now an inflation of theories that explains nothing, since there was little-to-nothing observed or experimented to start from in the first place.
Peter and others:
How come universities (with theory departments) are not offering dedicated courses in neutrino physics? That is the the only place where there is “supposedly” evidence for Physics BSM? Or is it that the “breakthroughs” in neutrino physics have taught us nothing?
Claudio Paganini,
Yes, if funding, hiring, training of graduate student, etc. decisions are all in the hands of people working in field X, any significant change requires changing the thinking of those in field X. The terminology “field X” for the dominant field in formal theory in the US is a very good one, since, while people in this field often call themselves “string theorists”, most of the best ones are not doing string theory. While they often have moved on to other things, the problem is that they are still advertising string theory to the public, training graduate students in string theory, and evaluating new research directions based on whether they somehow follow from the string theory research program path of the past.
SparkTech,
The problem is that string theorists have already been under huge pressure to show that their research connects to experiment, and all this has led to is even more dubious research directions like the string theory landscape or the currently popular “swampland” program. What’s needed is acknowledgement that this is a failed idea that can never connect to experiment, not more effort to find a way to do so.
Shantanu,
Pretty much every institution teaches courses on QFT, leading up to the Standard Model QFT, and in many of these I would expect that there is some discussion of neutrino masses and possible extensions of the Standard Model involving them.
But I don’t think there has been much in the way of either promising new theoretical ideas or unexpected experimental results on this front. If forthcoming neutrino experiments turn up something unexpected, I think courses on the topic would quickly become much more popular.
You could be right. My bet: they acknowledging that will never happen.
As you said, there is already evidence for what I believe(more dubious research in spite of huge pressure, etc., I mean the right trend is not there). And human-wise who could acknowledge that he/she spend last 20 years himself/herself on bogus physics? Which is rather consistent with my little samplings of string – theorists claiming that string theory did make significant progress in the last decades. And, as you mentioned already, it is not only them: we got also many-worlds/universes etc.
I mean: you, Sabine, Lees Smolin doing great job. And it could bring hundreds of others being just as vocal. Still waiting for that to see it happening. I only say it may not be enough.
Again, thanks for you patience and time,
and I wish you a nice day!
Sabine says:
But didn’t this method work wonderfully well between 1900 and 1975?
Planck put forth the entirely baseless speculation that light came in chunks of discrete energy (and won a Nobel Prize).
Dirac put forth the entirely baseless speculation that the entire universe was a sea of electrons that had both extra electrons and holes (and won a Nobel Prize).
Geoffrey Chew put forth the entirely baseless speculation that the “bootstrap method” could explain the particle zoo (and did not win a Nobel Prize).
The difference today is that we don’t have experiments that rule out the worthless entirely baseless speculations and confirm the entirely baseless speculations that happen to be correct. Experiments have been replaced by some complicated social process that selects some subset of the entirely baseless speculations that physicists come to a consensus on believing, but which seems to do a terrible job of separating the correct baseless speculations from the incorrect ones.
“Experiments have been replaced by some complicated social process that selects some subset of the entirely baseless speculations that physicists come to a consensus on believing, but which seems to do a terrible job of separating the correct baseless speculations from the incorrect ones.”
An excellent summary of the problem…
Peter, Alessandro,
The idea that one needs high energies to probe new physics is simply wrong. To begin with, I don’t have to tell you that if there’s new physics at high energies that would also change low-energy predictions, so higher precision can replace higher energies.
But maybe more importantly it is generally an unjustified assumption that the next breakthrough in the foundations will come from going to higher energies or short distances as opposed to probing other regimes that have been untested so far. My favorite example are quantum effects in many particle systems. This is imo presently the obvious frontier to push. You may not agree, but that’s not the point. The point is that higher energies is not the only route.
Of course particle physicists don’t want to hear it, but building bigger colliders has clearly run its course — for now. It’s expensive and the scientific benefit is negligible. Colliders are something we will certainly come back to. Maybe in a hundred years or 200 years. (Unless by then hunting with wooden sticks has become ground-breaking technology.) But this is not the right time to throw more money at particle physicists.
Let me also repeat that the problem with lacking experimental input is not decoupled from the lack of progress in theory development. The days in which we could bank on serendipitous discoveries in the foundation of physics are over. “Just look” does not work any more. The string of failed experiments in the past 40 years is evidence for that. This is simply not up to debate, it’s a fact.
It is likewise a fact that theorists’ methods of theory development are not working. Claiming that “it’s physics” does not explain why they are not willing to revise their methods.
Re the supposed lack of alternatives.
First, it’s wrong. There are various alternatives. It’s just that so few people work on them that it’s not even clear how promising they are. They are easy to dismiss because they have open questions simply due to lack of manpower.
But besides that, I am sick and tired of the claim that theorists in the foundations should be allowed to do a crappy job because no one has any better idea what to do. If they don’t know what to do, then we shouldn’t pay them.
Seriously, what kind of attitude is this? In what other profession would you get away with not being able to get anything right for 40 years and then saying you will continue that way because you can’t think of anything better to do?
Sabine,
I agree with much of what you have to say here, do disagree though about the issue of a higher energy collider. If the only argument for it was “let’s go look, maybe gluinos or extra dimensions will turn up”, I’d agree with you. You’re right that the generic “let’s investigate the 1-10 TeV scale even though we don’t think there’s anything there” argument faces a serious counterargument about the cost. The to me serious HEP argument is that the most mysterious and least understood part of the Standard Model is the Higgs sector (some might want to argue for the neutrino sector, but that area of HEP is going ahead with little controversy over its funding, since it is quite a bit cheaper). I don’t believe there’s any way to get better information about the Higgs sector than what we’ll get from the HL-LHC without a new collider.
The problem of course is the cost. If a new collider cost less than \$1 billion there wouldn’t be a lot of opposition to going forward. At \$10 billion and up, the cost is a big problem. For the latest on the leading proposal, see this talk from yesterday
https://indico.cern.ch/event/838435/contributions/3635822/attachments/1967874/3272551/200113_FCC-Status.pdf
The timeline given there (page 15) shows, I think assuming the project gets the go-ahead in the next year or so, a three year “funding strategy” period, and eight years before one starts even digging the tunnel. My understanding is that next week there’s a drafting session scheduled, and May 2020 is the time frame for a specific proposal to be submitted to the CERN Council. I’d argue that it’s best if everyone wait a little while to have a serious argument over the new collider question, until there’s a specific proposal for both what to build and how to fund it on the table. There’s going to be plenty of time for discussion about this before anything much actually happens.
The post had asserted:
“Caltech: No courses beyond QFT in 2019/20”
At Caltech, Physics 230 is seen as the follow-up to the QFT courses.
Alex,
Thanks, I updated the posting. I’d stopped reading the course description after “Advanced methods in QFT” and seeing mention of confinement and non-perturbative techniques in gauge theory, typical Standard Model QFT issues. The course description does say however that a first term version of the course like the fall one would cover
“introduction to supersymmetry, including the minimal supersymmetric extension of the standard model, supersymmetric grand unified theories, extended supersymmetry, supergravity, and supersymmetric theories in higher dimensions.”
and if that’s what the instructor chose to do last fall, it would have been just as bad an idea as the MIT SUSY course.
I’m well aware (Amitabh Lath kind of points this out), that what gets covered in advanced classes like this depends on the instructor and evolves over the years. I’d be interested to know if instructors of courses like this are realizing that the topics covered should be changed.
A bit of research shows that in Fall 2017 when Hirosi Ooguri was teaching Ph 230 at Caltech, the syllabus
http://ooguri.caltech.edu/education/230
covered mostly gauge theory, no SUSY.
In 2015-6, when the course was taught by Kapustin, the fall and winter quarter were Yang-Mills and non-perturbative QFT, the spring was devoted to SUSY.
http://www.theory.caltech.edu/~kapustin/Ph230/Ph230.html
I’ll put forward the suggestion here that the techniques one does _not_ get trained in, or worse, get trained to avoid also play a significant role. In fact, I think this is the big structural issue which has lead to the current stall (albiet taking perhaps 50-70 years to do so).
Adam Becker makes a strong case in his recent book that the profession and education of theoretical physicists underwent a sharp transition immediately after the second world war. The volume of physicists (and also other scientists and engineers) spiked, the centre of the profession shifted to the previously placid United States, and educational norms began to emphaise practicalities over philosophical introspection. You could argue that foundational issues were forgotten in the heady days of Big Science.
But, I find it hard to avoid the conclusion that there is more to it. A reading of history suggests that (quantum) foundational issues were buried, becoming not simply unfashionable, but actually taboo. Becker presents several unflattering cases for the record of post-war physics, enough to convince me at least that lack of progress on foundations (and I’d argue any new physics) can be blamed in significant part on purely sociological issues, all physics aside. This is the case in all fields no doubt, but this really does include theoretical physics.
The mid 70’s alway seem to be a sharp cutoff for most assessments of the progress in physics. I’d point to a delayed fallout from the general shift in the profession, as an older differently trained cohort passed the torch to a structurally different generation (though you could point to the SALT treaty too!). A generation which had been trained, institutionally, socially or otherwise to _not_ go looking in certain directions. Maybe it worked for a while, and probably did work for several other fields, but it hasn’t worked for theoretical physics.
The solution I’d advocate is to change the “training”. Make graduate students (indeed all physics students) aware of the problems, and in cases the gaping holes. Make them keenly aware of the history and the controversies and the unexplained, that there is still work to be done, and then let them do it. Tacitly of course this resigns the current generation(s) to “standing and waiting”. Perhaps more controversially, this involves public funding bodies be made aware that the profession is more fallible than generally accepted. In any case, continuing as is only means further ‘Breznevisation’.
Alan Roxdale,
I wrote about Becker’s book here
https://www.math.columbia.edu/~woit/wordpress/?p=10147
where you can see that I very much disagree with the story he pushes about the evil, unthinking Copenhagen orthodoxy and the good Bohm/Everett “quantum rebels”. The last thing theory graduate students need is to be drawn into the tar pit of “interpretations” and sold a moralistic story about how the old fogeys just didn’t realize that many-worlds are the solution to fundamental problems.
Peter, you say At \$10 billion and up, the cost is a big problem. Why? Surely the cost of the FCC (or comparable collider in China) is a smaller fraction of the World GDP today than Fermilab’s cost was to the late 1960s US economy. By the way a lot of that 10G$ estimate is salaries for engineers which will probably be “in kind” contributions from countries like India and China.
Also, the world is a much more peaceful place today. There are no incendiary-laden B52s heading to Asia. Countries today are content to lob shipping containers at each other and a major world crisis happens when the number of containers decreases by 4%. This is the ideal time to build something like the FCC.
The Standard Model is proving to be a tough nut to crack, I’ll admit. But I do not know of any (experimental high energy) physicist that does not think we will eventually break it. It might take many decades, our grandchildren might be the ones to see the first hints of new physics but it will happen. The alternative is to believe that 2012 was somehow a privileged point in human history after which no fundamental discoveries could be made.
Sabine and Peter,
I find that Sabine’s line of argumentation against a next high-energy collider to be completely unsatisfying. It seems that she is arguing that because the LHC has not turned up new physics (besides the Higgs, which is a major triumph that she seems to downplay), that means that experiment is a failure and that future colliders should not be built. I am happy to see that Peter disagrees, but I would like to add some of my feelings.
For one, we should remember that the LHC would never have happened had it not been for the cancellation of the Superconducting Supercollider (SSC). The LHC currently operates at a center-of-mass energy of 13 TeV with an expected upgrade to 14 TeV (though 14 TeV was the initial “design” com energy). On the other hand, the SSC was a 40 TeV collider, and a reasonable upgrade could have sent it to 50 TeV. This is not insubstantial.
New physics shows up only in very very subtle hints before one reaches the mass threshold for producing on-shell particles (e.g. a new 10-15 TeV particle would be nigh impossible to detect at the LHC but might show up at the SSC). In the past, theorists of SUSY and other BSM theories have predicted what the thresholds for new phenomena should be, but have been basically wrong. So, justifiably, Sabine is arguing that the theorists have been on the wrong track in terms of predicting what the thresholds should be. However, according to her logic, that would mean it could turn out that the LHC upgrade that ups it from 13 to 14 TeV could open new worlds of possibilities. Likewise, the possibilities at 40 TeV would be (and would have been, with the SSC) that much greater. So already, when Sabine is saying that “we need new physics,” it is completely unjustified to argue against a new, more powerful collider. Had US politics not steered collider physics as they have, we might now be in a bountiful situation much like what was seen in the 1970’s. And writing off this sort of possibility, based on theoretical arguments, is hypocritical if the argument is based on the same kinds of theoretical considerations you were just arguing against.
Just to put it in a simplified form: we could have had a much more powerful collider, 40-50 TeV instead of 13-14 TeV, years ago. The costs for such a machine have gone down at this point (adjusting for inflation.) So we are in the situation that the people who’ve told us that we should expect new physics at some new energy, based on theory, have been shown to be unreliable — however, that reasoning for why we should not trust theorists applies _just as well_ as those who are telling us _not_ to expect new physics at the next collider.
Sabine’s ideas on “alternative” kinds of experiments are complete “Hail Mary” ideas — she needs to cite better literature if anyone is to take her seriously when she is going up against a historically proven experimental program basically single-handedly. Even if such ideas had any merit, it doesn’t mean delaying the next collider makes any sense — big science needs to push on multiple fronts, just like ITER (a much more expensive experiment than the LHC) has been trudging along despite developments in fusion that will have made it somewhat out-of-date already by the time it gets finished. But in the field of fusion energy, they have the right mentality in that the big mainstream projects should proceed in parallel with the smaller more speculative ones, and not to put off progress on the big mainstream projects in hopes that a miracle happens in smaller sideline work.
That first sentence should say The cost of the FCC or Chinese collider is smaller as a fraction of the World GDP than Fermilab’s cost compared to the US economy in the
late 1960s.
John,
I don’t disagree and made many of the same points myself in the posting and discussion here
https://www.math.columbia.edu/~woit/wordpress/?p=10768
” One piece of advice though is that experience of the past few decades shows you probably shouldn’t listen to theorists.”
Amitabh Lath (and John),
My comment that a \$10 billion cost is a “big problem” was not meant to indicate I have a problem with government spending on that order to fund a collider, or that there is some problem of principle, but just that raising that kind of sum is going to be hard to do.
This isn’t going to get built in the US and the US is not going to provide a large fraction of the cost of a machine built elsewhere. The US budget process is currently a joke, with no clarity as to what will happen next year, much less over a 10-20 year period. Any document promising US future funding would not be worth the paper it was written on. I also doubt the Japanese or Chinese will fund much of a machine built in Europe. Maybe the Chinese will do this in China, but that would require overcoming objections from those like C.N. Yang who feel that China has more pressing needs for spending \$10 billion.
In Europe, CERN has a fixed budget (in fraction of GDP) and while one can reasonably expect it to hold onto that, it’s very unclear to me whether a plan for \$10-\$15 billion in new spending can fit into that budget. I don’t know enough about CERN funding to know whether there’s a realistic prospect of getting commitments for a big budget increase to fund a new machine. I’m guessing we’ll be hearing from CERN leadership in coming months about these issues.
The above is what I meant by “big problem”, I hope it can be solved.
All,
As discussed above, I don’t think further debate over a new collider here now is a fruitful thing to do (and, you shouldn’t listen to theorists anyway…).
Amitabh Lath: If a €1.5 billion telescope was cancelled because it was too expensive, it’s difficult to imagine how physicists will be able to get funding for a $20 billion collider.
And it’s also difficult to imagine how a new collider will give us information about quantum gravity, at the Planck scale. So even if it discover interesting physics beyond the Standard Model, it doesn’t really solve the problem that Sabine’s article brings up.
Peter Shor, either new physics is there within human reach or it is not. If it is then the biggest hadron collider we can build is obviously the best bet to find it. Nothing beats bigger \sqrt{s}. If you look beyond the headlines (LHC fails to find SUSY) you will see the breathtaking precision with which the searches have been carried out. Yes we are still smashing Swiss watches together but now we can identify just about every screw, spring, and sprocket that flies out. This should give you confidence about our ability to find new physics.
As to the comment that “even if it discover interesting physics beyond the Standard Model, it doesn’t really solve the problem…” I do not understand this statement. How do we know what problems the new physics can or cannot solve until we find it?
As for $$, yes but we won’t get it if we don’t ask. The astronomers might have struck out with the Overwhelmingly Large but they are getting the Extremely Large, not to mention the James Webb, the Vera Rubin, etc.
Amitabh Lath: “the biggest hadron collider we can build is obviously the best bet to find it”
To me, this seems not obvious at all. I do not think that “new physics” necessarily requires “new particles”. Sure, if you want to find new physics, you need to somehow explore new parameter space, and larger energies are an obvious candidate, but so are violations of the equivalence principle, massive superpositions, large distance Bell tests, …
There are currently very few reasons to believe that any of those will show new physics, but we need to keep in mind that reasons to expect something new from a collider are equally unconvincing. As theoreticians, our task is of course to work on finding such convincing reasons why something new should happen in one experiment or another, but in the absence of these convincing theoretical reasons all these options must be seen as equally (un)likely to produce anything truly exciting.
The way I understand Sabine, her point is mainly that, because of this fact, we should maybe focus on the easier (i.e. cheaper) areas where new physics could show up first, and I very much agree with that sentiment.
The only potential benefit I see for a new collider is the one that Peter mentions: better understanding of the Higgs sector. Whether or not this is worth the price is in fact not an easy question. The “just look” argument, on the other hand, has not been aging well in my opinion.
André,
One problem with the “let’s do cheap violations of the equivalence principle, massive superpositions, large distance Bell tests, … instead of building a collider” argument is exactly that such things are (on the HEP funding scale) cheap. The issue of funding and doing such experiments really has nothing at all to do with the collider issue since the scales are completely different. The US and Europe each have billion-dollar/year budgets right now being spent on high energy physics, the Simons Foundation is handing out a quarter-billion/year in grants. If proposals for small-scale experiments can convince people they’re worthwhile, they should in principle be able to get funded (if this isn’t possible, the field has a different sort of funding problem that needs to get fixed).
Peter Shor wrote:
I can’t let this go by. Planck was trying to explain some experimental data, and he didn’t originally even pay much attention to the fact that his calculation required that light came in discrete chunks.
The simplified story often told to students is that Planck was struggling to deal with the fact that classical electromagnetism combined with statistical mechanics leads to an “ultraviolet catastrophe”: due to the equipartition theorem, a box of classical radiation in thermal equilibrium would have more and more energy density at higher and higher frequencies, as described by the Rayleigh-Jeans law, leading to infinite total energy. Planck showed that if radiation came in discrete chunks this problem would go away.
But in fact Planck wrote his groundbreaking paper in 1900, he didn’t accept the equipartition theorem seriously at the time, and the ultraviolet catastrophe was only recognized as a serious problem later. (The term was coined by Ehrenfest in 1911.)
What Planck actually started out trying to do was justify a different formula for the energy density as a function of frequency of radiation in thermal equilibrium, one that seemed empirically correct at high frequencies: the Wien law. He wrote a paper doing this in 1899, but then experiments showed the Wien law was inaccurate at low frequencies, so Planck went back to the drawing board.
In “an act of desperation”, in 1900 he introduced discrete energy levels to get a law that matched the Wien law at high frequencies but differed from it at low frequencies. But he didn’t think much about the meaning of these energy levels. He later wrote that it was “a purely formal assumption and I really did not give it much thought except that no matter what the cost, I must bring about a positive result”.
He got a formula that fit the data, the Planck law, and he was happy with that. Only in 1908, thanks to Lorentz, did he accept the physical significance of the energy quanta he’d almost unwittingly introduced. This was after Einstein wrote his Nobel-winning paper on photons in 1905.
In short, Planck was heavily data-driven, and only later did the conceptual meaning and revolutionary nature of his calculation become clear. For more, try Max Planck: the reluctant revolutionary by Helge Kragh.
Concerning Peter Shor, John Baez: Or read the formidable book by Thomas Kuhn, Black-Body Theory and the Quantum Discontinuity 1894-1912.
Peter Shor/John Baez,
An interesting discussion of Planck, but this has gotten far off topic. Hossenfelder is getting a lot of flak for the “baseless speculation” terminology, but I think she was trying to point to something very specific, where she has a serious case.
Specifically, the problem is that there is currently no well-motivated theoretical reason to expect new physics:
1. At a new e+/e- collider. The examples people give of BSM models that contain new states that would not be seen at the LHC, but would be visible at a lower energy lepton collider appear to be contrived, “baseless speculation” is not a bad description. The non “baseless speculation” case for a lepton collider is as a tool for detailed study of the Higgs.
2. At a proton-proton collider with beam energy 2 (HE-LHC) -7 (FCC-pp) times that of the LHC. Such a machine would open up the study of a new energy range, but it is true that we have no well-motivated reason to expect something new in that energy range. Against her, there’s a good argument that one should investigate the new higher energy range anyway, because doing so is what science is about (not just trusting that your theory extends to cover some unexamined region, but looking to check, something unexpected is quite possible). The problem is that doing this would be very expensive.
There is a nice reply from CLIC (https://arxiv.org/abs/2001.05373) on allegations made by FCC proponents (https://arxiv.org/abs/1912.13466).
There is some seriously naive zero-sum thinking on this blog and comments. Peter Shor, Andre and others all point to cost as a the determining factor. Perhaps if science (colliders, telescopes, fusion…) manages to squeeze itself into a small enough box it won’t get its head chopped off.
Projects like the SSC, LHC and the next one do not live or die by cost alone. Above a certain amount that isn’t even the determining factor. The SSC was cancelled not because of minor cost overruns but because Speaker Jim Wright (D-Texas) was forced to resign, George H.W. Bush, (a yankee pretending to be Texan) lost re-election, and Lloyd Bentsen left the Senate. Suddenly Texas had zero support in DC. It’s counter-intuitive but if the SSC had cost more, and that extra spending had been spread around widely to certain strategic states and districts, it could have survived.
If the FCC gets built it will be because of buy-in from the large CERN member states who will be promised major contracts. Italian, French, German, British, and Dutch electrical, mechanical, civil engineering firms will make sure to get their cut. Major code-writing will be done in Eastern Europe. There will be tacit agreement on new facilities in depressed areas, number of jobs, etc.
I understand the need highly numerate people have to believe everything is controlled by numbers, budgets, timelines. But the go/no-go decision on a project like the next hadron collider will rest on something a lot more political and social.
The political background is correct, but the motivation is forgotten: The SSC was supposed to discover the Higgs boson(s) and sparticles, or rule out natural SUSY.
The LHC already did that.
If one is being honest “because it’s there” is the only well-founded motivation to probe the energy frontier, and one I happen to support (for what it’s worth, which isn’t much). Theoretical considerations have proven to be so pliable and inaccurate, and this has been going on for so long, it’s very difficult for “outsiders” to take them seriously anymore.
It’s really about time those making these arguments stop assuming interested outsiders are stupid and face the toll history has inflicted on the current state of the field.
I’m going to delete any further arguing about a new collider here, nothing new is coming out of it. For the latest news about this, see this presentation from today:
https://indico.cern.ch/event/838435/contributions/3647164/attachments/1971463/3279599/05-D99-AB-next-steps.pdf
There’s a strategy drafting session next week, results to be made public maybe in March. Even if there’s a positive decision to pursue the idea of a new collider, it looks like this will just be the start of a multi-year process of trying to see if Amit is right and CERN member states will be willing to come up with new funding. An actual decision whether or not to go ahead is not envisioned until 2025/6. So, there will be plenty of time to debate this more…
@John Baez: That’s an interesting piece of history that I didn’t know. But I don’t think that invalidates the example. How much difference is there between “an act of desperation” and an “entirely baseless speculation”?
My point is that the paradigm that worked so well for physics before 1975 or so — throw lots and lots of not-so-crazy and crazy theories at the wall and see which ones are validated by experiments — doesn’t work in the absence of experiments.
Dear Peter,
If its not too late, I’d like to propose a view point that points to a way forward for HEP. One reason scientists can get stuck is when we persist in asking the wrong questions. Progress resumes when someone asks the right question, ie a question that leads to the correct explanation of the physics in question.
In the case of unification and beyond the standard model physics, its pretty clear by now that what seemed like the right questions in the 1970’s and 1980’s: naturalness, what is the right symmetry group, etc are not leading to progress.
So what is the right question to ask now? Here is one suggestion:
Accept that the standard model parameters are fine tuned and ask: “What was the dynamical mechanism, acting in the early universe, or perhaps before the big bang, that tuned them?”
That is, think like Darwin rather than Plato. Faced with a diversity of species, Plato asked for absolute, timeless principles, that explained why these-and only these species-exist. Darwin’s insight was that there are no such absolute principles. What we see in the biosphere is the result of a dynamical process: natural selection, acting over billions of years, that could have turned out differently. But the dynamics is simple, profound and leads to many insights.
My proposal, which I have been making since 1992, is that theoretical HEP will begin to flourish again when we HEP theorists drop the search for the ultimate symmetry group, and begin learning to think like Darwin.
Thanks,
Lee
Thanks Lee,
I disagree with you that the current problem is too many theorists searching for a deeper symmetry-based explanation. These days the popular direction seems to be not that, but instead pursuit of the idea that space-time and the symmetries we see are just emergent epiphenomena of some seemingly unknowable or random quantum system.
The large number of random-looking parameters needed as input for the SM certainly doesn’t look like the output of any known symmetry-based argument. Maybe they are environmental, determined either by anthropic selection or evolutionary path from whatever is going on pre-Big Bang. The problem remains though of coming up with a viable theory of whatever physics it is that is going on there, and that seems to me completely open.
Peter Shor wrote:
To me they seem diametrically opposed. Planck had some experimental data that contradicted the best theory so far, and he was desperately trying to fit it with a simple new model. An “entirely baseless speculation” doesn’t give a better fit of existing data: at best it satisfies some theoretical predilections and avoids contradiction with existing data. An example of the latter is making up a theory with new particles that just happen to have masses slightly larger than current colliders can see, but doesn’t make any more accurate predictions about things we actually do see. Some particle physicists have been doing this for decades now.
Is it worth pointing out that “the foundations of physics” are not limited to high-energy and particle physics? There are many theorists working on foundational problems in condensed matter physics, for example, where much progress continues to be made. It is a thriving and fruitful discipline with a close coupling to experiment and lots of exciting new results. The HEP community’s insistence that they’re the only ones who work on “foundations” (see, e.g., the headline to this very blog post) may be one of the problems here.