This Week’s Hype

In recent years string theorists have been having trouble getting taken seriously by the media, a problem they’ve been trying to deal with by enlisting the PR departments of their universities to help. Following Princeton and Stanford, today’s the turn of the string theorists at Northeastern, who had their press office put out a press release announcing “Northeastern team uses string theory to explain the fundamental nature of the universe.”

As usual, this is just pure, unadulterated hype. It’s based on a PRL publication, also available as this preprint. I usually try to avoid this sort of editorializing, but I’m actually shocked to see that PRL is now publishing this sort of thing, which is infinitely far from having any connection to conventional science.

Posted in This Week's Hype | 9 Comments

A Few Items

A few things that may be of interest:

  • The Perimeter Institute has a new director, Rob Myers, succeeding Neil Turok. Myers is very much a mainstream theorist, and Perimeter over the years has been converging with the mainstream, from a very non-mainstream initial state. While Turok has taken the view in recent years that theoretical physics is in “a deep crisis”, Physics World has:

    Myers says there are many opportunities in theoretical physics, mostly thanks to the vast amounts of data that are being collected by various experiments such as CHIME, EHT and the LIGO gravitational-wave detectors in the US. Yet Myers doesn’t believe that theoretical physics is in “a deep crisis” as Turok once admitted. “Particle physics is somewhat at a crossroads,” he says. “Describing it as a crisis is slightly dramatic, but I would agree that people have been relying on the status quo for too long and relaying on certain models from decades ago.”

    Indeed, Myers now challenges researchers to think in new ways. “Young people are the future and we want to instill in them to question the status quo,” he adds. “After all, it is the people here that make the PI such a special place.”

  • Speaking of challenges to the status quo, it seems that Sabine Hossenfelder now has a contract for her second book, topic not yet revealed.
  • For the latest news from the Swampland, see this twitter thread from Will Kinney. He explains how the “Swampland conjecture” was meant to kill off the string theory multiverse, but this conjecture got in trouble:

    So by getting rid of the multiverse, we have also gotten rid of known physics like the Higgs boson. Merde!

    It was replaced by a fix, the “refined Swampland conjecture”, but Kinney has a new paper in PRL (arXiv link here) showing this fix doesn’t solve the multiverse problem for string theory:

    This means that, as soon as we fix up the Swampland Conjecture so it doesn’t trivially rule out known physics like the Higgs, we inevitably get an unwelcome passenger: the string multiverse!

    This is important because it looked like the Swampland Conjecture was likely to free us from the multiverse and associated awful stuff like the Anthropic Principle. Not so, we’re still stuck with it. Sorry.

  • John Baez has a popular article at Nautilus about his new-found love for algebraic geometry, as an explanation of the relation of classical and quantum. The more technical version is a series of posts here.

Update: Arnold Neumaier has posted at the arXiv a series of three papers discussing his “thermal interpretation” of quantum mechanics (see here, here and here). While I find many of the points he seems to be making compelling, I haven’t had time to think seriously about what the problems of his approach might be (and there’s a long history of online discussions between him and others which would be a good place to start). Neumaier in the papers explicitly asks for discussion of them at physicsoverflow.org, and there are now posts there for this purpose (here, here and here). I look forward to following any discussion with him over there. He also has a website devoted to this topic here, which has some links to earlier discussions.

Update: There’s a new issue of Inference out. As usual, some interesting pieces from people not usually heard from in a non-technical venue. No sign of the pro-intelligent design/climate denialism agenda that they’ve been accused of having (see here). Pieces specifically relevant to some of the obsessions of this blog are a review by Glashow of Lost in Math, and a piece by David Roberts on the Mochizuki/Scholze/Stix story.

Posted in Swampland, Uncategorized | 3 Comments

The Mathematical Question From Which All Answers Flow

I’m beginning to suspect that there are actually (at least) two different theoretical HEP physicists named Nima Arkani-Hamed out there. One of them (who I’ll call Nima1) believes the way to understand the fundamental nature of physical reality involves extremely complicated extensions of the Standard Model, with large numbers of parameters tuned to avoid conflict with observation, and possibly hundreds or thousands of extra fields thrown in for good measure. He also seems to like the multiverse and anthropic explanations. I have a lot of disagreements with Nima1, most recently discussed here.

The second Arkani-Hamed (Nima2) has a completely different point of view, one quite close to my own, although he may be even more of a mathematical mystic than I am. Natalie Wolchover has recently talked to Nima2 and written about it for the New Yorker. Nima2 is in love with the deep mathematical structure of physics and the way it appears in different aspects:

Nima Arkani-Hamed, a physicist at the Institute for Advanced Study, is one of today’s leading theoreticians. “The miraculous shape-shifting property of the laws is the single most amazing thing I know about them,” he told me, this past fall. It “must be a huge clue to the nature of the ultimate truth.”

Wolchover expands on this idea of multiple ways of expressing the same underlying mathematical structure:

The existence of this branching, interconnected web of mathematical languages, each with its own associated picture of the world, is what needs to be understood.

This web of laws creates traps for physicists. Suppose you’re a researcher seeking to understand the universe more deeply. You may get stuck using a dead-end description—clinging to a principle that seems correct but is merely one of nature’s disguises. It’s for this reason that Paul Dirac, a British pioneer of quantum theory, stressed the importance of reformulating existing theories: it’s by finding new ways of describing known phenomena that you can escape the trap of provisional or limited belief. This was the trick that led Dirac to predict antimatter, in 1928. “It is not always so that theories which are equivalent are equally good,” he said, five decades later, “because one of them may be more suitable than the other for future developments.”

Today, various puzzles and paradoxes point to the need to reformulate the theories of modern physics in a new mathematical language. Many physicists feel trapped. They have a hunch that they need to transcend the notion that objects move and interact in space and time. Einstein’s general theory of relativity beautifully weaves space and time together into a four-dimensional fabric, known as space-time, and equates gravity with warps in that fabric. But Einstein’s theory and the space-time concept break down inside black holes and at the moment of the big bang. Space-time, in other words, may be a translation of some other description of reality that, though more abstract or unfamiliar, can have greater explanatory power.

Nima2 is obsessed with exactly the same mystical mathematical issue that I am: what’s the right mathematical question that has as answer the Standard Model and GR?

To Arkani-Hamed, the multifariousness of the laws suggests a different conception of what physics is all about. We’re not building a machine that calculates answers, he says; instead, we’re discovering questions. Nature’s shape-shifting laws seem to be the answer to an unknown mathematical question…

Arkani-Hamed now sees the ultimate goal of physics as figuring out the mathematical question from which all the answers flow. “The ascension to the tenth level of intellectual heaven,” he told me, “would be if we find the question to which the universe is the answer, and the nature of that question in and of itself explains why it was possible to describe it in so many different ways.” It’s as though physics has been turned inside out. It now appears that the answers already surround us. It’s the question we don’t know.

I’m not sure the Amplituhedron is the right path to the “tenth level of intellectual heaven” and finding the “mathematical question from which all the answers flow”, but I’m completely sympathetic with Nima2’s motivation and quest.

Posted in Uncategorized | 16 Comments

Various and Sundry

First a couple of items from Paris:

  • Fields medalist Cédric Villani is campaigning for the position of Mayor of Paris. This Sunday there will be a campaign event/book launch for his new book, Immersion: De la science au Parlement.
  • The long-awaited public unveiling of the results of director Ilya Khrzhanovsky’s attempt to make a film inspired by the story of Lev Landau has finally happened, with Dau now on view spread over three locations in Paris. This project was filmed during 2009-11, and I wrote a bit about it here in 2015. For more about the project, see for instance here and here. Among those appearing in the film are David Gross, Sergio Cecotti, Alexander Vilenkin, Carlo Rovelli, Costas Bachas, Erik Verlinde, Igor Klebanov, Samson Shatashvili, Shing-Tung Yau, Dmitry Kaledin, Nikita Nekrasov and Andrey Losev.

Some number-theorist related items:

  • Videos from last year’s Barry Mazur birthday conference are now available here. Also available is the write-up from Mazur of a talk last fall on The Unity and Breadth of Mathematics.
  • See here for an interview with Akshay Venkatesh.

Finally, some comments from Scott Aaronson on the current “like beer at a frat party” state of funding of quantum information theory:

I wanted to call attention to a hilarious irony. For years, I’ve made the case that trying to build scalable quantum computers, in order to probe the universe for the first time in “the regime beyond the classical Extended Church-Turing Thesis,” is just as scientifically interesting as finding the Higgs boson—even if we set aside any of the possible applications of QC.

I don’t think I imagined to what extent the tables would someday turn—with funding now flowing into quantum information like beer at a frat party (for a combination of good and bad reasons…), with the future of experimental particle physics now in serious doubt, and with me put in the position of arguing that the high-energy frontier is worth exploring too! 😀

Update: More about the opening of Dau in Paris here.

Posted in Uncategorized | 5 Comments

Where in the World are SUSY and WIMPs?

Back in 2017, after it had already become clear that negative LHC results about SUSY and WIMPs had falsified theorist’s most popular scenarios for how to extend the Standard Model, Nima Arkani-Hamed gave a summer school talk to students with the title Where in the World are SUSY & WIMPS?, which I discussed here. At the time I was encouraged that while he was still promoting SUSY and the landscape (in the split SUSY variant), at least he seemed to be arguing that the lesson to be drawn might be that the whole SUSY-GUT business was a mistake:

The disadvantage to the trajectory of going with what works and then changing a little and changing a little is that you might just be in the basin of attraction of the wrong idea from the start and then you’ll just stay there for ever.

A few weeks ago in Princeton, at a PCTS workshop on Dark Matter, he gave an updated version of the same talk. Much of it was the same material about how split SUSY is the best idea still standing. Unfortunately, at the end (1:09) he seems to now have changed his mind and be arguing that the best thing for theorists to do is to keep tweaking the models that failed at the LHC:

You could very justifiably say “look, you’re just continuing to make excuses for a paradigm that failed”, OK, and I would say that’s true, and even the paradigm most of your advisors love [e.g. usual SUSY] was already an excuse for the failure of non-supersymmetric GUTs before that.

That is a perfectly decent attitude to take, but I would like to at least tell you that you should study some of the history of physics. This very, very, very rarely happens, that some idea that seems basically right is just crap and wrong, It’s probably mostly right with a tweak or some reinterpretation. You’d have to go back over…, I don’t know how far you’d have to go back, even Ptolemy wasn’t so far from wrong…

These are two different attitudes towards connecting theory and experiment. If you like, more the theory egocentered attitude, or the just more explore from the bottom up attitude, they’re both perfectly good attitudes, we’ll see which is more fruitful in the end. If you take the more top-down attitude, just keep fixing things a little bit.

If you had to pick the single most influential theorist out there on these issues, it would probably be Arkani-Hamed. This kind of refusal to face reality is I think a significant factor in what has caused Sabine Hossenfelder to go on her anti-new-collider campaign. While I disagree with her and would like to see a new collider project, the prospect of having to spend the decades of my golden years listening to the argument “we were always right about SUSY, it just needs a tweak, and we’ll see it at the FCC” is almost enough to make me change my mind…

Update
: Today Ethan Siegel at Forbes has Why Supersymmetry May Be The Greatest Failed Prediction in Particle Physics History.

Posted in Uncategorized | 48 Comments

On Inference

The first issue of the magazine Inference appeared online back in 2014. At the time, it was surrounded by a significant amount of mystery: who were the editors, what were they trying to do, and who was funding it? I asked around and no one I talked to was sure what the answers were to these questions. Best guesses seemed to be that it was run out of Paris, with David Berlinski playing some role, and the funding source might be Peter Thiel.

Looking at the early issues that came out, on the topics I was competent to judge, the contributions about mathematics and physics were generally interesting and of high quality. On some other topics where I lack competence, there seemed to be a skeptical attitude towards materialism and evolutionary theory that I’m not sympathetic with.

Late in 2015 I was contacted by someone from Inference (Hortense Marcelin) to write an essay for them, something about the multiverse and string theory. After thinking about it a bit, I turned down the offer. The main reason was that I was sick and tired of the subject, didn’t want to spend time writing at length about it. A contributing factor in the back of my mind was that, not knowing the identity of the editors or anything about their agenda was another reason to not get involved.

A couple years later I got another invitation to write for them, a request to write a short response to an excellent piece by George Ellis, Physics on Edge. Deciding to do this wasn’t hard. The piece would be short and I already knew exactly what I wanted to say, so it would take little time. In addition, I think by this time the identity of the editors was known, and, most importantly, Inference had a pretty good track record of publishing high quality articles in the areas I know about. What I wrote was published as Theorists Without a Theory.

Adam Becker a few days ago published at Undark a long article about Inference. It’s a bit of an exposé, taking issue with some of the writing as “intelligent-design propaganda”, and revealing that yes, Peter Thiel is a funder. An odd part of the story is that Becker suspects that a negative review of his book by Glashow in Inference was motivated by the fact that he had not much earlier contacted Glashow to ask pointed questions about the publication and its funding.

Today I got in my inbox A Statement from Sheldon Glashow and Inference, which is available here. You can read it for yourself. Noteworthy in the Undark article is Becker’s report that Glashow had told him that “questioning evolution” is “no longer a
policy of the journal”. Referring to two early 2014 articles that could be described as questioning climate change and evolution, the statement says:

Becker believes that two of our essays are deserving of censure. They are William Kininmonth’s “Physical Theories and Computer Simulations in Climate Science,” and Michael Denton’s “Evolution: A Theory in Crisis Revisited.”

Both were published in 2014.

Now a secret must be imparted. Sheldon Glashow and Rich Roberts agree with Becker. Richard Lindzen and David Gelernter do not.

It ends with this response to the accusation about the motivation for the negative review of Becker’s book:

Inference commissioned Sheldon Glashow to review Becker’s book in the spring of 2018, well before Becker was known to Inference. The idea that we would require the services of a Nobel Laureate in order to make a fool of Becker is absurd. Becker is capable of doing that quite by himself.

Posted in Uncategorized | 50 Comments

Three Short Book Reviews

Unfortunately I don’t have time now to write about the three following books at the length that they deserve, but here are some quick comments on three books worth your attention:

  • A carefully produced detailed write-up of Sidney Coleman’s Harvard Physics 253 quantum field theory course has now been published by World Scientific. This course was taught by Coleman off and on from the mid-seventies until 2002, and the book is based on various sets of video recordings and lecture notes (including a copy of my lecture notes from when I attended the class). A huge amount of work by various people has gone into producing a very high quality book. David Derbes has some comments here, and he is perhaps the main person to thank for seeing this project through to completion.

    David Kaiser has contributed an introduction to the book (available here, or, if this doesn’t work, try here) which does an excellent job of putting the material in historical and intellectual context, as well as describing what Coleman was like and why he had a huge influence on several generations of Harvard students. If you’ve already spent a lot of time learning QFT from various modern textbooks, your reaction to much of this one may be “that’s the standard way of explaining that point, nothing unusual here.” Keep in mind that often the reason that’s now the standard way of explaining things is that many authors of modern textbooks learned the subject from Coleman (or from someone who learned it from Coleman…).

  • Jim Baggott has a very good new book out, entitled Quantum Space, which could roughly be described as a popular account of loop quantum gravity at the level of the account of string theory in books like Brian Greene’s The Elegant Universe. Baggott has spent a lot of time talking to Carlo Rovelli and Lee Smolin, and one of the best aspects of the book is the way it conveys their personal stories, intellectual journey, and current outlook on the subject.

    One unavoidable topic that Baggott covers is the relation of string theory and LQG as competing (or perhaps someday collaborating?) approaches to the problem of quantum gravity. Due to long ago experience (to get an idea, watch this), I’ve long ago lost patience for arguments about which approach is “better”. Baggott’s take on the issue seems fair to me, but if you really want to engage in that argument it will have to be elsewhere than the comment section here.

  • Finally, the new book you really should buy a copy of is my brother Steve’s Fly Fishing Treasures. He has been working on it for years, and it includes the most amazing beautiful pictures of antique fly fishing equipment in existence, as well as a wealth of information about those who collect these things. OK, if you, like me, aren’t especially excited about the topic of fly fishing, then buy a copy as a present for someone who is.

Update: I should have included links to postings about Coleman here, here and here.

Update: If you’ll be at the March APS meeting in Boston, I hear there’s a book launch for the Coleman book, 1:30pm-2:15pm Weds. March 6, at the World Scientific booth in the exhibition hall.

Posted in Book Reviews | 11 Comments

Should the Europeans Give Up?

The European HEP community is now engaged in a “Strategy Update” process, the next step of which will be an open symposium this May in Granada. Submissions to the process were due last month, and I assume that what was received will be made publicly available at some point. This is supposed to ultimately lead to the drafting of a new European HEP strategy next January, for approval by the CERN Council in May 2020.

The context of these discussions is that European HEP is approaching a very significant crossroads, and decisions about the future will soon need to be made. The LHC will be upgraded in coming years to a higher luminosity, ultimately rebranded as the HL-LHC, to start operating in 2026. After 10-15 years of operation in this higher-luminosity mode, the LHC will reach the end of its useful life: the marginal extra data accumulated each year will stop being worth the cost of running the machine.

Planning for the LHC project began back in the 1980s, and construction was approved in 1994. The first physics run was 16 years later, in 2010. Keep in mind that the LHC project started with a tunnel and a lot of infrastructure already built, since the LEP tunnel was being reused. If CERN decides it wants to build a next generation collider, this could easily take 20 years to build, so if one wants it to be ready when the LHC shuts down, one should have started already.

Some of the strategy discussion will be about experiments that don’t require the highest possible collision energies (the “energy frontier”), for instance those that study neutrinos. Among possibilities for a new energy frontier collider, the main ones that I’m aware of are the following, together with some of their advantages and drawbacks:

  • FCC-ee: This would be an electron-positron machine built in a new 100 km tunnel, operating at CM energies from 90 to 365 GeV. It would provide extremely high numbers of events when operated at the Z-peak, and could also be operated as a “Higgs factory”, providing a very large number of Higgs events to study, in a much cleaner environment than that provided by a proton-proton collider like the LHC.

    In terms of drawbacks, it is estimated to cost \$10 billion or so. The CM energy is quite a bit less than that of the LHC, so it seems unlikely that there are new unknown states that it could study, since these would have been expected to show up by now at the LHC (or at LEP, which operated at 209 GeV at the end).

    Another point in favor of the FCC-ee proposal is that it would allow for reuse of the tunnel (just as the LHC followed on LEP) for a very high energy proton-proton collider, called the FCC-hh, which would operate at a CM energy of 100 TeV. This would be a very expensive project, estimated to cost \$17 billion (on top of the previous \$10 billion cost of the FCC-ee).

  • HE-LHC: This would essentially be a higher energy version of the LHC, in the same tunnel, built using higher field (16 T vs. 8.33 T) magnets. It would operate at a CM energy of 27 TeV. The drawbacks are that, while construction would be challenging (there are not yet appropriate 16 T magnets), only a modest (27 vs. 14 TeV) increase in CM energy would be achieved. The big advantage over the FCC-hh is cost: much of the LHC infrastructure could be reused and the machine is smaller, so the total cost estimate is about \$7 billion.
  • CLIC: This would be a linear electron-positron collider with first stage of the project an 11 km-long machine that would operate at 380 GeV CM energy and cost about \$7 \$6 billion. The advantage of this machine over the circular FCC-ee is that it could ultimately be extended to a longer 50 km machine operating at 3 TeV CM energy (at a much higher cost). The disadvantage with respect to the FCC-ee is that it is not capable of operating at very high luminosity at lower energies (at the Z-peak or as a Higgs factory).

For some context for the very high construction costs of these machines, the CERN budget is currently around \$1.2 billion/year. It seems likely that member states will be willing to keep funding CERN at this level in the future, but I have no idea what prospects if any there are for significantly increased contributions to pay for a new collider. A \$10 billion FCC-ee construction cost spread out over 20 years would be \$500 million/year. Can this somehow be accommodated within CERN’s current budget profile? This seems difficult, but maybe not impossible. Where the additional \$17 billion for the FCC-hh might come from is hard to see.

If none of these three alternatives is affordable or deemed worth the cost, it looks like the only alternative for energy frontier physics is to do what the US has done: give up. The machines and their cost being considered here are similar in scale to the SSC project, which would have been a 40 TeV CM energy 87 km proton-proton collider but was cancelled in 1993. Note that the capabilities of the SSC would have been roughly comparable to the HE-LHC (it had higher energy, lower luminosity). Since it would have started physics around 2000, and an HE-LHC might be possible in 2040, one could say that the SSC cancellation set back the field at least 40 years. The worst part of the SSC cancellation was that the project was underway and there was no fallback plan. It’s hard to overemphasize how disastrous this was for US HEP physics. Whatever the Europeans do, they need to be sure that they don’t end up with this kind of failure.

Faced with a difficult choice like this, there’s a temptation to want to avoid it, to believe that surely new technology will provide some more attractive alternative. In this case though, one is running up against basic physical limits. For circular electron-positron machines, synchrotron radiation losses go as the fourth power of the energy, whereas for linear machines one has to put a lot of power in since one is accelerating then dumping the beam, not storing it. For proton-proton machines, CM energy is limited by the strength of the dipole magnets one can build at a reasonable cost and operate reliably in a challenging environment. Sure, someday we may have appropriate cheap 60T magnets and a 100 TeV pp collider could be built at reasonable cost in the LHC tunnel. We might also have plasma wakefield technology that could accelerate beams of electrons and positrons to multi-TeV energies over a reasonable distance, with a reasonable luminosity. At this point though, I’m willing to bet that in both cases we’re talking about 22nd century technology unlikely to happen to fall into the 21st century. Similar comments apply to prospects for a muon collider.

Another way to avoid the implications of this difficult choice is to convince oneself that cheaper experiments at low energy, or maybe astrophysical observations, can replace energy frontier colliders. Maybe one can get the same information about what is happening at the 1-10 TeV scale by looking at indirect effects at low energy. Unfortunately, I don’t think that’s very likely. There are things we don’t understand about particle physics that can be studied using lower energies (especially the neutrino sector) and such experiments should be pursued aggressively. It may be true that what we can learn this way can replace what we could learn with an energy-frontier collider, but that may very well just be wishful thinking.

So, what to do? Give up, or start trying to find the money for a very long-term, very challenging project, one with an uncertain outcome? Unlike the case of the LHC, we have no good theoretical reason to believe that we will discover a new piece of fundamental physics using one of these machines. You can read competing arguments from Sabine Hossenfelder (here and here) and Tommaso Dorigo (here, here and here).

Personally, I’m on the side of not giving up on energy frontier colliders at this point, but I don’t think the question is an easy one (unlike the question of building the LHC, which was an easy choice). One piece of advice though is that experience of the past few decades shows you probably shouldn’t listen to theorists. A consensus is now developing that HEP theory is in “crisis”, see for instance this recent article, where Neil Turok says “I’m busy trying to persuade my colleagues here to disregard the last 30 years. We have to retrace our steps and figure out where we went wrong.” If the Europeans do decide to build a next generation machine, selling the idea to the public is not going to be made easier by some of the nonsense from theorists used to sell the LHC. People are going to be asking “what about those black holes the LHC was supposed to produce?” and we’re going to have to tell them that that was a load of BS, but that this time we’re serious. This is not going to be easy…

Update: Some HEP experimentalists are justifiably outraged at some of the negative media stories coming out that extensively quote theorists mainly interested in quantum gravity. There are eloquent Twitter threads by James Beacham and Salvatore Rappoccio, responding to this Vox story. The Vox story quotes no experimentalists, instead quotes extensively three theorists working on quantum gravity (Jared Kaplan, Sabine Hossenfelder and Sean Carroll). Not to pick specifically on Kaplan, but he’s a good example of the point I was making above about listening to theorists. Ten years ago his work was being advertised with:

As an example question, which the LHC will almost certainly answer—we know that the sun contains roughly 10^60 atoms, and that this gigantic number is a result of the extreme weakness of gravity relative to the other forces—so why is gravity so weak?

Enthusiasm for the LHC then based on the idea that it was going to tell us about gravity was always absurd, and a corresponding lack of enthusiasm for a new collider based on negative LHC results on that front is just as absurd.

Update: Commenter abby yorker points to this new opinion piece at the New York Times, from Sabine Hossenfelder. The subtitle of the piece is “Ten years in, the Large Hadron Collider has failed to deliver the exciting discoveries that scientists promised.” This is true enough, but by not specifying the nature of the failure and which scientists were responsible, it comes off as blaming the wrong people, the experimentalists. Worse, it uses this failure to argue against further funding not of failed theory, but of successful experiment.

The LHC machine and the large-scale experiments conducted there have not in any sense been a failure, quite the opposite. The machine has worked very well, at much higher than design luminosity, close to design energy (which should be achieved after the current shutdown). The experiments have been a huge success on two fronts. In one direction, they’ve discovered the Higgs and started detailed measurements of its properties, in another they’ve done an amazing job of providing strong limits on a wide range of attempted extensions of the standard model.

These hard-won null results are not a failure of the experimental program, but a great success of it. The only failure here is that of the theorists who came up with bad theory and ran a hugely successful hype campaign for it. I don’t see how the lesson from seeing an experimental program successfully shoot down bad theory is that we should stop funding further such experiments. I also don’t see how finding out that theorists were wrong in their predictions of new phenomena at the few hundred GeV scale means that new predictions by (often the same) theorists of no new phenomena at the multiple TeV scale should be used as a reason not to fund experimentalists who want to see if this is true.

Where I think Hossenfelder is right is that too many particle physicists of all kinds went along with the hype campaign for bad theory in order to get people excited about the LHC. Going on about extra dimensions and black holes at the LHC was damaging to the understanding of what this science is really about, and completely unnecessary since there was plenty of real science to generate excitement. The discussion of post-LHC experimental projects should avoid the temptation to enter again into hype-driven nonsense. On the other hand, the discussion of what to defund because of the LHC results should stick to defunding bad theory, not the experiments that refute it.

Update: Some more commentary about this, from Chris Quigg, and the CERN Courier. In particular, the CERN Courier has this from Gerard ‘t Hooft:

Most theoreticians were hoping that the LHC might open up a new domain of our science, and this does not seem to be happening. I am just not sure whether things will be any different for a 100 km machine. It would be a shame to give up, but the question of whether spectacular new physical phenomena will be opened up and whether this outweighs the costs, I cannot answer. On the other hand, for us theoretical physicists the new machines will be important even if we can’t impress the public with their results.

and, from Joseph Incandela:

While such machines are not guaranteed to yield definitive evidence for new physics, they would nevertheless allow us to largely complete our exploration of the weak scale… This is important because it is the scale where our observable universe resides, where we live, and it should be fully charted before the energy frontier is shut down. Completing our study of the weak scale would cap a short but extraordinary 150 year-long period of profound experimental and theoretical discoveries that would stand for millennia among mankind’s greatest achievements.

Update: Also, commentary at Forbes from Chad Orzel here.

Update: I normally try and not engage with Facebook, and encourage others to follow the same policy, but there’s an extensive discussion of this topic at this public Facebook posting by Daniel Harlow.

Posted in Experimental HEP News, Favorite Old Posts | 78 Comments

Michael Atiyah 1929-2019

While away on vacation, I heard last week the sad news of the death last week of Michael Atiyah, at the age of 89. Atiyah was both a truly great mathematician and a wonderful human being. In his mathematical work he simultaneously covered a wide range of different fields, often making deep connections between them and providing continual new evidence of the unity of mathematics. This unifying vision also encompassed physics, and the entire field of topological quantum field theory was one result.

I had the great luck to be at MSRI during the 1988-89 academic year, when Atiyah spent that January there. Getting a chance to talk to him then was a remarkable experience. He had one of the quickest minds I’ve ever seen, often grasping what you were trying to explain before the words were out of your mouth. At one point that month I ran into Raoul Bott walking away from an ongoing discussion with Atiyah and Witten at a blackboard. Bott shook his head, saying something like “it’s just too scary listening to the two of them”.

Any question, smart or stupid, would lead to not just an answer, but a fascinating explanation of all sorts of related issues and conjectures. For Atiyah, his love of discussing mathematics was something to be shared at all times, with whoever happened to be around.

The last time I met him was in September 2016 in Heidelberg. He was his usual cheerful and engaging self, still in love with mathematics and with discussing it with anyone who would listen. I did notice though that age had taken its toll, in the sense that he no longer would engage with anything that got into the sort of complexities that in the past he had been quick to see his way through. It’s unfortunate that near the end of his life far too much attention was drawn to implausible claims he started making that he could see how to solve some of the most difficult and intractable open problems of the subject.

There’s a lot more I could write here about Atiyah and his remarkable career, but I’ve realized that most of it I’ve already gotten to in one post here or another. So, for more, see some of the following older posts, which discussed:

Interviews and profiles here, here and here.

Atiyah and his work with Raoul Bott.

Atiyah and topological quantum field theory.

Update: In recent years Andrew Ranicki had been maintaining a page with Atiyah-related links.

Posted in Obituaries | 7 Comments

Roy Glauber 1925-2018: Notes on QFT

I saw today that Roy Glauber has passed away, at the age of 93. John Preskill speculates that Glauber was the last living member of the wartime T division at Los Alamos.

My only interaction with him was that he was the instructor for the first quantum field theory course I took, at Harvard during the 1976-77 academic year. The course was my first exposure to quantum field theory, and was taught from what seemed then (at the time of the advent of gauge theories and wide use of the path integral method) a rather stodgy point of view. It’s one however that I have later in life come to appreciate more.

I just located the binder of notes I kept from the class and plan to look over them. It occurred to me that if I want to look at these on vacation, the thing to do is to scan them. So, I just did this, and am making the scans available here in case others are interested:

Roy Glauber: Quantum Field Theory notes 1976-77

Roy Glauber: Quantum Field Theory problems and solutions 1976-77

Update: The New York Times has an obituary here.

Posted in Obituaries | 13 Comments