Various and Sundry

First a couple of items from Paris:

  • Fields medalist Cédric Villani is campaigning for the position of Mayor of Paris. This Sunday there will be a campaign event/book launch for his new book, Immersion: De la science au Parlement.
  • The long-awaited public unveiling of the results of director Ilya Khrzhanovsky’s attempt to make a film inspired by the story of Lev Landau has finally happened, with Dau now on view spread over three locations in Paris. This project was filmed during 2009-11, and I wrote a bit about it here in 2015. For more about the project, see for instance here and here. Among those appearing in the film are David Gross, Sergio Cecotti, Alexander Vilenkin, Carlo Rovelli, Costas Bachas, Erik Verlinde, Igor Klebanov, Samson Shatashvili, Shing-Tung Yau, Dmitry Kaledin, Nikita Nekrasov and Andrey Losev.

Some number-theorist related items:

  • Videos from last year’s Barry Mazur birthday conference are now available here. Also available is the write-up from Mazur of a talk last fall on The Unity and Breadth of Mathematics.
  • See here for an interview with Akshay Venkatesh.

Finally, some comments from Scott Aaronson on the current “like beer at a frat party” state of funding of quantum information theory:

I wanted to call attention to a hilarious irony. For years, I’ve made the case that trying to build scalable quantum computers, in order to probe the universe for the first time in “the regime beyond the classical Extended Church-Turing Thesis,” is just as scientifically interesting as finding the Higgs boson—even if we set aside any of the possible applications of QC.

I don’t think I imagined to what extent the tables would someday turn—with funding now flowing into quantum information like beer at a frat party (for a combination of good and bad reasons…), with the future of experimental particle physics now in serious doubt, and with me put in the position of arguing that the high-energy frontier is worth exploring too! 😀

Update: More about the opening of Dau in Paris here.

Posted in Uncategorized | 5 Comments

Where in the World are SUSY and WIMPs?

Back in 2017, after it had already become clear that negative LHC results about SUSY and WIMPs had falsified theorist’s most popular scenarios for how to extend the Standard Model, Nima Arkani-Hamed gave a summer school talk to students with the title Where in the World are SUSY & WIMPS?, which I discussed here. At the time I was encouraged that while he was still promoting SUSY and the landscape (in the split SUSY variant), at least he seemed to be arguing that the lesson to be drawn might be that the whole SUSY-GUT business was a mistake:

The disadvantage to the trajectory of going with what works and then changing a little and changing a little is that you might just be in the basin of attraction of the wrong idea from the start and then you’ll just stay there for ever.

A few weeks ago in Princeton, at a PCTS workshop on Dark Matter, he gave an updated version of the same talk. Much of it was the same material about how split SUSY is the best idea still standing. Unfortunately, at the end (1:09) he seems to now have changed his mind and be arguing that the best thing for theorists to do is to keep tweaking the models that failed at the LHC:

You could very justifiably say “look, you’re just continuing to make excuses for a paradigm that failed”, OK, and I would say that’s true, and even the paradigm most of your advisors love [e.g. usual SUSY] was already an excuse for the failure of non-supersymmetric GUTs before that.

That is a perfectly decent attitude to take, but I would like to at least tell you that you should study some of the history of physics. This very, very, very rarely happens, that some idea that seems basically right is just crap and wrong, It’s probably mostly right with a tweak or some reinterpretation. You’d have to go back over…, I don’t know how far you’d have to go back, even Ptolemy wasn’t so far from wrong…

These are two different attitudes towards connecting theory and experiment. If you like, more the theory egocentered attitude, or the just more explore from the bottom up attitude, they’re both perfectly good attitudes, we’ll see which is more fruitful in the end. If you take the more top-down attitude, just keep fixing things a little bit.

If you had to pick the single most influential theorist out there on these issues, it would probably be Arkani-Hamed. This kind of refusal to face reality is I think a significant factor in what has caused Sabine Hossenfelder to go on her anti-new-collider campaign. While I disagree with her and would like to see a new collider project, the prospect of having to spend the decades of my golden years listening to the argument “we were always right about SUSY, it just needs a tweak, and we’ll see it at the FCC” is almost enough to make me change my mind…

Update
: Today Ethan Siegel at Forbes has Why Supersymmetry May Be The Greatest Failed Prediction in Particle Physics History.

Posted in Uncategorized | 48 Comments

On Inference

The first issue of the magazine Inference appeared online back in 2014. At the time, it was surrounded by a significant amount of mystery: who were the editors, what were they trying to do, and who was funding it? I asked around and no one I talked to was sure what the answers were to these questions. Best guesses seemed to be that it was run out of Paris, with David Berlinski playing some role, and the funding source might be Peter Thiel.

Looking at the early issues that came out, on the topics I was competent to judge, the contributions about mathematics and physics were generally interesting and of high quality. On some other topics where I lack competence, there seemed to be a skeptical attitude towards materialism and evolutionary theory that I’m not sympathetic with.

Late in 2015 I was contacted by someone from Inference (Hortense Marcelin) to write an essay for them, something about the multiverse and string theory. After thinking about it a bit, I turned down the offer. The main reason was that I was sick and tired of the subject, didn’t want to spend time writing at length about it. A contributing factor in the back of my mind was that, not knowing the identity of the editors or anything about their agenda was another reason to not get involved.

A couple years later I got another invitation to write for them, a request to write a short response to an excellent piece by George Ellis, Physics on Edge. Deciding to do this wasn’t hard. The piece would be short and I already knew exactly what I wanted to say, so it would take little time. In addition, I think by this time the identity of the editors was known, and, most importantly, Inference had a pretty good track record of publishing high quality articles in the areas I know about. What I wrote was published as Theorists Without a Theory.

Adam Becker a few days ago published at Undark a long article about Inference. It’s a bit of an exposé, taking issue with some of the writing as “intelligent-design propaganda”, and revealing that yes, Peter Thiel is a funder. An odd part of the story is that Becker suspects that a negative review of his book by Glashow in Inference was motivated by the fact that he had not much earlier contacted Glashow to ask pointed questions about the publication and its funding.

Today I got in my inbox A Statement from Sheldon Glashow and Inference, which is available here. You can read it for yourself. Noteworthy in the Undark article is Becker’s report that Glashow had told him that “questioning evolution” is “no longer a
policy of the journal”. Referring to two early 2014 articles that could be described as questioning climate change and evolution, the statement says:

Becker believes that two of our essays are deserving of censure. They are William Kininmonth’s “Physical Theories and Computer Simulations in Climate Science,” and Michael Denton’s “Evolution: A Theory in Crisis Revisited.”

Both were published in 2014.

Now a secret must be imparted. Sheldon Glashow and Rich Roberts agree with Becker. Richard Lindzen and David Gelernter do not.

It ends with this response to the accusation about the motivation for the negative review of Becker’s book:

Inference commissioned Sheldon Glashow to review Becker’s book in the spring of 2018, well before Becker was known to Inference. The idea that we would require the services of a Nobel Laureate in order to make a fool of Becker is absurd. Becker is capable of doing that quite by himself.

Posted in Uncategorized | 50 Comments

Three Short Book Reviews

Unfortunately I don’t have time now to write about the three following books at the length that they deserve, but here are some quick comments on three books worth your attention:

  • A carefully produced detailed write-up of Sidney Coleman’s Harvard Physics 253 quantum field theory course has now been published by World Scientific. This course was taught by Coleman off and on from the mid-seventies until 2002, and the book is based on various sets of video recordings and lecture notes (including a copy of my lecture notes from when I attended the class). A huge amount of work by various people has gone into producing a very high quality book. David Derbes has some comments here, and he is perhaps the main person to thank for seeing this project through to completion.

    David Kaiser has contributed an introduction to the book (available here, or, if this doesn’t work, try here) which does an excellent job of putting the material in historical and intellectual context, as well as describing what Coleman was like and why he had a huge influence on several generations of Harvard students. If you’ve already spent a lot of time learning QFT from various modern textbooks, your reaction to much of this one may be “that’s the standard way of explaining that point, nothing unusual here.” Keep in mind that often the reason that’s now the standard way of explaining things is that many authors of modern textbooks learned the subject from Coleman (or from someone who learned it from Coleman…).

  • Jim Baggott has a very good new book out, entitled Quantum Space, which could roughly be described as a popular account of loop quantum gravity at the level of the account of string theory in books like Brian Greene’s The Elegant Universe. Baggott has spent a lot of time talking to Carlo Rovelli and Lee Smolin, and one of the best aspects of the book is the way it conveys their personal stories, intellectual journey, and current outlook on the subject.

    One unavoidable topic that Baggott covers is the relation of string theory and LQG as competing (or perhaps someday collaborating?) approaches to the problem of quantum gravity. Due to long ago experience (to get an idea, watch this), I’ve long ago lost patience for arguments about which approach is “better”. Baggott’s take on the issue seems fair to me, but if you really want to engage in that argument it will have to be elsewhere than the comment section here.

  • Finally, the new book you really should buy a copy of is my brother Steve’s Fly Fishing Treasures. He has been working on it for years, and it includes the most amazing beautiful pictures of antique fly fishing equipment in existence, as well as a wealth of information about those who collect these things. OK, if you, like me, aren’t especially excited about the topic of fly fishing, then buy a copy as a present for someone who is.

Update: I should have included links to postings about Coleman here, here and here.

Update: If you’ll be at the March APS meeting in Boston, I hear there’s a book launch for the Coleman book, 1:30pm-2:15pm Weds. March 6, at the World Scientific booth in the exhibition hall.

Posted in Book Reviews | 11 Comments

Should the Europeans Give Up?

The European HEP community is now engaged in a “Strategy Update” process, the next step of which will be an open symposium this May in Granada. Submissions to the process were due last month, and I assume that what was received will be made publicly available at some point. This is supposed to ultimately lead to the drafting of a new European HEP strategy next January, for approval by the CERN Council in May 2020.

The context of these discussions is that European HEP is approaching a very significant crossroads, and decisions about the future will soon need to be made. The LHC will be upgraded in coming years to a higher luminosity, ultimately rebranded as the HL-LHC, to start operating in 2026. After 10-15 years of operation in this higher-luminosity mode, the LHC will reach the end of its useful life: the marginal extra data accumulated each year will stop being worth the cost of running the machine.

Planning for the LHC project began back in the 1980s, and construction was approved in 1994. The first physics run was 16 years later, in 2010. Keep in mind that the LHC project started with a tunnel and a lot of infrastructure already built, since the LEP tunnel was being reused. If CERN decides it wants to build a next generation collider, this could easily take 20 years to build, so if one wants it to be ready when the LHC shuts down, one should have started already.

Some of the strategy discussion will be about experiments that don’t require the highest possible collision energies (the “energy frontier”), for instance those that study neutrinos. Among possibilities for a new energy frontier collider, the main ones that I’m aware of are the following, together with some of their advantages and drawbacks:

  • FCC-ee: This would be an electron-positron machine built in a new 100 km tunnel, operating at CM energies from 90 to 365 GeV. It would provide extremely high numbers of events when operated at the Z-peak, and could also be operated as a “Higgs factory”, providing a very large number of Higgs events to study, in a much cleaner environment than that provided by a proton-proton collider like the LHC.

    In terms of drawbacks, it is estimated to cost \$10 billion or so. The CM energy is quite a bit less than that of the LHC, so it seems unlikely that there are new unknown states that it could study, since these would have been expected to show up by now at the LHC (or at LEP, which operated at 209 GeV at the end).

    Another point in favor of the FCC-ee proposal is that it would allow for reuse of the tunnel (just as the LHC followed on LEP) for a very high energy proton-proton collider, called the FCC-hh, which would operate at a CM energy of 100 TeV. This would be a very expensive project, estimated to cost \$17 billion (on top of the previous \$10 billion cost of the FCC-ee).

  • HE-LHC: This would essentially be a higher energy version of the LHC, in the same tunnel, built using higher field (16 T vs. 8.33 T) magnets. It would operate at a CM energy of 27 TeV. The drawbacks are that, while construction would be challenging (there are not yet appropriate 16 T magnets), only a modest (27 vs. 14 TeV) increase in CM energy would be achieved. The big advantage over the FCC-hh is cost: much of the LHC infrastructure could be reused and the machine is smaller, so the total cost estimate is about \$7 billion.
  • CLIC: This would be a linear electron-positron collider with first stage of the project an 11 km-long machine that would operate at 380 GeV CM energy and cost about \$7 \$6 billion. The advantage of this machine over the circular FCC-ee is that it could ultimately be extended to a longer 50 km machine operating at 3 TeV CM energy (at a much higher cost). The disadvantage with respect to the FCC-ee is that it is not capable of operating at very high luminosity at lower energies (at the Z-peak or as a Higgs factory).

For some context for the very high construction costs of these machines, the CERN budget is currently around \$1.2 billion/year. It seems likely that member states will be willing to keep funding CERN at this level in the future, but I have no idea what prospects if any there are for significantly increased contributions to pay for a new collider. A \$10 billion FCC-ee construction cost spread out over 20 years would be \$500 million/year. Can this somehow be accommodated within CERN’s current budget profile? This seems difficult, but maybe not impossible. Where the additional \$17 billion for the FCC-hh might come from is hard to see.

If none of these three alternatives is affordable or deemed worth the cost, it looks like the only alternative for energy frontier physics is to do what the US has done: give up. The machines and their cost being considered here are similar in scale to the SSC project, which would have been a 40 TeV CM energy 87 km proton-proton collider but was cancelled in 1993. Note that the capabilities of the SSC would have been roughly comparable to the HE-LHC (it had higher energy, lower luminosity). Since it would have started physics around 2000, and an HE-LHC might be possible in 2040, one could say that the SSC cancellation set back the field at least 40 years. The worst part of the SSC cancellation was that the project was underway and there was no fallback plan. It’s hard to overemphasize how disastrous this was for US HEP physics. Whatever the Europeans do, they need to be sure that they don’t end up with this kind of failure.

Faced with a difficult choice like this, there’s a temptation to want to avoid it, to believe that surely new technology will provide some more attractive alternative. In this case though, one is running up against basic physical limits. For circular electron-positron machines, synchrotron radiation losses go as the fourth power of the energy, whereas for linear machines one has to put a lot of power in since one is accelerating then dumping the beam, not storing it. For proton-proton machines, CM energy is limited by the strength of the dipole magnets one can build at a reasonable cost and operate reliably in a challenging environment. Sure, someday we may have appropriate cheap 60T magnets and a 100 TeV pp collider could be built at reasonable cost in the LHC tunnel. We might also have plasma wakefield technology that could accelerate beams of electrons and positrons to multi-TeV energies over a reasonable distance, with a reasonable luminosity. At this point though, I’m willing to bet that in both cases we’re talking about 22nd century technology unlikely to happen to fall into the 21st century. Similar comments apply to prospects for a muon collider.

Another way to avoid the implications of this difficult choice is to convince oneself that cheaper experiments at low energy, or maybe astrophysical observations, can replace energy frontier colliders. Maybe one can get the same information about what is happening at the 1-10 TeV scale by looking at indirect effects at low energy. Unfortunately, I don’t think that’s very likely. There are things we don’t understand about particle physics that can be studied using lower energies (especially the neutrino sector) and such experiments should be pursued aggressively. It may be true that what we can learn this way can replace what we could learn with an energy-frontier collider, but that may very well just be wishful thinking.

So, what to do? Give up, or start trying to find the money for a very long-term, very challenging project, one with an uncertain outcome? Unlike the case of the LHC, we have no good theoretical reason to believe that we will discover a new piece of fundamental physics using one of these machines. You can read competing arguments from Sabine Hossenfelder (here and here) and Tommaso Dorigo (here, here and here).

Personally, I’m on the side of not giving up on energy frontier colliders at this point, but I don’t think the question is an easy one (unlike the question of building the LHC, which was an easy choice). One piece of advice though is that experience of the past few decades shows you probably shouldn’t listen to theorists. A consensus is now developing that HEP theory is in “crisis”, see for instance this recent article, where Neil Turok says “I’m busy trying to persuade my colleagues here to disregard the last 30 years. We have to retrace our steps and figure out where we went wrong.” If the Europeans do decide to build a next generation machine, selling the idea to the public is not going to be made easier by some of the nonsense from theorists used to sell the LHC. People are going to be asking “what about those black holes the LHC was supposed to produce?” and we’re going to have to tell them that that was a load of BS, but that this time we’re serious. This is not going to be easy…

Update: Some HEP experimentalists are justifiably outraged at some of the negative media stories coming out that extensively quote theorists mainly interested in quantum gravity. There are eloquent Twitter threads by James Beacham and Salvatore Rappoccio, responding to this Vox story. The Vox story quotes no experimentalists, instead quotes extensively three theorists working on quantum gravity (Jared Kaplan, Sabine Hossenfelder and Sean Carroll). Not to pick specifically on Kaplan, but he’s a good example of the point I was making above about listening to theorists. Ten years ago his work was being advertised with:

As an example question, which the LHC will almost certainly answer—we know that the sun contains roughly 10^60 atoms, and that this gigantic number is a result of the extreme weakness of gravity relative to the other forces—so why is gravity so weak?

Enthusiasm for the LHC then based on the idea that it was going to tell us about gravity was always absurd, and a corresponding lack of enthusiasm for a new collider based on negative LHC results on that front is just as absurd.

Update: Commenter abby yorker points to this new opinion piece at the New York Times, from Sabine Hossenfelder. The subtitle of the piece is “Ten years in, the Large Hadron Collider has failed to deliver the exciting discoveries that scientists promised.” This is true enough, but by not specifying the nature of the failure and which scientists were responsible, it comes off as blaming the wrong people, the experimentalists. Worse, it uses this failure to argue against further funding not of failed theory, but of successful experiment.

The LHC machine and the large-scale experiments conducted there have not in any sense been a failure, quite the opposite. The machine has worked very well, at much higher than design luminosity, close to design energy (which should be achieved after the current shutdown). The experiments have been a huge success on two fronts. In one direction, they’ve discovered the Higgs and started detailed measurements of its properties, in another they’ve done an amazing job of providing strong limits on a wide range of attempted extensions of the standard model.

These hard-won null results are not a failure of the experimental program, but a great success of it. The only failure here is that of the theorists who came up with bad theory and ran a hugely successful hype campaign for it. I don’t see how the lesson from seeing an experimental program successfully shoot down bad theory is that we should stop funding further such experiments. I also don’t see how finding out that theorists were wrong in their predictions of new phenomena at the few hundred GeV scale means that new predictions by (often the same) theorists of no new phenomena at the multiple TeV scale should be used as a reason not to fund experimentalists who want to see if this is true.

Where I think Hossenfelder is right is that too many particle physicists of all kinds went along with the hype campaign for bad theory in order to get people excited about the LHC. Going on about extra dimensions and black holes at the LHC was damaging to the understanding of what this science is really about, and completely unnecessary since there was plenty of real science to generate excitement. The discussion of post-LHC experimental projects should avoid the temptation to enter again into hype-driven nonsense. On the other hand, the discussion of what to defund because of the LHC results should stick to defunding bad theory, not the experiments that refute it.

Update: Some more commentary about this, from Chris Quigg, and the CERN Courier. In particular, the CERN Courier has this from Gerard ‘t Hooft:

Most theoreticians were hoping that the LHC might open up a new domain of our science, and this does not seem to be happening. I am just not sure whether things will be any different for a 100 km machine. It would be a shame to give up, but the question of whether spectacular new physical phenomena will be opened up and whether this outweighs the costs, I cannot answer. On the other hand, for us theoretical physicists the new machines will be important even if we can’t impress the public with their results.

and, from Joseph Incandela:

While such machines are not guaranteed to yield definitive evidence for new physics, they would nevertheless allow us to largely complete our exploration of the weak scale… This is important because it is the scale where our observable universe resides, where we live, and it should be fully charted before the energy frontier is shut down. Completing our study of the weak scale would cap a short but extraordinary 150 year-long period of profound experimental and theoretical discoveries that would stand for millennia among mankind’s greatest achievements.

Update: Also, commentary at Forbes from Chad Orzel here.

Update: I normally try and not engage with Facebook, and encourage others to follow the same policy, but there’s an extensive discussion of this topic at this public Facebook posting by Daniel Harlow.

Posted in Experimental HEP News, Favorite Old Posts | 78 Comments

Michael Atiyah 1929-2019

While away on vacation, I heard last week the sad news of the death last week of Michael Atiyah, at the age of 89. Atiyah was both a truly great mathematician and a wonderful human being. In his mathematical work he simultaneously covered a wide range of different fields, often making deep connections between them and providing continual new evidence of the unity of mathematics. This unifying vision also encompassed physics, and the entire field of topological quantum field theory was one result.

I had the great luck to be at MSRI during the 1988-89 academic year, when Atiyah spent that January there. Getting a chance to talk to him then was a remarkable experience. He had one of the quickest minds I’ve ever seen, often grasping what you were trying to explain before the words were out of your mouth. At one point that month I ran into Raoul Bott walking away from an ongoing discussion with Atiyah and Witten at a blackboard. Bott shook his head, saying something like “it’s just too scary listening to the two of them”.

Any question, smart or stupid, would lead to not just an answer, but a fascinating explanation of all sorts of related issues and conjectures. For Atiyah, his love of discussing mathematics was something to be shared at all times, with whoever happened to be around.

The last time I met him was in September 2016 in Heidelberg. He was his usual cheerful and engaging self, still in love with mathematics and with discussing it with anyone who would listen. I did notice though that age had taken its toll, in the sense that he no longer would engage with anything that got into the sort of complexities that in the past he had been quick to see his way through. It’s unfortunate that near the end of his life far too much attention was drawn to implausible claims he started making that he could see how to solve some of the most difficult and intractable open problems of the subject.

There’s a lot more I could write here about Atiyah and his remarkable career, but I’ve realized that most of it I’ve already gotten to in one post here or another. So, for more, see some of the following older posts, which discussed:

Interviews and profiles here, here and here.

Atiyah and his work with Raoul Bott.

Atiyah and topological quantum field theory.

Update: In recent years Andrew Ranicki had been maintaining a page with Atiyah-related links.

Posted in Obituaries | 7 Comments

Roy Glauber 1925-2018: Notes on QFT

I saw today that Roy Glauber has passed away, at the age of 93. John Preskill speculates that Glauber was the last living member of the wartime T division at Los Alamos.

My only interaction with him was that he was the instructor for the first quantum field theory course I took, at Harvard during the 1976-77 academic year. The course was my first exposure to quantum field theory, and was taught from what seemed then (at the time of the advent of gauge theories and wide use of the path integral method) a rather stodgy point of view. It’s one however that I have later in life come to appreciate more.

I just located the binder of notes I kept from the class and plan to look over them. It occurred to me that if I want to look at these on vacation, the thing to do is to scan them. So, I just did this, and am making the scans available here in case others are interested:

Roy Glauber: Quantum Field Theory notes 1976-77

Roy Glauber: Quantum Field Theory problems and solutions 1976-77

Update: The New York Times has an obituary here.

Posted in Obituaries | 13 Comments

This Week’s Hype (and a couple other things)

For today’s university press release designed to mislead the public with hype about string theory, Uppsala University has Our Universe: An expanding bubble in an extra dimension. It’s the Swampland variant of string theory hype, based on this preprint, which is now this PRL. Marketing to other press outlets as usual starts here and here.

In the current Swampland hype, string theorists have “discovered” that string theory doesn’t really necessarily have that landscape of vacua making it untestable, and now we’re finally on our way to testing string theory. In this press release version:

For 15 years, there have been models in string theory that have been thought to give rise to dark energy. However, these have come in for increasingly harsh criticism, and several researchers are now asserting that none of the models proposed to date are workable….

The Uppsala scientists’ model provides a new, different picture of the creation and future fate of the Universe, while it may also pave the way for methods of testing string theory.

For some other things that may be of more interest:

Finally, I’m leaving tomorrow for a two-week or so vacation in France, blogging likely slim to non-existent.

Posted in Swampland, This Week's Hype, Uncategorized | 2 Comments

Tim May 1951-2018

I was sorry to learn yesterday of the death of Tim May, who had been a frequent commenter here on the blog. For more about his life, see here and here.

One can find his comments here for instance by this search. In some of these he told a bit of the story of his life. I’ll include here part of one such comment:

In 1970 I was accepted at MIT, Stanford, and Berkeley for college. I transferred my acceptance and Regents Scholarship from Berkeley to UC Santa Barbara. A lesser school compared to Berkeley, on overall grounds, but a more interesting fit to my interests. (College of Creative Studies, with many advantages.)

By around 1972 it was clear the Big Drought was unfolding. Tales of Ph.D.s driving taxi cabs, professors advising that the odds of the then-current Ph.D. candidates getting a real position were dwindling. (Besides the overall downsizing of HEP and other physics funding, there was a glut of physics professors who had been hired in the post-Sputnik boom era….and they were still 30 years or more from retirement.)

Fast forwarding, I decided to not apply to grad school and instead join a small semiconductor company. There, I worked on a bunch of “engineering physics” probems. Because we were the leaders in dynamic RAM memory, I had exposure to some interesting problems. One of them was the mysterious issue of bits sometimes being flipped, but not permanently. In fact, the bit flips were apparently random and occurred only once (or at least close to only once…).

My physics background served me well, as I knew about the physics of how the devices worked (more so than a lot of the EE folks, who thought in terms of circuits), and I knew some geology. I had a brain storm that maybe low levels of uranium or thorium or the like in our ceramic and glass packages were causing the problem. Some experiments confirmed this. And all of the physics calculations about charged particle tracks in silicon matched. A lot of stuff I don’t have the space here to describe.

So my career was launched. Lots of papers on this “soft error” phenomenon. (Oh, and the cosmic ray corrollary was indeed obvious: but in 1978 when the first paper was presented, it was insignificant as a source as compared to alpha particles.)

Instead of spending until 1980-82 doing a Ph.D. and then 4-8 years or more as a post-doc, I had some fun and retired from Intel in 1986.

I’ve been pleasantly able to pursue whatever interested me ever since.

Posted in Obituaries | 2 Comments

The Chronic Incompleteness of String Theory

Ex-string theorist turned philosopher Richard Dawid has become known over the years for his arguments that string theory is a theory to be evaluated not by the conventional scientific method, in which experiment plays a role, but by “post-empirical theory assessment” methods. He has a book about this, and I’ve written about his arguments here, here and here.

Today he has a new paper out, entitled Chronic Incompleteness, Final Theory Claims, and the Lack of Free Parameters in String theory, which tries to address the “chronic incompleteness” problem of string theory’s claim to be a complete unified theory. This problem is starting to look very serious:

Rather than bringing the time horizon for the completion of fundamental physics from virtual infinity to somewhere within our lifetime, string theory’s final theory claim seems to be associated with an extension of the time horizon for the completion of this particular theory that may, once again, virtually reach towards infinity.

In other words, there’s a chronic problem of string theorists not being able to tell us what the theory actually is, and now it’s looking like they’ll never be able to do so. Dawid is right that this is the root of the problem, not usual excuses like “it predicts stuff, but you’d need an accelerator as big as the galaxy to test these predictions” or “the equations are just too hard to solve”.

One question here is that of defining what “string theory” even means anymore. Dawid does the best he can with this in a footnote:

Here as throughout the entire paper, the term ‘string theory’, if not specified otherwise, denotes the overall theory that aims at describing the observed world and is identified by the present knowledge on perturbative superstring theory, duality relations, etc.

This isn’t exactly a precise definition, it’s basically just “string theory is a conjectural theory with a certain list of properties which I’m not going to even try and describe, since that would get really complicated and different people likely have different lists”.

In the main text, Dawid explains that “string theory has no fundamental dimensionless free parameters”, a claim often made that I’ve always found kind of baffling. If you don’t know what the theory is, how do you know that it doesn’t have free parameters??? He makes a great deal of this assumption, adding in the argument that this lack of parameters means no classical limit, and I guess thus no formulation of the theory as “quantization” of something describable in classical terms.

I don’t really see what the big deal is about having a quantum system that is not defined as the quantization of some classical system. A simple example of such a system is the qubit that we often start teaching quantum mechanics with. Somehow Dawid wants to get from “not quantization of a classical system” to “we can’t ever hope to write down the theory”, but I don’t see how this follows.

He examines various possibilities for how the problem of no fundamental theory can be resolved. His alternative C is the obvious one: we just haven’t found it yet. He would like to argue that this might not be right, that string theory is a new and different kind of science:

…string theory and the conceptual context within which it is developed is in a number of ways substantially different from anything physicists have
witnessed up to this point. Therefore, it is far from clear whether prevalent physical intuitions as to which kinds of questions can be expected to have a fully calculable theoretical answer are applicable in this case. It seems
difficult to rule out that what seems to be a question that finds a fully calculable theoretical answer in fact rather resembles the case of the leaf carried by autumn winds and just defies calculation.

Dawid seems to argue that string theory may be an example of what he calls alternative A:

Even in principle, there exists no mathematical scheme that is empirically equivalent to string theory and generates quantitative results that specify the fundamental dynamics of the theory. In that case, the fundamental theory is conceptually incomplete by its very nature. It has no fundamental dynamics and no set of solutions that can be deduced from its first principles. The fundamental theory merely serves as a conceptual shell that embeds low energy descriptions (ground states of the theory) consistent with the principles encoded in the fundamental theory. Those low energy descriptions contain specified parameter values and do generate quantitative results. But there is no way to establish from first principles how probable specific ground states of the system are.

His summary of his vision of string theory is as follows:

Full access to a theory without free parameters thus might be expected to require representations that don’t have their own classical limit. The fact that they cannot be developed by generalizing away from a classical limit seems to impede the full formulation of a final theory even once one has found it. The resulting idea of a fundamental theory whose full formulation is hidden from the physicists’ grasp because its most adequate representation lacks intuitive roots has even more radical rivals, which amount to questioning the possibility of calculating the dynamics of the fundamental theory either within the bounds of human calculational power or as a matter of principle.

At one point Dawid acknowledges that some people have drawn the obvious conclusion about the current situation, the one consistent with our usual understanding of science:

It has been suggested by various exponents and observers of contemporary fundamental physics (see e.g. Smolin 2003, Woit 2003, Hossenfelder 2018) that the chronic incompleteness of string theory represents a substantial failure of the research program that is indicative of a strategical problem that has afflicted fundamental physics in recent decades.

He doesn’t like this conclusion, so argues that this time it’s different:

Considering the range and character of the very substantial differences that set the current state of fundamental physics apart from any previous stage in the history of physics, there is little reason to expect that theory building at the present stage can be judged according to criteria that seemed adequate in the past.

While he doesn’t say so, this argument takes him back to the problem of how one is to judge “string theory”, but taking a position even more radical than his earlier one. The argument now seems to be that we’re supposed to consider accepting as the final, fundamental theory of physics, a “theory” that is not just untestable, but is a “chronically incomplete” framework based on something we can never hope to define or understand. I’m having trouble understanding why this is supposed to be science rather than another human endeavor that it looks a lot more like, theology.

Posted in Uncategorized | 12 Comments