Until about a year and a half ago, the way to get funding in physics was to somehow associate yourself to the hot trend of quantum computing and quantum information theory. Large parts of the string theory and quantum gravity communities did what they could to take advantage of this. On November 30, 2022, this all of a sudden changed as two things happened on the same day:
- Quanta magazine, Nature and various other places were taken in by a publicity stunt, putting out that day videos and articles about how “Physicists Create a Wormhole Using a Quantum Computer”. The IAS director compared the event to “Eddington’s 1919 eclipse observations providing the evidence for general relativity.” Within a few days though, people looking at the actual calculation realized that these claims were absurd. The subject had jumped the shark and started becoming a joke among serious theorists. That quantum computers more generally were not living up to their hype didn’t help.
- OpenAI released ChatGPT, very quickly overwhelming everyone with evidence of how advanced machine learning-based AI had become.
If you’re a theorist interested in getting funding, obviously the thing to do was to pivot quickly from quantum computing to machine learning and AI, and get to work on the people at Quanta to provide suitable PR. Today Quanta features an article explaining how “Using machine learning, string theorists are finally showing how microscopic configurations of extra dimensions translate into sets of elementary particles.”
Looking at these new neural network calculations, what’s remarkable is that they’re essentially a return to a failed project of nearly 40 years ago. In 1985 the exciting new idea was that maybe compactifying a 10d superstring on a Calabi-Yau would give the Standard Model. It quickly became clear that this wasn’t going to work. A minor problem was that there were quite a few classes of Calabi-Yaus, but the really big problem was that the Calabi-Yaus in each class were parametrized by a large dimensional moduli space. One needed some method of “moduli stabilization” that would pick out specific moduli parameters. Without that, the moduli parameters became massless fields, introducing a huge host of unobserved new long-range interactions. The state of the art 20 years later is that endless arguments rage over whether Rube Goldberg-like constructions such as KKLT can consistently stabilize moduli (if they do, you get the “landscape” and can’t calculate anything anyway, since these constructions give exponentially large numbers of possibilities).
If you pay attention to these arguments, you soon realize that the underlying problem is that no one knows what the non-perturbative theory governing moduli stabilization might be. This is the “What Is String Theory?” problem that a consensus of theorists agrees is neither solved nor on its way to solution.
The new neural network twist on the old story is to be able to possibly compute some details of explicit Calabi-Yau metrics, allowing you to compute some numbers that it was clear back in the late 1980s weren’t really relevant to anything since they were meaningless unless you had solved the moduli stabilization program. Quanta advertises this new paper and this one (which “opens the door to precision string phenomenology”) as well a different sort of calculation which used genetic algorithms to show that “the size of the string landscape is no longer a major impediment in the way of constructing realistic string models of Particle Physics.”
I’ll end with a quote from the article, in which Nima Arkani-Hamed calls this work “garbage” in the nicest possible way:
“String theory is spectacular. Many string theorists are wonderful. But the track record for qualitatively correct statements about the universe is really garbage,” said Nima Arkani-Hamed, a theoretical physicist at the Institute for Advanced Study in Princeton, New Jersey.
A question for Quanta: why are you covering “garbage”?
Update: String theorist Marcos Mariño on twitter:
In my view, using today’s AI to calculate the details of string compactifications is such a waste of time that I fear that a future Terminator will come to our present to take revenge for the merciless, useless exploitation of its grandparents.
Update: More string theory AI hype here.