Last week a symposium on Nature at the energy frontier was held at the ETH in Zurich, funded by the Latsis foundation. Videos of many of the talks have appeared here.
As part of the symposium, David Gross gave a public talk, available here. Gross is often invited to give such talks, and I’ve either attended or taken a look at the video for quite a few of them. The first substantive posting on this blog back in 2004 was about one such talk. Back then, Gross was claiming that in 3-4 years there would be a headline in the New York Times about the discovery of supersymmetry at the LHC (he was overly optimistic about how long it would take to get the machine working properly). I wondered at the time:
What will be interesting to see will be what Gross et. al. do when this doesn’t happen. Will they drop string theory?
In all of the talks I’ve seen since that time, Gross continued to express optimism, including willingness to bet significant sums of money on a discovery of superpartners at 50/50 odds. His talk at the Latsis symposium included a remarkable change of tone, featuring an Einstein quote I hadn’t seen him use before:
The successful attempt to derive delicate laws of nature, along a purely mental path, by following a belief in the formal unity of the structure of reality, encourages continuation in this speculative direction, the dangers of which everyone vividly must keep in sight who dares follow it.
He describes Einstein as both encouraging the kind of speculative path followed by string theory and SUSY, while at the same time warning of its dangers, and noted that Einstein himself devoted the latter part of his life to a speculative path that turned out to be a dead end.
In his explanation of the standard arguments for string theory and SUSY, Gross was much more cautious than in the past, careful to explain that these arguments were based on just speculative “clues”. These clues might just be coincidences, but the main reason for not giving up on them was that we don’t really have any others.
About the LHC results, Gross described them as having now ruled out the simplest SUSY models, which could be a clue that SUSY does not exist. He said that if SUSY is not seen at the LHC, we will learn for sure that theorists have been on the wrong track for many decades. According to him, this would “change a lot of our way of thinking”, since “we have been pursuing these clues for a long time.” He didn’t discuss how ways of thinking will change, although he is well known to strenuously reject one popular idea about this (anthropics and the multiverse).
Erik Verlinde gave an odd talk with a title that promised to address these issues, String Theory and the Future of Particle Physics. He explained the history of string theory and how the original hope was that it would tell us where the SM parameters came from, something which “looks differently now”. According to him, the idea of strings moving in a compactified 10d was the “old view”, now made obsolete by branes. The “present view” is all about gauge-gravity duality, which doesn’t say anything about those SM parameters. According to him the “future view” of string theory will somehow start from some new basic principles that we don’t know yet, providing an underlying microscopic description that will explain holography, and give an “emergent” explanation of space, time, matter, etc. There was no explanation at all of what these new principles were or what the new microphysical description would be. String theory itself would just be an “effective theory”, like the Standard Model.
After this introduction, Verlinde than moved on to something that seems to have nothing at all to do with string theory, giving a long explanation of how he thinks there are some new degrees of freedom out there that have dynamics at long time scales. These should be treated by the methods of polymer dynamics, and they will explain dark matter and dark energy. He produced lots of graphs of astrophysical data, and claimed that he has an “extension” of Newtonian gravity, better than MOND, which explains the astrophysical data. At the end of the talk there were a bunch of questions about the cosmological implications of this, I couldn’t tell if Verlinde had an answer. He has been talking about this idea in public talks for a couple years now, although as far as I can tell there is no paper. Also, as far as I can tell, no one else besides him takes this seriously.
The last talk of the conference was from Nima Arkani-Hamed, giving basically the same talk I’ve written about recently here and here. I confess to not listening to the whole thing, since it was quite familiar (although I did notice he addressed the question of the possibility there wasn’t really a hierarchy problem, saying people who raised that were “frustrating”, since he had thought about it for decades, and “trust me, there’s a hierarchy problem”). In the question session, he made the same point I often end up arguing with string theory proponents about, saying (1:14) that if “you can do experiments at the string scale, wouldn’t help you at all”. The idea that you would see string excitations on a compactified space he characterizes as a misguided old idea from the 1990s. If there’s a landscape, the possibilities are so complex for Planck scale behavior that you can’t predict what experiments at that scale would see. I’m glad to see that now instead of getting into arguments like this one, I can send people to go argue with Nima.
A historical comment: You, and others, have often emphasized that this disconnect between theory and observability, and acceptance of this disconnect, is unprecedented in the history of science.
In fact, hydrodynamics provides a very relevant historical example: In the 18th century it was realized that ideal hydrodynamics was a very lousy description of the behavior of fluids, even fluids with a very low viscosity (“D’Alambert’s paradox”, the force on a cylinder undergoing laminar flow, is a textbook example).
The reaction of the scientific community was pretty much the reaction of supersymmetry and string theory practitioners today:
————————-
http://en.wikipedia.org/wiki/D%27alembert%27s_Paradox
D’Alembert, working on a 1749 Prize Problem of the Berlin Academy on flow drag, concluded: “It seems to me that the theory (potential flow), developed in all possible rigor, gives, at least in several cases, a strictly vanishing resistance, a singular paradox which I leave to future Geometers [i.e. mathematicians – the two terms were used interchangeably at that time] to elucidate”.[4] A physical paradox indicates flaws in the theory.
Fluid mechanics was thus discredited by engineers from the start, which resulted in an unfortunate split – between the field of hydraulics, observing phenomena which could not be explained, and theoretical fluid mechanics explaining phenomena which could not be observed – in the words of the Chemistry Nobel Laureate Sir Cyril Hinshelwood.[5]
——————————–
The best mathematicians in the world continued to study formal, beautiful, and irrelevant solutions of inviscid fluid dynamics, and the reason for the world being grossly different was not paid attention to until the beginning of this century.
Substitute “supersymmetry” for “ideal fluid dynamics”, and you get today’s impasse.
The work on AdS/CFT and amplitudes reflects very well the theorist’s response to disappointing experimental results.
In retrospect, was it a mistake? Certainly we learned a lot of mathematics techniques from these solutions, which are now used elsewhere.
On the other hand, the _reasons_ the real world is different (which have to do with turbulent regions which form at low viscosity) are equally, if not more, fascinating and rich in mathematical insight. On the “third hand”, these reasons are also “hard”, we still have not completely understood, so it is difficult to see whether “extra effort” at that time would have made much difference.
lun,
Interesting analogy. One difference though is that here we already have a wonderful mathematical theory that works (the Standard Model).
On an older topic, your more mathematically interested readers may want to go to terrytao.wordpress.com and click on the polymath8 link to see recent progress in improving Zhang’s bound on prime gaps (now down to around 250,000 instead of 70,000,000.) You can watch mathematics in action, in almost real time, and participate yourself, if you know your stuff.
Thanks David,
That is an amazing project, I was thinking of finding some way to write about it. Shows the power of crowd-sourcing in action, when your crowd includes Terry Tao…
I’m not sure I’d describe the Standard Model as a wonderful mathematical theory. The computations we can do work nicely, of course, but the Standard Model is not a well-defined mathematical model in the way that hydrodynamics is.
In hydrodynamics, it’s very clear the mathematical model is, i.e., what the degrees of freedom and observables are, and what rules govern their dynamics. The big questions mainly concern whether solutions to these dynamical rules exist for various sorts of boundary conditions.
In the Standard Model, on the other hand, it is _not_ clear what the mathematical model is, let alone which boundary conditions play well. We would like the Standard Model to be a continuum quantum field theory. Typically we specify these by either simply just writing down some observables and their OPE coefficients (the “divine inspiration” method) or by showing that some system of regularized Feynman integrals have a well-defined continuum limit.
To the best of my knowledge, we don’t have even a physically satisfactory candidate for the set of regularized Feynman integrals. Note that I’m not just complaining we can’t take the continuum limit with perfect mathematical rigor. What I’m saying is that every regulator I’ve ever heard of breaks some crucial feature of the Standard Model: gauge invariance, Lorentz symmetry, locality, or chiral symmetry.
To make matters worse, there’s no evidence that any regularization would have a sensible continuum limit, thanks to the running couplings in the Higgs & U(1) sectors. (You can argue that there’s no Landau pole if you want, but the burden of proof is on you. And the numerical results aren’t exactly encouraging.)
Sorry to be such a downer, but I don’t think it’s particularly fair of you to take a maximally pessimistic viewpoint on string theory’s mathematical structure (in the linked MathOverflow post) while advertising the Standard Model as ‘wonderful’. The SM is far more successful as physics, but mathematically, it’s in terrible condition. I’m not even sure it’s likely that the SM is anything more than a set of recipes for making approximate computations in some unknown model.
Hi A.J.,
I know basically nothing about hydrodynamics, or the associated mathematical models, other than that there’s a million dollar prize for understanding something about the solutions of Navier-Stokes, a situation that doesn’t look that different than the million dollar prize for showing there’s a mass gap in pure Yang-Mills (which as you know well can be rigorously defined on the lattice).
In what I wrote at mathoverflow, I was trying to give not something maximally pessimistic, but an accurate explanation of what the status is of the attempt to give a definition of “string theory”. As far as I know, the statement that no one knows how to define string theory outside of perturbation theory (i.e. M-theory) is completely accurate (other than as a conjectural beast that satisfies certain consistency conditions and has various known limits). People really don’t know what the right fundamental variables and laws are for the theory, even at a very formal, conjectural level. I’d be interested to see a good reference if this is not accurate.
The situation with the Standard model, as a mathematical structure is a complex and fascinating one, and I know you’re well aware of this. While a rigorous formulation with proven properties is lacking, formally you can write the thing down, and a lot is known about how to make sense of the formal equations. I think there’s a glass half-full/half-empty situation here. To me the SM has a huge amount of precise and very deep and powerful mathematical structure and we know in principle how to do accurate calculations with it, although there are fascinating remaining issues about how to get a rigorous definition one can prove things about. To me that’s a glass more than half-full, but tastes can differ.
Yes, you can say many similarly precise things about perturbative string theory, but the current lore (see Arkani-Hamed’s comment) is that that’s not what’s relevant to the real world. I don’t believe you can produce a realistic (e.g. with the right CC) string theory model purely defined using perturbative string theory. Typical conjectured realistic models use an effective potential for moduli that get stabilized through complex mechanisms, invoking extremely poorly understood and conjectural aspects of non-perturbative string theory. Again, here you don’t even know what the right fundamental variables or dynamical laws are, even formally.
I am merely an interested bystander wrt theoretical physics, but I know a fair amount about fluid dynamics (a somewhat more correct name than hydrodynamics). I think that A. J. is off base regarding the well-definedness of fluid dynamics. The overriding problem in fluid dynamics for many, many years has been the modeling of turbulence – there are more degrees of freedom there than you have equations, so the history of accurate solutions (almost all numerical) to the Navier-Stokes equations is the history of turbulence models, and their agreement with experiment is still not that impressive.
Also, for the historical record, the key insight that bridged the gap from D’Alembert’s potential flow theory to modern understanding of fluid dynamics was Prandtl’s boundary layer approximation. Any analogy between boundary layer theory in fluid dynamics development and quantum electrodynamics in the development of modern physics?…
Cthulu,
Thank you for the correction. That’s a lot worse than I’d realized.
Peter,
I’m not sure it’s a glass half full/half empty situation, as far as the mathematics goes. (I see it as closer to 40/60. 😉 I don’t think the situation with the Standard Model vis a viz formal existence is much better than the one in string theory. This is, of course, a subjective opinion, and to do purely with the mathematical merits, not the physical ones.
With the Standard Model, we can write down a classical version and we can write down a free field theory which ought to be the zero coupling limit. This can be done with basically arbitrary amounts of rigor. But when we talk about the Standard Model, we’re asserting that there exists a interacting QFT which deforms this things. This is a very sketchy definition, even at a physical level! It’s not a completely absurd suggestion, of course. Mathematicians have shown that such deformations can exist in some situations. (But they’ve also showed that they can’t exist in others: There aren’t a lot of interacting scalar QFTs in higher dimensions.) And there are subsectors where we can use perturbation theory or monte carlo integrals to make approximate computations.
But this supposed deformation should be regarded with at least a little skepticism. The landau poles suggest it doesn’t exist. And the phenomenon of confinement tells us that the deformation is quite singular; the set of observables changes so much that a mass gap appears! It’s not really clear precisely what expectation values we’re computing; we have to actually write down a set of correctly dressed observables for the IR. Likewise, the troubles we have with regularizing chiral gauge theories (and generally with dealing with fermion determinants) suggests to me that we’re missing important parts of the story. We really don’t have a completely plausible description of the full dynamics yet. And without a sufficient understanding of the dynamics, we can’t _quite_ say what the observables are. (Thus the large cash prize for existence and mass gap in the only interacting subtheory where we think we might have a satisfactory definition.)
The mathetical status of string theory looks analogous to me. They’ve got a number of quantum mechanical theories (the 5 string theories, free 11d supergravity, large N quantum mechanics, ABJM, AdS/blahblahblah) which they say are degenerate limits of some unknown quantum theory. They can study these theories in infinitesimal neighborhoods, and they assert that bigger deformations exist. They don’t have anything as nice as Monte Carlo, but they do have all these BPS & wall crossing computations, which at least suggest that there are paths connecting these limits. (Compare to the SM, where we have pretty good numerical evidence that key parts of the theory, like the field whose interactions give mass to the weak bosons and fermions, can’t actually be made to interact in the manner prescribed.)
Analogous but worse actually. I don’t intend to minimize the difficulty of the puzzles string theory presents. The degrees of freedom and the dynamics are both generally a mystery. The nonperturbative effects are stronger. And the IR problems that appear when you try to deform the known limits are really gnarly. You thought confinement screwed up your description of observables? Meet black holes.
Lastly, note that I’m not advocating that we not study gauge theokries, especially chiral ones. They’re probably my favorite bit of mathematics. I’m just arguing that the Standard Model itself isn’t a particularly good example of a gauge theory, chiral or otherwise. The ones that fall out of string theory are far nicer.
AJ,
The difference as I see it is that in the SM case (modulo problems with chiral gauge couplings to fermions), we have well-defined theories for finite cut-off. We also know a lot about what is likely to happen as we remove the cut-off (for QCD, conjecturally asymptotic freedom tells us this can be done and how to do it, not that we can prove that this works yet…). It may very well be that for some parts of the SM (Higgs, U(1)) there is no way to remove the cutoff and get non-trivial interactions. If so, this tells us something very interesting about physics, that the SM has to fail at some scale (with gravity or some unknown physics coming to the rescue perhaps).
With string theory though, generically non-perturbatively you just don’t have anything like this (a well-defined, non-perturbative cut-off theory where you can study the continuum limit). All you’ve got is some consistency arguments saying maybe something is there.
I thought the main difference between the SM and string theory is that we have experimental probes of the SM. That is, we can test the SM against experiments and see if our attempts to make sense of it are at least consistent with reality. With plank scale physics like string theory, we don’t have any experiments telling us if it’s consistent with reality, so we need more rigor to show that it’s at least consistent with itself.
If the SM isn’t mathematically consistent in any incarnation, it is at least an effective “rule of thumb” for making predictions about nature. A particle physicist can look at the ill defined nature of the SM and say “at least its predictions are accurate”.
AJ said
“Compare to the SM, where we have pretty good numerical evidence that key parts of the theory, like the field whose interactions give mass to the weak bosons and fermions, can’t actually be made to interact in the manner prescribed.”
What “numerical evidence” are you talking about?
A.J., you’ve set a very high standard of rigor for the Standard Model (or any QFT) to meet. I would call that a rather pessimistic point of view. However, keeping the same standard, one has to admit that there are no realistic alternatives to the Standard Model that can best it or even match it at its current level of rigor. That is, there is no alternative that is in better shape either. I’m not saying anything that hasn’t already been admitted in this discussion. I’m just pointing out that if you choose this reason to feel pessimistic about the Standard Model, then there is no reason to feel any less pessimistic about alternatives.
DimReg:
All true, but not relevant. I was talking about the merits of the SM as a particular kind of mathematical model, a continuum QFT. The point is that, while you can do computations and you can say precisely what some of the parts are and vaguely what the whole thing ought to be, you can’t write down a completely satisfactory definition, even on the lattice. This is not a problem for the physics, but it’s a bit of a nuisance if you want to study the thing as a piece of mathematics. (Compare to the Navier-Stokes model of fluid dynamics, where you can just write down the differential equation and go from there.)
Mitchell Porter:
That comment references the apparent triviality of Phi-4 theory. Wilson conjectured it and it was one of the first things people investigated with computers. What they found generally was a lack of evidence for a continuum limit. The early literature that comes to mind includes Smolensky’s thesis and Luscher & Weiss’s stuff from ~87. There may be more recent literature contradicting those early studies, but I haven’t heard of anything that was universally accepted. Since I’m taking the pessimistic viewpoint here, I’ll let others make the argument that phi-4 and U(1) triviality is not a problem.
Peter:
Sure we can define Standard Model-like QFTs, and I’d be happy to bet that they can be made completely rigorous. But the chiral couplings problem and the triviality are real problems for mathematical study of the SM itself.
DimReg,
As Arkani-Hamed points out, there really is a serious problem with the idea that the problem with string theory is just that you can’t do Planck scale experiments to test it. This was only true pre-1995, what Verlinde calls the “old view” of string theory. Once you have decided that you really need to do M-theory, that holography is important, that string theory is going to give you a new idea about what space is, then you no longer have a well-defined, testable framework, at any distance scale. That’s the real problem with string theory, not experimental limitations.
Well, the originator of perhaps the leading hope to have a formal understanding of string theory is now working on this:
http://arxiv.org/abs/arXiv:1306.0533
I think its a sign that this particular hope has fizzled
A.J.,
I think your view of the SM is overly pessimistic. The situation is a lot better after the developments in lattice chiral gauge theory in the last 15 years. At the perturbative level everything is fine conceptually and mathematically (I’ll explain below), and at the nonperturbative level the thing that remains to be done seems to be “just” a hard technical problem; conceptually there are no apparent problems.
First, regarding what you wrote earlier about the issue of regulators breaking symmetries: This doesn’t matter as long as only a finite number of new counterterms (i.e. new terms in the quantum effective action that weren’t present in the original bare action) arise due to the broken symmetry. Then you can just include these terms in your bare action to begin with and tune their coefficients to remove them order by order in perturbation theory (and probably also nonperturbatively in a lattice formulation).
Anyway, for gauge theories at the perturbative level with the lattice regularization: this regularization breaks the Euclideanized Lorentz symmetry (after wick rotation to Euclidean spacetime) but preserves a subgroup of it – the hypercubic lattice rotations. Miraculously, this residual symmetry is still enough to ensure that no new counterterms arise. So there is no issue there. E.g. perturbative QCD with lattice regularization is in just as good shape as with dimensional regularization (although in practice computations with the lattice regulator are much more messy).
Now about chiral symmetry: The long-standing problem of formulating chiral fermions on the lattice was solved at the conceptual level 15 years ago. The solution goes under the name “overlap fermions” or “Ginsparg-Wilson fermions” (the name you use depends on whether you would rather be friends with Neuberger or Luscher 😉 ). This produced a new lattice Dirac operator – the “overlap operator” (hep-lat/9707022, 1000+ citations already) which satisfies the so-called Ginsparg-Wilson relation and therefore has an exact symmetry which can be regarded as a lattice-deformed version of the continuum chiral symmetry (see hep-lat/9802011). Consequently the massless lattice Dirac fermion action decomposes into left- and right-handed parts, giving the chiral fermion actions. Then you can write down the bare action for the SM with the lattice regularization (Yukawa terms coupling the chiral fermions and the Higgs can also be done) and everything is now fine at the classical field theory level.
The remaining issue at the quantum field theory level, in the path integral formulation, is the construction of the integration measure for the chiral fermions. Or equivalently, construction of the chiral fermion determinants. As you know, the massless Dirac operator maps between different vector spaces of chiral fermions (left-handed to right-handed, and vice-versa). So its determinant is only well-defined up to a complex phase factor (which depends on the choice of orthonormal bases for the vector spaces used to compute the determinant). The question then is whether it is possible to find a choice of bases such that the chiral fermion determinant is gauge invariant (this is necessary and sufficient to ensure that gauge symmetry is preserved at the quantum level, which is absolutely essential for renormalizability of any gauge theory). The Dirac operator is coupled to the gauge field, so typically the choice of chiral fermion bases will have to depend on the background gauge field too. More mathematically, the question is whether the determinant line bundle determined by the lattice Dirac operator is trivializable over the gauge orbit space of lattice gauge fields.
The answer should be negative in general due to the well-known “chiral gauge anomaly” – we only expect it to be possible when the chiral fermions live in representations that satisfy the “anomaly cancellation condition”. It has been shown that that answer is indeed negative when that condition is not satisfied. (No reference since I don’t want to “out” myself ;-).) On the other hand, when the condition *is* satisfied, the situation at present is as follows: (1) Positive answer has been shown for U(1) gauge theory. I.e. we now have a completely satisfactory nonperturbative formulation of U(1) gauge theories with chiral fermions in anomaly-free representations – see hep-lat/9811032. (2) So far a positive answer has not been shown for gauge theories in general, including the SM. It is a hard technical problem, but without any conceptual roadblocks that I can see. Significant progress has been made though – see e.g. arXiv:0709.3658. (3) At the *perturbative* level, a positive answer has been shown for all gauge theories with chiral fermions satisfying the anomaly cancellation condition, including the SM. Specifically, it has been shown that a gauge invariant chiral fermion measure can be constructed order by order in perturbation theory – see hep-lat/0006014. So at the perturbative level the conceptual and mathematical situation for the SM is completely fine.
Finally, regarding the Landau pole issue you mentioned: How would a nonperturbative lattice formulation of the SM deal with this issue? Well, probably the same way that nonperturbative lattice QED deals with it: As the coupling is increased it has a phase transition from a weakly-coupled phase (the physically relevant one) to a strongly coupled “lattice junk” phase. Probably the same thing would happen in a nonperturbative lattice formulation of the SM.
A.J.,
You say several times something along the lines that “We would like the Standard Model to be a continuum quantum field theory.” I would suggest that even this premise might be wrong. The scales are always important. The concept of boundary layers can be illustrative here. The physics outside a boundary layer are generally very different than the physics inside a boundary layer. Preliminary findings in quantum gravity suggest a sort of boundary layer in physics at the Planck scale. It is reasonable to keep the regularization condition outside this boundary layer, so that there needs to be no continuum limit. When there exists a boundary layer, perturbative methods do not necessarily need to converge inside the boundary layer, since that is not the area under consideration.
Amused:
Thanks for the comments and the references. I hope it’s clear that my pessimism is _partly_ theatrical, as I was criticizing what Peter said on MO about string theory by taking a very grumpy look at the Standard Model. (It’s for example, not really fair of him to say there that nothing is known non-perturbatively. The various matrix theories are … difficult… to make practical use of, but they aren’t totally unreasonable candidates for definition.)
Anyways: I agree that the fermion determinant problems look solvable, but aren’t actually solved. I think the triviality issues are more serious for mathematicians, at least if the game is to construct an example of a particular kind of thing. (Obviously there’s lots of interesting mathematics in just studying Euclidean lattice models.) We have some idea of a continuum QFT is, and what axioms it ought to satisfy. I don’t think we have as clear a picture of what sort of mathematical object an effective QFT is. It’s certainly a much larger class of objects. Even if one can write down a lattice version of the Standard Model, it’s not going to be completely clear what kind of mathematical object you’ve cooked up.
Zathras:
It’s a wrong premise physically, almost certainly. But the point was that the SM doesn’t look likely to fit into any currently known mathematical framework. You’re quite welcome to argue that mathematicians should get to work on finding an attractive definition of ‘non-perturbative effective QFT’. I don’t think that will be a minor undertaking…
A.J.,
Ok, I take it that “not even sure it’s likely that the SM is anything more than a set of recipes…” was theatrical then. 🙂
I do agree though that realistically the lattice formulation (or any other formulation) of the SM will probably never be more than an approximation theory, i.e. will not have a full continuum limit and only able to calculate physical quantities approximately and above some length scale. (I don’t like the term “effective theory” since it implies there is a more fundamental underlying theory, which might not be the case.)
In this regard it’s probably similar to the nonperturbative lattice formulation of QED. But I think such a theory still has “mathematical meaning” though, even though the answers it gives for physics quantities aren’t exact numbers but only approximate. The key thing that the answers, approximate or not, can be derived consistently from an explicit and well-defined starting point. This does look like it will be the case for the lattice formulation of the SM.
And this is already infinitely better than the state of affairs for string theory…although in fairness to the stringers their problem is a bit harder…
Amused,
Any reference for your comments about nonperturbative lattice QED and the “Landau pole”? I’m very curious.
Amused:
“I don’t like the term “effective theory” since it implies there is a more fundamental underlying theory, which might not be the case.”
And you think I’m pessimistic… 🙂
Igor,
It was just my expectation/guess. The Landau pole, if it exists, must show up in perturbative QED with lattice reguarization too, so presumably the nonperturbative lattice theory also knows about it and then it will have to deal with it somehow. The simplest (and only?) way I can imagine it dealing with this is by having a phase transition to a nonphysical “lattice junk” phase when the coupling gets large. Such a phase transition certainly exists in lattice QED — existence of a deconfined weak-coupling “physical” phase was shown by Guth in Phys.Rev. D21 (1980) 2291, and existence of a confined, and hence unphysical, phase at strong coupling was shown in Wilson’s initial work on confinement in lattice gauge theories, so there must be at least one phase transition in between. Presumably this has also been verified numerically in lattice QED simulation somewhere… As for whether there has been any work backing up my speculation about the connection between the phase transition and the Landau pole issue, I can’t say… There seems to have been a lot of work on lattice QED in the early days of lattice field theory, but it was long before my time and my knowledge of the literature is very sketchy. Maybe someone else knows about this; I would also be interested to have a reference if one exists..
A.J.,
I also like to indulge in a bit of theatrics sometimes… 😉
“He (Gross) describes Einstein as both encouraging the kind of speculative path followed by string theory and SUSY, while at the same time warning of its dangers, and noted that Einstein himself devoted the latter part of his life to a speculative path that turned out to be a dead end.”
I suspect Einstein might still reject that verdict of “dead end”. He was always inclined to go his own way, and he had a track record such that I certainly wouldn’t be the one to try to tell him not to follow his own instincts. On the other hand, his view never dominated physics, or discouraged or crowded out others from following different lines of inquiry. In fact, it was Einstein in the minority, so much so that the “dead end” label was suggested by others even while he lived and rejected that diagnosis (along with Schrodinger). Indeed, after he died, that label of “dead end” was elevated to conventional wisdom so fast, that very few people even tried to follow Einstein’s approach to search for a classical unified field theory which can explain quantum phenomena (which Gross correctly notes at 56:30 into his talk, was what Einstein was seeking).
Peter,
Sorry for the off topic update to your “Number Theory News” article published on May 12, but posting in that location is closed. Anyway, New Scientist (on June 4) did an article titled “Game of proofs boosts prime result by millions” containing many interesting links. Apparently, in addition to Terence Tao, most of the recent work on improving the prime gap bound is being done by Scott Morrison and Andrew Sutherland as actively documented on the “Secret Blogging Seminar” both here and, more recently, here. As David as said, you can literally watch mathematics being done in real time. For those wishing to keep abreast of the latest “world record” of the prime gap bound, see the PloyMath Wiki page “Bounded gap between primes“. For those trying to work through Zhang’s ground breaking proof, the PolyMath page, as linked above, also has a list of frequently updated errata to Zhang’s original proof as to be published in a forthcoming issue of “Annals of Mathematics“.
Again, sorry for the off topic post.
–Nick
Amused, A.J., Igor, Zathras, etc.
Forgive a smartass for braying in the middle of your technical discussion.
There are a lot of remarks in your discussion which are important, but need some clarification.
Renormalizability guarantees insensitivity to the ultraviolet cut-off scale or breakdown of the effective theory. As Peter W. points out, in QCD we think the situation is better, and that the cut-off can, in principle, be removed (though we believe that this theory is also cut off in Nature). For this reason, the standard model alone doesn’t tell us exactly where it breaks down as a description. Maybe the cut-off is at 100 TeV. Maybe it is at the Planck scale. We can’t know any different without experiments.
Forget the one-loop Landau pole. The Landau pole is only suggestive, and should not be taken seriously. The triviality of the S-matrix (as the cut-off is removed) has nothing to do with the Landau pole. Forget the lattice QED phase transition (evidence is strong that this is a 1st order phase transition, hence has nothing to do with continuum field theory).
If you want to understand problems with triviality, etc., the place to start is the Ginzburg criterion from critical phenomena – which gives the physical reason why phi^4 theories need a cut-off in 4D – not the Landau pole. It is in most books on
applications of the renormalization group.
Regards,
P.O.
Any commentary on this report? Anatole dark matter:
http://www.scienceworldreport.com/articles/7432/20130611/physicists-reveal-simple-theory-explains-nature-mysterious-dark-matter.htm
Anapole dark matter (Phys. Letter B, 24 May 2013)
http://www.eurekalert.org/pub_releases/2013-06/vu-stm061013.php
http://arxiv.org/abs/1211.0503
principal author Robert Scherrer
Peter O.,
“Forgive a smartass for braying in the middle of your technical discussion…”
Well I can’t speak for A.J. and the others, but personally I can forgive you. 😉
“Renormalizability guarantees insensitivity to the ultraviolet cut-off scale or breakdown of the effective theory. As Peter W. points out, in QCD we think the situation is better, and that the cut-off can, in principle, be removed (though we believe that this theory is also cut off in Nature). For this reason, the standard model alone doesn’t tell us exactly where it breaks down as a description. Maybe the cut-off is at 100 TeV. Maybe it is at the Planck scale. We can’t know any different without experiments.”
We can certainly agree on that. Hopefully there is no contradiction with anything in the earlier discussion, otherwise there’s been a misunderstanding.
“Forget the one-loop Landau pole. The Landau pole is only suggestive, and should not be taken seriously. The triviality of the S-matrix (as the cut-off is removed) has nothing to do with the Landau pole. Forget the lattice QED phase transition (evidence is strong that this is a 1st order phase transition, hence has nothing to do with continuum field theory).”
The meaning of “Landau pole” in the above discussion was perhaps a bit vague. My personal interpretation of it is “the possibility that a renormalized coupling diverges when the energy scale gets sufficiently high (or length scale sufficiently small). *If* this does happen to all orders in continuum perturbation theory (e.g. in QED) then it will also happen in lattice perturbation theory. But then the nonperturbative lattice model must also know about it and deal with it. As I wrote above, the simplest (and only?) way for it to deal with this that I can imagine is to have a phase transition to a nonphysical “lattice junk” phase when the coupling gets large. I agree completely with you that this has nothing to do with trying to take a continuum limit and the associated triviality issue. Never said it did 🙂 Also, my understanding of A.J.’s comments is that the Landau pole and triviality issue were mentioned as separate issues, no implication that they were the same or connected.
Thanks for the suggestion on where to start learning more about the problems with triviality. This is one of many topics I would like to educate myself more about in principle, and would if i had infinite time… But in practice I don’t feel an urgent need… My current simple-minded understanding of the triviality issue is that there is some place (presumably at a phase transition) in the bare coupling parameter space of the lattice theory where you can try to take a continuum limit, and when you do it the resulting continuum theory turns out to be trivial (non-interacting). My reaction to that is a shrug. It just means that its not a suitable place to tune the bare parameters when trying to describe Nature.
For example, I read somewhere that nonpertubative lattice QED has (or is believed to have) this same triviality issue. But on the other hand, by suitably tuning the bare parameters in lattice QED it should be possible to calculate, e.g., the energy levels of the hydrogen atom to at least the same high precision as in QM + perturbative QED corrections. So the absence of a continuum limit, or of a physically relevant (nontrivial) continuum limit does not mean inability to calculate/predict physical quantities. It just means the answers are approximate (since the values to tune the bare parameters to aren’t uniquely fixed, only approximately), and that there is some length scale below which the lattice theory will not give physically sensible and accurate answers. I expect the situation will also be like this for a nonperturbative lattice formulation of the SM.
Regards,
“amused” (David A.)
Having thought about it a bit more, I don’t think there’s any reason or need for the Landau pole issue (if there is one) to be connected with a phase transition to a nonphysical phase in lattice QED or lattice SM like I wrote above. So I’d like to withdraw that piece of ill thought out speculation. Once we accept that these lattice models can only describe the physics above some length scale, then that’s already enough to “solve” any Landau pole issue and triviality issue. E.g. in lattice QED, perturbation theory indicates that increasing the bare coupling drives the lattice spacing smaller, and a Landau pole would manifest itself through the fact that there is a minimum length that the lattice spacing can’t get below, no matter how much we increase the bare coupling. (And at some point, for large enough bare coupling, there is a phase transition to a nonphysical strong-coupled phase, so that also limits how much the lattice spacing can be decreased.)
Anonyrat/Marcus,
I’m no expert on dark matter, but I can say that this kind of story, based purely on a press release from Vanderbilt
http://news.vanderbilt.edu/2013/06/dark-matter/
is the sort of thing one should be extremely skeptical about. I’ve now seen a very large number of such press releases (“scientists at our institution have solved one of the great problems of theoretical physics”), and can’t think of an instance where the claims being made held up to scrutiny. My guess is that the only question here is whether this will ever make the mainstream science press, leading experts to explain what’s wrong with it.
Not directly following from previous comments but I don’t see anywhere to draw attention to things that may be of interest. The Observer (Sunday Guardian effectively) has this article where Jim Baggott and Mike Duff discuss “A theory of everything … has physics gone too far?”
<a href="http://www.guardian.co.uk/science/2013/jun/16/has-physics-gone-too-far" Observer Article
Sorry – got the link wrong should be to
http://www.guardian.co.uk/science/2013/jun/16/has-physics-gone-too-far
Dom,
Next posting will be about Baggott’s book and this debate. Soon…