This week’s string theory hype is embedded in a story by Michael Schirber about the possibility of variation of fundamental constants that has appeared on msnbc.com, foxnews.com, and Slashdot. According to Schirber:
A popular alternative to relativity, which assumes that sub-atomic particles are vibrating strings and that the universe has 10 or more spatial dimensions, actually predicts inconstant constants.
According to this string theory, the extra dimensions are hidden from us, but the “true” constants of nature are defined on all dimensions. Therefore, if the hidden dimensions expand or contract, we will notice this as a variation in our “local” 3D constants.
It’s kind of funny to hear that string theory “predicts” that constants like the fine structure constant will vary in time. When Michael Douglas was here in New York giving a talk last year and was asked about predictions of the string theory landscape, he said that the best one was that the fine structure constant would NOT vary. His argument was that it couldn’t vary since effective field theory arguments would imply a corresponding variation in the vacuum energy, something inconsistent with observation. So string theory both predicts that the fine structure constant will vary, and predicts that the fine structure constant will not vary.
For more string theory hype, Michio Kaku now has a MySpace site, including a blog. He also has his own web-site, mkaku.org, which has recently been redesigned and now prominently features an offer of signed copies of his (softcover) books for $50.
Update: There’s an informed take on what the data about varying fundamental constants actually says from Rob Knop.
Here is a recent assessment of twistor techniques by Keith Ellis,
in a closing talk of QCD Moriond 2006 (hep-ph/0607038, page 9):
“So far the impact on real phenomenology is rather limited…”
He praises, however, “great intellectual excitment and an injection of personnel from formal areas”.
What a nice compliment for the new QCD personnel!
By the way, twistor techniques should not be confused with string inspired/helicity/SUSY techniques used by Dixon et al very successfully over the last 15 years.
DEAR PROFESOR BAEZ
I AM NOT A CRACKPOT!! I AM A BIG ADMIRRER OF THIS WEEKS FINDS, IT IS VERY USEFUL. I AM VERY SAD THAT MY HEROE HAS CALLED ME CRACKPOT.
I MUST ADMIT THAT I AM MORE INTERESTING IN MATHEMATICS THAN IN PHYSICS BUT I FIND MATHEMATIC-PHYSIC BORDER IS MOST INTERESTING. I WILL WRITE ABOUT WHAT I WANT TO BE LEARNING SO YOU CAN SEE I AM NOT CRACKPOT AND I WILL BE PROOVED INNOCENT. SORRY PETTER FOR NOT WRITING ON THE TOPIC I WILL NOT DO IT AGAIN.
PLEASE TO BE WRITING IN THIS WEEKS FINDS ABOUT TWISTORS. THEY SEEM QUITE INTERESTING TO EVERYONE, CLASICAL PEOPLE AND QUANTUM PEOPLE AND EVEN THE STRING THEORY HERATICS. I UNDERSTAND THE IDEA OF WHAT IS A TWISTOR FOR MINKOWSKI BUT DO NOT UNDERSTAND HOW THEY CAN BE DEFORM IN GENERAL WHEN THE WEIL TENSOR IS DUAL TO ITS OWN SELF — I CANNOT UNDERSTAND KODAIRA DEFORMATION THEOREM AT ALL, VERY DIFFICULT! FROM A PHYSIC PERSPECTIVE WHY IS THE LIGHT RAY AND THE LIGHT CONE MORE IMPORTANT THAN THE POINT, THIS SEEMS A LITTLE VERY SILLY!! ALSO HOW DO MONOPOLES BECOME? THERE ARE MANY PAPERS ABOUT MONOPOLES AND TWISTORS. I DO NOT UNDERSTAND ANY LINK BUT THIS DOES NOT MEAN I AM A CRACKPOT, ONLY THAT I AM STUPID.
Does the statement that “lattice QCD IS QCD” hold water considering many of the things it does not take into account,
Lattice QCD with 5 flavours of domain wall Fermions (for example) *is* QCD, in the continuum limit. Alas, we lack a computer to run such a beast.
Likewise, lattice QCD with 3 light flavours of staggered fermions, and 2 heavy flavours quenched is as close to QCD as we need to get.
In both of these cases there is nothing that isn’t taken into account, though with the staggered fermions there is an ongoing theoretical issue.
and only certain regimes the approximations in lattice QCD hold?
Huh? The only “regime” that I know about where lattice QCD simply cannot be done is with non-zero chemical potential. And even that is basically a computational issue.
The problems with Lattice QCD these days are computation related. That is, we simply do not have the resources to run a realistic simulation (i.e. the first scenario I mentioned above). However, the staggered fermion approach, with various approximations for the heavy quarks has yielded real results, of genuine phenomenolgical interest. For a few recent results see hep-lat/0607011.
For instance what defines the utility or goodness of an approximation since as an example N=4 SUSY YM is more useful to calculating in QCD for the LHC than the lattice
Lattice people are busy computing things to compare to real experiments going on right now (cleo-c, belle, babar, D0, CDF, RHIC, etc.). The LHC is *well* into the perturbative QCD regime in any case. There are lattice calculations that could be done, however they’re very complicatied (hadronization) and we probably lack the comptational resources.
MathPhys Said
“How many string theorists it takes to change a light bulb?”
The whole community (10^3)? First they may find the good light bulb and Landscape is large enough.
Juan R.
Center for CANONICAL |SCIENCE)
Chris O,
In your humorous honor I christen my HID apparatus AKHIEZER and BERESTETSKII.
-drl
Matthew,
Would it be practical to do something like “SETI At Home” with these calculations? I’m amazed that there are still computing problems in physics that can’t be solved with a 1000-node Beowulf.
-drl
Danny – fine. The keyboard I am using at the moment (on a library computer in Oxford) is a bit old and disagreeable, so I am going to call it “PAULI”.
Q: How many (less obnoxious) String Theorists does it take to change a light bulb?
A: Change the light bulb? Why? Playing games in the dark is so much better! You would be amazed at how brilliant some of the games we have devised are!
drl,
Would it be practical to do something like “SETI At Home” with these calculations?
Probably not. Current computational bottlenecks are the speed of communication between nodes in a cluster, not the raw horsepower of the machine.
I’m amazed that there are still computing problems in physics that can’t be solved with a 1000-node Beowulf.
There are many (many many many) problems of interest in computational physics for which a 1000 node Beowulf is nothing. Realistic QCD (i.e. with physically accurate light quark masses) is not feasible on any current computer, full stop. To solve complicated multiparticle states (say, to compute the nucleon-nucleon force from a lattice calculation) would require petaflop (or beyond) scale computers.
One trouble is the fermion update aglorithm critically slows down as you approach light quark masses. So lighter masses means massively more computer power is needed. Of course, we’ve got chiral perturbation theory, so you don’t really need to go to the physical point. But even getting to within the validity of chiral PT is a challenge for modern supercomputers.
The article on Connes is a two page “Insight” piece on page 36 of the August issue, written by Alexander Hellemans. I learned from the article that renormalization was introduced by (among others) ‘t Hooft and Veltman, and that Connes has linked its mathematical justification to one of Hilbert’s (solved) problems, and that the solution involved non-commutative geometry which “serves as a starting point to unify relativity and quantum mechanics.” HTH you as much as it did me!
I also learned that Connes is the kind of guy who can walk through a riot without losing the thread of his mathematical conversation.
I am guessing that this refers to the Connes-Kreimer work.
D R Lunsford Says and Matthew,
Chemistry and biology are another two fields with entire problems without computational solving. Processors’ flops are not the only problem, memory can be more a handicap. Usual chemical computations (TIE) needing of 2GB temporary files are hard to efficient managing by a supercomputer. More advanced methods (TDL) applied to basic chemical systems of interest need of 7 GB for storing equation, many more for solving it.
Juan R.
Center for CANONICAL |SCIENCE)
Hello, all. Did this start as an entry on changing constants? What does it mean if there is evidence from 2 independent methods that c is changing in exactly the amounts predicted?