The NSA, NIST and the AMS, Part II

Last summer I wrote here about an article in the AMS Notices which appeared to make misleading claims about the NSA’s involvement in putting a backdoor in an NIST cryptography standard known as DUAL_EC_DRBG. The article by Richard George, a mathematician who worked at the NSA, addressed the issue of the NSA doing this kind of thing by discussing an example of past history when they were accused of doing this, but were really actually strengthening the standard. He then went on to claim that:

I have never heard of any proven weakness in a cryptographic algorithm that’s linked to NSA; just innuendo.

This appears to be a denial of an NSA backdoor in the standard, while not saying so explicitly. If there is a backdoor, as most experts believe and the Snowden documents indicate, this was a fairly outrageous use of the AMS to mislead the math community and the public. At the time I argued with some at the AMS that they should insist that George address explicitly the question of the existence of the backdoor, but didn’t get anywhere with that. One of their arguments was that George was speaking for himself, not the NSA.

The question of fact here is a very simple and straightforward mathematical one: how was the choice used in the standard of points P and Q on an elliptic curve made? There is a known way to do this that provides a backdoor. Did the NSA use this method, or some other one for which no backdoor is known? The NSA refused to cooperate with the NIST investigation into this question. The only record of what happened when the NIST asked about how P and Q were chosen early on in the development of the standard is this, which indicates that people were told by the NSA that they were not allowed to publicly discuss the question.

Remarkably, the latest AMS Notices has a new article with an extensive discussion of the DUAL_EC_DRBG issue, written by mathematician Michael Wertheimer, the NSA Director of Research. At first glance, Wertheimer appears to claim that the NSA was unaware of the possibility of a backdoor:

With hindsight, NSA should have ceased supporting the dual EC_DRBG algorithm immediately after security researchers discovered the potential for a trapdoor. In truth, I can think of no better way to describe our failure to drop support for the Dual_EC_DRBG algorithm as anything other than regrettable.

On close reading though, one realizes that Wertheimer does not address at all the basic question: how were P and Q chosen? His language does not contain any actual denial that P and Q have a backdoor.

For a careful examination of the Wertheimer piece by an expert, see this from Matthew Green. Green concludes that

… it troubles me to see such confusing statements in a publication of the AMS. As a record of history, Dr. Wertheimer’s letter leaves much to be desired, and could easily lead people to the wrong understanding.

In a recent podcast on the subject Green states

I think it’s still going on… I think that the NSA has really adopted a policy of tampering with cryptographic products and they’re not going to give that up. I don’t think that this is a time that they want to go out admitting what they did in this particular case as a result of that.

Given that this is now the only official NSA statement about the DUAL_EC_DRBG issue, the Notices article has drawn a lot of attention, see for instance here. The Register summarizes the story with the headline NSA: So sorry we backed that borked crypto even after you spotted the backdoor.

The publication of the George and Wertheimer pieces by the AMS has created a situation where there are just two possibilities:

  • Despite what experts believe and Snowden documents indicate, the NSA chose P and Q by a method that did not introduce a backdoor. For some reason though they are unwilling to state publicly that this is the case.
  • P and Q were chosen with a backdoor, and the AMS has now repeatedly been used to try and mislead the mathematics community about this issue.

I’ve contacted someone at the AMS to try and find out whether the question of a backdoor in P and Q was addressed in the refereeing process of the article, but been told that they won’t discuss this. I think this is an issue that now needs to be addressed by the AMS leadership, specifically by demanding assurances from Wertheimer that the NSA did not choose a backdoored P and Q. If this is the case I can see no reason why such assurances cannot be provided. If the NSA and Wertheimer won’t provide this, I think the AMS needs to immediately cut off its cooperative programs with the agency. There may be different opinions about the advisability of such programs, but I don’t think there can be any argument about the significance of the AMS being used by the NSA to mislead the mathematics community.


Update
: There’s an Ars Technica story here, with a peculiar update of its own:

An NSA spokesperson emailed Ars on Friday to say Wertheimer retired in the fall of 2014 and submitted the article after he left his position. The Notices article made no mention of his retirement.

Another odd thing about the Wertheimer piece is that in a different part of it he seems to reveal what I would have thought the NSA considered a closely held piece of information about Taliban communication methods (see here). If he can discuss that publicly, why can’t he say whether P and Q were backdoored?

Update: This is getting international attention, with le Monde reporting the AMS Notices piece as an admission by the NSA that they backdoored DUAL_EC_DRBG.

Update: The NIST has put out a revised draft on its cryptographics standards process and asked for comments. On the NSA problem, it says that no changes have been made to the NSA-NIST Memorandum of Understanding, and that

cooperation with NIST is governed by an MOU between the two agencies and technical staff meet monthly to discuss ongoing collaborative work and future priorities.

It seems (see the NIST VCAT report) that, despite its obligations under the MOU, the NSA has refused to explain what it did with regards to compromising the DUAL_EC_DRBG standard, and experts believe (see above) that the NSA is committed to continuing to tamper with cryptographic products. Under these circumstances I don’t see how the NIST can expect anyone to not be suspicious of their standards.

A promise is made to identify NSA contributions to standards, but a footnote says that names of some NSA staff cannot be revealed and that documents involving NIST-NSA collaboration provided in response to FOIA requests may be redacted. I don’t see anything here that would keep the NSA from misleading or corrupting NIST staff to produce a backdoored standard, while keeping their input out of any record available to the public.

Posted in Favorite Old Posts, Uncategorized | 23 Comments

Short Items

  • The latest issue of the New York Review of Books has an article about the new Turing film, explaining in detail how it gets pretty much everything completely wrong about Turing and his story (see my review here). In related news, this week it was announced that the film is one of the final Oscar nominees for Best Adapted Screenplay.
  • The DESY research magazine femto has a sequence of articles about the LHC, SUSY and BSM physics.
  • The Swedish Research Council has just announced a ten-year grant of $60 million SEK (about $7 million) to bring Frank Wilczek to Stockholm University.
  • Mike Duff has some complaints about the Dean Rickles “A Brief History of String Theory” (for mine, see here.)
  • Jim Stewart, a mathematician who became wealthy based on his popular Calculus book (which we use here at Columbia) passed away last month at the age of 73. For more about him, see here and here. I had the pleasure of meeting him a couple times, with one occasion including a tour of his remarkable home in Toronto, Integral House.
  • For a new book about a certain mathematical point of view on QFT, see Factorization algebras in quantum field theory, by Kevin Costello and Owen Gwilliam.
  • Quanta magazine has a nice article by Kevin Hartnett on Ciprian Manolescu’s work on the triangulation conjecture.


Update:
One more. The Yale Art Gallery now has an exhibition of prints based on equations chosen and drawn by well-known mathematicians and physicists. It’s called The Art of the Equation, and impresario of the project Dan Rockmore will be discussing it there at 5:30 on Thursday January 22.

Posted in Uncategorized | 23 Comments

Back

Now back from vacation, and as far as I can tell, not much happened while I was away. Here are a few things I’ve seen that may be of interest:

  • Mochizuki has posted a long progress report on “activities devoted to the verification of IUTeich.” New Scientist has an article about this here, which quotes Minhyong Kim making comments I think most experts would agree with:

    Some mathematicians say Mochizuki must do more to explain his work, like simplifying his notes or lecturing abroad. “I sympathise with his sense of frustration but I also sympathise with other people who don’t understand why he’s not doing things in a more standard way,” says Kim. It isn’t really sustainable for Mochizuki to teach people one-on-one, he adds, and any journal would probably require independent reviewers who have not studied under Mochizuki to verify the proof.

    Lieven Le Bruyn has a less charitable take (see here and here):

    If you are a professional mathematician, you know all too well that the verification of a proof is a shared responsability of the author and the mathematical community. We all received a referee report once complaining that a certain proof was ‘unclear’ or even ‘opaque’?

    The usual response to this is to rewrite the proof, make it crystal-clear, and resubmit it.

    Few people would suggest the referee to spend a couple of years reading up on all their previous papers, and at the same time, complain to the editor that the referee is unqualified to deliver a verdict before (s)he has done so.

    Mochizuki is one of these people.

    His latest Progress Report reads more like a sectarian newsletter.

    There’s no shortage of extremely clever people working in arithmetic geometry. Mochizuki should reach out to them and provide explanations in a language they are used to.

    Mochizuki’s progress report strikes me as quite an odd document, especially in its insistence that experts need:

    to deactivate the thought patterns that they have installed in their brains and taken for granted for so many years and then to start afresh, that is to say, to revert to a mindset that relies only on primitive logical reasoning, in the style of a student or a novice to a subject.

    He at times seems to be arguing that his ideas are nearly disconnected from the rest of known mathematics, and the only way to understand why the abc conjecture is true. This is highly implausible, since the great beauty and strength of mathematics is the way in which deep ideas are interconnected, with many paths from one place to another. If he wants to convince people that he really has what he claims, the best way to do it would be to follow the conventional route: write himself a document giving an exposition of a proof of abc, in as clear and simple terms as possible.

    Unfortunately, that doesn’t seem to be what he has planned, with his efforts devoted to getting others to start from the beginning and master his long series of papers. If this works, at some point there will be others able to write up a proof of abc using his ideas, and when that happens, experts may have something they can work with. This looks now like a story that is going to go on for a long time…

  • The last couple weeks in Jerusalem there was a Winter School on General Relativity. It included a final session (video here) largely devoted to defending string theory as the one true path to quantum gravity. This included a panel discussion where Carlo Rovelli held his own in a battle of the LQG/string wars, with him ganged up on by Gross and Arkani-Hamed. Mostly I don’t think there were any new arguments, just a rehash of the tediously familiar. Gross did give an enthusiastic call for all students to read the Dawid book discussed here.
    For yet another promotional effort about strings, one that seems like it could have been written exactly the same way twenty years ago, see here.
  • One new argument from the Rovelli side was to point out that “Nature talks”, and what it has said at the LHC so far is that SUSY is not there, blowing a big hole in the expectations of the superstring theory community. The Economist has a piece about how the upcoming LHC run at 13 TeV will be:

    the last throw of the dice for the theory, at least in its conventional form.

    As often the case though, the article misrepresents the strength of arguments for SUSY:

    But, though the Standard Model works, it depends on many arbitrary mathematical assumptions. The conundrum is why these assumptions have the values they do. But the need for a lot of those assumptions would disappear if the known particles had heavier partner particles: their supersymmetric twins.

    This is pretty much complete nonsense, since the problem with SUSY has always been that it doesn’t actually explain why the SM model parameters take the values that they do, and this has always been the best reason to be skeptical about it.

    On the other hand, the Economist and Rovelli do get the basic story right: Nature talks, and if what it says in LHC Run 2 is that the theoretical physics community has been barking up the wrong tree for the last forty years, it will be interesting to see if theorists are still willing to listen.

Posted in abc Conjecture, Uncategorized | 32 Comments

Winter Break

Blogging here should be light to non-existent for a while, with family holiday celebrations tomorrow and departure for a trip to Europe the day after. Travel plans still in flux, but the general idea is to head south after arriving in Paris, spend a couple weeks on the road and mostly in Italy, end up back in Paris January 6, back to New York on the 11th.

I somehow seem to have caused in the last posting (see the comment section there) an eruption of an even odder version of the kinds of attacks from string theorists that were common in 2006, a period known to aficionados as the “String Wars”. The new version is more like the “Multiverse Wars”. From past experience I know that involvement in such things is not a good way to spend your vacation, so I think when I head to the airport I’ll likely shut off comments. In the meantime, I hope the holiday spirit will reign…

Update
: Off on vacation, comments will be off. Some last minute links that may be of interest:

Posted in Uncategorized | 6 Comments

Dualities

There’s a very interesting new paper on the arXiv by Joe Polchinski, a survey article for Studies in History and Philosophy of Modern Physics, entitled just Dualities. It’s an unusually lucid summary of the story of dualities in quantum field theory and string theory. This is a very complex subject which has been a central one in theoretical physics for the last few decades, but most expository writing on the subject has tended to be either superficial promotional material or mired in technical detail obscuring fundamental issues.

One reason for this is that, as Polchinski does an admirable job of making clear, in a very real sense we still do not understand at all the fundamental issues raised by these dualities. He notes that “we are still missing some big idea”, and points to the same comments from Nati Seiberg last month that I blogged about here. For most of the dualities at issue, our current standard technology for dealing with QFTs (the Lagrangian and the path integral over classical fields) is capable of capturing the two QFTs that are in some sense “dual”, but we lack a viable larger framework that would give the two QFTs in two different limits and explain the duality relationship.

For an example of the problem, probably the oldest and most well-studied case where we are missing something is Montonen-Olive duality, a non-abelian duality between electric and magnetic charges and fields. A currently popular idea is to find the explanation of this in “Theory X”, a 6d superconformal QFT, with duality coming from compactifying the theory on a torus (for more about this, see talks last week in Berkeley). The problem with this is that we don’t have a definition of the “Theory X”.

Polchinski places this problem in the context of a conjectural “M-theory” with various string theory limits. This has been the dominant idea in the subject for nearly 20 years now, but we seem no closer now to finding an actual realization of this conjectural picture than we were back in the mid-90s. Twenty years and thousands of papers have just given better understanding that various possible ideas about this don’t work.

One place where I think Polchinski’s survey is weak is in the treatment of this conjecture, where at times he takes as solid result something highly conjectural. For instance he starts off at one point with:

String-string dualities imply that there is a unique string/M-theory.

and moves on to the conjecture that

In this sense it may be that every QFT can be understood as a vacuum state of string/M-theory.

The problem here is that he’s built a speculative view of the unification of physics, constructed on an assumption about a “unique” theory, when we don’t know at all that such a thing exists. One basic lesson of mathematical research is that you need to keep very clear the distinction between what you really understand and what is speculation, because your speculation is often wrong and if so will lead you in the wrong direction. I think particle theory of recent decades likely suffers from people forgetting that some ideas are speculative, not firmly grounded, and may be pointing in the wrong direction.

One wrong direction this takes Polchinski is to the non-predictive, pseudo-scientific landscape of supposed string theory solutions and the multiverse, which he blithely invokes as our best fundamental explanation of physics. Tellingly, unlike the clear explanations of other topics, here he makes no attempt to describe these ideas other than to note that

they rest on multiple approximations and no exact theory.

In a final section, Polchinski addresses the question of what all this tells us about what is “fundamental” and what is the role of symmetries. This is the crucial question, and I’d argue that our lack of understanding of where these dualities come from likely is due to our missing some understanding of how symmetries are realized in QFT or string theory. This has been the lesson of history, with the Standard Model only coming into being when people better understood how symmetries, especially gauge symmetries, could act in QFT. Polchinski largely takes the opposite point of view, arguing that the fundamental theory maybe has no symmetries, local or global. He quotes Susskind as suggesting that symmetries have nothing to do with fundamental equations, are just calculational tools for finding solutions. I think this is completely misguided, that a strong case can be made (and I do it here) that “symmetry” (in the sense of the mathematics of groups and their representations) lies at the very foundation of quantum mechanics, and thus any quantum mechanical theory, even string/M-theory, whatever it might be.

Wondering whether there will be an arXiv trackback to this, and whether Polchinski has something to say about it…

Update: The arXiv Monday evening has a large collection of excellent review articles entitled “Exact results on N=2 supersymmetric gauge theories”, edited by J.Teschner (first is arXiv:1412.7118, last arXiv:1412.7145). Some of the results reviewed are based on deriving implications of the existence of the 6d (2,0) models discussed here and in the comment section.

Update: I’ve put this blog posting in the Multiverse Mania category, not because of the posting content, but because of comments in the comment section from Polchinski and Bousso.

Posted in Multiverse Mania | 61 Comments

Quick Links

  • The Planck data release has been delayed yet again. December 22, is now off the table, the latest plan is “before the end of January 15”, see here. Some peeks at their results are in slides from the Ferrara conference, available here. The fact that the slides for the “Planck low-ell CMB power spectra” talk are unavailable correlates with the rumor I’ve heard that they have recently found serious problems with that part of their data analysis, which would explain why the data release keeps getting pushed back.

    This week there’s a conference in Paris, no slides yet. Streaming video has been available, which I took a look at for a while. Just managed to catch the tail end of questions about what the state of their analysis is relevant to the crucial B-mode business. Not enough to get the bottom line of what the state of affairs is. Perhaps someone who was there or who watched the whole thing can report. About the best source of information on cosmology these days seems to be Twitter, hashtag #planck2014. Something else of interest at the Paris conference was a debate about inflation featuring Steinhardt, Mukhanov, Linde and Brandenberger. Maybe video will be available someday, along with the slides.

  • Scott Aaronson has more here about the problems with the recent movie about Turing that I mentioned here. Despite (or maybe because of…) having little relation to reality, the screenplay of the film has been nominated for a Golden Globe award.
  • David Mumford and John Tate wrote a biographical sketch of Grothendieck for Nature. Unfortunately it seems that it won’t be published there because of being too technical. It is however available at Mumford’s blog.
  • There’s an interesting interview with Nikita Nekrasov at the artist Marina Abramovic’s MAI site.

Update: Shantanu points out that the Paris talk videos are available here. Looking a bit, I didn’t see anything from the Planck people about when they will release direct B-mode polarization results (next month? later?). Steinhardt gave a powerful talk arguing in detail that inflation does not predict anything, and that the usual claims for it are untenable. For the Steinhardt, Mukhanov, Linde, Brandenberger debate, see here.

Update: For yet another explanation of the problems with the Turing movie, set this at the New York Review of Books.

Posted in Uncategorized | 5 Comments

Defend the Integrity of Physics

This week’s Nature features a call to arms from George Ellis and Joe Silk, entitled Scientific method: Defend the integrity of physics. I’m very glad to see well-known physicists highlighting the serious problem for the credibility of science raised by the string theory multiverse and the associated ongoing campaign to justify the failures of string theory by attacking the scientific method. Acknowledging evidence that an idea you cherished doesn’t work is at the core of what science is and physics now has a major problem with prominent theorists refusing to abide by this principle. Ellis and Silk do a great job of identifying and characterizing an important challenge the scientific community is facing.

The issue is however complicated, and while the Nature piece carefully and clearly addresses some of the complexities, there are places where things get over-simplified. In particular, the introduction frames the issue as whether a theory being “sufficiently elegant and explanatory” allows it to not need experimental testing. The problem with the string theory multiverse though is not this, since such a theory is the antithesis of “elegant and explanatory”. There’s just about nothing in science as inelegant as the various attempts (e.g. the KKLT mechanism) to make string theory fit with known physics, and “the multiverse did it” is no more an actual explanation of anything than “a big omnipotent turtle did it”.

Trying to cut through the complexities, Ellis and Silk write:

In our view, the issue boils down to clarifying one question: what potential observational or experimental evidence is there that would persuade you that the theory is wrong and lead you to abandoning it? If there is none, it is not a scientific theory.

This is at the heart of the matter, but there are subtleties. A common recent move among some prominent string theorists has been to argue that string theory is falsifiable: it is based on quantum mechanics, so if experiments falsify quantum mechanics, they falsify string theory. This just makes clear that the question of falsifiability can be slippery. Philosophers of science are experts at the intricacies of such questions and Ellis and Silk are right to call for help from them.

They also make the interesting call for the convening of a conference to address these issues. How such a thing would work and how it might be helpful seem well worth thinking about. As for one of their other recommendations though:

In the meantime, journal editors and publishers could assign speculative work to other research categories — such as mathematical rather than physical cosmology — according to its potential testability.

I’m leery of the impulse among physicists to solve their problem of how to deal with bad physics by calling it mathematics. Yes, there is good mathematics that has come out of untestable ideas about string theory, but no, this doesn’t include the string landscape/multiverse cop-out, which physicists need to face up to themselves.

For the specific arguments from Sean Carroll and Richard Dawid that Ellis and Silk address, I’ve written about them elsewhere, see for instance here, where I discussed in some detail Dawid’s arguments.

Update
: Sabine Hossenfelder has commentary on this here.

Update: Taking the opposite side of the argument in January’s Smithsonian magazine is by colleague Brian Greene, with an article entitled Is String Theory About to Unravel?. As you might expect, Brian’s answer is “No”, and he gives a good account of the point of view Ellis and Silk are warning against. He mentions the possibility of encouraging news for string theory from the next LHC run, but says that “I now hold only modest hope that the theory will confront data during my lifetime.”

Update: Sean Carroll responds to the criticism from Ellis and Silk with a tweet characterizing them as belonging to the “falsifiability police”:

My real problem with the falsifiability police is: we don’t get to demand ahead of time what kind of theory correctly describes the world.

Update: Gordon Kane joins the fight in a comment at Nature, claiming that, before the LHC, string theory predicted a gluino mass of 1.5 TeV.

The literature contains clear and easily understood predictions published before LHC from compactified string theories that gluinos, for example, should have been too heavy to find in Run 1 but will be found in Run 2 (gluino mass of about 1.5 TeV).

As far as I can tell, this is utter nonsense, with Kane publicly claiming string theory predictions of a gluino mass of around 600 GeV (see page 22 of this) back in 2011, then moving the “prediction” up as Run 1 data falsified his earlier predictions. Kane at least makes falsifiable predictions, the problem with him only comes when they get falsified…

Update: Chad Orzel has his take here.

Update: Adam Frank has an essay on this here.

Posted in Multiverse Mania | 34 Comments

Weinberg on the Desert, Seiberg on QFT

Last week Steven Weinberg gave a Lee Historical Lecture at Harvard, entitled Glimpses of a World Within. There’s a report on the talk at the Harvard Gazette.

In essence, Weinberg argues in the talk for an idea that first started to dominate thinking among HEP theorists nearly forty years ago, one that is sometimes called the “Desert Hypothesis”. The idea is that by looking at what we know of the SM and gravity, you can find indications that the next level of unification takes place around the Planck scale, with no new physics over the many orders of magnitude between the scales we can observe and that scale, at least no new physics that will affect running of coupling constants for instance. The evidence Weinberg gives for this is three-fold (and very old by now):

  • He describes listening to Politzer’s first talk on asymptotic freedom in 1973, and quickly realizing that if the strong coupling decreases at short distances, at some scale it would become similar to the coupling for the other fundamental forces. In a 1974 paper with Georgi and Quinn this was made explicit, and he argues this is evidence for a GUT scale a bit below or around the Planck scale.
  • He explains about the Planck scale, where gravity should be of similar strength to the other interactions. This idea is even older, well-known in the fifties I would guess.
  • He refers to arguments (which he attributes to himself, Wilczek and Zee in 1977) for a Majorana neutrino mass that invoke a non-renormalizable term in the Lagrangian that would come from the GUT scale.

Weinberg sees these three hints as “strongly suggesting” that there is a fundamental GUT/Planck scale, and that’s what will explain unification. Personally though, I don’t see how three weak arguments add up to anything other than a weak argument. GUTs are now a forty-year old idea that never explained very much to start with, with their best feature that they were testable since they generally predicted observable proton decay (which we haven’t seen). We know nothing at all about the source of particle masses and mixing angles, or the reason for their very different scales, and there seems to be zero evidence for the mechanism Weinberg likes for getting small neutrino masses (including zero evidence that the masses are even Majorana). As for quantum gravity and the Planck scale, again, we really have no evidence at all. I just don’t think he has any significant evidence for a desert up to a Planck unification scale, and this is now a very old idea, one that has been unfruitful in the extreme.

Weinberg ended his talk with another very old idea, that cosmology will somehow give us evidence about unification and GUT-scale physics. That also hasn’t worked out, but Weinberg quotes the BICEP2 value of r as providing yet more evidence for the GUT scale (he gives it a 50/50 chance of being correct). Again though, one more weak piece of evidence, even if it holds up (which I’d give less than 50/50 odds for at this point…), is still weak evidence.

For a much more encouraging vision talk, I recommend listening to Nati Seiberg at the recent Breakthrough Prize symposium. Seiberg’s talk was entitled What is QFT?, and to the claim that QFT is something understood, he responds “I really, really disagree”. His point of view is that we are missing some fundamental insights into the subject, that QFT likely needs to be reformulated, that there exists some better and more insightful way of thinking about it than our current conventional wisdom. In particular, there seems to be more to QFT than just picking a Lagrangian and applying standard techniques (for one thing, there are QFTs with no known Lagrangian). Seiberg takes the fact that mathematicians (who he describes a “much smarter than most quantum field theorists”…) have not been able to come up with a satisfactory rigorous version of QFT to indicate not that this is a boring technical problem, but that we don’t have the right definition to work with.

To make things more specific, he describes joint recent work (for another version of this see here) on “Generalized Global Symmetries” that works with global symmetries associated to higher co-dimension spaces than the usual codimension one case of Noether symmetries and Lagrangian field theory. Evidently there’s a forthcoming paper with more details. I’m in complete agreement with him that there must be better ways of thinking about QFT, and I think these will involve some deeper insights into the role of symmetries in the subject.

Update: The paper Seiberg mentions is now available here.

Posted in Uncategorized | 68 Comments

Very Short Items

  • Long-awaited results from the Planck experiment were unveiled last week, with a new model for how to do this: hold a conference with no videos, no slides released, no wifi in the lecture hall, and put out a press release in French. They did release an amazing image that looks like a Van Gogh, but if you want numbers, you have to search Twitter. Stories at Nature and the New York Times indicate not much new, data relevant to primordial gravitational waves still to come (Dec. 22?, next year?).
  • News from MIT is that, after an investigation of charges of sexual harassment of one or more students online, the university has revoked physics professor Walter Lewin’s emeritus status and is removing his lecture videos and course material from their online course sites.
  • Today’s Wall Street Journal has a sensible piece by Ira Rothstein on The Perils of Romanticizing Physics.
  • I took a look at Kip Thorne’s The Science of Interstellar in a local bookstore. It gives a detailed explanation of the “science” behind the film, explaining what a lot of the highly confusing later plot of the film was supposedly about. It seems it’s all based on the “large extra dimension” business of 15 years ago, the dimensions that were supposed to show up at the LHC. If you want to see all the equations, go here and look for the pictures of the blackboards.
  • The LHC magnets are now getting trained for 6.5 TeV operation, as well as inspiring fashion designers.
  • I mentioned last year’s Gelfand Centennial conference at MIT here, thought that there were no videos. Luckily I was wrong, quite a few talks well worth watching are now available here.
  • Videos from the Breakthrough Prize symposia recently held at Stanford are now available. For the physics talks, see here. For the math talks, see here, here, here, here, here and here.
  • If you need a change of pace, and can’t get enough of the string theory/LQG debate, this is for you.
  • For talks about the implications of not seeing new physics at the LHC, there’s Naturalness 2014. Nima Arkani-Hamed kicked it off with “Hopefully My Last Ever Talk On This!” (seems unlikely…). He argues for the idea that it’s the Multiverse that did it, and we should keep looking for Split SUSY, disses “conformality” approaches. Matt Strassler on the other hand points out that Arkani-Hamed’s use of the multiverse explanation doesn’t make sense (there’s no anthropic reason for the highly non-generic SM).
  • There’s a workshop this week at Caltech on scattering amplitudes and the Grassmanian.
  • Finally, a HEPAP meeting this week. Budget news for HEP theory doesn’t look good, but on the other hand the US now seems to be functioning without a budget, so it’s kind of hard to be sure…

Update: Slides from the Planck conference are now available.

It seems the US does have a budget now, some info here. DOE HEP and Cosmic Frontier $766 Million, down very slightly from last year’s $775 million, higher than the White House requeset of $744 million.

Update: Scott Aaronson on Walter Lewin here.

Posted in Uncategorized | 15 Comments

The Multiverse in a Nutshell

The Guardian has a podcast up today featuring Robert Trotta and David Wallace called The Multiverse in a Nutshell. It’s largely more of the usual uncritical multiverse hype that has been flooding the public expositions of fundamental physics for years now. Trotta gives the usual promotion of the cosmological multiverse, with no indication there is any problem with it. He assures us that this is being tested (by looking for “bruises” in CMB collisions). As far as I can tell, the Planck results released today, like all CMB data, show no evidence for anything like this. It appears that the Planck people don’t even think this is worth mentioning. The public channels used for this hype will never report the fact that there’s nothing there, instead they will just endlessly talk about this as something “scientists are looking for.”

Wallace talks about something completely different, many-worlds, with nobody telling listeners that this has nothing at all to do with the cosmological material. Instead we’re told that it’s all related, because “most theories” “tell us there must be a multiverse”. When challenged about the splitting universe business in QM, Wallace admits that at the fundamental level there is no splitting, there’s just one theory and one universe, that “many worlds” is just a way of talking about the emergent behavior of the classical approximation. His book about this, The Emergent Multiverse, is quite good and makes clear what the “Multiverse” there really is. It’s a real shame that he chooses to involve himself in this kind of attempt to muddy the waters and promote pseudo-science to the public.

Thankfully at least the physics community has one physicist trying to do something about this nonsense: Paul Steinhardt. In an interview with John Horgan, here’s his “Multiverse in a Nutshell”:

Unfortunately, what has happened since is that all attempts to resolve the multiverse problem have failed and, in the process, it has become clear that the problem is much stickier than originally imagined. In fact, at this point, some proponents of inflation have suggested that there can be no solution. We should cease bothering to look for one. Instead, we should simply take inflation and the multiverse as fact and accept the notion that the features of the observable universe are accidental: consequences of living in this particular region of the multiverse rather than another.

To me, the accidental universe idea is scientifically meaningless because it explains nothing and predicts nothing. Also, it misses the most salient fact we have learned about large-scale structure of the universe: its extraordinary simplicity when averaged over large scales. In order to explain the one simple universe we can see, the inflationary multiverse and accidental universe hypotheses posit an infinite variety of universes with arbitrary amounts of complexity that we cannot see. Variations on the accidental universe, such as those employing the anthropic principle, do nothing to help the situation.

Scientific ideas should be simple, explanatory, predictive. The inflationary multiverse as currently understood appears to have none of those properties.

Posted in Multiverse Mania | 29 Comments