First LHC Winter Conference Results

This week the Aspen Center for Physics is hosting one of the first of this year’s “Winter Conferences” where results from last year’s LHC run are being reported. Appropriately, the title of the conference is New Data from the Energy Frontier. The most dramatic result has to do with what is not being seen: any evidence of supersymmetry, with new limits reported today by ATLAS. The new ATLAS results rule out gluino masses up to 7-800 GeV, improving on the first limits of this kind from CMS which were about 600 GeV.

For more detailed discussion courtesy of the blogosphere, see Resonaances and Cosmic Variance. For some indication of what this means for string theory, Michael Dine’s lecture notes for his talks on “What LHC might tell us about String Theory” at last summer’s TASI summer school are now out, with the title Supersymmetry from the top down. These lecture notes start off with a section very unusual for this kind of thing, entitled “Reasons for Skepticism”. and he notes:

Our enthusiasm for supersymmetry, however, should be tempered by the realization that from existing
data – including early LHC data – there are, as we will discuss, reasons for skepticism.

For some historical perspective about what pre-LHC expectations were, I happened to run across today a copy of Witten’s lecture notes from a string theory conference at Hangzhou in 2002, where he gives the muon magnetic moment discrepancy as one piece of evidence for supersymmetry, and says:

Assuming this discrepancy holds up, we would expect to interpret it in terms of new particles, but these are highly constrained; one explanation that does work is supersymmetry, with masses of new particles of order 200 – 300 GeV.

Of course, even the minimal supersymmetric extension of the Standard Model is ferociously complicated, with over a hundred unknown parameters, so all quoted limits make various simplifying assumptions. Relating LHC data to limits on supersymmetry will be a subject keeping many physicists busy for the next few years, for more about this, see this talk at Aspen by Jay Wacker. He doesn’t expect this year’s run to as dramatically increase limits on gluinos as last year’s run did, describing early results as “full coverage up to 300 GeV, reach up to 600 GeV”, increasing to “full coverage up to 375 GeV, reach up to 800 GeV” after an inverse femtobarn of data is analyzed (that’s the official LHC goal for 2011, although it’s hoped they can double or triple that).

The last sentence of his last slide refers to something that I’ve always worried about, but am not expert enough to know whether such a worry is serious. He describes the web-site http://LHCNewPhysics.org where simplified models based on supersymmetry and other BSM ideas are given, and notes:

ATLAS studying 10 Simplified Models from 0 in August. Changing their triggers.

The worry I’m not so sure about is to what extent the LHC detector triggers are being optimized to look for supersymmetry, potentially missing un-expected non-Standard Model physics. Since there were always reasons to be skeptical of LHC-scale supersymmetry, and these have now become so compelling that even Michael Dine is writing about them, one hopes that the trigger designers will keep that in mind.

Meanwhile, back at the LHC, powering tests are finished, the ring is closed and will be put through full tests of its operational cycle the next couple of days. Official start of beams for this year is planned for Monday.

Update: More details about the latest on this at Resonannces.

Update: More from Tommaso Dorigo (LHC Excludes SUSY Theories, Theorists Clinch Hands), and a Physics World article by Kate McAlpine here. Tommaso links to a 2008 posting by Ben Allanach that discusses predictions for SUSY masses made (using various assumptions one can read about there) around that time. One of these, by a large group including John Ellis, predicted that 50 inverse picobarns at 10 TeV would be enough to explore most of the region they expected SUSY masses to be in, at 68% confidence level. The latest data, which is about that luminosity but at 7 TeV, does rule out much of that region, with the most likely SUSY mass right around the boundary of the region ruled out by ATLAS (although the tan(beta) values are different). According to the Physics World article:

John Ellis of CERN and King’s College London disagrees that the LHC results cause any new problems for supersymmetry. Because the LHC collides strongly interacting quarks and gluons inside the protons, it can most easily produce their strongly interacting counterparts, the squarks and gluinos. However, in many models the supersymmetric partners of the electrons, muons and photons are lighter, and their masses could still be near the electroweak scale, he says.

Posted in Experimental HEP News | 23 Comments

Budget News

Almost five months into FY 2011, the US still has no budget for the year, operating on a continuing resolution that funds the government at FY 2010 levels until March 4. The House Republicans have come up with a proposal for huge budget cuts, which would arrive late in the fiscal year, probably requiring national labs like Fermilab to essentially shutdown for the remainder of the fiscal year. John Conway has more about this here. No one seems to seriously believe this proposal will pass into law.

This morning, the White House announced its budget proposal for FY 2012. The DOE Office of Science would get a healthy increase, to $5.416 billion from $4.964 billion in 2010. Similarly, the NSF would go from $6.873 billion to $7.768. However, the Administration’s FY 2011 request ended up being pretty much irrelevant, and it’s not clear that this one will fare any better. No information yet on how HEP and mathematics fare specifically in these requests.

So, bottom line is that no one really knows what this year or next year’s budget numbers will be, with the House proposal a lower bound for this year, and I suspect the president’s proposal will be an upper bound for next year.

Midday tomorrow, Fermilab director Oddone will give an all-hands talk at Fermilab to discuss the implications of all this for the lab.

Update: At the DOE, the FY2012 request for High Energy Physics is $797 million, versus $791 million in FY 2010 (last year at this time, the FY 2011 request was for $829 million). More details here.

Details of the NSF budget request are here. Mathematics research goes from $241 million in FY 2010 to $260 million in FY 2012. Physics from $290 million to $300 million. The NSF has pulled the plug on the DUSEL lab, freeing up $36 million/year, which is repurposed towards what they describe as their three priority areas: physics of the universe, quantum information science and the physics-biology interface.

Posted in Uncategorized | 6 Comments

Celebrity News

A selection of celebrity math/physics news:

  • Jane Fonda’s blog has a report on My Meeting With Stephen Hawking. Hawking told her “You were my heart throb”, admitting that Barbarella was what he had in mind.
  • MIT has put online a video of a lecture given in December by Jim Simons. Simons tells some of the story of his remarkable career and gives some advice about doing math. The financial news this morning reports that Simons’s Renaissance Technologies is now down in Brazil doing high-frequency trading. The story is illustrated with a photo of Gisele Bundchen super-imposed on what appears to be Simons lecturing on differential K-theory.
  • Someone has come up with an Einstein Index to rank theoretical physicists using a new citation analysis. According to this index, Juan Maldacena outranks Steve Weinberg and Ed Witten by a sizable amount. Further down the list, Sean Carroll and Murray Gell-Mann get the same ranking.
  • Brian Greene’s new book on the Multiverse is getting a lot of attention, with mostly laudatory reviews. For an alternate take, see the review at Bookforum by Charles Seife.
  • Posted in Uncategorized | 22 Comments

    IAS 80th Anniversary Talks

    This may be old news, but I just recently noticed that talks given at the IAS in Princeton last fall to celebrate its 80th anniversary are now available on-line here. They include talks by Voevodsky on the foundations of mathematics, Zaldarriaga on cosmology, Wilczek on supersymmetry and quantum computing, and Arkani-Hamed on Fundamental Physics in the 21st Century.

    According to Arkani-Hamed, the 21st century will be all AdS/CFT, all the time, using it to justify:

    the slogan is that string theory/quantum gravity is particle physics…

    There is a less interesting but amusing sociological fact associated with this, that since this realization there aren’t really different camps in theoretical physics any more. There aren’t string theorists and particle theorists, there’s one big structure “Good Ideas in Theoretical Physics”, OK. That structure has many different facets and you can work on different parts of it, but it’s all connected. You can really see it in the way the field developed since the late 90s, we’re much, much more one big happy family than was the case in the 1980s. I should say of course there are still people that do bad theoretical physics, but they’re not at the Institute. So all good ideas in theoretical physics are combined in one very big structure that no longer is there such a big difference between strings, QFT…

    This puts into practice Nati Seiberg’s 2005 prediction:

    “Most string theorists are very arrogant,” says Seiberg with a smile. “If there is something [beyond string theory], we will call it string theory.”

    Both Wilczek and Arkani-Hamed advertise supersymmetry, with Arkani-Hamed making the peculiar claim that physicists need to throw space-time out the window, but for some reason, before doing so it is important that they add supersymmetry to it. Wilczek states definitively that if superpartners don’t show up at the LHC, as far as he is concerned the idea will be gone.

    The minimal supersymmetric standard model is an important pillar of the “Good Ideas in Theoretical Physics” that those at the Institute cling to, despite efforts to use them to unify physics failing miserably over the last 25 years. If the LHC doesn’t see supersymmetry and Wilczek and others give up on the idea, it will be interesting to see if the IAS faculty revise their point of view and start to develop more of an interest in what they now claim is “bad theoretical physics”. I suppose though that when they do, they’ll call whatever they change over to “string theory”.

    Update: By the way, for the latest on what initial results from the LHC are saying about extra dimensions and supersymmetry, see this talk given today at CERN by Alessandro Strumia.

    Posted in Uncategorized | 34 Comments

    The 4% Universe

    I’ve written a review of Richard Panek’s quite good new book The 4% Universe, which has appeared at the Wall Street Journal. The main topic of the book is the supernova searches that led to what seems to be a non-zero value of the cosmological constant. It also discusses the astronomical evidence for dark matter, as well as on-going searches for a dark matter particle.

    One of the most interesting themes of the book is that of the encounter between the two different cultures of particle physics and astronomy. Astronomers have begun to worry not only about a new culture of large collaborations, but about the danger of an over-emphasis on certain specific measurements of fundamental significance. For more about this, see the article by Simon White from a few years ago Fundamentalist Physics: why Dark Energy is bad for Astronomy. Now that cosmologists have their own highly successful Standard Model, they’re starting to take a look at what happened after the arrival of the Standard Model in particle physics, and worry that they too may someday become victims of their own success.

    Posted in Book Reviews | 12 Comments

    Is the Multiverse Immoral?

    [Warning, somewhat of a rant follows, and it’s not very original. You might want to skip this one…]

    In the last week or so, I’ve run into two critiques of the currently fashionable multiverse mania that take an unusual angle on the subject, raising the question of the “morality” of the subject. The first of these was from Lee Smolin, who was here in New York last week talking at the Rubin Museum. I probably won’t get this quite right, but from what I remember he said that discussions of a multiverse containing infinite numbers of copies of ourselves behaving slightly differently made him uneasy for moral reasons. The worry is that one might be led to stop caring that much about the implications of one’s actions. After all, whatever mistake you make, in some other infinite number of universes, you didn’t do it.

    Over at Scientific American, yesterday they had John Horgan’s Is speculation in multiverses as immoral as speculation in subprime mortgages?. There’s more about this in a Bloggingheads conversation today with George Johnson, where Horgan describes his current reaction to multiverse mania as “I can’t stand this shit.”

    I’m in agreement with Horgan there, but my own moral concerns about the issue are different than the ones he and Smolin describe. The morality of how people choose to live their everyday lives doesn’t seem to me to have much to do with whatever the global structure of the universe might be. The world we are rapidly approaching in which a multiverse is held up as an integral part of the modern scientific world view isn’t one in which many people are likely to behave differently than before, so I don’t share Smolin’s concerns. Horgan’s exasperation with seeing the multiverse heavily promoted by famous physicists appears to have more to do with the idea that this is a retreat by physicists from engagement with the real world, something morally obtuse in an era of growing problems that scientists could help address. For what he would like to see instead, I guess a good model would be John Baez’s recent decision to turn his talents towards real-world problems facing humanity, see his blog Azimuth for more about this. Personally, I’m not uncomfortable with the fact that many mathematicians and physicists find that they don’t feel they are likely to be of much help if they go to work on the technology and science surrounding social problems. Instead, one can reasonably decide that one has some hope of making progress on fundamental issues in mathematics or physics and choose to work on that instead. One can try and justify this by hoping that new breakthroughs will somehow, someday help humanity, although this may be wishful thinking. Or one can argue that working towards a better understanding of the universe is inherently worthwhile, so pursuing this while taking some care to avoid worsening one’s local corner of the world is a morally reasonable stance.

    My own moral concerns about the multiverse have more to do with worry that pseudo-science is being heavily promoted to the public, leading to the danger that it will ultimately take over from science, first in the field of fundamental physics, then perhaps spreading to others. This concern is somewhat like the one that induced Alan Sokal to engage in his famous hoax. He felt that abandonment by prominent academics of the Enlightenment ideals exemplified by the scientific method threatens a move into a new Dark Ages, where power dominates over truth. Unfortunately, I don’t think that revelation of a hoax paper would have much effect in multiverse studies, where some of the literature has already moved beyond the point where parody is possible.

    For a while I was trying to keep track of multiverse-promoting books, and writing denunciatory reviews here. They’ve been appearing regularly for quite a few years now, with increasing frequency. Some typical examples that come to mind are Kaku’s Parallel Worlds (2004), Susskind’s The Cosmic Landscape (2005), and Vilenkin’s Many Worlds in One (2006). Just the past year has seen Sean Carroll’s From Eternity to Here, John Gribbin’s In Search of the Multiverse, Hawking and Mlodinow’s The Grand Design, and Brian Greene’s new The Hidden Reality. In a couple weeks there will be Steven Manly’s Visions of the Multiverse. Accompanying the flood of books is a much larger number of magazine articles and TV programs.

    Several months ago a masochistic publisher sent me a copy of Gribbin’s book hoping that I might give it some attention on the blog, but I didn’t have the heart to write anything. There’s nothing original in such books and thus nothing new to be said about why they are pseudo-science. The increasing number of them is just depressing and discouraging. More depressing still are the often laudatory reviews that these things are getting, often from prominent scientists who should know better. For a recent example, see Weinberg’s new review of Hawking/Mlodinow in the New York Review of Books.

    While most of the physicists and mathematicians I talk to tend towards the Horgan “I can’t stand this shit” point of view on the multiverse, David Gross is about the only prominent theorist I can think of known to publicly take a similar stand. One of the lessons of superstring theory unification is that if a wrong idea is promoted for enough years, it gets into the textbooks and becomes part of the conventional wisdom about how the world works. This process is now well underway with multiverse pseudo-science, as some theorists who should know better choose to heavily promote it, and others abdicate their responsibility to fight pseudo-science as it gains traction in their field.

    Posted in Multiverse Mania | 107 Comments

    News From Chamonix

    The people responsible for the LHC are meeting in Chamonix this week to make plans for the upcoming run, slides of many talks are available here. The results of discussions there are:

  • The recommendation will be to run at 3.5 TeV/beam, not increasing to 4 TeV/beam as widely expected.
  • The long shutdown to fix splices and allow going to 7 TeV/beam will be delayed until 2013, with 2012 devoted to a physics run. The 2013 shutdown will likely last more than a year.
  • Officially, the integrated luminosity goal for 2011 remains at 1 inverse femtobarn. However, unofficially, they expect to be able to do at least twice this, ending up with 2-3 inverse femtobarns by the end of the year
  • For a discussion of the possible physics that can be done with these parameters, see here. By the end of 2011 the LHC should be able to do better than the Tevatron on the Higgs search over most of the possible mass range, except for the low end of the range, where higher energy isn’t much help, and the Tevatron’s more than 10 inverse femtobarns and longer experience with the data analysis may give them an edge.

    Posted in Experimental HEP News | 6 Comments

    Number 999 or 1000

    According to the WordPress software, this is either post 999 or 1000 on this blog, depending on whether you count one I haven’t gotten around to finishing. I’m not sure that number is reliable anyway, since there are various anomalies due to a long-ago transition from Movable Type to WordPress. Some other statistics: 27,089 approved comments since the beginning (March 2004), 84,259 spam comments since the latest spam filter was turned on a couple years ago, 510,530 page hits last month (mostly spam, robots), and 8,842 subscribers at Google Reader.

    The blog has turned out to be far more of a success than I ever expected when it was first started. There were few similar blogs back then, with Jacques Distler’s Musings having been around for a while, and Sean Carroll’s Preposterous Universe just starting up. Lubos Motl’s Reference Frame quickly followed, I gather somewhat in response to mine. These days, Musings seems to have gone dormant, but Sean and Lubos are still at it. I haven’t kept track of physics blogs in general, but there are now quite a few that deal with particle physics in one way or another. For particle phenomenology, Resonaances is a great source of information, for experimental HEP Tommaso Dorigo has a wonderful blog, and Philip Gibbs at viXra log does a great job of keeping track of the state of the LHC (for the latest, see here).

    There’s also now a huge variety of research-level math blogs, including a very active blog by Fields Medalist Terry Tao. The new site Mathblogging.org is an exhaustive source of information and links. The big recent change in the math blogging world is the amazing phenomenon of MathOverflow, which features many of the best young mathematicians around carrying on the sort of conversations about research-level mathematics that traditionally go on in math common rooms. In an odd way, mathematics has often been somewhat of an oral tradition, since the impenetrability of much of the literature often meant that the only way to learn about something was to find an expert and get them to explain it to you. Now you can do this on-line, and this may significantly change how mathematics is done. Just as listening in to common room conversations was a great way to learn things, poking around the links on MathOverflow can provide quite an education. The moderation system somehow maintains a high level of discussion, although sometimes it lets its hair down to allow discussions like the ongoing one about Mathematical “Urban Legends”. There one can learn that one’s suspicions about string theorists educated at Princeton are correct, with Jeff Harvey contributing the following:

    Since the OP gave a physics example, here is another one, also at Princeton. Why are they always at Princeton? Student finishes his presentation on very mathematical aspects of string theory. An experimentalist on the committee asks him what he knows about the Higgs boson. He hems and haws and finally says “well, it was discovered a few years ago at Fermilab”, Experimentalist: “Can you tell me the mass?” Student: “I think around 40 GeV.”

    This was more than 20 years ago and actually happened. I was there. The student passed, but the next year all Ph.D students working on string theory were required to take a course on the phenomenology of particle physics.

    There’s now a physics version of mathoverflow starting up, but so far it seems to me much less successful, with far too much in the way of the high-school level topics and un-informed discussion that plagues most internet physics discussion forums. There are some examples of serious questions and well-informed people writing in, so maybe things will improve and it will turn into something very worthwhile, replacing much of what is now going on at blogs.

    I’m surprised to still be doing this nearly seven years after starting, but now doesn’t seem to be the time to stop. Particle theory has long been a rather intellectually dead topic, but whatever the news is from the LHC, it promises to shake the field up in one way or another, a process that should be interesting to follow. In coming weeks I may try and find time to learn some more about the features of WordPress, adding some features to the blog, or at least refreshing its rather tired look. Don’t be surprised if its appearance starts to change, or at least become unstable…

    Posted in Uncategorized | 23 Comments

    News from Templeton and FQXi

    The Templeton Foundation has just released their “2010 Capabilities Report“, a sort of bi-annual report. It shows that in 2009 they had assets of $1.5 billion, and spent $31.8 million on “Science and the Big Questions”. For 2010 two of their funding priorities were Quantum Physics and the Nature of Reality and Foundational Questions in the Mathematical Sciences, but they have yet to report what grants they made in those categories.

    The foundation is now being run by Jack Templeton, a surgeon who is devoting his efforts to spending his father’s money according to his instructions. For a couple of recent articles explaining what is going on at Templeton these days from two very different perspectives, see God, Science and Philanthropy at the Nation, and Honoring his Father at World magazine. The Nation article reports that the Foundation should soon have $2.5 billion or more to spend as the father’s estate is settled, and discusses how Jack Templeton’s right-wing politics and the Foundation’s goal of bringing science and religion together make many scientists uneasy. At the World on the other hand, they seem concerned that the Foundation is supporting the theory of evolution, for reasons that Jack Templeton spells out:

    Every five years, three independent analysts are to conduct a review to see if Jack Templeton (or his successor) is making grants consistent with Sir John’s intent. If they find that Jack is giving 9 percent or more of the grants to causes inconsistent with paternal intent, he has one year to get back into line. If not, Jack and his top two officers will be fired.

    Nor can Foundation trustees make changes by themselves or choose new board members. Templeton family members, plus winners of the annual Templeton Prize, plus heads of several organizations Sir John respected (such as the Acton Institute) are honorary members: There are about 75 in all, and 95 percent of them must be in agreement for any substantive change in foundation goals and purposes to be made. Even to change the location of the board’s annual meeting requires a 75 percent vote of the honorary members.

    The Foundation maintains Sir John’s “core funding areas.” The lead one, “Science & the Big Questions,” includes questions about evolution. Other Templeton core areas are Character Development (“We can determine how to be the masters of our habits”), Exceptional Cognitive Talent & Genius (humans can be “helpers in the acceleration of divine creativity”), and Genetics (the Foundation is not yet accepting unsolicited proposals in that area). Jack Templeton would not discuss any differences from Sir John in those areas: His calling is to do the will of his father.

    The son clearly sees things the same way as his father in one other Core Area, Freedom & Free Enterprise. Jack recalls how Sir John “often spoke, year after year about ‘people’s capitalism’ and what it would mean if the overwhelming majority of people in any country were shareholders themselves with the result that they would be much less likely to be envious and instead would focus much more persistently on ‘the good of the whole.’

    Besides science, Templeton has traditionally funded lots of activities related to religion, as well as ones promoting “Character Development” and “Freedom and Free Enterprise”. Another core funding area is “Exceptional Cognitive Talent and Genius”, where they try to identify and nurture “young people who demonstrate exceptional talent in mathematics and science.” Their newest interest is in genetics, where they’ve just started to make grants, including one in support of the “Genetics of High Cognitive Abilities Consortium.”

    One of the Templeton Foundation’s biggest grants, featured on the front-page of their web-site, was $8.8 million given to set up FQXi. They list this grant as having an end-date of December 2009, and the plan was for FQXi to get later funding elsewhere. FQXi is still in operation, either with leftover Templeton money or new funds from other sources. They announced today the award of $1.8 million dollars in grants for research into “The Nature of Time”, based on this request for proposals, which asked for research “unlikely to be supported by conventional funding sources”. The list of grants announced includes quite a few that satisfy that criterion, but winners also include some prominent theorists working on not exactly unconventional topics such as Andy Strominger on AdS space-time, Joe Polchinski on holography and AdS/CFT duality, Hiranya Peiris on analyzing WMAP data, and Berkeley’s Raphael Bousso on the Multiverse (along these lines). Maybe the last one does qualify as “unlikely to be supported by conventional funding sources”.

    In further support of the cause of investigating the Nature of Time, FQXi will pay for an event entitled Setting Time Aright which will take attendees on a chartered cruise from Bergen to Copenhagen late this summer.

    Posted in Uncategorized | 9 Comments

    More Short Items

  • There’s an excellent article by Michel Berube about the Sokal hoax, fifteen years later, entitled The Science Wars Redux.
  • The latest Notices of the AMS has a review of the recent Yau-Nadis book by Nigel Hitchin (for my take, see here).
  • My colleague Brian Greene has a new book coming out soon, The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos. I haven’t seen a copy, but from what I can gather, it looks like it is probably the best of the many books about “multiverse” ideas, but still not exactly my cup of tea. If you’re interested in the “multiverse” and want to read a popular level exposition, you should try this one. But you should also pay attention and see if there’s any experimental evidence (or reasonable hope of getting some) for the ideas being discussed. The book has very extensive more technical notes, and the Amazon site gives access to these. Brian also has an Op-Ed piece in today’s New York Times drawn from the book, about the fact that in an accelerating universe, in the distant future less and less will be visible.
  • It seems that there recently was a Physics of the Universe Summit, along the lines of last year’s (see here). There’s a web-site here, but about all you tell without a password is that the participants were staying at a very trendy hotel in West Hollywood.
  • A film has been made about the geometer Shiing-shen Chern. The title is “Taking the Long View” and there’s a web-site here.
  • Update: Two more.

  • XKCD on extra dimensions.
  • Matthew Chalmers has an interesting new article at Physics World entitled Reality check at the LHC.
  • Update: There’s a review of the new Brian Greene book by George Ellis at Nature this week. It emphasizes the problem of lack of testability.

    Posted in Uncategorized | 14 Comments