Never Give Up

Alok Jha has a piece in the Guardian yesterday about the failure to find SUSY. His conclusion I think gets the current situation right:

Or, as many physicists are now beginning to think, it could be that the venerable theory is wrong, and we do not, after all, live in a supersymmetric universe.

An interesting aspect of the article is that Jha asks some SUSY enthusiasts about when they will give up if no evidence for SUSY appears:

[Ben] Allanach says he will wait until the LHC has spent a year or so collecting data from its high-energy runs from 2015. And if no particles turn up during that time? “Then what you can say is there’s unlikely to be a discovery of supersymmetry at Cern in the foreseeable future,” he says.

Allanach has been at this for about 20 years, and here’s what he has to say about the prospect of failure:

If the worst happens, and supersymmetry does not show itself at the LHC, Allanach says it will be a wrench to have to go and work on something else. “I’ll feel a sense of loss over the excitement of the discovery. I still feel that excitement and I can imagine it, six months into the running at 14TeV and then some bumps appearing in the data and getting very excited and getting stuck in. It’s the loss of that that would affect me, emotionally.”

John Ellis has been in the SUSY business even longer, for 30 years or so and he’s not giving up:

Ellis, though confident that he will be vindicated, is philosophical about the potential failure of a theory that he, and thousands of other physicists, have worked on for their entire careers.

“It’s better to have loved and lost than not to have loved at all,” he says. “Obviously we theorists working on supersymmetry are playing for big stakes. We’re talking about dark matter, the origins of mass scales in physics, unifying the fundamental forces. You have to be realistic: if you are playing for big stakes, very possibly you’re not going to win.”

But, just because you’re not going to win, that doesn’t mean you have to ever admit that you lost:

John Ellis, a particle theorist at Cern and King’s College London, has been working on supersymmetry for more than 30 years, and is optimistic that the collider will find the evidence he has been waiting for. But when would he give up? “After you’ve run the LHC for another 10 years or more and explored lots of parameter space and you still haven’t found supersymmetry at that stage, I’ll probably be retired. It’s often said that it’s not theories that die, it’s theorists that die.”

There may be a generational dividing line somewhere in the age distribution of theorists, with those above a certain age likely to make the calculation that, no matter how bad things get for SUSY and string theory unification, it’s better to go to the grave without admitting defeat. The LHC will be in operation until 2030 or so, and you can always start arguing that 100 TeV will be needed to see SUSY (see here), ensuring that giving up won’t ever be necessary except for those now still wet behind the ears.

For another journalist’s take on the state of SUSY, this one Columbia-centric and featuring me as skeptic, see here.

Posted in Uncategorized | 60 Comments

The Next Machine

For the last week or so US HEP physicists have been meeting in Minneapolis to discuss plans for the future of  US HEP.  Some of the discussions can be seen by looking through the various slides available here.   A few days earlier Fermilab hosted TLEP13, a workshop to discuss plans for a new very large electron-positron machine.   There is a plan in place (the HL-LHC) for upgrading the LHC to higher luminosity, with operations planned until about 2030.   Other than this though, there are no current definite plans for what the next machine at the energy frontier might be. Some of the considerations in play are as follows:

  • The US is pretty much out of the running, with budgets for this kind of research much more likely to get cut than to get the kinds of increases a new energy frontier machine would require. Projects with costs up to around $1 billion could conceivably be financed in coming years, but for the energy frontier, one is likely talking about $10 billion and up.
  • Pre-LHC, attention was focused on prospects for electron-positron linear colliders, specifically the ILC and CLIC projects. The general assumption was that LEP, which reached 209 GeV in 2000, was the last circular electron-positron collider. The problem is that, at fixed radius, synchrotron radiation losses grow as the fourth-power of the energy, and LEP was already drawing a sizable fraction of the total power available at Geneva. Linear accelerators don’t have this problem, but they do have problems achieving high luminosity since one is not repeatedly colliding the same stored bunches.

    The hope was that the LHC would discover not just the Higgs, but all sorts of new particles. Once the mass of such new particles was known, ILC or CLIC technology would give a design of an appropriate machine to study such new particles in ways that not possible at a proton-proton machine. These hopes have not worked out so far, making it now appear quite unlikely that there are such new particles at ILC/CLIC accessible energies. It remains possible that the Japanese will decide to fund an ILC project, even without the appealing target of a new particle besides the Higgs to study.

  • The LHC has told us the Higgs mass, making it now possible to consider what sort of electron-positron collider would be optimal for studying the physics of the Higgs, something one might call a “Higgs factory”. It turns out that a center of mass energy of about 240 GeV is optimal for Higgs production. This is easily achievable with the ILC, but since it is not that much higher than LEP, there is now interest in the possibility of a circular collider as a Higgs factory. There is a proposal called LEP3 (discussed on this blog here) for putting such a collider in the LHC tunnel, but it is unclear whether such a machine could coexist with the LHC, and no one wants to shutdown the LHC before a 2030 timescale.
  • Protons are much heavier than electrons, so synchrotron radiation losses are not the problem, but the strength of the dipole magnets needed to keep them in a circular orbit is. To get to higher proton-proton collision energies in the same tunnel, one needs higher strength magnets, with energy scaling linearly with field strength. The LHC magnets are about 8 Tesla, current technology limit is about 11 Tesla for appropriate magnets. The possibility of an HE-LHC, operating at 33 TeV with 20 Tesla magnets is under study, but this technology is still quite a ways off. Again, the time-scale for such a machine would be post-2030.
  • The other way to get to higher proton-proton energies is to build a larger ring, with energy scaling linearly with the size of the ring (for fixed magnet strength). Long-term thinking at CERN now seems to be focusing on the construction of a much larger ring, of size 80-100 km. One could reach 100 TeV energies with either 20 Tesla magnets and an 80 km ring, or 16 Tesla magnets and a 100 km ring (such a machine is being called a VHE-LHC). If such a tunnel were to be built, one could imagine first populating it with an electron-positron collider, and this proposal is being called TLEP. It would operate at energies up to 350 GeV and would be an ideal machine for precision studies of the Higgs. It could also be used to operate at very high luminosity at lower energies, significantly improving on electroweak measurements made at LEP (the claim is that LEP-size data sets could be reproduced in each 15 minutes of running). Optimistic time-lines would have TLEP operating around 2030, replaced by the VHE-LHC in the 2040s.
  • For more about TLEP, see the talks here. The final talk of the TLEP workshop wasn’t about TLEP, but Arkani-Hamed on the VHE-LHC (it sounds like maybe he’s not very interested in the Higgs factory idea). He ends with

    EVERY student/post-doc/person with a pulse (esp. under 35) I know is ridiculously excited by even a glimmer of hope for a 100 TeV pp collider. These people don’t suffer from SSC PTSD.

    Looking at the possibilites, I do think TLEP/VHE-LHC looks like the currently most promising route for the future for CERN and HEP physics (new technology might change this, i.e. a muon collider). Maybe I don’t have a pulse though, since I can’t say that I’m ridiculously excited by just a glimmer of VHE-LHC hope for a time-frame past my life-expectancy.

    A 100 km tunnel would be even larger than the planned SSC tunnel (89 km) and one doesn’t have to suffer from SSC post-traumatic-stress-disorder to worry about whether a project this large can be successfully funded and built (In very rough numbers I’d guess one is talking about costs on the scale of $20 billion). My knowledge of EU science funding issues is insufficient to have any idea if the money for something on this scale is a possibility. On the other hand, with increasing concentration of all wealth in the hands of an increasingly large number of multi-billionaires, perhaps this just needs the right rich guy for it to happen.

    Someone is going to have to do a better job than Arkani-Hamed in terms of finding an argument that will sell this to rest of the scientific community. His main argument is that such a machine would allow us to improve the ultimate LHC number of “fine-tuning” being at least 10-2 to a number like 10-4, or maybe finally see some SUSY particles. I don’t think this argument is going to get $20 billion: “we thought we’d see all this stuff at the LHC because we were guessing some number we don’t understand was around one. We saw nothing and turns out the number is small, no bigger than one in a hundred. Now we’d like to spend $20 billion to see if it’s smaller than one in a hundred, but bigger than one in ten thousand.”

Posted in Experimental HEP News | 21 Comments

Latest from the Stacks Project

My colleague Johan de Jong for the last few years has been working on an amazing mathematical endeavor he calls the “Stacks Project”. As boring 20th century technology, this is a work-in-progress document (now nearly 4000 pages), available here. But from the beginning Johan (known to his friends as “the Linus Torvalds of algebraic geometry”) has conceptualized this as an open-source project using 21st century technology, including a blog and a github repository.

As of last night, the Stacks Project has many new features, courtesy of impressive work by Johan’s collaborator on this, Pieter Belmans. I was going to write something here describing the new features and how cool they are, but a much better job of this has been done by Cathy O’Neil, aka Mathbabe (by the way, if you’re not reading Cathy’s blog, you should be…). With her permission, I’m cross-posting her new blog entry about this, so, what follows is from Cathy:

The Stacks Project gets ever awesomer with new viz

Here’s a completely biased interview I did with my husband A. Johan de Jong, who has been working with Pieter Belmans on a very cool online math project using d3js. I even made up some of his answers (with his approval).

Q: What is the Stacks Project?

A: It’s an open source textbook and reference for my field, which is algebraic geometry. It builds foundations starting from elementary college algebra and going up to algebraic stacks. It’s a self-contained exposition of all the material there, which makes it different from a research textbook or the experience you’d have reading a bunch of papers.

We were quite neurotic setting it up – everything has a proof, other results are referenced explicitly, and it’s strictly linear, which is to say there’s a strict ordering of the text so that all references are always to earlier results.

Of course the field itself has different directions, some of which are represented in the stacks project, but we had to choose a way of presenting it which allowed for this idea of linearity (of course, any mathematician thinks we can do that for all of mathematics).

Q: How has the Stacks Project website changed?

A: It started out as just a place you could download the pdf and tex files, but then Pieter Belmans came on board and he added features such as full text search, tag look-up, and a commenting system. In this latest version, we’ve added a whole bunch of features, but the most interesting one is the dynamic generation of dependency graphs.

We’ve had some crude visualizations for a while, and we made t-shirts from those pictures. I even had this deal where, if people found mathematical mistakes in the Stacks Project, they’d get a free t-shirt, and I’m happy to report that I just last week gave away my last t-shirt. Here’s an old picture of me with my adorable son (who’s now huge).

tshirt_stacks

Q: Talk a little bit about the new viz.

A: First a word about the tags, which we need to understand the viz.

Every mathematical result in the Stacks Project has a “tag”, which is a four letter code, and which is a permanent reference for that result, even as other results are added before or after that one (by the way, Cathy O’Neil figured this system out).

The graphs show the logical dependencies between these tags, represented by arrows between nodes. You can see this structure in the above picture already.

So for example, if tag ABCD refers to Zariski’s Main Theorem, and tag ADFG refers to Nakayama’s Lemma, then since Zariski depends on Nakayama, there’s a logical dependency, which means the node labeled ABCD points to the node labeled ADFG in the entire graph.

Of course, we don’t really look at the entire graph, we look at the subgraph of results which a given result depends on. And we don’t draw all the arrows either, we only draw the arrows corresponding to direct references in the proofs. Which is to say, in the subgraph for Zariski, there will be a path from node ABCD to node ADFG, but not necessarily a direct link.

Q: Can we see an example?

Let’s move to an example for result 01WC, which refers to the proof that “a locally projective morphism is proper”.

First, there are two kinds of heat maps. Here’s one that defines distance as the maximum (directed) distance from the root node. In other words, how far down in the proof is this result needed? In this case the main result 01WC is bright red with a black dotted border, and any result that 01WC depends on is represented as a node. The edges are directed, although the arrows aren’t drawn, but you can figure out the direction by how the color changes. The dark blue colors are the leaf nodes that are farthest away from the root.

Another way of saying this is that the redder results are the results that are closer to it in meaning and sophistication level.

Note if we had defined the distance as the minimum distance from the root node (to come soon hopefully), then we’d have a slightly different and also meaningful way of thinking about “redness” as “relevance” to the root node.

This is a screenshot but feel free to play with it directly here. For all of the graphs, hovering over a result will cause the statement of the result to appear, which is awesome.

Stacks Project — Force (depth) 01WC

Next, let’s look at another kind of heat map where the color is defined as maximum distance from some leaf note in the overall graph. So dark blue nodes are basic results in algebra, sheaves, sites, cohomology, simplicial methods, and other chapters. The link is the same, you can just toggle between the different metric.

Stacks Project — Force (height) 01WC

Next we delved further into how results depend on those different topics. Here, again for the same result, we can see the extent to which that result depends on the different on results from the various chapters. If you scroll over the nodes you can see more details. This is just a screenshot but you can play with it yourself here and you can collapse it in various ways corresponding to the internal hierarchy of the project.

Stacks Project — Collapsible 01WC

Finally, we have a way of looking at the logical dependency graph directly, where result node is labeled with a tag and colored by “type”: whether it’s a lemma, proposition, theorem, or something else, and it also annotates the results which have separate names. Again a screenshot but play with it here, it rotates!

Stacks Project — Cluster 01WC

 

Check out the whole project here, and feel free to leave comments using the comment feature!

Posted in Uncategorized | 12 Comments

Bankrupting Physics

I just spent a depressing and tedious few hours reading through Bankrupting Physics, an English translation of Alexander Unzicker’s 2010 Von Urknall zum Durchknall written in German.

When I started reading the thing I wasn’t expecting much, but figured it would be some sort of public service to take the time to identify what Unzicker had to say that made sense and what didn’t, and then write something distinguishing the two here. After a while though, it became clear that Unzicker is just a garden-variety crank, of a really tedious sort. Best advice about the book would be the usual in this situation, just ignore it, since no good can possibly come from wasting time engaging with this nonsense. I have no idea why any publisher, in Germany or here, thought publishing this was a good idea.

If you must know though, here’s a short summary of what’s in the book. The first half is about gravitation, cosmology and astrophysical observations. Unzicker’s obsessive idea, shared with innumerable other cranks, is that any scientific theory beyond one intuitively clear to them must be nonsense. Similarly, any experimental result beyond one where they can easily understand and analyze the data themselves is also nonsense. He’s a fan of Einstein, although thinks general relativity somehow needs to be fixed, something to do with it getting phenomena involving small accelerations wrong. There’s endless complaints about how cosmology involves too many parameters, and dark matter/energy shows that physicists really understand nothing.

When he gets to particle physics, we learn that things went wrong back when physicists started invoking a symmetry that wasn’t intuitively obvious, isospin symmetry. According to Unzicker, symmetries in particle theory are all a big mistake, “the standard model barely predicts anything”, “the standard model can actually accommodate every result”, and endless other similar nonsense. As for the experimental side of things, he takes a comment from Feynman about renormalization in QED, claims it means that there is no understanding of production of photons at high energy, then uses this to describe as “It’s just ridiculous” data analysis at HEP experiments. High energy physics experiments are all just a big scam, with the physicists involved unwilling to admit this, since they’ve wasted so much money on them.

The last part of the book contains lots of criticism of string theory, etc., much of it parroting my book and blog. According to Unzicker:

Woit does a great job in debunking the string and SUSY crap. Unfortunately, he has pretty mainstream opinions with respect to the Standard Model.

Well, maybe he does get something right… I have to admit that one of the things that every so often makes me wonder if I’m completely misguided, and maybe there is a lot more value to strings/SUSY/branes/extra dimensions etc. than I think, is reading rants like Unzicker’s.

So, my strong advice would be to do your best to ignore this. Luckily, there’s an infinitely better book coming out here in the US at the same time: Jim Baggott’s Farewell to Reality, which I highly recommend. It seems likely that the two books will get reviewed together, giving Unzicker far more attention than he deserves. If so, at least this will provide a real-life experiment indicating whether book reviewers can tell sense from nonsense.

Posted in Book Reviews | 34 Comments

Where are we heading?

Every summer the IAS in Princeton runs a program for graduate students and postdocs called “Prospects in Theoretical Physics”. It’s going on now, with this year’s topic LHC Physics. Much of the program is devoted to the important but complex technical issues of extracting physics from LHC data. Things began though with a talk on Where are we heading? from Nati Seiberg designed to explain to students how they should think about the significance of the LHC results and where they were taking the field.

Most of the talk was about the hierarchy problem and “naturalness”, with the forward-looking conclusion the same one that Seiberg’s colleague Arkani-Hamed has been aggressively pushing: the main significance of LHC results will be telling us that the world is either “natural” (likely by discovering SUSY) or “unnatural” (in which case there’s a multiverse and it’s hopeless to even try to predict SM parameters). Given the negative results about SUSY so far, this conclusion pretty much means that the students at the IAS are being told that the LHC results mean it’s the multiverse, and they shouldn’t even think about trying to figure out where the SM comes from since that’s a lost cause. The talk ends with the upbeat claim that this is a “win-win situation”: reaching the conclusion that the LHC has shown we can’t learn more about where the SM came from will be a great scientific advance and “The future will be very exciting!”. Seiberg does at one point make an interesting comment that indicates that he’s not completely on-board with this conclusion. He notes that there’s a “strange coincidence” that theorists are making this theoretical argument about the necessity of giving up at just exactly the same time in our history that we have run out of technological ability to explore shorter distances. A “strange coincidence” indeed…

For more conventional wisdom along these lines, see Naturally Unnatural from Philip Gibbs, which also argues that what we are learning from the LHC is that we must give up and embrace the multiverse.

Frank Wilczek has just made available on his web-site a new paper on Multiversality. It has the usual arguments for the multiverse, although unlikes Seiberg/Arkani-Hamed he doesn’t try to claim that this is an exciting positive development, closing with a “lamentation”:

I don’t see any realistic prospect that anthropic or statistical selection arguments – applied to a single sample! – will ever lead to anything comparable in intellectual depth or numerical precision to the greatest and most characteristic achievements of theoretical physics and astrophysics…

there will be fewer accessible features of the physical world for fundamental theory to target. One sees these trends, for example, in the almost total disconnect between the subject matter of hep-th and hep-ex.

and a “warning

There is a danger that selection effects will be invoked prematurely or inappropriately, and choke off the search for deeper more consequential explanations of observed phenomena. To put it crudely, theorists can be tempted to think along the lines “If people as clever as us haven’t explained it, that’s because it can’t be explained – it’s just an accident.”

He does see possibilities for understanding more about the SM in two places, the SUSY GUT unification of couplings and axions as an explanation of the smallness of the QCD theta parameter. The last part of the paper is about axion cosmology and anthropics. Wilczek has written about the stories of the 1981 origin of the SUSY GUT unification argument and the 1975 birth of the axion. It’s striking that we’re 32 and 38 years later without any idea whether these ideas explain anything. A depressing possible answer to “Where are we heading?” would be an endless future of multiverse mania, with a short canonical list of ancient, but accepted ideas about fundamental theory (SUSY Guts, string theory, axions) that can never be tested.

Posted in Multiverse Mania | 52 Comments

News From All Over

  • I confess to mostly finding “philosophy of physics” arguments not very helpful for understanding anything, but for those who feel differently, some new things to look at are a Scientific American article Physicists Debate Whether the World is Made of Particles or Fields or Something Else Entirely, an interview with Jonathan Bain, an interview with Tim Maudlin, a debate between John Ellis, Lawrence Krauss and theologian Don Cupitt about Why is there something rather than nothing?, and the talks at a UCSC Philosophy of Cosmology Summer School. Since the last of these was funded by the Templeton Foundation, it ended with several talks on “Implications of cosmology for the philosophy of religion”. These included a detailed argument that the explanation for the laws of nature is “there is a perfect being”, contrasting this to another argument favored at the Summer School “the multiverse did it”.
  • This week the Perimeter Institute will host Loops 13, devoted to loop quantum gravity and other quantum gravity approaches. While it’s also funded by Templeton, the organizers seem to have managed to keep God out of this one.
  • At CERN, Amplitudes, Strings and Branes is on-going. Philip Gibbs has an amusing argument that this and Loops 13 are The Same Bloody Thing.
  • One thing the LQG and Amplitudes people do share is that some of their most important ideas come from the same person: Roger Penrose (who, but the way, would be a good candidate for the Fundamental Physics Prize, although his distaste for string theory might be a disqualifier). There’s a long interview with him at The Ideas Roadshow, mainly about his “Cyclic Universe” ideas.
  • The Simons Foundation has been publishing some excellent science reporting, and now has an online publication they’re calling Quanta Magazine. The latest story there is a very good piece on the search for dark matter from Jennifer Ouellette. The Simons Center at Stony Brook now has a newsletter about their activities.
  • Another on-going conference is one of the big yearly HEP conferences, EPS HEP 2013 in Stockholm. CMS and LHCb have impressive new results about rare B decays, timed for this conference. For the details, see Tommaso Dorigo. There are also CMS and CERN press releases.

    Last year similar but less accurate results were advertised as putting SUSY “in the hospital”, which some people objected to, on the grounds that it was already in trouble and this kind of result doesn’t make things much worse. Resonaances had the details, summarizing this a “another handful of earth upon the coffin”. The CERN press office tries to put the best SUSY spin on this that it can:

    One popular theory is known as supersymmetry, SUSY for short. It postulates the existence of a new type of particle for every Standard Model particle we know, and some of these particles would have just the right properties to make up a large part of the dark universe. There are many SUSY models in circulation, and SUSY is just one of many theoretical routes to physics beyond the Standard Model. Today’s measurements allow physicists to sort between them. Many are incompatible with the new measurements, and so must be discarded, allowing the theory community to work on those that are still in the running.

  • Finally, for those with mathematical interests who have waded through the above, Terry Tao has a remarkable long expository piece about the Riemann hypothesis, ranging from analytic number theory aspects through the function field case and l-adic cohomology.

Update: For more from Penrose, see this recent talk in Warsaw.

Update: Studies in History and Philosophy of Modern Physics is planning a special issue on the significance of the Higgs discovery, the call for papers is here.

Posted in Uncategorized | 21 Comments

No Joking Matter

Back now from vacation, and while I was away several people sent me links to point out that string theory promoters definitely aren’t taking a vacation. Links here with a few quick comments, followed by something about the issue of making fun of string theorists.

  • Lenny Susskind has a quite good new book out about classical mechanics (see here), but the Economist doesn’t want to talk to him about this, instead it’s the usual string theory promotional effort:

    These extra dimensions can be arranged and put together in many different patterns, in a variety of different ways. Not billions, trillions or quintillions of ways, but many more than that. The ways these dimensions are put together into these tiny little spaces determine how particles will behave, what particles will exist, what the constants of nature are—quantities like dark energy or the electric charge of an electron. In string theory all those things are features of the ways that these tiny dimensions are put together. The tiny dimensions are like the DNA of the universe.

  • Last month Cumrun Vafa was in Bangalore, explaining (see here, slides here) among other things about the significance of the web of string dualities that makes up M-theory:

    String dualities [are] in my opinion perhaps the most fundamental discovery that physicists have made in a century.

    Note that the past century includes General Relativity (barely), quantum mechanics, gauge theory, the Standard Model, as well as quite a few discoveries in other areas of physics.

  • The Strings 2013 conference included the usual public talks promoting string theory. Witten’s was String Theory and the Universe, which was pretty much unchanged (minus optimism about SUSY at the LHC) from similar talks he has been giving for nearly 20 years (since 1995 and the M-theory proposal). Linde’s was Universe or Multiverse?, about the “new scientific paradigm” of Multiverse Mania. He argues that the virtue of this is that it is “impossible to disprove”.
  • David Gross’s public talk on The Frontiers of Particle Physics had nothing to do with string theory, focusing on explaining the standard model and some of the questions it leaves unanswered. Much more interesting was his outlook talk at the conference which included the usual exhortations about string theory being alive and healthy, flourishing with many new and brilliant string theorists, but also included some material unusual in such a venue and much more challenging for his audience. His reference to connections between string theory and condensed matter physics described this as having been “overhyped by our community”. About AdS/CFT, he noted that it “does not provide a satisfactory non-perturbative quantum gravity”.

    He commented on the lack of connection between the talks and HEP physics, saying that it was “important that string theorists not retreat into quantum gravity”. About SUSY, he characterized it as “still alive, but not kicking”, and he argued that the LHC results of the past year have made more likely the “Extremely pessimistic scenario” of an SM Higgs, no SUSY, no dark matter, no indication of the next energy threshold. Since “HEP is where string theory connects to reality”, he made the point to the audience that “if this scenario materializes we are all going to suffer.”

    I’m not sure why he picked this date, but he encouraged those with post-1999 Ph.D.s to realize that it was now quite possible that those who came before them had “somehow got it wrong”. This was the first time I’ve seen an influential member of the string theory community raise this possibility and call for people to consider it seriously.

  • Sean Carroll deals here with arguments from the “Popperazi” that the string theory anthropic multiverse is pseudo-science by ignoring the serious arguments being made. He has his own definition of what science is, which looks to me to open far more questions than it answers. About string theory unification itself, the question has never been whether it’s science, but whether it’s an idea worth pursuing given the ways in which it has so far failed. The best argument for continued pursuit of the idea is of course that there aren’t obviously better ones around, but this raises its own issues.

Sabine Hossenfelder has a posting about an introduction to a Lawrence Krauss talk where the joke was made that “String theorists have to sit in the back”. The context for this was a controversy about the place of women at a discussion involving Krauss hosted by an Islamic group. Like Sabine, I don’t want to discuss here that controversy, just agree there’s a good case to be made that it’s no joking matter. I think she makes a mistake by interpreting the joke as an attack on string theorists (it’s a joke, after all, open to many interpretations), but I was struck by her perception of string theorists as an embattled minority under unfair attack, as well as the claim that it’s not all right to in any way make fun of them.

The situation these days is clearly very different than it was back in 2004 when I started this blog, partly because of the past decade of failure of string theory unification to get anywhere, partly because of the negative LHC results, partly because of the multiverse, and partly because of the public behavior of some in the string theory community in reaction to criticism. Given the high profile ongoing promotional campaign exemplified above, I don’t think we’re yet at the point where criticism of string theorists is “kicking them when they’re down”, and humor is sometimes the best way to make a point concisely. Probably the most incisive criticism of string theory ever made was made in a cartoon, and personally I’ve never understood how it is even possible to take arguments like Linde’s seriously (and am not even sure he does…), so I see a role for humor in charting the continuing story of the collapse of the heavily influential string theory unification paradigm.

Update: If you noticed more than the usual sloppiness, incompleteness and incoherence, maybe it was because this got published early, while I was in the middle of writing it.

Update: The latest on the philosophy of science from Linde (see here), who is at a workshop in Bad Honnef this week.

The multiverse is the only known explanation so in a sense it has already been tested

Is it all right to make fun of this, or should one seriously discuss the scientific merits here?

Posted in Uncategorized | 46 Comments

Hiking the Appalachian Trail

I’ll be heading North tomorrow, ultimately ending up in backwoods Maine, hiking the Appalachian Trail, then back home after a week or so. The comment section here will likely be closed for the duration.

Progress is being made on my notes on quantum mechanics and representation theory, based on the course I taught this past year. About 3/4 done with the writing at this point, the latest version is available here.

Posted in Uncategorized | 10 Comments

Strings 2013 etc.

These days one can just about attend a wide variety of summer conferences from the comfort of one’s home or office, with talks appearing online more or less immediately after they are given. This week some possibilities to consider are:

  • The big yearly string theory conference, Strings 2013, is in Seoul this week with about 279 physicists in attendance, talks are available here. As usual for recent string theory conferences, there aren’t a lot of strings to be seen. Perhaps these yearly conferences should be renamed something like “Conference on the latest topics popular among people who used to do string theory”. No sign at all of the landscape or string theory unification. For some understanding of why, see Life at the Interface of Particle Physics and String Theory, to be published by Reviews of Modern Physics, which makes pretty clear why the organizers of Strings 20XX now want to avoid this topic.
  • The 2013 Lepton Photon conference is in San Francisco this week, talks here. To mark the occasion, Gordon Watts and Jacques Distler have thrown in the towel, paying up on their bet with Tommaso Dorigo that the LHC would find SUSY or other “new physics”. Tommaso has the full story here.
  • For the more philosophically minded, the Templeton Foundation is funding a summer institute on the Philosophy of Cosmology, and you can follow the talks here. If thinking about time is your thing and you’ve missed out on the free summer amongst the redwoods, there’s still time to get in line for Templeton funds, with a few days left to apply for grants here.

Update: To round out the list, I should have included something for the mathematical physicists, this week’s Symmetries in Mathematics and Physics conference in Rio. Videos of the talks are appearing here.

Update: Jacques Distler has a posting here (or guest post here) conceding the loss of his bet with Tommaso Dorigo. Unlike some other theorists (e.g. John Ellis) who are arguing that everything’s fine, one just has to wait until 2015-6 for the higher energy LHC run, Distler is more of a realist:

…would I be willing to bet on the 2015 LHC run uncovering new BSM physics?

The answer, I think, is: not unless you were willing to give me some substantial odds (at least 5–1; if I think about it, maybe even higher).

Knowing the mass of the Higgs (∼125GeV) rules out huge swaths of BSM ideas. Seeing absolutely nothing in the 7 and 8 TeV data (not even the sort of 2-3σ deviations that, while not sufficient to claim a “discovery,” might at least serve as tantalizing hints of things to come) disfavours even more.

The probability (in my Bayesian estimation) that the LHC will discover BSM physics has gone from fairly likely (as witnessed by my previous willingness to take even-odds) to rather unlikely. N.B.: that’s not quite the same thing as saying that there’s no BSM physics at these energies; rather that, if it’s there, the LHC won’t be able to see it (at least, not without accumulating many years worth of data).

Posted in Strings 2XXX | 20 Comments

Kenneth Wilson 1936-2013

Kenneth Wilson died this past weekend, in Maine at the age of 77. Some obituaries can be found here, here, here, and here.

Wilson won the Nobel prize in 1982 for his work on critical phenomena and phase transitions, but his influence on particle theory was arguably even greater than on condensed matter physics. Unfortunately I never got a chance to meet him, but a large part of what I was learning about quantum field theory back in my days as a graduate student came either directly or indirectly from him.

Soon after the discovery of asymptotic freedom in 1973, he started work on developing lattice methods for studying gauge theories non-perturbatively with a fixed cut-off. This founded the whole field of lattice gauge theory, which remains a major and active part of HEP theory. Not many people have a whole section of the arXiv they’re responsible for. For his story of how this came about, see his 2004 The Origins of Lattice Gauge Theory.

The reason Wilson was well-placed to quickly get lattice gauge theory off the ground in 1973-4 was that he was one of very few theorists who had been thinking hard and fruitfully about the meaning of non-perturbative quantum field theory. After getting an undergraduate degree in math from Harvard in 1956, he did his thesis work under Gell-Mann at Caltech, finishing in 1961 and developing an interest in the renormalization group. From 1963 on he was focusing his research on strong interactions and the high energy behavior of quantum field theory. This was a time when QFT had fallen out of favor, with S-matrix theory considered the cutting edge. One reason others weren’t thinking about this was that the problem was very hard. It was also perhaps the deepest problem around: how do you make sense of quantum field theory? What is QFT, really, outside of the approximation method of perturbation theory?

By the early 1970s, Wilson had developed the ideas about the renormalization group and QFT that now form the foundation of how we think about non-perturbative QFTs. The first applications of this actually were to problems about critical phenomena, and it was for this work that he won the Nobel prize. With the arrival of QCD, these ideas became central to the whole field of particle theory, with much of the 1970s and early 80s devoted to investigations that relied heavily on them. If you were a graduate student then, you certainly were reading his papers.

For more from Wilson himself about his life and work, see his 1983 Nobel Prize lecture and a long interview from 2002 here, here and here.

John Preskill has a wonderful posting up about Wilson, with the title We are all Wilsonians now. He ends it by explaining Wilson’s early role in the debate about “naturalness”. Wilson was well aware of the quadratic sensitivity of elementary scalars to the cut-off and had argued that this meant that you didn’t expect to see elementary scalars at low masses. This argument was developed here by Susskind as a motivation for technicolor. Preskill doesn’t mention though that Wilson later referred to this as a “blunder”. In 2004 he had this to say:

The final blunder was a claim that scalar elementary particles were unlikely to occur in elementary particle physics at currently measurable energies unless they were associated with some kind of broken symmetry [23]. The claim was that, otherwise, their masses were likely to be far higher than could be detected. The claim was that it would be unnatural for such particles to have masses small enough to be detectable soon. But this claim makes no sense when one becomes familiar with the history of physics. There have been a number of cases where numbers arose that were unexpectedly small or large. An early example was the very large distance to the nearest star as compared to the distance to the Sun, as needed by Copernicus, because otherwise the nearest stars would have exhibited measurable parallax as the Earth moved around the Sun. Within elementary particle physics, one has unexpectedly large ratios of masses, such as the large ratio of the muon mass to the electron mass. There is also the very small value of the weak coupling constant. In the time since my paper was written, another set of unexpectedly small masses was discovered: the neutrino masses. There is also the riddle of dark energy in cosmology, with its implication of possibly an extremely small value for the cosmological constant in Einstein’s theory of general relativity.

This blunder was potentially more serious, if it caused any subsequent researchers to dismiss possibilities for very large or very small values for parameters that now must be taken seriously…

He then goes on to argue at length that the lesson of the history of science is that often what seemed like unlikely possibilities turned out to be the right ones, with the argument for unlikeliness just a reflection of the fact that people had been making assumptions that weren’t true and/or they didn’t understand the possibilities as well as they thought they did.

Wilson may be no longer with us, but his ideas certainly are, and they’re very relevant to the biggest controversies of the day.

Posted in Obituaries | 17 Comments