Tevatron vs. LHC

The news from Fermilab is that the Tevatron has set a new luminosity record, with a store last Friday that had an initial luminosity of 4.04 x 1032cm-2s-1, or, equivalently, 404 inverse microbarns/sec. For more about this, see a new posting from Tommaso Dorigo, where someone from Fermilab writes in to comment that they’re not quite sure why this store went unusually well.

Over in Geneva, commissioning of the LHC continues. There, the highest initial luminosity reported by ATLAS is 2.3 x 1027cm-2s-1, or 200,000 times less than the Tevatron number. I haven’t seen a recent number for the total luminosity delivered by the LHC to the experiments so far, but I believe it’s a small number of hundreds of inverse microbarns. The Tevatron is producing more collisions in one second than the LHC has managed in the three weeks since first collisions.

The current plan for the LHC is to devote most of the rest of 2010 to increasing the luminosity of the machine, with a goal of reaching something somewhat lower than the Tevatron luminosity (1-2 x 1032cm-2s-1) by the end of the year. Then the plan is to run flat out at this luminosity throughout 2011, accumulating data at the rate of about 100 pb-1/month, and ending up with a total of 1 fb-1. The hope is that this will allow them to be competitive for some measurements with Fermilab, the lower luminosity compensated by the advantage of a factor of 3.5 in beam energy that they now enjoy.

The Tevatron has already produced over 8 fb-1 of data, and the current plan is to run the machine through the end of FY 2011, reaching at least 10 fb-1, and then shut it down for good. The LHC is supposed to go into a long shutdown throughout 2012, not coming back into operation until 2013. Even if all goes well, it likely will not have accumulated enough data to decisively compete with the Tevatron until late 2013 or 2014. Under the circumstances, it’s hard to believe that there aren’t plans being proposed at Fermilab to keep the Tevatron running for several more years, until 2014. The machine should then be able to end up with a total of 15-20 fb-1 worth of data, which could be enough to allow them to see evidence of the Higgs at the 3 sigma level over the entire possible mass range.

Shutting down the Tevatron as planned would free up funds for other experiments, and free parts of the accelerator complex for use by neutrino experiments. It will be interesting to see whether instead of this, the decision gets made to go for a bid to outrace the LHC to the Higgs and other high energy frontier physics over the next few years.

Update: The integrated luminosity seen by the ATLAS detector for the current run so far is about 400 inverse microbarns, 350 at CMS (where they have 300 worth of recorded data).

Update: This morning the LHC has started delivering collisions with stable squeezed beams to the experiments, initial luminosity 1.1-1.2 x1028cm-2s-1.

Update: Integrated luminosity delivered to each experiment at the LHC is now up to around 1000 inverse microbarns.

Posted in Experimental HEP News | 27 Comments

Three Mysteries

There’s a well-known list of high-profile problems in fundamental theoretical physics that have gotten most of the attention of the field during the past few decades (examples would be the problems of quantizing gravity, solving QCD, explaining dark energy, finding a model of dark matter, breaking supersymmetry and connecting it to experiment, etc.). Progress on these problems has been rather minimal, and in reaction one recent trend has organizations such as FQXI promoting research into questions that are much more “philosophical” (for instance, they are now asking for grant proposals to study “The Nature of Time”). In this posting I’d like to discuss a different class of problems, ones which I believe haven’t gotten anywhere near the attention they deserve, for an interesting reason.

The three problems share the characteristic of being apparently of a purely technical nature. The argument against paying much attention to them is that, in each case, even if one were to find a satisfactory solution, it might not be very interesting. It’s possible that all one would discover is that the conventional wisdom about these problems, that they’re just “technical” and thus not of much significance, is correct. The argument for paying more attention is that the technical problem may be an indication that we’re doing something wrong, that there is something of significance about the Standard Model that we haven’t yet understood. Achieving this understanding may lead us to the insight needed to successfully get beyond the Standard Model. At the moment all eyes are on the LHC, with the hope that experiment will lead to new insight. Whether this will work out is still to be seen, but in any case it looks like it’s going to take a few years. Perhaps theorists with nothing better to do but wait will want to consider thinking about these problems.

Non-Perturbative BRST

The BRST method used to deal with the gauge symmetry of perturbative Yang-Mills theory does not appear to generalize to the full non-perturbative theory, for a rather fundamental reason. This was first pointed out by Neuberger back in 1986 (Phys. Lett. B, 183 (1987), p337-40.), who argued that, non-perturbatively, the phenomenon of Gribov copies implies that expectation values of gauge-invariant observables will vanish. I’ve written elsewhere about a different approach to BRST that I’m working on (see here), which is still at a stage where I only fully understand what is going on in some toy quantum-mechanical models. My own point of view is that there’s still a lot of very non-trivial things to be understood about gauge symmetry in QFT and that the BRST sort of homological techniques for dealing with it are of deep significance. Others will disagree, arguing that gauge symmetry is just an un-physical redundancy in our description of nature, and how one treats it is a technical problem that is not of a physically significant nature.

One reaction to this question is to just give up on BRST outside of perturbation theory as something unnecessary. In lattice gauge theory computations, one doesn’t fix a gauge or need to invoke BRST. However, one can only get away with this in vector-like theories, not chiral gauge theories like the Standard Model. Non-perturbative chiral gauge theories have their own problems…

Non-perturbative Chiral Gauge Theory

Since the early days of lattice gauge theory, it became apparent that chiral symmetry was problematic on the lattice. One way of seeing this is that naively there should be no chiral anomaly on the lattice. The problem was made more precise by a well-known argument of Nielsen-Ninomiya. More recently, it has become clear that one can consistently introduce chiral symmetry on the lattice, at the cost of using fermion fields that take values in an infinite dimensional space. One such construction is known as “overlap fermions”, which have the crucial property of satisfying relations first written down by Ginsparg and Wilson. This kind of construction solves the problem of dealing with the global chiral symmetry in theories like QCD, but it still leaves unsolved the problem of how to deal with a gauged chiral symmetry, such as the gauge symmetry of the Standard Model.

Poppitz and Shang have recently written a nice review of the problem, entitled Chiral Lattice Gauge Theories Via Mirror-Fermion Decoupling: A Mission (im)Possible? They comment about the significance of the problem as follows:

Apart from interest in physics of the Standard Model — which, at low energies, is a weakly-coupled spontaneously broken chiral gauge theory that does not obviously call for a lattice study — interest in strong chiral gauge dynamics has both intensified and abated during the past few decades. From the overview in the next Section, it should be clear that while there exist potential applications of strong chiral gauge dynamics to particle physics, at the moment it appears difficult to identify “the” chiral theory most relevant to particle physics model-building (apart from the weakly-coupled Standard Model, of course). Thus, the problem of a lattice formulation of chiral gauge theories is currently largely of theoretical interest. This may or may not change after the LHC data is understood. Regardless, we find the problem sufficiently intriguing to devote some effort to its study.

In a footnote they compare two points of view on this: Creutz who argues that the question is important since otherwise we don’t know if the Standard Model makes sense, and Kaplan who points out that if there is some complicated and un-enlightening solution to the problem, it won’t be worth the effort to implement.

You can read more about the problem in the references given in the Poppitz-Shang article.

Euclideanized Fermions

Another peculiarity of chiral theories arises when one tries to understand how they behave under Wick rotation. Non-perturbative QFT calculations are well-defined not in Minkowski space, but in Euclidean space, with physical observables recovered by analytic continuation. But the behavior of spinors in Minkowski and Euclidean space is quite different, leading to a very confusing situation. Despite several attempts over the years to sort this out for myself, I remain confused, and can’t help suspecting that there is more to this than a purely technical problem. One natural mathematical setting for trying to think about this is the twistor formalism, where complexified, compactified Minkowski space is the Grassmanian of complex 2-planes in complex 4-space. The problem though is that thinking this way requires taking as basic variables holomorphic quantities, and how this fits into the standard QFT formalism is unclear. Perhaps the current vogue for twistor methods to study gauge-theory amplitudes will shed some light on this.

On the general problem of Wick rotation, about the deepest thinking that I’ve seen has been that of Graeme Segal, who deals with the issue in the 2d context in his famous manuscript “The Definition of Conformal Field Theory”. I saw recently that he’s given some talks in Europe on “Wick Rotation in Quantum Field Theory”, which makes me quite curious about what he had to say on the topic.

For some indication of why this confusion over Minkowski versus Euclidean spinors remains and doesn’t get cleared up, you can take a look at what happened recently when Jacques Distler raised it in related form on his blog here (he was asking about it in the context of the pure spinor formulation of the superstring). I’m not convinced by his claim that the thing to do is to go to Euclidean space-time variables, while keeping Minkowski spinors. Neither is Lubos, and he and Jacques manage to have an argument about this that sheds more heat than light. It ends up with Lubos accusing Jacques of behaving like Peter Woit, which lead to him being banned from commenting on the blog. While this all is, as Jacques describes it “teh funny”, it would be interesting to see a serious discussion of the issue. Since it in some sense is all about how one treats time, perhaps one could get FQXI funding to study this subject.

Update: Lubos Motl has immediately come up with a long posting explaining why these are all non-problems, of concern only to those like myself who are “hopeless students”, “confused by many rudimentary technicalities that prevented him from thinking about serious, genuinely physical topics.” If I would just understand AdS/CFT and Matrix theory I would realize that gauge symmetry is an irrelevance. Few in the theoretical physics community are as far gone as Lubos, but unfortunately he’s not the only one that thinks that concern with these “technicalities” is evidence that someone just doesn’t understand the basics of the subject.

Posted in Favorite Old Posts, Uncategorized | 76 Comments

A Tear at the Edge of Creation

There’s a new book out this week by Marcelo Gleiser, entitled A Tear at the Edge of Creation. Gleiser blogs at the NPR site 13.7, and that site also has a review of the book from his fellow blogger Adam Frank.

Gleiser started out his professional life as a string theorist, enchanted by the prospect of finding a unified theory, and for many years that motivated his research:

Fifteen years ago, I would never have guessed that one day I would be writing this book. A true believer in unification, I spent my Ph.D. years, and many more, searching for a theory of Nature that reflected the belief that all is one.

Over the years he began to become disillusioned with this quest, not only with string theory, but also with other closely associated ideas (e.g. GUTs and supersymmetry) about how unification is suppose to happen. Hopes that GUTs would give predictive theories of inflation or proton decay have fallen by the way-side, and about supersymmetry he is “very skeptical”:

The fact that the [lightest, stable superpartner] particle has so far eluded detection doesn’t bode well. To make things worse, results from the giant Super-Kamiokande detector in Japan and the Soudan 2 detector in the United States have ruled out supersymmetric GUT models, at least the simpler ones, based again on the proton lifetime. If SUSY is a symmetry of Nature, it is very well hidden.

About string theory itself, Gleiser refers to my book and Lee Smolin’s, and comments that:

Responses from notable string theorists were of course highly critical of the books and their writers. Some were even offensive. I find this sort of dueling pointless. People should be free to research whatever they want, although they should also reflect responsibly on whether their goals are realistic.

Ultimately, Gleiser came to the point of view that all hopes for a unified theory are a misguided fantasy, and explaining this point of view is the main goal of the book. In an interview on his publisher’s web-site, he says:

After years searching for a “final” answer, as a scientist and as a person, I realized that none exists. The force of this revelation was so intense and life transforming that I felt compelled to share it with others. It completely changed the way I think about Nature and our place in it.

In his book, he argues repeatedly against the fundamental nature of symmetries in our understanding of physics, seeing the failures of GUTs and supersymmetry as a failure of the idea of getting unification out of larger, more powerful symmetry laws. For him, symmetries are always just approximations, never exactly true principles. He claims to be more interested in asymmetries, in failures of symmetry laws, seeing in asymmetry a fundamental explanatory principle about the universe and humanity’s role in it.

Personally, I find myself in strong disagreement with him about this, and don’t see much evidence in the book that the abandonment of the search for symmetries that he advocates leads to any positive route to greater understanding of the universe. I agree with him about the failure of the most popular ideas about how to use symmetry to get beyond the Standard Model, but disagree with him about the implications of this.

The problem with both GUTs and supersymmetry is that one posits a new symmetry only to be faced immediately with the question of how to break it, with no good answer. To be successful, any new symmetry principle needs to come with a compelling explanation of how it is to be realized in fundamental physics. A repeated lesson of the development of the Standard Model was that major advances came not only from coming up with new symmetry groups, but through coming up with unexpected ways of realizing them (e.g. spontaneous symmetry breaking and confinement in gauge theory). I don’t believe that the gauge symmetries of the Standard Model are approximations, but rather that they are among our most powerful and fundamental physical principles, and that much work remains to be done to understand their full implications. The failures that have discouraged Gleiser have also discouraged many others, and a resulting abandonment of symmetry-based attempts to find a better, more unified, fundamental theory would be a shame.

Posted in Book Reviews | 36 Comments

$2 Million For The Nature of Time

FQXI has just announced that it will be awarding another series of large grants, of size \$50K-\$100K, totaling about \$2 million. These grants will be targeted at research into “The Nature of Time.” Initial proposals are due June 14, grants will start January 2011. For more details, see here and here.

At this point, I’m a bit curious about where the FQXI money is coming from. They started up in 2006 with a seed grant from the Templeton Foundation of about \$8.8 million, and that grant was supposed to finish at the end of last year. Going forward, I haven’t seen any indication of whether they’re still operating on the seed money, or have new money from Templeton or other sources.

Posted in Uncategorized | 1 Comment

Freaky Physics Proves Parallel Universes Exist

Fox News has decided that some recent experimental atomic physics work showing that quantum mechanics works as expected (for a sane discussion of the science, see here) proves that parallel universes exist and that time travel may be feasible. In an article entitled Freaky Physics Proves Parallel Universes Exist, Fox News writer John Brandon develops this idea with help from Sean Carroll and Fred Allan Wolf (aka Dr. Quantum):

The multi-verse theory says the entire universe “freezes” during observation, and we see only one reality. You see a soccer ball flying through the air, but maybe in a second universe the ball has dropped already. Or you were looking the other way. Or they don’t even play soccer over there.

Sean Carroll, a physicist at the California Institute of Technology and a popular author, accepts the scientific basis for the multi-verse — even if it cannot be proven.

“Unless you can imagine some super-advanced alien civilization that has figured this out, we aren’t affected by the possible existence of other universes,” Carroll said. But he does think “someone could devise a machine that lets one universe communicate with another.”

It all comes down to how we understand time.

Carroll suggests that we don’t exactly feel time — we perceive its passing. For example, time moves fast on a rollercoaster and very slowly during a dull college lecture. It races when you’re late for work . . . but the last few minutes before quitting time seem like hours.

Back to the Future

“Time seems to be a one-way street that runs from the past to the present,” says Fred Alan Wolf, a.k.a. Dr. Quantum, a physicist and author. “But take into consideration theories that look at the level of quantum fields … particles that travel both forward and backward in time. If we leave out the forward-and-backwards-in-time part, we miss out on some of the physics.”

Wolf says that time — at least in quantum mechanics — doesn’t move straight like an arrow. It zig-zags, and he thinks it may be possible to build a machine that lets you bend time.

Update: Matt Springer has a more detailed analysis of the Fox News article entitled The Worst Physics Article Ever. For some reason his critique skips over the multiverse part, implying that that’s the one part of the article that makes sense…

Posted in Multiverse Mania | 5 Comments

Solar

Ian McEwan’s new novel Solar is now out, with a plot featuring Michael Beard, an aging theoretical physicist. Beard won a Nobel prize early in his career for the “Beard-Einstein Conflation”, which supposedly involves some unexpected coherent behavior in QED, based on the discovery of a structure in Feynman diagrams that involves E8. An appendix of the novel reproduces the Nobel presentation speech, evidently it’s the work of physicist Graeme Mitchison (see here).

As the novel opens in 2000, Beard is in his fifties, having spent quite a while coasting on his Nobel Prize. He’s director of an alternative energy research center, and one of the activities there employs postdoc theorists to evaluate unconventional ideas that are sent in, largely by cranks:

Some of these men were truly clever but were required by their extravagant ambitions to reinvent the wheel, and then, one hundred and twenty years after Nikola Tesla, the induction motor, and then read inexpertly and far too hopefully into quantum field theory to find their esoteric fuel right under their noses, in the voids of the empty air of their sheds or spare bedrooms – zero-point energy.

Quantum Mechanics. What a repository, a dump, of human aspiration it was, the borderland where mathematical rigor defeated common sense, and reason and fantasy irrationally merged. Here the mystically inclined could find whatever they required and claim science as their proof.

The postdocs have little interest in the old history of Beard’s discovery about QED, but instead baffle him by making “elliptical references to BLG or some overwrought arcana in M-theory or Nambu-Lie 3 algebra.” McEwan seems to have gotten this from Mike Duff, who is thanked in the acknowledgments. There’s the obvious problem though that Bagger-Lambert-Gustavsson, a hot topic while the book was being written, dates from 2007, so is anachronistic as a topic of discussion in 2000. Beard’s reaction to what the postdocs work on is:

Some of the physics that they took for granted was unfamiliar to him. When he looked it up at home, he was irritated by the length and complexity of the calculations. He liked to think that he was an old hand and knew his way around string theory and its major variants. But these days there were simply too many add-ons and modifications. When Beard was a twelve-year-old schoolboy, his math teacher told the class that whenever they found an exam question coming out at eleven nineteenths or thirteen twenty-sevenths, they should know that they had the wrong answer. Too messy to be true. Frowning for two hours at a stretch, so that the following morning parallel pink lines were still visible across his forehead, he read up on the latest, on Bagger, Lambert and Gustavsson – of course! BLG was not a sandwich – and their Langrangian description of coincident M2-branes. God may or may not have played dice, but surely He was nowhere near this clever, or such a show-off. The material world simply could not be so complicated.

To some extent, Solar is an entertaining comic novel of physicists and the alternative energy research business. The dominant theme though is a topic not all will find interesting: Beard’s personal life, which involves five failed marriages over the years. At the end of the book, Beard is in his sixties, a fat, unpleasant slob, with two younger women fighting over the possibility of being number six on the list.

Posted in Book Reviews | 9 Comments

Short Items

Planning on getting back to writing some longer postings, but for today, here’s a collection of quick news and links:

  • I hear from number theorists that Princeton’s Manjul Bhargava has some breakthrough results on the ranks of elliptic curves. I was out of town and missed his colloquium talk here, reports were that it was quite impressive. Here’s the main result, from the talk abstract:

    There is a standard conjecture, originating in work of Goldfeld, that states that the average rank of all elliptic curves should be 1/2; however, it has not previously been known that the average rank is even finite!  In this lecture, we describe recent work that shows that the average rank is finite (in fact, we show that the average rank is bounded by 1.5).

  • There’s an intriguing new paper out from Frenkel, Langlands and Ngo, describing some tentative new ideas about how to prove functoriality using the trace formula. I gather that this combines ideas from Ngo’s proof of the fundamental lemma, ideas of Langlands about “Beyond Endoscopy”, and ideas originating in the geometric Langlands program. The paper is clearly largely written by Langlands (one hint is that it’s in French, be grateful it’s not in Turkish…).
  • Besides this new work, Ed Frenkel also has a new film coming out, entitled Rites d’Amour et de Maths. Here’s the plot summary from IMDB:

    Is there a mathematical formula for love without death? The film ‘Rites of Love and Math’ is a sprawling allegory about Truth and Beauty, Love and Death, Mathematics and Tattoo, set on the stage of Japanese Noh theater. About the directors: Edward Frenkel is Professor of Mathematics at University of California at Berkeley and one of the leading mathematical physicists in the world. Reine Graves is a talented French filmmaker who has directed a number of original and controversial films that have won prestigious awards. Having met in Paris, Frenkel and Graves decided to create a film showing the beauty of mathematics. But how to do this without getting bogged down in technical details of the subject that could scare away non-specialists? Looking for the right metaphor, they came across the idea of making the tattoo of a mathematical formula. What better way to show the beauty of the formula than by letting it merge – literally – with beautiful female body! They found the aesthetic language for expressing this allegory in the enigmatic film ‘Rites of Love and Death’ (a.k.a. ‘Patriotism’) by the great Japanese writer Yukio Mishima, which had a very unusual and mysterious history of its own (banned for over 40 years, it came out on DVD in the Criterion Collection in 2008). The exquisite imagery of Mishima’s film and the original idea of Frenkel and Graves have led to the creation of ‘Rites of Love and Math.’

  • Harvard finally has a female tenured math professor: Sophie Morel.
  • This week’s Science Magazine has an article about Sabine Hossenfelder’s work (also see her blog posting here) purporting to show that you can’t get linear terms in deformed Special Relativity, making deviations from standard Special Relativity unobservably small. Personally, this is the sort of thing I don’t know enough about to offer an informed judgment on, but I’m curious to hear what experts think.
  • Nature has an article about social scientists studying the LHC project.
  • The KITP is now running a program on Strings at the LHC and in the Early Universe, which is a bit odd, since string theory predicts nothing at all about either topic. They’ve had promotional Blackboard Lunch talks by Cvetic and Brandenberger claiming otherwise (Brandenberger’s title was “Testing String theory with Cosmological Observations”). Taking a look at them, I don’t see anything at all that corresponds to a “test of string theory”.
  • Update: One more. See here for Jester’s summary of what particle theory came up with during the noughties, which has to have been the most depressing decade for the subject in a very, very long time.

    Update: There are two new papers (see here and here) on the arXiv this evening that address Sabine Hossenfelder’s arguments about DSR (she also has a new paper summarizing her argument, here). In one of these, Lee Smolin argues that, at least in some cases, the paradoxes pointed out by Hossenfelder can be eliminated if one studies wave-packet propagation instead of classical propagation.

    Update: There’s another unusual paper on the arXiv this evening, by Longo and Witten, entitled An Algebraic Construction of Boundary Quantum Field Theory. It’s an algebraic QFT paper, written in a rigorous mathematical style, quite out of character with typical papers from Witten.

    Posted in Uncategorized | 25 Comments

    First High Energy Collisions at the LHC

    Current schedule is for first 7 TeV center of mass collisions tomorrow (Tuesday) at 9:17 am Geneva time. Injection of the beams will take place after 2 am, ramp up to 3.5 TeV/beam from 3-4 am. For more details of what has been going on recently at the LHC, see here. The schedule for tomorrow is here, a link to the planned webcast is here.

    CMS e-commentary is here, ATLAS control room blog here.

    Update: I just woke up, a couple minutes after first collisions were observed, 12:57 Geneva time. Collisions are going on now.

    Posted in Experimental HEP News | 14 Comments

    More Prizes

    As far as I can tell, it’s still unclear if Perelman will accept the $1 million Millennium prize awarded to him last week. This week brings news of two more million dollar prizes:

  • John Tate is this year’s winner of the Abel Prize, worth 6 million Norwegian Kroner, which is a bit more than $1 million. Tate is now 85, recently retired from UT Austin (he spent much of his career at Harvard). He is a major figure in the development of algebraic geometry and number theory during the second half of the last century. His Princeton Ph. D. thesis, which pioneered the use of Fourier analysis on the adele group in the study of number theory, could easily be the most widely read and used doctoral thesis in mathematics.
  • The 2010 Templeton Prize, worth 1 million British pounds, or about $1.5 million, was awarded to biologist Francisco Ayala. Remarkably, the Templeton Foundation describes the prize-winner as someone who has “vigorously opposed the entanglement of science and religion”. I had thought that the main goal of the Templeton Foundation WAS “the entanglement of science and religion”, so this is a bit surprising. Ayala has done admirable work over the years refuting creationism and intelligent design.

    There was a bit of a kerfuffle over the fact that the announcement was made at the National Academy of Science (Ayala is a member), with Sean Carroll quoted as:

    Templeton has a fairly overt agenda that some scientists are comfortable with, but very many are not. In my opinion, for a prestigious scientific organization to work with them sends the wrong message.

    Science magazine has an article here about the award and about what some scientists think of Templeton’s activities, including the following:

    Even those who are put off by Templeton’s mission agree that the foundation does not attempt to influence the outcomes of the research and discussions it sponsors. “I am not enthusiastic about the message they seem to be selling to the public—that science and religion are not incompatible; I think there is real tension between the two,” says Steven Weinberg, a Nobel Prize–winning physicist at the University of Texas, Austin, who has been an outspoken critic of religion. “But for an organization with a message, they are pretty good at not being intrusive in the activities they fund. I don’t wish them well, but I don’t think they are particularly insidious or dangerous.”

  • Posted in Uncategorized | 10 Comments

    High Energy Beams at the LHC

    At 5:23 am in Geneva this morning, for the first time the two LHC beams were ramped up to high energy, the 3.5 TeV/beam that they plan to run at for the next two years. These are the highest energy (per particle) beams ever created by human beings, significantly surpassing the value at which the Tevatron operates (.98 TeV/beam) as well as the record achieved last fall (1.18 TeV/beam) during the early stages of beam commissioning.

    From now on, work will continue on preparing the machine to operate at higher intensity (for now they are using low-intensity pilot beams). For the next week or two, one of the challenges will be to carefully avoid any interesting collisions between particles in the two beams, since a major media event is being organized around the first collisions, and the event is tentatively scheduled for March 30.

    Update: CERN press release is here.

    Update: CERN has confirmed in a press release that first collisions will be attempted on March 30.

    Posted in Experimental HEP News | 10 Comments