Taking stock at the LHCP conference

I felt like I was returning home as I walked through the gates of Columbia University at 116th Street and Broadway, the day before the LHCP conference began. The scaffolding from the recently completed graduation ceremonies reminded me of my own PhD graduation thirteen years ago. The ubiquitous Columbia-blue signs of “Welcome back Alumni” seemed to be talking just to me. There was some nostalgia for what has changed, most notably the replacement of the tennis courts next to the large brick physics building with an even larger modern glass one.

Prof. Mike Tuts of Columbia University

Prof. Mike Tuts of Columbia University welcoming participants to LHCP 2014.

Jet-lagged from the flight from England, my current home, had me awake at 4 am on the first morning of the conference. Anticipation — because it was going to be my first conference since the boson-discovery conference in Melbourne in 2012 — would not let me return to sleep. LHCP’s kick-off, given by the conference chair and my own former PhD supervisor Professor Mike Tuts, reminded me of a few of the things to look forward to: a public showing of the new Particle Fever movie on Wednesday night and a panel discussion moderated by New York Times’ science writer Dennis Overbye on Friday afternoon.

The conference comes during an early transition for the LHC experiments. While the experimentalists are finalising many measurements from the first set of collisions completed in 2013, they are making significant preparations for the next set of collisions scheduled to begin early next year. The scientific discussions of the sometimes mundane details of the first measurements are sprinkled with giddy soothsaying for what we might discover in the coming years, and how. Continuing into the coffee breaks, the tangible excitement in the ensuing conversations is a highlight of the conference.

My own presentation on measurements of multiple weak-boson production from the ATLAS experiment came on the second day. Jet-lag had not been kind, allowing me a mere three-and-half hours’ sleep the night before, and my only hope for a coherent presentation was to keep a steady stream of coffee pulsing through my veins. This worked only too well — I sped through the results at twice my planned speed, leaving the session chair to comment, “Well, we have plenty of time for questions…”

The morning sessions of the third day focused on what the newly-discovered Higgs boson could be telling us about what lies beyond. The afternoon was open, a break at the midpoint of the conference. For me this meant several hours of catching up on meetings and email. But there was a reward, an early dinner consisting of three things that are hard to find in the UK: fried chicken, a caesar salad, and a Brooklyn beer I’d never heard of. Then it was off to the showing of Particle Fever, where I had volunteered to answer any and all questions the public moviegoers had about physics.

Particle Fever at LHCP

Volunteers answering questions of the public before attending the Particle Fever screening at LHCP 2014.

I found a place next to a poster of the Standard Model and described the particles and interactions as best I could to the small crowd that formed around me. At the end, one of the listeners told me she was involved in the development of the poster — from the presentational side — and she now had a better understanding of what it represented. It’s always nice when someone lets you know you have done a good job.

More than 1000 attendees were packed into the big conference hall on the southwest corner of the Columbia campus to watch ‘Particle Fever’. The movie tells the story of the Higgs boson discovery, focusing on a few individuals who convey the excitement and activity within and outside the big experiments that took the data leading to the discovery. Afterwards the movie’s director and three of its stars — David Kaplan, Nima Arkani-Hamed, and Fabiola Gianotti — answered many questions from the audience. I was impressed by the depth of the public’s questions, cutting to many of the difficult issues that physicists continue to try to answer through their research. It is reassuring that the public finds many of the same questions interesting as we physicists do. This research is truly a universal human endeavour, and this is one of the core themes of the movie.

Another day of scientific results passed and the panel discussion came up after lunch. This focused on the major accelerators the field will need for the next big discoveries over the years and decades to come. While I will have retired before the next big accelerator produces data, I have a responsibility to ensure that the next generation of physicists have the tools to answer the questions that my generation has yet to even ask. The next discovery will lead to even more profound questions than the last — this is the excitement of research, and it will continue well beyond the discovery of the Higgs boson, the latest important milestone on the path to understanding the workings of the universe.


Caterina Doglioni Chris Hays is a Research Lecturer at Oxford University focusing on Higgs boson measurements at ATLAS. He also works on the precise measurement of the W boson mass, which provided an expected mass range for the Higgs boson prior to its discovery. Chris is currently serving as ATLAS UK physics coordinator.

LHCPlanning for the future

As someone who comes from a small mountain town, for many years I’ve linked the word ‘summer’ to ‘seaside’ and ‘sun’. During my experience as a physicist working in ATLAS, I found myself associating the word ‘conferences’ to the word ‘summer’ more often than to the two above. Physicists work hard to meet review deadlines so that their result is made public before the start of the conference, often postponing seaside and sun. The reward is being able to present the work to an international audience: Summer conferences are the showcase of ATLAS results obtained throughout the year.

Even if in my case the results are still in the works, I was invited to chair the Physics Beyond the Standard Model sessions at the LHCP (Large Hadron Collider Physics) conference, at Columbia University in New York. LHCP is a new addition to the Summer conferences calendar and is already a well established appointment even though it is only in its second edition. LHCP combines two of the previous Summer conferences, HCP and PLHC, allowing physicists to economise on acronyms and travel.

During our week at LHCP, the sea was a few kilometres away so it didn’t feature, but we had plenty of sun and plenty of results. Beyond the new ATLAS results on display (as detailed in Kate Shaw’s post), the conference featured an interesting debate on the outlook of LHC physics for the coming years. The discussion between the six panelists (Natalie Roe, Steve Ritz, Hitoshi Murayama, Jerry Blazey, Sergio Bertolucci, Nima Arkani-Hamed) was moderated by NYT journalist Dennis Overbye, and was centred on the perspectives for physics at the LHC and beyond in the next decades.

Overbye Panel

Panel chaired by Dennis Overbye discussing the report of the Particle Physics Project Prioritization Panel (P5).

The immediate, practical question that comes to mind is: why should we start thinking about the future so much in advance? We already have enough to do in ATLAS between completing 8 TeV searches and analyses and preparing for the upcoming 13 TeV run!

Past experience teaches us that planning for accelerators and experiments that require global collaborative efforts need to start much more in advance with respect to the start of operations. The first steps for the Large Hadron Collider happened more than 20 years before the start of the LHC data taking. Even though the concrete plan will certainly be driven by the results obtained at the 13 TeV LHC, now it’s the time to start thinking about a global strategy that calls into action many countries in the world. Something that we scientists often take for granted is how well science works in terms of collaboration between different countries that normally aren’t that fond of each other. Everything seems so effortless when discussing physics problems! Policymakers aren’t as bright-eyed though, and if we want worldwide collaboration we need a robust framework in terms of international relations.

There are still many open questions in terms of targeted physics planning. As Fabiola Gianotti said, one of the most important questions is ‘at which energy scale will we find the answers to the shortcomings of the Standard Model’?

John Ellis's concluding talk at LHCP

John Ellis’s concluding theory talk at LHCP

Many of us still strongly suspect that the energy scale of the LHC gives us a very good starting point to look for answers (see post by Zach Marshall about the SuperSymmetric particles that could be found at the LHC):

However, many of us also know that nature hasn’t been yet so kind as to show us the signatures of the new phenomena that easily, and might not do so even in the upcoming LHC run. In the BSM sessions that I chaired, there was no claim of new physics discovery yet. So we know that we should plan for our current and upcoming searches to welcome unexpected and rare processes as they might help fill the gaps in our understanding of nature.

Theorist Markus Luty's conclusions at the LHCP conference

Theorist Markus Luty’s conclusions at the LHCP conference

Fabiola’s talk also highlighted that our recent successes might lead the way to new physics. The first LHC run pointed us to a new particle that is different to any other particle we know: the Higgs boson. The Higgs boson could be the particle connecting the known Standard Model world to discoveries beyond our current understanding. We must study the properties of the Higgs boson in detail; The choice of whether to do so at a linear accelerator or a large circular one will heat the debates of the next few years.

Fabiola Gianotti's slide at the LHCP conference: Enrico Fermi's extrapolations on future technologies, 1954->1994

Fabiola Gianotti’s slide at the LHCP conference: Enrico Fermi’s extrapolations on future technologies, 1954->1994

Another message from the discussion was that even though resources are limited we don’t want to limit our ambitions. Enrico Fermi was quoted as an example: he mentioned TeV-scale colliders at a time when the techology was still science fiction.

If our vision of particle physics is one of a world-wide coherent research field, collaboration will help think of new technologies needed to make future particle physics research facilities happen. Our efforts should also be targeted towards making those technologies as affordable as possible, since we can’t forget budgets are and always will be limited (let’s not forget that the LHC expenditure needs to be put into perspective). But, as Nima Arkani-Hamed also pointed out, we should keep in mind that science is an investment and the pay-off (for everyone, not just for us scientists) is years from now. The questions we’re pursuing now shape our culture and our world, now and in the years to come, so let’s keep planning in order to answer those questions.


Caterina Doglioni Caterina Doglioni is a post-doctoral researcher in the ATLAS group of the University of Geneva. She got her taste for calorimeters with the Rome Sapienza group in the commissioning of the ECAL at the CMS experiment during her Master’s thesis. She continued her PhD work with the University of Oxford and moved to hadronic calorimeters: she worked on calibrating and measuring hadronic jets with the first ATLAS data. She is still using jets to search for new physics phenomena, while thinking about calorimeters at a new future hadron collider.

Notes from Underground: IBL vs Brazil Championship

More from our Notes from Underground blog series by ATLAS members preparing to explore new worlds that higher energy collisions will reveal in the LHC’s next run

brasil_2014

Previously in Notes from Underground, Dave Robinson wrote in some detail about the work going on inside the ATLAS Detector, and Clara Nellist wrote about the inner detector of ATLAS, discussing the different types of detection units or Sensors (Planars & 3D). I will continue to delve into the exciting world of the inner detector with its brand new Insertable B-Layer (IBL) and its related parts.

Next year the LHC will start running again at 13 TeV, almost double the previous energy, and the protons will be collided together every 25 nano seconds, twice as often as in 2012, thus ATLAS needed a new detector layer nearer to the collision point to help reconstruct the debris of each collision. The ATLAS Detector is big (46m long, 25m diameter), and at first it was difficult to believe there would be available extra space for a new detector but in fact reduction of the diameter of the beam pipe itself was proposed. The IBL was just inserted into ATLAS last month. It was an important and unique goal in that game. Next week during the Football World Cup in Brazil, we will see how 11 players will easily insert two or three balls into the goals in a few tens of minutes, but the insertion of the IBL was indeed a more difficult task.

Ahmed is inside the pit for the pixel services in Jan 2014

Ahmed inside the pit for the pixel services in Jan 2014

The IBL championship began several years ago, when the decision to insert a fourth pixel layer into the newly reduced beam pipe was made, to improve the tracking system and to compensate for the irreparable failures in other layers. The IBL has been installed after a lot of work and technical support from many Captains playing in the ATLAS club, between the existing pixel system and a new smaller radius beam-pipe at a radius of just 3.3 cm. To cope with the high radiation and pixel occupancy due to the proximity to the interaction point, a new read-out chip and a newer version of the planar pixel sensors, and a completely new design called 3D silicon were invited from all over the world to help in this championship. Moreover, a lot of work has been done to improve the physics performance of the detector to make it more efficient.

IBL is suited in the clean room in SR1, it is ready for lowering down to the pit.

IBL is suited in the clean room in SR1, it is ready for lowering into to the pit of the ATLAS Cavern.

The IBL is made of 14 staves. A stave is simply the structure that holds the pixel modules which are the main players in the detection process. There are types of modules on each stave, 3D modules on the ends that Clara discussed last week in her blog, and planar modules covering the central part of the stave.

As detailed health tests for players are needed to make sure only the fittest are accepted as a main team player, detailed tests and selection rules are used to choose the best modules, electronics, services , cables and staves to select the best and the most high quality parts. I worked on this analysis to identify the highest quality modules and staves to be used to build the IBL.

During the IBL lowering down to the pit on May 2014

The IBL being lowered into the pit of the ATLAS cavern.

The IBL was finally assembled and tested and it was installed last month successfully inside the inner detector. Soon the cosmic testing will begin to perform the detector commissioning and calibration.

To finalize this comparison to the World cup in Brazil, the football team performs their training in 30 degree heat wearing their football kit in the muddy fields outside while the IBL team prefers to do their work inside the chilly clean rooms wearing lab coats.


Ahmed Bassalat Ahmed Bassalat is a Palestinian PhD student at LAL, France, working on IBL and planar pixel sensors for ATLAS and the VBF H->invisible analysis channel. He joined Paris sud 11 University in France after getting his Bachelor degree from An-Najah National University in Nablus, Palestine. Ahmed is working to get Palestinian students and Universities more involved in High Energy Physics.

Notes from Underground: Pixel Prototypes

More from our Notes from Underground blog series by ATLAS members preparing to explore new worlds that higher energy collisions will reveal in the LHC’s next run

In last week’s post for this Notes from Underground series, David talked about the work that goes on in the ATLAS pit. I’m going to take a step back and talk about what happens before a detector is installed. Although the work I want to tell you about didn’t technically take place underground, much of it was performed in what is essentially a large airport hanger without natural light, so it certainly feels like you’re 100m down!

Setting up the equipment for an experiment. Photo credit J. Hasi.

Setting up the equipment for an experiment. Photo credit J. Hasi.

My research is focused on the Pixel detector, which lies at the very heart of ATLAS, closest to the point where protons are smashed together (the Interaction Point).

The purpose of the pixel detector is to track charged particles as they travel outwards from the interaction point, allowing us to make measurements of the electrical charge and mass. One method is to see which way they bend in the magnetic field that we surrounded this part of the detector with. This helps us to identify particles. By following these tracks back towards the interaction point, we can work out when one of the particles was a beauty quark (or b-quark). We can tell this as, once the b-quark has been created, it travels a few millimetres before turning into a different particle. Our detector is accurate to about 10s of micrometres (or 0.01 mm) and so we can see when this has happened. Finding out when a b-quark has been made is a very useful piece of information for many of the physics analyses.

One problem is that every time the LHC collides bunches of protons together (40 million times a second as David said last week), it sprays the ATLAS detector with new particles and our pixel detector gets a bit damaged from all the radiation! Imagine you have a row of ducks at a fairground stall and someone’s throwing balls at them; if you hit one, the duck gets moved a bit, or even knocked over. This is what happens when particles (the balls) are travelling through our detector, which is made of a three-dimensional grid of silicon atoms (the ducks). When the atoms are displaced, electrons moving around inside can get trapped (and later released) and this means our measurements are not as good (or can’t happen at all). To be honest, the duck analogy was probably a bit strained here. The main idea is that the detectors get damaged by the radiation, and after a while we have to replace them.

For this current shut-down of the LHC, there wasn’t enough time for us to completely replace the whole pixel detector, so we decided to add an extra layer and insert it between a new beam-pipe and the current inner-most layer. We called it the Insertable B-Layer. This layer had to be faster, last longer and take a sharper ‘image’ of the particles passing through it. Consequently, new pixel sensors had to be designed to be placed in this layer, since it was going to be even closer to the interaction point, and the energy of the protons of the LHC were going to be increased.

When a prototype of a new design has been made, we take it to CERN (or another particle accelerator) and place the prototype into the particle beam there. This is where the large airport hanger comes in. At CERN the particle beams from some of the pre-accelerators for the LHC can be diverted away (when they’re not busy feeding the LHC) to special experimental halls where we can do these sorts of experiments. Even when the accelerators at CERN are off, we can travel to another particle accelerator for these tests.

The crew for an experiment testing prototype pixel detectors at CERN. Photo credit J. Hasi.

The crew for an experiment testing prototype pixel detectors at CERN. Photo credit J. Hasi.

During the experiment we have to make the most of the time available: so data is taken for 24 hours a day! Fortunately shifts are split into three sets of eight-hours, so it’s not so bad. Thankfully, we always work in pairs, so there’s always someone to talk to at 4AM when the data taking is stable (meaning there’s nothing for us to do until we have to change the way our experiment is set up). I’ve watched some terrible films at this time in the shift, because I don’t trust myself to be awake enough to make useful additions to my analysis code. From these experiments, two types of pixel detectors were chosen to go into the IBL: a newer version of the design already in ATLAS, called planar pixel sensors, and a completely new design called 3D silicon.

The next stage was to make enough of these sensors (with some left over in case any get broken in the process), to build this new detector layer and install it. But I’ll leave that to Ahmed for his post next week!


Clara Nellist Clara Nellist is a British post-doc at LAL, France, working on planar pixel sensors for future upgrades of the ATLAS detector and the H->tautau analysis channel. She did her PhD with Manchester University studying 3D silicon pixel detectors for the IBL upgrade and her masters degree in top physics at the DO experiment at FermiLab. Clara is also active in science communication, with an aim to encourage more young women to study physics.

Notes from Underground: Servicing Silicon

Launching our Notes from Underground blog series by ATLAS members preparing to explore new worlds that higher energy collisions will reveal in the LHC’s next run

Engineers deep inside the ATLAS detector. Their location is a few metres directly below the usual point of collisions, and several metres above the cavern floor.

We physicists refer to the vast underground cavern that houses the ATLAS experiment as ‘the pit’. That may be a strange term to use for a marvel of civil, mechanical and electrical engineering, but nonetheless there are parallels to what you might imagine a ‘pit’ to be. Working inside the ATLAS detector in the pit can be dark, sometimes hot and not suited to those with claustrophobia. It often involves climbing several sets of makeshift steps and gantries and crawling flat on your stomach through narrow gaps to get to the part of the detector where you need to be. You will be wearing a safety helmet with mounted lamp, steel toe-cap shoes, one or more dosimeters to monitor radiation exposure and even a harness, if working at heights. Not to mention tools, laptop and any equipment you need to do your job. You tend to recognize the experimental physicists, engineers and technicians who have just come up from the pit – they stand blinking in the sunlight with a tired and rather sweaty appearance.

Getting authorization to work in the pit is no easy ride either. First you need a medical. Then there are safety courses to follow (with tests to pass). You must request access to the various ‘zones’ within the pit. You make a work request to detail the work, its duration, the location, and the number and names of people working with you. And then you fill out a risk assessment. All three of those formalities require approval by safety officers, site managers and project leaders. When that’s done, then finally you can put on your helmet, dosimeter, boots, and use your personal CERN badge (with a chip to identify you) to enter the different access zones, backed up by an iris scan to make sure it’s really you (the access control systems are electronically linked to the approval processes mentioned above). It sounds like a lot of hassle but after the initial shock you tend to take it in stride.

I’ve already mentioned that ‘the pit’ is an engineering marvel. The ATLAS detector is also a marvel of experimental physics. The sheer scale of the technology down there never fails to impress, even if you work there often. You can read the mind-boggling facts about ATLAS in this fact sheet. But the scale is only part of it – the really impressive stuff is the appreciation of what the numerous ‘sub-detectors’ that comprise ATLAS are made of and how they function.

I am the Project Leader of one such sub-detector – the SemiConductor Tracker (SCT) – which is centred around the proton-proton collision point right in the heart of the experiment. The SCT is about 6m long and 1.5m in diameter. Its detecting ‘element’ is a  ~6×6 cm silicon sensor with several hundred micro-strips implanted on its surface. A charged particle passing through the silicon generates electron-hole pairs in its bulk though ionization, and the holes drift towards the micro-strips where they form ‘blips’ of excessive charge. We measure that charge and, because the micro-strips are microscopic (the clue is in the name), we can tell with very high precision exactly where the particle passed through the silicon. And here’s the thing. There are more than 16000 such silicon sensors in the SCT, together comprising about 6 million micro-strips, and we measure the charge on every single one. Our 60 square metres of silicon allow us to measure the trajectories (or ‘tracks’) of the thousands of particles that are generated by each proton-proton collision, and to measure each track with a precision of microns (millionths of a metre).  And it does this small task 40 million times every second, which happens to be the rate at which protons collide head-on in the centre of ATLAS.

The LHC beam operations are on pause for two years, which is why we can work directly on the ATLAS detector in the pit (radiation levels would prevent us from entering the cavern otherwise). Even though we have stopped taking data, there is still plenty to do to safeguard this remarkable detector and prepare it for more data-taking from 2015, which is why I am often in the pit.

The SCT itself remains inaccessible, concealed within other sub-detectors that surround the collision point, like the layers of an onion (as Shrek once said, it’s complex). But the tens of thousands of cables, optical fibres, and cooling circuits that service the SCT are partially exposed. And SCT is just one of many such sub-detectors in ATLAS, each with their own services. A tiny mistake – something as small as a washer or misplaced screw – could provoke an electrical short and the electronic noise arising from that short could prevent us from measuring the tiny amount of charge on the micro-strips. Elaborate detection systems are in place to detect such mistakes instantly. We also have to be vigilant on the issue of the environment around the silicon sensors. During collisions, the SCT is operated cold (-7oC) to minimize the rate of radiation damage to the silicon, so the sensors must be kept very dry (in a nitrogen atmosphere) to prevent condensation or frost, which could destroy the millions of delicate connections to the silicon.

We also have to prepare for data taking again from 2015. When I mentioned earlier that protons collide head-on at the rate of 40 million per second, I neglected to mention that it’s not one proton, but billions. When these billions collide, and are sufficiently focused at the point of collision, chances are that there are many simultaneous collisions from the quarks and gluons from multiple protons. So the tracks we measure from what looks like a massive collision are in fact the superposition of multiple (massive) collisions. The LHC has become much better at this than we originally foresaw – we need to be able to extract even more massive amounts of data, in some cases beyond the existing capabilities of the detector readout systems. This has required significant upgrades to those systems this year.

I’ve touched on just a few of the activities currently underway in the pit but, believe me, this is just the tip of the iceberg. There will be further notes from underground in the coming weeks that will describe more of the work going on right now. Working on ATLAS in the pit raises enormous challenges – technical, scientific and even physical. But the rewards are enormous too, meeting these challenges together with skilled and motivated teams of truly global international engineers and physicists. Right now, I wouldn’t want to work anywhere else.


Dave Robinson Dr Dave Robinson is a Senior Research Physicist at the Cavendish Laboratory, Cambridge University, and at CERN. Since March 2013 he has been the Project Leader of the ATLAS Semiconductor Tracker, and Project Leader of the ATLAS Inner Detector. Among other things, he has worked on triggering, data acquisition and silicon detector design and development for the UA1, OPAL and ATLAS experiments at CERN.

Unread Section Opened in the Standard Model Book

While others are worrying that new physics might be running out of corners (see Eve Le Ménédeu’s blog) we should not forget that even within the book of the Standard Model there are completely unread chapters. The Standard Model draws its success from the fascinating fact that its basic energy density formula, called Lagrangian, is uniquely defined by just specifying three fundamental symmetries.  It not only fits on John Ellis’s t-shirt (see blog by Jessica Levêque) but even on a mug in Figure 1. Introducing a spin zero Brout-Englert-Higgs field by adding the last two lines on the mug allows for a symmetry-breaking ground state, gives particles their mass and us the chance to live on earth and investigate all this. Each term on the mug corresponds to a chapter in the Book of the Standard Model by describing a certain class of processes via vertices in Feynman diagrams.

copied from quantum diaries http://www.quantumdiaries.org/2011/06/26/cern-mug-summarizes-standard-model-but-is-off-by-a-factor-of-2/

Figure 1. The Chapters of the Standard Model Book, one per line. Image courtesy of Quantum Diaries

For thousands of years mankind has been reading Chapter 2 (second line), describing the interaction of ‘Gauge particles’ like photons, which mediate forces, with ‘matter particles’ like electrons, laying the basis for forming atoms, molecules and matter.

Chapter 3 (third line) and Chapter 4a (4th line left) with the Higgs couplings to fermions, like top and tau, and to bosons, like W and Z, was opened only in 2012 at CERN’s LHC and currently thousands of scientists are reading it with increasing passion and excitement. For the right-hand part of line 4, the Higgs self-coupling chapter, we will have to wait for the next generation of accelerators.

But what about Chapter 1 in the first line? Its lowest order content, the free propagation of photons, is not even depicted in the figure. Classically, this free propagation was described as electromagnetic waves, predicted and found by Maxwell and Hertz in the second half of the 19th century.  Depicted near the mug are self-interactions, which only exist for those gauge particles, who themselves carry the charge of the interaction, i.e. for the strong gluons and the weak W and Z Bosons. The existence and predicted strengths of all so-called ‘Triple Gauge Couplings’ (TGC) of three gauge particles have been proven at LEP (1992 for gluons and 1997 for W and Z Bosons). However, the so-called ‘Quartic Gauge Coupling’ (QGC) of four gauge particles – for gluons at least indirectly seen – was never part of any measured process involving W and Z bosons so far. It thus remained a completely unread section in Chapter 1 of the Book of the Standard Model — until this week!

Last Thursday, the ATLAS collaboration at CERN announced the first observation of a process which involves the quartic gauge coupling: the scattering WW → WW of two W bosons with same electric charge. In two simultaneous conferences — Marc-Andre Pleier’s talk in Moriond’s morning session, and Anja Vest’s and Ulrike Schnoor’s talks in the afternoon sessions of the German Physical Society Spring Conference — the community had the chance to witness the opening of this thus-far-unread section of the Standard Model. In addition to 16 expected background events, ATLAS observes an excess of 18 candidate events for WW → WW scattering, in perfect agreement with the Standard Model, predicting 14 such signal events in the 8 TeV LHC data. The probability that the background could have fluctuated up that far is only 1:3000.

candidate event for WW → WW scattering. Source: http://cds.cern.ch/record/1690282

Candidate event for WW → WW scattering. Source: http://cds.cern.ch/record/1690282

But how can two W Bosons scatter at LHC, when in fact protons are collided in the first place? Quite frequently, in such a collision, a quark inside a proton radiates a W boson. Less frequently, this happens in both of the colliding protons, so that the decay products of two Ws will be visible in the detector. In very rare cases, before their decay, these Ws can come close enough to scatter via the electroweak force. A candidate for such a scattering event is shown in the figure: It has the characteristic features of two ‘tagging’ jets close to the beam axis, produced by the two radiating quarks, large missing transverse momentum from the neutrinos (blue arrow) and two like-sign electrically charged leptons (red towers) from the W decays in the central part of the detector. Observing this process first for like-sign Ws was not accidental: W pairs with the same electric charge have the huge advantage of negligible background from top-antitop decays or W radiation from gluon-induced quark-antiquark pairs, which can only produce W pairs with opposite electric charge.

Actually, this scattering process of gauge particles is at the heart of  electroweak symmetry breaking, mentioned in the beginning, and was one of the key reasons to build the LHC. In the Standard Model, the contribution of Higgs Bosons is needed to make sure that the rate of this scattering for large WW centre of mass energies in the TeV range obeys the basic ‘unitarity’ law, that a probability cannot be larger than 100%. These critical energies will indeed be reached in the forthcoming 13 TeV run. The scattering of gauge particles will then tell us more about the properties of the Higgs Boson and the symmetry-breaking Brout-Englert-Higgs field. Maybe we don’t have to look too far to find remote corners; perhaps new physics is written in this quartic gauge coupling section of the book of the Standard Model, a chapter we have just started to read.


Michael Kobel Michael Kobel is a full professor at TU Dresden and is currently head of the Institute for Particle Physics and Dean of Studies in the Dept. of Physics. He is a member of the ATLAS Collaboration at CERN, having worked before in six high energy physics experiments at four laboratories.  Since 2005 he has been project leader of IPPOG’s “International Masterclasses” and project leader of the German “Netzwerk Teilchenwelt” since 2010,  two initiatives bringing basic science to the public.

Is New Physics Running Out of Corners?

Friday was the last occasion for Moriond participants to see new results on specific physics topics since Saturday is reserved for summary talks.  The topic was ‘Beyond the Standard Model’ — a very large subject, which covers an incredible number of theoretical models, from Supersymmetry to Two-Higgs-Doublet Models, two of the most discussed topics of the day.

Schema of top decays, from unmerged t ofully merged (boosted), from P. Azzi talk.

Schema of top decays, ranging from unmerged to fully merged (boosted) (talk presented by Patricia Azzi at Moriond 2014).

Each talk addressed more than one theoretical model, as the experiments prefer to focus on model-independent results. With each talk, however, the space left for new physics by latest measurements appeared smaller and smaller. In fact, Jean Iliopoulos highlighted in his summary talk that it is becoming harder to say “new physics must be around the corner,” as we are “running out of corners!”  So, I’ll focus on a topic that appears less theoretical even if it is treated in close collaboration with theorists: searches for new physics with boosted topologies. This topic was presented by Patrizia Azzi on behalf of the ATLAS and CMS collaborations.

What is a boosted topology? The term, which derives from “Lorentz boost”, is applied when a particle has energy equal to or above twice its mass. Due to their light masses, this is pretty much always the case for electrons and muons; they are considered “ultra-relativistic” and are not classified in this category. Rather, the term is reserved for much heavier particles, like W, Z or H bosons or top quarks – that is, particles that need much more energy to be boosted. These particles are unstable and are only observed by their decay products and, as a consequence of the boost, their decay products end up collimated in a single jet.

As an example of a boosted topology, consider a top quark decaying into a W boson and a b quark.  If the W boson decays hadronically, it produces two light jets. So, in the non-boosted case, we could expect to reconstruct the top quark from three jets: two light jets and one b-jet. In the boosted case, however, we only observe one collimated jet containing, in its substructure, the two light jets and the b jet. The challenge is to identify such a jet and recognize its components.

Boosted topologies are also studied in searches for a Z’ boson (a heavy Z boson predicted in some new theories) decaying into a top – anti-top pair. The top quarks are boosted for a Z’ with mass above 1 TeV and, at the moment, Z’ are excluded below 1.65 TeV (at 99% Confidence Level) depending on the model. Such searches represent possible new “corners” for finding new physics, especially as the LHC centre of mass energy increases (from 8 TeV to 13-14 TeV) in Run 2.

Reconstructed mass of tau leptons coming from Higgs, from R. Manzoni talk.

Reconstructed mass of tau leptons coming from Higgs decays (from talk presented by Riccardo Manzoni at Moriond 2014).

At these energies, boosted topologies will also be important for Higgs boson decays to b quarks or τ leptons.

Another very important topic — concerning the Standard Model and the Higgs boson — was brought up during the Young Scientist Forum.  This session features PhD students, who are each given five minutes to present a topic and to answer one question, which is an excellent opportunity to present their work. And this topic, evidence of Higgs boson decays into τ leptons, was treated by two students: Nils Ruthmann for ATLAS and Riccardo Manzoni for CMS.

This is a major result. Until the end of 2013, the Higgs had been only observed decaying into bosons (γγ, ZZ and WW), although the Standard Model predicts that it should also decay into fermions  (ττ, bb,…), decays in these channels are difficult to identify due to high background rates and final states that are more difficult to extract (jets versus leptons or photons). Both analyses used multivariate techniques to achieve the goal.

One of the more difficult challenges is to identify the tau leptons, which decay fully leptonically in 12% of the cases, leptonically and hadronically in 46% of the cases and fully hadronically in the rest (42%). The plot at the right illustrates the mass of tau leptons, as reconstructed in the hadronic decay mode. The final results present evidence of H → ττ at a significance of 4.1 σ for ATLAS and 3.2 σ for CMS. No new “corner” here, but more key support for the Standard Model, and a very important measurement.


Eve Le Menedeu Eve Le Ménédeu is currently a postdoctoral physicist at IFAE (Barcelona), working on ttH, H → bb analysis and some b-tagging studies. She wrote her thesis at CEA-Saclay on muon spectrometer performances and studies of WZ dibosons. In her spare time, Eve plays the flute and guides underground visits of the ATLAS detector.

Dark Matters

The winter conference season is well under way, and what better way to fill my first blog post than with a report from one of the premier conferences in particle and astroparticle physics: the Rencontres de Moriond.

One of the nice things I like about attending a conference is that it lets me step away from my day-to-day work and think again about the wider context of what we do as physicists. In this conference, it was the progress being made in our understanding of dark matter that best seemed to bring together work from many different areas of investigation. (Note that some of the results I will mention were already included in Jessica Levêque’s post No Matter How Hard You Try… Standard Is Standard).

Artist’s impression of dark matter (in blue) surrounding the Milky Way. Credit: ESO/L. Calçada

Artist’s impression of dark matter (in blue) surrounding the Milky Way. Credit: ESO/L. Calçada

Dark matter is the material that holds galaxies and clusters of galaxies together – the evidence for its existence from astronomical measurements is overwhelming. The problem: no one knows what dark matter actually is. None of the particles we know will do the job, not even the elusive neutrinos. All we do know is that it must be electrically neutral, very weakly interacting, and stable over billions of years. But that’s pretty much it.

What to do? Well, we could try to detect collisions between dark matter particles and ordinary atoms. At the conference, the LUX and CDMS collaborations reported their searches to detect this mysterious substance. Neither group saw any evidence of a signal, more or less ruling out potential hints seen by other groups over the last few years. In addition, several searches for dark matter production in the ATLAS and CMS experiments were reported, also with null results.

But a lack of positive signals is not the end of the story. Far from it – three talks in particular showed how our models for dark matter are evolving, with constraints from many directions.

The first of these addressed so-called supersymmetric dark matter. Supersymmetry is a group of models that predict new particles, as yet undiscovered, to explain several mysteries in particle physics (see So where is all the SUSY?, by Zach Marshall). In many supersymmetric models, one of the new particles is the dark matter particle, but there is no solid prediction of its mass to guide searches. Lorenzo Calibbi, however, combined astrophysical observations, searches for supersymmetry at the LHC and a few assumptions and educated to argue that the dark matter particle should be at least 24 times as massive as the proton – if it is supersymmetric.

New results from LUX and CDMS (solid lines) contradict previous hints of dark matter signals (shaded blobs).

New results from LUX and CDMS (solid lines) contradict previous hints of dark matter signals (shaded blobs).

Another possibility for dark matter is a particle called an axion. This would be very difficult to detect, in fact the searches I mentioned would have no chance of observing it. A surprise constraint arrived during the week, with the announcement by BICEP-2 of the detection of some particular patterns in the cosmic microwave background (CMB) radiation. This radiation was emitted early in the history of the universe, when it was about 380,000 years old – for comparison, the current age of the universe is 13.8 billion years. Their results are very fresh, and many groups will be seeking to replicate their observation, but if confirmed it’s essentially the last major piece of evidence for cosmic inflation that occurred shortly after the Big Bang.

What does this have to do with axion dark matter? Quite a lot, actually. There were big uncertainties in the possible properties on axions, depending on whether they were created before or after inflation. The observation by BICEP-2 rules out the creation of axion dark matter before inflation, giving a more precise target for future axion searches.

Finally, there was a proposal by Marco Drewes that perhaps we don’t need exotic new theories at all. The very fact that neutrinos have mass implies that they should have partner particles – one each for the electron, muon and tau neutrinos. He showed that, if these partners, or right-handed neutrinos – are arranged in just the right way, one of them could be the dark matter particle. Even better, the other two could explain why we live in a universe of matter rather than antimatter. By conventional standards, this proposal is artificial, without any solid theoretical motivation, but it’s testable, and that is music to an experimentalist’s ears.


Mike Flowerdew Mike Flowerdew is a post-doctoral researcher at the Max Planck Institute of Physics in Munich, Germany. He is currently searching for evidence of supersymmetric particles in the ATLAS data, and was also responsible for the calibration of the muon systems during data-taking.

The Neutrino Puzzle

Having explored the latest results on what we call ‘heavy flavour’ or physics of particles containing a b-quark (see The Penguin Domination by Jessica Levêque), we embarked on a much lighter subject: neutrinos.

It was as if a fresh breeze swept through the audience. Partly because we are surrounded by snow-capped mountains but mostly because of the topic — neutrino physics has been bubbling with activity these past few years. Many new measurements were shown, adding several pieces to the neutrino puzzle. But we are still far from having a clear idea of the picture we are trying to build, piece by piece.

Neutrinos are special particles. They are at the heart of some of the most exciting fundamental problems that particle physicists are trying to solve. But neutrinos are elusive, a characteristic that makes it difficult to study them. Physicists must use their ingenuity to compete at developing new kinds of detectors capable of measuring neutrinos coming from different sources.

Neutrinos sources studied by experiments

Neutrinos sources studied by experiments

There are a few things we know about neutrinos. In the Standard Model, neutrinos are neutral leptons that were thought to be massless. There are three neutrino species — electron, muon and tau neutrinos, each associated to the other three leptons in the Model — electron, muon and tau. They are the second most common particle in the universe after photon but are not well-known to the public. They interact with matter through weak interaction which makes them difficult to catch. But physicists like challenges and build experiments to detect and measure the flux of neutrinos coming from sources outside of our solar system or the sun, through the atmosphere, produced by terrestrial nuclear power plants or particle accelerators.

Most of these experiments were only sensitive to one neutrino species and at first, all these measurements appeared to be inconsistent. The picture got clearer when the Super Kamiokande experiment in Japan established in 1998 that neutrinos can oscillate from one species to another. Which means that an electron-neutrino can transform itself into a muon-neutrino and vice-versa. This explains, for instance, why the solar electron-neutrino measured flux is well below the one predicted by the solar model — because a fraction of them oscillate into muon-neutrinos that were not detected. The important consequence of the oscillation is that it can only occur if neutrinos have mass!

Neutrino masses with respect to the other Model Standard particles (fermions)

Neutrino masses with respect to the other Standard Model particles (fermions)

Since then, new experiments have been built to measure the probability of oscillation between different neutrino species and infer a measurement of their mass. At the conference, several measurements of these parameters were shown and we now know with fair precision the different oscillation probabilities as well as the mass differences between neutrino species. However, we still don’t know the mass itself although cosmological experiments allow us to set an upper limit on the sum of the masses of the three neutrino species, which is below an eV (electronVolt). Moreover, new experimental inconsistencies appear: some experiments do not observe the expected number of neutrinos, even with the oscillations taken into account.

So now, new questions have arisen: Where does the neutrino mass come from? Why is it so far from the other lepton masses? As it is massive and weakly interacting, could the neutrino be part of the dark matter of the universe? Is the neutrino its own anti-particle? Are there more than three neutrinos? Where are the high energetic neutrinos coming from?

Some experiments like IceCube are now able to map neutrinos coming from the universe and this is like doing astronomy with neutrinos!

Neutrino skymap as measured by the IceCube experiment

Neutrino skymap as measured by the IceCube experiment

During the session, several theoreticians proposed models that try to conciliate the different observations and answers to the above questions: Couldn’t there be a new species of neutrino in which the others could oscillate? Is the neutrino description in the standard model complete: couldn’t they have (as the other leptons) right-handed partners? This last option is interesting since it could explain why the standard neutrino mass is so small and perhaps also part of the universe dark matter as the right handed-neutrinos could be very massive.

Theoretical talks alternated with experimental ones describing future experiments that are currently being developed to help solve the puzzle. These experiments are being built by smaller collaborations in comparison to the LHC teams. The experiments can be located in the South Pole to take advantage of the ice as an interacting medium for the detector or in the depth of a disused mine to fight efficiently against cosmic ray background. The proposed technologies are also very different depending on the aim of the measurement but all experiments need a very low and well-controlled background, as the number of observed neutrinos is always small.

Stay tuned! There is no doubt that new results on neutrinos will come soon but in the meantime, my colleagues and I will catch some fresh air during a long lunch break up on the snowy mountains. After all, it is important to rest our brains in order to prepare for presentations of the top quark, the Higgs boson and other new results from the LHC in the next sessions.

So, what does a particle physicist, with her brain at rest, see in the surrounding mountains?

Higgs_LaThuile

well…

Higgs decaying in two photons bump over background as seen by the ATLAS experiment

Higgs decaying in two photons bump over background as seen by the ATLAS experiment

the Higgs boson of course!!!


Sabine Crépé-Renaudin Sabine Crépé-Renaudin is a French researcher at CNRS. She is involved in Grid Computing (the grid is like a world wide distributed computing centre used to reconstruct and analyse LHC experiment data). Her main activity in analysis is the search for new phenomena beyond the Standard Model in top-antitop quark final states. She also devotes part of her time to outreach activities.

No Matter How Hard You Try… Standard is Standard.

The past two days of the Recontres de Moriond 2014 Electroweak conference have been very intense with many new experimental results, many insightful theoretical talks and many lively discussions. Since the topics cover neutrino experiments, astrophysical observations and Standard Model precision measurements, giving a summary is not an easy task. But I will try.

HiggsPuzzle

Fig. 1 – Is the Higgs boson the last missing piece of the Standard Model or part of a much bigger puzzle? (image courtesy of minutephysics)

The discovery of the long-sought Higgs boson, the last missing piece of the Standard Model of particle physics, was announced in July 2012 by both the ATLAS and CMS collaborations at CERN, and the Nobel prize was awarded in October 2013 to Peter Higgs and François Englert, for proposing the mechanism responsible for breaking the electroweak symmetry and giving mass to the Z and W bosons.

In less than two years, the experimental paradigm shifted from the search for a new particle to precise measurements of its properties. The newly discovered boson has to be perfectly characterized to make sure it is exactly the one predicted by the Standard Model and not an impostor. The first results published in 2012 by both ATLAS and CMS established the bosonic nature of the new particle as well as its couplings to the W and Z bosons and to photons. But this was not enough. Because the Higgs boson is not only responsible for the W and Z masses but also for all the Standard Model particle masses, we have to establish that it directly couples to fermions.

Higgs_Fermion

Fig. 2 – From Moriond talk of Eilam Gross presenting measurements of Higgs boson coupling to fermions.

One of the important results of this conference is that both ATLAS and CMS showed very strong evidence of the Higgs boson decaying into b quarks and tau leptons (see Fig. 2). The official combination is expected to be published later this year. We can already see from the individual points that the final combination will provide required statistical significance to claim observation of the Higgs coupling to fermions, having a strength compatible with Standard Model predictions. Standard Model 1 : New Physics 0.

Couplings

Fig. 3 – From Moriond talk of Eilam Gross presenting measurements of Higgs boson production and decay rates.

In addition, both ATLAS and CMS released a large number of measurements of the Higgs boson’s production and decay rates over a significant number of final states during the past year, which were summarized today in clear and comprehensive talks. A difficult channel, the associated ttH production, is just at the edge of our sensitivity, but the ATLAS and CMS data clearly show a hint of this production mode. The combination of all these measurements allow one to probe the deviation of the entire set of Higgs coupling constants from the Standard Model. And no matter how hard you try, everything beautifully aligns to 1 (as shown on Fig. 3). Standard Model 2 : New Physics 0.

Higgs_Width

Fig. 4 – From Roberto Covarelli’s Moriond talk presenting measurement of Higgs boson decay width.

One of the last highlights of the conference was the measurement of the Higgs boson width. The “width” of a particle depends on its lifetime, or in other words, its decay probability. For the 126 GeV Higgs boson, the Standard Model predicted width is 4 MeV. The previously established limit, obtained from the width of the reconstructed mass peak, could only constrain the Higgs boson width to values smaller than 3.4 GeV. The analysis presented by CMS today uses a new method. (I won’t enter into technical details here but the idea is that the production rate of the Higgs boson decaying into ZZ* at high energy depends on the Higgs boson width.) This new measurement shows a remarkable sensitivity and constrains the Higgs boson width to be below 17 MeV, more than two orders of magnitude better than the previous limits! Standard Model 2.5 : New Physics 0. Only half a point here, as the Higgs boson is still allowed to decay into invisible new particles, less than 50% of the time, but this still leaves enough room for new physics to sneak in. It maybe the only place, actually.

Could this be the end for new physics models? It’s becoming a serious question for theorists since no hint in deviation from the Standard Model predictions has been found yet despite the huge amount of data analyzed so far. From all that I’ve heard from theorists today, here are a few phrases that grabbed my attention:

  • “We are lucky that experiments find anomalies from time to time, as it allows us to publish papers”.
  • “the Standard Model may after all not be an effective theory at low energy of a more fundamental theory, but might very well be the fundamental theory itself.”
  • “Any new physics model must only be a small fluctuation around the Standard Model predictions”

So, do we really need more than the Standard Model? What are the questions that are not answered so far ? There seem to be only a handful of them remaining: neutrino oscillations (or mass), baryon asymmetry, dark matter, dark energy and inflation.

j.Ellis-LagrangianTshirt

Fig. 5 – A model and his model.

The inflation problem is actually in the spotlight right now. A “guest” talk was added at the last minute to our conference agenda, to present the recent observation published by the BICEP-2 experiment at the South Pole. The measurement of the polarisation of the cosmic microwave background shows a signal that could (under minimal assumptions) very well be compatible with the inflation model (which is needed to expand the universe at a very high rate in its early stage). This polarisation signal could also be the experimental proof of the gravitational waves, the last of Einstein’s predictions that remain to be validated. It also looks like one of the most simplistic inflation models is sufficient to explain the BICEP-2 observations.

The same “minimalistic” trend is being considered in our field. For example, one of the new models presented by Marco Drewes proposes the addition of only three right-handed neutrinos to the Standard Model to solve all the remaining issues, in particular  dark matter, baryon asymmetry and neutrino masses.

An interesting suggestion was made today towards a minimalistic model choice. Besides the obvious need to accurately describe all the experimental results gathered so far in particle physics, its equations must fit on a medium size T-shirt, such that physicists can wear. And today, it seems that only the Standard Model can successfully fulfill both criteria.


Jessica Levêque Jessica Levêque is a French researcher at CNRS. Her main activities are measurement of the Higgs boson properties in the di-photon channel, improvement of the photon and electron performance, data quality monitoring and the tracker upgrade for LHC phase II in 2022. She also devotes part of her time on outreach activities. And when she’s not at work, she climbs mountains.