Kategorie: ‘particle physics’
Looking for Dark Matter: Dark(ness) at the end of a tunnel
In the last post, we introduced the “Shake It, Make It, Break It” approach for dark matter detection and talked about shaking dark matter to deduce its properties (what is usually called Direct Detection). Of course, the ideal way to study dark matter would be to create it in our laboratories, which brings us to the second approach: Make It.
Attempts to make dark matter are carried out in particle colliders. To make dark matter we destroy light (aka visible matter). But before we go down that road, let’s have quick look at the Standard Model of Particle Physics. This model describes the physics of everything visible around us. It tells us the structure of atoms, the workings of three of the four forces governing all interactions in nature, the mechanism by which particles gain mass, the mechanism by which they decay. Essentially, it is a summary of our knowledge of particle physics. And although the standard model has been an incredible success, it appears to be somewhat incomplete. For example, we don’t yet have a particle that mediates gravity, the fourth force. We don’t know conclusively why neutrinos have mass. Or why the Higgs mass appears fine-tuned. The incompleteness of the Standard Model provides a strong motivation for additional particles. If dark matter is composed of particles, it could be one of the missing puzzle pieces in the Standard Model. And much like in an actual puzzle, we can deduce the properties of the missing piece by the space its absence has created. We can figure out which Standard Model particles are likely to talk to dark matter and build robust models around these interactions. With this basis for dark-visible interactions, we can look for them at colliders.

A particle collider is used to accelerate particles to high energies, smash them together and study the resulting debris to understand the physics of nature at small (length) scales. One can look for dark matter at colliders by figuring out whether this debris matches our expectation from the Standard Model (which doesn’t include dark matter). A simple way to do this (in principle) is by using the law of conservation of energy– in any physical process, the total energy of the system remains conserved. The initial energy of the particle beams is something we know from an experiment’s design. The total energy after a collision can be reconstructed from the energy of the particles we detect. If these two numbers don’t match, we know that some of the energy has been carried away by “invisible” particles which could be potential dark matter candidates. (In practice, this is much harder to do which is one of the reasons we have an entire subgroup of physicists (theorists and experimentalists) devoted to working out the intricacies of collider physics.)
Another way to look for dark matter at colliders is by studying how the Standard Model particles produced in a collision decay. Consider the Z-boson. In the Standard Model, it can decay into quarks, leptons or neutrinos. We know the total decay width of the Z-boson which characterizes the probability that a Z-boson would decay. We can also get measurements on the individual probability of a Z-boson decaying into quarks, leptons, and neutrinos. A mismatch between these probabilities is a hint that a Z-boson decays into something else which is invisible to our detectors. Once again, we can deduce the presence of dark matter by its absence.
We’ve known for quite some time that there is more to the Universe than meets the eye. To understand it, we must exhaust all avenues available to us. Crashing particles being one of them.
Looking for Dark Matter: Trembles beneath the surface

The ever-elusive nature of dark matter is of interest not only from a theoretical point of view –we are, after all, missing 85% of the Universe– but also from an experimental one. How exactly does one study something invisible? We can point our telescopes at the sky and study how light bends around clusters of apparent emptiness, hinting that there must be something there. We can do this with ancient light (commonly known as the CMB) and figure out exactly how much of this invisible substance is present in our Universe. We can study the structure of galaxies and galaxy clusters, the motion of stars bound to them. Everywhere we look, it seems, there is evidence for something massive lurking in the shadows. But none of these things tell us anything about what a dark matter particle* actually looks like. We cannot “trap” a dark matter particle for it can, quite literally, slip through walls. We don’t have access to it in the way we have access to, say electrons. But as humans, we are tenacious. And as scientists, we are creative. The minor problem of dark matter’s invisibility won’t stop us in our quest for answers.
In this series of (hopefully) short posts, we’ll cover the three conventional avenues of dark matter detection. The elevator pitch for these techniques? Shake It, Make It or Break It. ‘It’ being dark matter.
Shaking dark matter is as fun as it sounds. In the simplest of terms, it involves waiting for a dark matter particle to strike a target atom placed inside a detector. As a result of the collision, the state of the target atom changes. It could either get ionized, resulting in the production of free electrons which we can ‘see’ in the detector. It could absorb the energy of the dark particle and then release it in the form of a photon. Or it could release this energy as heat. In all of these cases, the collision results in a visible/detectable signature.

There are two important questions to ask now:
1. Where does the initial dark matter particle come from?
2. How can we be sure that the electrons/photons/heat we detect is actually caused by this collision?
The answer to the first questions is fairly straight-forward. From various other measurements, we know that there is a constant (ish) dark matter flux through Earth. At any given instant, we are being bombarded with dark matter particles. To get an idea of how strong this bombardment is, consider these numbers. The dark matter flux (total dark matter mass passing through one cubic centimeter every second) is approximately . (GeV is the standard unit of mass in physics. 1 GeV is approximately
kg.) So if we assume a dark matter mass of 1 GeV, this amounts to roughly 0.3 dark matter particles per centimeter cubed. The average volume of a human body is 66400 cubic centimeters. Which means that in one second, about 20,000 dark matter particles zip through every person on Earth. Given that dark matter is weakly interacting, it is no surprise that we don’t notice this constant flux. But given that the flux is substantial, we can hope to detect these particles by building detectors that are large enough. A larger detector volume means more dark particles passing through and hence a higher probability of collision.
Which automatically brings us to question two. How can we be sure that the cause of a signal in our detector is a dark matter-target atom collision. The short answer to this is we’re extremely meticulous. For every experiment, there is a ‘background’ that we need to be aware of. A background is basically “noise” which makes it hard to see the actual signal. In this case, any other particle colliding with the target would give rise to a background signal. A major part of these experiments is to account for all possible backgrounds and reduce this noise as much as possible. The first step in doing this is to build the detector underground. The Earth’s crust provides a natural shielding against most of the stray particles hanging around in our corner of the universe. Beneath that are layers of concrete to further stop any particles getting inside the detector. Note that since the dark matter particles are incredibly weakly interacting, they have no problem shuttling through the Earth’s crust or any of our other protective layers. Another important factor is the choice of the target atom itself. Since we want to avoid spurious signals, an atom that decays radioactively or is otherwise reactive would be a poor choice for the target. Some of the best target atoms are inert gases such as Xenon or Argon.

This is (very briefly) how a conventional direct detection experiment works**. We hide out in the depths of Earth, waiting. The obvious follow-up question is what happens if we don’t see a signal. Does it mean the dark matter paradigm is dead in the water (or liquid Xenon)? The answer is Not Quite. The lack of a signal is valuable information as well. For example, it can be used to infer that the dark matter-target interaction is weaker than we thought (meaning the probability of a collision is even smaller resulting in the absence of a signal). In this way, we can use the “null results” of an experiment to set limits on the strength of the dark matter — standard model interaction. So even though we don’t “detect” dark matter, we end up with more knowledge than we started with. And that, in the end, is the true spirit of science.
Up next: Can we make dark matter at colliders?
*The assumption that dark matter is composed of particles is well-motivated but it might very well be that dark matter is something more exotic such as primordial black holes.
**Detectors operate on the same principle (some kind of dark matter — particle collision) but can be experimentally realised in different ways.
TTK Outreach: A Universe of Possibilities Probabilities
The universe may not be full of possibilities –most of it is dark and fatal– but what it does have in abundance are probabilities. Most of us know about Newton’s three laws of motions. Especially the third which, taken out of context, apparently makes for a good argument justifying revenge. For centuries, Newton’s laws made perfect sense: an object’s position and velocity specified at a certain time gives us complete knowledge of its future position and velocity aka its trajectory. Everything was neat and simple and well-defined. So imagine our surprise when we found out that Newton’s laws, valid as they are on large scales, completely break down, on smaller ones. We cannot predict with 100% certainty the motion of an atom in the same way that we can predict the motion of a car or a rocket or a planet. And the heart of this disagreement is quantum mechanics. So today let’s talk about two of the main principles of quantum mechanics: duality and uncertainty.
Duality:
We begin with light. For a long time, no one seemed to be quite sure what light is. More specifically, we didn’t know if Light was a bunch of particles or a wave. Experiments verified both notions. We could see light interfering and diffracting much like two water waves would. At the same time, we had phenomena such as the photoelectric effect which could only be explained if Light was assumed to be made of particles. It is important to dwell on this dichotomy for a bit. Waves and particles lie on the opposite ends of a spectrum. At any given instant of time, a wave is spread out. It has a momentum, proportional to the speed with which it is traveling, but it makes no sense to talk of a definite, single position of a wave by its very definition. A particle, on the other hand, is localized. So the statement, ‘Light behaves as a wave and a particle’, is inherently non-trivial. It is equivalent to saying, ‘I love and hate pineapple on my pizza’, or ‘I love science fiction and hate Doctor Who.’
But nature is weird. And Light is both a particle and a wave, no matter how counter-intuitive this idea is to our tiny human brains. This is duality. And it doesn’t stop just at Light. In 1924, de Broglie proposed that everything exhibits a wave-like behavior. Only, as things grow bigger and bigger, their wavelengths get smaller and smaller and hence, unobservable. For instance, the wavelength of a cricket ball traveling at a speed of 50km/h is approximately 10-34 m.
And it is duality which leads us directly to the second principle of quantum mechanics.
Uncertainty:
The idea of uncertainty, or Heisenberg’s Uncertainty principle, is simple: you can’t know the exact position and momentum of an object simultaneously. In popular science, this is often confused with something called the observer’s effect: the idea that you can’t make a measurement without disturbing the system in some unknowable way. But uncertainty is not a product of measurement, neither a limitation imposed by experimental inadequacy. It is a property of nature, derived directly from duality.
From our very small discussion about waves and particles above, we know that a wave has a definite momentum and a particle has a definite position. Let’s try to create a ‘particle’ out of a wave, or in other words, let’s try to localize a wave. It’s not that difficult actually. We take two waves of differing wavelengths (and hence differing momenta) and superimpose them. At certain places, the amplitudes of the waves would add up, and in others, they would cancel out. If we keep on adding more and more waves with slightly differing momenta, we would end up with a ‘wave-packet’, which is the closest we can get to a localized particle.

Image taken from these lecture notes.
Even now, there is a small, non-zero ‘spread’ in the amplitude of the wave-packet. We can say that the ‘particle’ exists somewhere in this ‘spread’, but we can’t say exactly where. Secondly, we’ve already lost information on the exact momenta of the wave and so there is an uncertainty there as well. If we want to minimize the position uncertainty, we’d have to add more waves, implying a larger momentum uncertainty. If we want a smaller momentum uncertainty, we would need a larger wave-packet and hence automatically increase the position uncertainty. This is what Heisenberg quantified in his famous equation:
Δx Δp ≥ h/4π
And so we come to probabilities. At micro-scales statements such as, ‘the particle is in the box’, are meaningless. What we can say is, ‘the particle has a 99% probability of being in the box’. From Newton’s deterministic universe (which is still valid at large scales) we transition to quantum mechanics’ probabilistic one where impossible sounding ideas become reality.
The Doctor once said, “The universe is big, it’s vast and complicated, and ridiculous. And sometimes, very rarely, impossible things just happen and we call them miracles.” Or you know, at small enough scales, a manifestation of quantum mechanics. And that is fantastic.
The Higgs Boson in 2015
The Higgs boson has been discovered in 2012 at CERN’s Large Hadron Collider (LHC), or more precisely at the LHC experiments ATLAS and CMS. The discovery of the Higgs resonance is definitely a milestone in particle physics and two of the fathers of the Higgs mechanism, Peter Higgs and Francois Englert, were awarded the nobel prize in physics in 2013.
After the discovery three years ago, there was immediately a decisive question to be answered: Has the discovered particle all the properties which are predicted by the Standard Model (SM) of particle physics. Or turning the question around in more scientific terms: Are there any statistically significant deviations from the SM predictions which can be identified with the recorded proton-proton collisions. Any such deviation would of course call for physics beyond the SM. Until the end of LHC run 1 at the beginning auf 2013, there have been great efforts to collect as many Higgs collisions as possible. And similar efforts have been invested in recent years to extract as much information as possible from the recorded data.

Signal strength measurements for different Higgs-production channels, where a signal strenath of 1 is the SM expectation (taken from ATLAS CONF-2015-044)
Only recently the Higgs legacy of run 1 has been finalized performing the combination of the Atlas and the CMS data. The so-called signal strength for different production channels and the global signal strength is shown in the diagram on the right relative to the SM prediction. So far, the discovered particle does not give any hints for new physics beyond the SM. It simply looks more or less as predicted decades ago.
For these and similar measurements, the interplay between theory predictions and the experimental analysis is most crucial. Mid of October, experimentalists and theorist working on Higgs physics have met at the conference “Higgs Couplings 2015” which was hosted by the IPPP in Durham and took place in the beautiful medieval Lumley Castle close to Durham. The latest run 1 measurements have been presented and discussed. But run 1 is already part of the past. Everybody is looking forward to seeing the measurements from run 2 and gearing up for the upcoming analyses.
Run 2 has already started this year with the record breaking proton-proton energy of 13 TeV (run 1 has provided 7 and 8 TeV collisions). In 2015, a year to learn how to operate at the record-breaking energy and with collisions every 25 nano-secons, there will be not enough data recorded to make a major step forward in the precision of Higgs measurements (this is very different for other new physics searches, e.g. for multi-TeV resonances). However, the coming years of run 2 will be exciting for Higgs physics for sure.
So far, measurements are still statistically limited, i.e. by the relatively small number of recorded Higgs-bosons. However, residual uncertainties within the theoretical predictions will soon enter as a major player in the quest for ultimate measurements of Higgs properties, and therefore also in the quest for physics beyond the SM in the Higgs sector. Hence, improving theory predictions and making them available for the analysis of the data is more important than ever in the field of Higgs physics, and one of the research topics at our institute. Conferences like the “Higgs Couplings 2015” are providing an important forum for discussions on these topics between experimentalists and theorists.
So, let’s see what nature will teach us about the Higgs in the coming years.
Exploring dark matter with IceCube and the LHC
Various astrophysical and cosmological observations point towards the existence of dark matter, possibly a novel kind of fundamental particle, which does not emit or reflect light, and which only interacts weakly with ordinary matter.
If such a dark matter particle exists, it can be searched for in different ways: direct detection looks for the elastic scattering of dark matter with nuclei in highly sensitive underground experiments, as Earth passes through our galaxy’s dark matter halo. Indirect detection experiments on Earth or in space look for cosmic rays (neutrinos, photons, or antiparticles) from the annihilation of dark matter particles in the centre of the Galaxy or in the Sun. And last but not least, if dark matter interacts with ordinary matter, it may be produced in high-energy proton collisions at the LHC.
To explore the nature of dark matter, and to be able to combine results from direct, indirect and collider searches, one can follow a more model-independent approach and describe dark matter and its experimental signatures with a minimal amount of new particles, interactions and model parameters. Such simplified or minimal models allow to explore the landscape of dark matter theories, and serve as a mediator between the experimental searches and more complete theories of dark matter, like supersymmetry.
About 1027 dark matter particles per second may pass through the Sun. They can loose some energy through scattering off protons and eventually be captured in the core of the Sun by the Sun’s gravitational pull. Dark matter particles in the Sun would annihilate with each other and produce ordinary particles, some of which decay into neutrinos. Neutrinos interact weakly with matter, can thus escape the Sun and could be observed by the neutrino telescope IceCube near the South Pole. Neutrinos therefore provide a way to search for dark matter in the core of the Sun.
At the LHC, dark matter may be produced in high-energy proton collisions. As dark matter particles interact at most weakly with ordinary matter, they would leave no trace in the LHC detectors. However, dark matter (and other novel weakly interacting particles) can be detected by looking at exotic signatures, where a single spray of ordinary particles is seen, without the momentum and energy balance characteristic for standard particle collisions (so-called mono-jet events, see right figure).
We have recently joined forces with members of the RWTH IceCube team to explore dark matter searches from neutrinos in the Sun and through dark matter production at the LHC, see http://arxiv.org/abs/1411.5917 and http://arxiv.org/abs/1509.07867. We have considered a minimal dark matter model where we only add two new particles to the ordinary matter: a new dark matter fermion, and a new force particle, which mediates the interaction between the dark matter fermion and the ordinary matter. As no signal for dark matter has been observed, we can place limits on the masses of the dark matter particle and the new force particle, see figure to the left. We find a strinking complementarity of the different experimental approaches, which probe particular and often distinct regions of the model parameter space.
Thus only the combination of future collider, indirect and direct searches for dark matter will allow a comprehensive test of minimal dark matter models.
Why the Large Hadron Collider is good for Philosophy
Guest post in Jon Butterworth’s Guardian blog: The philosophy of the Large Hadron Collider
There have been many tedious and futile discussions about the value of philosophy for modern science. I find it much more interesting and fruitful to ask if and in what way modern science can advance philosophy. The complexity, the new challenges and the new methods that arise in modern science in general – and at the LHC in particular – raise a number of questions that concern core issues of philosophy of science: what are the methods of acquiring knowledge, what is the role of models, and how does the intricate relationship between theory, computer simulations and experimental data work? The LHC has been built for fundamental physics, but it will also challenge and advance the philosophy, sociology and history of science!
See the full text at: The philosophy of the Large Hadron Collider
Species
The origin and size of the neutrinos masses is one of the unsolved mysteries in High energy physics. Their masses must be very small (< 1eV) in comparison to the other elementary particles: if the neutrino had the weight of a house mite, the top quark would have the mass of a sperm whale! For theorists, this hierarchy in the masses is really unsatisfying, and the question is if there is a mechanism that makes the neutrino mass automatically that small.
Susy edges
Two weeks ago, the SUSY conference took place in Manchester, and we already had a nice report from Mathieu and Jory here in our blog. However, I also want to draw your attention to this talk about an overview of CMS searches for supersymmetric particles at the LHC. Generic searches for supersymmetric particles depend mainly on two possible observations: in most supersymmetric scenarios, one has a lightest stable supersymmetric particle (that can play the role of the dark matter candidate). This particle, if produced at a collider like the LHC, does not decay any more (stable!) and does not leave any trace in the detectors. No trace? No! Read the rest of this entry »
Back from the SUSY conference
Are there leptoquarks?
Leptoquarks are interesting objects: These (so far) hypothetical particles are able to connect a single Lepton and a single quark in one interaction. For triggering such an interaction, leptoquarks have to carry both lepton number (such as leptons) and baryon quantum number (such as quarks). Hence the name: Lept-o-quark.