RWTH Aachen Particle Physics Theory

TTK Outreach: Special Relativity in a Nutshell

February 27th, 2018 | by

Einstein’s theory of relativity has seeped into popular culture like no other. But what is relativity? And why is it important to our day-to-day life? Today, we look at Special Relativity: the imagine-the-cow-to-be-a-sphere case of the complete or general theory of relativity.

The beauty of SR and probably one of the reasons for its ubiquity in popular science is its elegance and simplicity. An added benefit is that it’s possible to go quite a-ways with an intuitive understanding of SR and no complicated mathematics. At the heart of it, special relativity has two basic principles. Once we understand these two ideas, we basically understand all of special relativity and the ‘paradoxes’ that come with it. These two ideas are as follows:


1. The laws of physics are invariant (identical) in all inertial reference frames.

There is just one jargon-y term here which is ‘inertial reference frames’. A reference frame is a system of coordinates that you use when you perform an experiment. This system fixes the location and orientation of your experiment. An inertial reference frame is one that is not accelerating, i.e, it is either stationary or moving with a constant velocity. So, a car going in a straight line at 50 km/h is an inertial frame of reference. So is a physicist sitting at her desk. The Falcon Heavy during its trip to outer space is not: it accelerates. Neither is the Earth.

The first principle of SR states that physics should look the same in all inertial frames. In essence, if you perform your experiment on your way to work (provided you drive at a constant speed) you’ll get the same results as when you repeat it in your lab.

This also means that there is no ‘absolute’ frame of reference. Say you perform your experiment in a bleak, windowless container. Unbeknownst to you, the container is actually on a moving belt. This moving belt is on a ship on its way to the New World. Do you consider the ship to be your reference frame? Or the belt? Or just the container? It’s kind of an inverse Russian doll situation. But we don’t care. As long as the reference frames are inertial, the physics would remain the same and we get the same results either way.

2. The speed of light in vacuum is the same for all inertial frames

This one is slightly tricky because it’s counterintuitive. For any object, the speed you measure depends on the reference frame you’re in. For example, you’re in a car going at 50 km/h. On the seat next to you are six boxes of pizzas. For you, the pizzas are stationary. For a hungry person at a bus stop, the pizzas are going away from them at 50 km/h. For another car which is coming towards you at 25 km/h, the pizzas are coming towards it at a speed of 25 km/h. So, in general, speed is relative. But for light, we always measure the same speed irrespective of our motion with respect to the light source.

The constancy of the speed of light gives rise to a host of interesting results. The one most used in science-fiction is time dilation. And as it turns out, it’s pretty easy to understand time dilation if you understand these two principles of SR. So, let’s give this a shot!

Time Dilation:

Quick side-note before we begin: In special relativity, we assume that gravity plays no role (hence equating a cow to a sphere). Here, time dilation is a result of the velocity difference between two observers. If we consider the full picture, i.e., General Relativity, time dilation can also be caused by gravity. If you’ve seen Interstellar, this is why time runs slower closer to the black hole. And also the reason that clocks in outer space will tick slower than clocks on Earth (if their relative motion to Earth is zero).

                  Fig. 1

A simple thought experiment to understand time dilation is as follows. Consider two scientists Alice and Bob inside their respective spaceships. Both have light-clocks. A light-clock consists of two mirrors at opposite ends of a cylinder. One end also has a light source. The way this clock measures time is by shooting a photon from one end of the cylinder and ‘ticking’ when the photon returns to the same end.

Now back to Alice and Bob. Alice gets tired of trying to convince Bob of the superiority of Firefly and flies away in her spaceship. For Alice, one tick on her light-clock corresponds to the process depicted in Fig 1 and with some middle-school math, we can calculate the time between ticks.

For Bob staring dejectedly at Alice’s ship realizing that he was wrong, the path that the photon takes is given in Fig 2. Again, employing some simple middle-school math, we can calculate the time between ticks from Bob’s perspective.

                                        Fig. 2

After a bit of algebra, we find that from Bob’s perspective/frame of reference, time appears to be running slower for Alice.

ΔtB = ΔtAγ


γ = ( 1 – u2/c)-1/2 so that γ > 1

So, when Alice sends a passive-aggressive email to Bob with the one season of Firefly –such injustice—her clock would be a little behind Bob’s. By extension, she would’ve aged slightly less than Bob (in Bob’s frame of reference)**.

And that, in principle, is how time dilation works. Keep in mind that this is not just an abstract thought experiment. We actually sent high-precision atomic clocks on plane rides around the earth and compared their time to the ones on the ground. The lag was exactly the one given by special relativity.

Of course, you can’t mention time dilation without talking about the Twin Paradox. But this post has already exceeded its word limit. So, I’ll leave that for the next one.

**For now, we’ve chosen to completely ignore Alice’s frame of reference. If we delve deeper, we’d find that for Alice, Bob would be the one aging more slowly. This is what eventually leads to the twin paradox. More on this in the next post!

TTK Outreach: A Universe of Possibilities Probabilities

January 30th, 2018 | by

The universe may not be full of possibilities –most of it is dark and fatal– but what it does have in abundance are probabilities. Most of us know about Newton’s three laws of motions. Especially the third which, taken out of context, apparently makes for a good argument justifying revenge. For centuries, Newton’s laws made perfect sense: an object’s position and velocity specified at a certain time gives us complete knowledge of its future position and velocity aka its trajectory. Everything was neat and simple and well-defined. So imagine our surprise when we found out that Newton’s laws, valid as they are on large scales, completely break down, on smaller ones. We cannot predict with 100% certainty the motion of an atom in the same way that we can predict the motion of a car or a rocket or a planet. And the heart of this disagreement is quantum mechanics. So today let’s talk about two of the main principles of quantum mechanics: duality and uncertainty.


new doc 2018-01-28 16.38.08_1We begin with light. For a long time, no one seemed to be quite sure what light is. More specifically, we didn’t know if Light was a bunch of particles or a wave. Experiments verified both notions. We could see light interfering and diffracting much like two water waves would. At the same time, we had phenomena such as the photoelectric effect which could only be explained if Light was assumed to be made of particles. It is important to dwell on this dichotomy for a bit. Waves and particles lie on the opposite ends of a spectrum. At any given instant of time, a wave is spread out. It has a momentum, proportional to the speed with which it is traveling, but it makes no sense to talk of a definite, single position of a wave by its very definition. A particle, on the other hand, is localized. So the statement, ‘Light behaves as a wave and a particle’, is inherently non-trivial. It is equivalent to saying, ‘I love and hate pineapple on my pizza’, or ‘I love science fiction and hate Doctor Who.’

But nature is weird. And Light is both a particle and a wave, no matter how counter-intuitive this idea is to our tiny human brains. This is duality. And it doesn’t stop just at Light. In 1924, de Broglie proposed that everything exhibits a wave-like behavior. Only, as things grow bigger and bigger, their wavelengths get smaller and smaller and hence, unobservable. For instance, the wavelength of a cricket ball traveling at a speed of 50km/h is approximately 10-34 m.

And it is duality which leads us directly to the second principle of quantum mechanics.


The idea of uncertainty, or Heisenberg’s Uncertainty principle, is simple: you can’t know the exact position and momentum of an object simultaneously. In popular science, this is often confused with something called the observer’s effect: the idea that you can’t make a measurement without disturbing the system in some unknowable way. But uncertainty is not a product of measurement, neither a limitation imposed by experimental inadequacy. It is a property of nature, derived directly from duality.

From our very small discussion about waves and particles above, we know that a wave has a definite momentum and a particle has a definite position. Let’s try to create a ‘particle’ out of a wave, or in other words, let’s try to localize a wave. It’s not that difficult actually. We take two waves of differing wavelengths (and hence differing momenta) and superimpose them. At certain places, the amplitudes of the waves would add up, and in others, they would cancel out. If we keep on adding more and more waves with slightly differing momenta, we would end up with a ‘wave-packet’, which is the closest we can get to a localized particle.

Screen Shot 2018-01-28 at 4.45.06 PM

                                                                                      Image taken from these lecture notes.


Even now, there is a small, non-zero ‘spread’ in the amplitude of the wave-packet. We can say that the ‘particle’ exists somewhere in this ‘spread’, but we can’t say exactly where. Secondly, we’ve already lost information on the exact momenta of the wave and so there is an uncertainty there as well. If we want to minimize the position uncertainty, we’d have to add more waves, implying a larger momentum uncertainty. If we want a smaller momentum uncertainty, we would need a larger wave-packet and hence automatically increase the position uncertainty. This is what Heisenberg quantified in his famous equation:

Δx Δp ≥ h/4π

And so we come to probabilities. At micro-scales statements such as, ‘the particle is in the box’, are meaningless. What we can say is, ‘the particle has a 99% probability of being in the box’. From Newton’s deterministic universe (which is still valid at large scales) we transition to quantum mechanics’ probabilistic one where impossible sounding ideas become reality.

The Doctor once said, “The universe is big, it’s vast and complicated, and ridiculous. And sometimes, very rarely, impossible things just happen and we call them miracles.” Or you know, at small enough scales, a manifestation of quantum mechanics. And that is fantastic.

TTK Outreach: A Beginner’s Guide to Dark Matter

January 17th, 2018 | by

In the post-truth society that we live in it is easy to fall down the rabbit hole of doubting every scientifically held belief. To wonder if NASA is hiding proof of intelligent extra-terrestrial life (they’re not), or if people at CERN are rubbing their hands plotting something nefarious (nope) or whether the Big Bang theory is a Big Bad Lie (it really…isn’t).  But don’t worry, we at TTK have got you covered. Every Wednesday we answer your questions live on Twitter and every whenever-this-author-stops-procrastinating-day we give you a more elaborate explanation of some of the most frequently asked questions.

Today on the agenda: Dark Matter — what it is and why you should be reasonably sure of its existence.

Simply stated, dark matter is a kind of matter that doesn’t interact with light. This means we can’t “see” it in the conventional sense. As you would expect, this makes studying dark matter a bit difficult. But if there is one redeeming quality in humankind, it is that we don’t shy away from the seemingly impossible. Of course, the question remains, if we can’t see dark matter and if it doesn’t interact all that much with other things, how do we know that it exists in the first place? The answer comes to you in four parts.

1. Galaxy Rotation Curves

Some of the earliest indirect evidence of dark matter comes from galaxy rotation curves. A rotation curve is a plot of the orbital speed of stars or visible gas present in a galaxy as a function of their distance from the galactic center. If we assume that the total mass of a galaxy is only composed of normal or ‘visible’ matter, the farther we move away from the center (where most of this mass is concentrated), the lower the orbital speeds should get. This is what happens in the Solar system. Since the Sun accounts for most of the mass percentage, the planets farthest from it revolve slowly as compared to the ones close by.

However, measurements of galactic rotation curves don’t agree with this prediction at all. Instead of decreasing with distance, the orbital speeds of outlying stars appear to either stagnate to a constant value or increase. This points towards the possibility of an additional contribution to the mass of a galaxy from something we can’t see.  Maybe something dark?

2. The Bullet Cluster

Bullet Cluster

Another smoking gun for dark matter is the Bullet Cluster. It is composed of two colliding galaxy clusters, the smaller of which looks like a bullet. Galaxy clusters are a busy place and when they collide, chaos ensues. The stars, far apart as they are, mostly survive the collision without a story to tell (aka pass through).  The particles present in the galactic plasma, however, smash and ricochet and radiate a lot of energy.

Galactic plasma makes up most of the baryonic (visible) mass of a cluster so we can derive a mass-profile for the cluster from this radiated energy. We can also model the mass-profile by studying the lensing effects of clusters. Because massive objects bend light, we can figure out their mass distribution by studying how they distort light from surrounding clusters. If the entire mass of a cluster is just the baryonic mass, these two mass-profiles should coincide. What we find instead, is that they are in exact opposition. In the image above, the pink regions are where the baryonic mass is present. The blue regions show where the total mass of each cluster is concentrated. The zero-overlap between the two implies the presence of a non-baryonic, invisible source of mass. Moreover, it purports that most of the mass of a cluster is non-baryonic or dark. (In fact, roughly 80% of the universe’s matter content is dark!)

Quick Side Note: Keep in mind that the colors are for purely representative purposes! The radiation emitted by the galactic plasma doesn’t fall within the visible spectrum. Similarly, the blue is where the experiments tell us dark matter is concentrated.

3. Large Scale Structure Formation

Sloan Digital Sky Survey

An interesting question to ask cosmologists is why does the universe have a structure? How do we go from a more or less homogeneous particle soup to well-defined clusters of galaxies and then even to clusters of clusters of galaxies? The simple answer to this question is fluctuations. Tiny fluctuations right after the Big Bang lead to overdensities and underdensities of matter. As the universe expands, these fluctuations also grow on account of gravity and we end up with clumps of matter which would eventually form stars, galaxies, galaxy clusters, etc. There is one small problem with this line of reasoning though. We know that the early universe was dominated by radiation (or light). And light exerts pressure. So even as the fluctuations would cause matter to clump, radiation would cause it to homogenize. In the end, the fluctuations would be nearly wiped out and we wouldn’t have the kind of structure that we see today.

Dark matter solves this problem. It is massive and it doesn’t interact with light. Formation of dark matter lumps would aid the ‘clumping’ of normal, baryonic matter and give rise to structure despite the homogenizing effect of radiation.

4. Cosmic Microwave Background


The CMB can be regarded as a picture of the baby universe. And though at first glance it might look like random splotches of paint, it provides deep insights into what the universe looked like billions of years ago. Any cosmological model that we create has to be in agreement with this map. By specifying initial conditions — for instance, percentage of matter, dark matter and radiation — we should end up with density fluctuations as observed here. The best model we currently have is the ΛCDM. As you might have guessed, the DM here stands for dark matter. It is only when we include dark matter in the model that our predictions line up with the data.


These are just a few of the reasons we believe that dark matter exists. And even though we haven’t detected anything like a dark matter particle (yet), everywhere we look the universe seems to suggest that it must be there. If you still don’t understand why you should believe in it, (and as a reward for reading these 1000+ words), here’s a (dark) analogy:

The Higgs Boson in 2015

October 23rd, 2015 | by

The Higgs boson has been discovered in 2012 at CERN’s Large Hadron Collider (LHC), or more precisely at the LHC experiments ATLAS and CMS. The discovery of the Higgs resonance is definitely a milestone in particle physics and two of the fathers of the Higgs mechanism, Peter Higgs and Francois Englert, were awarded the nobel prize in physics in 2013.

After the discovery three years ago, there was immediately a decisive question to be answered: Has the discovered particle all the properties which are predicted by the Standard Model (SM) of particle physics. Or turning the question around in more scientific terms: Are there any statistically significant deviations from the SM predictions which can be identified with the recorded proton-proton collisions. Any such deviation would of course call for physics beyond the SM. Until the end of LHC run 1 at the beginning auf 2013, there have been great efforts to collect as many Higgs collisions as possible. And similar efforts have been invested in recent years to extract as much information as possible from the recorded data.

Higgs coupling strength measurements

Signal strength measurements for different Higgs-production channels, where a signal strenath of 1 is the SM expectation (taken from ATLAS CONF-2015-044)

Only recently the Higgs legacy of run 1 has been finalized performing the combination of the Atlas and the CMS data. The so-called signal strength for different production channels and the global signal strength is shown in the diagram on the right relative to the SM prediction. So far, the discovered particle does not give any hints for new physics beyond the SM. It simply looks more or less as predicted decades ago.

For these and similar measurements, the interplay between theory predictions and the experimental analysis is most crucial. Mid of October, experimentalists and theorist working on Higgs physics have met at the conference “Higgs Couplings 2015” which was hosted by the IPPP in Durham and took place in the beautiful medieval Lumley Castle close to Durham. The latest run 1 measurements have been presented and discussed. But run 1 is already part of the past. Everybody is looking forward to seeing the measurements from run 2 and gearing up for the upcoming analyses.

Run 2 has already started this year with the record breaking proton-proton energy of 13 TeV (run 1 has provided 7 and 8 TeV collisions). In 2015, a year to learn how to operate at the record-breaking energy and with collisions every 25 nano-secons, there will be not enough data recorded to make a major step forward in the precision of Higgs measurements (this is very different for other new physics searches, e.g. for multi-TeV resonances). However, the coming years of run 2 will be exciting for Higgs physics for sure.

So far, measurements are still statistically limited, i.e. by the relatively small number of recorded Higgs-bosons. However, residual uncertainties within the theoretical predictions will soon enter as a major player in the quest for ultimate measurements of Higgs properties, and therefore also in the quest for physics beyond the SM in the Higgs sector. Hence, improving theory predictions and making them available for the analysis of the data is more important than ever in the field of Higgs physics, and one of the research topics at our institute. Conferences like the “Higgs Couplings 2015” are providing an important forum for discussions on these topics between experimentalists and theorists.

So, let’s see what nature will teach us about the Higgs in the coming years.

Exploring dark matter with IceCube and the LHC

October 2nd, 2015 | by

Various astrophysical and cosmological observations point towards the existence of dark matter, possibly a novel kind of fundamental particle, which does not emit or reflect light, and which only interacts weakly with ordinary matter.


If such a dark matter particle exists, it can be searched for in different ways: direct detection looks for the elastic scattering of dark matter with nuclei in highly sensitive underground experiments, as Earth passes through our galaxy’s dark matter halo. Indirect detection experiments on Earth or in space look for cosmic rays (neutrinos, photons, or antiparticles) from the annihilation of dark matter particles in the centre of the Galaxy or in the Sun. And last but not least, if dark matter interacts with ordinary matter, it may be produced in high-energy proton collisions at the LHC.

To explore the nature of dark matter, and to be able to combine results from direct, indirect and collider searches, one can follow a more model-independent approach and describe dark matter and its experimental signatures with a minimal amount of new particles, interactions and model parameters. Such simplified or minimal models allow to explore the landscape of dark matter theories, and serve as a mediator between the experimental searches and more complete theories of dark matter, like supersymmetry.



About 1027 dark matter particles per second may pass through the Sun. They can loose some energy through scattering off protons and eventually be captured in the core of the Sun by the Sun’s gravitational pull. Dark matter particles in the Sun would annihilate with each other and produce ordinary particles, some of which decay into neutrinos. Neutrinos interact weakly with matter, can thus escape the Sun and could be observed by the neutrino telescope IceCube near the South Pole. Neutrinos therefore provide a way to search for dark matter in the core of the Sun.

At the LHC, dark matter may be produced in high-energy proton collisions. As dark matter particles interact at most weakly with ordinary matter, they would leave no trace in the LHC detectors. However, dark matter (and other novel weakly interacting particles) can be detected by looking at exotic signatures, where a single spray of ordinary particles is seen, without the momentum and energy balance characteristic for standard particle collisions (so-called mono-jet events, see right figure).


We have recently joined forces with members of the RWTH IceCube team to explore dark matter searches from neutrinos in the Sun and through dark matter production at the LHC, see and  We have considered a minimal dark matter model where we only add two new particles to the ordinary matter: a new dark matter fermion, and a new force particle, which mediates the interaction between the dark matter fermion and the ordinary matter. As no signal for dark matter has been observed, we can place limits on the masses of the dark matter particle and the new force particle, see figure to the left. We find a strinking complementarity of the different experimental approaches, which probe particular and often distinct regions of the model parameter space.

Thus only the combination of future collider, indirect and direct searches for dark matter will allow a comprehensive test of minimal dark matter models.

Anticipating Discoveries: LHC14 and Beyond

July 17th, 2015 | by

PhD students Leila Ali Cavasonza and Mathieu Pellen report from the workshop “Anticipating Discoveries: LHC14 and Beyond”

Few months ago, the Large Hadron Collider (LHC) in Geneva woke up from a long shut-down phase. It is now operating at a centre of mass energy of 13 TeV (it might reach 14 TeV in the upcoming phases). This is the first time in the history of humanity that particles are collided at such high energy in a machine built by humans.
Thus the Run II of the LHC is just starting and is lifting once again the excitement in the particle physics community. It is thus the right time to discuss what particles or theories could be discovered by the ATLAS and CMS detectors. In this spirit, a topical workshop organised by the Munich Institute for Astro- and Particle Physics (MIAPP) has been held in Munich: “Anticipating Discoveries: LHC14 and Beyond” from the 13th to the 15th of July.


screen_shot_2015-07-14_at_08.13.25In the last few days, the so-called pentaquark has been claimed to be discovered by the LHCb collaboration. This is an extraordinary discovery but the particle physics theorists are after another kind of particles. Indeed this pentaquark (a composite object made of 5 quarks, see picture to the right) has been predicted many years ago by quantum-chromo dynamics (QCD) but has never been observed so far. What theorists are looking for are theories beyond the standard model. These are introduced to explain experimental and theoretical problems. In general, these predict new resonances or effects that can be traced by experimental collaborations.



During this workshop many theories or extensions of previous ones have been proposed. In particular since the discovery of the Higgs boson, extensions of the Higgs sector are under high scrutiny. The beautiful theory of supersymmetry which predicts a special relation between bosons and fermions is still greatly discussed.
In particular extension of its minimal version have been proposed. Finally, as we know there is a huge amount of unexplained, invisible matter in our Universe, the so-called Dark Matter, it is justified to propose myriads of models that could explain various anomalies. In particular during these three days, several theories involving a non-abelian structure of the dark sector have been presented. These have a particular phenomenology at very different scales and are currently being tested against observations.
During this workshop many theories have been discussed and all theorists are craving to find signs of their favourite theory at the next LHC run. The kind of signs they are looking for is similar to the one reported by the ATLAS and CMS collaboration. The experimental collaborations have made public an excess in the Z/W channels (see picture on the left) and especially in the one where the gauge bosons are decaying into two jets. Future will tell us whether this is a sign of hope and the beginning of a new exciting hera.


Leila and Mathieu

Axions, WIMPs or WISPs? Searching for dark matter

July 3rd, 2015 | by

PhD student Mathieu Pellen reports from a dark matter workshop in Zaragoza.

The quest for the understanding of dark matter is certainly one of the greatest challenges of the 21st century. It is thus an extremely hot topic in the particle physics community. 


The 11th Patras Workshop on Axions, WIMPs and WISPs has been held in the beautiful and hot city of Zaragoza (Spain) (21-26 June 2015). As the title indicates, the focus was on dark matter and more particularly on axions.

Axions have been originally proposed to solve the strong CP problem. They are light particles (of the order of an electron-Volt or even lighter). These can be detected in light-shinning-through-wall experiments or in low background underground laboratories like the one of Canfranc (which has been one the highlights of the conference). During the conference, several innovative experiments looking for axions, axion-like particles or dark photons have been presented. New mechanisms predicting the existence of light particles have been also proposed.

In addition to light particles, Weakly Interacting Massive Particles (WIMPs) are the best motivated solution to account for the dark matter observed in our Universe. WIMPs are studied in three different ways: the first is their production at collider experiments such as the Large Hadron Collider (LHC, Geneva). The second is the detection of nuclear recoils produced by dark matter particles scattering on heavy nuclei in underground facilities such as the Grand Sasso laboratory in Italy. Finally, when two dark matter particles annihilate in the galaxy, they produce cosmic rays of standard model particles. These can be detected in satellite-based experiments such as the Alpha Magnetic Spectrometer (AMS-02, partly built at the RWTH Aachen University) on the International Space Station (ISS).

My contribution to the conference focuses on the last possibility. I have reported exciting results on a project carried out with Leila Ali Cavasonza and Michael Krämer.CCnew2_09_14 Indeed, AMS-02 has reported an excess in the measurement of the positron flux (red date points, left figure) compared to standard expectations from astrophysical sources (green curve, left figure). This has triggered a lot of interest recently. The reason is that anti-particles are an extremely interesting observable when searching for dark matter. Indeed they are rarely produced from standard astrophysical sources. Thus the discovery of excesses in anti-particles fluxes could be already a smoking gun for the existence of dark matter. Nowadays, the dark matter contribution is believed to be sub-dominant in the AMS-02 observations. However, the absence of a “bump” – as expected from a from a dark matter signal – in the very smooth AMS-02 spectrum is a great opportunity to set limits on dark matter annihilation cross sections.

MathieuWe have derived new upper limits on the annihilation cross section using a new method that allows us to study dark matter with masses ranging from several TeV down to 1 GeV. In particular we have focused on the impact of massive electro-weak gauge bosons on these limits. Even if their contributions are limited, they are of prime importance as they produce all standard model particles when decaying. I have thus shown that there is a promising complementarity between different fluxes of anti-particles. This opens up new ways to exclude or find dark matter in the next few years using indirect detection.

Why the Large Hadron Collider is good for Philosophy

June 14th, 2015 | by

Guest post in Jon Butterworth’s Guardian blog: The philosophy of the Large Hadron Collider

There have been many tedious and futile discussions about the value of philosophy for modern science. I find it much more interesting and fruitful to ask if and in what way modern science can advance philosophy. The complexity, the new challenges and the new methods that arise in modern science in general – and at the LHC in particular – raise a number of questions that concern core issues of philosob6d92a04-2231-42ce-814c-c82d86b80ebd-2060x1236phy of science: what are the methods of acquiring knowledge, what is the role of models, and how does the intricate relationship between theory, computer simulations and experimental data work? The LHC has been built for fundamental physics, but it will also challenge and advance the philosophy, sociology and history of science!

See the full text at: The philosophy of the Large Hadron Collider

Exploring dark matter through electroweak radiation

May 1st, 2015 | by

PhD student Leila Ali Cavasonza reports on a her recent work on indirect dark matter searches.

Investigating the nature of Dark Matter is certainly one of the most compelling and exciting goals of particle and astroparticle physics nowadays, both on the experimental and on the theoretical side.

It is now almost universally assumed that the Dark Matter consists of one or more new particles. According to the observations, this new particle is neutral, non relativistic, massive, weakly interacting and with a small self-interaction cross section. One of the most prominent candidate are the so-called Weakly Interactive Massive Particles (WIMPS).

These particles could interact with ordinary matter and be detected in the so-called direct detection experiments. Or they could be produced via annihilation of standard particles and discovered at colliders. Or they could annihilate into standard particles like photons, electrons and positrons or neutrinos and produce an excess in the fluxes of standard model particles observed in cosmic rays.


Indirect detection experiments, like the Alpha Magnetic Spectrometer (AMS, see left figure) measure the composition and the fluxes of the cosmic rays with very high precision in order to detect such possible excesses.

The AMS experiment has actually found a significant excess in the positron fluxes, that is at the moment not explained by standard astrophysics (figure below). On the other hand no excess has been observed for example in the antiproton fluxes.


The positron fraction measured by AMS (red circles) compared with the expectation from ordinary cosmic-ray collisions.

To explain this situation, the so called leptophilic models for Dark Matter have been introduced: According to these models the Dark Matter particles can annihilate only into leptons, like electrons and positrons, but not in hadrons, like the antiprotons.

However, the situation is not so simple. A very energetic electron produced via WIMPS annihilation can actually radiate a Z or W boson and produce at the end all stable standard particles, including antiprotons. To have a consistent picture and accurate predictions for the AMS experiment in the frame of leptophilic models it is then necessary to take into account the electroweak radiation off the standard model final state.

The so called fragmentation functions approximation has been developed in order to include these contributions in a simple and model independent way. In our paper arXiv: 1409.8226 we analyse the quality of this approximation. In particular we produce predictions for the AMS experiment within a simple leptophilic model including the electroweak radiation in the complete theory and with the approximation respectively. We then compare the predicted fluxes to understand how reliable the approximation is. It turns out that for some models the approximation is not valid. On the other hand, when valid, the approximation is actually very reliable and it is possible to obtain accurate predictions in a faster and simpler way.

Simplified models for new physics

February 13th, 2015 | by

Jan Heisig and Jory Sonneveld report on their recent work on simplified models

With about a petabyte of data processed in Switzerland everyday, the Large Hadron Collider (LHC) provides an enormous amount of information on high energy physics processes. This information is used in order to test theories beyond the Standard Model of Particles Physics — theories that are motivated either by outstanding theoretical problems or experimental evidence, like in the case of dark matter. While experimentalists work their way through the data, theorists line up to convince them to search for their favorite a model in the currently collected 20 fb-1 of data.

squark_decayIt is impossible for experimentalists to search for each possible model theorists came up with. This is why they often try to search for simple characteristics that represent a larger class of possible new models of physics. One new model of physics, supersymmetry, for example, predicts new spin-0 (scalar) quarks, or squarks (supersymmetric quarks) among many other particles. These new squarks decay to a quark and so-called neutralino (see Feynman diagram on the left), which in many models of supersymmetry is assumed to be the lightest supersymmetric particle.



What would be seen at the LHC if supersymmetry were realized in nature? As the neutralino is a neutral, stable particle it is invisible for the detector. But as it carries away energy and momentum it could be reconstructed with the missing energy in an event. However, in order to recognize that energy is missing we have to measure visible particles the neutralino recoils against. If at the LHC a pair of squarks is produced in the collision of two protons and both squarks decay in a quarks and a neutralino, we would see events with two quarks (recognized as “jets” in the detector) and missing energy from the invisible neutralinos. This is an important signature that is looked for at the LHC.

How could we interpret the presence or absence of a signal in the search for jets and missing energy in a specific model? This is not trivial, since the significance of the search in general depends on details of the model. For instance, as supersymmetry has a huge number of free parameters, the significance of the search can in principle depend on the masses of supersymmetric particles other than the squark and neutralino. T2_CMS t_channel_gluinoIn this article we investigated this question studying to what extent the “simplified squark model” (left figure) introduced by the experimental collaborations can be used to draw conclusions about more general supersymmetric models where the production is also mediated by a gluino, the supersymmetric partner of the gluon (as shown in the Feynman diagram on the right).

In addition to supersymmetry, there are many other possible models of physics beyond the Standard Model. kk_quark_decay One such model postulates extra spatial dimensions (see also notes by various speakers at the TASI Lectures). It also predicts new quarks (see Feynman diagram on the right), but this time particles with spin 1/2: this means they have the same spin as the Standard Model quarks. We can call this model a same-spin model. Could we also use the results for a supersymmetric simplified squark model to say something about excluded masses of quarks and the lightest particles of a same-spin model? It turns out that one can.

We theorists then continue to use results from searches for simplified models and apply these to our favorite models of physics beyond the Standard Model. Many tools exist for exactly this purpose; one example is SModelS. We look forward to the fresh start of the LHC this year!