Archives For subatomic particles

particle decay at the lhc

Once upon a time, we looked at an explanation for dark matter involving a theory about how all matter around us could decay over 6.6 × 10^33 years and noted that there’s a controversy as to whether protons actually decay. To help settle this, astronomers took advantage of the fact that telescopes are relativistic time machines, and peered through them at a galaxy known as PKS 1830-211 — a name only a scientist could love — that just so happens to be a gravitational lens allowing us to see some 7 billion years back. To be a bit more precise, it lets us look at clouds of alcohol molecules formed eons ago in deep space and compare their spectrum to that of booze analyzed in a lab right here on Earth. Don’t worry, no hard liquor was harmed in the process as the alcohol in question is methanol, the kind used in fuel and manufacturing, and which causes blindness if ingested, not the ethanol in which we can indulge. But even if no buzz was killed for the sake of science, what exactly does looking at the light spectra of alcohol tell us about how our universe formed and its possible fate many quadrillions of years from now?

Well, the spectrum of a molecule depends on μ, the ratio of proton to hydrogen mass. That’s an extremely important metric because it lets us measure the strong force, one of the fundamental interactions of matter as we know it responsible for building atomic nuclei. Because the masses involved are created by interactions of elementary particles representing the strong force, if μ falls below or exceeds 1,836.15267245(75) and the difference is reproducibly recorded, we can say that something changed the effect of this fundamental force on matter. Hence, if the 7 billion year old methanol emits an appreciably different spectrum from methanol we create today, this would mean that one of the fundamental forces has changed as the universe grew and matter is decaying on cosmic time scales. Lucky for us, turns out that atoms are very much stable since the spectrum of methanol was for all intents and purposes identical over 7 billion years, which is just over half of the way back to the Big Bang itself.

This tells us a couple of things about the fate of the universe. First is that the Standard Model of physics is still accurate and can make viable predictions about atomic structure and decay. The second is that matter will continue to be matter at the end of the universe or decays so slowly it would only matter on time scales far exceeding the lifetimes of supermassive black holes. Finally, it allows us to rule out overly exotic explanations for the origins of dark matter involving decay of particular subatomic elements or quirky behavior of the strong force since these results match a number of previous experiments designed to find out the same thing. In a universe flying apart, churning with explosions, collisions, and radiation, it’s nice to know that you can rely on matter that makes you and the planet on which you live isn’t also slowly decaying on you like a ticking cosmic time bomb. And while space may be out to get you through GRBs, asteroids, and huge galactic train wrecks, it will at least spare the very fabric of your existence.

See: Bagdonaite, et. al. (2012). A stringent limit on a drifting proton-to-electron mass ratio from alcohol in the early universe Science DOI: 10.1126/science.1224898

Share

Few things in particle physics seem to be as elusive as the Higgs boson. After many years of smashing a lot of particles, we’ve been able to narrow its probable mass down to about 125 GeV, about five times less than predicted by a number of scientists at the beginning of the search. With CERN staying vague about whether it got even a smidgeon closer to finally finding the particle and its potential habitat getting smaller and smaller, so much so that at this point one starts to wonder if it’s actually too light to impart mass all by itself, maybe we should start considering a future without the boson as the linchpin of mass. What if it’s really not so much the definitive result of the standard model, but actually a placeholder for something far more complex? What if we have to look at the data collected by the LHC again, going over with a fine-tooth comb for anything anomalous or run new experiments which make sure that we detected every but of subatomic shrapnel blasting out of the high energy collision? Far from marking the LHC a failure, it would actually greatly boost its importance…

Here’s an important thing to keep in mind. An unfortunate truth with which scientists are often faced is that the public at large has been conditioned that every project has two possible outcomes. You set out to find or build something and you either succeed or fail. Projects are evaluated for how well they achieve a goal, not how the knowledge gained through attempting them can be applied elsewhere. Yet, in the research world, failure isn’t only an option, but sometimes, a very desirable one because the post-mortem of your attempt to prove a new hypothesis or test an old one can yield new ideas and new approaches. Patent offices are filled with all sorts of innovations that came from accidents, failures, or discarded efforts. Who cares if you didn’t prove that some plant in a distant rainforest can be used as an ingredient in a moisturizing skin cream if it turns out to have an aggressive effect against melanoma? Failure to create an artificial mind plagues the AI field, and yet we make complex signal processing algorithms and our questions have leaked over into fascinating new research into how the human brain actually works. We didn’t find the Higgs boson? Too bad, so sad. But now we have one very bizarre and complex mystery on the origins of mass and a few trillion data points to crunch.

All right, so if there’s no Higgs boson, what else could there be? Well, there’s a myriad of ideas out there and they range from the incredibly exotic, to hinting at some unification between quantum mechanics and our run of the mill standard model particles. In one scenario from a set of theories known as Technicolor Models, an interaction between W and Z bosons breaks symmetry and generates mass. This approach relies on what’s known as confinement, the idea that quarks are tied to each other by charges that prompt them to appear as jets of particles, clumping together into baryons and mesons, or ordinary particles and quark-anti-quark pairs in plain English. Since this symmetry breaks at around 250 GeV, roughly twice the new upper bound set for a wild Higgs to appear out of the particle showers, it would mean that we’ve already went below the point where the appearance of the said wild Higgs boson would falsify this hypothesis. It also gives us a new target area for further study should the Higgs fail to materialize completely, though how to evaluate confinement at such a small scale is something best left for a professional physicist to consider since the equations involved aren’t for the faint of heart to put it mildly, and the mechanics of the experiment are nothing to take lightly either.

But despite the fact that physics could jettison the Higgs if it fails to appear, one prominent tehologian across the pond adopted the search for the boson as his justification for belief in a deity, declaring that physics is filled with esoteric and exotic proposals like dark matter and dark energy, posits ideas that cannot be proven, and believes that the Higgs exists solely because it’s necessary to make their equations work. Of course our theologian in question, Alister McGrath, is woefully mistaken on all counts. If physicists really believed that the universe contains the Higgs boson in the same way theists believe in a deity, they wouldn’t build the LHC in a very complex and expensive attempt to prove whether it exists, and as pointed out above, there are other ways of making the standard model work on paper. Rather than simply looking to confirm their beliefs, scientists at the CERN labs are putting their theories to the test, and they’ll move on should they fail to summon the Higgs, rather than doing the same thing as McGrath; shrugging and declaring that some things just can’t be proven, but since the math works, then it must be correct and the Higgs boson must therefore exist because it makes the math work, using a textbook example of circular logic. Though explaining why they’re moving on would be an uphill fight against those stuck in a very black and white, I-want-to-believe kind of mindset…

Share

As Professor Farnsworth would say, shocking news everyone! A new experiment says that we’ve consistently been overestimating the size of protons by about 3 × 10^-14 millimeters, and the physicists who measured this discrepancy by tracking the motions of electrons’ much heavier siblings, are clutching their chests in fear that they might’ve broken a law of physics. To us, who tend to measure the world in meters and kilometers, a tiny fraction of a millimeter might not sound like much until we remember that protons are really small. With a radius of just around a quadrillionth of a meter and a mass so small, it’s measured in electron volts, even the slightest correction to its size makes a notable difference, especially as far as particle physics goes. Smaller protons mean that our calculations of electromagnetic forces on a quantum scale could be off and in need of some serious updating, possibly with some adjustments to the existing equations of the standard model.

But wait a second. Haven’t we been smashing atoms in particle colliders for the past few decades to test the standard model and found it to be quite accurate? Well, yes. But that’s the thing about scientists. They always want to keep testing. And in this case, a team of physicists wanted to hone in on the exact size of protons with an interesting experiment which involves muons orbiting around them and measuring the muon’s energy as they swarmed around in an approximation of a hydrogen atom. We’ve encountered energy levels of electrons in atoms before when we talked about the tweak to the Lyman α surveys which failed to find some 90% of a vast spider web of galaxies astronomers expected to see in the night sky. Looking just one level up at the Hα line seems to have solved the whole problem. Likewise, when trying to measure the size of tiny particles, the energy levels of electron clouds, which depend on the size of the particles they orbit, can tell us just how big a subatomic particle might be. That’s why this experiment used muons. Muons are identical to electrons except for their mass. Being some 200 times heavier, they’re more sensitive to the proton’s magnetic fields than their smaller and lighter cousins. And according to the muon’s behavior, protons are 4% smaller than expected.

How do we reconcile a 4% error with hugely successful particle collider experiments show that the standard model should have a pretty good handle on certain basics like the near exact sizes and masses of fermions? The researchers aren’t rushing to claim that they’ve shown a problem in particle physics just yet. Instead, they believe they might be wrong in their calculations and measurements, or there’s something else at play when protons are orbited by clouds of muons rather than electrons. Of course the ultimate thrill would be to find that the standard model of particle physics is incomplete and rather than being the precursor to fully solidifying the work of particle physicists and slowing their research projects to a crawl after once again confirming what the model could predict, colliders might be used for new and intriguing insights into the world of elementary and subatomic particles. But that will only happen if physics could show that subatomic particles really don’t seem to match the equations or our previous measurements. It may be that the proton experiment is an odd fluke or a result of errors in measurement on a scale at which the slightest imprecision can invalidate your work, but it could be more interesting if there’s nothing wrong with the data and physicists have a chance to further refine their ideas of how the quantum world works with new experiments and better, more precise tools.

See: Randolf Pohl, et al. (2010). The size of the proton Nature, 466, 213-216 : 10.1038/nature09250

Share

Last week, I wrote about the newest addition to the periodic table of elements, a synthetic atom created in a heavy ion collider for the purpose of seeing how many protons and neutrons you could squeeze together and still have a nucleus that can hold itself together, if only for an instant. And in the case of Element 112, we really do mean an instant since its half-life is just a few hundred microseconds. But why do some atoms generally tend to have shorter and shorter half-lives as their atomic numbers get higher and higher? And why do just a few upward notches along the way produce radical differences in how the resulting elements behave?

atomic nucleus

When we mention the atomic number of an element, what we’re really talking about is the number of protons in its atoms’ nuclei. Of course, you can’t just put protons together and expect to create a stable structure. The particles will have the same spin and charge, which means that the electrostatic repulsion between them will be too much to overcome with the strong nuclear force. You need neutrons to keep the nucleus intact and as the number of protons increases, you’ll need more and more neutrons to balance out their mutual repulsion. Eventually, as you move down the periodic table you’ll find bismuth which has 83 protons and 126 neutrons, a configuration currently thought to be the heaviest stable nuclear composition. After that, all the elements we’re aware of become radioactive, starting with the 84 proton and 125 neutron polonium.

Ordinarily, in a stable atom, the strong nuclear force would keep the nucleus intact. But in a radioactive atom, there’s not enough binding energy to stop it from decaying. The heavier the atom and the more protons it has, the harder it is to stabilize and as the atomic number creeps upward, we cross certain thresholds for just how stable the isotopes we discover would be. This is why just one more proton creates a radioactive substance, another ten create elements with drastically shorter half-lives than the ones preceding them, and eleven more cut down those half-lives further still.

Share

Sometimes, when you read articles like this, you feel like you might as well be reading The Onion instead. A universe filled with subatomic particles spanning the same swaths of space as galactic archipelagos? Isn’t that an oxymoron of cosmic proportions? Maybe somebody should check on what those physicists are doing in their labs and make sure they’re all right? But believe it or not, there’s actually some serious science in the idea that the first neutrinos created after the Big Bang might now be truly immense objects. It just got lost in a writer’s search for a catchy headline and a reporting strategy designed to elicit a lot of raised brows.

neutrinos

The question that some physicists have been asking is whether the very first subatomic particles created just a few fractions of a second after the Big Bang, could have been caught up in the rapid expansion of space and time to stretch out into objects billions of light years across. At first glance, the answer would seem to be a no because as space expands, objects stay the same size and shape as the distance between them increases. As our universe expands, galaxies don’t get larger. They just end up farther and farther apart. Why would your garden variety subatomic particle do something different?

And that’s where we step into the bizarre world of quantum mechanics. Neutrinos can behave both as waves and particles. In the presence of gravity, they collapse into a form we can identify and try to measure. But even then, their exact size is somewhat fuzzy and depends on a variety of factors. Depending on how you reconcile them and what variables you put into your equations, you can get all sorts of strange and surprising results. One of these results would be neutrinos being caught up in the post-Bang inflation and stretching out into all sorts of vast blobs that have a tiny mass, no electric charge and would be extremely difficult to find even when they interact with other matter or collapse under the gravitational pull of a galaxy.

Now, if you remember the science news cycle, you know that a story listing a number of possibilities with an enormous list of complex conditions and variables isn’t going to be nearly as interesting to the public as one with a contrarian headline touting some bizarre scientific curiosity the same way P.T. Barnum announced his shows. And so we end up with a story that talks about the universe being filled with subatomic particles bigger than clusters of galaxies without putting the hypothesis in the proper perspective.

While there is a note that the concept remains to be backed up by observational evidence, the article makes no mention of whether we could expect similar behavior from other primordial fermions and find electrons or protons as big as the Virgo Supergroup. If it can happen to some ancient neutrinos, couldn’t other primordial subatomic particles behave the same way?

Share