explaining the origins of dark matter. or not.

December 27, 2010 — 5 Comments

Ordinary baryonic matter, the stuff from which living things, planets, stars, and galaxies are composed, is just a bit player in the grand scale of things. Accounting for about 4.6% of the universe’s mass and energy content, it’s easily overshadowed by dark matter, the invisible clumps of something which creates gravitational lenses and lets spiral galaxies keep their shape instead of flying apart or blending into mush. But this phenomenon presents an interesting conundrum. It makes up nearly a quarter of the universe and yet, we can’t image it or take a sample of it. All we can do is make maps of were it seems to be acting on stellar bodies or stretching the fabric of space and time under its immense mass. Plenty of physicists and astronomers argued that it’s just noise in the observational data or that it doesn’t even exist because direct evidence for it is rather slim, and few things send popular science blog readers as worked up as exotic theories about it. But the fact of the matter is that it, or something a lot like it, is there and we need to explain what it is and how it first appeared.

To that end, comes a recent proposal by three physicists which tries to pin down a common ancestor to dark matter and baryonic matter. Since the standard model of physics likes to pair particles with antiparticles, the idea is that in the primordial universe, the initial pairs of nascent particles and their counterparts could decay into quarks that would combine into baryons, and into anti-baryons which we would call dark matter. And just like any particle/anti-particle pair, dark matter and baryonic matter annihilate each other, potentially messing up the search for the evidence of the decay of nucleons on our end, evidence which could either lead to new general unification theories, basically theories of everything, or confirm an existing theory by giving us some idea about the stability of protons in nuclei. Currently, we’re pretty sure that protons are extremely stable and the first batch of them produced during the Big Bang will be around for eons, or 6.6 × 10^33 years if you really want to get technical about this sort of thing. But if their lifetime turns out to be much shorter than that and the proton decays into a positron (an anti-proton) and two photons, we’d need to start re-writing how we measure interactions between subatomic particles and ultimately, the story of baryogenesis, the birth of matter as we know it. So if dark matter is really some sort of exotic antimatter, it may explain a lot about the universe.

However, there’s a number of problems with this proposal. Every time you have two opposite types of matter, baryonic or not, you should be dealing with equal amounts of both. That’s what the standard model needs to keep the energies involved in particle interactions in balance, otherwise you could violate the conservation of mass and energy. To get unequal amounts of say, matter and antimatter, you’d need to break symmetry and create an imbalance in the way the particles are being created. When ordinary baryonic matter won out over antimatter in the early universe, the imbalance was around 1% at the very most, something that’s already very difficult to accurately explain and sent some physicists on a search for signs of proton decay to explain some complex nuances in particle interactions. If we’re talking about dark matter emerging from an imbalance with ordinary matter, we may be talking about an even greater asymmetry which is even more difficult to back up in light of how hard it is to break symmetry in the first place, even mathematically. And making things more tricky here is the fact that the physicists cast dark matter as “anti-baryonic matter,” and from what I understand, the antithesis of baryonic matter is plain old antimatter. In effect, we’re being told that dark matter is really better known as antimatter and instead of winning over antimatter during baryogenesis, matter is outnumbered five to one in the known universe. The first instants of the Big Bang have been flipped on their head.

So if we’re faced with dark matter being just big clumps of antimatter, a whole set of questions emerge. Dark matter is supposed to only indirectly interact with matter and have a density far higher than you’d expect from antimatter or regular baryonic matter. Antimatter, on the other hand, is basically a mirror image of matter with an opposite charge and can directly interact with it. If dark matter was really just antimatter, we’d expect to see gamma ray explosions coming from parts of the sky from where clumps of matter and antimatter collide. We don’t. Obviously antimatter can’t fill dark matter’s shoes, no matter how many “hidden sectors” or other rather buzzword-laden obfuscations were created to justify this decay concept, and rather than explaining the origins of the mysterious stuff seemingly filling a quarter of the universe, the physicists in question simply threw out an idea that flips the matter/antimatter asymmetry at the dawn of the cosmos and presents that as a potential solution to the questions of what dark matter really is and from where it came.

See: Davoudiasl, H., Morrissey, D., Sigurdson, K., & Tulin, S. (2010). Unified Origin for Baryonic Visible Matter and Antibaryonic Dark Matter Physical Review Letters, 105 (21) DOI: 10.1103/PhysRevLett.105.211304

Share
  • Pierce R. Butler

    We’re also told that just plain vacuum isn’t really empty, but a seething soup of “particles” and “anti-particles” which appear out of nothing, collide, and annihilate each other with 100 % efficiency (not even stray zaps of radiation).

    Do these particles (do they have a specific name?) have anything to do with the primordial pre-quark particles? Could they still, on special occasions, be creating (or destroying) quarks to affect baryonic matter? (Since they’re reportedly detected by their effect on photons, presumably they’re not directly part of the dark matter phenomenon – but could there be a link to dark energy?)

    Or am I just getting carried away by the use of the prefix “anti-” in different contexts?

  • Greg Fish

    Technically, those particle/antiparticle pairs should exist for far too short of a time to have any major, macroscopic effects on the universe, so my gut feeling is to say that no, they should have nothing to do with dark matter or dark energy. Now, as for their relationship to primordial elements, we’re going into very exotic physics that I don’t understand well enough to give you a meaningful answer.

  • Pierce R. Butler

    One million p/ap (no Greek name ending in -on yet?!?) pairs, each existing for a millionth of a second, provide one second’s worth of gravitational/EM/whatever influence. If they’re evenly distributed throughout all space, probably that influence cancels out (at least gravitationally) – but if their occurrence fluctuates, randomly or according to some unknown factor(s), in space or time, then we have yet another variable that really can’t be excluded from cosmological equations.

    The fact that all this is, so far, so abstruse that the number of people with a decent grasp of it can probably be tallied with about five digits has no bearing on whatever is Going On Out There. My own gut feeling is that eventually somebody will connect all these dots in a way that revises our understanding of how-things-got-this-way, but right now, with >90% of the currently known universe having just been discovered in our lifetimes, such Grand Theories seem wildly premature.

  • http://www.chriswarbo.tk Warbo

    @Pierce The vacuum energy you mentioned isn’t actually the really weird part. It’s actually quite a logical thing to happen, if we remember that in Quantum Physics a “particle” isn’t a hard billiard ball, but is actually a (severely constrained) wave which, when squared, gives a probability. The first thing to notice is that for any value to do with probabilities, the sum of the possibilities should always add up to 1, which in this case means that all of the waves must have a constant. If you squash the wave width-wise (reducing the range of possibilities) then it’ll get taller (the probability of those remaining goes up); if you push it down (reducing the probability of some possibilities) it will get fatter (more outcomes will become possible), since the area must remain constant. This, quite easily, gives us the Uncertainty Principle, since there’s no way to define a ‘position’ for a wave, unless we make it a single peak of incredibly small width. If we did have such a small width, that would mean an equally small wavelength, and since the momentum of a particle is the Planck constant divided by the wavelength, this would mean dividing by zero to get the momentum when you have an “exact position”. Likewise, having a momentum requires a finite, non-zero wavelength, which means the “position” is spread out (an ‘exact momentum’ would require an infinite, periodic wave, which has no ‘position’).

    One of the less famous Uncertainty Principles is that of energy and time, which can be explained by a similar argument involving the amplitude (energy) of the wave and its frequency or time-period: restricting the possible range of one makes the other’s range go up, since the area of the wave must be constant.

    This means that the amount of energy in any particular region can be wildly uncertain, if you severely restrict the range of time that the energy is around for. Relativity tells us that energy is mass, and thus if you’ve got an uncertain amount of energy, you’ve got an uncertain number of particles. Thus “empty space”, ie. a region of space with “no energy” can’t exist, as it introduces infinities again.. If there’s no such thing as “empty space”, then there are particles everywhere, which necessarily annihilate after an incredibly short time (since they’re made out of “borrowed” energy, ie. the more energy they have, the higher the peak, the shorter the time it is around for).

    Now, this is pretty well established (look up “sea quarks” for where this gets really weird), but the real craziness is that, if you’re summing up every possibility to find the area of your curve, then you must integrate over all space. However, space is full of energy, so in fact you end up getting an infinite amount of energy for everything, due to the infinite amount of vaccuum energy. This is quietly brushed aside via “renormalisation” which treats the vacuum energy as the zero-mark, but there’s actually no justification in doing this other than it makes the equations work… :)

  • Pierce R. Butler

    Warbo – thanks for the mini-lesson!

    I can’t say that I understand the physics of uncertainty from this, but I can say that now I understand it a little better

    But now I can’t use my previous cosmological work-around of “There is only way of being nothing, but an infinite number of ways of being something!” any more!