[ weird things ] | how fixing particle colliders’ discarded data problem could bring us a better internet

how fixing particle colliders’ discarded data problem could bring us a better internet

Particle colliders have a huge, seldom discussed problem when trying to record experimental data. Solving it could help us find ground breaking new physics and give us a more reliable internet.
router ethernet grid

One of the biggest stories in particle physics for the last few years has been the fear that we don’t know how to go past the Standard Model as the Large Hadron Collider simply confirmed our existing theories without discovering anything new. But physicist Ethan Siegel had an even more worrisome concern. It may be entirely possible that the LHC discovered something new but we either never recorded the data for the relevant collisions or discarded it for a number of good reasons. In short, the LHC generates more data than our computers can handle and we either can’t catch it fast enough to record events in full detail, or lack the storage and capacity to handle and process it.

As we discussed previously on the World of Weird Things podcast, this is a big deal because we use the lessons learned from particle colliders and the digital infrastructure that supports their work have a lot of practical applications like the internet and treating cancer. This means we owe the only possible glimpse at new physics to a few anomalous detections of odd particles emanating from the arctic ice rather than a massive device we built entirely for this purpose, and while we can still use existing colliders for interesting experiments, their impact on our basic understanding of matter is likely to be rather minimal.

Our only ways forward in the quest to find new physics seem to be serendipity and a massive upgrade to computers as we know them. Unfortunately, we can’t simply cram more data into existing hard drives because the technology to encode data on an atomic level is simply too cumbersome to use in high speed computing environments, and quantum computers won’t help either because they’re only suitable for a certain class of problems, a class of problems which falls outside of simply recording collision data. What we need is a revolution in throughput, the ability to make sure we can transmit data quickly enough to record all the collisions in real time, or at least hold the data in a buffer so it can be written to a solid state drive in due time.

Hold on though, you might say, why would we want to invest so much time and effort into trying to figure out if this sort of technology is even possible, much less implement it? Well, as is often the case with particle colliders, the work they do tends to make its way to us. In this case, the extreme high throughput devices would be invaluable in data centers that form the backbones of cloud services and heavy duty routers, making for a faster, more fault tolerant, and therefore more reliable, internet. Scientists would get more data to figure out how the universe come to be the way it is now, and we’d get better e-commerce and more robust cloud storage, a win-win for everyone involved.

# tech // computer science / particle colliders / particle physics


  Show Comments