[ weird things ] | putting the processing pedal to the metal

putting the processing pedal to the metal

Supercomputers may be able to perform calculations at the speed of light, and maybe slightly faster if we're feeling adventurous, according to a pair of studies.
cpu

Since the dawn of computers, the processing power of just about every computing device out there has been pushed ever upwards. Today’s mediocre laptops are faster than ten year old supercomputers and a modern supercomputer can carry out more than a quadrillion floating point operations per second, enough to take on complex problems in astrophysics, climatology, artificial intelligence experiments and weapons development. And with some very encouraging steps towards fully fledged quantum computing coming from physicists, the future looks bright for the computing equivalents of a Formula 1 race car.

The supercomputers of the next decade might be capable of building entire virtual worlds and computing algorithms which are out of reach of today’s machines thanks to gains in both efficiency and speed. We may even see the emergence of extremely detailed and involved virtual worlds like that of The Matrix and Tron coming out of computer science labs…

However, even if we manage to overcome every design challenge along the way to build supercomputers that shatter the speed records of the previous generation, there is a limit to what they could do. It’s not even a limit imposed by the materials from which the machines will be made but rather, the rules of physics could act as the governor for the supercomputers’ speedometers. The most efficient quantum devices should theoretically max out at 100 exaflops, or 100 quintillion floating point operations per second, making them roughly 50,000 times faster than modern supercomputers.

This computational limit was proposed by Professors Lev Levitin and Tommaso Toffoli who estimated the smallest amount of time it takes to process the simplest operations in quantum computing and used it to derive an equation to calculate the maximum processing speed that will be achieved by a relatively “conventional” machine. Any further gains in processing speed would require new technologies based around exchanging information with light, i.e. optical computers which Levitin and Toffoli don’t take into account, declaring quantum machines as the pinnacle of computing achievement.

But once again there’s a physical limit on how fast optical computers can operate. Information can’t travel any faster than 299,792,458 meters per second and trying to exceed this universal constraint would require a new kind of particle called a tachyon. Problem is that tachyons are just a hypothetical construct and would probably disintegrate the instant they had to deal with the physical constraints of our slower than light cosmos. Even if they do exist in some hidden cosmic manifold, they just won’t work for a computing device.

And yet, there may be a way out of this dilemma through a little quantum mechanical trick. By making a computer’s innards from exotic materials with a negative refractive index, you could channel the photons in the machine to split into a number of electron/positron pairs which will collapse back into a photon. That would allow the photon to travel at superluminal speeds for a small part of its journey. And as odd as it may seem, because this pair would be entangled (or described by the same quantum wave function), it should preserve the integrity of the data being carried, allowing the hypercomputers of the far future to crack the light speed barrier by just a little bit.

Still, there are a few problems with this idea. The pair of Austrian researchers who presented it don’t seem to take into account the overhead required to make optical computers work and their proposal calls for a hybrid photonic/quantum machine which works with photons and handles quantum entanglement at the same time in order to prevent the potential data loss if the electron/positron pairs simply annihilate each other, potentially breaking the computer, or if the photon into which they recombine is described by a different waveform than it was when it left towards the sensor.

Given the increase in the energy consumption, risk of data corruption or outright loss, and added complexity to an already complicated system, would the slight gain in speed even be worth it? Maybe we should just go with the laws of physics on this one unless a new breakthrough gives us a pain-free way to exceed the limitations of photons and keep the integrity of our data?

See: Levitin, L., & Toffoli, T. (2009). Fundamental Limit on the Rate of Quantum Dynamics: The Unified Bound Is Tight Physical Review Letters, 103 (16) DOI: 10.1103/PhysRevLett.103.160502

Putz, V. & Svozil, K. (2010). Can a computer be “pushed” to perform faster-than-light? arXiv: 1003.1238v2

# tech // computer science / quantum computing / quantum physics


  Show Comments