[ weird things ] | why the next big leap in a.i. isn’t new math, but energy savings

why the next big leap in a.i. isn’t new math, but energy savings

We keep seeing artificial intelligence do more and more impressive things. But the most impressive thing it can do next is to fit in our hands.
computer chip illustration

While pundits and tech writers often talk about how we carry AI assistants and the knowledge of the entire world in our pockets, that’s not really true. We carry devices that connect to vast server farms in which the AI models and libraries with which we interact live. And unless you’re willing to spend tens of thousands of dollars and are ready to lug a server lag with a generator everywhere you go, that’s where they’ll remain in the near future. But a decade or two from now, this may change, and we’ll see artificial intelligence do amazing things.

But wait a minute, how is that possible? Don’t some of the most popular neural networks use billions of parameters and churn through gigabytes of data? Wouldn’t that consume massive amounts of energy? Isn’t that why NASA uses AI primarily by beaming down images from a destination, running the models on Earth, then sending back the outputs if necessary? Yes, all of the above is true, which is why engineers have been working on NeuRRAM, an architecture for a computer chip optimized to execute the kind of operations used by AI.

To be perfectly fair, NeuRRAM is not the first technology to try and bring AI to a specialized computer chip, and there are other architectures known as FPGAs trying to achieve the same results. But where the memory metals in FPGAs are highly limited in how complex they can be and how many AI models they can run, NeuRAMM is more efficient, achieving results in line with Intel’s flagship neuromorphic Loihi chips using half the energy in the process, which may not sound like a big deal until you consider the full implications of this technology.

Humming alongside your typical cell phone CPU, it could run calculations that would otherwise require an internet connection to execute right on the device. It may drain your battery a bit more at first, but it would have significant benefits when it comes to privacy and performance. Network lag wouldn’t be an issue, and none of your data needs to be exposed to the world at large to run the computations. Reduce the energy requirements enough and robots on Earth and in space can take advantage of more powerful and sensitive models than ever before.

At this moment, there’s no date on which you can expect NeuRRAM chips added to your phone or computer. Its designers are still trying to figure out how well it can scale for mass production and want to drive its energy consumption down even further. But with demand for something very much like them growing every day from researchers, scientists, and consumer electronics companies, and it seems like it’s just a matter of time before NeuRRAM chips make their way into everyday devices, factory robots, and space probes.

See: Wan, W., et al. (2022) A compute-in-memory chip based on resistive random-access memory. Nature 608, 504–512 DOI: 10.1038/s41586-022-04992-8

# tech // artificial intelligence / computer chips / computer science


  Show Comments