measuring our brains with the wrong ruler
There are many things at which North Korea is really, really incompetent. It’s bad at launching rockets. It’s bad at even trying to appease its populace. It’s rulers are incapable of staying in power without resorting to terror and transparent bombastic propaganda. It’s unable to feed its people. And it’s also terrible at insults aimed at foreign heads of state, calling South Korea’s leader scum with a 2 megabyte brain.
What kind of an insult is that and what does it even mean? It’s kind of like a petulant child calling someone a “big stupid head,” but yet somehow even more immature and nonsensical. But of course someone just had to start wondering and ask exactly how much memory a human brain can accommodate, coming up with a range of estimates that sway between 1 TB and 100 TB based on what are explicitly said to be overly simplistic calculations, calculations a thousand readers of futurist blogs use every day to predict the coming of the Singularity and fuel papers set in something much like the Matrix universe from a certain philosopher with a futurist streak. So really, what is the proper answer to the question of how many terabytes a human brain can store? Well, it’s irrelevant.
Asking about the capacity of the brain in terms of bits and bytes is like asking how many light years it takes to reach Mars or how many gallons are in a kilometer. Neurons don’t work or store data in binary, and when you see comparisons of their firing to a one and relative dormancy to a zero, similar to byte code when it comes to measuring a neuron’s capacity, you’re seeing extremely gross oversimplifications leading to a dead end. I’ve lost count of how many times I found myself in debates with futurists trying to derive some sort of bizarre math out of an average neuron’s firing patterns and then trying to jam the resulting mathematical monstrosity into a supercomputer’s CPU, so this misconception is surprisingly persistent and the question keeps being asked despite its irrelevancy to how the human brain actually works. Here’s a sample of the logic in question…
The math behind these estimates is fairly simple. The human brain contains roughly 100 billion neurons. Each of these neurons seems capable of making 1,000 connections, which represents about 1,000 potential synapses, which largely do the work of data storage. Multiply each of these 100 billion neurons by the approximately 1,000 connections it can make, and you get 100 trillion data points, or about 100 terabytes of information.
Where to start with this game of whack-an-assumption? First of all, we’re assuming that each neuron will only send a single byte through the synapse. Secondly, synapses don’t necessarily do the work of data storage as neurons themselves seem to light up when the mind recognizes something, they’re the pathways by which all those neurons talk to each other. Thirdly, not all neurons are involved in data storage because quite a few are there to process data from the environment and trigger responses to it, and our brain is compartmentalized to spread its workload to neurons in the right configurations to handle certain types of tasks. Information we tend to store for a long time seems to be controoled by the hippocampus, rather than the entirety of the brain, while a part of the brain called the V4 cortex is dedicated primarily to feature extraction for the process of recognizing objects. Fourthly, we don’t know the data density of the brain and don’t really have ways to measure it with any kind of reliability, so any straightforward multiplication is not a valid approach by any means, and thankfully, in this presentation of it, the author acknowledges that it’s completely off base. Many others just roll with it.
Now, we could say the experiments involving an artificial hippocampus can give us a rough representation of how many bits memory in the brain takes, but the hippocampus doesn’t store memories, but the keys to other parts of the brain to reconstruct them. I’m sure that by now you get the point that comparing neurons to a used bit in a hard drive is absolutely irrelevant and doesn’t even need to even be entertained as a question. Its one of those popular pseudoscientific notions that’s well past its prime, like the factoid that we supposedly use a measly 10% of our brain when conscious and the other 90% supposedly house the potential for revolutionary mental prowess or extrasensory perception. It’s been a sci-fi movie trope for so long, it still persists, but most of the population now knows that fMRI tests show 99% of the brain abuzz pretty much constantly. And it’s time for the same to happen with this awkward comparison, so whenever someone tries to equate neurons and a typical hard drive, please don’t hold back and let this person know that he’s doing the equivalent of measuring time in feet or weight in megahertz. Neurobiologists and computer scientists everywhere will thank you.