why you shouldn’t bet on digital immortality
Kenneth Hayworth has a vision. One day, right before his body gives out, he will be injected with a highly toxic soup of chemicals that will preserve him down to the cell. Then, his brain will be sliced into wafers as thin as tracing paper and a computer would map every one of this 100 billon or so neurons as well as how they have connected over the decades that he’s been alive. Finally, this vast map, his connectome, is uploaded to a big enough supercomputer and switched on to create a digital replica of him just as he was the instant before all the preceding events took place.
It’s ripped right out of Kurzweil’s wildest dreams, advocated as the next step in human evolution by a Russian billionaire, though the implementation will be rather different, and it bets the house on the idea that we’ve found the seat of what we can poetically call a human’s soul: the aforementioned connectome. Unfortunately for Hayworth, Kurzweil, and Itskov, their bet is more than likely to be wrong. While it is true that the connectome, the complete layout of all the neurons and their connections, is critical in what we define as a mind and in forging our personalities, mapping it from a preserved brain or transferring it to digital media misses several of the most crucial parts of how our brains work and ignores the limits of science.
One of the reason why Kurzweilian Singularitarians are so interested in AI is because they feel that once we’ll know what it takes to support a conscious mind in a computer, we’ll be one step closer to transferring human minds into the virtual realm. This is where the connectome comes into play. It would tell us how to load all the digital neurons that should simulate your mind, so you become a giant artificial neural network running all the time and constantly learning new things. Except you won’t. The problem with getting the connectome from you when you’re dead means that all those virtual neurons won’t have the exact weight of the connections with all the other neurons around them.
Get enough little mistakes here and there in a system far greater than any other recreated neural network, and the cascade of errors can end very, very badly. If we can’t figure out every single synapse’s importance, the virtual neuron would either refuse to fire (since the activation function would always be less than the necessary threshold value with connections that are too subtle) or fire nonstop regardless of whether it needs to or not. A mistake in one or even a dozen neurons won’t matter much, but were likely to get most of them wrong since estimating connection strength based on how the synapses look is unreliable.
And there are more problems. Many basic functions of the brain that we take for granted are still poorly known or understood because the brain is just so difficult to study. We could simulate how a human brain should be learning based on what we know, but we won’t know if that’s right or how the brain would compartmentalize a lot of new information in a virtual state. Our memories are generally reconstructed through the hippocampus, and we understand it well enough to make a synthetic one, but that synthetic hippocampus relies on much of the rest of the brain being wired correctly and is working with motor skills rather than high level cognition.
If you have a good idea how mice can learn to press buttons in sequence, don’t presume that you now know all about how humans remember how to speak another language or how to draw. So if we were to try and load a person’s connectome into a supercomputer and try to animate it, we could either fail spectacularly, or create a virtual mental patient whose mind comes unhinged as we try to keep it going in the computer. No IRB with an understanding of the legal rights of human experimental subjects would ever allow such an attempt to ever be made, preferring instead to call machine-brain interfaces and artificial organs the limit of this research, so it’ll probably be best if you try to become immortal via advanced robotics rather than try to ditch biology…