[ weird things ] | visions of a digital doomsday

visions of a digital doomsday

Its extremely unlikely that Terminator's Skynet, or a system like it, would become a reality. At least not on purpose...
terminator salvation still

Today, the fourth installment of the Terminator saga comes out in theaters. Of course you probably knew that already since you obviously use the web, own a TV and pass by magazine stands on a daily basis. And when you watched the last films in the franchise, you might have wondered whether anything you saw on the screen was even remotely possible. How real is the idea that a computer network connected to the weapon systems of an entire military would suddenly turn on humans and cause nothing less than the end of the world?

In the movies, the premise behind the sudden attack on humanity is that Skynet developed intelligence which panicked when worried technicians tried to take it offline for analysis. It labeled humans as threats and having control of nuclear weapons and an entire army of computerized killers at its disposal, started wiping out most of the human race within seconds. While the question of machines suddenly becoming self-aware if you give them enough processing power is debatable, when you have a single computer system controlling every last major weapon you have, you’ve created the potential for that machine to turn against you because of a simple error. If it’s not endowed with the right logic to tell friend from foe and several failsafe mechanisms, it will turn on you when you make a serious mistake or a major bug in its code manifests itself.

Today, the military is focused on putting more machines into the battlefield to prevent the loss of human life on some of the most dangerous and demanding missions. It looks good on paper and scores major political points. When you can fight wars without diminishing your capacity and without putting nearly as many soldiers on the front lines, both politicians and the public are happy. But the fighting robots we have now, are under the complete control of watchful humans. They make very few decisions on their own and their weapons are only fired when a human gives the green light. In the near future, should generals decide to give robots a lot more leeway in making decisions and network machines so they could talk to each other, they should keep in mind the sheer complexity of the code required to make that happen and how a mistake could very easily change the way these robots identify their targets.

Bugs, mistakes computers make when the logic being dictated to them by code is confusing or makes them do something we don’t want them to do, pop up all the time in the IT world. When it happens in an enterprise system, you might get a system outage, loose some information or get puzzling behavior that won’t let you do what you need to do for the day. Now imagine if these same enterprise systems were designed to hunt down and kill. Instead of storing and processing data, they’re aiming and shooting deadly weapons. And what could happen if you reach for the cutoff switch but a killer robot thinks you’re the enemy trying to disable it because a glitch in its code put it in battle mode? It’s problems like these which make the idea of something like Skynet, or a network of robotic weapons with minds of their own working in concert a very dangerous one. However, when you either ignore these issues or you aren’t aware of how common they really are, you might just decide to build one because it sounds like it would be a very efficient way to wage war.

In a few decades, generals who don’t really understand IT systems but are so high up the chain that refusing their requests is a terrifying proposition, might just ask for prototypes of autonomous, networked weapons somewhat reminiscent of the Skynet concept. And maybe, if there’s a big enough bug in the code and the new system gets a hold of live ammo, we’ll be looking at a vicious killer robots coming after us, driven not so much by a sinister computer intelligence but by the mistakes we made in programming them.

# tech // artificial intelligence / computers / killer robots / robots


  Show Comments