are new computers really after your desk job?

February 21, 2012

Several months ago, Slate’s tech writer Farhad Manjo wrote a six part series detailing how you will lose your job to a machine in the near future and how absolutely no one is safe from being automated away. He uses a fairly simple formula to make his point. First, he describes a system doing a tedious pattern recognition task or a meta-analysis well enough to be used in the real world. Second, he describes how a profession which is using these systems is glad that the tedium has been transferred to a machine and knows that entry level job opportunities will shrink in favor of this system and its successors. Third, and finally, he describes how this is just wishful thinking and pretty much everyone but a few experts can be automated away as the computers do it all with the absolute minimum of human supervision. He even targets scientists, arguing that machines are becoming oracles of science and spitting out complex equations underlying the laws of physics and biology a human will not understand. Forget a nuclear holocaust by Skynet, machines will just take your job without the decency to exterminate you afterwards. In the Animatrix, this is how the Great Human/Machine war began…

Sadly there is truth in Manjo’s conclusion that many jobs will go away for good thanks to automation. As many liberal political activists and the OWS movement point out, productivity and corporate profits are booming, but while they often use this as a starting point to blame outsourcing and bonus-saving layoffs for a lack of jobs, they forget the role of automation. It’s not something we think about often and it’s not easy to make slogans to shame robots into quitting. You can fault an executive for laying off a thousand people to meet quarterly goals or deciding that hiring an American worker is too expensive and going overseas, but the uncomfortable reality is that a lot of companies are about as lean as they’re going to be after years of layoffs and belt-tightening and a number of smaller companies that used to outsource have been slowly weaning themselves off a reliance overseas factories citing increased labor costs, blatant theft of intellectual property aided and abetted by local bureaucrats, quality issues, and customs troubles. So how is productivity still up? Automation. How could you fault a company for increasing productivity not by simply getting rid of a job for questionable reasons, but to an automated tool? Of course the takeaway here is that some jobs will be completely unnecessary.

But just how many jobs will go the way of the dodo? Extending Manjo’s formula, we could even argue that one day not even programmers will be needed, only architects who run code generation tools as in an ironic twist, those who automated away tens of thousands of jobs now automate themselves away. But funny thing is that this approach has been tried before in IT and it did not end well. Model Driven Architecture, or MDA, attempted to create a kind of factory line for software where many steps could be fully automated, including generation of code. But lack of standards, incompatibilities with existing tools, and the many big and little issues in trying to turn an abstract model into a complete piece of software made the end products unmanageable. Why? While computers are great at repetitive tasks and crunching immense amounts of data, which is what they’re made to do, they’re not good at design or nuance. In programming, how does a machine know that object X needed to be encapsulated? Or that it could use less code to get the same behavior meaning less code to test? You need humans who know how to write code and define the rules to step in, roll up their sleeves and work on a creative problem like this. The MDA scholars tried to counter this issue by creating ever more abstract ways of designing logical models but abstraction doesn’t always yield lean, mean, performant applications.

So here’s what automation is good at doing. The mundane stuff. If you change the rules or come up with new ideas for how to do something, those new rules and ideas have to be implemented by humans, and systems need to be upgraded to deal with new processes. Try to automate away an entire field and you’ll end up with a dearth of innovation and software unable to cope with new challenges. One wonders why Manjo decided that help from robots and machines means that anything with concrete deliverables can be done by computers in his ruthless musings of whose job can be eliminated and replaced by a machine and in the last part, we see just why he made up his mind about the economic cyber-takeover in his quote about a hypothesis generating prototype which uses a genetic algorithm to come up with descriptive equations for scientific data…

Lipson and Schmidt recently worked with Gurol Suel, a molecular biophysicist at the University of Texas Southwestern Medical Center, to look at the dynamics of a bacterium cell. Given data about several different biological functions within a cell, the computer did something mind-blowing. “We found this really beautiful, elegant equation that described how the cell worked, and that tended to hold up over all of our new experiments,” Schmidt says. There was only one problem: the humans had no idea why the equation worked, or what underlying scientific principle it suggested. It was, Schmidt says, as if they’d consulted an oracle.

Actually it’s more like they fed a computer with reams of data, had it try to find relationships, and hit on a lucky few guesses that worked out. Happy that it worked, they’re now trying to have the software produce more data as it comes up with its guesses until it hits something that looks right. Far from being an oracle, the software in question is just a scientific correlation finder. What Majo is doing here is taking a few successful attempts and presenting them as the norm with barely a mention of the several thousand erroneous guesses made in the process, much like proponents of psychics and astrologers focus only on the "correct" predictions without so much as even acknowledging the overwhelming error rate. And this is not to mention the slew of other big issues with a computer doing science. Yes, technology will spread even farther and yes, there will be many jobs lost to ongoing automation. But presenting this fact while portraying this technology as transcending the humans who built it when it does no such thing, and shedding tear after tear for the soon-to-be laid off or the- never-to-be-hired without taking the opportunity to explain that this is exactly why we need to invest a lot more into research, development, and STEM disciplines, makes a potentially interesting look at the future of a post- industrial economy which asks profound questions, fall far, far, far short of its potential.

[ illustration by Paul Hostetker ]

Share
  • http://www.whitegroupmaths.com whitecorp

    But Automation can and will fail, and human hands ultimately will be required to troubleshoot matters. Still it brings shudders thinking of the likelihood that majority of us will be rendered obsolete eventually. Peace.

  • Bruce Coulson

    The problem is that for millenia, that’s what the vast majority of people did; the mundane, repetitious tasks. Once agriculture was developed, most of humanity spent their time doing numbing, mindless work so that a privileged few could administrate and develop new ideas.

    So, although there will most likely always be places for talented, ingenious people (and this is more true for areas which require creativity) what will the vast majority of humanity who lack these traits do when they become superfluous?

  • Jordan

    People have been worrying about this since the Industrial Revolution. So far, technology has created about as many jobs as it has destroyed. Economists have a term for thinking that technology will destroy all jobs: the Ludditte fallacy.

    Even Ray Kurzweil, the ultimate technology booster, admits that computers can never compleatly automate an entire field or process.