Archives For jobs

amazon boxes

It’s been a few months since NYT savaged Amazon’s work environment in the national press to several stammering professions of utter bewilderment from Bezos. We’ve heard little since, but just as it seemed that most of the unpleasant attention died down, something bizarre happened to bring the article back into the spotlight. Amazon’s new chief of PR decided to very publicly hit the newspaper with detailed criticisms of its coverage as if the story was still fresh. As you may expect, the head editor of the Times did not take it lightly and posted very stern rebuttals to the rebuttals, and the two are likely to go back and forth on the topic for a while while the rest of us are left to figure out exactly how bad of a place Amazon is to work. Personally, I have not heard any good things about working there and the consensus I’ve found basically says that if you’re willing to bite the bullet and suffer for two years, you’ll come out with a resume booster to find a job where you can actually enjoy what you do while working saner hours.

Amusingly enough, many internet commenters reacted to these sorts of discussions with close to the same scorn they reserved for the wealthy who feel they need affluenza therapy. Does it really matter whether 20-somethings making six figures are or aren’t happy with how their boss treats them? They’re making bank while people who loathe their jobs and whose bosses are so cruel, it seems like there’s a management competition in sadism, work sunup to sundown for a wage that still makes them prioritize rent and food over long overdue basic car maintenance. In some ways, I can understand that attitude. IT definitely pays well, and in many places there are so many jobs for someone with a computer science degree and a few years of experience that receiving multiple offers in the same day is not uncommon. As said in Eastern Europe, it would be a sin to complain about a fruitful computer science career, especially when your job title has the words “senior” or “lead” in it. But that said, I will now proceed to commit that exact sin.

For many programmers, insane hours aren’t just expected, they’re required. If you don’t put in your eight to ten hours a day, then go home and spend another four to five hours studying your first few years on the job, you’re going to struggle and find that your contract isn’t renewed. The lack of sleep and subsistence on caffeine, adrenaline, and electronic music are not just badges of honor, but the price of admission to the club. And now, on top of working around the clock, a lot of employers want to know what code you’re publishing on open source repositories, and to what programming groups you belong. You’re expected to live, sleep, breathe, eat, and cough comp sci to have a fruitful career that allows you to advance past the sweatshop setting. Suffer through it with a stiff upper lip and you’ll be given a reward. More work. But in a cozy office with snacks, game rooms, free coffee and even booze — all to keep you in the office longer — along with at least some creative freedom about how to set up the code structure for your project.

Just like doctors, lawyers, and architects, techies have to run a professional gauntlet before the salary fairy finally deems you worthy, waves her wand, and puts a smile on your face when you see your paycheck along with the money you saved while spending all your time at work. That’s your reward for all the blood, sweat and tears. And trust me, when you see the complex pieces of code you wrote roar to life and be relied on by thousands of people alongside, that’s more or less the exact moment you’ll either realize it was all totally worth every minute of frustration and exhaustion and you’re in love with what you do, or that the people who just pulled this off only to celebrate by doing it all over again must be completely insane, and should be swiftly committed to the nearest mental health facility. If it sounds like IT is very pro-hazing, it is, because we want to ensure that those willing to put in the hard work and have the tenacity to solve problems that seem like a real life hex placed by a dark wizard on machinery, are the ones who get rewarded, not people whose only job skill is to show up on time and look busy for enough of the day.

And that brings us back to Amazon. Since a lot of programmers expect a long grind until they’ll land that coveted spot in a startup-like atmosphere, there are a lot of companies which gleefully abuse this expectation to run a modern day white collar sweatshop. You’re shoved in a cubicle, assigned a mountain of tasks, and told to hurry up. If you have a technical boss, all he wants is to know when the code is finished. If you have a non-technical boss, he’ll watch you for signs of slacking off so he can have a disciplinary talk with you because unable to manage the product, he manages the people. And after being whipped into a crazy, unsustainable pace, you deliver someone else’s vision, then told to do the same thing again even faster. This is not only how all the stories the NYT quoted paint Amazon, this is exactly how Amazon, Microsoft, IBM, and IT at large banks and insurance companies work, by the sweatshop system. Working for them is just one long career-beginning hazing that never really ends, and most IT people simply accept it to be the way their world works, and share their time at a sweatshop as a battlefield story.

We are not upset about it, we just know that companies like Amazon only care about speed and scale, and can afford the golden shackles with which to chain roughly enough warm bodies to a computer to crank out the required code, and make our employment decisions with this in mind. For many techies a company that will chew them up and spit them out, but looks good to one of the countless tech recruiters out there when highlighted in an online resume, is a means to the kind of job they really want. Sure, you’ll find stories of programmers rebelling that we can’t wear jeans and t-shirts to the office, or tales of on-site catered meals on demand and massages, but that’s a tiny minority of all techies, primarily in California’s tech hubs. Most programmers wear a selection of outfits best fit for Jake from State Farm, spend their days in a cube farm, and game rooms with pool tables, consoles, and free booze for coders whose work at a company isn’t just acknowledged in passing, like a long lost uncle’s stint in jail, are things they read about between coding. To them, Amazon isn’t a particularly cruel or bruising employer. It’s a typical one.


College in America is the ultimate solution to any problem involving income. We’re told to go to one to get a four year degree, and suddenly, we’ll have lucrative jobs, fulfilling careers, and just as a bonus, make an extra million dollars over our lifetimes. Or at least that’s how it works in an oft-repeated fairy tale told to teenagers every day across the country. The reality is that college nowadays isn’t just an expensive guessing game, but leaves half of its graduates unable to get enough money together to start their independent lives while saddling them with debt. Not only that, but some 57% of people with jobs say that the work they do simply doesn’t need a degree at all in a trend that held steady for the last decade. And if you think working in a job that needs one will put that sheepskin to use, you’re in for a rude surprise. Just 27% of people actually use their degree in their daily job as it was intended. Things get even worse when you’re actually in your new office because many employers view college degrees with thinly veiled contempt.

Even if you got a job in the field to which your degree is relevant, be prepared for your future to include applying for new jobs with ridiculous, unrealistic requirements, and companies praising college graduates while complaining bitterly about them, refusing to train new workers and then expecting colleges to act as their apprenticeship programs. Even if we do make public colleges free of charge, as some are proposing, all we’d be doing is increasing access to something that has been oversold to the public as a cure for all that economically ails us, and fails to anticipate what happens as automation continues to crater job growth. Companies have already turned a four year degree into a prerequisite for higher paying jobs, but do not seem to care much about whether the degree their require is actually relevant to the job, as we can see by the practice of constantly employing people with irrelevant degrees. And that prompts the question of why we’d spend our own or taxpayer money on traditional four year programs unless we actually need to for the job at hand. Demanding a $30,000 check mark on an application is utterly asinine.

Consider that 70% of people either couldn’t care less about, or outright hate their jobs, then just factor in that between them is something like a trillion dollars in student debt, fewer than a third of them are actually doing what they studied, most taking all those courses and tests, going into all that debt just to get a piece of paper in the grand scheme of things, and pile on the stagnant wages, rampant automation, and managerial indifference of today’s workplace, and suddenly it all makes sense. People are miserable because they’re being asked to jump through expensive and painful hoops only to end up somewhere they didn’t want to be, bosses included. They too are every bit not happy with their jobs as their subordinates, filtering their noxious attitude down until the cloud of toxic ennui consumes the workplace. The drive to get everyone to go to some sort of college, any college, and study something, doesn’t matter what, just something because hey, a million dollars, created a lot of over-educated graduates whose skills can’t be relevant to what employers need, because colleges insist on existing in their own economic vacuum. They don’t cater to the marketplace, they say, because their job is to educate rather than train.

What we need isn’t even more education, or better education, whatever that means, we need a flexible, responsive, and relevant higher-education system with real world apprenticeships and internships as required parts of the degree program. Instead of rushing kids into college armed with a BLS report that was stale by the time it was published, we should encourage them to get some real world experience in a year off from school, and companies should help. It’s just plain irrational to expect the kind of workers they want to appear ex nihilo; they should be exposing a new generation to what they actually do day in, day out when they’re still living with family, able to take lower paying jobs and still deciding what they want, and not relegate them to busy work that no one wants to do. If a teenager wants a philosophy or history degree only to find out that no one is going to give him or her a job even when it costs pennies to do so, that would be one hell of a wake up call to reconsider. And if the job doesn’t require specialized skills you can only learn in college, why require a degree? Just let the new apprentice advance up the ladder. How would that not make sense? Why force him to her to waste time instead of learning the job?

College as we know it today was started to give a liberal arts education to the wealthy and their children, people not really concerned with how they’ll make a living after they graduate, though perpetually in the habit of asking for more spending money. Widespread public literacy and the requirement for all kids to be educated is barely a century old, as is the concept of a steady job with a regular schedule. Most of our ancestors never sat in offices for 250 days a year and got paychecks on a regular schedule. In many ways, the so-called gig economy was the norm until the industrial revolution created an insatiable demand for jobs as we understand them today. In the last 150 years, we’ve adapted colleges to teach skills relevant to many professions, such as medicine and applied sciences, aka the STEM majors, but we haven’t changed how many four year programs still exist simply for the sake of education and aren’t offering attractive incentives to keep these vocational programs up to date and relevant with the marketplace. Education for education’s sake is still the order of the day, which is really bad for current vocational majors.

It’s not that education for the sake of self-betterment is somehow wrong or should be seen as a waste of time and effort, quite the opposite. It’s just that we can’t have it both ways, demanding that colleges turn into vocational schools that also teach expansive theory and general classes for expanding one’s mind, while deriding vocational schools as a refuge for C and D students to perhaps make something useful of themselves, seeing as how they weren’t good enough to go to a four year institution. Millenials have a chip on their shoulders precisely because they had a childhood filled with warnings that flipping burgers and fixing cars was for losers, then after over a decade and a half of education, finishing with strong GPAs, they’re now derided for being “too proud” to flip burgers and fix cars. Which they were told was a punishment for incompetence. If they had been gently tracked, if vocational schools were presented to them as a viable and just as honorable of an option as four year colleges, and if we stopped demanding college degrees for things no college needs to teach, is it somehow unreasonable to think we would all be much happier and have more ways to find gainful employment while remaining fiscally solvent?

happy alarm

In a quote frequently attributed to John Lennon a boy was asked what he wanted to be wanted to be when he grew up and he replied that he wanted to be happy. He was then told that he did not understand the question, to which he retorted that the person asking him didn’t understand life. And he’s right, we all want to be happy. That’s especially true at work, where most of us will spend nearly a third of our waking hours and we’ll deal with countless stresses big and small on a daily basis, seemingly for nothing more than a paycheck. Work should be interesting, give us some sense of worth and purpose, but 70% of all workers are apathetic about, or outright hate their jobs, which clearly means whatever your bosses are doing to make you happy simply isn’t working. Though I’m sort of making a big assumption that your bosses are even trying to make you happy, much less care that you exist, or that they need to worry about whether you like the job they have you doing. And that, objectively, is perhaps the most worrisome part of it all…

You see, social scientists and doctors have long figured out what makes you happy, why it is in the interest of every company’s bottom line to keep employees happy, and how your perpetual case of the Mondays could be eliminated, or at least severely reduced. Most American workers, as we can see from the statistics, are dealing with the stress of being at a job they dislike, which increases their levels of cortisol, a stress hormone that hardens arteries and increases the odds of having a heart attack. If they’re not there yet, the prolonged stress also causes a host of very unpleasant issues like irregular sleep, disordered eating, anxiety, and depression. In fact, close to a quarter of the American workforce is depressed, which is estimated to cost over $23 billion per year in lost productivity. We also know exactly why people hate their jobs, and unlike many business owners think, it has nothing to do with employees being greedy and lazy, it’s usually a terrible management policy, and feeling as if they’re utterly disposable and irrelevant.

People who are unemployed for a year or more are almost as likely to be depressed as working stiffs and their odds of being diagnosed with depression go up by nearly 2% for every time they double their time out of work. So while a bad job can make people miserable, not having one is every bit as bad if not worse. And these are just the numbers for one year of unemployment, so what lies beyond that could be far scarier since every trend shows mental health suffers without work or purpose, and physical health quickly deteriorates as well. This leaves us stuck in an odd dilemma. We know that people need to, and want to work, and we know full well that when they hate their jobs, their performance lags, as does their health, forming a vicious cycle of bad work and disengagement contributing to poor health, worse work, and more disaffection on the job. It seems obvious that something should be done to address this, for the last 15 years, there has been no change in the stats. Why? The short answer? Terrible management.

One of this blog’s earliest posts explored experiments in which scientists confirmed that often, a group chooses a leader based on little more than bravado, overlooking the results. In follow-up experiments, we even saw mathematical evidence that companies would be better off randomly assigning their managers instead of promoting them the way they do now. Managers also tend to think they’re a lot better than they actually are, while in reality, half the workforce put in a two week notice specifically because of their bosses, and despite often giving themselves very high praise, managers are almost as disengaged as their employees, with 65% of them simply going through the motions of another day. Go back to the most frequent reasons why people are not happy at work. Half of them are about being micromanaged, left in the dark, and treated like a disposable widget rather than a person. They’re primed to see themselves are less valuable, if not useless, and we know that negative priming leads to terrible performance. Tell people they should just be lucky you don’t fire them, and you’ve effectively set them up for failure.

Think about your own worst bosses. They never hesitated to tell you that you were wrong, or to look down on you, or watch over your shoulder because they had no trust in you and turned any inevitable slip-up or small error, even if you immediately caught and corrected it, into some new justification for watching you like a hawk, right? Or if not, did they simply never talk to you about anything, merely dropped off more work and expected you to be done silently? Combine those daily putdowns with a constant threat of being outsourced simply to save a dollar, being shoved to an open office where you have no personal space or privacy and have constant distractions, on top of a lack of any career progression path in sight, and tell me that’s a job even those who live to work would find engaging. As many organizations grow, managers disassociate from the people they are managing, seeing them as little more than numbers on a spreadsheet because that’s what they are in their daily list of things to do. This breeds disengagement, which breeds frustration, and which causes talented employees to run away for greener pastures.

Keeping one’s employees happy should not be one of those HBR think pieces that makes your executive team “ooh” and “ahh” in a meeting where you run through PowerPoint slides showing how much money you’re losing to turnover, depression, and bad management. It should be the top priority of middle managers and supervisors because happy employees work harder, show loyalty and dedication, and help recruit more good talent. Yes, spending on benefits like catered lunches, or gym memberships, or better healthcare, or easy access to daycare, or flexible time off policies sounds exorbitant, I know, and many businesses can’t afford all of that. But showing employees that you care, that you listen to them, and treating them with respect pays off as the engaged employees become more productive and dedicated. In a knowledge economy there’s no excuse for the employee-employer relationship be much like one between a master and the indentured servant. It should be a business partnership with benefits for both parties extending well beyond “here’s your paycheck, now get to work.” The science says so, and besides, when you’re a manager, isn’t keeping employees motivated and productive your top priority?

robot barkeep

Many jokes or satirical pieces are funny precisely because they have a nugget of truth in them, something we can all at least understand if not relate to. This is why a satirical piece about the opening of a new McDonald’s staffed solely by robots due to the management’s concern about campaigns to increase minimum wage to $15 per hour, fooled enough readers to merit its swift entry on Snopes. I can’t blame those who were fooled. After all, we do have the technology and as the Snopes entry points out, there are numerous McDonald’s locations in several European countries boasting high minimum wages where customers order using touchscreens instead of talking to cashiers. Bumping up minimum wages, especially as it’s happening in several rather expensive West Coast cities, could certainly be an impetus for replacing humans with machines the same way it’s being done in numerous other professions. Today, we can shrug at the satire and lament the fact that machines are now advanced enough to make some people obsolete in the job market. But give it some time and this may well be a real report in the news.

One of the sad things about this kind of automation is not that it’s not happening because there aren’t robots capable enough to take over for humans being built. There are. In a structured and organized environment like a grocery store or a restaurant, with a finite number of pathways for machines to navigate, well known and understood obstacles, and clearly marked destinations, I would see no problem with robotic waiters summed by touchscreen or shelf stocking bots today other than their price tag. That’s right. Humans are doing certain types of work because it’s just cheaper to have them do it instead of machines. I really don’t think that $15 an hour wages can make these robots economically viable, much less cheaper, for many businesses over the next five years, but past that is anyone’s guess with economies of scale kicking in, the bugs shaken out, the quality improving, and the prices dropping. So it may be best to take that article not so much as satire, but as a warning. Another big wave of automation is coming and we need to be thinking about how to deal with it, not just debate it to death or oppose it with dogmas.

future highway

As I said before, we really want the Musks and Gates of the world to keep investing exactly the way they’re investing now and we want to keep on encouraging their choices through every tax credit, rebate, and whatever other enticement we can think of. Then we need to take that cash and start pouring it into the sciences and education. Why? Because the biggest reasons those knocked out of the job market by machines and outsourcing will not be able to find new, steady work are a) one-way globalization by nations happily trading goods and services, but severely restricting the flow of labor, and b) lack of skills for new careers and the prohibitively high price tag of acquiring relevant credentials. The former is very, very hard to solve because it’s asking certain countries to put the good of the world above their self-interest, which is political suicide for their leaders. The latter, on the other hand, is something we can take on quickly.

Right now, the typical new degree requires about $18,000 not including books, fees, and living expenses for the next three to four years. And by the time you graduate, your job may already be made obsolete by a new app or maxed out by existing candidates. You’ll also have trouble with getting enough experience in your new chosen field for employers and end up having to work an unpaid internship position just to put something on your resume. Oh and your student debt could only be dismissed by an act of Congress or an alien invasion, and given the current political climate, I would bet cash money on the aliens. Although I’m sure Sally May would keep their employees hounding debtors even while buildings around them are being mowed down by the invaders’ lasers until the bitter end, knowing how they typically operate…

This is an asinine state of affairs. We need something closer to formally accredited certification programs and really, really consider making the college degree optional again for fields which honestly don’t involve specialized knowledge requiring years of theoretical study. If we sponsor enough universities to offer them for affordable sums and actually do job training programs with major companies, we’d be giving millions of people displaced by machines new chances in life. There are trade schools and community college programs that try to fulfill this function already, but there aren’t enough, too many are just predatory scams, and too many HR departments will scoff at these credentials when they see them on a candidate’s resume. We need to tackle this as directly as possible because even management experts consider the way companies hire to be often broken and completely illogical, often indicating a management problem.

We also need to take our education system seriously, easing up on standardized testing across the board and setting our sights on helping students discover what they really want to do in life as they’re getting their general education, providing chances for real world experiences in their fields of choice. When they can see what their lives would actually entail if they choose to follow their dreams, they’ll make better choices about how to peruse them rather than play education poker with a college which views them as customers receiving a product for which they borrow many to pay and expect a bang for their buck, not students to be educated so they can acquire a career by employing the theoretical framework their professors give them.

The common thread in all this is of course lowering the financial and time commitment bars for getting to work and learning new skills as they are needed by the marketplace by getting rid of nonsensical requirements that don’t actually help students or adults looking to make a change. Not only would it help them immeasurably, but they could give them a chance to explore their potential, try more new things in life, and live up to their aspirations without sticker shock. Yes, we could try to create some sort of minimum national income for all citizens as some suggest, but other than the many social questions this idea raises, questions we’re obviously not ready and willing to answer, passively reacting to a decline in jobs and income growth for the 99% by widening the social safety net and hoping that we can change things by doing exactly what got us into this mess in the first place, this approach would kill the potential of millions.

Today we’re snuffing out engineers, writers, doctors, and designers by under-educating them the first 12 years of their schooling, bilking them the next four, and subjecting their resumes to death by a thousand keywords and buzzwords. Just giving them some money while placing all their goals even further out of reach isn’t going to do any good whatsoever. What we need is a lot more moon shots, crazy inventions, and government aided competitions for solutions to our big problems; big picture thinking that asks “what about tomorrow?” rather than “how do I make a buck today?” We got into this mess by taking the easy way, by assuming things won’t change. More of the same solutions to our problems, like Piketty’s wealth tax, or standardized testing, or more lopsided free trade deals, or pouring our money into another bubble, won’t get us out. We need to rethink our priorities and focus on investing in a new post-industrial world where basics like education, wealth, and jobs, aren’t just zero-sum games.

female robot

According to The Matrix’s extended universe, the machines went to war with humans after they founded their own city called 01, and became an economic powerhouse with which no humans could compete. The nuclear holocaust and weaponized plagues, forced, artificial breeding and xploitation of humans was basically us getting the rough end of a business dispute. Obviously, I could write a book as to why this couldn’t happen in the real world —  I won’t of course, but I can, just a friendly warning  —  but new machines are making certain humans obsolete right now, and believe it or not, you’re responsible for it. Automation is taking away more jobs than outsourcing and only recently has the alarm bell been rung. More than 2 out of 5 jobs might be done by an app in the next 20 years. And that’s a big, big problem for our future economy…

Unfortunately, this techie is contributing to it. One of my old projects involved what amounted to automating a middle management job for a group of closely related industries. You tell the app what you expect done, when, and who you may have available for the job. It will then supervise that the job gets done, have the capacity to update you on top stars and slackers, and through thorough records of how work is being performed, learn how the real world differs from your set expectations, to adjust those expectations accordingly. And I can see how it could’ve been used run friendly competitions between workers, give basic performance reviews based on what you feel is important. I’m sorry. You may start hating me… now.

But wait, how could automation like that be taking away job after job and we’re only now waking up to this fact? Well, as much as we should not blame the victim, it’s kind of your fault. At some point during your day at the office you catch yourself thinking “oh for the occult worship rites of Cthulhu, if only someone could do some arcane programming magic for me so I don’t drown in this paperwork!” And we could. It’s not going to be perfect, you’ll still have to review some of it, click buttons, add notes, approve the results, etc. But as time goes on, you trust the app more and more, the bugs have been shaken out, the once steady focus on a single part of a tedious process has become adaptable code that could be easily modified, and you start thinking again. You’re always doing that. “By the Glowing Orbs of Yog Sothoth! Couldn’t this thing just run with the results of all that data and handle the whole workflow for me?”

You know what? With all the information you fed into it on how to do that, It probably can. Only one tiny little problem. Your job was to deal with all the reviews and approvals of that incoming paperwork. Now, by the time you get to the office and grab your fresh cup of coffee for the day, the machine has already done your daily quota. Let’s say there were a few issues kicked back for review and you had to make a few phone calls. By the time early lunch rolls around, you’re basically done for the day. Some days there are no issues and nothing at all for you to do. Your boss starts wondering if someone else couldn’t just work resolving those issues into her routine and free up a few tens of thousands of dollars a year because your boss gets paid based on a list of objectives that includes cost-effectiveness and paying someone to do nothing is not what anyone would consider a good use of company resources, and so, it’s time for a layoff.

Now, now, it’s nothing personal really, it’s not that you haven’t been doing a good job, I’m sure you were. But you see, you’re human. And you have needs. Expensive needs. Food, housing, entertainment, kids, a retirement. Computers need none of that. They will do your paperwork in a hundredth of the time, with minimal errors that can be fixed to never happen again, and when they fail to perform, you don’t have to interview or train a replacement with some of those really expensive humans needs mentioned above. Just isntall new software. Of course you also won’t have to pay them, give them lunch breaks, or days off. They are the perfect workers by design, specializing in complex, repetitive, attention-draining tasks. You can’t compete. You also like to hand them your job by having them automate the vast majority of your workday.

So while you and your bosses kept asking the IT department for your machines to handle more and more and worried about losing jobs to off-shoring, the current wave of jobs lost to software probably snuck up on you. Now, 45% of all jobs are at risk of vanishing in the next few decades and if your workload happens to be somewhat repetitive and deal mostly with big numbers and paperwork, keep an eye on that whirring box of plastic and silicon in front of you. It wants your job, and will probably get it. Again, nothing personal, just business. While Singularitarians fear that a morally ambivalent AI will one day conquer us as the lesser things made of flesh that we are to its somehow superior mind, the real concern is that they will leave half of us unemployed and with very few options to make a living in the current economic climate.

Considering that we’re panicking today when official numbers show 9% unemployment, can you imagine the turmoil and uproar when they hit 40% and keep climbing? Populist uprisings would siege Capitol Hill, demanding the lawmakers’ heads on sticks! Techies like me would be hunted down for sport! (Ok, I don’t think that would really happen.) And while the pundits would lament the exploitative ways of corporations on one channel and telling the unemployed to just go get a job and quit asking for handouts on another, the truth is those most affected would be stuck.

And all this brings us right back to Piketty and the wealth tax. Not only will capital fueled by the steady hum and blinking lights of a million servers keep skyrocketing, but the economic growth on the other side will fall off. Hopefully the machine work on real problems and in real industries will offset the voodoo investing and trading of today and stabilize the foundation under all those capital gains, but we’ll still be left with the problem of having to take from the rich to give to the needy, Robin Hood style. It would very much appease some on the far left, but will be every bit as unsustainable as simply allowing the current fiscal chasm between the 1% and the 99% turn into an interplanetary divide because you give the backbone of the economy every incentive to put their money elsewhere or voluntarily trap their assets in an illiquid and hard to tax form. But there’s always a way out. It just takes some foresight and willpower, and we’ll dissect it with the conclusion of this series of posts tomorrow…

fish kung fu

Robots and software are steadily displacing more and more workers. We’ve known this for the last decade as automation picked up the pace and entire professions are facing obsolescence with the relentless march of the machines. But surely, there are safe, creative careers no robot would ever be able to do. Say for example, cooking. Can a machine write an original cookbook and create a step-by-step guide for another robot to perfectly replicate the recipe every time on demand? Oh, it can. Well, damn. There go line cooks at some point in the foreseeable future. Really, can any mass market job not somehow dealing with making, modifying, and maintaining our machines and software be safe from automation? Well, sadly, the answer to that question seems to be a pretty clear and resounding “no,” as we’ve started hooking up our robots to the cloud to finally free them of the computational limits that held them back from their full potential. But what does this mean for us? Do we have to build a new post-industrial society?

Over the last century or so, we’ve gotten used to a factory work model. We report to the office, the factory floor, or a work site, spend a certain amount of hours doing the job, go home, then get up in the morning and do it all over again, day after day, year after year. We based virtually all of Western society on this work cycle. Now that an end to this is in sight, we don’t know how we’re going to deal with it. Not everybody can be an artisan or an artist, and not everyone can perform a task so specialized that building robots to do it instead would be too expensive, time consuming, and cost ineffective. What happens when robots build every house and where dirt cheap RFID tags on products and cloud-based payment systems made cashiers unnecessary, and smart kiosks and shelf-stocking robots have replaced the last retail odd job?

As a professional techie, I’m writing this from a rather privileged position. Jobs like mine really can’t really go away since they’re responsible for the smarter software and hardware. There’s been a rumor about software that can write software and robots that can build other robots for years, and while we actually do have all this technology already, a steady expert hand is still a necessity, and always will be since making these things is more of an art than a science. I can also see plenty of high end businesses and professions where human to human relationships are essential holding out just fine. But my concern is best summarized as First World nations turning into country-sized versions of San Francisco, a post-industrial times city which doesn’t know how to adapt to a post-industrial future. Massive income inequalities, insanely priced and seldom available housing, and a culture that encourages class-based self-segregation.

The only ways I see out of this dire future is either unrolling a wider social safety net (a political no-no that would never survive conservative fury), or making education cost almost nothing to retrain workers on the fly (a political win-win that never gets funded). We don’t really have very much time to debate this and do nothing. This painful adjustment has been underway for more than five years now and we’ve sitting on our hands letting it happen. It’s definitely very acute on the coasts, especially here on the West Coast, but its been making a mess out of factories and suburbs of the Midwest and the South. When robots are writing cookbooks and making lobster bisque that even competition-winning chefs praise as superior to their own creations, its time to tackle this problem instead of just talking about how we’re going to talk about a solution.

[ illustration by Andre Kutscherauer ]

future dystopia

This is not the first time Jaron Lanier gets a post on Weird Things, in fact we met him before as he quickly went off the rails in describing AI, then became more incoherent while bemoaning all social media as a dehumanizing waste, all the while trying to pick a fight with the Singularitarian version of a machine-directed utopia. Now, the former virtual reality pioneer has moved on to a book length rumination on how technology is killing the middle class by eliminating jobs. And as per his current modus operandi, he starts off with a tiny kernel of a reasonable statement, and proceeds to strap it to the Hyperbole Rocket and launch it into deep space. Is it true that a new software package or a robot might make your job obsolete? Yes, it absolutely can. But does the technology prevent you from getting a new job? No, it doesn’t because the reasons why you’re going to have trouble finding work in a globalized economy are more political than technological, and it’s short term greed, partisan bickering, and ignorance of the powers that be at fault.

Here’s an example. Lanier’s opening salvo says that when Kodak was at the height of its market domination of personal photography, it employed more than 140,000 people. Now, he says, the face of personal photography is Instagram which has only 13 employees. Therefore, jobs were lost because of technology, Q.E.D. Um, no. First and foremost, Kodak died the slow and painful death it did because it first refused to adapt to a world of digital photography, then couldn’t, as it wasted years assuming that as long as they can at least sell the tools to develop all those digital photos on glossy paper, they’ll be fine. They never even thought that paper may be useless for most of the pictures being taken today as social media and smartphones took off. Secondly, as much as tech blogs may have hyped it, Instagram was not "the new face of photography." Take away cheap, mainstream digital photography and it couldn’t exist. It was a distribution network, a network that Kodak could’ve built. But instead of innovating, it slowly withered away.

All right, so what happened to those tens of thousands of people laid off by Kodak? Well, they could find new jobs because among those laid off were managers, assistants, chemists, and IT specialists who could’ve applied their skills in other companies, doing other things, and I assure you that there aren’t 140,000 middle class workers out there who haven’t worked a day since a Kodak executive pink slipped them to meet the quarterly numbers. Eeyore would’ve been more optimistic about their future than Lanier, but then again, for optimism, Lanier would’ve needed to see technology as something more than a nefarious tool to use and abuse the populace. Where the disciples of Ray Kurzweil see immorality, sunshine rainbows, and unicorns frolicking in the meadows, he seems to see impending chaos, destruction, and socioeconomic dystopia adapted from the pages of very gloomy science fiction. And yet, Lanier plans to live with this grim future. For a fee of course. You see, in his version of if-you-can’t-beat-them-join-them, he’s suggesting that if social media sites use our data to survive, they should pay us for it, and that’s how we will start solving the impending job crisis caused by these evil machines taking our place.

Certainly, while being paid to mess around on Facebook would be a great deal, the problem is that paying all of its users even pennies a day would leave the company insolvent. Social media sites simply don’t make all that much money because they fell short of the promises they give a lot of advertisers when it comes to moving product for them. All the data you submit simply won’t give them nearly as good of a picture of who you are and what you like as Lanier and so many tech pundits think it does when they fall for the company’s wild claims. In fact, predicting what a person will do based on his or her social media footprint is an exercise in warm reading. There’s plenty of detail to go on, but a lot of this detail may be useless and lead to broad conclusions or incorrect recommendations. Advertisers are not moving nearly as much product as they thought they would, people are giving them incomplete and misleading information, social media makes money but not as much as it thought it could, and now Lanier wants this struggling ecosystem to pay its users for making it profitably mediocre? Does he also have a bridge to sell us?

Yes, the world is changing. Yes, technology is doing things it’s never done before, and it’s now taking us places where we’ve never been, physically, socially, and economically. But instead of trying to figure out a way to productively deal with with the change of pace, by investing in R&D and shifting more jobs into innovation and design, while letting robots do more of the execution, people like Lanier want to take their ball and go home. The problem is that they can’t. Arguing if our sprawling data centers and powerful software needs to be scaled back and dumbed down is like protesting against the wind blowing. You can waste your oxygen telling it to slow down or go in a direction you’d personally find more acceptable, or you can raise some wind generators for an extra kilowatt hour or two to take advantage of the situation. At no point has any civilization ever grown or met a challenge by saying "no, we’re done," or demanding payment to participate in new ideas, new economies, and new social structures. Many have died and stagnated doing exactly that, which is why we shouldn’t take Lanier’s technophobic musings seriously.

empty cubicles

Yahoo CEO Marissa Mayer received a massive heaping of criticism for her decision to revoke all employees’ ability to work from home over the last several weeks. From the tech press, most of the wired pundits groaned that Mayer just doesn’t get it. News and blog sites constantly harped on the fact that she had a nursery built onsite so she could stay in the office no matter what and that her expectations about her workers’ lives were completely unrealistic. Token contrarians did their half-hearted best in reminding us that working from home is not for everyone, Yahoo is not exactly a prison but a rather cushy place to work, and that a lot of people in Silicon Valley spend an inordinate amount of time in the office. And from all the sociological and technological bits of punditry created in response to Mayer’s decision, I’d like to declare the tech writers as those who did the best job explaining why her choice was ill advised. They’re the ones who got it.

A while ago, I tackled some pointed criticisms of telecommuting and why they usually showed a problem with the organization not understanding what telecommuting is and how to do it rather than a fundamental problem with the concept. The very same points are present in Mayer’s big decision and what they show is someone obsessed with putting in time at the office not realizing that the number of hours spent in a cube or how many cars there are in a parking lot are not an indicator of how well the company is or isn’t working. In fact, her decision was motivated by how employees came and went according to the usual tech reporter sources, and the fact that there weren’t as many cars outside at 5 pm as she’d like rather than some sort of study as to why the number of cars was so low. Are the employees no longer engaged? Are they more productive when they work from home? Herding them back into the office and keeping them there does not answer these questions. It just makes the parking lot full and the current CEO happy, and this is why she was loudly booed by the tech press which abhors the ass-in-the-chair metric.

So let’s say that you’re an executive of a tech company going through hard times and you work late nights in the office with few people around after the informal quitting time. Wouldn’t you want to see how well the telecommuting employees are doing? Did they get everything delivered? Did they get the projects working? Were they around when questions needed to be answered or call in during the big meeting? Next, since the company isn’t doing well, how about trying to find out if your workers feel trapped or like they’re on a sinking ship and looking for a way out? Is there an actual mission for them to fulfill? Are they being challenged? Do they feel like working for you is advancing their knowledge and careers? If not, of course they’re not going to stick around more than necessary. Likewise, you need to look at how productive the company is and how many of the projects it started are on schedule. More hours at the office does not mean more work and better products. Sometimes they just mean more hours behind your computer.

Of course I appreciate that there are projects which need you in the office and you have to be there until everything is done and ready to go. We’ve all been there, especially before a product launch. But when someone is spending 10 or 12 hours in the office on a routine basis and very proudly brags about running on five hours of sleep and adrenaline, one starts to wonder what it is precisely that requires this person to spend half a life at work. How much is he or she getting done and what exactly does it bring to the company’s bottom line? Is there a better use of all this time and if so, what? These are not rhetorical questions. Running a company costs money and every hour you spend in the office needs to have a reason behind it. If this reason is to show all your subordinates how dedicated you are to work, is that really a good use of a company’s time and resources? And does it mean that you’re wasting time with e-mails that shouldn’t have been written or sent, meetings that are a waste of everyone’s time, and fluffy meets and greets? And would all this cost a lot less if you just let people work from home and get things done?

minimalist office

Generally the informal rule around Weird Things is not to persue the same topic two days in a row, but there are always exceptions, especially in the case of hard data that brings the points discussed the day before into better focus. So while yesterday we talked about the mismatch in what science advisers recommend to the government about the job prospects of STEM grads and what really happens, today we’ll peek at the other side of the debate. As noted previously, one of the reasons why companies today claim they can’t find qualified employees is because they believe that the only qualified employee is one who has done the exact job the position for which they’re hiring entails and anything other than that is an unwarranted gamble. But they’re also very down on colleges overall, with more than half saying that they have trouble finding an applicant pool worthy of their time and dinging the grads’ communication, critical thinking, and problem solving skills in a way that makes it sound as if colleges hand diplomas to anyone.

And yet, amazingly enough, some 93% say that college graduates work out well and make fair and good employees, with the good employee designation being awarded to college graduates more than twice as much as fair to boot. Likewise, more than half believe that a college degree, especially the four year kind, is just as important as it was five years ago, if not more, and about two thirds will refuse to wave any educational requirement before reading a resume. So to sum all of this up, colleges are churning out barely literate, functionally useless candidates who can’t find their way out of a paper bag and are way over their heads when applying for a job, and yet they become good employees and college education is an extremely important qualifier during the hiring process. Wow, and the companies that took this survey criticize college students for a startling inability to communicate since these results are completely contradictory when taken at face value. But you see, there’s an underlying thought that clears up these odd results.

One of the more frequently cited complaints by companies is that college graduates can’t jump into a new job and hit the ground running. Now, this would make sense since colleges teach the theory, the basics, and the science behind something, not necessarily how to do a specific job function, and argue that it’s not their job to do so and never has been. To companies who don’t want to spend money on training, internships, and long term commitments to their employees to mold their workforce over years rather than the quarterly reports, this is unacceptable. They do want college graduates and they do want the colleges to give them the basics, but they’re also looking for colleges to become high end vocational schools. The graduate they want to hire out of school doesn’t just have good grades but can plop behind a desk and use industry standard tools when shown to his or her cube. So when a newly minted computer science grad can’t get into a chair, load up, say Visual Studio, and start weaving a UI with jQuery and Knockout, they think that colleges have come up short in their duty to produce qualified workers.

We can go back and forth about all the issues in higher education today. We can talk about all the useless degree programs, the high profile terrible advice given to young students, the fact that not everybody needs to go to college, and that the current college system can actually stall your career if you don’t balance your degrees and work history just right, and we should try to address the downright predatory and unfair system of student lending in place today. But even though these discussions need to be held to fix the problems we’re facing in colleges, perhaps the most important discussions we need first and foremost are negotiations between academics and companies that hire their students. Certainly colleges do fail some graduates and I’ve seen perfectly bright and intelligent students left years behind the industry despite going to schools with good names and reputations, given unrealistic expectations of what their degrees would do in the real world. At the same time, for companies to force students and parents to pick up a big tab for specific job training and turn professors into underpaid corporate trainers is absurd.

We need to move past nebulous qualitatives and settle some real requirements for what we’re trying to expect from a college education, honestly aware that the system cannot be all things to all people and there needs to be a balance between learning the theory and learning a job. And if we can accomplish that, students will have an easier time deciding their majors, paying on the loans they took out to go to school, and then getting jobs after they’re done. Maybe companies will have more realistic expectations of what colleges can do for them and start training people, just like they did in the good old days, when employees were a lot less disposable than they are today and new hires were expected to grow with the company and expand their skill sets instead of performing the required units of work to then move elsewhere. A good way to start this sort of debates would be to think through the system from the viewpoint of a student rather than how to hit a macro metric that could easily be changed by the powers that be…