Archives For behavior

creepy manequins

Every psychology class mentions Stanley Milgram’s famous experiment to determine the limits of how far people could be pushed to execute horrific orders, and it’s since been the standard for today’s experiments measuring how to awaken our inner sociopath without interfering with your normal brain function. We already know that enough money will make you reconsider the natural human aversion of harming others, especially if you don’t actually have to see the pain you inflict firsthand. But what actually goes on in the brains of those who are following orders or inducements to hurt someone? Are they suffering some internal crisis when they harm others, are they simply pushing the button with no sense of agency on their own, or is something more complicated going on? To find out, European researchers repeated Milgram’s experiment with several important modern twists. They added buttons, a tone when a button was pressed, and read the electrical activity inside the participants’ brains when they were doing their part.

Now, Milgram’s inspiration for his research were the excuses of Nazis at Nuremberg defending themselves by saying that they were simply following orders so his tests focused on how orders are delivered and the subsequent reactions, so verbal commands were a key part of the setup. In this follow-up, how orders were delivered didn’t matter, just the fact that an order was issued so the researchers played a tone after participants pressed a button they were told to press. If the subjects were making conscious decisions and sticking to them, previous research said, the tone would seem to come notably faster after they pressed the buttons than if they were simply doing something on auto-pilot. We’re not sure why this happens, but accidental events seem to be processed slower than intentional ones, which is why gauging the subject’s subjective ideas about how quickly the tone came after they performed the requested or voluntary actions was a crucial part of the experiment. Some were free to choose to apply a small electric “shock” to an anonymous victim, take away £20 from him or her, or just press a button that did nothing as the control group. Others were simply told what buttons to push by the researchers.

What they found was quite interesting. First and foremost, the group told what to do reported a longer time between pressing the button and hearing the tone, exactly as expected. This meant that taking orders made them feel less in control of their actions, the brains evaluating what just happened as an involuntary action despite requiring their agency to be carried out. Secondly, a thorough analysis of their EEG patterns showed that they processed their decisions significantly less than the control group by analyzing activity known as event-related potential, or ERP, used to determine the cognitive load of an action in response of a stimulus. In other words, ordering someone to perform a task makes them feel as if they’re not actually the ones doing it and give the task and its consequences less thought. Revealingly, the topographical maps of the neural activity show areas where you’d find the prefrontal cortex, the seat of decision-making, showing the most activation in both groups while being a lot dimmer for the experimental participants to support this notion. As scary as it sounds, it seems that our brains might just be wired to follow orders with less thought and care than making our own choices. Why? We’ll need more studies to find out, but I’d bet it has to do with us evolving as a social species rather than loners.

jason mask

Here’s a fun fact for you. If you zap someone with a powerful enough magnetic field, you could change this person’s behavior and not always for the best. In fact, you could even zap someone into a state of cold, callous sociopathy if you know where to aim, at least for a short while. Yes, the effects do wear off, but it does seem perfectly plausible that the same effect could be easily harnessed and prolonged by a chemical cocktail and we’ve known that behavior can be altered with the right tools. So of course conspiracy theorists around the world were wondering if sinister military officers or politicians with little concern for their fellow humans would start injecting some people with a psychopath-killer-in-a-syringe serum and setting them loose on a battlefield to do unspeakable evil, acting as shock troops before or during an invasion. The answer is twofold. In theory, yes, they could. In practice, the results would vary widely and can easily backfire and we already have plenty of sociopaths available for building a small army of shock troops. Just ask the Pakistani ISI if you’re curious, and while you’re at it, ask how well it’s worked for them…

Basically, the issue here is that there are limits to which you can change someone’s behavior as well as for how long. In the article above, the subject feels less empathetic and inhibited, but his psychopathy only extends to taking more risks in a video game and pocketing an uncollected tip which he promptly pays back after returning to normal. His comparison point is a special forces soldier who had extensive training and whose skills were honed in real wars. This doesn’t tell us much because military training is a major variable that’s overlooked in such stories. How likely is our non-military test subject to injure or kill someone in a real fight? Probably not very, and here is why. If you ever take a martial arts class, you’ll spend the first few weeks apologizing if you do manage to land a punch on your sparring partner and the instructors will yell at you for going far too easy on your blows and tackles. You’ll shy away from jabs and your natural instinct will be to flinch or fall back when attacked, not to calmly stand your ground. Humans are social creatures and they tend to be averse to hurting each other in the vast majority of cases.

True, we can be induced into hurting others with money or threats, and we do know how to train someone not to shy away from fights and to overcome the natural aversion to real violence. But the experimental subject in question appears to have never had any combat training or martial arts background. He may be less averse to getting into a fight because his impulse control was radically lowered, but chances are that he’ll run for it if he picks a fight with someone who’s able to hold his own or when he realizes that he’s about to get hurt. Likewise, he’s unlikely to punch as hard or as accurately as someone who’s had some real training. All in all, he may be a major menace to unwatched tips in a bar and in Grand Theft Auto, but he’s most probably not a threat to flesh and blood humans. His former special forces friend? Absolutely, but he seems to have no need to be zapped into an emotionally detached state and has his impulses pretty well under control. On top of that, were we to just zap or drag a random person into a psychopathic malice, there’s simply no telling whether he would turn on his friends and handlers or not, a chance no evil, self-respecting mastermind of the New World Order would want to take.

And that brings us back to the very real problem of an abundance of psychopaths to do a dirty job for someone willing to pay. Just look at what happened in Afghanistan during and soon after the Soviet occupation. The mujahedeen trained to fight a guerilla war against the Red Army as well as become proxy shock troops for the ISI in a potential war with India, were not given drugs or magnetic bursts to the brain. They were recruited based on their religious convictions, trained to channel their loathing for the occupying infidels into violence, and let loose on Soviet troops. No artificial inducement or neural intervention was even needed. Today they quire regularly turn on their former handlers, kill people who displease them with near impunity and absolutely zero question or moral qualms, and have generally proved to be a far bigger threat and liability than an asymmetric military asset. Considering that real psychopaths are so dangerous, why create an entire army of them with experimental chemicals or magnetic beams? If indiscriminate murder is your goal, fully automated robots are the easiest way to go, not average people or soldiers just out of basic with their impulse control drugged and zapped out of existence…

Humans have a very strange ability to understand how irrational we are and yet build complex systems which demand that everyone involved is perfectly rational. In his appearance on the Daily Show, writer Justin Fox hit that nail on the head when talking about the current ideas underlying the modern stock market. Rather than try to predict how to deal with greed, excess or impulsive decisions, these ideas see every trader like a machine. Cold, calculating and rationally thinking through every obstacle to get maximum profit or minimum loss. But in the real world where the complexities of human behavior tend to intrude on narrow proposals, it doesn’t work that way. If it did, there would be no subprime bubble. Everyone would know when to stop issuing loans.

stock broker

Here’s a good example of how powerful human emotions are in financial decisions. A recent study using an ultimatum game found that more than 70% of people will punitively reject a bad deal even if it means they get absolutely no gain from the exercise. Ordinarily, according to the idea that humans should be rational when it comes to money, any offer in the ultimatum game should be accepted. When two people are told to decide on how to split some amount of cash, they can either agree on a split and walk away with at least a few dollars or reject it and both of them walk away with nothing. Obviously, walking away with even a tiny sum leaves them a bit better off than when they started. But when the splits venture from the 50/50 and 60/40 rate, rejections soar because the slighted participants no longer care about the money. It’s about bringing the hammer down on a greedy opponent. Their goal is not to win money, it’s not to let the other person get any of it.

A scholar behind the current ideas of how markets should function would be appalled at this behavior. Why in the world would they care about punishment more than they’d care about money? Because, as the research team behind the study clearly notes, humans are social mammals. Our instinct to punish whose who wrong us even if it means we have to spend resources and effort to do it is an evolutionary asset. It helps us keep an otherwise unruly society orderly. It’s the mentality which created our justice systems, prisons and systems of capital punishment. To us, an 80/20 split in an ultimatum game isn’t a chance to make a buck. It’s an offense that must be dealt with in no uncertain terms. And it’s good for the market because terrible terms which give a certain party an unfair advantage alienate customers, limiting the offender’s opportunities and encouraging a competitor to offer better, more transparent terms. Well, until greed overshadows our ability to pay attention to what those terms actually are…

See: Yamagishi, T., et al. (2009). The private rejection of unfair offers and emotional commitment Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.0900636106