by Davi Barker
Reclaiming Lost Ground
Ethical concerns raised by these experiments lead to changes in the APA ethical guidelines for psychological research. When I first pitched the idea of a renegade psychological experiment I erroneously believed that the APA was a government agency, and that repeating these experiments had been made illegal. I even suspected that the experiment I proposed might be criminal, and precautions would have to be taken to protect the participants from State reprisal. It turns out I was wrong on all counts.
The American Psychological Association (APA) is the largest professional organization of psychologists in the United States and Canada. At most their ethical guidelines are a criterion that influences public funding. APA ethical guidelines are not law. In fact, versions of these experiments have been repeated, often in other countries, all with similar results. The most interesting finding, which I rarely hear acknowledged, was that no experiment found any significant difference between the degree of obedience exhibited by either gender. So, the “natural obedience” that some people attribute to women is simply not expressed when tested. Instead, it seems that men and women are equally willing to murder an innocent victim if an authority figure tells them to.
Still, these changes to the APA ethical guidelines have essentially neutered any meaningful research on authoritarian sociopathy. There have been a handful of more recent studies that flesh out the findings of these classic experiments, and the implications of their results are no less startling, but they are far less dramatic because of the new limitations. Devoid of shock value this research doesn’t penetrate mainstream culture, and so it has failed to safeguard society against the dangers of obedience. The scientific study of obedience and author has largely been relegated to the water cooler banter of academics.
None the less, it is prudent to examine the other experiments before we design our own. Even those who trumpet the results of the Stanford Prison Experiment and the Milgram Experiment have likely never heard of these less dramatic studies. So, let’s take a look.
Powerful People Are Better Liars
Dana R. Carney is an assistant professor at the UC Berkeley, Haas School of Business. She conducted an experiment intended to discover whether “leaders” and “subordinates” experience the same physiological stress while lying. She found that a sense of power was a buffer against the emotional, cognitive, and physiological stress of lying, and that it increased one’s ability to deceive others, making lying not only easier, but even pleasurable.
Fifty volunteers from diverse backgrounds were recruited from Columbia University. Participants completed a “leadership questionnaire” that identified them as “leaders” or “subordinates.” In reality the selection was random, but the fake survey created an air of legitimacy, which allowed subjects to believe their assignment was somehow deserved. Those randomly designated as “leaders” were given a large office with an executive desk, while those randomly designated as “subordinates” were placed in small windowless cubicles. Both groups were given an hour of busy work. After that, they engaged in a 10-minute mock negotiation over compensation.
After the mock negotiation, half the “leaders,” and half the “subordinates” were given an opportunity to steal a $100 bill by a computer. They were told that they could keep the money if they convinced the experimenter in their exit interview that they didn’t have it. This is known as a “high-stakes lie of transgression” which can be detected with 90% accuracy from nonverbal cues by those trained in lie detection. The exit interviewer asked all subjects the same 10 questions, including 3 control questions about the weather, and didn’t know which subjects had the money, and which didn’t.
For most people, lying elicits negative emotions, cognitive impairment, physiological stress, and nonverbal behavioral cues, all of which can be measured. Video of the exit interviews was reviewed to identify behavioral cues, such as fidgeting, involuntary shoulder shrugs, or increased rate of speech. Saliva samples were tested for increases in the stress hormone cortisol. Tests of reaction time were conducted by computer to demonstrate cognitive impairment. And a mood survey assessed participants’ emotional states during the experiment.
By every measure, liars from the “subordinate” class exhibited all five the anticipated indicators of deception. They reported negative emotions, demonstrated cognitive impairment and increased stress levels, and exhibited behavioral cues associated with lying. Liars from the “leader” class exhibited the exact opposite: by every measure, they were indistinguishable from truth-tellers. In fact, it was discovered that they enjoyed reduced levels of stress, increased cognitive function, and reported positive emotions. Only “subordinates” reported feeling bad about lying.
Dana Carney writes,
“Just as kids don’t touch a stove once they learn it burns them, people don’t like to lie because it hurts them emotionally and physiologically. These data suggest that powerful individuals don’t get burned when they touch the figurative stove… What we’ve shown here is that if you give people power, they’re more comfortable lying, and it will be harder to tell they’re doing it.”
Professor Carney speculates that authority could have a similar impact on other unethical behaviors that cause similar physiological responses, such as cheating, stealing, exploitation, reckless behavior, and even political corruption. She concludes, “Power will lead to increases in intensity and frequency of lying.”
In other words, lying comes easier, and is inherently more pleasurable, to those in a position of authority, even fake authority. Also, positions of authority not only attract dishonest people, but actually incentivize dishonesty. Power rewards dishonesty with pleasure.
So, going back to the Stanford Prison Experiment, being a prison guard not only induces cruel and sadistic behavior, but may make it pleasurable. It also makes it easier to lie to cover it up. Returning to the Milgram Experiment, being in a position of subordination induces a willingness to follow unethical orders, and being in a position of authority makes it easier, even pleasurable, to give unethical orders in the first place.
Power and Compassion are Mutually Exclusive
Psychologist Gerben A. van Kleef from the University of Amsterdam collaborated with colleagues from UC Berkeley to conduct an experiment designed to identify how power influences someone’s emotional reactions to the suffering of others.
Unlike the previous experiments we’ve discussed, where participants were randomly selected for “high-power” and “low-power” roles, in this experiment 118 undergraduates from diverse backgrounds filled out a questionnaire about their own sense of power in their actual lives, and were identified as “higher-power” and “lower-power” individuals. Subjects were randomly paired off to take turns sharing stories in which they experienced great pain, or emotional suffering.
During the exchange both participants were hooked up to electrocardiogram (ECG) machines to measure their autonomic emotional regulation. Individuals faced with psychological stress typically exhibit increased respiratory sinus arrhythmia (RSA) reactivity, resulting in a lower heart rate and a calmed, relaxed feeling. In addition, after the exchange participants all filled out a second questionnaire describing both their own emotional experience, and what they perceived of their partner’s emotional experience.
The results were unmistakable.
For starters, increased stress readings in the storyteller correlated with increased stress readings in low-power listeners, but not in high-power listeners. In other words, low-power individuals respond to the suffering of others with emotional reciprocity, but high-power individuals experience greater emotional detachment.
According to self-reporting after the experiment, high-power individuals felt less compassion than low-power individuals, as might be expected. In addition, high-power listeners correctly identified the emotions their partners, but self-reported being unmotivated to empathize with them. In other words, high-power individuals see the suffering of others, but they just don't care. This could explain why high-power listeners don’t experience emotional reciprocity. They are simply indifferent to the suffering of other people.
Also, storytellers with high-power listeners reported higher distress than storytellers with low-power listeners. This could indicate that high-power individuals’ lack of compassion actually exacerbates the suffering of those around them.
After the experiment, researchers inquired about whether participants would like to stay in touch with their partners. As you might expect, the low-power subjects liked the idea, but the highpower subjects didn’t.
Now... let’s speculate.
In the Stanford Prison Experiment we saw subjects randomly appointed as “high-power” individuals torture subjects randomly appointed as “low-power” individuals. Van Kleef's research may explain why: Once the subjects of the study were given a taste of power, they simply no longer experienced reciprocal emotions with those who were powerless. And worse, the lack of compassion of the “guards” would have exacerbated the suffering experienced by the “prisoners.”
How can we apply these findings to the Milgram Experiment scenario, where an authority orders a subordinate to murder a stranger? We already know that 65% of subordinates in that experiment obeyed orders that they viewed as unethical. In this case the authority is the “high-power” individual, the subordinate is the “low-power” individual, and the stranger is the one who suffers. So, we can speculate that the authority will feel less compassion for the suffering of the victim, which would make giving the lethal order easier for him (maybe even pleasurable). But we can also reason that the authority is less likely to feel compassion for the emotional distress of the subordinate. That means that the person making the life-or-death decision is the one most cut off from its consequences.
But here’s where it gets sick. The subordinate, a “low-power” individual, has no such emotional detachment. As we saw in the Milgram Experiment, some of the subordinates “exhibited signs of extreme stress.” The subordinate is stuck in the impossible position of both being conditioned to obey the authority, and experiencing emotional reciprocity with the victim. Is it any wonder we see high instances of suicide and post-traumatic stress disorder among soldiers, but not among the ranking officers who give them their orders?
A Silver Bullet Against Power
It has become a cliché that the most outspoken anti-gay politicians are often closet homosexuals themselves, and the champions of “traditional marriage” are frequently engaged in extramarital affairs. Nothing is more common than the “fiscal conservative” who demands ridiculous luxuries at the expense of taxpayers, or the “anti-war” progressive who takes campaign donations from the military industrial complex. Well, now it seems there’s some science behind the hypocrisy of those in power.
Joris Lammers, from Tilburg University, and Adam Galinsky of the Kellogg School of Management conducted a battery of experiments designed to test how having a sense of power influences a person’s moral standards, specifically whether or not they were likely to behave immorally while espousing intolerance for the behavior of others. In each of five experiments the method of inducing a powerful feeling, and the method of determining these double standards was different, but in every one the results were about what you’d expect. Powerful people judge others more harshly, but cheat more themselves. What’s especially interesting is their last experiment, where distinguishing between legitimate power and illegitimate power garnered the opposite results.
The first experiment was designed to determine the discrepancy between the subjects’ expressed standards and their actual behavior. As in previous experiments, subjects were randomly assigned in “high-power” and “low-power” roles. To induce these feelings “high-power” subjects were asked by experimenters to recall an experience where they’d felt a sense of power. Meanwhile, “low-power” subjects were asked by experimenters to recall an experience where they felt powerless.
Each subject was asked to rate how egregious a moral infraction they considered cheating. Then they were given an opportunity to cheat at dice. They were promised some number of lottery tickets equal to the roll of two dice, and then allowed to self-report their roll. The “high-power” subjects reported that they considered cheating a higher moral infraction than “low-power” subjects, but were also more likely to cheat themselves.
In the second experiment participants conducted a mock-government. Half were randomly assigned as “high-power” roles in which they gave orders, and half were randomly given “low-power” roles in which they took orders. Then each group was asked about their feelings regarding minor common traffic violations, such as speeding, or rolling through stop signs. As expected, “high-power” subjects were more likely to give themselves permission to bend the rules if they were running late for an important meeting, but less likely to afford other drivers the same leniency.
In the third experiment participants were divided as in the first experiment, by either recalling a personal experience where they felt powerful or powerless. Then each group was asked to describe their feelings about common tax evasions, such as not declaring freelance income. As expected, high-power subjects were more likely to bend the rules for themselves, but less likely to afford other taxpayers the same leniency.
In the fourth experiment the sense of power was manipulated in an unusual way. All participants were asked to complete a series of word puzzles. Half the participants were randomly given puzzles containing high-power words such as “authority,” and the other half were given puzzles containing low power words such as “subjugation.” Then all participants were asked about their feelings regarding keeping a stolen bike that had been found abandoned on the side of the road. As in all previous experiments, even with such an insignificant power disparity, those in the “high-power” group were more likely to say they would keep the bike, but also more likely to say that others had an obligation to seek out the rightful owner, or turn the bike over to the police.
The fifth and final experiment yielded, by far, the most interesting results, and it is my hope that this is the direction this type or research takes in the future. The feeling of power was induced as it was in the first and third experiments, where participants were asked to describe their own experience of power in their own lives, with one important distinction. In this experiment, the “high-power” class was divided into two groups. One group was asked to describe an event in which they felt their power was legitimate and deserved, and the other group was asked to describe a situation in which their power was illegitimate and undeserved.
The hypocrisy results found in the previous four experiments emerged only when “high-power” subjects viewed their power as legitimate. Those who viewed their power as illegitimate actually gave the opposite results. Researchers dubbed it “hypercrisy.” They were harsher about their own transgressions, and more lenient toward others.
This discovery could be the silver bullet that society needs to put down the werewolf of corrupt authority. The researchers speculated that the vicious cycle of power and hypocrisy could be broken by attacking the legitimacy of power, rather than the power itself. As Lammers and Galinsky write in their conclusion:
“A question that lies at the heart of the social sciences is how this status-quo (power inequality) is defended and how the powerless come to accept their disadvantaged position. The typical answer is that the state and its rules, regulations, and monopoly on violence coerce the powerless to do so. But this cannot be the whole answer… Our last experiment found that the spiral of inequality can be broken, if the illegitimacy of the power-distribution is revealed. One way to undermine the legitimacy of authority is open revolt, but a more subtle way in which the powerless might curb self enrichment by the powerful is by tainting their reputation, for example by gossiping. If the powerful sense that their unrestrained self enrichment leads to gossiping, derision, and the undermining of their reputation as conscientious leaders, then they may be inspired to bring their behavior back to their espoused standards. If they fail to do so, they may quickly lose their authority, reputation, and—eventually—their power.”
This final experiment offers some hope that authoritarian sociopathy can not only be stopped, but driven into reverse, not by violence or revolution, but simply by undermining their sense of legitimacy. Viewed this way, it could be said that comedians such as George Carlin, or Penn Jillette have done more for freedom than politicians like Ron Paul, or intellectuals like Murray Rothbard.