Latest News: We've added 5 new bonuses to Submersion, our popular 60-day Subjective Reality deep dive course. These include the new Summary Guide, audio walkthroughs, walkthrough transcripts, Subjective Reality story videos, and the Subjective Reality Explorer's Guide. All Submersion explorers can access these bonuses in the Submersion portal now. See the related news post for details. Enjoy!
In the early 1960s, Yale professor Stanley Milgram conducted a serious of famous psychological experiments to measure people’s obedience to authority. A volunteer was instructed by an experimenter to help administer a simple test to a subject in another room. Cards were drawn to determine which of two “volunteers” would play each role, but the cards were rigged such that the actual volunteer was always given the same role each time, and the other role was played by an actor. This gave the volunteers the impression that the role they happened to be assigned was arbitrary.
The test subject (i.e. actor) could be heard but not seen by the volunteer. Whenever a test question was answered incorrectly by the subject-actor, the volunteer was instructed to administer a shock by pressing a button on a control panel. These shocks began at a negligibly low voltage, but with each wrong answer, the shocks were to be increased in 15-volt increments until eventually the final level of 450 volts was reached. The shocks were fake, so no one was physically harmed, but the volunteers didn’t know that the shocks were fake.
As these shocks were administered, the subject in the next room (who again could be heard but not seen by the volunteer), would express discomfort in a manner befitting the severity of the shock, including complaining of a heart condition, screaming louder and louder, and banging on the wall. After a certain voltage was passed, the shock-receiver eventually become completely silent (as if to simulate unconsciousness or death). Even after this point, the volunteer was instructed to continue administering shocks.
Milgram’s experiment was intended to test how far the average person would go. At what point would they refuse to give out any more shocks, despite being told by the experimenter to continue?
If you haven’t already heard of this experiment, what would your prediction be? What percentage of people would go all the way to the end?
Before the first experiment was run, senior psychology students polled by Milgram collectively predicted that only 1.2% of the test volunteers would go all the way to 450 volts. They expected that about 99% of people would stop before that point, figuring that most people are not so sadistic. Similar polling of professional psychiatrists yielded a prediction that about 0.1% would go all the way to 450 volts, meaning that 99.9% would stop before that point.
What was the actual result?
In reality, 65% of volunteers made it all the way to the end of the experiment, which required pushing the 450-volt button not just once but three times in a row.
This experiment has been repeated numerous times with highly consistent results, even when the experiment was updated to conform to today’s stricter experimental ethics guidelines. Compliance rates are generally in the 61-66% range, meaning that most people go all the way to administering the full 450 volts.
Milgram himself reported 19 variations on this experiment that he conducted. By tweaking different factors, such as whether a fellow volunteer participant (played by an actor) voiced strong objections and quit, or obeyed until the end, Milgram found that the compliance rate could be tweaked up or down. In one variation he was able to achieve a compliance rate of 92.5%, while in another he was able to get it down to 10%. The effect of peer pressure had a strong influence on the results.
Incidentally, the compliance rate was the same for men and women alike, so the female volunteers were no more or less obedient than the male ones.
Instead of being blindly obedient or downright sadistic, the volunteer would usually object to going further at some point, often around 135 volts. In response to each verbal objection voiced by the volunteer, the experimenter would instruct the volunteer to continue with the following statements:
- Please continue.
- The experiment requires that you continue.
- It is absolutely essential that you continue.
- You have no other choice. You must go on.
If the volunteer objected a fifth time, then the experiment was halted. And of course the experiment would end if the volunteer objected more strongly at any point such as by getting up and walking out of the room. So the experimenter would eventually take no for an answer — but not right away.
There were also a few custom responses that the experimenter would give as replies to specific types of objections. For instance, if the objection was about doing irreparable harm to the subject, the experimenter would assure the volunteer that although the shocks were strong, no permanent tissue damage would occur.
As payment for participating in the experiment, which took about an hour, each volunteer received $4.
How Nazi Are You?
Milgram’s experiments were partly conceived in response to the trials of Nazi war criminals after WWII. Did the Nazis have to recruit unusually sadistic people to implement their plans? Did they have to use fear and force to get people to obey? Or is it actually much easier to get people to obey a perceived authority, even when it runs contrary to the person’s conscience?
I recently returned from a 30-day trip to Europe, during which I visited Germany, the Netherlands, and the United Kingdom. This was an interesting progression as it relates to WWII since I went from the the aggressor (Germany) to an occupied country (Netherlands) to one of the victors (UK). I visited WWII-related museums and sights in each country and talked to locals about their perceptions of this phase of European history.
It was a compelling experience to visit some of the actual WWII-related locations I’d previously only read about in school or had seen in movies. I visited an old WWII bunker. I walked through the Secret Annex where Anne Frank hid from the Nazis. I explored the underground war rooms used by Winston Churchill and his staff. I caught trains at some of the stations that were once used to transport Jewish people to concentration camps.
Other than Pearl Harbor (which I visited when I was a teenager) and various constructed memorials, the USA is largely devoid of significant WWII sights. I can’t just stroll around Las Vegas and point to places where bombings or battles occurred. But when walking around certain European cities, such locations are hard to miss.
In many American WWII films, the Nazis are depicted as a society of evil, inhuman sadists. A great example of this portrayal can be seen in the role of Amon Göth (played by Ralph Fiennes) in the movie Schindler’s List. The real Amon Göth, who was the commandant of a concentration camp, would do things like make the Jews pay for their own executions, taxing them to compensate the Germans for the bullets used to kill them. After the war he was tried as a war criminal, found guilty, and executed by hanging at age 37. Apparently it took three tries to hang him before the execution was successful, due to a miscalculation of the rope length. As an SS Captain in charge of a concentration camp, Göth had plenty of people under his command to carry out his orders. So why did people obey him? More importantly, how many factors (like the threat of punishment) can we strip away and still see people obeying orders from someone like Göth?
Stanley Milgram set out to discover some deeper truths. What would it take for a typical person to override his/her conscience and obey commands to hurt or kill others? As it turns out, for most people it doesn’t take much at all. If someone assumes an air of authority and tells people what to do, there will be plenty of people willing to obey, even if the commands contradict a person’s sense of ethics and morality.
According to Milgram, “Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority.”
A key to the Milgram experiments is that a person is gradually eased into overriding their conscience. They aren’t instructed to give the 450-volt shock right away. Instead they begin with a voltage that isn’t even noticed. They progress from there in small increments.
The Nazis used a similar strategy. They didn’t immediately begin shipping Jews to gas chambers. They changed the climate and the culture slowly, such as by producing lots of propaganda, progressively restricting Jews’ rights, increasing Jews’ taxes, isolating the Jewish community in ghettos, and then moving them into camps. They started small and turned the dial several notches each year. And people went along with each incremental step, which was a little stronger than the previous step.
It’s been interesting to observe some social changes that are happening today, which strike me as part of a gradual progression. For example, Microsoft recently announced the XBox One device, which comes with a Kinect camera system. This device actually watches you while you use it. It can closely monitor your eye movements, allowing it to determine exactly which part of the screen you’re looking at. It can register small shifts in your body movement. Supposedly it can even detect an increase in your heart rate, which tells it which part of a commercial may be affecting you emotionally.
Years ago a device like this would have seemed unconscionable and incredibly creepy. Some people will undoubtedly perceive it as such today, but as part of a progression towards greater personal surveillance and less privacy, this can also be viewed as just another incremental step. It’s only an entertainment system, right? But it also helps you get used to putting a surveillance device in your home, one that watches you, collects data about you, and rewards you in accordance with behavioral conditioning practices (such as by giving you points for watching commercials). If you object to some aspects of this, you may choose to disable those aspects initially, but of course not everyone will. Society will have time to get used to each progressive step, just like Milgram’s volunteers.
You may object verbally of course, but your verbal objections won’t be an issue if you still tolerate the outcome in the long run. As Milgram discovered, just about everyone objects at some point, but most of them still obey.
Another example is Google Glass, which is slated to be released next year. This device has already been banned by many businesses, including Las Vegas casinos, largely because it can function as an unwelcome surveillance device. Google claims that the privacy concerns regarding Glass are overblown. Cell phone cameras are already ubiquitous, and this is just one incremental step beyond that.
And of course if various authorities tell us these next steps are okay, nothing to worry about, then it shouldn’t be a big deal, right? 😉
I’m not saying that this is a terrible thing per se. But I do think these are interesting examples of how progressive acquiescence can be used to change behavior, one incremental step at a time. When people object, it doesn’t necessarily kill the progression. It just means that people may need more time to get used to the current step before moving on to the next one. Verbal objections may slow the progression, but they aren’t sufficient to stop it.
If Milgram could get people to issue painful/lethal electric shocks by having an authority figure tell them to do so, you might imagine that it’s even easier to get people to take less extreme (but still questionable) actions, such as working long hours for low pay doing meaningless busywork.
Even though many people would naturally object to throwing so much time at empty and unfulfilling work, they’ll still go ahead and do it if someone tells them to. Most people with jobs don’t like the work they do, but they still show up, even if the incentives aren’t very compelling.
What if you want to quit, but your boss, your parents, or some other perceived authority figure objects? Will you surrender and go back to work if they say something like this:
- Please don’t quit.
- We need you to keep working.
- Many people are out of work. You should be glad that you have a job at all.
- You have no choice. You have to go to work.
Getting people to do meaningless work is actually pretty easy. Most of the time, you can just have an authority figure like a boss command them to do it, and they will.
Is this a trap you’ve fallen into?
Another place where people succumb to overrule-by-authority is their relationship life.
What if you want to split up, but your partner objects? Now what if your family objects? Or you partner’s family? Or your mutual friends? Or what if you sense that society at large objects to your desire to split up? What if you’re married? Do you have the inner resources to make this decision for yourself without being overruled by someone else?
What’s especially interesting about Milgram’s experiments is that just about every volunteer resisted in some way. They verbally questioned the experiment. They sweated, squirmed, groaned, or dug their nails into their skin. Some said they didn’t want the $4 payment. A few even had seizures. The experiment produced obvious signs of stress and discomfort in the volunteers. Yet the majority of them still obeyed all the way to the end.
We see these results all the time when people stay stuck in unfulfilling jobs or relationships. They show obvious signs of distress. Some complain. Some have nervous breakdowns. Some read self-help material incessantly, looking for a way out. Yet the majority still stay in those situations, lacking the inner strength to leave.
Do you allow anyone in your life to wield authority over your relationship decisions? Do you need anyone’s approval or fear their disapproval?
Students and Authority
Many students get suckered into high-stress situations at exam time. They’re told by authoritative professors and administrators that they must be tested and that exams are necessary. But the apparent necessity of exams is a manufactured illusion of academic life. Outside of such domains, the academic examination process is largely irrelevant. No one outside of school cares what exams you have or why you think you need them. In fact, many people consider the academic testing process ludicrous and dysfunctional.
During my first run at college, I disliked exams, so I declined to show up for many of them. A predictable consequence was that I failed many classes and was soon expelled. But I learned that the decision to take or not take any exam was mine to make. No one ever forced me to take a test — my permission was always required. I could see that behavioral conditioning techniques were being used to compel me to behave a certain way, such as rewards or punishments. Once I saw through this silly game, I became free to choose for myself whether to play the role of academic student, knowing that it was entirely my choice and that it was impossible for anyone to force me to be tested if I didn’t want to be tested. This turned out to be a powerful mental shift. When I returned to college later, I found it easy to ace my exams without undue stress and generally without needing to devote extra time to studying. I understood that submitting myself to testing was always my choice and never something I had to do. I could only be tested if I chose to be tested.
As a reward for taking and passing certain exams, you may receive a slip of paper that says you know something, but you’ve probably forgotten most of that material a week after the exams anyway. The purpose of the exam was to temporarily convince someone else that you know what they want you to know. What that slip of paper really says is that you’re obedient to authority and that you’ll do the assignments and take the tests that are given to you, and that in itself is something that many employers value. But if you don’t care to submit to another authority, then that slip of paper is of minimal utility. I have one in a box in my garage from my university days, and no one has ever asked to see it. In retrospect, I regard the effort required to earn it to be largely a waste of time, even though I did it faster than most people. (Incidentally, if you still want that slip of paper and you’d like to graduate faster than normal, read 10 Tips for College Students.)
If you’re currently a student, recognize that no one has authority over you. You don’t actually have to show up to class, take exams, and do busywork. Participating is your choice, and no one can force to you play the role of academic student without your permission. The best they can do is apply behavioral conditioning techniques to try to get you to submit to their authority, but if you see through their silly games of rewards and punishments, those techniques lose a lot of their power. You may still choose to play the academic game for your own reasons, which is perfectly fine. Just don’t fall into the trap of thinking that any part of it is being forced upon you. The whole thing is your choice.
Now that you know about this tendency of human beings to obey authority even when strong objections may be present, how shall you deal with this?
The first step is to become aware of any areas in your life where you may already be succumbing to the pressure of authority and allowing it to override your own morals, ethics, values, or desires.
If you value your time, then where are you feeling pressured to waste time or to invest in activities or responsibilities that aren’t actually important to you? For example, how much time did you invest in social media or web surfing this week? Was that a conscious decision on your part, or did you behave that way because someone or something else was conditioning your behavior with the promise of updates, information, or the illusion of pseudo-connection?
If you value freedom, where have you been encouraged to give up some of that freedom in ways that feel uncomfortable to you? What do you feel compelled or obligated to do this week? What are your have-tos? Are those genuine needs you’ve decided to fulfill, or were you progressively lured into a trap by giving your power away unnecessarily? For instance, did you choose to take on as much debt as you have now, or were you subtly enticed to go there, one easy step at a time?
What areas of your life are causing you signs of distress? Where are you sweating, squirming, complaining, or biting your nails? What parts of your life are causing you the equivalent of mild seizures?
Notice where some part of you is objecting to the state of your reality. Is this an area where you’re still obeying some kind of authority, even if you’re not happy with the results?
As you become aware of your tendency to submit to authority, even if it’s hard to stomach all the areas where you’ve been doing so, this will increase your alignment with truth. At first these realizations might sting a little. But please don’t allow yourself to sink back down to a place of denial and ignorance. Do your best to maintain this level of awareness, even if you don’t feel ready to act on it yet.
A run of one of Milgram’s experiments with a single volunteer took about an hour. That didn’t give people much time to think about their decisions — they were caught in a high-pressure situation. In real-life situations, however, you’re more likely to have some time to pause and reflect on your decisions. This is especially true when it comes to career and relationship decisions. Use this reflection time to your best advantage, and learn to trust yourself in those quiet spaces where the influence of a perceived external authority figure is minimal. For instance, pay attention to how you feel about your job when you’re not at work, and notice how you feel about your relationship when your partner is away — in these moments you’ll have access to a more accurate assessment of your feelings.
Peer pressure certainly played a role in some of the Milgram experiments, either increasing or decreasing the compliance rate. The nice thing about peer pressure is that you can consciously create your own peer pressure to align with your desires.
When it’s possible to do so, seek out the support of others. When your inner voice is being squashed by the seemingly louder voice of some perceived authority, reach out to connect with others who’ve been in similar situations and have already moved beyond them. Especially target people who already have the results you desire, such as a fulfilling career, a happy relationship, or a stress-free academic life, and seek their counsel. Ask such people what they would do in your situation and why. See if their answers resonate with you.
You’ll often find when you talk to such people that they’ll have very different attitudes towards the same authorities that tend to overpower you. I experience this all the time from the opposite side when people share their current challenges with me. They constantly fall into the trap of giving away their power to some perceived outside authority. They often don’t even realize that they can choose to disobey, and that once they get past their resistance to doing so, everything will work out just fine. Disobeying may seem very difficult before you do it, but afterwards you’ll look back and kick yourself for making such a big deal out of it. In many cases it’s as simple as saying no and meaning it.
The student can’t change his/her major because Mom and Dad would be disappointed. The unhealthy relationship can’t end because the needy partner would be hurt. The crappy job can’t be quit because the bank wants to keep receiving the monthly loan payments.
You’re the authority in your life. Not your parents. Not your partner. Not your bank.
You can expect that other people will apply behavioral conditioning techniques to get you to comply with their wishes. Parents do it. Partners do it. Bosses do it. Banks do it. But in the end they’re all powerless to force you to do anything. The only way you obey is that you mistakenly believe that you have to obey. They tell you to obey, and you obey. But like the ornery volunteers in Milgram’s experiments who refused to go all the way to 450 volts, you always remain free to stop administering shocks at any time — especially to yourself.
The good news is that you’re not alone. Other people will be delighted to support you on this path, if you choose to invite their support. But they won’t be the same people who’ve benefitted from your obedience in the past, so don’t go looking for support from the authorities who are still giving you orders. If you go complaining to Amon Göth, you’ll get a bullet in the head for your troubles.
Don’t feel you must make a dramatic shift overnight. You may find it more realistic to make gradual, step-by-step progress.
In the Milgram experiments, even the subjects who objected and quit didn’t generally do so immediately. Their resistance increased gradually as the experiment progressed. As the voice of their conscience grew louder, their willingness to blindly obey authority gradually diminished.
During the 5-year Nazi occupation of the Netherlands, the Dutch didn’t immediately jump to maximum resistance. At first they tried to accept the occupation and adapt to it, but as the Nazis grew more oppressive, the Dutch pushed back with greater levels of resistance, including helping people go into hiding, printing underground newspapers, espionage, sabotage, and armed resistance.
Members of the Dutch resistance also sought to collaborate and coordinate their efforts, working together to support each other. Individually they were weaker, but collectively they could support each other in resisting the occupation on the long journey towards Liberation Day.
Demolishing Unauthorized Authority
Ultimately the task before you is to dismantle the external forms of authority in your life that you’re no longer willing to accept.
One memorable act of rebellion from my own life was when I was 17 years old and realized that I didn’t actually believe in the religious gobbledegook that had been fed to me throughout my childhood. For the first few months, I held this awareness only to myself, not having anyone in my life that I could safely confide in.
When I eventually shared my honest beliefs openly, the reaction from others was predictably negative. Initially this was a stressful time for me. What kept me going was the feeling of certainty that I was in the right, which was largely something created from within.
I experienced a powerful shift when I stopped giving my power away to the old perceived authority figures in my life. I stopped believing that they were smarter or wiser than I was. I finally allowed myself to believe that they could be wrong, mistaken, or deluded. By seeing them as fallible, I no longer held them up as worthy authorities over me.
In other words, I de-authorized those previous authorities. I rescinded permission for them to wield authority over me. Once I experienced that shift in my thinking, I then had the power to think and choose for myself, and no amount of behavioral conditioning tactics (i.e. rewards or punishments) would cause me to yield. As people recognized this shift in me and realized that they no longer had my permission to wield such authority over my thinking and behaviors, they soon gave up on trying to control me. Really I gave them no choice.
The power of Milgram’s experiment lies in the volunteers’ belief in the authority of the experimenter. By giving this person permission to wield authority over their decisions, they gave their power away and became capable of denying responsibility for the pain they may have caused. This allowed them to justify their participation as that of a cog in a machine.
One way to opt out of such an experiment before reaching the end is to place anyone who tries to claim authority over you on a lower rung than yourself on your mental ladder of authority. Don’t assume the experimenter is smarter or wiser than you. Realize that they may be mistaken, wrong, or unethical in their dealings and that you may be right. Stop doubting what your own mind is telling you.
Who or what have you authorized to be a greater authority than yourself in your life? If someone in a position of authority tells you that something is okay, but inside you feel creeped out by their actions, do you go along with them, or do you listen to yourself and say no? What if most of your friends and family go along for the ride? Will you succumb to that kind of peer pressure, even if you feel something isn’t right?
Note that the word authority includes the word author. To wield authority over your life is to become the author of your life. You can’t consciously author much of your life if you give someone or something else authority over you.
Objecting to the misapplied use of authority isn’t enough. Just about everyone objects at some point. People object yet still obey. At some point you have to be able to object and disobey, which means to obey your own inner guidance above the demands of any perceived external authority.
Subjectively speaking, there is no external authority. What’s happening internally (within your own mind) is that you’re stressing yourself out. The stress is a result of trying to deny your own power and authority, make yourself weak, and act like a cog in a machine. This is stressful because it contradicts your true nature. The reality is that you’re very powerful and creative, and if you desire to change some aspect of your reality that doesn’t suit you, you can do so. But in order to do so, you must recognize and accept your power. If you don’t like the way the world is right now, you can step up and do something about it. Pretending to be a powerless victim of circumstance doesn’t suit you.
Becoming an Authority
If you de-authorize the phony authorities in your life and become your own authority, you’ll begin to experience the flip side of Milgram’s experiment. Instead of being the hapless follower, you’ll soon find other people following your lead.
This is where the authority game becomes much more interesting. Instead of being a blind follower, you can transform yourself into a conscious leader. By authoring your own life more proactively, you’ll inspire others to follow your example.
I think that’s the secret fear that many people have when it comes to authority. Once you regain your personal authority, it’s an easy progression into the land of greater public responsibility. When you take charge of your life, you’ll attract others who want to follow your lead and do something similar. You won’t even have to try — those people will come to you.
If you know in advance that authoring your own life will result in others wanting to experience a similar story, is this something you can accept? Are you willing to step into the role of leader? Can you welcome that role into your life? Or would you rather keep playing the follower for a while?
You can follow, or you can lead, and there isn’t much of a space in between. If you’re not willing to lead, you’ll end up following by default.
If you’re willing to lead, then how are you going to lead? When people recognize the authority you have over yourself and become attracted to it, how will you deal with that? Will you try to ignore them? Will you accept that kind of responsibility and do your best? Will you abuse it and become a sadist?
One benefit of leadership is that you can learn a great deal more about your own path when you have a chance to see it reflected in those who seek to join you. Just as Milgram’s experimenters could observe when their volunteers were experiencing stress in response to the unethical demands placed upon them, you can also gauge the response to the authorship of your life from public feedback — but without giving your power away to that feedback. Allow the requests of others to serve as input, but make your own decisions from your personal sense of authority, wisdom, and conscience.
Reclaiming Your Power
Incidentally, Stanley Milgram was only 27 years old when he began conducting his famous experiments (he died at age 51), so don’t make the mistake of assuming that he was some wizened old senior professor. In his day he was quite the rabble-rouser, shaking up the status quo by challenging people’s beliefs.
As a result of going against the grain, Milgram had some authority-based pressure used against him as well. He moved from Yale to Harvard, but he was denied tenure at Harvard, probably because of the controversial nature of his experiments. His membership application to the American Psychological Association was also put on ice for a year.
Many of Milgram’s peers challenged the ethics of his experiments because the experiments caused significant stress to the volunteer participants. Yet most of the original participants, when interviewed about it later, were glad to have been part of the study. Some of them even wanted to work with Milgram. They understood the significance of his work, even though helping him with his research was stressful.
If Milgram’s experiments were indeed unethical, then wouldn’t it also be unethical for teachers to use their authority to stress out their students with exams and grades, for companies to control their employees with rewards and punishments, and for parents to demand that their children comply with family traditions and expectations? When is it okay to use stressful psychological tactics to control the behavior of another?
When stress-producing tactics are used on you in order to manipulate you into behaving a certain way, try to recognize these tactics for what they are — an invitation for you to give your power away. Realize that you can always decline this invitation, reclaim authority over your own life, and make your own conscious choices.
Even if most people continue to give their power away, you don’t have to be one of them. You can stop the shocks whenever you want. The shocks were never real to begin with. 😉