You Will Obey, Won't You?
top of page

You Will Obey, Won't You?

In part one we talked about Stanley Milgram’s infamous obedience to authority experiments. Milgram claimed to have revealed a “darker side” of the human condition. His participants administered dangerous (and sometimes lethal) electric shocks to a fellow participant, simply because they were told to by a man in a lab coat.


Fortunately, the experiment was staged and no one was harmed.


However, the inspiration for these experiments was all too real. In the aftermath of the atrocities committed in Nazi Germany, many defended themselves by claiming they had only been "following orders".


And after all, the reality is what we make it. Milgram's participants didn't know the experiment was a trick. They truly believed they were administering those shocks.


Didn’t they?


Puppet (File, c. 1935)


Anything that receives attention and acclaim makes itself a target for criticism. That is no less true in the world of scientific research.


Milgram’s work is one of the best-known experiments in the history of psychology. It has not been found wanting of critique.


It is worth considering Milgram’s own explanation for his work. He offered two theories: theory of conformism and agentic state theory.


The first of these is derived from another well-known experiment from the annals of social psychology history.


Asch's conformity lines (1951)


In 1951, Solomon Asch performed an experiment that investigated the effect of social pressure on an individual’s judgement. Asch presented a group of eight male college students with a series of images like the one above. The task was simple, each member of the group had to identify which of the three lines — A, B, or C — matched the single target line.


But there was a catch. This is psychology, after all.


Only one of the group was a real participant — the other 7 were stooges.


The stooges were instructed to always give the same answers as each other. In some trials, they would agree on the correct answer. In other trials, they would agree on an incorrect answer.


Answers were given aloud, one member of the group after the other. The real participant always went last.


In the example above the correct answer is B. We all agree on that, don't we?


The results were fascinating. When participants answered alone (in the control condition) they chose the wrong answer in less than 1% of trials.


When they answered after the stooges attempted to mislead them, the error rate jumped to 37%. Social pressure influenced participants to conform, even though they did not agree with the group's choice.


In all, 75% of participants gave at least one incorrect answer. You can view the footage here.

The Asch conformity experiment is notable in another way. Psychology textbooks have notoriously underreported the number of participants that defied social pressure to conform. In fact, 95% defied the majority at least once.


Perhaps this is reflected in the Milgram experiment. Can we become so entranced by those that obey, that we fail to reflect on why others do not?


Solomon Asch (n.d.)


The second theory Milgram offered was one of his own devising — agentic state theory. Milgram proposed people obeyed because they no longer viewed themselves as an entity responsible for its own actions. Instead, they had become a tool of the authority figure and an extension of that authority’s will.


The essence of obedience is that a person comes to view himself as the instrument for carrying out another person's wishes, and he therefore no longer regards himself as responsible for his actions. Once this critical shift of viewpoint has occurred, all of the essential features of obedience follow. The most far-reaching consequence is that the person feels responsible to the authority directing him but feels no responsibility for the content of the actions that the authority prescribes. Morality does not disappear -- it acquires a radically different focus: the subordinate person feels shame or pride depending on how adequately he has performed the actions called for by authority.

Stanley Milgram (1973)


Alexander Haslam and Stephen Reicher, modern Milgram researchers, have offered an alternative explanation for Milgram's findings. In part by considering what influenced participants to disobey.


They argue that Milgram's results are better explained by the phenomenon of engaged followership.


In other words, they believe that people are more likely to follow instructions if they believe in the virtue of the authority figure's vision. In this way, they are 'following' rather than 'obeying' — fundamentally at odds with Milgram's own interpretation.


In the context of Milgram's experiment, Haslam and Reicher believe that the participants' faith in experts and science led them to keep flipping switches.


So who is correct?


As it stands, there is no definitive proof one way or the other. It is a testament to the power of Milgram's research that the debate continues 50 years after it was first published.


The only thing that researchers are able to agree on is that Milgram revealed something truly remarkable.


Hold that thought.



Eichmann's trial (1961)


Questions have been raised about the validity of Milgram's work almost as soon as it was published.


Early critics Martin Orne and Charles Holland argued that participants would have known (even if it was deep down) that Yale could not really have allowed an experiment which put people in real danger.


Milgram himself considered these early criticisms to be… Well, decide for yourself how he felt.


Suggestion that the subjects only feigned sweating, trembling, and stuttering to please the experimenter is pathetically detached from reality, equivalent to the statement that hemophiliacs bleed to keep their physicians busy.

Stanley Milgram (1973)


In 2017, Matthew Hollander and Jason Turowetz published an analysis of the 117 post-experiment interviews that Milgram conducted with his participants. Primarily interested in challenging engaged followership theory, Hollander and Turowetz argued that participants generally did not indicate a belief in science had led them to continue with the experiment.


Like such a thing needs to be said aloud.


These authors did note, however, that about three-quarters of Milgram participants doubted that their fellow participant was really being hurt. They suggested that if their fellow had truly been in danger, then the experiment would have been stopped.


Gina Perry and colleagues took this research a step further. They analysed an unpublished report made by Milgram's lab student, Taketo Murata. Overall they found that rates of obedience were higher when participants were more sceptical that they were causing pain.


In other words, the more a participant doubted the reality of the situation, the more likely they were to keep flipping switches.


Perry further criticises the wider reporting of Milgram's work by noting that averaging all of his manipulations drops the obedience rate to 43%. As such, the majority of Milgram's participants disobeyed.


Might it be possible to conclude that our understanding of Milgram’s research has been flawed all along?



Stanley Milgram (n.d.)


Probably not.


That is my own opinion, but rest assured I am not alone with it.


A 2021 analysis by Nestar Russell and Robert Gregory of these modern criticisms aligns very neatly with my own views.


They point out that Milgram might not have successfully duped his participants (something Milgram believed was pivotal to the success of his work).


But participants could only have had doubts. They could not be certain of the reality of their situation. As such, they could only suspect that they might be being tricked (and therefore also that they also might not). And in those circumstances they continued flipping switches.


How much chance would you take with another person’s life?


1 in 5? 1 in 10? 1 in 1,000,000?


The overwhelming evidence from a number of replications is that obedience (to an ostensibly real and horrifying event) occurs at a rate far higher than would be expected.


Variations that produced lower rates of obedience are the most extreme scenarios. In one, the participant was close enough to the learner to reach out and touch them. And 30% still continued to the very end.


Certainly these rates are far higher than the 1% predicted before Milgram carried out his research.


Russell and Gregory make another important point about these contemporary criticisms: the authors place a lot of faith in the 'truth' of the participant's post-experiment interviews.


Humans, quite famously, lie.


Perhaps they are more likely to when they have had an uncomfortable truth revealed about their nature.


Milgram’s experiments are not as clean and orderly as perhaps they have been popularly portrayed. But to my mind, there is no doubting their importance in the history of psychological research. Nor in their importance towards understanding ourselves.


 

Want to learn more about psychology? Why not start with the Experimental Psychology 101 series?


 

References


Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgments. In H. Guetzkow (Ed.), Groups, leadership and men; research in human relations (pp. 177–190). Carnegie Press.


Griggs, R. A. (2015). The disappearance of independence in textbook coverage of Asch’s social pressure experiments. Teaching of Psychology, 42(2), 137–142.


Haslam, S. A., Reicher, S. D., & Birney, M. E. (2014). Nothing by mere authority: Evidence that in an experimental analogue of the Milgram paradigm participants are motivated not by orders but by appeals to science. Journal of Social Issues, 70, 473-488.


Hollander, M. M., & Turowetz, J. (2017). Normalizing trust: Participants’ immediately post‐hoc explanations of behaviour in Milgram's ‘obedience’ experiments. British Journal of Social Psychology, 56(4), 655-674.


Milgram, Stanley (1973). The Perils of Obedience. Harper's Magazine 247 (1483), 62-78


Nissani, M. (1990). A cognitive reinterpretation of Stanley Milgram's observations on obedience to authority. American Psychologist, 45(12), 1384–1385.


Orne, M. T., & Holland, C. H. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6(4), 282-293.


Perry, G., Brannigan, A., Wanner, R. A., & Stam, H. (2020). Credibility and incredulity in Milgram’s obedience experiments: A reanalysis of an unpublished test. Social Psychology Quarterly, 83(1), 88-106.


Russell, N., & Gregory, R. (2021). Are Milgram’s Obedience Studies Internally Valid? Critique and Counter-Critique. Open Journal of Social Sciences, 9(2), 65-93.


 

Image references


Asch's conformity lines (1951) Effects of group pressure upon the modification and distortion of judgments. Effects of group pressure upon the modification and distortion of judgments. [Figure]. https://upload.wikimedia.org/wikipedia/commons/4/47/Asch_experiment.png


Eichmann's trial (1961) [Photograph]. https://upload.wikimedia.org/wikipedia/commons/e/eb/Adolf_Eichmann_at_Trial1961.jpg


File, G. (c.1935) Puppet [Watercolour, graphite, pen and ink, and gouache on paperboard]. National Gallery of Art, Washington, D.C., United States. https://www.nga.gov/collection/art-object-page.27746.html


Solomon Asch (n.d.) [Photograph]. https://upload.wikimedia.org/wikipedia/commons/b/be/Solomon_Asch_.jpg


Stanley Milgram (n.d.) [Photograph]. https://upload.wikimedia.org/wikipedia/en/d/da/Stanley_Milgram_Profile.jpg

Author Photo

Sam Ridgeway

Arcadia _ Logo.png

Arcadia

Arcadia, has many categories starting from Literature to Science. If you liked this article and would like to read more, you can subscribe from below or click the bar and discover unique more experiences in our articles in many categories

Let the posts
come to you.

Thanks for submitting!

  • Instagram
  • Twitter
  • LinkedIn
bottom of page