The "rule-breaking" isn't referring to anything the researchers were doing.
It's referring to what the participants were doing. It points out that the compliant subjects who delivered the shocks weren't always following the procedure they were given perfectly. Which is, of course, expected, since people in general don't follow instructions 100% perfectly all the time, and especially not the first time they do something.
> Kaposi and Sumeghy interpret these patterns as a complete breakdown of the supposedly legitimate scientific environment. The subjects were not committing violence for the sake of an orderly memory study. With the scientific elements either forgotten or rushed, the laboratory changed into a setting for unauthorized and senseless violence.
This feels like a huge stretch. Forgetting a step at one point or reading something out loud too early isn't a "complete breakdown of the supposedly legitimate scientific environment" -- a "scientific environment" that is completely fictional to begin with.
The article quantifies the amount of rulebreaking. The article actually compares rule breaking across participants and notes that those who were better at obeying the instructions of the experiment are the ones who refused to continue till the end.
The article doesn't invalidate the milgrim experiments. It claims that the interpretation from traditional literature is possibly wrong.
This does suggest that subjects who are bought into and understand the purpose behind what they’re doing, and are attentive to how the specific tasks they’re doing tie into the bigger picture, are more likely to be actively engaging their judgement as they go. And subjects who are just trying to follow the tasks as given to them are sort of washing their hands of the outcomes as long as they’re following the directions (which is, ironically, causing them to fail at following the directions too).
If they think the procedure is to read the next question when the previous one has been completed, and they do, even if the other person is screaming, they think they're "following rules". They're not the ones who came up with the procedure.
Which is the whole point: the participants were trying to follow rules, even if they made mistakes in following those rules. The idea that there was a total "breakdown" of the rules doesn't seem supported at all.
Your point is fair, but what is really nuanced is that the people who 'stopped' were the best ones at following the rules.
This seems interesting to me - they were conscientious about 'what was happening' - not just blithly following orders.
The 'rule followers' maybe were conscientiously applying the 'spirit of the test' and quit when they realized it was not reasonable.
The others were 'pressing buttons'.
Even then, it's subject to interpretation. There's a perfectly rational reason why people might subject to 'following the rules' if that's what they've been asked to do and have a sense of 'dutiful civic conduct' and 'trust in institutions'.
Instead, most participants rushed through, most likely to end their own negative experience. Which is much more nuanced that "gosh, they told me to do it."
The article doesn’t claim that the experiment was invalidated, but that some conclusions drawn from it are not well founded.
Now the interesting question is _why_ did those people who followed the rules quit at a greater rate? _Why_ did those people follow the rules more closely in the first place? Was there any variation in how the rules were presented? What is the difference in between folks who follow the rules more closely and folks who don't? What can we learn about the human condition from this?
Basically under ill guidance of authority, people can become real monsters. That is the conclusion I got from it, and is now still worse.
It's being consistently verified in real time if you track current events.
Smooth shiny white walls, beakers and test tubes filled with brightly colored liquids on shiny metal tables… Science!
...Which is a good metaphor for the "experiment" as a whole.
These were Yale students, so probably smarter than average, and the study didn’t do a very convincing job make it seem believable from what I’ve read.
When I took psychology in college I had to submit to random experiments to as part of my grade (there were alternatives but the experiments were easier). Before I’d ever heard of Milgram, if one of those studies had put me in a similar situation I would have smelled a rat immediately.
When I was in middle school the teachers created a fake “government decree” to convince us that there was a new sin tax on products kids use (as a simulation). I immediately knew it was fake as did many other students, but that didn’t stop us from playing along for fun. I talked to a few of my teachers later and they genuinely believed that we fell for it.
And social science/history/economics is about learning the standard lessons of the field (even if those lessons are themselves simplistic compared to the real world, they are a baseline of common knowledge).
The (eventually) disobedient subjects were better at respecting the experimental process they were given than the "obedient" ones who went all the way to the maximum voltage. Why was that?
Could it be a sign that the disobedient subjects were on average more concentrated on the task at hand (smarter? less stressed? better educated? more conscientious?) than the ultimately obedient ones, and therefore were more likely to realise they were "hurting" the alleged learner and stop?
Or could it be that the obedient subjects were more likely to realise there was something fishy going on, suspecting the "learner" wasn't really being shocked, and thus were paying less attention to the learning rules?
Or was it, as the article suggests, that the obedient ones may have shut down emotionally under pressure to follow through, and their mistakes are the result of that?
Or were the obedient ones more likely to be actual sadists, who were enjoying the shocks so much that they didn't even care if the "learner" didn't hear their question, giving them a greater chance of shocking them again?
Unfortunately I think the Milgram experiment has become so entrenched in popular culture that there's absolutely no way it can be properly repeated to explore these questions.
* kids grow to be rich because they accept delayed gratification
* alpha males are the leader of the pack and all other males are useless
* people accept violence if there is a higher authority which justifies it with a reason
How many people suffered or delivered suffering because of their beliefs in the above?
It can be reframed as cca discipline too, willingness to suffer a bit for later rewards. Can see this as massive success multiplier in many real world situations.
Almost every person I went to college with had this viewpoint. There's also something comforting knowing you and your friends are all doing the same thing. We all were dirt poor in college trying to support ourselves with crappy part-time jobs working delivering pizza, working in fast food joints, cleaning offices at night. The idea was we all believed we were working towards something better than our current situation. The suffering some how made you a better person, more resilient, made you understand what it was like to really earn something.
All of my close friends I had in college all went on to do successful things. Engineers, attorneys, stock brokers, software engineers, pharmacists. We all eventually got to where we wanted to be, but the suffering is what still binds us together to this day. Talking about some of the houses we lived in that should've been condemned. Having to work 60 hours a week, and still do well on that exam on Friday.
The willingness to suffer is eased when you have a shared experience with others around you.
Milgram decided to repeat his gross ethical violation 30 times(!), with dozens of test subjects each time. Overall, the majority of people actually disobeyed the orders to continue with higher voltages.
I think the only reason it's become so popular is because it makes for a shocking story, with grandiose implications. The specific "agentic state theory" Milgram invented is not backed up by his data, and personally, I find it philosophically dubious and psychologically concerning that he gravitated to it.
See:
https://www.bps.org.uk/psychologist/why-almost-everything-yo...
https://journals.sagepub.com/doi/abs/10.1177/095935431560539...
The delayed gratification thing in particular is correlation vs. causation. It was really more about trust. Forcing kids to delay gratification is meaningless or counterproductive.
You instead look at the claim and the data and the experiment methodology. It often says something far far less generalizable or significant than the conclusion section of the paper.
One of the researchers feels guilty from the apparent panic attack his subject appears to be going through, so he excuses himself from the experimental room and approaches the lead investigator who's watching on CCTV from outside:
“Professor, this subject is really suffering from their belief that they are electrocuting the learner. I believe this is unethical, can we stop please?”
The professor replies:
“The experiment requires that you continue.”
The reading of questions while the subject was screaming is acting in a way that seems like that it is a performative action of conforming to the pattern and that the failure of the pattern is caused by the answerer failing to conform to the pattern. That makes the shocks a punishment for failing to conform. The questioner has a facade of doing the right thing by going through the motions, even though they are breaking the rules by doing so, because if the other party were compliant that rule wouldn't have been broken. That the shocks were painful would feel appropriate to those who had a strong sense that nonconformity should be punished. It is less them following the rules and more them assuming the intent of the rules and permitting abuse because the intent was not their decision. It might make them less willing participants to the abuse and more 'not my problem' active participants.
Once you have an experiment that degenerates into just an event, a situation where the controls have failed, you come up with many potential conclusions but you've lost any science-specific-conclusion to the observations and you may as well look any series of events.
That said, I think experimental psychology just generally fails to establish enough controls to merit the scientific quality it aspires to.
Many people are cruel. Not all people, maybe; not most people, also maybe; but some people enjoy hurting others. We see this everywhere. Isn't it possible that this kind of profile jumped on the occasion to inflict pain on people with no fear of repercussions?
In other words, isn't this study just a sort filter to triage / order students from most cruel to less cruel?
His study plainly shows that most people, in the right circumstances, will act in unimaginably cruel ways.
If we remove this cycle of abuse, what is the natural rate of humans that will hurt others?
An uncomfortable idea, as victims become perpetrators, it may be best to segregate victims to prevent future abuse and victimization.
If we remove this cycle of decency, what is the natural rate of humans that will hurt others?
The premise is flawed, humans learn from their environment and there's really no way to put a human in a coffin until they're 20 and see what they do then.
Yeah, but you can also find that rate if you remove the trigger (abuse) from the environment (society) and see how the rate changes.
You don't have to lock someone in a coffin, or something ridiculous like that (and that would be counterproductive anyway). You create a society, or a least a sub-society, where there's no abuse, and see how much abuse is invented by the people raised in that environment.
That's presuming the only influence on a child's development are the adults who are raising them, which is not true.
Let’s put it another way, if a catholic priest touches a choirboy, it’s not a good idea to let the choirboy become a priest and victimize the next generation of choirboys.
Gross but perhaps a benefit to society
Extremely individual reactions, what makes one tougher breaks another completely and permanently, and everything in between.
I'd say everybody experienced some sort and level of abuse, typical school bullies (which were usually also bullied somehow, hence the behavior).
Wonderful idea. Let's not forget to segregate the poors, since they commit violent crimes at higher rates too. We can build a perfect utopia if only we just get rid of all the undesirables!
You are trapped in an experiment and you have the impression that things went too far and you think you can't escape? You rush it. You hear horrible noises? You just pretend you don't hear them. These are all classical mental patterns. There are million ways to explain them.
> The study authors propose that the experimenter played a major, passive role in establishing this dynamic. When the participants broke the rules and skipped steps, the authority figure rarely intervened to correct them or pause the session. By staying silent and letting the memory study fall apart, the experimenter allowed an atmosphere of illegitimate violence to flourish.
This sounds like looting scenarios to me. ie. When a situation descend into chaos, some people will just surf/leverage that chaos, instead of attempting a return to normalcy, for whatever reason.
Turns out those are not valid examples either. So I am genuinely wondering: what remains of the field of psychology, except for a group of people who find it interesting to think about how other people think/behave? Are there examples of actual, useful and valid conclusions coming from that field?
See also the replication crisis.
In order for someone to answer this, I think you need to come up with some sort of definition what "actual", "useful" and "valid" actually means here in this context.
Lots of stuff from psychology been successfully applied to treat people in therapy with various issues, but is that "valid" enough for you? Something tells me you already know some people are being helped in therapy one way or another, yet it seems to me those might not be "useful" enough, since I don't clearly understand what would be "useful" to you if not those examples.
I don't know what experience of therapy you've had in the past, but this is typically not how it works. People get better when a treatment is applied that is suitable to them as a person and the context, not sure where you'd get the whole "people get better no matter what treatment is applied", haven't been true in my experience.
> While every obedient participant reliably pressed the shock lever, they regularly neglected or ruined the other steps required to justify the shock.
Procedural violations here include things like asking the question while the person in the other room was still screaming.
And based on everyone I've met, and on Dan Ariely's own actions (1), I've concluded this one is true.
We all cheat a little from time to time.
Ex : for me, driving a few km/h above the speed limit is "cheating a little"
1 : https://www.businessinsider.com/dan-ariely-duke-fraud-invest...
His relationship with Jeffrey Epstein isn’t a good look either.
Mason Cooley
Here is Derren Brown's attempt at repeating the experiment: https://www.youtube.com/watch?v=Xxq4QtK3j0Y
That said the study has been replicated many times since the original, with researchers adjusting different parameters like participant screening, changing the gender balance, or varying the roles (teacher/student, researcher/technician...) Across these variations, the overall result stays quite consistent: under certain conditions, ordinary people can be led to do harmful things.
Other experiments have also looked at which factors make this more likely, and for example, diffusing responsibility seems to be one of the most effective ones.
The pop culture version of what happened in those experiments is “regular people will administer potentially lethal shocks when told to”, and that claim has been refuted experimentally many times over.
Contrary to most reports, the original experimenters never told participants that the shocks are supposedly lethal or even dangerous. When participants were actually told that there was a health risk, and that they should ignore it, the vast majority of participants refused to administer the shocks in a later recreation.[1]
In other words, the Milgram experiment, as commonly understood, is somewhere between sensationalism and an outright lie.
What the experiment actually showed: People follow orders when the orders are justified within a persuasive ideological context, e.g. you value science and the scientific researcher is telling you to proceed for the sake of science.
In the first, people who follow the orders of Nazis are not necessarily ideologically aligned with the Nazis, they might just be in a brainless order-following trance. But this isn't real, and in reality the people who were "just following orders" were in fact ideological committed to the cause and should be judged accordingly.
With good enough propaganda machine, any percentage of people would end up 'ideologically committed to the cause' but I don't think they should necessarily 'be judged accordingly' regardless of the larger context..
The version of the Milgram experiment taught to undergrads asks people to believe that you'll follow orders you would ordinarily consider abhorrent simply because you were commanded. But there's basically no evidence for that. People follow orders if those orders are justified in a way that seems persuasive. Nobody ever doubted that Nazis persuaded people to join them. That's not a surprising or even remotely novel finding.
And also this: The most frequent violation in obedient sessions (those who shocked till the end) involved reading the memory test questions over the simulated screams of the learner. Doing this effectively guaranteed that the learner would fail the test and receive another shock.
Basically, being willing to shock other people without stopping was more about violence itself being permitted then about being obedient person. Rule followers followed the protocol until they concluded "nope, this is too much" and stopped mistreating the victim.
The performance, or signal, or whatever we're calling it. That's the important thing.
The article doesn't say that more people refused than was previously known.
It just concludes that most people weren't following instructions in a way that would have supported the validity of the supposed memory experiment.
* Did the subjects who went full voltage stop caring about the "learning" protocol because they realised it was all fake? Then the conclusions of Milgram's experiment are invalid.
* Did the subjects who went full voltage make more mistakes because they were more anxious and fearful of the experimenter? Then underlying fear might be a mechanism for blind obedience, and further research would be interesting.
* Did the subjects who went full voltage just enjoy electrocuting the dude so much that they stopped caring about asking the questions correctly? Then blind obedience is the least of our worries, widespread sadism is much more concerning.
The act of torturing was not due to the torturer obeying the rules. Instead, torturers broke the rules and created conditions that allowed them more torture.