How the Challenger Disaster Became a Case Study of the ‘Normalization of Deviance’

The Challenger explosion
Science History Images / Alamy

At 11:38 a.m. on January 28, 1986, at the Kennedy Space Center near Cape Canaveral, Florida, the space shuttle Challenger lifted off. In the hours before the launch, engineers at Morton Thiokol, the company that built the shuttle’s solid rocket boosters, warned that cold weather posed a structural risk to the “O-rings,” the rubber seals between booster segments that prevent hot gases from escaping. Using data from prior flights, the engineers advised that the launch should not proceed in temperatures below 53°F.

That morning, with the country glued to coverage of a flight that would take the first civilian, a schoolteacher, into space, managers at Morton Thiokol, pressured by NASA, overruled their engineers, and NASA gave the green light. At liftoff, the temperature was 36°F.

Seventy-three seconds into the flight, Challenger disintegrated in a spectacular fireball. All seven crew members were killed. A federal commission was formed to investigate.

Like many Americans, sociologist Diane Vaughan was transfixed by the Challenger tragedy. But her interest was also professional. She had long wanted to explore how the dynamics of group decision-making can lead to deviations from established norms and was looking for a case in which an organization had violated rules. NASA seemed to fit the bill. “This appeared to be a typical case of misconduct,” Vaughan says. “There were production pressures and rules violations, and NASA was continuing to fly despite knowing about the flaws in the system. Based on the commission’s report, it looked like you had amoral, calculating managers who threw caution to the wind.”

But when Vaughan went to the National Archives to examine the documents that formed the basis of the report, “I found something completely different,” she says. “Reading the dialogue between a commission member and a NASA manager who was pushing for a launch, I discovered that the commission did not understand the language that the manager and others at NASA were using — and therefore didn’t understand the decision-making process.”

As Vaughan pored over the records, some things became evident. One was that NASA had a clear decision-making structure that it consistently followed. Another was that engineers could only predict how a flight would go — they couldn’t get real-time readings of the condition of the O-rings during the flight and therefore couldn’t understand anything until the vehicle returned. “Then they would determine what had gone wrong,” says Vaughan, “and fix it so it didn’t happen again.”

To solve the O-ring problem, engineers used a heat-resistant putty as an additional sealant. Because their repeated fixes seemed to work — the shuttle kept returning — NASA came to view the O-ring issue as an acceptable risk. And though engineers warned of low temperatures leading up to the doomed flight, they did not have enough data to persuade the NASA managers that it wasn’t safe to fly.

Vaughan published The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA in 1996. The book popularized Vaughan’s concept of “the normalization of deviance,” which describes the process by which deviations from norms — in this instance, safety protocols — become ingrained in organizations through a mix of production pressures, poor communication, and workplace culture.

The study was both revelatory and prescient. In 2003, during the launch of the space shuttle Columbia, a piece of foam insulation from the vehicle’s external fuel tank fell off and struck the left wing. When Columbia reentered Earth’s atmosphere, the wing damage caused the shuttle to break apart. This crew of seven was also lost. Vaughan was asked to sit on the Columbia Accident Investigation Board.

After the board released its report, which echoed the Challenger findings, Vaughan attended a NASA luncheon in Washington. “I was at a table, and I was scared — not everyone at NASA loved my book,” she recalls. “But then people came up to me and thanked me or brought books to be signed. One woman broke into tears and said, ‘I can’t believe we did this again.’”

Forty years after the loss of the Challenger, Vaughan’s analysis remains relevant. As she writes in her book: “The Challenger disaster was an accident, the result of a mistake. What is important to remember from this case is not that individuals in organizations make mistakes, but that mistakes themselves are socially organized and systematically produced.”

 

This article appears in the Winter 2025-26 print edition of Columbia Magazine with the title "This Is Not Normal."

Read more from