In This Review
Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety

Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety

By Eric Schlosser

The Penguin Press HC, 2013, 640 pp.

Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. By Eric Schlosser. Penguin Press, 2013, 632 pp. $36.00.

Between 1950 and 1980, the United States experienced a reported 32 “broken arrows,” the military’s term for accidents involving nuclear weapons. The last of these occurred in September 1980, at a U.S. Air Force base in Damascus, Arkansas. It started when a young technician performing routine maintenance on a Titan II missile housed in an underground silo dropped a socket wrench. The wrench punctured the missile’s fuel tank. As the highly toxic and flammable fuel leaked from the missile, officers and airmen scrambled to diagnose the problem and fix it. Their efforts ultimately failed, and eight hours after the fuel tank ruptured, it exploded with tremendous force. The detonation of the missile’s liquid fuel was powerful enough to throw the silo’s 740-ton blast door more than 200 yards and send a fireball hundreds of feet into the night sky. The missile’s nine-megaton thermo­nuclear warhead -- the most powerful ever deployed by the United States -- was found, relatively intact, in a ditch 200 yards away from the silo.

The Damascus accident epitomizes the hidden risk of what the sociologist Charles Perrow has dubbed “normal accidents,” or mishaps that become virtually inevitable once a system grows so complex that seemingly trivial miscues can cause chain reactions with catastrophic results. As the journalist Eric Schlosser explains in his new book, Command and Control, “The Titan II explosion at Damascus was a normal accident, set in motion by a trivial event (the dropped socket) and caused by a tightly coupled, interactive system.” That system, he writes, was so overly complex that technicians in the control room could not determine what was happening inside the silo. And basic human negligence had only made things worse: “Warnings had been ignored, unnecessary risks taken, sloppy work done.”

Command and Control is really two books in one. The first is a techno-thriller, narrating the Damascus accident in gripping detail and bringing alive the participants and the tough decisions they confronted in dramatic fashion. The second is a more analytic exploration of the challenge at the heart of nuclear command-and-control systems: how to ensure that nuclear weapons are both completely reliable and perfectly safe. Schlosser skillfully fits these two parts together to shine a bright light on the potentially catastrophic combination of human fallibility and complex systems. As in his two previous books, Fast Food Nation and Reefer Madness, Schlosser has exposed the hidden costs of practices that are widely accepted by the American public. Others have examined nuclear weapons though the lens of the normal-accidents theory, most notably the political scientist Scott Sagan in his influential 1993 book, The Limits of Safety. But Schlosser’s gifts as a storyteller lend his book a visceral quality, such that every successive accident or close call feels more hair-raising than the last.

DOOMSDAY MACHINISTS

Since the dawn of the nuclear age, military planners, scientists, and civilian leaders have struggled with what the political scientist Peter Feaver has termed the “always/never dilemma”: how to ensure that nuclear weapons always launch and detonate when ordered to do so, but never when they are not supposed to. In Schlosser’s telling, throughout the Cold War, the military invariably opted for technologies and doctrines that maximized the readiness and reliability of U.S. nuclear forces to deliver a devastating blow against the Soviet Union.

The trend started under President Dwight Eisenhower, who supported the nuclearization of the military and the militarization of nuclear weapons as a cost-saving way to deter the Soviets. Under Eisenhower, custody of nuclear weapons shifted from the civilian Atomic Energy Commission to the military, and each service branch lobbied for new nuclear weapons to support its traditional missions. Even the authority to launch nuclear weapons under certain conditions was predelegated by the president to military commanders. As the number, type, and power of nuclear weapons increased, they became widely dispersed across the United States, at overseas military bases, and onboard ships and submarines.

Schlosser profiles one of the most powerful organizations driving this trend: the Strategic Air Command, which was created in 1946 as the air force’s nuclear-strike arm. When General Curtis LeMay took over SAC in 1948, he inherited an organization that was grossly unprepared for its mission. To enable SAC to execute its elaborate war plan at a moment’s notice under the most stressful conditions imaginable, he developed a checklist for every task and contingency, instituted a rigorous program of training and exercises, measured performance through routine and surprise inspections, and held officers and airmen accountable if their performance did not meet his standards. The command quickly developed a reputation for its discipline, proficiency, and zero tolerance for mistakes. “To err is human,” newcomers were told, but “to forgive is not SAC policy.”

Under LeMay’s leadership, SAC’s arsenal grew to include nearly 2,000 bombers, 800 tankers, and thousands of nuclear weapons. The hawkish LeMay, famous for orchestrating the firebombing of Japan, was and remains a controversial figure. As air force chief of staff, he advocated a first strike against Soviet missiles in Cuba during the Cuban missile crisis and, later, rapid escalation of U.S. military involvement in Vietnam. But it is undeniable that by the time LeMay left SAC, in 1957, he had transformed the organization from a hollow force into a formidable and impressive nuclear-war-fighting machine.

To maximize the United States’ ability to survive a surprise attack and deliver a massive retaliatory blow, SAC kept a large portion of its bombers on ground alert -- fully fueled, loaded with thermonuclear weapons, and ready for launch within 15 minutes -- and maintained a small number of nuclear-armed bombers on continuous airborne alert. It is no coincidence that the accident rate for U.S. nuclear weapons was highest between 1958 and 1968, when SAC’s alert rate was at its peak. Schlosser recounts in horrifying detail a litany of accidents in which nuclear-armed bombers crashed or caught fire due to misplaced rubber cushions, loose nuts, broken vents, mechanical malfunctions, or human error.

In 1961, a B-52 on airborne alert broke apart in midair, dropping two Mark 39 hydrogen bombs near Goldsboro, North Carolina. As it hurtled toward the ground, one of the weapons rapidly passed through five of the six steps needed to arm it for an explosion. If the sixth safety mechanism had malfunctioned, the four-megaton bomb could have detonated on impact, showering the eastern seaboard with radioactive fallout. Even more disturbing, that sixth switch was later found to have had a history of malfunctions.

Near the end of the Eisenhower administration, the safety of nuclear weapons began to receive more concerted attention from weapons scientists and civilian leaders. As Schlosser recounts, the military’s strong preference for reliability tended to trump other concerns. Even civilians such as Donald Quarles, the secretary of the air force, who took a personal interest in nuclear safety, believed that such considerations “should, of course, cause minimum interference with readiness and reliability.” The uniformed brass were even less tolerant of such interference, and high-level civilian intervention was frequently required to ensure the adoption of measures that would improve security and reduce the risk of unauthorized use.

One of the factors that contributed to this prioritization of reliability over safety was the intense secrecy that shrouded all things nuclear. Military officials withheld details about nuclear accidents not only from Congress and the public but also from other parts of the nuclear weapons enterprise. As part of his research, Schlosser obtained a 245-page declassified Pentagon report on nuclear mishaps, which lists hundreds of minor accidents and technical glitches that the weapons scientists who were in charge of ensuring the safety of the U.S. nuclear arsenal were never informed of. Schlosser provides several examples of unsafe, insecure, or high-risk practices that the military halted only after they were exposed to outside scrutiny. With 484 footnotes and a 29-page bibliography, Schlosser’s book ably brings the hidden history of U.S. nuclear command and control to a broader audience.

Ib Ohlsson

POSTWAR POSSIBILITIES

As alarming as these revelations are, the world Schlosser so vividly describes no longer exists. The Cold War ended more than two decades ago. As Schlosser himself acknowledges, steep reductions in strategic nuclear arms, the retirement of older and more dangerous weapons, and the near-total elimination of tactical nuclear weapons have greatly reduced the risks of a nuclear accident. Yet Schlosser is quick to point out that this change does not mean the risks have been eliminated. In August 2007, for example, a B-52 bomber flew from the Minot air base, in North Dakota, to Louisiana mistakenly loaded with six cruise missiles, each armed with a 150-kiloton nuclear warhead for a combined yield of about 60 Hiroshima-size bombs. Surprisingly, this incident warrants only a page in Schlosser’s book, despite it being the first time since SAC’s airborne alert was terminated in 1968 that a nuclear-armed bomber flew over the United States.

A report of the incident in the Military Times triggered an avalanche of investigations into the air force’s nuclear safety and security measures. After then Secretary of Defense Robert Gates fired its secretary and its chief of staff, the air force embarked on a major reorganization of its nuclear operations. One of the most important changes was to create a high-level command dedicated to the nuclear mission, the Air Force Global Strike Command. Some of the independent inquiries pointed to the dissolution of SAC in 1992 and the erosion of its organizational culture as the root cause of the Minot incident. Yet this view risks romanticizing SAC. Although the discipline and attention to detail that the command instilled in its members was commendable, its emphasis on operational readiness at the expense of safety was not. And despite SAC’s zero-tolerance approach, dangerous accidents occurred nonetheless. Schlosser’s book is a powerful reminder that no combination of organizational design and culture can prevent accidents when systems are highly complex, tightly connected, and sensitive to unexpected deviations from standard operating procedures.

Perhaps the most important application of the lessons from Schlosser’s book is not to the United States but to regional nuclear powers such as India and Pakistan, where Schlosser believes the biggest risk of nuclear confrontation now lies. With India and Pakistan capable of annihilating each other’s capitals with a nuclear-armed missile in a matter of minutes, these countries face even stronger pressures on their command-and-control systems than the Cold War superpowers did. To make matters worse, Pakistan is currently introducing tactical nuclear weapons into its arsenal to counter India’s conventional superiority, lowering the threshold for when nuclear weapons might be used during a conflict. And Pakistani nuclear planners must also grapple with internal security threats of a kind that neither the United States nor the Soviet Union ever had to face. As a result, Pakistan faces what Sagan calls the “vulnerability/invulnerability paradox”: measures that allow its nuclear forces to withstand a first strike, such as mating warheads to mobile missiles and dispersing them, also make them more vulnerable to theft or takeover by terrorists.

India’s nuclear command-and-control system faces a different challenge. According to Verghese Koithara, a retired Indian vice admiral, India’s program suffers from too little military input. Nuclear doctrine remains in the hands of civilian scientists who are disconnected from the operational realities of handling and deploying nuclear weapons. As a result, the Indian nuclear command-and-control system is characterized by a “preference for networking over institutionalization, tight compartmentalization of activities, a dysfunctional approach to secrecy, highly inadequate external audit, and a marked lack of operational goal setting.” This description does not inspire much confidence in the system responsible for ensuring the safety and security of India’s roughly one hundred nuclear weapons.

DISASTER RELIEF

Schlosser concludes his book on a contradictory note. On the one hand, he pessimistically describes the world’s nuclear weapons as “an accident waiting to happen, a potential act of mass murder. They are out there, waiting, soulless and mechanical, sustained by our denial -- and they work.” On the other hand, his book is filled with examples of when nuclear weapons haven’t worked, even when subjected to abnormal environments and unanticipated stresses. According to Schlosser, “none of the roughly seventy thousand nuclear weapons built by the United States since 1945 has ever detonated inadvertently or without proper authorization.” This safety record is even more impressive when one takes into account the 55,000 nuclear weapons produced by the other eight nuclear weapons states. Schlosser’s account shows that serious accidents have occurred but also that they have never resulted in the ultimate catastrophe -- a nuclear explosion.

As Perrow has pointed out, this apparent contradiction makes logical sense. A nuclear weapon must undergo a highly precise sequence of events before it can detonate, and any accident that disrupts a single step in this process will prevent an explosion. According to Perrow, “the immense complexity of the devices might have protected us from disaster even as it caused lesser accidents.” But as Schlosser notes in his book, complacency about the safety of nuclear weapons risks running afoul of what the sociologist Donald MacKenzie has called “the Titanic effect”: the safer a system is believed to be, the more catastrophic the accident to which it is vulnerable. The challenge, then, is to make the system safer while preserving the belief that it is dangerous.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and over a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print and online, plus audio articles
Subscribe Now
  • GREGORY D. KOBLENTZ is an Associate Professor in the Department of Public and International Affairs at George Mason University.
  • More By Gregory D. Koblentz