22 november 2004

Human Factor and the risk of Nuclear War, 2004

Human Factor and the risk of Nuclear War, 2004

Text från PDF

Human
Factor
and the risk of Nuclear War
Swedish Section of International Physicians
for the Prevention of Nuclear War

Human to Err
Christina Vigre Lundius
“We have created a world in which perfection is required if disaster beyond history is
to be permanently avoided. But in the world of human beings, perfection is unachiev-
able. The more weapons we deploy and the greater their geographic dispersion, the
more  people  will  be  interacting  with  them.  And  the  greater  will  be  the  likelihood  of
disaster resulting from human error.”
Prof. Lloyd J. Dumas, Bulletin of the Atomic Scientists , Nov. 1980
In the fall of 1999, I once again found myself in Moscow. Leo Feoktistov, one of the foremost
scientists who had worked on the development of the Soviet nuclear bomb, had written a book,
“Nukes Are Not Forever” in which he speaks strongly for the abolition of all nuclear weapons.
      I  had  become  friends  with  Academician  Feoktistov  during  a  voyage  aboard  the  M/S  Anna
Akhmatova  to the nuclear weapons testing ground on Novaya Zemlya.
   Together with a colleague from Germany, I was invited to the Russian Ministry of Defense for a
discussion, as our organization has been on several previous occasions. This time we met with
leaders of the Nuclear Risk Reduction Center. As the millennium change was close at hand, we
discussed the possible risk associated with the computers. I asked the general in charge of the
center if nuclear weapons are really needed, and he answered: “Mankind does not need nuclear
weapons, but while they continue to exist we have to increase their safety.”
   After our discussion we were shown the center which monitors all of the radar stations that are
keeping an eye on the Russian borders.
   In my work as a physician I deal with problems of shift work and its consequences and risks. I
inquired about the working hours for the people manning the monitors and telephones, and was
told that they worked for 24 hours without sleep!
   My duties and my experience tell me that night time work increases the risk of accidents and
mistakes. We also know that at the end of a 16 hour shift, the accident rate increases threefold. A
24 hour shift must increase the risk even further.
   I thought about major accidents such as Chernobyl, Three Mile Island and Exxon Valdez that all
have been linked to fatigue during night work.
   The night shift workers in the company in which I am employed say: “To fall asleep during the
night shift is normal, if only for a few seconds or minutes. Especially if you have a monotonous
3

task, or if you work alone.”
   As a young hospital doctor, I personally experienced very long work hours, up to 48 hours or
more, and I have felt how easy it is to fall asleep, or to make bad judgements.
Many major accidents happen at night. This was the case on Good Friday 1989 when the Exxon Valdez oil tanker ran
aground at Bligh Reef, Alaska, the December 1984 gas leak at Union Carbide’s chemical plant in Bhopal, India, the March
1979 and April 1986 nuclear power plant accidents at Three Mile Island in Pennsylvania, U.S. and Chernobyl, Ukraine.
Numerous reports have found that night time work, and long work hours increases the risk of accidents and mistakes. At the
end of a 16 hour work shift, the accident risk increases threefold.
At the Nuclear Risk Reduction Center, there was a sincere interest in continuing our contacts. I
knew  that  neuroscience  studies  at  the  Karolinska  Institute  in  Stockholm  have  shown  links  be-
tween sleepiness and fatigue, and human mistakes and increased risk of accidents. We agreed to
continue an exchange of seminars on these subjects, as well as to encourage further studies on
the influence of alcohol, drugs and psychological factors and disease.
   We were in total agreement: To err is human, but we must do all we can to prevent human
mistakes that could lead to an accidental nuclear detonation or nuclear war.
Christina Vigre Lundius, M.D. and board member of SLMK,
the Swedish Section of Internationl Physicians for the Prevention of Nuclear War
4

A Global
Catastrophe
Gunnar Westberg
“During our long periods cruising deep in the ocean, often for several weeks, I rarely
got more than a few hours of sleep per night. For several days I stayed on the bridge,
keeping awake with coffee and vodka. There were times when I was so tired, I found it
difficult to see which lights were green and which were red on the instrument panel.
And, yes, during this period I and my crew had the capability to launch our missiles
with more than a hundred nuclear charges.
Retired Soviet Admiral, with many years
of service as a submarine commander, as
told at an IPPNW meeting in Moscow.
At this moment several thousand intercontinental missiles charged with nuclear warheads are on
hair trigger alert, almost all of them in Russia and the United States. If satellite or border surveil-
lance systems show that a country is being attacked by strategic nuclear weapons, the plan is to
“launch on warning,” so that the land/based strategic missiles will be on their way long before
they can be destroyed in their silos. The result of such a launch would be destruction of all human
civilization, maybe all humankind. At the press of a button. The time available to evaluate the
information - real attack or technical snafu – and to make the decision to exterminate mankind
would  be  approximately  ten  to  fifteen  minutes.  And  how  is  it  determined  whether  a  massive
nuclear attack is truly in progress, or simply a computer glitch or perhaps a few rockets fired in
misjudgment?
   No man or woman should have to bear this responsibility.
   During the terrible time of the Cold War, for more than thirty years, we all felt threatened by
the possibility of a nuclear war by mistake. In his book, “The Limits of Safety” Scott D. Sagan cites
many incidents which might have ignited a nuclear holocaust. These were mishaps only on the
U.S. and NATO side of the equation. Many dangerous incidents more are unknown. Examples of
a few frightening situations are described below.
      Now  that  the  Cold  War  is  over,  the  risk  for  an  accidental  all-out  nuclear  war  has  lessened.
However, it is not zero. Mistakes can always be made. And we have no guarantee that the present
good  and  trusting  relations  between  Russia  and  the  United  States  will  remain  forever.  If  the
relationship deteriorates, if trust and fear returns, we would be a lot safer if nuclear weapons had
5

As of July, 2004,
Russia had
approximately
7,800 operational
nuclear warheads
in its arsenal.
This includes
about 4,400
strategic warheads.
The U.S. stockpile
contains approx.
7,000 operational
nuclear warheads,
including 5,886
strategic warheads.
been abolished. If the nuclear weapons states retain their arsenals, proliferation to many other
countries is inevitable. Is it really possible to believe that nuclear war can be avoided if nuclear
weapons are in the hands of politicians and generals in 20 or 30 countries around the globe?
Even a “limited nuclear war” might kill tens of millions of people and make large areas of the
planet uninhabitable. If nuclear weapons are going to spread to many countries, the least the
established nuclear powers can do is to set an example by removing their nukes from high alert
and persuading all “new” nuclear weapons countries to do the same.
   In recent years some nuclear weapons countries have, in defiance of international law, declared
that  they  may  consider  using  their  weapons  even  if  they  have  not  been  attacked  with  such
weapons. This theory of “First Strike” dramatically increases the risk for misunderstanding and
mistakes.
   Proliferation also increases the likelihood that terrorist organizations might acquire or other-
wise gain control of a nuclear weapon, perhaps even nuclear-armed missiles. It could well be that
terrorists would use these with the intent of making a hated government believe that it has been
attacked by another nuclear weapons country and thereby start a nuclear war. This might sound
Source: The Nuclear Notebook,
prepared by the Natural Resources Defense Council
6

like a fictional “James Bond scenario.” But then, so would the September 11 attack on the United
States have appeared before it happened.
Examples of critical situations
The Cuban missile crisis in 1962 may have been the most dangerous situation the world had ever
experienced. A majority of President Kennedy’s advisors favored an invasion of Cuba. They were
not aware that the Soviet Union at that time had already installed 162 operational atomic war-
heads on the island. Four Soviet submarines equipped with nuclear torpedoes were also cruising
in the area. In order for any of them to launch a nuclear torpedo, three officer in the submarine
had to act in agreement. According to newspaper reports from a conference in Havana in 2002,
one of the officers refused. His name was reported to be “Second Captain Vasili Arkipov.” Perhaps
it should be recorded in history books. He may have saved the world.
   On November 9, 1979, duty officers in four separate U.S. command centers observed a radar
pattern indicating a large number of Soviet missiles in a full-scale attack on the United States.
During the ensuing six minutes, full scale preparations for a nuclear retaliation were made. A U.S.
Senator, who happened to be present in the command center, reported afterwards of “absolute
panic” in the center. A check of the U.S. satellites by the NORAD command thankfully showed
that they were functional and no So-
viet  rockets  were  in  flight.  If,  by
chance,  several  satellites  had  been
out of contact or malfunctioning at
that  moment  the  situation  could
have been deadly.
   The misinformation was caused by
an exercise tape running on the com-
puter system.
   On a September night in 1983, the
monitors in a Soviet command cen-
ter suddenly showed a picture of first
one, then five, intercontinental mis-
siles  approaching  Soviet  territory
from  the  United  States.  The  obser-
vation was reported by the officer on
duty, Colonel Stanislav Petrov, to his
superior.  Petrov,  however,  also  ex-
pressed that he believed the obser-
vation to be in error, as there was at
the time no reason for a U.S. attack.
If  the  political  situation  had  been
more tense, preparations for a Soviet
missile  launch  would  have  been
made,  risking  a  similar  reaction  on
the U.S. side. Escalation to a nuclear
war could have followed.
   We have been lucky so far, but we
will  not  know  beforehand  when  our
luck has runs out.
7
Whereever the U.S. President
goes he is accompanied by an
officer  with  the  “football,”  an
attaché case with the codes and
keys  to  launch  a  nuclear  war.
The  President  of  France  has  a
similar nuclear squire.
Gunnar Westberg, Professor of Medicine
at the University of Gothenburg, and also
President of SLMK, the Swedish Section of
International Physicians for the Prevention
of Nuclear War.

Lethal Arrogance
Ordinary mistakes can have extraordinary consequences. Dialing the wrong number
on a telephone is easy. It happens every day. But entering the wrong numbers into an
airplane’s navigational computer, can result in a disaster. Lloyd J. Dumas, professor of
political economy at the University of Texas at Dallas and author of Lethal Arrogance:
Human Fallibility and Dangerous Technologies, has studied the impact of human error
for many years.
On  April  26,  1986,  something  went  terribly
wrong at the Chernobyl nuclear power plant.
It was known to be poorly designed, and could
be dangerously  unstable. But rather than cor-
recting the problems, stricter rules were imple-
mented.
   “The new regulations did not make much dif-
ference on that April morning, since six major
safety systems had been disconnected on pur-
pose,”  Dumas  says.  “It  was  the  deputy  chief
engineer,  the  person  in  charge  of  the  opera-
tion, who was mostly responsible for breaking
the  rules.  This  is  an  example  of  a  poorly  de-
signed  facility,  and  of  bad  management  deci-
sion making that caused serious trouble.”
      The  American  chemical  company  Union
Carbide’s  facility  in  Bhopal,  India,  had  a  well
constructed security system. But on December
3,  1984,  a  giant  cloud  of  poisonous
methylisocyanate gas rose from the plant and
spread over the city. The toxic fumes killed at
least  2,000  people  and  injured  more  than
200,000.
   “That day, workers had siphoned off refriger-
ants from the tank where the methylisocyanate
was stored, disabling the refrigeration system, designed to keep the chemical safe by keeping it
cool.”
      “At  the  same  time,  maintenance  workers  shut  off  the  scrubbers,  designed  to  prevent  toxic
gases from being vented. Yet another group shut down the torch, designed to burn any toxic
8

Union Carbide’s facility in
Bhopal, India.
that managed to get by the scrubbers, in order to repair corroded pipes.”
   “In all, three out of four security systems were turned off. Simultaneously,” says Lloyd Dumas.
Boredom, stress or anticipation
Boredom can be another reason for humans to err. A monotonous job can lead to boredom and
to a dangerous lack of attention. People also sometimes become unreliable because they start
using alcohol or drugs to battle boredom. Other try to cope by sleeping on duty or trying to stay
awake by playing games to take their minds off the boring routine.
   Boredom itself can be stressful and high stress is another important reason why people make
mistakes. Human error is also more likely when someone is expecting or anticipating something
might happen. In a stressful situation, filled with anticipation, it is easy enough to misinterpret
what you see, so that it conforms to what you expected to see, Professor Dumas says.
   In 1987 the American warship USS Stark was attacked by the Iranian Air Force, and 37 sailors
were killed.
   The following year, the USS Vincennes was involved in battles with Iranian ships in the Persian
Gulf. As it got closer to July 4, the American Day of Independence, the crew aboard the American
warship was on heightened alert, having been told to expect an attack.
   Suddenly the Americans saw an airplane taking off from Iran. They saw it starting to dive, as if
to attack. Checking civilian airline schedules, they saw no civilian aircraft that were supposed to
be taking off at that time of the day. So the USS Vincennes fired two missiles, destroying the plane
and killing everyone on board.
   “As it turned out, the plane was actually a civilian airliner, flown by Iranian Airways. When the
guide to the flights was checked again, it was clearly listed, right were it was supposed to be. The
aircraft  had  also  been  transmitting  a  signal  from  its  transponder  that  identified  it  as  a  civilian
airplane.”
   Yet the crew on the Vincennes misinterpreted it as a fighter plane.
   “The crew was obviously under a great deal of stress. They were in combat. Most certainly, they
had    the  USS  Stark  on  their  minds.  And  when  they  saw  that  plane  coming  out  of  Iran,  they
believed that they were being attacked just like the Stark had been a year earlier. And so they
misinterpreted everything to go along with that picture.”
   It is clear that there are major issues of human fallibility within the nuclear military. Between
1975 and 1990, 66,000 U.S. military personnel were permanently removed from duties involv-
ing nuclear weapons, because they were judged to have become unreliable. That is an average of
4,100 people per year for a decade and a half.
9

   In his book, Professor Dumas lists no less than 89 major publicly reported nuclear arms-related
accidents between 1950 and 1994. That means, on average, there was one accident involving
nuclear weapons every six month over the 45 years. It is not known how many more accidents
may have gone unreported.
   “Very often these kinds of accidents occur within the military forces of a country, or even within
an industry, and they are hushed-up. When they can, they usually are. So I consider that list of 89
accidents to certainly be an undercount,” says Lloyd Dumas.
   Murphy’s Law says: “Anything that can go wrong, will go wrong.” Professor Dumas asks: “How
many mistakes have you made today? How often have you seen Murphy’s Law in action? Unfor-
tunately, Murphy’s Law applies to weapons of mass destruction and other dangerous technolo-
gies as much as it applies to anything else.”
According to a 1998 study by the U.S. General Accounting Office, human error was a contributing factor in
almost 75% of the most serious class of U.S. military aircraft accidents in 1994 and 1995. In November 1999,
the Institute of Medicine of the U.S. National Academy of Sciences issued a report finding that medical errors
cause more deaths each year in the United States than breast cancer or AIDS.
10

Waking up to the
Real Threats to Security
in the Post-Cold War World
Lloyd J. Dumas
When the Cold War ended more than a decade ago, we all breathed a collective sigh of relief. We
knew the world had not suddenly become a peaceful place, but a least we had managed to bring
the nuclear arms race to a close without the nightmare of a nuclear war. By a combination of
good sense and good luck, we had somehow managed to exorcise the terrifying specter that had
haunted all of us since that mushroom cloud first rose into the morning sky over Hiroshima.
   Of course we knew the arsenals of nuclear weapons had not disappeared. But that was just a
matter of time, a final detail, a footnote to the history of history’s most dangerous arms race.
   Especially in America, now the only global superpower, with the world’s mightiest military to
protect us, and our former Cold War rivals rapidly receding from center stage, there was a feeling
that our homeland was finally secure.
   Then came September 11th, 2001. The devastating terrorist attack that struck the United States
on that day, shattered New York’s massive World Trade Center, a piece of the Pentagon, thou-
sands  of  innocent  lives  and  the  illusion  that  sophisticated  technology  and  powerful  weapons
could keep us safe.
      Thousands  of  ordinary  people  going  about  their  day-to-day  lives  became  the  victims  of  an
enemy who cared nothing about our fleets of warships, bombers and missiles, who was unde-
terred by a nuclear arsenal capable of turning any country into rubble within hours – an enemy
who turned the fruits of our won technology against us. Over 33 years, more than 14,000 inter-
national terrorist attacks by sub-national groups around the world had taken a total of 9,000 -
10,000 lives. In one terrible day, almost 3,000 new victims were added to the list.
   It had been 135 years – more than six generations – since this kind of deliberate slaughter was
seen on the mainland of the United States. It became natural for Americans to assume that this
was something that always happened somewhere else. Now we know that the U.S. is vulnerable,
too.
   While the American illusion of invulnerability has been shattered, there is another illusion we
have yet to relinquish, one we share with much of the world – the illusion that a species as prone
to  error  and  malevolence  as  ours  can  indefinitely  control  all  the  technologies  we  create,  no
matter how powerful, no matter how dangerous, and permanently avoid disaster. More than just
an illusion, it is a lethal form of arrogance.
11

No form of this illusion is more threatening to human survival than the belief that we can indefi-
nitely  maintain  arsenals  of  devastating  nuclear  weapons  without  eventually  triggering  nuclear
war, by intention or by mistake.
   First, we will briefly explore the pervasiveness of human error, then consider the nature and
genesis of accidental war. Finally, we will take a brief look at the form of malevolence that links
the possibility of accidental nuclear war with what has become a daily reality of present day life -
- the threat of terrorism. This link is malevolence in one of its more virulent forms, the terrorism
of mass destruction.
Human Error
According to a 1998 study by the U.S. General Accounting Office, human error was a contributing
factor in almost 75% of the most serious class of U.S. military aircraft accidents in 1994 and 1995.
A 1998 study by the Union of Concerned Scientists of the nuclear power plants (representing a
cross section of American civil nuclear industry) concluded that nearly 80% of reported problems
resulted from worker mistakes or the use of poorly designed procedures. In November 1999, the
Institute of Medicine of the U.S. National Academy of Sciences issued a report finding that medical
errors cause more deaths each year in the United States than breast cancer or AIDS.
   As we briefly survey some of the most important aspects of human error in dangerous techno-
logical systems, keep two key points in mind. The first is that failures do not have to be con-
tinuous in order to be dangerous. A drug or alcohol impaired nuclear weapons guard is not
alert and ready to act the moment terrorist commandos try to break into a storage area, there
could be a major disaster. Because there is no way to know when those critical moments occur,
every failure of reliability must be taken seriously.
   The second point is that the difference between a trivial error and a catastrophic error lies
not  in  the  error  itself,  but  in  the  surrounding  situation.  Many  of  the  most  trivial  kinds  of
mistakes  that  all  of  us  make  on  a  daily  basis  would  be  disastrous  if  made  in  a  very  different
context. For example, making a telephone call begins by entering a sequence of numbers on a
keypad. The sequence is fed into computers that switch the call. If we inadvertently enter the
wrong number, the wrong telephone rings, and we have to redial the call. The error is trivial. But
the pilots on American Airlines flight 965, headed for Cali, Colombia in 1995, made essentially
the same mistake. They inadvertently entered the wrong sequence of numbers into a computer,
the plane’s navigational computer, and the plane steered into a mountain, killing everyone on
board.
Boredom
For all the potential risk involved, much of the day-to-day work of many of those who deal with
dangerous technologies is really quite boring.
   Guarding nuclear weapons storage areas, going through checklists in missile silos, monitoring
control panels at nuclear power plants is not all that stimulating. Boring work dulls the mind,
leading  to  a  lack  of  vigilance.  Laboratory  studies  have  shown  that  after  a  few  weeks,  people
exposed to extremely monotonous living and working environments sometime experience seri-
ous mood swings, diminished judgement and even hallucinations.
   The things people sometimes feel driven to do to cope with grinding boredom can also cause
serious problems. They may try to distract themselves by focusing their attention on more inter-
esting  or  amusing  thoughts,  which  means  they  are  not  paying  close  attention  to  the  task  at
hand.  They may play games.
   For example, in the late 1970s, Tooele Army Depot in Utah contained enough GB and VX nerve
gas to kill the population of the earth one hundred times over. According to newspaper reports,
the guards at Tooele sometimes distracted themselves from the boring routine by drag racing
their vehicles. They played marathon card games. Arsonists once burned down an old building
inside the Army Depot while guards on the night shift played poker. There were also reports of
frequent sleeping on the job. Sometimes people try to make boredom more palatable by drink-
ing or taking drugs. An American sailor who served as helmsman on the nuclear aircraft carrier
Independence during the late 1970s and early 1980s, claimed that he used LSD almost every day
on duty. He said it was the only way to get through eight hours of extremely boring work.
12

Stress
Working with dangerous technologies can also be very stressful. Sustained high levels of stress
can lead to serious physical problems, such as a compromised immune system and serious emo-
tional problems, such as severe depression and even post traumatic stress disorder (PTSD). PTSD
includes  difficulty  concentrating,  extreme  suspicion  of  others,  recurrent  nightmares  and  emo-
tional  detachment,  all  of  which  tend  to  reduce  reliability.
 At  least  500,000  of  the  3.5  million
American soldiers who served in Vietnam have been diagnosed as suffering from PTSD, as many
as 30% of them may never lead a normal life without medication and/or therapy.
Drug and Alcohol Abuse
Boredom and stress can lead to drug and alcohol abuse. Data released by the Pentagon for the
years 1975-1990 show that almost 20,000 American military personnel were permanently
removed from nuclear duty over that period as a result of drug abuse. Alcohol abuse added
about another 7,000 to the total.
The Fallibility of Groups
One common strategy for assuring that an unreliable individual cannot cause a disaster in the
nuclear military is to require that a group act together to, say, launch a missile attack. But sometimes
groups can be less reliable than individuals.
   In bureaucracies, the flow of information from subordinates to superiors is often distorted. One
classic example is the “good news” syndrome; subordinates edit problems out of the information
they send to higher management in order to pass along a more pleasant picture. The result of all
this good news is that top-level decision makers have a very distorted picture of what is really
going  on.  And  this
problem  tends  to  get
worse,   not   better,
when  there  is  more
at stake, as in organiza-
tions dealing   with
dangerous  technolo-
gies.
   “Groupthink” occurs
when the quality of de-
cisions  made  by  a
group deteriorates as a
result of the pressure to
maintain  consensus
among  its  members.
Increasingly  isolating
themselves from other
points  of  view,  group
members can develop
an illusion of invulnera-
bility that sets the stage for very risky decision making. For example, during the Korean War, after
the North Koreans had been successfully driven out of the South by U.S.-led U.N. forces – the
original goal of the war – groupthink was involved in the U.S. decision to press on and invade
North Korea. Even though the Chinese threatened to enter the war if North Korea was invaded,
and every member of the key American decision group believed that a Chinese entry would be a
disaster for the U.S., they somehow managed to convince themselves that the Chinese would
never challenge U.S. forces. They decided to invade. The Chinese made good on their threat and
entered the war. They overwhelmed the American forces and drove them deep into South Korea.
Years of fighting followed to regain the lost ground. That reckless, foolish decision cost millions of
lives.
13

   Group psychosis is a situation in which a crazy but charismatic leader is able to draw the otherwise
sane members of a group into his or her own delusional world by isolating them and controlling
the conditions in which they live. Twentieth century examples include the Reverend Jim Jones
and his followers at Jonestown, Guyana, in the 1970s and David Koresh and the Branch Davidian
at Waco, Texas, in the early 1990s.
   Suppose a charismatic military commander, who seemed fully functional, had become deeply
disturbed. With great control over the lives of troops already primed for obedience by the very
nature of military life, such a commander might be able to draw them into his or her delusional
world. The crew of a nuclear missile submarine is isolated for months at the time. The captain has
nearly complete control of the conditions in which
they  live  and  work.  And  every  nuclear  missile
submarine carries enough firepower on board to
devastate any nation on earth.
   In short, relying on groups does not solve the
human reliability problem.
Nuclear War by Accident
In January 1987, the Indian Army prepared for a
major  military  exercise  near  the  bordering  Paki-
stani province of Sind. Because Sind was a strong-
hold of secessionist sentiment, Pakistan concluded
that  India  might  be  getting  ready  to  attack  and
moved its own forces to the border. The two na-
tions had already fought three wars with each other
since  1947.  Both  of  them  were  now  nuclear-ca-
pable;  Pakistan  was  widely  suspected  of  having
clandestine  nuclear  weapons.  The  build-up  of
forces  continued  until  nearly  one  million  Indian
and Pakistani troops tensely faced each other across
the border. The threat of nuclear war hung in the
air as they waited for the fighting to begin. Then,
after intense diplomatic efforts, the confusion and
miscommunication  induced  by  human  error  be-
gan to clear and the crisis was finally defused. In-
dia and Pakistan had almost blundered into a cata-
strophic war by accident.
   In 2002, tensions once more flared between the
two, as India concluded that a murderous attack
on the Indian parliament carried out by Kashmiri
separatists had been supported by Pakistan. Now
there was little question that both sides possessed
arsenals of nuclear weapons and missiles that could
be  used  to  deliver  them.  Threats  and  counter-
threats filled the air. A small misstep, a human er-
ror induced misinterpretation, and those missiles
might have begun to fly. Pakistan and India share
a border with China (some of it in the region of
Kashmir), which has a much larger nuclear arse-
nal.  Aside  from  the  catastrophic  loss  of  life  that
would result from a nuclear war between the two,
if one of those nuclear-armed missiles accidentally
landed in China the world could have been drawn
into a much larger conflagration. And the whole
chain of events could easily have been set in mo-
tion by human error.
14

  Is this exaggeration? Do we have any real evidence that a disastrous war can actually be started
by mistake? Think back to 1914. Two alliances of nations were locked in an arms race, faced off
against  each  other  in  Europe.  Both  sides  were  armed  to  the  teeth  and  convinced  that  peace
would be maintained by the balance of power they had achieved, despite the growing tensions.
   Then on June 28, 1914, Archduke Ferdinand of Austria-Hungary and his wife were assassinated
by a Serbian nationalist. The assassination set in motion a chain of events that rapidly ran out of
control of Europe’s politicians and triggered a war that no one wanted. By the time it was over,
nine to eleven million people had lost their lives. Yet the whole thing might have been prevented,
but for a simple failure of communications.
   The Kaiser had sent the order that would have stopped the opening attack of World War I (the
German invasion of Luxembourg on August 3, 1914) before it was to begin. But the message
arrived thirty minutes late. In a classic understatement, the messengers who finally delivered the
belated order said, “a mistake has been made.”
   For an accidental nuclear war to occur, there has to be a triggering event. During the nuclear
age, there have been many serious false warnings of nuclear attacks that could have played a key
role in unleashing nuclear forces by mistake.
   For example, in 1995, Russian warning radars detected a rocket rising from the Norwegian Sea
that appeared to be a U.S. submarine-launched Trident missile targeted at Moscow. The warning
was relayed all the way up to President Yeltsin, who had only a few minutes to decide whether to
launch a nuclear attack in response. Fortunately, the Russian military determined that they had
made an error in projecting the missile’s trajectory. It was headed far out to sea, not targeted on
Moscow.  The  rocket  was  American,  but  it  was  not  a  Trident  missile.  It  was  a  scientific  probe
designed to study the Northern Lights. The Russian government had been told of the launch, but
apparently “a mistake had been made,” and word never reached key military commanders.
   Today, a decade after the Cold War, we still keep much of the U.S. nuclear force on high alert,
increasing the probability of accidental war.
   It is also possible that a sufficient deadly terrorist attack could trigger an accidental nuclear war.
Consider  the  following:  1)  the  U.S.  has  declared  a  loosely  defined  “war  on  terrorism”  and
demonstrated a propensity toward using massive military force in response to terrorist attack; 2)
President Bush has officially cast the nations of the world into two camps, by publicly declaring
that all countries are either “with us or with the terrorists”; 3) the Pentagon’s (January, 2002)
Nuclear Posture Review has led to speculation that we may be ready to use tactical weapons as
“bunker busters” and the like in future military assaults against nations the U.S. considers to be
“with  the  terrorists.”  Under  these  conditions,  a  mass  destruction  terrorist  attack  on  U.S.  soil
might lead to a military counterattack involving nuclear weapons against a country we supposed
or assumed had aided or encouraged the terrorists, when they actually had not. Could terrorists
actually launch such an attack?
The Terrorism of Mass Destruction
There are two basic ways that terrorists might carry out an act of mass destruction. One is to use
weapons  of  mass  destruction  that  they  have  built,  bought  or  stolen.  The  other  is  to  stage  a
conventional terrorist bombing of a toxic chemical plant, a nuclear power plant or a toxic chemical
or nuclear waste storage area.
   On April 23, 2002, the New York Times reported: “A top leader of Al Qaeda now in custody has
told American interrogators that the terrorist group is close to building a crude nuclear device
and  may  try  to  smuggle  it  into  the  United  States.”  He  was  apparently  talking  about  a  “dirty
bomb,” a conventional explosive wrapped in radioactive material. But terrorists could do better
than  that.  All  the  information  necessary  to  design  a  nuclear  bomb  has  been  available  in  the
public literature for decades. More than twenty years ago, two undergraduate students - one at
Princeton, one at M.I.T. - independently designed workable nuclear weapons using only publicly
available sources. In 1996, Time Magazine reported that seventeen scientists at Los Alamos Nuclear
Weapons Labs had been given the assignment of designing AND building terrorist-type nuclear
weapons using “technology found on the shelves of Radio Shack and the type of nuclear fuel sold
on  the  black  market.”  They  successfully  assembled  more  than  a  dozen  “homemade”  nuclear
bombs.
15

If  the  terrorists  who  bombed  New  York’s  World  Trade  Center  with  airliners  had  used  even  a
crude, inefficient nuclear weapons instead, the death toll would not have been in the thousands,
it would have been in the tens off thousands.
   Terrorists might also be able to steal – or buy – a well-designed, ready-made weapon. In 1997,
on American television, Russian General Alexander Lebed claimed that Russia had lost track of
some  100  “suitcase”  nuclear  bombs.  On  April  23,  2002,  the  New  York  Times  reported:  “The
White  House  cut  93%  of  a  recent  [$380  million]  request  by  the  [U.S.]  secretary  of  energy  for
money to improve the security of [American] nuclear weapons and waste.”
   What about conventional attacks against nuclear facilities? In early 2002, the U.S. reported that
it had found diagrams of nuclear power plants in suspected terrorist hideouts in Afghanistan.
   We may have already had a very close call. The fourth jetliner, the Boeing 767 that crashed near
Somerset, Pennsylvania, during the barrage of hijacking on September 11, flew out of the East
Coast headed west and slightly south. After it was hijacked, it looped around and headed east
again, and apparently went down when the passengers and crew fought the hijackers.
   When the plane crashed it was headed toward, and only about 120 miles (about fifteen minutes
flying time) from the Three Mile Island nuclear power plant.
   The  Nuclear  Regulatory  Commission  has  admitted  that  the  containment  of  American  nuclear
power plants were not designed to withstand the impact of a 767 flying at 500 miles per hour. If
the plane had reached and crashed into the nuclear reactor at Three Mile Island, we likely would
have had an American Chernobyl on our hands.
Conclusions
We humans are a very powerful and capable species, but we are not perfect, and we never will
be. Our fallibility is part of what makes us human, and like it or not, we must recognize that it will
always be with us. It sets inherent limits on our ability to avoid error, even disastrous error.
   There are also those among us who consider human life, that physicians are trained so carefully
to preserve, to be just another commodity, expendable in the quest for whatever goals they seek.
Perhaps someday we will find a way to stop creating such people. But until that day comes we
must remove even the possibility that they can acquire the means by which to do catastrophic
damage to our species.
16

   There are better ways to fight terrorism than with massive force, there are better ways to
achieve security than through the threat or use of nuclear weapons. We are such adaptable
species, there is little doubt that we can learn to use them.
   We can no more avoid the binderies imposed by our fallibility than we can revoke the laws of
nature.  If  we  want  to  survive,  let  alone  prosper,  we  must  learn  to  live  within  those  binderies.
There is no other choice.
On March 28, 1979 a sequence of events; equipment malfunction, design related problems and worker errors,
led to a partial meltdown of Unit 2 at the Three Mile Island nuclear power plant near Middletown, Pennsylvania.
The main feedwater pumps stopped running, which prevented the steam generators from removing heat from
the reactor. Signals available to the operators failed to show exactly what had happened, why they took a series
of actions that made conditions worse by simply reducing the flow of coolant through the core.
Dr. Lloyd J. Dumas, Professor of political economy at the University of Texas at Dallas.
His research and teaching interests include: national and international security;
economic development and economic transition; macroeconomic theory
and the economics of military spending. His sixth book is highly interdisciplinary,
“Lethal Arrogance: Human Fallibility and Dangerous Technologies.”
17

The Right to Human Error:
the Price of the Issue
Sergey Kolesnikov
Alexander Yemelyanenkov
It is generally accepted that a human being has the right to human error. This applies equally to
the housewife standing before a gas stove and the operator of complex manually operated me-
chanical  systems.  There  is  a  clear  relationship  in  this:  the  more  powerful  and  dangerous  this
manually operated system is, the higher the price of human failure, and the more far-reaching
the consequences.
   It is even difficult to imagine the extent of the catastrophe and its consequences as a result of
human error made by a team or a single operator, during the application of nuclear technology
and involving nuclear weapons, including their delivery systems, based on that technology. There
have been quite a few such incidents and events in recent history when everything seemed to
hang on a razor’s edge and man found himself face-to-face with an irredeemable calamity.
   The materials of the joint research project by the Russian Committee of the “International Phy-
sicians for the Prevention of Nuclear War” movement, together with their colleagues from Sweden
and the United States, entitled “Incidents at nuclear installations with nuclear weapons and their
consequences as a result of human error” bear ample witness to this.
      During  the  1950s  and  60s,  the  Soviet  Union,  as  well  as  the  United  States,  France  and  later
China  conducted  many  tests  of  nuclear  arms  and  their  delivery  vehicles  under  conditions  as
closely resembling those on the battlefield as possible.
   Ballistic missiles with nuclear warheads were launched from the central and eastern regions of
the country (for example, from the Kapustin Yar and the Chita area launch pads) with the explo-
sions of the warheads over Novaya Zemlya or in the vicinity of the Barents Sea. The trajectory of
the missiles passed over some heavily populated areas and industrial sites of the country.
   Flight tests of nuclear-armed anti-aircraft missiles were being conducted during the same years
at  the  Kapustin  Yar  test  site.  The  warheads  were  detonated  at  great  heights  (so-called  high-
altitude nuclear explosions, nuclear arms testing in space). The navy tested torpedoes with nuclear
warheads, which were commissioned in the Russian Navy. (One such test took place in 1975 and
two  others  in  1961).  The  firings  were  made  from  diesel-powered  submarines  of  the  Northern
Fleet at the southern tip of Novaya Zemlya. Long-range bombers of the strategic air force fre-
quently  conducted  tests  and  training  with  real  drop  of  nuclear  bombs  in  the  Novaya  Zemlya
area.
18

Fortunately, most of these tests were successfully completed in the “regular” mode, as the mili-
tary say. There were, however, some unforeseen, including some very dangerous situations.
   In the mid-1950s, an accident occurred on board a Soviet military aircraft, which had taken off
with  a  nuclear  bomb  for  a  test  drop,  but  the  aircraft  had  to  return  to  base  for  an  emergency
landing with a fully activated nuclear device on board! Instructions for such a contingency did
not exist.
   Emergency situations (explosions, accidents, fires, technical malfunctions) occurred several times
during  test  and  training  vehicle  missile  launches  from  the  Tyura-Tam  (Baikonur),  Kapustin  Yar
and Plesetsk test sites.
On  January  26,  1983,  a  missile  launched
from  the  Plesetsk  test  site,  fell  into  the
middle of the Northern Dvina River near a
settlement  in  the  neighboring  region.  As
the  missile  hit  the  ice,    it  exploded  and
caused a hole of about 100 meters in di-
ameter. The rocket, together with the re-
maining  unburned  fuel,  sank.  Due  to  the
dangerous contamination of water in the
Northern Dvina (including heptyl) the wa-
ter supply for Archangelsk and other inhab-
ited places using the water supply down-
stream from where the rocket fell, was cut
off.
      One  of  the  latest  serious  incidents  was
the  explosion  and  fire  during  the  test
launch of a new sea-based ballistic missile
that occurred at the Nenoksa (Archangelsk
region) test site on November 19, 1997.
Submarine accidents
Quite a few dangerous situations occurred
during  sea  patrol  and  test  launches  from
submarines and surface ships in testing and
training areas in the Barents and the White
Seas (Northern Fleet) and off the Far East
Coast  of  Russia  where  ships  from  the  Pa-
cific Fleet are based.
   On June 25, 1983, the Soviet nuclear sub-
marine  K-429,  with  cruise  missiles  on
board, sank at a depth of 39 meters in the
training area in Sarannaya Bay (off the coast
of Kamchatka) due to an error on the part of the crew. Sixteen members of the crew died. The
submarine was raised on August 9, 1983.
   On October 6, 1986, the Soviet nuclear submarine K-219, with two reactors and 15 ballistic
missiles on board, sank off the coast of Bermuda as a result of an explosion in a missile silo. Four
of the crew died.
   On April 7, 1989, the Soviet nuclear submarine “Komsomolets” with two nuclear-tipped torpe-
does on board went down in the Norwegian Sea. Forty-two crew members died.
   In discussing the situation in the nuclear Navy, one has to admit that the hypothetical danger
exists in reality, even during the regular refueling operations in the reactor active zones of nuclear-
powered submarines (the so-called “Operation No. 1”). A typical example of this is the explosion
in the reactor of a nuclear-powered submarine in Chazhma Bay (the Pacific Fleet) and the events
that proceeded it.
19
   Missile launch from the Plesetsk test site.

On October 6, 1986, the Soviet nuclear submarine K-219, with fifteen ballistic missiles on board sank off the
coast of Bermuda.
On August 10, 1985, an explosion occurred in the reactor, together with a radioactive fall-out,
during refueling aboard the nuclear-powered submarine K-431, tied-up at the dockside of the
Chazhma Bay naval base (military township Shkotovo-22, near Vladivostock). Ten people died as
a  result  of  the  injuries  they  sustained  at  the  moment  of  the  accident  -  eight  officers  and  two
seamen of the emergency service. Ten people were victims of acute radiation sickness and reac-
tion to the radiation was noted among 39 people. In all, 290 people were subjected to overdoses
of radioactivity during the accident and during the clean-up operations. As a result of the explo-
sion in the reactor, the hull of the submarine was damaged, many of the operational systems
were knocked out of action, and the newly filled fuel was dumped overboard. The fire lasted four
hours. The radioactive fall-out in the atmosphere was widespread - aerosol fall-out was registered
at a distance of 30 kilometers from the explosion. Submarines and special-purpose vessels in the
vicinity  off  the  explosion,  as  well  as  piers,  the  site  and  industrial  structures  of  the  plant  were
seriously contaminated by radioactivity. The epicenter of radioactive contamination settled on
the waters of the bay, particularly hard by the submarine itself - the sector where the accident
occurred for a long time remained in contact with the seawater due to the hole in the hull. The
fate of the sub has not been decided to this day. The nuclear fuel has not been unloaded yet.
      Recently,  it  became  known  that  the  Chazhma  Bay  incident  repeats  almost  step-by-step  an
accident, which had been kept secret and which happened 20 years(!) earlier, at the repair dockside
in Severodvinsk. On February 12, 1965, during the refueling of the reactor active zone of nuclear
submarine K-11, due to carelessness of the crew, an unauthorized start-up of the reactor took
place (go-ahead, full-power) causing gas vapor emission and fire. The site of the plant, the wharfs
and the waters of the harbor were radioactively contaminated. The reactor section was filled with
water during the firefighting operations. As a result, 350 tons of highly radioactive water was
formed. Another 150 tons penetrated into the turbine section. In order to avoid sinking the sub,
the radioactive water, according to witnesses and participants of the events, was pumped over-
board - right into the waters of the base.
20

      The  submarine  afloat,  but  the  reactor  section  had  to  be  cut  out  and  replaced.  The  reactor
section, in which the accident occurred, was eventually sunk in Abrasimov Bay off Novaya Zemlya.
Transports, overloads and social factors
Another  serious  danger  is  human  error  on  the  part  of  the  crew  during  operational  checks  of
nuclear-tipped missiles and torpedoes on board surface ships and submarines.
   On September 8,1997, during an operational check of the missile silos sector of the nuclear-
powered submarine K-417 of the Pacific Fleet, pressure inside one of the missile silos reached a
critical level due to an error by one of the operators. A leakage of fuel components began from
the damaged fuselage of an intercontinental ballistic missile and the nuclear warhead (up to one
megaton  of  destructive  power)  was  torn  off  by  the  mounting  pressure  and  thrown  into  the
waters off the coast of Kamachatka. According to the evasive admissions of the participants in
this event, more than a month was spent on the search and the subsequent neutralization of the
warhead.
   As long as nuclear arms exist, we will inevitably be confronted with other risks directly con-
nected to them. Here is just a short list of such threats:
•High danger level during transportation of nuclear warheads and their components by rail,
sea and air.
•Explosions  and  detonation  of  conventional  ammunition  as  a  result  of  which  damage  to
nuclear arms devices in the immediate vicinity could occur with ensuing conflagration and
scattering of radioactive components.
•Overloading of nuclear cemeteries due to sharp and massive nuclear arms reductions.
Missing deadlines for operational checks on nuclear warhead ready for action.
•Social factors (insufficient material support, alcohol and drug addicition, psychological and
physiological deficiencies among the service personnel, etc.)
•Incidents  at  guard  posts  protecting  missile  installations  on  alert  and  at  nuclear  weapons
storage facilities.
•The inevitable lowering of the vigilance and responsibility levels during watch on nuclear-
powered  submarines  not  at  sea,  but  in  harbor,  within  sight  of  one’s  living  quarters.
And, something which should be stressed particularly, collisions at sea between submarines
with nuclear weapons on board.
In February of 1992 the multipurpose nuclear-powered submarine K-276, a “Sierra-2” submarine like in the
photo above, collided during surfacing with the nuclear-powered submarine “Baton Rouge” from the U.S. Navy.
21

   Here are two recent examples:
   February 11, 1992, 20:16 hours, the Barents Sea. The multipurpose nuclear-powered subma-
rine  K-276  (“Sierra-2”  according  to  NATO  classification)  of  the  Northern  Fleet  collided  during
surfacing at a depth of approximately 20 meters, with the nuclear-powered submarine “Baton
Rouge”  of  the  U.S.  Navy  of  the  Los  Angeles  class.  Both  submarines  were  armed  with  nuclear
missiles, torpedoes and mines. The “Baton Rouge” had one nuclear reactor, the Soviet submarine
had two. According to the command of the Russian Navy, the collision took place within Russian
territorial waters. A representative of the U.S. Navy insists that the incident took place in interna-
tional waters, however, he does not deny that the “Baton Rouge” was on a reconnaissance mis-
sion within the training areas of the Northern Fleet.
   March 20, 1993. At about 09:00 hours, the nuclear-powered submarine “Borisoglebsk” (“Delta-
4” according to NATO classification) of the Northern Fleet, collided with U.S. Navy submarine
“Grayling.” Both were below surface. The press service of the Russian Navy, not having precise
information about the nationality of the “stranger,” issued the following announcement: “The
commander  of  the  foreign  submarine,  attempting  to  observe  the  movements  of  our  sub,  lost
ASDIC contact with it. Maneuvering in an unprofessional way he created a dangerous situation
which led to the collision...” The announcement noted the “low level of responsibility and train-
ing  of  the  commanders  and  crews  of  foreign  submarines,  conducting  such  reconnaissance...”
There were no evaluations of the crew of the Russian sub. The U.S. Navy command acknowl-
edged the sub to be its own. That was all. There were no comments in response.
   The analysis of a large number of incidents, including those causing loss of life and consider-
able material damage (the loss of warships, aircraft, the destruction of industrial sites, the pollu-
tion of the environment, etc.) shows that in a number of cases, a seemingly insignificant techni-
cal malfunction turns into a serious accident, and the accident into a catastrophe.
      And  here  we  are  against,  once  again,  the  human  factor  -  to  what  extent  is  the  reaction  of
personnel, the rising to the occasion, adequate (that is, professional and timely) during emer-
gency situations (break-down, fire, explosion, technical damage, etc.)? What influences the out-
come? In many cases, the factors repeat themselves: the quality of professional training, the level
of expert knowledge, practical experience, morale, general psychological condition and of per-
sonal spiritual and physical qualities.
      In  the  recent  past,  when  Admiral  Sergey  Gorhkov  commanded  the  Soviet  Navy,  aboard  all
ships, submarines and at all shore bases, you could hear his well-worded saying:
   “There is no justifiable and unavoidable accident! Accidents and the reason why they happen are
created by people through their irresponsibility and their lack of professionalism.”
   This harsh and categorical, but not unfounded statement, made after the loss of the “Komsomlets”
in 1989, was ostracized and rejected as inhuman and politically incorrect. However, that did not
mean to say that there was a reduction in the number of accidents and their causes. Fires, explo-
sions of ammunition on warships and shore bases, collisions between nuclear submarines, acci-
dents in reactors and missile silos - all these are not a full list of those emergency situations, which
in a different combination of circumstances, could have had far greater tragic consequences.
   In connection with this, it is appropriate to recall the words of Academician Valery Legasov who
died  before  his  time.  He  directly  took  part  in  the  clean-up  operations  at  Chernobyl  and  did  a
great deal so that we would be able to know the truth about this catastrophe and be able to draw
lessons from it. Shortly before his tragic death, Academician Legasov gave this warning:
   “Mankind, having armed itself with powerful technological resources, has only just begun to think
how it can protect itself from them. Now we have to fight not against that which has already exploded
or is about to suddenly explode tomorrow. We have to realize once and for all: we have to fight for the
creation of protective technology, which is adequate enough to deal with that power which has been
given into the hands of mankind. This is a problem that concerns the whole world. I am for the respect
for  ergonomics  -  for  building  the  right  and  reasonable  relationship  between  man  and  machine.”
   One would like to think that the worst incidents are over. But, unfortunately, reality does not
allow us to build such illusions.
22

   Some reminders:
   The loss of “Challenger.” The nuclear sub “Kursk” catastrophe. The collision between a Boeing
cargo plane and the Tu-154 airliner with children on board over Lake Constance, Switzerland in
the summer of 2002. The Siberian Airlines plane shot down in error by an anti-aircraft missile
during  exercises  by  Ukrainian  forces.  The  demonstration  flight  by  a  fighter  over  the  military
airfield in Lvov, which crashed onto the crowd. The Il-86 airbus of the Pulkovsky Airlines, which
crashed  during  takeoff  at  Sheremetevo  airport  near  Moscow.  The  endless  series  of  crashed  of
Russian helicopters in the Northern Caucasus and other places. Frequent collisions of warships,
including submarines, between themselves and with merchant ships. And, finally, the tragic loss
of the “Columbia ”space shuttle.
   To be continued?
Sergey Kolesnikov, IPPNW Vice President, IPPNW-Russia Co-President,
Deputy of the RF State Duma,
Academician of the Russian Academy of Medical Sciences.
   Alexander Yemelyanenkov, journalist, scientific observer of
“Rossiskaya Gazeta” national daily,
IPPNW-Russia Program Coordinator from 1998 to 2001.
23
A Ropuchhka II amphibious assault ship at anchor near Vladivostok, as the ship readies for Exercise Cooperation
From the Sea ‘96, a joint exercise between the Russian Federation Navy and the U.S. Navy in conducting disaster
relief and humanitarian missions.

No Room
for Mistakes
Gunnar Westberg
“Any military commander must admit when
he looks back, if he is honest, that he has
made  mistakes  that  did  cost  lives,  maybe
thousands of lives. I have, we all have. But
we learn from our mistakes. We don’t often
do  the  same  mistakes  twice,  and  certainly
not  time  and  again.  But  with  nuclear
weapons there is no place for mistakes. There
is no learning time with nuclear weapons.”
Robert McNamara,
former U.S. Secretary of Defense,
 from the documentary film “Fog of War.”
At the 1985 Fifth International Conference of
International Physicians for the Prevention of
Nuclear  War,  IPPNW,  in  Budapest,  the  head
of  the  Soviet  Nuclear  Energy  Department
stated that the risk for a meltdown or similar
catastrophe  in  a  Soviet  nuclear  reactor  was
less than one in a million reactor years. I was
present  at  the  lecture  and  remember  the
statement vividly. That was before Chernobyl.
24

   The Chernobyl incident was caused mainly by an unforeseen combination of human mistakes.
Many wars have also started by an unexpected combination of mistakes and misunderstandings.
   One such mistake must never happen: A large nuclear war. That would be the mistake to end
all mistakes.
   “Launch on Warning” of intercontinental nuclear missiles on high alert meant that when the
commander-in-chief of a nuclear weapons state has sufficient reason to believe that the country
is attacked by many nuclear missiles, the country’s own nuclear missiles should be launched to
prevent them from being destroyed before launch. The time available for decision may be fifteen
minutes, in some situations much less. Launch on Warning is also a basis for deterrence through
“Mutual Assured Destruction” (MAD).
   At the present time the political relations between Russia and the United States of America are
open and cordial and a nuclear attack from either side is said to be “unthinkable.” If that is the
case, why do both countries retain their doctrines of Launch on Warning and keep their missiles
on  High  Alert?  The  military  and  political  leaders  of  the  two  countries  have  not  answered  that
question. Perhaps they are less trusting than we know?
   Nuclear weapons remain a “cornerstone of the military system in the foreseeable future,” for
both Russia and the U.S. There is no guarantee that the apparent confidence between the two
countries will last for the “foreseeable future.” When the trust is gone, it will be too late to reach
an agreement on nuclear abolition.
   Today, after more than a decade of increasing trust and partnership between Russia and the
U.S., is the best time ever to decrease the risk of nuclear war. If not now, when?
      If a nuclear charge explodes in Moscow and perhaps in one or more other Russian cities and
at the same time there is malfunction in several satellites, and warnings are sounded that missiles
may be approaching the country, what would the reaction be at the Russian command center?
Today, hopefully, this would be interpreted as a large terrorist attack, simultaneously affecting
the power supply and information systems. In the future there may be a different political climate.
Conflicts  between  Russia  and  The  U.S.,  e.g.  regarding  oil-rich  countries  in  Central  Asia,  may
increase  the  tension.  Will  the  Russian  response  at  that  time  really  be  one  of  “wait  and  call
Washington?” Is there enough time? The state would be set for the final mistake.
   The strategic nuclear missiles of Russia and the U.S. are said to be “detargeted.” If a missile is
launched unauthorized, for instance by a computer, the missile has no target co-ordinates in its
navigational system, or possibly, the target is in the Arctic. Let us hope that this is true. However,
retargeting can be done in seconds or minutes when a launch is initiated by humans. Detargeting
does not protect us from human mistakes.
   Mistakes will always be made. We must try to decrease the risk that the mistake leads to a
catastrophe. More time is needed, so that facts can be checked, alternative responses found
and contact can be made with the “enemy.”
   The first step to decrease the risk of annihilation by mistake is to take all nuclear missiles off
High  Alert.  This  can  be  done  electronically,  by  building  delays  into  the  systems,  giving  some
time to reconsider the situation. Electronic dealerting is probably not verifiable by the other side.
However, it can be argued that even a dealerting without verification would decrease the risk of
a  massive  response  to  a  false  warning.  Especially  for  the  U.S.,  with  its  enormous  arsenal  of
invulnerable  submarines  carrying  nuclear  weapons,  unverifiable  bilateral  electronic  dealerting
should be an attractive step to greater security.
      Unverifiable  dealerting  is,  however,  seen  as  insufficient  by  the  nuclear  powers.  A  verifiable
delay  system  can  be  obtained  by  a  covering  of  all  missile  silos  with  large  amounts  of  dirt,
which would take several hours to remove. “Decoupling” has also been proposed. The nuclear
charges should then at all times be stored at a certain distance from the missile, requiring hours
to recouple. Both preparations for launching can be checked by satellites. Immediate inspection
must be granted when suspicious changes in missile readiness are seen from space.
   Dealerting and delay for submarines is a difficult task. The British government has solved the
problem  by  being  able  to  contact  the  nation’s  only  nuclear-armed  submarine  on  patrol,  only
once every twenty-four hours. Methods have been proposed also for U.S. and Russian submarines
to implement a delay system. Experts in these countries have not reported sincere evaluations of
such proposals.
25

As part of the Nunn-Lugar/Cooperative Threat Reduction Program; A Russian shipyard worker uses a cutting
torch to breakdown a large bulge section of a Russian Oscar Class submarine at the shipyard in Severodvinsk.
   Deep cuts in the arsenals of strategic nuclear weapons down to less than 1,000 for BOTH the
United States and Russia, carried on a small number of submarines, would greatly decrease the
risk of a nuclear war by mistake. Launch on Warning would be unnecessary. But neither the U.S.
nor Russia plans such deep cuts. They intend to keep the capacity to kill all mankind “for the
foreseeable future.”
   A few steps to prevent mistakes and misunderstandings should be taken immediately:
   Working conditions for personnel in strategic command centers should be at least as favorable
for alertness and vigilance as for air traffic controllers. Active observation periods at the monitors
should  be  interspersed  with  adequate  rest.  First  of  all  a  security  culture  must  be  established
where the reporting of all accidents, all mistakes, all attacks of “micro sleep” should be actively
encouraged.
      To  decrease  the  risk  of  misunderstandings  and  to  increase  trust,  officers  responsible  for  the
strategic  nuclear  weapons  readiness  from  Russia,  USA  and  China  and  perhaps  other  nuclear
weapons  states,  should  meet  regularly.  They  should  review  scenarios  when  mistakes  could  be
made, develop means to contact each other within seconds in a critical situation and in general
work to prevent disastrous misunderstandings.
   Even if we do whatever possible to decrease the risk of mistakes, we must ask ourselves: Do we
believe that nuclear weapons can be allowed to exist for decades and centuries without being
used? And if they are used, how can we believe that their use will not escalate to a large nuclear
war?
So what should be abolished first, nuclear weapons or mankind?
26

Human Factor Meetings
Klas Lundius
September 1999: Meeting at the RF Ministry of Defense, Risk Reduction Center. Discussion about
shift work.
May 21-24, 2000: 2000 Dialogue Meeting in Moscow. Round table discussion “Changing Nuclear
Policy and Military Doctrine.”
November  9-11,  2000:  Seminar  hosted  by  SLMK  and  the  Swedish  National  Defense  College,
Stockholm. “Risk of Accidental War: The Human Factor.”
May 19-25, 2001: Seminar and Dialogue Meeting in Moscow. Press conference at Novaja Gazeta
“Nuclear Risk Reduction: Human Factor Effect.”
November  19-21,  2001:  Seminar  in  Uppsala,  Sweden.  A  delegation  from  Minatom,  Moscow
participated.  “The  human  factor  in  the  high  technology  developed  society  -  fatigue  due  to
respiratory, stress, sleep, pain disorder - diagnostic and therapeutic approaches.”
March 23-27, 2002: Seminar in Moscow: “Nuclear Technologies and Security: Human Factor.”
Meeting with the Minister of Health.
May  2-4,  2002:  Workshop  at  the  IPPNW  World  Congress  in  Washington  D.C.:  “The  Human
Factor and Risk of Accidental Nuclear War.” DC with lobbying on Capitol Hill.
May 14-18, 2003: Dialogue Meetings in Moscow and meeting with the Speaker of the Duma
and the Atom Minister. Press conference at Rossiska Gazeta.
   Thirty minute program on Russian Television with Sergei Kapitza, Dr. Christina Vigre Lundius
and Prof. Lloyd Dumas. The title of the show: “Obvious vs. Incredible!” The topic of the program
was “The Human Factor.”
November 2003: Start-up meeting for a booklet project on the Human Factor.
Klas Lundius, Executive Director SLMK,
The Swedish Section of International Physicians
for the Prevention of Nuclear War.
27

Photographs
Cover:Courtesy of U.S. Department of Defense
Page 4Probability of Error During Shift Work: Adapted from information obtained from “Body-
rhythms: Chronobiology and Peak Performance copyright 1994 Lynne Lamberg p. 197
William   Morrow and Co., Inc., New York.
Page 6Young girl in front of nuclear missiles.
Page 7Man with suitcase: Courtesy of U.S. Department of Defense
Page 8Professor Lloyd J. Dumas: Courtesy of University of Texas at Dallas
Page 9Union Carbide, Bhopal, India> Courtesy Chris rainier, U.S. Chemisafety
Page 10Woman at monitor: Courtesy of U.S. Department of Defense
Page 12India-Pakistan: Courtesy of U.S. Department of Defense
Page 14Letter: Courtesy of U.K. Government.
Page 16World Trade Center: Civil Air Patrol, New York Wing Mission.
Page 19Missile launch from Plesetsk Site: Courtesy of U.S. Department of Defense
Page 20Russian nuclear submarine K-219: Courtesy of U.S. Department of Defense
Page 21Russian nuclear-powered submarine K-276: Courtesy of www.hazegray.org
Page 23Russian amphibious assault ship: Courtesy of www.hazegray.org
Page 24Robert McNamara: Courtesy of The White House
Page 26Russian shipyard worker disassembles submarine: Courtesy of U.S. Department of Defense
28

Human Factor - and the Risk of Nuclear War
is published by Svenska Läkare mot Kärnvapen, SLMK
The Swedish Section of
International Physicians for the
Prevention of Nuclear War, IPPNW.
English edition edited by Claes Andreasson
© 2004

According to a 1998 study by the
U.S.  General  Accounting  Office,
human error was a contributing
factor in almost 75% of the most
serious class of U.S. military air-
craft accidents in 1994 and 1995.
A  study  by  the  Union  of  Con-
cerned  Scientists  of  the  nuclear
power  plants  concluded  that
nearly 80% of reported problems
resulted from worker mistakes or
the use of poorly designed proce-
dures.
Ladda ner PDF