Lies, Damn Lies, and Visions of Nuclear Catastrophe
Membership required
Membership is now required to use this feature. To learn more:
View Membership BenefitsAdvisor Perspectives welcomes guest contributions. The views presented here do not necessarily represent those of Advisor Perspectives.
What, exactly, is a catastrophe? We hear the word all the time, but not often enough in a relative context. As a result, we are not adept at comparing catastrophes; each one appears as equal in magnitude. Some are implicitly deemed more catastrophic than others. Because we have no common coin to compare them, whatever catastrophe conventional wisdom considers the worst is the one that dominates.
In the energy field, the nuclear energy catastrophe dominates. This needs to be submitted to a comparison.
In December 1952, a catastrophe befell the city of London, England. From December 5 to December 9, a yellow coal-burning smog settled into the city, killing between 3,500 and 4,000 people. During the next three months more than 8,000 additional deaths were attributed to the smog, bringing the death toll to 12,000.
But the number of deaths from coal and other fossil fuels has been much greater than that. Many millions of people, and their lungs, have been exposed to airborne particulates from the burning of coal, petrol, and diesel, day in and day out. Researchers have estimated that 8.7 million people worldwide died from this cause from 2012 to 2018.
Their deaths amount to a global catastrophe.
Yet 80% of the world’s energy use continues to be from fossil fuels.
In 1975 a catastrophe occurred in Zhumadian, Henan Province, China. The Banqiao Dam, built to provide hydroelectric power and irrigation water, collapsed. More than 170,000 people died.
Yet energy planners expect hydroelectric dam use to increase substantially in the future, to provide energy storage to back up intermittent electricity sources such as wind and solar.
On August 21, 1986, a catastrophe occurred at Lake Nyos, in Cameroon in West Africa. Hundreds of tons of carbon dioxide that had been stored deep in the lake were suddenly released in a rare type of natural disaster. The CO2 blanketed the area, killing at least 1,700 people and 3,000 cattle, as well as virtually all animals and birds for miles around.
Yet some energy planners advocate carbon capture and storage (CCS) as a solution to climate change. CCS is a scheme to suck carbon dioxide out of the emission stream of fossil fuel-burning plants, or out of the atmosphere itself, compress it, and sequester it deep underground.
If only half of global carbon dioxide emissions were captured and stored that way, the amount stored would be at least 17 billion tons of CO2 each year. To serve its climate change avoidance function, this CO2 would have to remain sequestered, without leakage, for hundreds, indeed thousands of years. If it leaks, it will throw its climate change mitigation benefit into reverse. In the unlikely event that it is released suddenly, it could pose a catastrophic danger to human and animal life. Such is the potential for catastrophe with CCS.
There have been two catastrophic nuclear power plant accidents and one that was perceived as a catastrophe at the time – Fukushima in 2011, Chernobyl in 1986, and Three Mile Island in 1979. Significant amounts of radioactive isotopes were released in the former two cases. Concern was particularly great in Europe in the case of Chernobyl because it was anticipated that a radiation cloud could blow across the continent.
But in the case of Fukushima, at most one person died from the radiation. In the case of Chernobyl, confirmed deaths from radiation were less than 50.1 Three Mile Island posed no danger at all to human health.
Even wind power has caused more deaths per kilowatt-hour of electricity produced than nuclear power.
The quantity of nuclear waste produced by nuclear power is minuscule compared to that of the carbon dioxide emitted worldwide. If the nuclear waste were to escape, it would pose less risk to the climate or to human life than if the CO2 were to escape, because when radioactive products disperse widely their dose delivery rate decreases and they are no longer a serious risk to human health. The low numbers of radiation deaths from Fukushima and Chernobyl are testimony to that fact.
A study by Gunter Pretzsch and Ralph Maier of the German organizations Gesellschaft für Anlagen und Reaktorsicherheit (GRS), mbH (The Society for Plant and Reactor Safety) and Bundesamt für Stahlenschutz (Federal Office for Radiation Protection) estimated the potential radiological consequences that would follow from a sabotage attack on above-ground nuclear-waste cask storage facilities. It envisioned a scenario in which a terrorist’s projectile pierced a cask storing the nuclear waste and released a cloud of radioactive aerosol particles. It estimated that a person standing at a distance 5,000 meters downwind of the cask would have no more than a 20% chance of receiving a radiation dose greater than a full body CAT scan, and only a 30% chance of greater-than-normal annual background radiation. At least 100 times that dose is required over a short time to cause lasting damage to health.
Yet it is very common to hear statements such as, “Nuclear power will remain a very difficult political topic unless we have a breakthrough in waste management.”2
At the same time, CCS is being seriously considered without similar cautions.
And consider discarded solar panels and wind turbines, which last about 20 years. They will create enormous amounts of waste, much of it non-recyclable and some highly toxic.
The great hope, especially among those who believe complete decarbonization of energy is necessary, is nuclear fusion. It is deemed a safe technology because unlike nuclear fission, it is thought – mistakenly – to produce no radioactive waste. But it may not have completely sunk in that the temperature in a fusion reactor is 100 million degrees Celsius (180 million degrees Fahrenheit). If some of that enormous heat and energy escapes, what catastrophe might it cause?
Despite this accounting, among all energy sources, fear of catastrophe most strongly stands in the way of nuclear fission, and in the way of its economic viability. This is completely irrational given the above information, but the reasons for this fear are understandable. They lie in the historic fear of nuclear weapons and the fallout from their testing, in misperceptions of the dangers of radiation, and in the mistrust of government and corporate nuclear energy advocates that originated in the 1970s.
Daniel Ford’s New Yorker article
That historical mistrust is revisited in a recent article in The New Yorker by Daniel Ford. Ford was, 50 years ago, executive director of the Union of Concerned Scientists, the influential organization of scientists that, at the time, raised questions about nuclear safety.
In the early 1970s, Ford and coauthor Henry Kendall, a professor at the Massachusetts Institute of Technology and later a Nobel Prize winner in physics, wrote several articles arguing that the backup cooling systems for nuclear power plants could not be shown to be reliable enough to prevent an accident that would release radioactive materials. The target of their articles was the US Atomic Energy Commission (AEC) and the corporate nuclear power plant vendors who assured the public that a catastrophic accident was extremely unlikely. Ford and Kendall claimed that the AEC and the power plant providers did not have enough information to provide that assurance.
Almost exactly 50 years later, as Ford’s New Yorker article showed, that claim has been validated by a book, Safe Enough? A History of Nuclear Power and Accident Risk, by Thomas Wellock, the official historian of the Nuclear Regulatory Commission, a successor to the AEC.
Ford said:
Technically astute insiders at the A.E.C. took it for granted that “catastrophic accidents” were possible; the key question was: What were the chances? The long and the short of it, Wellock’s book suggests, is that, while many officials believed the chances were very low, nobody really knew for sure how low they were or could prove it scientifically. Even as plants were being built, the numbers used by officials to describe the likelihood of an accident were based on “expert guesswork or calculations that often produced absurd results,” he writes.
What “catastrophic accidents” could result? Ford and Kendall spelled it out in a September 1972 article in the journal Environment, in terrifying detail:
A large power reactor can contain fission products equivalent to the fall-out from many dozens of nuclear weapons in the megaton range. The uncontrolled release of even 5 or 10 percent of this inventory could bring death to persons from 60 up to 100 miles from the reactor. Persons hundreds of miles distant could suffer radiation sickness, genetic damage, and increased incidence of many diseases including cancer.
For the exact nature of that catastrophe – its worst case – Ford and Kendall relied on the 1957 Brookhaven report known as WASH-740, Theoretical Possibilities and Consequences of Major Accidents in Large Nuclear Power Plants, and its 1965 update (which was not made available to the public). The 1957 report said that the worst-case nuclear accident could cause 3,400 deaths and 43,000 injuries. The 1965 update amended that to 45,000 deaths and 100,000 injuries.
The message of Ford’s and Kendall’s articles at the time was that, for certain kinds of credible accidents, there appears to be no assurance that the emergency cooling systems will prevent a major discharge of radioactive material to the environment. Such a discharge could cause a catastrophe of unparalleled scale.
This is still a widespread belief among the general population – though the more often enunciated concern has shifted, for some reason, from a radioactive release from the plant itself to concern over nuclear waste disposal. Nevertheless, the concern is still motivated by those fears of 50 years ago. And to read his article, they still motivate Daniel Ford.
Why is this concern over nuclear catastrophes so dissonant with the actual 60-year nuclear power plant experience?
Hypotheses for the difference between the level of alarm and actual experience
There are two hypotheses for why the level of alarm about nuclear energy and the record of safety are so dissonant:
- The prognostications of the possibility of human catastrophe were mistaken.
- The “big one” – a true nuclear fission catastrophe – is still as possible as ever but simply hasn’t happened yet.
Let us consider these one at a time.
The prognostications of the possibility of human catastrophe were mistaken
The possibility of human catastrophe is predicated on two eventualities: the possible leakage of radiation; and the likelihood of catastrophically damaging effects of that radiation on the health of people exposed to it. As Jack Devanney has shown in his invaluable book, Why Nuclear Power Has Been a Flop, the probability of the first – leakage of radiation – was vastly understated, while the probability of the second, damage to health from that radiation, was vastly overstated. Some leakage of radiation was always possible, in fact likely. But the probable resulting threat to human health would be small to negligible, or none.
Since the time of Ford’s and Kendall’s articles, much more has been learned about the effects of nuclear radiation. Long-term studies of survivors of the atomic bombs at Hiroshima and Nagasaki in 1945 have shown that those who did not experience the highest levels of radiation – most of whom died from it – had, subsequently, no higher cancer rates than the general population, and no genetic damage was detected in their offspring. Numerous other studies have shown that only very high radiation dosage rates are damaging to health. This was not known at the time the WASH-740 study made its estimates of deaths and injuries.
The ”big one” just hasn’t happened yet
This alternative hypothesis would require admitting that, contrary to popular perception, neither Fukushima nor Chernobyl were the “big one”; in fact were very far from it.
But the WASH-740 study, for its maximum-damage scenario, assumed a release of 50% of the fission products from a damaged reactor. The OECD’s Nuclear Energy Agency (NEA), in Chapter II of its post-Chernobyl 2002 study, “Chernobyl: Assessment of Radiological and Health Impacts”, stated that “From the radiological point of view, 131I [Iodine] and 137Cs [Cesium] are the most important radionuclides to consider, because they are responsible for most of the radiation exposure received by the general population.” For those important radionuclides it said that “the release fraction of 137Cs release was 20 to 40%” and “For 131I, the most accurate estimate was felt to be 50 to 60% of the core inventory”. These release figures were close to those of WASH-740’s maximum damage scenario.
In other words, the “big one” already happened. It was Chernobyl. Fukushima, however catastrophic it was in terms of damage to the nuclear reactor itself, released much less radiation by comparison.
It is difficult to conceive a release worse than Chernobyl. It had no containment vessel. Although firefighters, who were most of those who died from the radiation, may have mitigated the release, it still, according to the International Atomic Energy Agency (IAEA), released 400 times more radioactive material into the Earth’s atmosphere than the atomic bomb dropped on Hiroshima. This is in line with Ford’s and Kendall’s terrifying worst-case scenario that they wrote of 50 years ago.
If the measure of whether an energy technology is acceptable is the consequence of the worst conceivable accident that could occur because of it, then, as the list of catastrophes recounted above shows, no energy technology would be acceptable. Nuclear would be among the two or three closest to acceptable. (For wind and solar the worst conceivable accident might be a disaster occurring in mining the massive quantities of ores and materials required to build the wind turbines and solar panels.)
Where the Daniel Ford article is right, and where it is wrong
The Daniel Ford article showed that the AEC and the early nuclear industry made a mistake by doing exactly what Ford and Kendall accused them of – insisting on the safety of the technology, as they forged ahead building it on a massive scale, in what has become known as the “great nuclear bandwagon” of the 1960s and early 70s, without enough knowledge or evidence at that point to back up their insistence on its safety – specifically, that it would not release radiation.
This insistence eventually marred nuclear advocates’ credibility and embedded distrust of nuclear power in the hearts and minds of many in the Western world. The nuclear power industry might be in a better position today if it had courted trust by being more open about the early uncertainties, and if they had taken it more slowly as they learned more about possible accidents and the effects of dispersed radiation on human health.
But what Ford’s article did not do but should have done, was to update the state of those concerns, now, 50 years later. Now that the experience and scientific evidence accumulated over the intervening decades has been gained, we must undo whatever mistrust was spawned by some of nuclear’s early advocates. It is imperative in the face of a rapidly changing climate – a potential catastrophe likely to dwarf all energy catastrophes – that we do this.
The Ford article is wrong in leaving unsaid what is necessary to say; namely, that now, as opposed to then, 50 years ago, the verdict is in. It can confidently be asserted that nuclear power is less threatening to human health than most other energy alternatives; less than, for example, commercial air travel.
Cost, time, and proliferation
This leaves three valid concerns that are often mentioned about nuclear energy: proliferation of nuclear weapons; cost; and time lag to develop. It is difficult to determine whether and how much the likelihood of proliferation would increase, the more nuclear power plants are built. Some countries, notably North Korea, forged ahead with nuclear weapons without having any nuclear power plants; others have nuclear power without nuclear weapons. It is the job of the IAEA to prevent proliferation, a difficult job that may be made harder by an increasing number of nuclear power plants.
As to economics, nuclear power economics have been severely hampered by the fear of nuclear energy. Nuclear power should not be as expensive as it has been in some recent cases. The low cost of building nuclear plants in the 1960s, and the low cost in some countries such as South Korea, show that it can be done economically. And the experience of France, which built enough nuclear power plants to provide 75% of its electricity in the space of about 15 years shows that it can be done quickly.
The strongest force standing in its way is the antiquated, irrational fear of it.
Economist and mathematician Michael Edesess is adjunct associate professor and visiting faculty at the Hong Kong University of Science and Technology, managing partner and special advisor at M1K LLC. In 2007, he authored a book about the investment services industry titled The Big Investment Lie, published by Berrett-Koehler. His new book, The Three Simple Rules of Investing, co-authored with Kwok L. Tsui, Carol Fabbri and George Peacock, was published by Berrett-Koehler in June 2014.
1 A figure of 4000 deaths is often quoted from a 2005 World Health Organization (WHO) news release, which said that “A total of up to 4000 people could eventually die of radiation exposure from the Chernobyl nuclear power plant (NPP) accident nearly 20 years ago.” For that figure, the WHO references the report “Chernobyl’s Legacy: Health, Environmental and Socio-Economic Impacts” prepared by The Chernobyl Forum. The Chernobyl’s Legacy report says that 28 emergency workers died from acute radiation syndrome and 15 from childhood thyroid cancer. It also makes the following statement: “Apart from the dramatic increase in thyroid cancer incidence among those exposed at a young age, there is no clearly demonstrated increase in the incidence of solid cancers or leukaemia due to radiation in the most affected populations.” It further says, “It is impossible to assess reliably, with any precision, numbers of fatal cancers caused by radiation exposure due to the Chernobyl accident.” It later says, however, “the possible increase in cancer mortality due to this radiation exposure might be up to a few per cent. This might eventually represent up to four thousand fatal cancers in addition to the approximately 100 000 fatal cancers to be expected due to all other causes in this population.” No explanation for how the “few per cent” and 4000 figures were calculated or estimated is given. Absent the insertion of the sentence giving the four thousand number, a reader would infer from the other statements that the best estimate of the number of excess cancer deaths – beyond the deaths cited in the report of 28 emergency workers from acute radiation syndrome and 15 from childhood thyroid cancer – was zero. It is not clear where the committee that wrote the report got the 4000 number.
2 Griffith, Electrify, p 202.
Membership required
Membership is now required to use this feature. To learn more:
View Membership Benefits