Radiation from Bomb Versus Nuclear Power Plant
Date: April 4, 2011
Can you compare the radiation released in a nuclear bomb air blast with the radiation released by power plant melt down?
First off, the term "meltdown" tends to be a sensational term used in
the media. In reality, it simply means that the fuel rods in a reactor
have partially or fully melted. A reactor that has had a meltdown does
not necessarily have to release any significant radioactivity at all. For
example, the Three Mile Island reactor in the US suffered a partial
meltdown, but no significant radioactivity was released.
Even at its worst, there is no comparison between the radioactivity and
damage caused by a reactor suffering a meltdown, and the
radioactivity released by, and extreme damage resulting from, a
nuclear explosion. To illustrate, one only need compare the extreme
damage and massive casualties resulting from the very small nuclear
bomb dropped on Hiroshima, with the lack of any civilian casualties
whatsoever caused by the Three Mile Island event.
As you might expect, the answer depends on what sort of a bomb and what
sort of a reactor you are comparing, because the amounts of nuclear
radiation released vary over a huge range.
A nuclear bomb releases all of its radioactive material into the
environment. If it is detonated close to the ground, it produces even
more radioactive material by irradiating large amounts of dirt and
spewing the newly radioactive dirt all over the place.
If both contained the same amount of radioactive material, a bomb would
be much, much worse than a reactor, because most of the radioactive
material in a reactor will not be widely dispersed even in a complete
However, a nuclear reactor contains a lot of radioactive material,
and typically it also contains several reactor loads of "spent" fuel,
in a cooling pool, which might also be dispersed into the environment
in a worst-case accident.
Click here to return to the Physics Archives
Update: June 2012