Fusion reactors have long been touted as the “perfect”energy source. Proponents claim that when useful commercial fusion reactors are developed, they would produce vast amounts of energy with little radioactive waste, forming little or no plutonium byproducts that could be used for nuclear weapons. These pro-fusion advocates also say that fusion reactors would be incapable of generating the dangerous runaway chain reactions that lead to a meltdown—all drawbacks to the current fission schemes in nuclear power plants.
And, like fission, a fusion-powered nuclear reactor would have the enormous benefit of producing energy without emitting any carbon to warm up our planet’s atmosphere.
But there is a hitch: While it is, relatively speaking, rather straightforward to split an atom to produce energy (which is what happens in fission), it is a “grand scientific challenge” to fuse two hydrogen nuclei together to create helium isotopes (as occurs in fusion). Our sun constantly does fusion reactions all the time, burning ordinary hydrogen at enormous densities and temperatures. But to replicate that process of fusion here on Earth—where we don’t have the intense pressure created by the gravity of the sun’s core—we would need a temperature of at least 100 million degrees Celsius, or about six times hotter than the sun. In experiments to date the energy input required to produce the temperatures and pressures that enable significant fusion reactions in hydrogen isotopes has far exceeded the fusion energy generated.