The History Of Nuclear Power

Only a century old, nuclear power is one of the most promising tools for reducing carbon emissions. Discover its rich history here.

The History Of Nuclear Power

Nuclear power is one of the most promising tools in the pursuit of reducing carbon emissions. Although it’s a relatively recent discovery, nuclear power has a rich history.

While nuclear power was first discovered in 1895, it wasn't developed as a potential energy source until the late 1930s. Since then, nuclear energy has expanded exponentially and has now been proven to be a reliable source of renewable energy - a vital component to our modern world.

Being the second largest clean energy source in the world is no easy feat, but that’s what nuclear power has become. Understanding nuclear power helps it become more mainstream and more likely that this incredible energy source will rid itself of its bad reputation.

What is Nuclear Power? 

Nuclear power is the use of nuclear reactions within atoms to release electricity. Nuclear energy is the energy in the core of an atom (nucleus = nuclear) and this energy is released when the atom is split. 

There are three ways to produce nuclear power: 

  • Nuclear fission - atoms are split apart, effectively releasing energy. Nuclear fission is what occurs inside nuclear power plants to create nuclear power. 
  • Nuclear decay (a.k.a. radioactive decay) - this happens when a nucleus is unstable and produces energy in the form of radiation. Radioactive elements such as uranium, plutonium, and radium have unstable nuclei and emit radiation. 
  • Nuclear fusion reaction - when the nuclei in an atom with a low atomic number release energy by fusing to create a heavier nucleus. 

The Origins Of Nuclear Power

Discovering nuclear energy was one of the most significant scientific discoveries in the modern world - as was the ability to harness it as an energy source.

Until 1801, the word atom was only a general term used to refer to the smallest building blocks of matter. In fact, the word 'atom' was coined in the fifth century BC by the Greek philosophers Democritus and Leucippus and stems from the Greek word atomos - meaning 'indivisible.'

In 1801, chemist John Dalton discovered that elements were made up of atoms and that the makeup of the atoms varied between elements. 

However, it wouldn't be until the 1930s when nuclear power became a possibility.

The beginning of the 20th century was awash in technological and scientific advances. James Chadwick discovered the neutron in 1932. That same year, other scientists were discovering how to transform atoms by bombarding them with protons.

In 1934, Enrico Fermi discovered that bombarding atoms with neutrons instead of protons created more possibilities of transformation. Then, in 1939 he proved that nuclear fission produced electrical energy and in 1942 produced the first nuclear chain reaction.

After World War II, in 1946, the Atomic Energy Commission was founded in the U.S. which helped create the first successful nuclear reactor in Idaho a few years later. In the 1950s, nuclear power became commercial and by the 1990s, 32 countries worldwide had nuclear reactors.

Discovery of Nuclear Fission

Italian physicist Enrico Fermi was the first one to suggest nuclear power after discovering that atoms could be split using neutrons. In 1934, Fermi began his first experiments in using neutrons to split atoms - specifically uranium atoms.

He was surprised that the results of blasting uranium atoms with neutrons resulted in lighter atoms - suggesting that energy was being emitted from the impact.  

Fermi and his research team created the very first nuclear chain reaction in 1942 at the University of Chicago. 

German scientists Fritz Strassman and Otto Hahn recreated Fermi's experiment in 1938, but instead of using neutrons from uranium, they used those from elements including beryllium and radium.

The resulting leftovers from the impact included barium - which is about half the atomic weight of uranium. This was a groundbreaking discovery, as the leftover materials resulting from Fermi's experiments were only marginally lighter than uranium. 

Along with their colleague Lise Meitner, who'd temporarily moved to Denmark after fleeing Nazi Germany, the scientists came to discovery that the resulting lost atomic weight had turned to energy. It was Meitner who first suggested that energy resulted from the nuclear impact of neutrons hitting an atom.

Meitner also worked alongside her colleagues working on the idea presented by Hahn and Strassman - the famous physicists Niels Bohr and Otto R. Frisch, who contributed greatly to the theory of nuclear fission.

Building the First Nuclear Chain Reactor

The Danish physicist Niels Bohr came to the US in 1939 and brought with him the discoveries made by his colleagues Hahn, Strassman, and Meitner. After meeting Fermi, the two immediately began planning the best way to prove the theories suggested by them and their colleagues. 

At the University of Chicago, Fermi and his team - which included notable physicist Leo Szilard - produced the design for what would soon be the world's first self-sustaining nuclear chain reactor. It was a cube consisting of uranium inside a structure of graphite and was known as the Chicago Pile-1. 

On December 2, 1942, over the course of several hours, the experiment proved to be a success and the world officially entered the nuclear age. 

Nuclear Power Setbacks

There are many good things to say about nuclear power. Not the least of which is the fact that nuclear energy has a very low carbon footprint, producing 30 times less carbon than coal.

But ask ten people how they feel about nuclear energy and a good chunk of them will respond with fear. Many of them will cite how dangerous nuclear meltdowns are, and it's worth taking these into account.

Chernobyl

The most famous of the nuclear accidents was also the most deadly, and that was Chernobyl in 1986. The accident occurred, ironically enough, during a safety test.

The explosion killed two people and 29 of the first responders died later from Acute Radiation Sickness. It’s a tragedy caused by poor infrastructure, faulty design, and lacking safety measures.

Understanding how the explosion in Chernobyl occurred helps us to better prepare for future reactors.

Three Mile Island

In 1979, Three Mile Island’s Unit 2 reactor underwent a partial meltdown. Considered the most severe accident in the U.S.’s nuclear power history, Three Mile Island did not result in any deaths, illnesses, or injuries.

The reactor was built near Middletown, Pennsylvania and on that fateful day, certain key components of the reactor’s cooling process were malfunctioning. This led to the reactor overheating, and a partial meltdown ensued.

A partial meltdown simply means some of the nuclear core, though not all, melted when usually the cooling process keeps them solid. Thankfully, the safety casing around the nuclear core contained the accident.

Fukushima

More recently, in 2011, the Fukushima accident reminded the world of the risks of nuclear energy. A perfect storm gathered that day, with the Japanese coast getting hit by a 9.1 earthquake, followed by a tsunami, which led to the partial meltdown of the Fukushima plant.

The plant had a safety measure that shut off the reactors as soon as the earthquake was detected, but the nuclear core was still hot and the tsunami that followed kept the reactor from being able to bring them back down to a safe temperature.

Depending on your source, either no one died as a result of the meltdown or one person died as a result. The radiation which leaked into the ocean and air did contaminate the local marine life and crops, but the area has almost completely recovered.

Modern Nuclear Power

It’s important to keep in mind just how new this technology is, with the very first reactors coming online in the late 1950s. The technology boom we’ve seen in the past few decades has not left nuclear technology behind.

Nuclear reactors are separated by generation. First-generation reactors are generally considered prototype reactors. They produced the first commercial electricity but were quickly replaced by new designs.

Second-generation nuclear reactors were established in the mid-1960s. Most of the reactors in use today are of this generation. They were designed to be safer than the first generation, although their safety protocols still need to be activated by a human operator.

It wasn’t until the 1990s that nuclear reactor technology received much attention, but this was when third-generation reactors were developed. Primarily, the pursuit was an even safer reactor.

This was achieved by turning the active safety measures into passive safety measures, which meant they would be triggered automatically if certain conditions were met.

Passive safety measures have the great benefit of triggering both when a human operator is not available and when electricity has been shut off, something active safety reactors did not have.

Fourth-generation reactors are currently being developed, with six nuclear reactor designs being considered fourth-generation at the time of writing. The focus of the new generation is safety, economic feasibility, limiting waste, and maximizing output.

The six types of fourth-generation reactors are the Very-High-Temperature Reactor, the Sodium-Cooled Fast Reactor, the Supercritical-Water-Cooled Reactor, the Gas-Cooled Fast Reactor, the Lead-Cooled Fast Reactor, and the Molten Salt Reactor.

Another exciting development in nuclear technology is the Small Modular Reactor. These reactors are considerably smaller than other types of reactors and can vary significantly in their output, making them adaptable.

These Small Modular Reactors are still under development and are expected to be available in the late 2020s or early 2030s. Because of their size and adaptability, nuclear energy would become much more economically feasible and require less land.

While nuclear energy accounts for 10% of the world's total energy, the very real effects of climate change make increasing nuclear energy production an urgent priority.

Milestones in the History of Nuclear Power

After the success of the Chicago-Pile 1, the field of nuclear power grew rapidly. Over a short period of time, simple experiments grew into more elaborate projects, including using nuclear energy to power lights and - eventually - entire towns. 

  • 1946 - the creation of the Atomic Energy Commission (AEC) by Congress, which would oversee all regulations surrounding the production of nuclear energy.
  • 1951 - the world's first nuclear power plant was created in Idaho, known as the Experimental Breeder Reactor-1. Breeder reactors are a special kind of nuclear reactor that produce more fissionable material than they produce.
  • 1955 - Arco, Idaho becomes the first town to be entirely powered by nuclear energy.
  • 1957 - the first electricity-generating power plant opens in Shippingport, Pennsylvania and uses light-water reactors to cool the reactor core. This is the first time light-water reactors were ever used, and because they used ordinary water to cool the reactor core, they were more economical than the previous designs and became the default for future power plants. 

Throughout the decades that followed, nuclear power grew more popular as a more economical and more reliable power source than electricity.

While some events raised concerns about the safety of using nuclear energy, like the infamous Chernobyl and Fukushima disasters, nuclear energy remains a clean and economical energy source.

Conclusion

Nuclear power isn’t the newest technology we have available, but it isn’t even a century old and new developments are steadily coming through.

Because the few mishaps which have occurred were so publicly declared, and because entertainment loves to depict nuclear as more dangerous than it is, the biggest hurdle against adoption is public perception.

Fortunately, nuclear power has come a long way from its early days. From smaller reactors to cleaner and more efficient waste disposal, the latest developments in nuclear power have been encouraging.

Scientists agree that nuclear power is in fact among the safest energy sources available, causing fewer deaths than either wind or solar energy and far fewer than fossil fuels. Learning the history of this wonderful technology is the first step towards a brighter, more abundant future.