By Teresa Welsh |
About 20 percent of the nation’s electricity is produced in the country’s 65 nuclear power plants. Unlike coal and natural gas, America’s top sources of electricity, nuclear power plants have near-zero carbon emissions. Their carbon footprints make nuclear reactors attractive, but the problems with producing electricity using nuclear technology—specifically, the threat of meltdowns and the disposal of spent, highly radioactive nuclear fuel—are well-known.
Public support for the once-burgeoning industry was seriously damaged by the catastrophic events at the Fukushima Daiichi nuclear power plant in Japan last March. In that incident, the plant lost electrical power following a tsunami, and three nuclear reactors experienced full meltdowns. A series of hydrogen explosions ripped through the facility, releasing radioactive material into the air, ground, and sea. The disaster at Fukushima was eventually classified as a level 7 incident—the highest possible level of alert—on the International Nuclear Event Scale.
In the United States, the Energy Department has designated $22.5 billion for nuclear industry projects as part of its renewable energy loan guarantee program. Proponents say the investment is overdue and point to nuclear energy as an effective, carbon-neutral way to produce energy. Opponents argue that those resources would be better spent elsewhere, and they worry that incidents like the one at Fukushima could be repeated on American soil.
Should nuclear power be expanded? Here’s the Debate Club’s take:
John Shimkus U.S. Representative, Illinois's 19th District
Anthony R. Pietrangelo Senior Vice President and Chief Nuclear Officer of the Nuclear Energy Institute
Michael Mariotte Executive Director and Chief Spokesperson for the Nuclear Information and Resource Service