Can commercial nuclear power be an economically viable and timely option for addressing climate change? This may seem an odd question to ask in the aftermath of the triple meltdowns in March 2011 at the Fukushima Daiichi reactors in Japan, which has lead a number of nations to rethink their nuclear plans (Dempsey 2011, Morales 2011, Yang and Mufson 2011). Before the accident, the industry and its supporters in and out of government loudly trumpeted the possibility of a ‘nuclear renaissance’ fueled primarily by rising concerns over climate change, dramatic swings in the price of fossil fuels, and a host of federal subsidies (Snyder 2010, Whitten 2010). Although such talk now seems fanciful, it is worth noting that a nuclear revival was unlikely even before Fukushima. For nuclear power to make a significant contribution to the climate change problem, the United States and others would have had to embark on an unprecedented program of reactor building, and sustain it for decades. For that

to happen, however, fossil fuels would have to become much more expensive (perhaps through an increase in the price of carbon) and the nuclear industry would have to overcome some familiar but significant political and economic obstacles, all of which were made worse by the Fukushima accident.