Last month, I penned a simple list of 30 facts that I hoped would quell some fears and misconceptions about nuclear energy. Fellow Fool Travis Hoium offered a rebuttal with five facts that he believed doomed nuclear power. The biggest nail in nuclear’s coffin is cost, according to Travis and many others.
While I always welcome an intellectual discussion and debate, the article made me wonder whether nuclear really was more expensive than other sources of electricity generation. After digging into the numbers and working my way up the logic tree, I found several surprising answers.
First things first
My interest in the differing costs of flipping a light switch was piqued when Travis offered an example comparing First Solar, Inc. (NASDAQ:FSLR)‘s new plant in New Mexico with new nuclear facilities being built throughout the country. I wasn’t sold on the simplicity of the argument: Solar energy in the middle of the desert — an optimal location — is cheaper than another energy source. Let’s see First Solar build one of those puppies in New England and keep electricity rates under $0.06 per kWh.
This doesn’t mean renewable energy is an inferior option for the national grid, but it does highlight the regionalization and complexity of electricity costs. For instance, 34 states had average retail electricity prices that were less than the national average of $0.0983 per kWh in 2010. That’s because states with larger populations — and higher prices — are given more weight in the calculation. And as we will see, prices are also affected by the makeup of the regional grid.
Chicken or the egg?
Next, I turned to the Energy Information Administration, or EIA, for state-by-state data. It took a while to mine meaningful relationships from the raw numbers, but there did seem to be correlations between costly electricity, summer capacity, and total nuclear capacity. Here are selected metrics from the top five nuclear-powered states:
State | Average Retail Electricity Price ($/kWh) | Price Rank (cheapest=1) | Net Summer Capacity (GW) | Summer Capacity Rank (largest=1) |
---|---|---|---|---|
Illinois | $0.0913 | 28 | 44.13 | 5 |
Pennsylvania | $0.1031 | 36 | 45.58 | 4 |
South Carolina | $0.0849 | 21 | 8.49 | 17 |
New York | $0.1641 | 48 | 16.41 | 6 |
North Carolina | $0.0867 | 24 | 8.67 | 12 |
National average | $0.0983 | — | 20.77 | — |
Looking at the 19 states with no nuclear capacity adds some color to the other end of the analysis: 12 of these states are in the top 19 for cheapest retail rates. That’s it. End of story. It seems that nuclear must be the cause of higher electricity prices. Right?
There’s another way to view this relationship. Does nuclear cause higher rates, or is the powerful energy source simply relied upon more heavily in regions with larger populations and more demand?
Consider that the correlation factor between net summer capacity, when electricity demand is at its peak, and the amount of nuclear capacity installed in each state is 0.73 (pretty high). What the heck does that mean? States that have high population density regions (cities) are more likely to employ nuclear power — and a greater mix in general — to combat customer demand than states with lower peak demand.
Exelon Corporation (NYSE:EXC), whose Braidwood, Ill. nuclear facility is pictured, illustrates this trend perfectly. The company has 17 reactors powering the metropolitan areas of Chicago (13), Philadelphia (three), and New Jersey (one), with almost 140 GWh of electricity each year. So it looks as if we have answered several important questions already, but there’s no getting around the enormous price tags that come with constructing a new nuclear facility.