Combined these sources make for $9-10 per watt. Furthermore, they have life spans lasting far less than nuclear power, meaning they'll have to be replaced more frequently. By comparison, your own source found that nuclear was built for $2-3 per watt during the nuclear boom. Again: your own sources contradict you.
You're cherry picking the data I cherry picked to help you again. P919 says the average cost was $589/kW in 1983 dollars with $120/kW of non-TMI retrofits and costs rose with time rather than going down. In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same. Your argument about positive learning rates doesn't fit the data even slightly.
Your absolute best argument if I shuffle the goalposts all the way along for you and ignore the guaranteed money, the abandoned plants, the shutdowns that occured under a decade after opening (all of which were paid for on the public dime) the military and govt involvement and the lack of liability is that undoing 37 years of safety and efficiency improvements and reproducing reactor designs with similar capacity to wind and a much higher correlated forced outage rate to a renewable blend sans storage will allow you to come in at only 7% over the cost and only 4-6 years later?
Then even after all that, operating it for two decades will cost more than the total cost of the renewable system.
All this in a country with mediocre wind and worse solar resource than Alberta, Canada. This is your argument?
Whatever "TMI retrofits" which you keep referring to (yet never actually backing it up with a source) are likely not necessary: 3 mile island's secondary containment worked and prevented any significant amount of radiation release.
Estimates you're giving for renewables are excluding the cost of storage, or using fanciful figures of 4 hours worth of storage, as well as excluding costs of transmission and load shifting.
> In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same.
No, it doesn't. It comes out to $1600/kW. Average capacity factor of nuclear power is over 90%, not the 50% you claimed earlier. And again, your "renewable blend" omits the cost of storage, which will be immense if we're even able to build storage at the scale required at all.
The retrofits are reliability and safety upgrades excluding those that happened as a result of TMI. P920 in the Phung paper I linked. This adds about $120/kW or $200/kW net in 1983 dollars.
You don't get to use the price excluding 40 years of reliability and safety upgrades since TMI in one of the strictest nuclear regulatory regimes with tens of billions of tax money spent on the public share of enforcement, and the performance including those upgrades. A Ford Pinto isn't a 2022 Lambourghini.
The prices are costs pre-tmi. The 58% is the lifetime capacity pre-tmi. If you want to use 92%, then find and source the cost of retrofits and interruptions between 1979 and 2022, as well as the cost of replacing all the plants that closed early and the cost of abandoned plants.
At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages. The renewable blend isn't as reliable as the modern fleet, but the 1979 fleet needs more storage, more backup, and more transmission to distribute the overprovision to where it is needed.
If you don't want to prevent more TMI incidents by adding all the stuff that happened after, you're also going to have to throw in a billion every 20 years or so to pay for cleanup and replacing the lost generation capacity.
Alsk keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom. This alone is enough to disprove your assertion that the cost difference is due to lesser construction.
So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.
And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.
> At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages.
Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.
> Alsk [sic] keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom.
And again, you skew the timelines to fit your narrative. The nuclear boom started in 1965, and saw large price decreases from plants starting construction before then. That's the price drops brought about by the nuclear boom. Reactors continued to be cheap until three mile occurred and reactors that had started construction in the mid 70s had to deal with a whole ton of nuclear obstructionism. That's why price increases among reactors that started construction in the mid 1970s and later. Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.
> So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.
$200 in 1983 is 600 in 2022. So why are you claiming that it's the reason the plants finished in the 80s and 90s were more expensive then? A much cheaper version of a quarter of an insignificant amount can't double the price. There must be a different explanation like a negative learning rate.
> And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.
It still wiped out the plant and cost billions to clean up. And these are all changes related to the countless minor incidents like fires and pipe burst *before* TMI. The changes due to TMI before 1985 were not counted and were smaller. 40% of these modifications were initiated by the utilities to improve reliability and weren't even due to regulation. This was the cost of improving the early extremely simple reactors to the same safety and reliability standards as TMI.
> Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.
Not the machines you are saying to build. If you want to build a machine with 1/3rd of the material, a quarter of the regulation, a sixth of the labour many fewer redundancies, and less QA? You get one that performs like it. More regulation and more expensive designs and another 40 years of upgrades improved the reliability.
The nuclear industry learnt by doing, and what they learnt is if you half ass things your reactor catches fire or starts leaking and needs repairs like Browns Ferry or Rancho Seco or any if the N4 reactors in France which were built and operated at similarly slapdash rates.
> Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.
In the primary source from the DOE, the prices increase. Same with the Phung paper. The retrofit costs I included are precisely and only the ones unrelated to TMI. Excluding a small fraction of FOAK plants, they went up in price the entire time for plants that opened from 1969 to 1979. Every year they got bigger and more cimplex and added more redundancies because that was what was needed to keep them running more than half the time.
Every year the very first commercial plants were running, more problems were found that needed monkey patching and required adding complexity to subsequent and in progress designs. This necessarily can't have started before the first plant of each manufacturer was running.
The price per gigawatt dropped precipitously in plants with construction starting in 1965, and remained flat until three mile island. They got bigger and more complex, but produced more power and the cost per watt was the same. Only until after three mile island did cost start to balloon. Costs were almost entirely under $2 billion per GW until three mile island. Your sources do not contradict this, I have checked.
Which you keep citing parts of or citing directly shows costs increasing 23% year on year after the opening of the first large commercial reactor. This increase then slowed after TMI according to that paper. The only cost decreases were demo reactors, and small turnkey reactors most of which were shut down not very long after. The countries where costs did not escalate all had far worse reliability than the US program after the 80s. You get what you pay for.
In 2022 dollars the overnight cost for reactors coming online just prior to TMI is over $3000/kW for capacity factors that didn't exceed 58% until many more upgrades and repairs had been made over the course of years or decades.
Here is a simple model in the arctic for a renewable mix of 100MW with higher capacity than those plants at an all in cost that is lower than the overnight cost in your fantasy scenario. It uses a capacity factor about 2% lower than the median for new wind and a solar panel angle that is off optimal by 20 degrees for the latitude.
You're mixing up construction start dates and end dates. It's the color that distinguishes which plants were completed before and after Three Mile Island. The red is before. The brown is after.
Cost exploded after three mile island, your idea that cost increases slowed after is the complete opposite. Can you really not see how the brown data points shoot up?
> for capacity factors that didn't exceed 58% until many more upgrades and repairs had been made over the course of years or decades.
I looked into your claim that earlier plants had lower capacity factors. This is true for demonstration plants in the 50s and early 60s, but not the 800+ MW production facilities that were built during the nuclear boom.
> You're mixing up construction start dates and end dates. It's the color that distinguishes which plants were completed before and after Three Mile Island. The red is before. The brown is after.
I am and always have been talking only about the bright red dots. I'm indulging in your fantasy and demonstrating that the result of it is the opposite of what you claim. Yet again, the first large commercial reactor that didn't shutdown after a handful of years was San Onofre opening in 1968. This is the start point.
Every year in which nuclear reactors were being operated commercially, costs increased due to the discovery of a the ways it's really hard. Reactors which were under construction after 1968 and completed before 1979 were more expensive every year at a rate of 23% as per your own source. Your red cluster is an increasing function of time. The growth rate (as in ratio of costs from year to year) actually slowed after TMI. This is the conclusion of the paper this image is from.
The top end of the line representing reactors started just before TMI. The reactors you claim are the cheapest of all time. When adjusted for 2022 dollars. Cost over $3000/kW.
92% is the capacity for US plants after decades of operation including the costs incurred by TMI and chernobyl and fukuhsima. It also includes survivorship bias as the reactors which were destroyed or were too unreliable and shut down early are excluded. World average EAF according to IAEA is 79%. You want a cut rate nuclear program, you get the performance of a cut rate nuclear program. France and South Korea are barely better than the early US program. Japan was much worse. The only outlier with a large sample where reports are even approaching reality is China, and the prices China reports for every major project are a tiny fraction of what anyone else does.
The Phung paper has a list of plants and capacity factors up until 1985. The average is 58% and many are far below. Every reactor that wasn't cancelled or closed and didn't have an accident which destroyed an integral part of it has had substantial upgrades and repairs since then.
If we were being honest, then of the 16 reactors finished between 1976 and 1979, we'd include the price of the 3 that failed or were closed decades early in the accounting as well as their cleanup and decomissioning prices, but I'm letting you have that one along with all the other unaccounted subsidies.
This alone is more expensive than nuclear power built during the nuclear boom.
> New 4 hour battery is around $2/W.
So 12 hours of battery, which is a minimum estimate of what we'll need is $6/W. Also, this price is rising: https://www.utilitydive.com/news/battery-prices-to-rise-for-...
Combined these sources make for $9-10 per watt. Furthermore, they have life spans lasting far less than nuclear power, meaning they'll have to be replaced more frequently. By comparison, your own source found that nuclear was built for $2-3 per watt during the nuclear boom. Again: your own sources contradict you.