Nuclear plants use very little raw materials relative to the amount of power the produce. When built at scale, nuclear plants have been delivered at prices around $1-2 billion dollars per GW of capacity.
The cheapest form of carbon-free energy really depends on what the objective is: small reductions in a mostly fossil-fuel grid? Or total replacement of fossil fuels? Renewables are great for the former: you can throw up some solar panels or wind turbines and reduce a chunk of fossil fuels use. But once you try to start delivering significant portions of the energy grid through intermittent sources the surplus energy starts to get wasted, and the effectiveness drops.
Nuclear is roughly equal to wind on a modular foundation if you account for the fact that the tower and foundation outlast the nacelle. The "$2/GW" nuclear reactors were all built by state run agencies with opaque budgets and in France's, Japan's, and South Korea's cases have all proved wildly unreliable in addition to having opaque public subsidy on top of the very large visible subsidies in the supply chain and finance. If you think it's possible to match the prices China reports that megaprojects cost, I'd like to see any examples of projects in the global north with auditable accounting matching their figures in hydro, or highways, or rail, or ports or... basically anything.
In mediocre to good areas with something like the PEG racking system solar uses about the same raw material than nuclear already and it's almost all sand. By the time a new nuke came online this will be far less.
Both are recyclable. 12 hour storage adds negligible mass and can easily cover daily variation.
Intermittent power without storage can easily feed dispatchable loads like EV charging, chemical feedstock and heat production. These vastly exceed non-dispatchable electricity and can be used for virtual seasonal storage.
There are only a small handful of areas best served by nuclear, and most of them have hydro or nuclear already.
There's a narrow niche where nuclear is optimal:
Grid electricity between 50% and 80% penetration in the 50% of areas where hybrid CSP + e-fuel backup isn't better. This niche is rapidly shrinking and could easily be gone by the time one is built. More carbon can be removed faster and with fewer resources by throwing renewables at the other 10 or so TW of fossil fuels currently being burnt. Until those resources are committed, new nuclear just delays things.
> Nuclear is roughly equal to wind on a modular foundation if you account for the fact that the tower and foundation outlast the nacelle.
Except intermittent sources also need storage. They also need long distance transmission lines to bring power from remote areas of generation to places of demand (whereas you can just place nuclear plants next to areas of demand).
This is a common pattern in renewables discussion: laser focus on generation and ignoring the fact that wind and solar have storage and transmission requirements that other energy sources don't have.
> The "$2/GW" nuclear reactors were all built by state run agencies with opaque budgets
Nope, do more research. These were all built in the US with public cost history.
> 12 hour storage adds negligible mass and can easily cover daily variation
12 hours of storage for the world is 30,000 GWh. This is 70-100 times the global battery production output. "Negligible mass" is going to have to see a hundredfold increase in som extraction industries. By comparison, nuclear already produces 20% of the US's electricity hand about a tenth of the world's electricity. A tenfold increase is much more manageable than a hundredfold increase.
> Intermittent power without storage can easily feed dispatchable loads like EV charging, chemical feedstock and heat production. These vastly exceed non-dispatchable electricity and can be used for virtual seasonal storage.
If you're going to tell chemical industries and metallurgy plants that they'll have to cease production for part of the year when renewables are producing lower than average output, then that has to be factored into your costs. If the price of steel and ammonia goes up because they can't run their plants as usual, then that cost is ultimately borne by consumers. You can't just use load shifting as part of the plan and ignore the costs of load shifting. "Virtual seasonal storage" amounts to "tell industries to shut off during winter". And no, heat production is not non-dispatchable unless you're okay with people freezing to death.
> Except intermittent sources also need storage. They also need long distance transmission lines to bring power from remote areas of generation to places of demand
You are correct. They also need storage like nuclear for non-dispatchable loads in areas without good hydro or CSP resource.
For the remainder your battery production figures are off by at least a factor of two. China delivered 280GWh in H1 2022 at the peak of a market crunch in an industry that is growing at 50% YoY. There's no compelling reason to think the 5TWh/yr of factories under construction won't be completed on time as the renewable industry has been consistently over-delivering for a decade.
Your scaling for nuclear is new capacity. Which is around 5GW/yr right now. It has to increase tenfold to match the last year of new renewable generation, or fifteenfold to match the new capacity weighted installation.
> (whereas you can just place nuclear plants next to areas of demand).
Incorrect. Seismic activity, ground, water, temperature, security and many other concerns limit siting severely.
> If the price of steel and ammonia goes up because they can't run their plants as usual, then that cost is ultimately borne by consumers. You can't just use load shifting as part of the plan and ignore the costs of load shifting. "Virtual seasonal storage" amounts to "tell industries to shut off during winter". And no, heat production is not non-dispatchable unless you're okay with people freezing to death.
Hydrogen or ammonia continue to exist after you make them. Simply overprovision your $300/kW electrolyser slightly and use chemical energy as your buffer. This has the added advantage of being an emergency or low CF backup at minimal extra cost.
> They also need storage like nuclear for non-dispatchable loads in areas without good hydro or CSP resource.
Really? Show me all the storage facilities France built when they have >80% of their electricity coming from nuclear power? It'll be challenging for you to do so, since no such storage facilities were built. Nuclear power can be modulated. Plants try not to do this since they want to run at 100% as much as they can to make money, but there is no storage requirement for nuclear power.
> Incorrect. Seismic activity, ground, water, temperature, security and many other concerns limit siting severely.
All of which has been solved already. Seismically active areas have nuclear plants both in the US and around the world. Water is a non-issue since places with large energy use tend to be cities, which are populated by humans which also need water. Nuclear plants can also use wastewater or seawater (like the Palo Verde plant), it doesn't have to be potable water.
> Hydrogen or ammonia continue to exist after you make them. Simply overprovision your $300/kW electrolyser slightly and use chemical energy as your buffer. This has the added advantage of being an emergency or low CF backup at minimal extra cost.
Then show me the price history of commercial ammonia grid storage operators. Well, first you'll have to wait for such a facility to come online because none are operational. Proponents of intermittent sources keep hoping that some silver bullet will make storage nearly-free, since it's the only way to make wind and solar viable as primary sources of energy. But thus far, no silver bullet has come and it's unclear if it ever will.
Unlike nuclear which has historical precedence of being built at scale and cheaply. If we had kept building nuclear plant at the same rate as we did in the 60s and 70s we'd have a completely decarbonized grid by now. We have no such historical precedence building grid storage.
> Really? Show me all the storage facilities France built when they have >80% of their electricity coming from nuclear power?
It's called Europe. To make their nuclear less unaffordable they use other countries as seasonal and diurnal storage.
> Unlike nuclear which has historical precedence of being built at scale and cheaply
Where? Show a single privately run Gen III or later commercial plant funded without government enforced monopoly, free loans, or direct funding that comes in at an affordable price.
> Unlike nuclear which has historical precedence of being built at scale and cheaply. If we had kept building nuclear plant at the same rate as we did in the 60s and 70s we'd have a completely decarbonized grid by now. We have no such historical precedence building grid storage.
No, there'd be no viable Uranium in the ground after about 1980.
> Nope, do more research. These were all built in the US with public cost history.
On top of not including the cost of finance, liabity, or upstream supply chain which was provided by state military projects. The first one you linked had repeated safety violations and maintenance issues and has no record if how much they cost to remedy. The quoted price leads here:
And is in nominal dollars not 1986 dollars. $763 million 1966 dollars is $7 billion, not $3 billion. Add in the cost of finance and you get $10-14 billion or around $7/watt for an unsafe, inefficient plant with corrupt management. And this was not a greenfield site, it already had work for unit 1.
Do more research.
Every pro nuclear claim turns out to be a lie when examined even with the slightest scrutiny. All of them.
Your article studies plants built after the nuclear boom, which of course leads to higher prices. See the cluster of plants built cheaply starting in the mid 60s [1]? That's the nuclear boom. Your article studies plants still in construction at the end of 1986, which is when the nuclear boom tapered off following thee mile island. Deliberately or not you're pulling a slight of hand here by shifting the time frame. But in the end, this helps reinforce my point: nuclear is expensive when built in small numbers as your study demonstrates, and cheaper when built at scale as the study I'll link below explains.
Finance liability is a fancy word for debt: this has nothing to do with construction costs, and everything to do with financing models. You're right, nuclear would be even cheaper if better financing was done. Upstream supply chain is accounted for by the downstream purchase costs. This is like saying wind turbine costs don't include the costs of mining copper for the dynamos. That cost is in included when the wind turbine manufacturer pays for copper coils.
The costs you quoted were from a source linked on wiki talking about the first reactor on your list.
My source is a primary source from the DOE for that source.
It included every reactor built before 1986 including Palisades which your article lists as $650/kWe, but it has down as $118 million or $1300/kWe in 2022 $ excluding retrofit for subsequent safety standards. With capacity factor that's $1700/kW net for an inefficient and unsafe design before major safety standards were written.
If your downstream product is a byproduct of a military project that was built for a different purpose then you cannot claim it includes costs. If the US military needed a supply of worn out giant bearings, provided wind turbine designs that cost trillions for free, and was selling turbine blades and nacelles at low prices it would also be a subsidy.
Whatever your opinion on finance, it is included in renewable projects which are fully privately funded.
If your hypothesis about construction booms was true, then the price minimum would be either reactors started in 1982 when construction was at its peak, or if you want to claim TMI as a boogeyman, then reactors finished just before it.
More likely it is:
a) an artefact of whatever part of your chain of references didn't catch the bit in the DOE report that says it is in nominal dollars which depresses prices of early reactors by a factor of two, and
b) The fact that nuclear has a strong causal mechanism for negative learning rate. Each new reactor teaches you new things that can go wrong which you then have to retrofit to old reactors. This only appears in the sticker price if they are still under construction.
Taking arkansas one unit 2. It was $577m for 858MW at 80% CF.
Inflation was quite substantial, so we can't answer without knowing what rate it was disbursed and at what interest during construction, but it is between $4500 and $6500 per kW net. Right inline with new reactors once cost of finance is included.
US nuclear costs and always has cost around $10-12/W using modern accounting terminology with a few outliers and a few early plants before the negative learning kicked in. You just taught me this by having me cursorially examine your lie. Thankyou.
Edit: actually it might have gone down a bit with after TMI. As there might be different accounting on later reports.
So $1.7 billion per GW. This is an exceptionally good price for a system of generation that is geographically independent, is non-intermittent, and is energy dense (and so does not have to involve long transmission lines moving electricity from solar fields and wind farms to cities).
The US averages ~500 GW of electricity generation, 25% of which already comes from nuclear or hydro. At a cost of $1.7 billion per GW the remaining 375 GW could be replaced with nuclear for just under $640 billion dollars.
> If your downstream product is a byproduct of a military project that was built for a different purpose then you cannot claim it includes costs. If the US military needed a supply of worn out giant bearings, provided wind turbine designs that cost trillions for free, and was selling turbine blades and nacelles at low prices it would also be a subsidy.
So solar panels' cost has to include all the military and communications satellites that pioneered solar panel tech? Most renewable systems also use electronic computers to some degree. This technology was originally pioneered for military encryption and firing computers. You could apply this kind of broken logic to anything. Military and civilian reactor designs are vastly different: the latter are usually mobile, use highly enriched uranium, and are relatively small.
> If your hypothesis about construction booms was true, then the price minimum would be either reactors started in 1982 when construction was at its peak
Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.
> or if you want to claim TMI as a boogeyman, then reactors finished just before it.
Yes, this is exactly what's happened! Do you not see this big cluster of cheap plants built before 3 mile island and then plants got a lot more expensive afterward? Do you see how when the color shifts to dark brown they get a lot more expensive? The plants built just before 3 mile island were some of the cheapest forms of decarbonized energy we have ever deployed.
> Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.
>So $1.7 billion per GW. This is an exceptionally good price for a system of generation that is geographically independent, is non-intermittent, and is energy dense (and so does not have to involve long transmission lines moving electricity from solar fields and wind farms to cities).
You're really stretching here. That's a single pilot plant in an industry with a massive negative learning rate without necessary safety features which is the all-time outlier. I had to go out of my way to find it, and it is not the same metric as you're judging renewables on. You've picked the single ripest possoble cherry. It was also the first turnkey plant so it being the cheap directly contradicts your hypothesis.
> So solar panels' cost has to include all the military and communications satellites that pioneered solar panel tech? Most renewable systems also use electronic computers to some degree. This technology was originally pioneered for military encryption and firing computers. You could apply this kind of broken logic to anything. Military and civilian reactor designs are vastly different: the latter are usually mobile, use highly enriched uranium, and are relatively small.
If the PV on Jim Doe's roof was required to power the satellite, and the government sold the polysilicon and sent experts to Jinko to help design the manufacturing facility, and provided the funding then yeah.
> Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.
You appear to be struggling with the difference between start and finish. The largest capacity of plants ever finished in the US was '82. The Arkansaw plant I picked as an example was the last one finished before TMI and was wholly consistent with $6/W (or higher including cost of finance) and a negative learning rate since Paliside.
> Do you not see this big cluster of cheap plants built before 3 mile island and then plants got a lot more expensive afterward? I'll draw this in MS paint to make it easier for you: https://i.imgur.com/VD34Zhi.jpeg
I've pointed out a primary source which contradicts the numbers that graph is based on and posited a causal mechanism for the disparity. Refute the primary source, demonstrate that my understanding of their use of the term 'nominal dollars' is wrong, or find another primary source (or the primary source the paper uses).
> You're really stretching here. That's a single pilot plant in an industry with a massive negative learning rate without necessary safety features which is the all-time outlier. I had to go out of my way to find it, and it is not the same metric as you're judging renewables on.
If I were cherry picking I could pick even cheaper plants. Zion 1 and 2 were built for less, as was Oconee 1 and 2.
> You appear to be struggling with the difference between start and finish. The largest capacity of plants ever finished in the US was '82.
Most of which were delayed after the 3 mile island incident, and correspondingly experienced greater costs. Sure, if you want to get pedantic the peak number of plants under construction at any one time peaked just after three mile island. But that's because so many plants were delayed, and this led to higher costs.
> I've pointed out a primary source which contradicts the numbers that graph is based on and posited a causal mechanism for the disparity. Refute the primary source, demonstrate that my understanding of their use of the term 'nominal dollars' is wrong, or find another primary source (or the primary source the paper uses).
I'm looking over the OSTI report and calculating the inflation adjusted numbers line by line. They match the costs listed in my source. It doesn't look like there's anything to refute: both of our sources show that nuclear plants built during the nuclear boom were some of the cheapest forms of decarbonized energy there is.
I don't have anything refute, because your source agrees with my point. Your own source's data reinforces the claim that nuclear built during the nuclear boom (plants started after 1965 and built before three mile island) were often delivered between 1 and 2 billion dollars (2010 adjusted) per GW of capacity, and some even less than 1 billion.
Zion 1 and 2 were 276 million each at 58% CF or between $3 and $4.2 per net Watt. Better than the last plant to open before TMI, which supports a negative learning rate.
And again. This doesn't include safety retrofits, and it doesn't include O&M which is higher than new renewables.
Even after retrofit, it was destroyed due to a design and management failure in 1998.
All of those early plants are more expensive than you are saying, they had state controlled funding. They were inefficient, and they were unsafe when they opened.
Additionally they all had abysmal capacity factors in the 70s and 80s, around the 50-60% range so using lifetime CF is incredibly biased towards making them look good.
If we're counting capacity factor, then the cost of solar and wind increase by ~4x since they have capacity factors of ~25%, which is a lot less than nuclear's typical ~90% capacity factor [1]. Oconee's capacity factor is 81% over its life and 97% in a typical year. It's actually the opposite: focusing on lifetime capacity makes most nuclear plants look worse than in a typical year.
For all their supposed lack of safety, nuclear power - including these early and supposedly unsafe designs - safer than most renewables [2]. There's an immense double standard between renewable safety (nobody seems to care about the tens of thousands of people killed by dams) and nuclear power.
Lifetime capacities up to TMI are fair for a proposal to build what was built before TMI. Including reliability improvements deployed over cumulative decades of downtime at costs of billions per reactor isn't comparing the thing that was purchased before TMI.
Of course renewables should be capacity weighted. Noone is saying they shouldn't. Capacity weighted new solar in germany is about $3.80/W or new onshore wind is about $3/W. These are both dropping 10-20% YoY. New 4 hour battery is around $2/W. The up front cost is about the same, but the operating costs of NPP exceed what many wind and solar projects are able to bid for. Even if we assume unrealistically short construction times of the 70s for a new Gen III+ reactor the extra 6 years of operation will have the solar park half paid off by the time it opens.
Those early designs were safe enough to mostly keep operating thanks to the exorbitantly expensive upgrades. This is an engineering feat, and a testament to the care and excellence of the US NRC, but it came at a cost which you are trying to pretend does not need paying. Gen III+ reactors are far more complex and so cost more on top of the additional costs incurred by not operating in the unique environment of the 60s.
Combined these sources make for $9-10 per watt. Furthermore, they have life spans lasting far less than nuclear power, meaning they'll have to be replaced more frequently. By comparison, your own source found that nuclear was built for $2-3 per watt during the nuclear boom. Again: your own sources contradict you.
You're cherry picking the data I cherry picked to help you again. P919 says the average cost was $589/kW in 1983 dollars with $120/kW of non-TMI retrofits and costs rose with time rather than going down. In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same. Your argument about positive learning rates doesn't fit the data even slightly.
Your absolute best argument if I shuffle the goalposts all the way along for you and ignore the guaranteed money, the abandoned plants, the shutdowns that occured under a decade after opening (all of which were paid for on the public dime) the military and govt involvement and the lack of liability is that undoing 37 years of safety and efficiency improvements and reproducing reactor designs with similar capacity to wind and a much higher correlated forced outage rate to a renewable blend sans storage will allow you to come in at only 7% over the cost and only 4-6 years later?
Then even after all that, operating it for two decades will cost more than the total cost of the renewable system.
All this in a country with mediocre wind and worse solar resource than Alberta, Canada. This is your argument?
Whatever "TMI retrofits" which you keep referring to (yet never actually backing it up with a source) are likely not necessary: 3 mile island's secondary containment worked and prevented any significant amount of radiation release.
Estimates you're giving for renewables are excluding the cost of storage, or using fanciful figures of 4 hours worth of storage, as well as excluding costs of transmission and load shifting.
> In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same.
No, it doesn't. It comes out to $1600/kW. Average capacity factor of nuclear power is over 90%, not the 50% you claimed earlier. And again, your "renewable blend" omits the cost of storage, which will be immense if we're even able to build storage at the scale required at all.
The retrofits are reliability and safety upgrades excluding those that happened as a result of TMI. P920 in the Phung paper I linked. This adds about $120/kW or $200/kW net in 1983 dollars.
You don't get to use the price excluding 40 years of reliability and safety upgrades since TMI in one of the strictest nuclear regulatory regimes with tens of billions of tax money spent on the public share of enforcement, and the performance including those upgrades. A Ford Pinto isn't a 2022 Lambourghini.
The prices are costs pre-tmi. The 58% is the lifetime capacity pre-tmi. If you want to use 92%, then find and source the cost of retrofits and interruptions between 1979 and 2022, as well as the cost of replacing all the plants that closed early and the cost of abandoned plants.
At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages. The renewable blend isn't as reliable as the modern fleet, but the 1979 fleet needs more storage, more backup, and more transmission to distribute the overprovision to where it is needed.
If you don't want to prevent more TMI incidents by adding all the stuff that happened after, you're also going to have to throw in a billion every 20 years or so to pay for cleanup and replacing the lost generation capacity.
Alsk keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom. This alone is enough to disprove your assertion that the cost difference is due to lesser construction.
So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.
And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.
> At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages.
Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.
> Alsk [sic] keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom.
And again, you skew the timelines to fit your narrative. The nuclear boom started in 1965, and saw large price decreases from plants starting construction before then. That's the price drops brought about by the nuclear boom. Reactors continued to be cheap until three mile occurred and reactors that had started construction in the mid 70s had to deal with a whole ton of nuclear obstructionism. That's why price increases among reactors that started construction in the mid 1970s and later. Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.
> So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.
$200 in 1983 is 600 in 2022. So why are you claiming that it's the reason the plants finished in the 80s and 90s were more expensive then? A much cheaper version of a quarter of an insignificant amount can't double the price. There must be a different explanation like a negative learning rate.
> And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.
It still wiped out the plant and cost billions to clean up. And these are all changes related to the countless minor incidents like fires and pipe burst *before* TMI. The changes due to TMI before 1985 were not counted and were smaller. 40% of these modifications were initiated by the utilities to improve reliability and weren't even due to regulation. This was the cost of improving the early extremely simple reactors to the same safety and reliability standards as TMI.
> Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.
Not the machines you are saying to build. If you want to build a machine with 1/3rd of the material, a quarter of the regulation, a sixth of the labour many fewer redundancies, and less QA? You get one that performs like it. More regulation and more expensive designs and another 40 years of upgrades improved the reliability.
The nuclear industry learnt by doing, and what they learnt is if you half ass things your reactor catches fire or starts leaking and needs repairs like Browns Ferry or Rancho Seco or any if the N4 reactors in France which were built and operated at similarly slapdash rates.
> Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.
In the primary source from the DOE, the prices increase. Same with the Phung paper. The retrofit costs I included are precisely and only the ones unrelated to TMI. Excluding a small fraction of FOAK plants, they went up in price the entire time for plants that opened from 1969 to 1979. Every year they got bigger and more cimplex and added more redundancies because that was what was needed to keep them running more than half the time.
Every year the very first commercial plants were running, more problems were found that needed monkey patching and required adding complexity to subsequent and in progress designs. This necessarily can't have started before the first plant of each manufacturer was running.
The price per gigawatt dropped precipitously in plants with construction starting in 1965, and remained flat until three mile island. They got bigger and more complex, but produced more power and the cost per watt was the same. Only until after three mile island did cost start to balloon. Costs were almost entirely under $2 billion per GW until three mile island. Your sources do not contradict this, I have checked.
Which you keep citing parts of or citing directly shows costs increasing 23% year on year after the opening of the first large commercial reactor. This increase then slowed after TMI according to that paper. The only cost decreases were demo reactors, and small turnkey reactors most of which were shut down not very long after. The countries where costs did not escalate all had far worse reliability than the US program after the 80s. You get what you pay for.
In 2022 dollars the overnight cost for reactors coming online just prior to TMI is over $3000/kW for capacity factors that didn't exceed 58% until many more upgrades and repairs had been made over the course of years or decades.
Here is a simple model in the arctic for a renewable mix of 100MW with higher capacity than those plants at an all in cost that is lower than the overnight cost in your fantasy scenario. It uses a capacity factor about 2% lower than the median for new wind and a solar panel angle that is off optimal by 20 degrees for the latitude.
You're mixing up construction start dates and end dates. It's the color that distinguishes which plants were completed before and after Three Mile Island. The red is before. The brown is after.
Cost exploded after three mile island, your idea that cost increases slowed after is the complete opposite. Can you really not see how the brown data points shoot up?
> for capacity factors that didn't exceed 58% until many more upgrades and repairs had been made over the course of years or decades.
I looked into your claim that earlier plants had lower capacity factors. This is true for demonstration plants in the 50s and early 60s, but not the 800+ MW production facilities that were built during the nuclear boom.
> You're mixing up construction start dates and end dates. It's the color that distinguishes which plants were completed before and after Three Mile Island. The red is before. The brown is after.
I am and always have been talking only about the bright red dots. I'm indulging in your fantasy and demonstrating that the result of it is the opposite of what you claim. Yet again, the first large commercial reactor that didn't shutdown after a handful of years was San Onofre opening in 1968. This is the start point.
Every year in which nuclear reactors were being operated commercially, costs increased due to the discovery of a the ways it's really hard. Reactors which were under construction after 1968 and completed before 1979 were more expensive every year at a rate of 23% as per your own source. Your red cluster is an increasing function of time. The growth rate (as in ratio of costs from year to year) actually slowed after TMI. This is the conclusion of the paper this image is from.
The top end of the line representing reactors started just before TMI. The reactors you claim are the cheapest of all time. When adjusted for 2022 dollars. Cost over $3000/kW.
92% is the capacity for US plants after decades of operation including the costs incurred by TMI and chernobyl and fukuhsima. It also includes survivorship bias as the reactors which were destroyed or were too unreliable and shut down early are excluded. World average EAF according to IAEA is 79%. You want a cut rate nuclear program, you get the performance of a cut rate nuclear program. France and South Korea are barely better than the early US program. Japan was much worse. The only outlier with a large sample where reports are even approaching reality is China, and the prices China reports for every major project are a tiny fraction of what anyone else does.
The Phung paper has a list of plants and capacity factors up until 1985. The average is 58% and many are far below. Every reactor that wasn't cancelled or closed and didn't have an accident which destroyed an integral part of it has had substantial upgrades and repairs since then.
If we were being honest, then of the 16 reactors finished between 1976 and 1979, we'd include the price of the 3 that failed or were closed decades early in the accounting as well as their cleanup and decomissioning prices, but I'm letting you have that one along with all the other unaccounted subsidies.
Heating is trivially dispatchable over 24 hours. Put some sand, brick or water between what you want to heat and the heat source. This method has been used for thkusands of years.
The cheapest form of carbon-free energy really depends on what the objective is: small reductions in a mostly fossil-fuel grid? Or total replacement of fossil fuels? Renewables are great for the former: you can throw up some solar panels or wind turbines and reduce a chunk of fossil fuels use. But once you try to start delivering significant portions of the energy grid through intermittent sources the surplus energy starts to get wasted, and the effectiveness drops.