Hacker News new | past | comments | ask | show | jobs | submit login

Don’t even bother explaining that it is a result of an explicit political decision aiming to force the nuclear power industry to subsidize the renewables from its profits. In the end, regardless of your efforts, people will use the losses the nuclear incurs to subsidize renewables as a proof that nuclear is uneconomical, and that renewables beat it handily and are the way to go.



Wonder how much of this is related to the fact that France's energy minister have been MBA and political science graduates for quite some time. The previous one was Hollande's partner, so it's just another kind of nepotism[1].

Macron selling off France's nuclear infrastructure probably doesn't help either.

Does anyone here have historical knowledge of Europe's glory days? Were there more actual scientists and engineers in key positions in Government in the 60s and 70s?

For example in Germany Helmut Schmidt(74-82) had a plan for the future where Germany was to build out a fiber optic grid, which the subsequent Chancellor Kohl scrapped because did not like the influence of public TV services and wanted copper for cable TV to counter it[2].

[1] https://en.wikipedia.org/wiki/Ségolène_Royal

[2] https://netzpolitik.org/2018/danke-helmut-kohl-kabelfernsehe...


Valery Giscard d'Estaing graduated from Polytechnique (best engineering school in France).

He was a key driver of things like nuclear power, high-speed trains, the Minitel, etc.


> Does anyone here have historical knowledge of Europe's glory days?

You mean the Renaissance? Since then it's been 400 years of brain drain to the US.


You mean the US has competitive advantages because their products don't contain social costs like the ones from Europe?


I think it is simpler than that.

The US gained a lot of highly qualified immigration around WWII, when Europe tore itself into shreds. Poles, Italians, Russians, Germans, you name it.

And it is hard to disrupt the advantage of places like California ever since. Once you have top universities and top corporations somewhere, individuals will flock to them instead of trying to create competing hubs elsewhere. Plus the dominance of the English language all but guarantees that English-speaking countries will be the net benefactors of this global movement.

For all their advantages, Germany, Japan et al. still struggle with their parochiality when attracting foreign talent, while the US can do this really, really well. Take the entire roster of top IT people in the US and make a checkmark next to every immigrant or a child of immigrants. Similar lists in Munich, Paris, Tokyo etc. would look very different. Most European countries struggle with the fact that recent immigrants tend to be overrepresented in prisons.


Probably not, when Europeans realize, that solar pannels manufacturing is dependent on China.


> people will use the losses the nuclear incurs to subsidize renewables as a proof that nuclear is uneconomical, and that renewables beat it handily

There is already plenty of proof around the world that renewables are cheaper than nuclear.


Maybe cheaper as of today if we don't account for storage, but since buulding renewables use far more materials than nuclear, would fossil fuels which ensured cheap production and transport become lacking, or base materials extraction not being able to follow a rising demand, I am not sure it would still be the case.


If nuclear used fewer raw materials nuke plants wouldn't be huge - huge - capital projects, wouldn't take years to come online, and wouldn't have huge cleanup costs.

If the money spent on nukes had been spent on renewables and on developing storage we wouldn't have these problems.

This was predictable decades ago.

The reality is that nukes are a political solution to a political problem. It's nice that they sometimes generate energy for a while, but there is no sense in which they've ever been a rational economic choice.


Nuclear uses roughly an order of magnitude fewer ressources than wind, without factoring in storage: https://imgur.com/a/Kc2h21O.

Source: https://www.energy.gov/sites/prod/files/2017/03/f34/quadrenn... (page 390)


You can't use a report from 2015 citing data from 2010 as indicative of a technology that has dropped in cost 10-fold and nearly doubled capacity factor since then.

Additionally the overwhelming majority of that material is foundation and tower. Both of which can be reused by replacing the nacelle.


The Vestas V112-3MW from 2009 has a 42% capacity factor. I don’t know any modern wind turbines that has a capacity factor of 80%.

https://www.thewindpower.net/turbine_en_413_vestas_v112-3000...

https://en.m.wikipedia.org/wiki/Humber_Gateway_Wind_Farm


And this makes the point. A 2009 model turbine is a huge improvement on a retrospective on past history of plants that had been running for some time in 2010

https://www.vestas.com/content/dam/vestas-com/global/en/sust...

A V112 has 40% lower materials per TWh (or ~30% for the lower wind configuration) than your report without considering modular foundations which would drop it by another factor of 3 or being able to reuse the foundation and tower at least once which would drop it by another 2.

On the other end you're not considering Uranium mining or enrichment facilities or waste storage or that the metals are all recyclable immediately rather than half of them being LLW or SLIW.


Average capacity factor is basically an irrelevant metric. What is more important is the lowest seasonal capacity factor, since seasonal power storage doesn’t make sense. For the UK this is 25% for offshore wind: https://onlinelibrary.wiley.com/cms/asset/bd2bb73a-ef25-4c4b...

This means that we need to overbuild wind by >200%, to have the necessary power year round.

Solar doesn’t work, since it is basically useless in the northern hemisphere during winter, and it needs a lot of battery storage during the night.

There is also a week in February where there is no wind, which will require green methane production and gas peakers(edit: https://en.m.wikipedia.org/wiki/Dunkelflaute)

Nuclear can work year round with load following using a design like the Natrium reactor. This design also uses a lot less concrete than traditional NPP’s.


Nice train you've got the goalposts on there.

The wind and solar capacity factors are anti-correlated.

There are areas in france where the december average is >2.6kWh/d. At 54kg/kWp a modern panel on a lightweight racking system gets this works out to 1800t/TWh and it's almost all glass you can build the same net wattage now for half the price and then use the money you save to replace in 30 years (or 50 if you take real world degradation rather than predicted). Having to recycle some glass once to decarbonize now ranther than in 20 years is a reasonable tradeoff for halving the costs. The gas plants are dirt cheap and that much solar would easily power enough electrolysis to fill your february gap during the 11 months of the year when it produces far more. The electrolysers will be needed in either case for ammonia, shipping, and SAF.

If we're invoking technology that doesn't exist and costs several times more, just use something that does exist and costs several times more like a CSP plant and an HVDC cable.

Additionally for the vast majority of the world which is in transmission range of somewhere arid, CSP is strictly cheaper than the easy part of a Natrium reactor.


Nuclear plants use very little raw materials relative to the amount of power the produce. When built at scale, nuclear plants have been delivered at prices around $1-2 billion dollars per GW of capacity.

The cheapest form of carbon-free energy really depends on what the objective is: small reductions in a mostly fossil-fuel grid? Or total replacement of fossil fuels? Renewables are great for the former: you can throw up some solar panels or wind turbines and reduce a chunk of fossil fuels use. But once you try to start delivering significant portions of the energy grid through intermittent sources the surplus energy starts to get wasted, and the effectiveness drops.


Nuclear is roughly equal to wind on a modular foundation if you account for the fact that the tower and foundation outlast the nacelle. The "$2/GW" nuclear reactors were all built by state run agencies with opaque budgets and in France's, Japan's, and South Korea's cases have all proved wildly unreliable in addition to having opaque public subsidy on top of the very large visible subsidies in the supply chain and finance. If you think it's possible to match the prices China reports that megaprojects cost, I'd like to see any examples of projects in the global north with auditable accounting matching their figures in hydro, or highways, or rail, or ports or... basically anything.

In mediocre to good areas with something like the PEG racking system solar uses about the same raw material than nuclear already and it's almost all sand. By the time a new nuke came online this will be far less.

Both are recyclable. 12 hour storage adds negligible mass and can easily cover daily variation.

Intermittent power without storage can easily feed dispatchable loads like EV charging, chemical feedstock and heat production. These vastly exceed non-dispatchable electricity and can be used for virtual seasonal storage.

There are only a small handful of areas best served by nuclear, and most of them have hydro or nuclear already.

There's a narrow niche where nuclear is optimal:

Grid electricity between 50% and 80% penetration in the 50% of areas where hybrid CSP + e-fuel backup isn't better. This niche is rapidly shrinking and could easily be gone by the time one is built. More carbon can be removed faster and with fewer resources by throwing renewables at the other 10 or so TW of fossil fuels currently being burnt. Until those resources are committed, new nuclear just delays things.


> Nuclear is roughly equal to wind on a modular foundation if you account for the fact that the tower and foundation outlast the nacelle.

Except intermittent sources also need storage. They also need long distance transmission lines to bring power from remote areas of generation to places of demand (whereas you can just place nuclear plants next to areas of demand).

This is a common pattern in renewables discussion: laser focus on generation and ignoring the fact that wind and solar have storage and transmission requirements that other energy sources don't have.

> The "$2/GW" nuclear reactors were all built by state run agencies with opaque budgets

Nope, do more research. These were all built in the US with public cost history.

https://en.m.wikipedia.org/wiki/Peach_Bottom_Nuclear_Generat...

https://en.m.wikipedia.org/wiki/Browns_Ferry_Nuclear_Plant

https://en.m.wikipedia.org/wiki/Byron_Nuclear_Generating_Sta...

https://en.m.wikipedia.org/wiki/McGuire_Nuclear_Station

https://en.m.wikipedia.org/wiki/Donald_C._Cook_Nuclear_Plant

https://en.m.wikipedia.org/wiki/Indian_Point_Energy_Center

https://en.m.wikipedia.org/wiki/Turkey_Point_Nuclear_Generat...

https://en.m.wikipedia.org/wiki/Arkansas_Nuclear_One

https://en.m.wikipedia.org/wiki/St._Lucie_Nuclear_Power_Plan...

https://en.m.wikipedia.org/wiki/North_Anna_Nuclear_Generatin...

> 12 hour storage adds negligible mass and can easily cover daily variation

12 hours of storage for the world is 30,000 GWh. This is 70-100 times the global battery production output. "Negligible mass" is going to have to see a hundredfold increase in som extraction industries. By comparison, nuclear already produces 20% of the US's electricity hand about a tenth of the world's electricity. A tenfold increase is much more manageable than a hundredfold increase.

> Intermittent power without storage can easily feed dispatchable loads like EV charging, chemical feedstock and heat production. These vastly exceed non-dispatchable electricity and can be used for virtual seasonal storage.

If you're going to tell chemical industries and metallurgy plants that they'll have to cease production for part of the year when renewables are producing lower than average output, then that has to be factored into your costs. If the price of steel and ammonia goes up because they can't run their plants as usual, then that cost is ultimately borne by consumers. You can't just use load shifting as part of the plan and ignore the costs of load shifting. "Virtual seasonal storage" amounts to "tell industries to shut off during winter". And no, heat production is not non-dispatchable unless you're okay with people freezing to death.


> Except intermittent sources also need storage. They also need long distance transmission lines to bring power from remote areas of generation to places of demand

You are correct. They also need storage like nuclear for non-dispatchable loads in areas without good hydro or CSP resource.

For the remainder your battery production figures are off by at least a factor of two. China delivered 280GWh in H1 2022 at the peak of a market crunch in an industry that is growing at 50% YoY. There's no compelling reason to think the 5TWh/yr of factories under construction won't be completed on time as the renewable industry has been consistently over-delivering for a decade.

Your scaling for nuclear is new capacity. Which is around 5GW/yr right now. It has to increase tenfold to match the last year of new renewable generation, or fifteenfold to match the new capacity weighted installation.

> (whereas you can just place nuclear plants next to areas of demand).

Incorrect. Seismic activity, ground, water, temperature, security and many other concerns limit siting severely.

> If the price of steel and ammonia goes up because they can't run their plants as usual, then that cost is ultimately borne by consumers. You can't just use load shifting as part of the plan and ignore the costs of load shifting. "Virtual seasonal storage" amounts to "tell industries to shut off during winter". And no, heat production is not non-dispatchable unless you're okay with people freezing to death.

Hydrogen or ammonia continue to exist after you make them. Simply overprovision your $300/kW electrolyser slightly and use chemical energy as your buffer. This has the added advantage of being an emergency or low CF backup at minimal extra cost.


> They also need storage like nuclear for non-dispatchable loads in areas without good hydro or CSP resource.

Really? Show me all the storage facilities France built when they have >80% of their electricity coming from nuclear power? It'll be challenging for you to do so, since no such storage facilities were built. Nuclear power can be modulated. Plants try not to do this since they want to run at 100% as much as they can to make money, but there is no storage requirement for nuclear power.

> Incorrect. Seismic activity, ground, water, temperature, security and many other concerns limit siting severely.

All of which has been solved already. Seismically active areas have nuclear plants both in the US and around the world. Water is a non-issue since places with large energy use tend to be cities, which are populated by humans which also need water. Nuclear plants can also use wastewater or seawater (like the Palo Verde plant), it doesn't have to be potable water.

> Hydrogen or ammonia continue to exist after you make them. Simply overprovision your $300/kW electrolyser slightly and use chemical energy as your buffer. This has the added advantage of being an emergency or low CF backup at minimal extra cost.

Then show me the price history of commercial ammonia grid storage operators. Well, first you'll have to wait for such a facility to come online because none are operational. Proponents of intermittent sources keep hoping that some silver bullet will make storage nearly-free, since it's the only way to make wind and solar viable as primary sources of energy. But thus far, no silver bullet has come and it's unclear if it ever will.

Unlike nuclear which has historical precedence of being built at scale and cheaply. If we had kept building nuclear plant at the same rate as we did in the 60s and 70s we'd have a completely decarbonized grid by now. We have no such historical precedence building grid storage.


> Really? Show me all the storage facilities France built when they have >80% of their electricity coming from nuclear power?

It's called Europe. To make their nuclear less unaffordable they use other countries as seasonal and diurnal storage.

> Unlike nuclear which has historical precedence of being built at scale and cheaply

Where? Show a single privately run Gen III or later commercial plant funded without government enforced monopoly, free loans, or direct funding that comes in at an affordable price.

> Unlike nuclear which has historical precedence of being built at scale and cheaply. If we had kept building nuclear plant at the same rate as we did in the 60s and 70s we'd have a completely decarbonized grid by now. We have no such historical precedence building grid storage.

No, there'd be no viable Uranium in the ground after about 1980.


> Nope, do more research. These were all built in the US with public cost history.

On top of not including the cost of finance, liabity, or upstream supply chain which was provided by state military projects. The first one you linked had repeated safety violations and maintenance issues and has no record if how much they cost to remedy. The quoted price leads here:

https://www.osti.gov/biblio/6259203

And is in nominal dollars not 1986 dollars. $763 million 1966 dollars is $7 billion, not $3 billion. Add in the cost of finance and you get $10-14 billion or around $7/watt for an unsafe, inefficient plant with corrupt management. And this was not a greenfield site, it already had work for unit 1.

Do more research.

Every pro nuclear claim turns out to be a lie when examined even with the slightest scrutiny. All of them.


Your article studies plants built after the nuclear boom, which of course leads to higher prices. See the cluster of plants built cheaply starting in the mid 60s [1]? That's the nuclear boom. Your article studies plants still in construction at the end of 1986, which is when the nuclear boom tapered off following thee mile island. Deliberately or not you're pulling a slight of hand here by shifting the time frame. But in the end, this helps reinforce my point: nuclear is expensive when built in small numbers as your study demonstrates, and cheaper when built at scale as the study I'll link below explains.

Finance liability is a fancy word for debt: this has nothing to do with construction costs, and everything to do with financing models. You're right, nuclear would be even cheaper if better financing was done. Upstream supply chain is accounted for by the downstream purchase costs. This is like saying wind turbine costs don't include the costs of mining copper for the dynamos. That cost is in included when the wind turbine manufacturer pays for copper coils.

Research on nuclear's cost history overwhelmingly finds that costs are lower when built at scale: https://www.sciencedirect.com/science/article/pii/S030142151...

1. https://ars.els-cdn.com/content/image/1-s2.0-S03014215163001...

> Every pro nuclear claim turns out to be a lie when examined even with the slightest scrutiny. All of them.

Well, you sound like you're engaging with this topic in well-adjusted and unbiased manner!


The costs you quoted were from a source linked on wiki talking about the first reactor on your list.

My source is a primary source from the DOE for that source.

It included every reactor built before 1986 including Palisades which your article lists as $650/kWe, but it has down as $118 million or $1300/kWe in 2022 $ excluding retrofit for subsequent safety standards. With capacity factor that's $1700/kW net for an inefficient and unsafe design before major safety standards were written.

If your downstream product is a byproduct of a military project that was built for a different purpose then you cannot claim it includes costs. If the US military needed a supply of worn out giant bearings, provided wind turbine designs that cost trillions for free, and was selling turbine blades and nacelles at low prices it would also be a subsidy.

Whatever your opinion on finance, it is included in renewable projects which are fully privately funded.

If your hypothesis about construction booms was true, then the price minimum would be either reactors started in 1982 when construction was at its peak, or if you want to claim TMI as a boogeyman, then reactors finished just before it.

More likely it is:

a) an artefact of whatever part of your chain of references didn't catch the bit in the DOE report that says it is in nominal dollars which depresses prices of early reactors by a factor of two, and

b) The fact that nuclear has a strong causal mechanism for negative learning rate. Each new reactor teaches you new things that can go wrong which you then have to retrofit to old reactors. This only appears in the sticker price if they are still under construction.

Taking arkansas one unit 2. It was $577m for 858MW at 80% CF.

Inflation was quite substantial, so we can't answer without knowing what rate it was disbursed and at what interest during construction, but it is between $4500 and $6500 per kW net. Right inline with new reactors once cost of finance is included.

US nuclear costs and always has cost around $10-12/W using modern accounting terminology with a few outliers and a few early plants before the negative learning kicked in. You just taught me this by having me cursorially examine your lie. Thankyou.

Edit: actually it might have gone down a bit with after TMI. As there might be different accounting on later reports.


> With capacity factor that's $1700/kW net

So $1.7 billion per GW. This is an exceptionally good price for a system of generation that is geographically independent, is non-intermittent, and is energy dense (and so does not have to involve long transmission lines moving electricity from solar fields and wind farms to cities).

The US averages ~500 GW of electricity generation, 25% of which already comes from nuclear or hydro. At a cost of $1.7 billion per GW the remaining 375 GW could be replaced with nuclear for just under $640 billion dollars.

> If your downstream product is a byproduct of a military project that was built for a different purpose then you cannot claim it includes costs. If the US military needed a supply of worn out giant bearings, provided wind turbine designs that cost trillions for free, and was selling turbine blades and nacelles at low prices it would also be a subsidy.

So solar panels' cost has to include all the military and communications satellites that pioneered solar panel tech? Most renewable systems also use electronic computers to some degree. This technology was originally pioneered for military encryption and firing computers. You could apply this kind of broken logic to anything. Military and civilian reactor designs are vastly different: the latter are usually mobile, use highly enriched uranium, and are relatively small.

> If your hypothesis about construction booms was true, then the price minimum would be either reactors started in 1982 when construction was at its peak

Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.

> or if you want to claim TMI as a boogeyman, then reactors finished just before it.

Yes, this is exactly what's happened! Do you not see this big cluster of cheap plants built before 3 mile island and then plants got a lot more expensive afterward? Do you see how when the color shifts to dark brown they get a lot more expensive? The plants built just before 3 mile island were some of the cheapest forms of decarbonized energy we have ever deployed.

I'll draw this in MS paint to make it easier for you: https://i.imgur.com/VD34Zhi.jpeg


> Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.

https://www.worldnuclearreport.org/reactors.html#tab=iso;reg...

>So $1.7 billion per GW. This is an exceptionally good price for a system of generation that is geographically independent, is non-intermittent, and is energy dense (and so does not have to involve long transmission lines moving electricity from solar fields and wind farms to cities).

You're really stretching here. That's a single pilot plant in an industry with a massive negative learning rate without necessary safety features which is the all-time outlier. I had to go out of my way to find it, and it is not the same metric as you're judging renewables on. You've picked the single ripest possoble cherry. It was also the first turnkey plant so it being the cheap directly contradicts your hypothesis.

> So solar panels' cost has to include all the military and communications satellites that pioneered solar panel tech? Most renewable systems also use electronic computers to some degree. This technology was originally pioneered for military encryption and firing computers. You could apply this kind of broken logic to anything. Military and civilian reactor designs are vastly different: the latter are usually mobile, use highly enriched uranium, and are relatively small.

If the PV on Jim Doe's roof was required to power the satellite, and the government sold the polysilicon and sent experts to Jinko to help design the manufacturing facility, and provided the funding then yeah.

> Except the construction wasn't at its peak in 1982. Construction was at its peak during the early 1970s, and lurched to a halt after 3 mile island and nuclear panic took hold.

https://www.worldnuclearreport.org/reactors.html#tab=iso;reg...

You appear to be struggling with the difference between start and finish. The largest capacity of plants ever finished in the US was '82. The Arkansaw plant I picked as an example was the last one finished before TMI and was wholly consistent with $6/W (or higher including cost of finance) and a negative learning rate since Paliside.

> Do you not see this big cluster of cheap plants built before 3 mile island and then plants got a lot more expensive afterward? I'll draw this in MS paint to make it easier for you: https://i.imgur.com/VD34Zhi.jpeg

I've pointed out a primary source which contradicts the numbers that graph is based on and posited a causal mechanism for the disparity. Refute the primary source, demonstrate that my understanding of their use of the term 'nominal dollars' is wrong, or find another primary source (or the primary source the paper uses).


> You're really stretching here. That's a single pilot plant in an industry with a massive negative learning rate without necessary safety features which is the all-time outlier. I had to go out of my way to find it, and it is not the same metric as you're judging renewables on.

If I were cherry picking I could pick even cheaper plants. Zion 1 and 2 were built for less, as was Oconee 1 and 2.

> You appear to be struggling with the difference between start and finish. The largest capacity of plants ever finished in the US was '82.

Most of which were delayed after the 3 mile island incident, and correspondingly experienced greater costs. Sure, if you want to get pedantic the peak number of plants under construction at any one time peaked just after three mile island. But that's because so many plants were delayed, and this led to higher costs.

> I've pointed out a primary source which contradicts the numbers that graph is based on and posited a causal mechanism for the disparity. Refute the primary source, demonstrate that my understanding of their use of the term 'nominal dollars' is wrong, or find another primary source (or the primary source the paper uses).

I'm looking over the OSTI report and calculating the inflation adjusted numbers line by line. They match the costs listed in my source. It doesn't look like there's anything to refute: both of our sources show that nuclear plants built during the nuclear boom were some of the cheapest forms of decarbonized energy there is.

I don't have anything refute, because your source agrees with my point. Your own source's data reinforces the claim that nuclear built during the nuclear boom (plants started after 1965 and built before three mile island) were often delivered between 1 and 2 billion dollars (2010 adjusted) per GW of capacity, and some even less than 1 billion.


Zion 1 and 2 were 276 million each at 58% CF or between $3 and $4.2 per net Watt. Better than the last plant to open before TMI, which supports a negative learning rate.

And again. This doesn't include safety retrofits, and it doesn't include O&M which is higher than new renewables.

Even after retrofit, it was destroyed due to a design and management failure in 1998.

All of those early plants are more expensive than you are saying, they had state controlled funding. They were inefficient, and they were unsafe when they opened.

Additionally they all had abysmal capacity factors in the 70s and 80s, around the 50-60% range so using lifetime CF is incredibly biased towards making them look good.

The cost of retrofits which was almost entirely unrelated to TMI was about 40c/Watt https://www.sciencedirect.com/science/article/abs/pii/036054... or about 80c per net watt just for the retrofit to meet 1980s standards.

Include all the failed reactors, and stop looking at just the lowest cost ones, include the cost of the free loans, and you're back up around $6/W


If we're counting capacity factor, then the cost of solar and wind increase by ~4x since they have capacity factors of ~25%, which is a lot less than nuclear's typical ~90% capacity factor [1]. Oconee's capacity factor is 81% over its life and 97% in a typical year. It's actually the opposite: focusing on lifetime capacity makes most nuclear plants look worse than in a typical year.

For all their supposed lack of safety, nuclear power - including these early and supposedly unsafe designs - safer than most renewables [2]. There's an immense double standard between renewable safety (nobody seems to care about the tens of thousands of people killed by dams) and nuclear power.

1. https://www.energy.gov/ne/articles/what-generation-capacity#....

2. https://www.statista.com/statistics/494425/death-rate-worldw...


Lifetime capacities up to TMI are fair for a proposal to build what was built before TMI. Including reliability improvements deployed over cumulative decades of downtime at costs of billions per reactor isn't comparing the thing that was purchased before TMI.

Of course renewables should be capacity weighted. Noone is saying they shouldn't. Capacity weighted new solar in germany is about $3.80/W or new onshore wind is about $3/W. These are both dropping 10-20% YoY. New 4 hour battery is around $2/W. The up front cost is about the same, but the operating costs of NPP exceed what many wind and solar projects are able to bid for. Even if we assume unrealistically short construction times of the 70s for a new Gen III+ reactor the extra 6 years of operation will have the solar park half paid off by the time it opens.

Those early designs were safe enough to mostly keep operating thanks to the exorbitantly expensive upgrades. This is an engineering feat, and a testament to the care and excellence of the US NRC, but it came at a cost which you are trying to pretend does not need paying. Gen III+ reactors are far more complex and so cost more on top of the additional costs incurred by not operating in the unique environment of the 60s.


> Capacity weighted new solar in germany is about $3.80/W or new onshore wind is about $3/W.

This alone is more expensive than nuclear power built during the nuclear boom.

> New 4 hour battery is around $2/W.

So 12 hours of battery, which is a minimum estimate of what we'll need is $6/W. Also, this price is rising: https://www.utilitydive.com/news/battery-prices-to-rise-for-...

Combined these sources make for $9-10 per watt. Furthermore, they have life spans lasting far less than nuclear power, meaning they'll have to be replaced more frequently. By comparison, your own source found that nuclear was built for $2-3 per watt during the nuclear boom. Again: your own sources contradict you.


You're cherry picking the data I cherry picked to help you again. P919 says the average cost was $589/kW in 1983 dollars with $120/kW of non-TMI retrofits and costs rose with time rather than going down. In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same. Your argument about positive learning rates doesn't fit the data even slightly.

Your absolute best argument if I shuffle the goalposts all the way along for you and ignore the guaranteed money, the abandoned plants, the shutdowns that occured under a decade after opening (all of which were paid for on the public dime) the military and govt involvement and the lack of liability is that undoing 37 years of safety and efficiency improvements and reproducing reactor designs with similar capacity to wind and a much higher correlated forced outage rate to a renewable blend sans storage will allow you to come in at only 7% over the cost and only 4-6 years later?

Then even after all that, operating it for two decades will cost more than the total cost of the renewable system.

All this in a country with mediocre wind and worse solar resource than Alberta, Canada. This is your argument?


Whatever "TMI retrofits" which you keep referring to (yet never actually backing it up with a source) are likely not necessary: 3 mile island's secondary containment worked and prevented any significant amount of radiation release.

Estimates you're giving for renewables are excluding the cost of storage, or using fanciful figures of 4 hours worth of storage, as well as excluding costs of transmission and load shifting.

> In the hypothetical where this is in some way related to somethint that could happen now this is $3600/kW vs a renewable blend in germany of $3400/kW. If we take the last few from each manufacturer that opened before 1979 and don't add retrofit costs it's about the same.

No, it doesn't. It comes out to $1600/kW. Average capacity factor of nuclear power is over 90%, not the 50% you claimed earlier. And again, your "renewable blend" omits the cost of storage, which will be immense if we're even able to build storage at the scale required at all.


The retrofits are reliability and safety upgrades excluding those that happened as a result of TMI. P920 in the Phung paper I linked. This adds about $120/kW or $200/kW net in 1983 dollars.

You don't get to use the price excluding 40 years of reliability and safety upgrades since TMI in one of the strictest nuclear regulatory regimes with tens of billions of tax money spent on the public share of enforcement, and the performance including those upgrades. A Ford Pinto isn't a 2022 Lambourghini.

The prices are costs pre-tmi. The 58% is the lifetime capacity pre-tmi. If you want to use 92%, then find and source the cost of retrofits and interruptions between 1979 and 2022, as well as the cost of replacing all the plants that closed early and the cost of abandoned plants.

At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages. The renewable blend isn't as reliable as the modern fleet, but the 1979 fleet needs more storage, more backup, and more transmission to distribute the overprovision to where it is needed.

If you don't want to prevent more TMI incidents by adding all the stuff that happened after, you're also going to have to throw in a billion every 20 years or so to pay for cleanup and replacing the lost generation capacity.

Alsk keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom. This alone is enough to disprove your assertion that the cost difference is due to lesser construction.


> $200/kW net in 1983 dollars.

So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.

And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.

> At 58% capacity factor with ~20% forced outage rates you are going to have many, long, correlated outages.

Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.

> Alsk [sic] keep in mind that on the list of reactor prices from 1968 to 1979, the prices went up the entire time. Your learning rate is negative even in the Nuclear boom.

And again, you skew the timelines to fit your narrative. The nuclear boom started in 1965, and saw large price decreases from plants starting construction before then. That's the price drops brought about by the nuclear boom. Reactors continued to be cheap until three mile occurred and reactors that had started construction in the mid 70s had to deal with a whole ton of nuclear obstructionism. That's why price increases among reactors that started construction in the mid 1970s and later. Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.


> So, it adds less than 200 million dollars per GW of storage. This is not large. It's also misleading to portray retrofitting plants as an expense that new nuclear builds would require: it's a lot harder and more expensive to do a retrofit than it is to incorporate these changes in the initial construction.

$200 in 1983 is 600 in 2022. So why are you claiming that it's the reason the plants finished in the 80s and 90s were more expensive then? A much cheaper version of a quarter of an insignificant amount can't double the price. There must be a different explanation like a negative learning rate.

> And again, three mile island was contained. Secondary storage worked. If these retrofits are expensive, they were evidently not necessary.

It still wiped out the plant and cost billions to clean up. And these are all changes related to the countless minor incidents like fires and pipe burst *before* TMI. The changes due to TMI before 1985 were not counted and were smaller. 40% of these modifications were initiated by the utilities to improve reliability and weren't even due to regulation. This was the cost of improving the early extremely simple reactors to the same safety and reliability standards as TMI.

> Good thing nuclear power averages a capacity factor of over 90%! By comparison what's the capacity factor for solar and wind? 25% depending on geography.

Not the machines you are saying to build. If you want to build a machine with 1/3rd of the material, a quarter of the regulation, a sixth of the labour many fewer redundancies, and less QA? You get one that performs like it. More regulation and more expensive designs and another 40 years of upgrades improved the reliability.

The nuclear industry learnt by doing, and what they learnt is if you half ass things your reactor catches fire or starts leaking and needs repairs like Browns Ferry or Rancho Seco or any if the N4 reactors in France which were built and operated at similarly slapdash rates.

> Restrict the query to reactors completed before three mile island and there's no increase in cost. The chart I posted handily put those in a different color, but this is apparently not obvious enough for you.

In the primary source from the DOE, the prices increase. Same with the Phung paper. The retrofit costs I included are precisely and only the ones unrelated to TMI. Excluding a small fraction of FOAK plants, they went up in price the entire time for plants that opened from 1969 to 1979. Every year they got bigger and more cimplex and added more redundancies because that was what was needed to keep them running more than half the time.

Every year the very first commercial plants were running, more problems were found that needed monkey patching and required adding complexity to subsequent and in progress designs. This necessarily can't have started before the first plant of each manufacturer was running.


Heating is trivially dispatchable over 24 hours. Put some sand, brick or water between what you want to heat and the heat source. This method has been used for thkusands of years.

Scale it up a bit and you can do it seasonally.


Not if you calculate in the cost of wanting to have a nuclear weapons and submarine fleet…


They don't compete. Nuclear provides base load.


They both provide non-dispatchable power. Renewables have a slight edge at moderate penetration with no storage because you can turn them off whenever you want without incurring massive costs and solar output is biased towards peak time.

Then there's hybrid PV-CSP which is available in about half of the world and is dispatchable. I guess you're probably right in that nuclear doesn't compete because hybrid CSP is vastly cheaper even in FOAK form and dispatchable power is superior.


Nuclear is dispatchable.


No it isn't. Ramping is slow and can't be done beyond 20% very often or you destroy your fuel and control rods

Reducing output doesn't reduce costs, it increases them. This is the opposite of dispatchable.

If you can only pay for your reactor by coercing people into buying daytime electricity for 20c/kWh rather than buying a solar panel that will pay for itself in 3 years then it's not dispatchable.


You don't need to alter the thermal output of the reactor to modulate a nuclear plant's electrical output. You can more aggressively cool the reactor to reduce the energy delivered to the turbine. This isn't often done since it's essentially deliberately reducing the efficiency of the plant.

> Ramping is slow and can't be done beyond 20% very often or you destroy your fuel and control rods

20% is all that's necessary to accommodate most load variations: https://www.eia.gov/todayinenergy/detail.php?id=42915


That is also slow.

Dispatchable generation ramps in tens of seconds and you don't pay $90-200/MWh for it when you're not using it.


It's not slow: the turbine water is can be cooled more aggressively immediately, and will start reducing output with one circuit of the generation turbine. Also, modulation only needs to vary 20-30% over the span of entire days not of tens of seconds. And no, dispatchable generation does not ramp in 10s of seconds. Natural gas plants - the most popular peaking generation plant - still takes an hour to activate. But this isn't an issue because electricity use doesn't fluctuate by 20% in the matter of tens of seconds.


Tell that to the power station engineers who have to watch for the ad breaks in British TV.

And it's still not dispatchable if not using it costs you anyway.


> And it's still not dispatchable if not using it costs you anyway.

No? This just plain wrong. A dispatchable source is a dispatchable source, regardless of any associated costs. And with nuclear there isn't even any direct cost with running the plant at a reduced capacity. There's only the opportunity cost of lost electricity sales, which would happen anyway because there isn't enough demand.

If there's 100 GW of peak demand and 80 GW of minimum demand, building 100 GW of nuclear plants and reducing output during periods of non-consumption does not have any increase of costs.


If there's noone to sell your $150/MWh electricity to because they took one look at the price and put a solar panel on their roof, then you're not selling $150/MWh electricity, you're selling $500/MWh electricity for the 20% of power they must buy. Then when they take a look at the new price, they go buy a battery. The only way to pay it off is a government enforced utility connection fee for a product nobody wants.

The only way to sell it for $150/MWh is to underprovision or to build storage or to find dispatchable loads. Just like renewables.


None of this has anything to do with dispatchablity. Nuclear power is indeed dispatchable, which is why you're pivoting to this strawman about pricing. If we had a primarily nuclear grid, there's be no need for solar panels anyway.

> Then when they take a look at the new price, they go buy a battery

You're making the same error a lot of renewable activists do: assuming that household electricity use is all there is. How do you power the turbopumps that make our sewage and plumbing systems? How about our telecommunications systems? We'll just deal with cell phones shutting off after dark?

Energy storage requirements are staggering. The world uses 60,000 GWh of electricity every day. Storage requirements are at least 12 hours for diurnal storage, and several days for seasonal storage. Just going out and buying a hundred terawatt hours worth of batteries is a lot easier said than done.


The mines and aluminium smelters and arc furnaces and polysilicon plants are all building their own renewables. They're not going to buy your daytime energy either when they can make their own DC power at $10-30/MWh. The industries which require hydrogen or derivatives will just make it on site and store a few weeks worth. The industries that need heat or steam will store it in a lump of iron ore wrapped in some fire bricks and rockwool.

Then you might want to just stop and think about how you might go about storing energy if you have a pump and a reservoir on a hill or a water tower. Just ponder that one for a few seconds.


> They're not going to buy your daytime energy either when they can make their own DC power at $10-30/MW

Unless it's night time. Or cloudy. Or during the winter when the incidence of the sun reduces solar output. Again, this is why any plan that involves cutting power to mines, smelters, etc. needs to factor in the costs of shutting down these industries when renewables fail to produce energy.

> Then you might want to just stop and think about how you might go about storing energy if you have a pump and a reservoir on a hill or a water tower. Just ponder that one for a few seconds.

Right, except we just have to have a lake on a hill handy. Some places have it. Most do not.

Why don't we just use hydroelectricity for all of our power needs? Ditch nuclear, and ditch solar and wind. Just build dams. Problem solved.


> Unless it's night time. Or cloudy. Or during the winter when the incidence of the sun reduces solar output. Again, this is why any plan that involves cutting power to mines, smelters, etc. needs to factor in the costs of shutting down these industries when renewables fail to produce energy.

So they'll buy your night time energy for the few hours a day when the wind farm they contracted with for less than your O&M costs isn't producing. Still doesn't help the nuclear operator pay the bills for the other 22 hours. Unless you're suggesting we ban people from supplying their own energy or making contracts with fully privately funded wind generators? Sounds pretty un-free to me.

> Why don't we just use hydroelectricity for all of our power needs? Ditch nuclear, and ditch solar and wind. Just build dams. Problem solved

You cited a need to store energy for moving water from a reservoir to where it is needed. Storing the amount of water you need to store but raise it up a little bit is a fairly well understood problem.


Blocking water with a dam is a well understood problem. Don't bother with wind nor solar nor nuclear nor storage. Just build dams, problem solved.


> Blocking water with a dam is a well understood problem

I feel like we still have some problems with dam building, because they keep failing. We struggle to get the building material (in particular, sand). Concrete is pretty awful in terms of CO2. Dams of all sizes cause problematic changes to the rivers they're on, and block flows of fish and other animals. Smaller low head weirs and dams kill humans.

Lots of time, money, and effort is going into removing smaller dams and low head weirs.


No need to have a tantrum just because you couldn't think of a way to claim every joule needed weeks long chemical battery storage.


We won't need any storage. We'll just get all of our electricity from dams. Since it's a well understood problem we can build them anywhere we want in any quantity.


Nah, wind and solar are cheaper and safer and don't take as long. We can use the existing dams for dispatchable power though. As well as CSP of which the unsubsidized LCOE has recently hit parity with O&M of nuclear and is plummeting. Throw in some thermal storage as well, that's safe.


Nope, hydroelectricity is about the same levelized cost of energy. Include storage costs and it's vastly cheaper: https://www.eia.gov/outlooks/aeo/pdf/electricity_generation....

Look at the countries that produce all or nearly all their electricity from renewables. It's dominated by hydroelectricity: https://en.wikipedia.org/wiki/List_of_countries_by_renewable...

And since it's a well understood problem we can just build it everywhere, in arbitrary quantities.


This is quite the tantrum to be having in response to being told that you don't need a nuclear reactor to pump water downhill. Was it really so earth shattering to your world view?

"The mines and aluminium smelters and arc furnaces and polysilicon plants are all building their own renewables."

They most definitely are NOT. Microsoft is building a gas turbine to power a data center in Ireland though, because data centers NEED POWER AT ALL TIMES!


Well done. Great comparison. An industry that needs five nines of uptime on their power supply in a country with worse solar resource than much of the arctic is totally representative of an industry which only needs to keep interruptions below 4 hours, has costs dominated by electricity and is adding it to reduce the bills.

What an incredible insight.

Meanwhile in things related to what I said:

https://www.riotinto.com/news/stories/First-solar-plant

https://www.nsenergybusiness.com/projects/hywind-tampen-floa...

https://www.microgridknowledge.com/google-news-feed/article/...

https://www.nsenergybusiness.com/features/microgrids-mining-...

And countless others so frequent they don't make the news. It's an absolute no brainer because solar is about the same price at any scale but fossil fuel micro generation is really expensive.


Not when you take intermittency into account.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: