Hacker News new | past | comments | ask | show | jobs | submit login

Interestingly both of these nuclear plants (Vogtle 3 and Olkiluoto 3) were built by companies (Westinghouse and Areva) that went bankrupt during the construction.

We sorely need a safe, cost-effective and reproducible blueprint for manufacturing nuclear infrastructure at scale.




Other countries manage to built nuclear plants on time and on budget. China and South Korea are managing it.

In the west, nuclear plants were more affordable when built at scale. It's not just reactors that are costly, specialty parts like steam generators and turbines are cheaper to produce in runs of 40 instead of 4. It's not so much the blueprint that makes a plant cheap. It's building two dozen of the same blueprint.


What's the price ratio between the first PWRs in the US and canada and the ones started in the late 80s/early 90s after the industry had maximum experience?

We'll just multiply that same ratio by the price of Vogtle and get.. what? $30k/kW?


Areva can't really bankrupt: it is backed by the French government, its main owner. It still exist and is basically a public company that critics argue is structured in order to dismantle the industrial companies that were still under public control. But nuclear energy is very unlikely to be really privatized in France. That's a very touch political subject.


Olkiluoto 3 definitely bankrupted Areva. The company ran out of money and the liability from that one project became a risk for all the other (healthy) branches. They restructured it to different companies with only the problematic Olkiluoto 3 project staying in the original French state-owned Areva. Once this project is complete Areva will be defunct.


> We sorely need a safe, cost-effective and reproducible blueprint for manufacturing nuclear infrastructure at scale.

Well, that was part of Areva's branding circa 2009: nuclear's nespresso and selling combustible and reactor in the same package. Full vertical integration: uranium mining, enrichment, reactor building, recycling.


And do you know what went wrong?


I don't have a comprehensive answer. All I know is Areva bought 3 uranium deposits in Africa for ~2.5 billions of € (plus ~1 billion of € for additional services Areva built later, like a desalination plant) but the deposits were ultimately not exploitable (costs of extraction were too high because concentration of uranium in the deposits was too small).

Areva used to be top in their field (mine prospecting, geological stuff) and then discredited. Ended up being bought back by EDF (which is to say, bought back by the French state).

The company (Uramin) that Areva bought (to get the uranium deposit fields) seem to have lie about their deposits' potential. It was before Fukushima sent the price of Uranium down, so they were expecting a lot of return on investments from this move.

The whole affair is riddled with corruption, insider knowledge, betrayal and incompetence at some key high level ranks at Areva. Too much easy money if you ask me, then someone (Uramin + insider ?) wanted a bigger part of the pie and the whole cake turned bad.

edit: also too much money (~10billions) invested in different fields ultimately led up Areva to bankruptcy.


They sold reactors for 3 billion a piece but it cost them 11 billion to build the first one because they didn't have and couldn't find the necessary competence.


What if it's not cost effective? Can we still do it for the environment?


We could, and would have to, if we didn't have other options that were equally friendly to the environment.

But given two options, one twice (or more) the cost of the other, and with the more expensive option being slower and less scalable, why choose the hard and expensive route versus the cheap and easy route?


The cheap and easy route is wind+solar+battery?

And what about fusion, we spend a lot on R&D but it's pretty clear it will be even more expensive than fission, if you were in charge would you cancel that effort entirely and shift funds elsewhere?


Wind+solar+battery is here today, and the faster we deploy it, the more we will save. Every day that we delay the transition is another day that we are overpaying for energy.

If fusion can compete once it happens, bring it on. But it should be targeting a cost of $1-5/MWh instead of $50/MWh.


Most DT fusion efforts should be cancelled. ITER is an abomination, for example. It has no chance of leading to anything remotely attractive as a power plant.

Helion's approach might make engineering sense, but it's still a longshot. But that's ok for research.


Even Sabine would not do it.


Wait, when did we solve power storage? I must have missed the memo...


What was holding back power storage was that it's largely unneeded when we're still burning fossil fuels.

No technological breakthroughs are needed for arbitrarily scalable power storage with good efficiency.

https://aip.scitation.org/doi/10.1063/1.4994054

"Insofar as the numbers I have presented in this paper are correct, they demonstrate that energy storage is a problem of 19th century science. No future laboratory breakthroughs or discoveries are required for solving it. All that is needed is fine engineering and assiduous attention to detail. Said poetically, this is 21st century rocket science.

Moreover, it is clear from Fig. 11 that the storage capacity of months becomes feasible once the engine (including the heat exchangers) exists as a product one can purchase at a known cost, particularly if the heat is further transferred into cheaper media for longer-term storage, such as rocks underground. Thus, pumped thermal storage with heat exchange is not a niche solution to the energy storage problem but a global one. This is the reason I think it will prevail."


I'm disappointed that paper contains no cost estimates. Those are the key thing with energy storage.


The paper does contain (rough) cost estimates, specifically the cost per unit of energy storage (about $13/kWh) and the cost per unit of power (from $0.20-0.27/W depending on the choice of gas.) See section V, "Cost".

I'm not impressed with the costing methodology used, but it's probably at least in the ballpark.


Here is a framework for government backed research into the topic: https://www.energy.gov/energy-storage-grand-challenge/articl...


In the past few years, lithium ion battery storage has plummeted in cost and is seeing massive deployment all around the grid.

Most utilities use five-year resource plans, and even then they tend to use out of date publications for cost guidance, which themselves took several years to be written and get through peer review.

So traditional utility deployment is done on 10-year old info. In more open markets, like Texas, storage is a huuuuge amount of the capital that's being deployed on the grid. And in places with more active residents that force the utility commissions to force the utilities to use realistic numbers, like California, storage is already deployed in GW range. For example, existing storage on the grid today was a bigger player than nuclear during California's recent and massive heat waves.

And one dirty secret that they don't tell you about nuclear: it's also going to need storage. Nuclear is not dispatchable, it can't be turned down on demand, and can't be ramped up. But real power demand varies a huge amount throughout the day.

The only reason France was able to get up to 70% nuclear energy on their grid was by using the continental grid to trade energy with other countries. France has a small number of super expensive nuclear "peakers" but they can only deal with very small fluctuations in demand.

So if nuclear were ever going to be a really major power source, or the only power source, it would require lots of storage to balance load.


> In the past few years, lithium ion battery storage has plummeted in cost and is seeing massive deployment all around the grid.

I advize doing the maths on this. Look at graphs of how much solar and win vary, check total elecricity consumption, look up latest price of li ion batteries and then do a bit of maths to see how much you need so that you get no blackouts in a 10 year period. Then realize you should use compressed air storage instead...

Last I checked, if we use the cheapest form of storage (compressed air) and assume there are enough suitable caves for the huge amount we want, we'd triple electricity costs by switching to renewables+storage.

> And one dirty secret that they don't tell you about nuclear: it's also going to need storage.

Not really, you just need to be able to burn excess power. Which is a very easy thing to do (you can spend as much as you want on turning atmospheric CO2 and water into methane).


I have done the math, and that's why I think that nuclear will only be a niche, and expensive, form of energy for a few countries that do not have good renewable resources.

As for CAES, if it can scale and be cheap, great. But there isn't nearly as much evidence of that for CAES as there is for batteries, which are being deployed by the GWh on the grid now, and which have massive plans for expansion in areas where the grid is market based and profit driven, instead of a regulated monopoly that can rest on its laurels.

I occasionally hear about liquid air too, and though everybody I have encountered that works on it is a bit nuts, I am more optimistic about liquid air than CAES for massive scale, as liquid air can be deployed many many places.


I've always wondered is the any reason we can't task the Navy with building these? They already make reactors for aircraft carrier and subs.


They are extremely expensive in LCOE terms.

Their core life relies on highly enriched uranium. Production and delivery at commercial power scale is a weapons proliferation risk in addition to being more expensive.

Refueling a naval reactor is a multi-year operation that happens only a tiny handful of times in the life of the ship. Current-gen submarines don't get refueled at all. They can get away with this in large part because they aren't running at 100% power. They are shut down in port, and even at sea they only operate at low power most of the time.

To reduce LCOE, commercial power reactors run at full power all the time. Refueling a commercial power reactor takes a month or so and happens every 1.5 years.


The newest US nuclear aircraft carrier uses Bectel A1B reactors [1], which produce 125 MWe and additional 260 MW of mechanical turbine power, which we could optimistically convert into another 260 MWe. This would be 385 MWe.

Each of the two reactors mentioned in the article produces 1250 MWe.

OTOH maybe a row of smaller reactors could offer a better economy of scale for production, even if they require more parts and more maintenance overall.

[1]: https://en.wikipedia.org/wiki/A1B_reactor


> We sorely need a safe, cost-effective and reproducible blueprint for manufacturing nuclear infrastructure at scale

Maybe. Or maybe nuclear is actually just too expensive, and we need to stop pretending it's a commercially-viable technology.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: