The elephant in the room with any kind of fission reactor is that they are going to need a lot of security. Even when the fuel and waste product is not weapons grade uranium, it's still highly radioactive and a great source of material for a dirty bomb. Basically anything that goes boom combined with small amounts of radioactive material and a bit of wind is a great way to depopulate e.g. large cities. So, having lots of small molten salt reactors all over the planet, which seems like the key selling point, is not a great idea from that point of view. Having them in or near any kind of conflict zone would be super risky. And according to this article there is actually some weapons grade uranium being produced as well which makes this even more problematic.
The security aspect would make this a lot more expensive than it already is and cost is already on the high side even before you consider that. This is a problem with traditional reactors as well but they have the advantage that they are huge facilities and that there are only a handful of them; so securing them is relatively easy.
People are suggesting this as complementing renewables but the cold hard truth is that renewables are already dirt cheap and on track to continue to drop in price by magnitudes for the next decades. That includes battery storage as well. Already quite cheap, also dropping in price, and typically already factored into e.g. new solar bids that are killing competing bids for coal and gas plants (or in some cases shutting them down prematurely).
Even at the current prices, that's a problem for any kind of nuclear solutions being contemplated right now. At the low end of the spectrum, we are talking 2 cents per kwh currently. Imagine this dropping to something like half a cent or even less. At those prices, the security alone would make nuclear too expensive probably. A 1 mw facility would have basically be generating only about 500-2000$ worth of energy per hour but only at peak demand. There's no guarantee prices won't drop way below that either. Any kind of operational overhead would be a problem. Needing 24x7 intense security would be very undesirable.
Are you suggesting that variable renewables plus batteries will approach prices that are... too cheap to meter?
Looking at current cost VRE trends, when variable sources contribute up to 4% of world energy, and surrounded by massive amounts of cheap but high-carbon natural gas, and assuming that the trends will just continue exponentially downward without serious complication, is optimistic.
Cost of integration of variable sources plus batteries is expected in many studies and somewhat intuitively to skyrocket as market penetration increases. When you have enough VRE to cover 100% including the big evening peak with batteries during a clear summer day, the extra generation you build to fill the other gaps gets curtailed. But you have to fill daily and then seasonal gaps, worldwide, including heat in winter and worldwide transportation (not just electricity). Buying that battery that is only even needed at all every third day is 3x the price, yet we prefer if the power doesn't brownout with this frequency. This is difficult. Already we're seeing NIBMYism in large solar installations in california and with transmissions lines. That gets worse with scale.
Nuclear today is a hedge against the possibility that deeply carbonizing with variable renewables + storage at world scale will be harder than we all think and hope it will be.
That said, nuclear certainly needs to drop capital and O&M cost dramatically if it wants to play the game. And security is indeed a big factor in this. Its competition in the small-footprint dispatchable world is natural gas with full CCS, which is also looking pretty cheap.
Yes. Not indefinitely of course but we've seen nothing yet and there's enough in the research pipeline to suggest clear trends over the next two decades in terms of cost improvements, efficiency improvements, cheaper materials, etc. That 4% you mention was less than 1% not so long ago and will be closer to 40% not too long from now. Such is the nature of exponentials.
IMHO 5X cost reduction is basically a done deal even just applying simple economies of scale. You can argue whether this happens in five or ten years maybe but arguing it isn't happening seems futile. Beyond that, 10x is extremely likely to happen as well. There's a lot of research happening that would need to be a complete failure/waste of time for that to not happen. At worst it will happen slower that I might like. Maybe this happens over the next decade. Or two. Or three even. From there, 20x might be quite possible as well.
Beyond that we're indeed talking rapidly diminishing returns. 10X would mean 0.2 cent/kwh. 20x would cut that to about 0.1 cent. This indeed gets you in to territory where metering it becomes more expensive than is worth the trouble. The average household uses about 15000 kwh per year, or about 150$ worth of energy at these prices; much less outside the US. Much of that is going to be produced on people's roofs or in their back yards. At that point it turns into a fixed cost.
Historically the same sources producing the studies you are referencing have been off by magnitudes predicting current adoption and prices for both solar and wind. So, I'd consider that the pessimist glass half empty point of view. Not to dismiss it entirely but arguing cost increases seems a bit far fetched in light of current trends in the market.
Aren't roof solar panels very inefficient? They can't track the sun and you can't have good density from them either. Furthermore, they don't work for office buildings and apartment buildings.
>Not to dismiss it entirely but arguing cost increases seems a bit far fetched in light of current trends in the market.
I think what the parent was saying is that because solar and wind don't reliably produce energy, we will need redundancy. As the market share of solar and wind grows we will need to increase the amount of storage and power generation far beyond what we normally need to operate.
Let's say you have a city that uses 1 GWh of electricity per day. If you were to generate all the electricity with natural gas then you need to have 1 GWh plus some extra for redundancy to fulfill the needs of the city. If you generate 5% with solar and the rest with natural gas then nothing really changes. If the weather is bad then the natural gas plant will step in.
As the share of your solar power generation grows toil need more and more redundant storage and solar power generation. If you rely 100% on solar power generation and the weather can be bad for a week in a row over the city, then you'll need storage that will last for more than that entire week. Not only that, you'll also need extra power generation capability to be able to fill and maintain that storage.
Just to put this into perspective: the US uses about 10 TWh of electricity per day.
They are good enough. Low efficiency just means we have to buy more of them. Obviously we'd need solar plants (and maybe wind parks and other clean sources) to also power those with limited access to roof surface.
Of course, bad weather doesn't mean solar stops working; it just reduces the output. The effect is typically very local as well. So, all that means is that you need a bit extra capacity to cover for that or import energy from somewhere else where the weather isn't miserable.
When (not if) the price for solar and wind drops 10x, you'll be able to buy 10x more than what you need and still beat coal/gas/nuclear on price. 10x is an insane safety margin that would mean you produce more than you need even on the most cold, dark, and miserable day imaginable. And of course solar is not the only source of clean cheap energy. Wind would be another popular option.
Bad weather in northern Europe means that everything is covered in snow for a long time. I think that's slightly more than just reduced output. The snow also tends to cover am area that's hundreds of kilometers in every direction too.
I've lived in Sweden and Finland. It's actually the lack of sunlight that is causing issues with solar. The snow is much less of a concern. E.g. Helsinki only gets a few hours of daylight and the sun does not get much above the horizon in the winter months. The reverse is true as well; you get insane amounts of sunlight in the summer months. Luckily there's wind, hydro, wood pellets, and a few other alternatives.
Further south in Germany, Netherlands, etc. Solar is usable throughout the year and quite common. Obviously output in the summer is going to be much better than during the winter.
> Aren't roof solar panels very inefficient? They can't track the sun
We are well past the point where tracking the sun was desirable. The apparatus that allows solar panels to track the sun is now much more expensive, especially when you consider the maintenance costs, than just using more static panels.
Actually, most new utility scale PV in the US southwest is now using single axis tracking systems. The advantage here is that the power curve is flattened over the day, which both makes better use of the inverters and pushes more production into higher value time periods.
With dirt-cheap renewables it will be more efficient to generate methanol from solar or wind electricity and to use it for long-term storage and redundancy than to burn natural gas or coal. And compared with coal or natural gas with methanol one can use fuel cells to convert it to electricity so one can have relatively small plants for backups in case of really long strips of bad weather.
This is very very optimistic. While the prices of panels are going down the rest of the cost isn't (land, permitting, installation labour, etc). And panels are only a small proportion of the cost now.
There is also only so much more efficient the panels can get per sq m. Right now we are approaching 20% efficiency, so the idea that they are going to get 10* cheaper with complete ease seems to be pushing it.
Regardless solar and wind have a lot of externalities that aren't really priced into these 2c/kWh PPA prices. If you need to build a
fleet of natural gas plants that are only operational 10% of the time and therefore cost 50c/kWh to take up the slack from low solar days it needs to be priced in somehow.
All the costs have been falling, at various rates.
It's amusing to see "permitting costs" being presented as a barrier, when at the same time "get rid of regulations" is spouted as the panacea for reducing nuclear's costs.
> That 4% you mention was less than 1% not so long ago and will be closer to 40% not too long from now. Such is the nature of exponentials.
But such is not the nature of the logistic function, which is usually a more accurate model of how technologies penetrate. (yes, in the beginning it looks like an exponential)
There is a final cost decline "baked in" to the curve, though. That's because the lifespan of solar now for calculations is maybe 20 years. Anything more would not make sense as costs are declining so fast. When the tech matures, the final part is extending the lifespan to 40-60 years.
You can't run an electrical grid on wind and solar alone and expect it to be stable enough.
- The prospect of every household having a battery backup big enough to smooth over spikes and dips is ludicrously wasteful. We are already struggling just mining enough to make batteries for cars and phones.
- the sheer scale of the wind and solar farms will leave major environmental impact to wildlife. Look at how massive area solar farms need, and know the area has to be cleared and desolate of life.
Nope, simply untrue. Cars use lithium ion batteries mainly for their energy density. For grid storage, other batteries and ways of storing energy are feasible and probably a lot cheaper as well. Wikipedia has a nice overview: https://en.wikipedia.org/wiki/Grid_energy_storage#Grid-orien...
The wild life impact of solar is relatively minor compared to burning coal, intensive farming, and all the other things we do. We'd need to only cover a relatively small amount of land with solar to produce what we need. And ironically some of the most suitable land for this would be deserts that have relatively little wild life to begin with. Of course the impact of wiping out coal, gas, and petrol usage would be enormously positive.
Windmills do indeed kill some birds. Yet, these things seem to have barely any impact on e.g. the seagull populations in places like Denmark that have a ridiculous amount of wind mills.
(Molten salt reactor is a fluid fuel reactor. Fluid: liquid, gas and plasma.)
Molten salt reactor is a stepping stone for fusion reactors. We need to master liquid and gaseous reactors before going plasma. Our civilization should aim for larger goals. We need some form of nuclear power to terraform other planets and spread earth's precious life and conciousness everywhere.
Well, given how solar has an inverse square relation between watts/area and distance from the sun, by the time you get to mars, solar is getting pretty piddly, the asteroid belts and jovian moons its downright comical.
Actually, solar still has huge advantages at the distance of Mars.
In space (actual space, not a planetary surface), it's difficult for nuclear to dissipate its prodigious amounts of waste heat. Solar, in contrast, can be made much lighter than on Earth, as there isn't wind and rain to deal with. And of course in general the Sun is available 100% of the time in space, further improving the economics.
But in any case, the argument wasn't that solar was necessarily better, it was that nuclear wasn't necessary. Solar can provide energy basically anywhere, with power beaming, even out into interstellar space.
Planets rotate so backup power is needed during nights. Spacecraft has to avoid obstracles and has to move, so backup is needed. During interstallar journey there is time delay in communication, so it may take hours to adjust the power beam when spacecraft moves a little bit. Fusion is like solar energy just near to us and MSR will be eventually replaced by fusion reactors.
> Planets rotate so backup power is needed during nights.
This does not rule out use of solar energy.
> During interstallar journey there is time delay in communication, so it may take hours to adjust the power beam when spacecraft moves a little bit
The vehicle tracks the beam, not vice versa.
> Fusion is like solar energy just near to us and MSR will be eventually replaced by fusion reactors.
Fusion is ridiculous when examined closely. It's like fission, only far worse from an engineering and cost perspective.
> Stefan–Boltzmann law: Radiation heat transfer is proportional to 4th power of temperature.
Which means waste heat from a reactor in space has to be radiated at higher temperature than for a reactor here on Earth, which means nuclear is worse off there than here. In contrast, solar works better in space, as I argued upthread.
You could say the same thing about any airplane, even small planes or drones could be filled with explosives and hit strategic targets like a chemical factory.
You can place the nuclear generators in police stations, military bases, other locations that are already protected, there is no need to have a small nuclear generator in each electronic device like in Fallout.
I thought LIFTR was the coolest thing ever. MSRs could have revolutionized power generation... about two decades ago.
Solar/Wind/Storage are beating almost everything. Everything they aren't... they will, very soon.
It's possible that MSRs could be scaled down and their liquid nature means that the reactor could simply be replaced on a schedule and the entire old reactor "reprocessed". But the investment won't be there with wind/solar/battery eating everyone's lunch.
And that's IF regulatory was simplified and IF you were able to get enough starting fuel to initiate breeding of the thorium.
Renewables have the problem that the larger the portion of your power that's generated through these renewables, the exponentially more storage you need. It's not a linear relationship, because you can have weather that doesn't cooperate for long periods of time, but the power can't go out.
Another issue is that solar and wind aren't really suitable everywhere. The further north you go the less economical solar becomes. A large chunk of Europe sits further north than the US.
What’s happening right now is that renewables are being paired with an increasing number of gas power plants. Since gas power can ramp up so quickly they’re a good match until storage can take over.
So I think we already have the solution to the problem you’re describing: just pay to maintain a decent amount of the existing gas power plant. If they’re only there for backup, the impact on CO2 emissions will be negligible. You could even make the gas from renewable sources, since in this scenario the cost of the gas itself is negligible compared to cost of keeping the power plant operational.
This is incidentally another reason (current) nuclear is a dead end going forward. They’re a bad match for renewables. What we need isn’t more base load, we need more peaker plants, load following plants, backup plants and energy storage. Nuclear isn’t a good solution in any of these categories.
And if people complain about the peakers releasing CO2, they could be converted to use hydrogen. Hydrogen can be produced either from renewables + electrolysis, or from 24/7 plants reforming methane to hydrogen with the carbon sequestered and the hydrogen stored underground for later use.
Solar is getting so cheap that it's becoming economical in places that might surprise you.
The UK sits further north than almost all of the USA (excl. Alaska), and has a climate not exactly known for its abundant sunshine. But large-scale solar farms are now commercially viable here without subsidy. (eg: https://www.clevehillsolar.com)
Up here the yearly sunshine is pretty good, close to Germany or so. The problem is that almost all of it comes during the summer. In the winter months when demand peaks, solar produces practically nothing. So without seasonal storage becoming economical, it's not a solution to decarbonizing.
Closer to the equator where seasonal variation is less and demand is driven more by air conditioning than heating solar is an excellent choice.
> "The problem is that almost all of it comes during the summer. In the winter months when demand peaks, solar produces practically nothing."
In the UK, this turns out to be seasonally complementary with our wind turbines. In the winter months, wind speeds are stronger and more consistent (especially off shore). In the summer, there is less wind, but the gap is made up by solar. The net result is pretty consistent renewables production year-round.
> "it's not a solution to decarbonizing."
Every kWh of energy produced by a solar panel in the UK (or wind turbine, for that matter) offsets a kWh that would otherwise be produced from fossil fuels, typically from imported natural gas.
I have a strong suspicion that a good chunk of the opposition to solar is good old fashioned white racism. Can't have an energy source win that benefits the brown skinned parts of the world, this thinking goes.
> Solar/Wind/Storage are beating almost everything.
Does anyone have numbers, how fast (yearly added kWh per capita) has any country built solar+wind lately? Or plans to build in the near future? When Sweden built nuclear in 1976–1986, they added 640 kWh per capita per year. Will some country or state beat that record?
The cost of manufacturing energy is not the only cost when you need to distribute energy on a large scale. Electrical grids are prone to oscillations and having less and bigger power sources makes things a lot easier and more reliable (and that means cheaper also). You can store energy but that again increases complexity of the grid and costs (even super-cheap storage will always be more expensive than no storage). To partially solve this you can concentrate solar into large solar farms, but than you have a problem with the space available, you need a lot of land for that - which is perhaps less of a problems in US or Australia or Africa, but in densely populated areas like Europe (which is also located more to the North and gets less sun) or Japan this creates problems. Also there is an issue that our climate is changing and becoming more and more unstable, which could cause lots of engineering and exploitation problems for renewables (hurricanes, hail storms, unexpected drops in temperature and periods of bad weather like this summer in Europe in May). So I don't think it's (yet) all that certain which is the best way to go long-term, and IMHO the clear win of renewables is more a picture painted by media than a solid science.
Grids must be very stable because there is no storage at the end points. If electricity consumers are very OK to be without electricity for few hours per day that alone makes grid much cheaper.
If every endpoint has enough storage to go days without being online, then that’s going to distribute a very large cost to the edge of the grid. Part of the reason that we have a power grid is so that we can pool usage and build capacity based on overall statistics, no absolute peak.
Additionally, the cost to operate the grid is based upon the size of the distribution network and maximum carrying capacity. If we all build local storage that takes us off the grid 90% of the time, we are still going to have to pay the utility roughly the same amount to maintain the network as we do today. Check your power bill, the generation costs are probably less than half of your monthly total, so the rest is going to come back to you one way or the other.
Ideal consumer, from the power network point of view, is as constant as possible. Each time you cut off a consumer node it creates harmonics and surges that need to be compensated upstream. Lots of nodes going on and off simultaneously (and they will as the whole parts of the city have the good sun at the same time) are engineering nightmare, existing grids are just not build to handle it, so they'll need to be upgraded, plus the maintenance costs will go up.
This is exactly the issue. Nuclear simply missed the boat. It couldn't win against coal on a balance sheet and it's surely not going to catch up to renewables now.
There's nothing wrong with it. A cheap fission technology would be a good thing. But at this stage literally no one is interesting in throwing money at a technology that is now into its eighth (!) decade of maturity. The low hanging fruit got picked by our grandparents.
And the environmental costs for nuclear were internalized by regulation, while the same did not happen for coal. This is still well within the realm of economics.
As far as coal's negative externalities go, carbon emissions and their impact on climate change were only some of the most well-known. There's extensive air and water pollution from both mining and electricity production, with negative health effects for everyone from the miners who mine the coal to those living near the power plants themselves. Mining operations can also have pretty significant environmental impacts in addition to the pollution. Fossil-fuel power plants result in thousands of premature deaths each year, millions of lost workdays, and billions in annual healthcare spending.[0][1][2] And estimates are difficult, so it's easy to undervalue their costs; for C02 emissions, for example, more evidence in recent years suggests that we're drastically underestimating the social and economic damages of carbon emissions.[3]
Very, very few of these consequences have ever been priced into coal-fired power production; for decades, it's been a de facto subsidy. And while we're seeing some shifts needed to lower emissions to fight climate change, many of the identified externalities still aren't priced in. Nuclear energy never received that kind of benefit of the doubt, or the sort of friendly regulatory capture that let society ignore many of its own negative externalities and risks.
internalized by regulation would imply that the externalities (eg. Climate change) are effectively factored into the cost of producing power by burning coal due to the cost of complying to imposed regulations.
I was very disappointed, expecting you to show something about storage costs, but you just handwave that part, even though you go into very specific numbers and extrapolation for the actual pv's. May I suggest this is a major blindspot and probably where it completely falls down?
I don't know enough about storage costs to calculate them well, although several recent low-priced PV power-purchase agreements have included storage components, including the one we're talking about here. I look forward to seeing your calculations!
I don't know the calculation either, and have been trying to get hard numbers on various potential large scale storage such as https://en.wikipedia.org/wiki/Thermal_energy_storage and Lithium ion, as well as some sort of learning curve for costs. So far I've found a pretty compelling and well backed case for solar, but the storage component is always hand waved.
Well, sure, storage costs depend in part on how long you have to store the energy, and how often. But surely someone has computed some kind of curve about how much storage is needed for PV to displace increasingly large percentages of peaker and baseload generation, assuming no demand response or other stabilization measures like rolling blackouts?
I hope you aren't making the assumption that storage = batteries, then arguing because weeks and months of batteries would be needed, it's impossibly expensive.
The usual quip in the industry is that the remote maintenance is such a complicated robotics problem. that which ever company is able to solve it, is better off converting to a robotics company and just drive Kuka, Fanuc and ABB out of business.
Completely robotically maintaining a complex machine sounds like a hard robotics problem to me. Of course it can be designed for machine manipulation but that seems like a pretty complex fusion of robotics and nuclear reactor design. So I think it would be a lot of work.
It seems to me that the simplest reactors tend to be the most dangerous. I'm thinking of the Windscale fire and SL-1 in particular. One of the theories for what went wrong at SL-1 is that it was a murder-suicide, caused by the man tasked with physically manipulating the control rods of the reactor. I think robotic control is better for something like that. Robots are more predictable.
One of the aggravating factors at Windscale was their inability to perform maintenance tasks behind the reactor. They had canisters of radioactive material smashing open behind the reactor and it took them ages to even notice. That was a very simple reactor design; basically a nuclear pile. It was also an awful design that was irradiating England even before it caught fire.
The Windscale reactors were built in a hurry in 1950 to produce nuclear weapons after Russia managed to test it's first atom bomb in 1949 and are not really representative of modern power generating technology.
Obviously Windscale wasn't modern and SL-1 certainly wasn't either. My point was that simple designs are poor designs, that modern designs are complex for good reason. The added system complexity of automation is well worth it when it comes to nuclear power.
Mostly hard robotics side. This is about replacing people and you don’t send people into heavy radiation areas. It’s like cleaning skyscraper windows, the overhead of keeping people safe is significantly more of an issue than just washing windows.
Decontamination and radiation mostly make it more expensive rather than more difficult.
From the article:
>if something goes wrong in a MSR and the temperature starts going up, a freeze plug can melt,
A MSR does not need any sort of valve to drain the fuel. ORNL-MSBR was designed to drain the core when pumps stopped working. The real advantage of freeze valve is that is uses no moving parts and maintanence free/friendly.
ORNL-4528: "The fuel salt pump and its sump, or pump tank, are below the reactor vessel, so that failure of the pump to develop the required head causes the salt to drain from the reactor vessel through the pump tank to the fuel salt drain tank."
I thought it was Uranium 235 which was the common weapons material, so I looked up Wikipedia [1]:
> ... While it is thus possible to use uranium-233 as the fissile material of a nuclear weapon, speculation[8] aside, there is scant publicly available information on this isotope actually having been weaponized ...
In the nuclear industry, U-233 is well known as an excellent weapons material [1]. It's simply a matter of its nuclear properties, which are well known.
"The fast critical mass of U-233 is almost identical to that for Pu-239 and the spontaneous fission rate is much lower, reducing to negligible levels the problem of a spontaneous fission neutron prematurely initiating the chain reaction -- even in a “gun-type” design such as used for the U-235 Hiroshima bomb (see Table 1)."
U-235 is the only naturally occurring fissile isotope of uranium. In raw uranium out of the ground, it makes up about 0.7%, the rest being U-238. U-233 is also fissile, but only really made in thorium breeder reactors.
The big problem with nuclear driven steam turbines is that they cannot be cheaper than thermal coal driven steam turbines due to costs-- construction, operation and decommissioning. Their LCOE is just not great and not lowering quick enough.
Even thermal coal is getting out priced by solar/wind+ batteries. And the latter will always get cheaper due to economies of scale and learning rate reductions not to mention core technological advances.
Nuclear would have kept us out of the carbon free energy mess decades ago but it's not the answer in 2019.
>Tritium production: If lithium is used in the salt, tritium will be produced,
Not a disadvantage for all MSRs. Both lithium and beryllium can be avoided. FLiBe is required for efficient MSRs and MSBR.
>Mobile fission products
It is the only disadvantage common to all molten-salt reactors and all fluid-fuel reactors. Pumps and pipes have to handle a hot radioactive liquid.
> Material Degradation
Common to all nuclear reactors, solar, coal boiler tubes, etc. Components used in reactor core do not last long. MSRs dispose nickel tubes and LWRs dispose zirconium tubes and uranium.
>Proliferation...The problem with MSRs, then, is that the fuel is already completely cut open and melted.
>it will be difficult for the IAEA to distinguish plate-out losses from actual proliferative losses.
The fuel salt loop can be sealed tamper-proof. The entire fuel salt loop is analogous to a fuel assembly. Weigh the entire fuel salt loop. Vapor pressure of actinide-halide salt is very low at operating temperatures. Actinides don't move out of this loop.
Regarding pumps and pipes, the Moltex design avoids them entirely. Liquid fuel is contained in vertical fuel rods, open at the top to allow escape of gaseous fission products (some of which are strong neutron absorbers). The pipes are mostly immersed in a pool of molten salt coolant.
> Molten salt fuels were first conceived of in the late 1940s, when people began thinking of nuclear powered airplanes!
> The idea was to have very long range bombers in the air at all times
I had never heard of this idea, and thought it was super interesting. Imagine aircraft that could stay in the air indefinitely (for some practical purposes), carrying massive loads without fuel being a concern.
Similar to nuclear aircraft carriers and submarines, at least conceptually.
In the long run nuclear jets could be really useful for the exploration of planets that don't have free oxygen in the atmosphere like Venus, Saturn, etc.
This is not a disadvantage for all MSRs. Only breeders need chemical plant. Not fair to compare a solid-fuel burner to liquid-fuel breeder.
Let us compare liquid fuel breeder vs. solid-fuel breeder: Reprocessing solid-fuel involves more complex chemical plant with physical mechanisms to declad and convert solid-fuel to a processable liquid. Fabricate processed liquid to solid-fuel and put it back in the reactor. Solid fuel breeders have additional physical and chemical processes/steps.
The largest reserves of thorium exist in India and China - two of the most energy hungry and polluting economies. One is on the UN Security Council and the other has a unique waiver from the US Congress on proliferation - so fissile material production is not the primary concern.
Both India and China have massive deployments of renewable energy, yet the demand for energy is outstripping projected build-out. LTFR is going to be one of the best answers if it can be made safe.
Nuclear is a great compliment to photovoltaic. One of the reasons Japan was able to go large on PV (and cause prices to come down for the rest of us) was their huge pumped water infrastructure that had been built for nuclear.
One problem with nuclear was all the excess power created at night, so they would pump water uphill at night and then run it down to generate power at the daytime peaks.
Of course with PV the timing was reversed but the need similar. If they hadn’t decommissioned their nuclear plants the combination would have been even better.
Nuclear is not a great complement to PV, if the PV is cheap. Cheap PV expands until there is no residual baseload demand left. At that point, nuclear power plants cannot maintain high capacity factor and their economics go all to hell.
Nuclear has effectively zero unit cost, it's all fixed cost. Once you build the reactor, you generate at full capacity all day and night and take whatever the market rate is.
That means you get paid less during the day when solar is the cheapest provider, but you still match its price and supply your full generation capacity even then, because you can, because anything is more than nothing and the incremental generation cost is effectively zero. Then you make more at night when solar requires storage, and even more still whenever the day was overcast and energy prices had to rise high enough to suppress demand enough that the depleted storage isn't fully exhausted before sunrise.
The capacity factor is always 100%, it's the price you get at any given time that varies. But price variability in itself is no problem either, as long as the average price is above average cost. And that average includes not only times that it rains for multiple days in a row, but also winter, and that with consideration of people also needing to switch from oil and gas to electric heat.
"as long as the average price is above average cost"
And that's the killer. Add enough solar and wind and the average price craters, even if solar and wind cannot handle everything (without help from dispatchable sources and/or storage).
The problem with that argument is that it proves too much. It applies to everything, because solar and wind are the same as nuclear, their incremental generation cost is effectively zero once the capital is paid, so you take whatever the market price is.
Which means you have a problem.
It's physically possible to build enough solar and wind and storage to even handle extended periods of low sun and low wind. But if you do that then on a normal day with a normal amount of sun and wind, you have a large oversupply and the price falls to zero and everybody goes bankrupt.
That happens less if you add real baseload like nuclear to the mix, because something with a fixed generation capacity doesn't have a periodic shortage requiring you to compensate with an overcapacity which bankrupts everybody on a normal day.
Solar and wind will build out until it is no longer profitable to do so. That means: when they have driven the price of electricity when they are operating, averaged over time, down to the point they can't get a return on investment. This price point is far below where new nuclear needs it to be for nuclear to earn back its construction and operating costs.
What drives all this is that the levelized cost of energy from solar and wind is much less (a factor of 3 or 4) than the levelized cost of energy from new nuclear plants.
Nuclear, or at least many existing nuclear plants, has an additional disadvantage: it cannot cut output rapidly. If power from renewables suddenly floods the market, prices can go negative. Renewables can just stop selling in that situation, but nuclear is forced to continued to run and eat the negative revenue. The low power density of the renewables sources, usually depicted as a negative, is the source of this advantage: sunlight absorbed in PV modules can just be allowed to dissipate as heat there with no negative effects.
(In fairness, I should also mention that there are subsidies in the US that encourage renewables to keep generating even at negative prices. These subsidies will have to go at some point, and perhaps that point is now.)
What will be the final death knell for nuclear will be when short term storage gets cheap enough that the times when that's discharging will also be economic death zones for nuclear plants. I expect few of the existing operating nuclear power plants to survive after that.
> Solar and wind will build out until it is no longer profitable to do so. That means: when they have driven the price of electricity when they are operating, averaged over time, down to the point they can't get a return on investment.
It seems like you're expecting this to be a slope rather than a cliff.
The problem with generation methods with no incremental generation cost is that absent some coordination/collusion, you go straight from a price somewhere above breakeven to basically zero as soon as you have any significant amount of oversupply, because everybody would rather get something than nothing.
So even if the average wholesale price is currently above 2c/kWh and you can bring capacity online that generates at 2c/kWh, you won't, because the act of doing it would create oversupply, cause the average market price to fall to below 2c, and you and everybody else would lose their shirts.
It's basically a market that bankrupts everybody without some coordination, but part of the value of that coordination includes preferring some amount of stable generation capacity to avoid the high cost of supply emergencies when low supply from the unstable generation methods coincide with each other.
> What drives all this is that the levelized cost of energy from solar and wind is much less (a factor of 3 or 4) less than the levelized cost of energy from new nuclear plants.
This isn't accounting for variable supply and demand. The price it costs to generate in the summer sun isn't the real price when the unmet demand is in the winter night, and it isn't going to be economical to shift the demand by six months using energy storage. But if you had enough solar to provide heat in cold climates in winter you would have so much oversupply the rest of the year that you wouldn't make a cent for nine months out of twelve.
> Nuclear has an additional disadvantage: it cannot cut output rapidly. If renewables suddenly flood the market, prices can go negative. Renewables can just stop selling in that situation, but nuclear is forced to continued to run and eat the negative revenue.
If that actually started happening on a regular basis there would be obvious solutions like resistive heaters or on-site energy storage which can be charged during those periods and then sold for a profit when prices are higher. (Thermal storage could work really well considering the reactor generates heat to begin with and they already have existing heat-to-electricity systems on site.)
And that's assuming all of this "smart grid" stuff doesn't ultimately succeed in preventing that from happening by increasing consumption as prices fall so that they don't actually go negative to begin with.
> What will be the final death knell for nuclear will be when short term storage gets cheap enough that the times when that's discharging will also be economic death zones for nuclear plants.
It's only speculation that this will actually happen. And even now people like to use overly optimistic numbers. Storage costs a certain amount if you charge it up every day and then discharge it again every night, but if you want the storage to be able to handle generation undersupply over a period of a week or more, you need a lot more of it which will generally go idle, which requires you to charge higher average prices per kWh.
> I expect few of the existing operating nuclear power plants to survive after that.
The existing nuclear power plants will keep going as long as the market price is above the operating cost. The capital costs are sunk.
It's remarkable you were able to read that into what I wrote, when that wasn't at all what I wrote.
The point I was making was that intermittent renewables can screw up the market for nuclear, even if the renewables themselves do not supply baseload.
You may be under the misapprehension that if there is a base level of demand on the grid, then that base level of demand can only be supplied by baseload power sources like nuclear or coal. This is not the case. It was in the case in the past that baseload sources were the cheapest way to satisfy that demand, but there's no law of physics or economics that requires that to always be true. And increasingly it's NOT true.
Renewables can ruin the market for nuclear even without storage. Adding economical short term storage (we're close with batteries, and almost certainly will have it before any nuclear plant whose construction was green-lit today could be completed) and nuclear is not just in trouble, it is dead.
MIT studies suggest that deeply decarbonizing is actually cheaper when you do some nuclear alongside your variable renewables + storage [1]. The cost of filling those seasonal solar gaps and 10-day wind gaps gets pretty large without nukes.
Nuclear is never actually cheap though. In practise it costs a fortune to build, a fortune to run, a fortune to dismantle, and then you have to pay to store the waste forever. Even that assessment is charitably assuming that there are no massively expensive accidents along the way.
Studies that purport to show that nuclear has a place due to inability of renewables to fill those gaps are usually assuming renewables + short term storage cannot fill those gaps. But renewables + short term storage + hydrogen can, and probably more cheaply than a system with nuclear reactors.
Hydrogen has low capital cost. The capital cost/energy of storing hydrogen underground will be much less than the cost of storing that energy in a battery. If you have a storage scenario where the energy is stored for very long times, there will be few cycles of that system over its economic lifespan, so minimizing capital cost (even if that means much lower round trip efficiency) is very important.
One would still use more efficient short term storage (and over-installation of renewables sources) for diurnal load leveling.
Hydrogen can also be turned back to electrical power (at lousy efficiency) with cheap hardware. In particular, simple cycle gas turbine power plants with efficiency of 40% cost maybe $400/kW. Compare this to $8-10K/kW for a new nuclear power plant.
> In particular, simple cycle gas turbine power plants with efficiency of 40% cost maybe $400/kW. Compare this to $8-10K/kW for a new nuclear power plant.
The problem being that it only operates ~2% of the time compared to ~100%, and has a shorter operating lifetime in practice, and that isn't counting the cost of storing the hydrogen nor the energy cost to produce it.
Yes, the power will be expensive during that 2%. But it will not contribute all that much to the total cost of operating the grid. In particular, it would be cheaper than forcing the consumers to pay for nuclear the other 98% of the time, just so it would be available during that 2%.
If it's dozens of times more expensive during that 2% because it has to recover 100% of its cost in 2% of the time then it does contribute quite a bit to the total cost of operating the grid, whereas nuclear only has to make up the difference in that time between the market price the rest of the time and its overall average cost.
Storage also has the further disadvantage that you have to over-spec it. It has to be built for the highest capacity you might need and the longest duration, which you don't know ahead of time. If you build less than you need you're in big trouble, but if you build more, you pay for it and get nothing.
Simple cycle gas turbines are 20 times cheaper than nuclear power plants of equal power output. So, no, your argument doesn't hold up when the actual numbers are examined.
Being 20 times cheaper while producing 2% of the kWh because they're turned off 98% of the time makes them 250% more expensive per kWh. And that's comparing the the capital cost for the entire nuclear plant to only the turbine, not including the capital required for the hydrogen storage, or the equipment to produce the hydrogen, or the energy cost of the original generation (with large conversion losses). Or, again, the deadweight losses from the safety margin you need for surplus capacity that you might but probably won't ever use, which could double the cost or more on top of everything else. Whereas if you spec additional nuclear it means you're generating that much more useful electricity 100% of the time.
Meanwhile that kind of storage will have more difficulty finding investors, because if it turns out that some cheaper or better alternative comes along, an investment in nuclear might have to average generating power below levelized cost, but at least you recover most of the capital. Putting in $100 in capital only to have the net present value fall to $80 sucks, but not nearly as much as putting in $100 in capital for a complete write off because you were expecting to be selling to the grid 2% of the time when it turns out to be 0% between demand based pricing and better than expected competing storage technologies. Which means higher capital costs (meaning interest rates) that reduce relative competitiveness even further.
There's a lot of references to "plate out" in the article and most of the results I find on Google are paywalled by Elsevier - can some explain what the term means or point me to a resource?
When fuel nuclei split apart in a nuclear reactor, their split fragments form a variety of lighter elements. Some of these elements are so-called "noble" metals -- metals that tend toward remaining chemically stable as metals, rather than as chemical compounds with other elements. Silver, ruthenium, and palladium are some examples of noble metals. More typical metallic elements like potassium and iron, by contrast, tend to be found on Earth as oxidized chemical compounds rather than as metals.
In a molten salt reactor, most fission products either stay dissolved in the salt mix or are lost from the mix as stable gases. The noble metals are different. Their tendency is to reform as solid metal. They tend to accumulate as a metallic layer or plate of metal over other solid surfaces they come into contact with. That's what is meant by plating out.
Here is a document that specifically addresses noble metal plate-out in molten salt reactors:
From the article:
>Problems with Molten Salt Reactors...
>...but similar problems may show up in long-lived power reactors.
Author assumes that MSR components should last as long as vessels and secondary heat exchangers of solid-fuel reactors. The author should understand that vessel and primary heat exchanger of a fluid-fuel reactor is anologus to fuel rods. Solid fuel reactors just dispose primary heat exchangers or fuel rods every few years.
Example: Zircolloy tubes worth a MSR vessel + heat exchanger is just disposed along with partially fissioned degraded solid fuel every 4.5 years in a LWR. Zircolloy (Hafnium separated nuclear grade zirconium + additives) is more expensive than commercially available nickel based alloys.
Graphite is a solid with a crystal structure. Crystal structure degradation under radiation is permanent and there is nothing anyone can do to reverse it. Solar panels degrade similarly. Nuclear industry handles solid-fuel rods which are far more radioactive than MSR graphite and complains that it can't handle MSR graphite. Just shows that either industry is incompetent or it is not interested in efficient fluid-fuel reactors. Does nuclear industry aims >1000 GW of nuclear capacity? Does nuclear industry care to solve global energy related issues? Efficiency really matters when we have >1000 GW of installed nuclear capacity. If all energy is obtained from nuclear, (12000-16000 GW) even seawater uranium get used up in 40-60 years with inefficient solid-fuel reactors.
> Efficiency really matters when we have >1000 GW of installed nuclear capacity. If all energy is obtained from nuclear, (12000-16000 GW) even seawater uranium get used up in 40-60 years with inefficient solid-fuel reactors.
That can't be right. About 200 tonnes of natural uranium is needed to produce 1 GWe per year in conventional reactors [1]. That's 3,200,000 tonnes per year if you mean 16000 GW in the form of electricity, or closer to 1 million tonnes per year if you're referring to primary (thermal) energy. Seawater contains about 4.5 billion tonnes of uranium [2]. That's well over a thousand years' worth of uranium, either way.
Firstly heat from inefficient low temperature solid-fuel reactors can't be used directly for many applications. So consider electricity. Cars/planes/kitchen-stoves cant use uranium or nuclear heat!!
All 4.5 billion tons can't be extratcted. More we extract, concentration decreases and harder it gets. I keep asking this question: If seawater extraction of metals is practical, why aren't we extracting other costly metals now?
https://twitter.com/AchalHP/status/1011661441412337665
If extracting gold is only 300 times harder, that puts the lower bound cost at $108,000 per kilogram of gold. That's significantly more expensive than the market price of gold.
Indeed, extracting uranium itself from seawater is not cost competitive with conventional terrestrial mining at present. But the process has been demonstrated on a technical level. Either terrestrial uranium deposits will have to get closer to exhaustion or seawater extraction will have to be much more cost optimized (or both) before seawater extraction of uranium is economically competitive. I was only addressing your technical claims about the exhaustion of seawater uranium, not making economic claims.
You included my sentence: "Efficiency really matters when we have >1000 GW of installed nuclear capacity." I wanted to address that: efficiency matters.
Inefficient technologies will hit some constraints if not others.
We may run out of time: A pressure vessel production cycle is over one year[1]. By the time we reach >1000GW someone may make fusion practical or China's competent nuclear industry may start selling MSRs to everyone.
We may run out of highly skilled people required to build and operate these complex machines. The safety is highly dependent on operations and management throughout the life cycle of complex machines. From manufacturing to operations there is tiny margins for error. (Not a physical constraint.)
LWR technology is associated with military ships and submarines, no one shares it. Few countries control the technology and there are only a few places in the world to make pressure vessels. (Not a physical constraint.)
We may run out of waste disposal sites. A geological repository needs special rock formations away from earthquake zones.
The article is written by a nuclear industry professional who tells that routine things done by the nuclear industry today when applied to MSR is a "disadvantage". They open the lid of an LWR every 18 months and replace/shuffle highly radioactive fuel assemblies and say that it is a disadvantage for them to replace graphite. That is totally unfair.
TL;DR
Even small pressure vessels (used by nuscale) have lead times of about a year. A megafactory can build like 1 or 2 vessel per month. At 1.5GW/year/factory, we need like 22 factories to keep up with the production schedule if we need to reach 1000 GW of small-LWR capacity within, say 2050.
Don't confuse solid fuel with traditional light water reactors. The highest temperature reactors are triso fueled helium cooled solid fuel reactors like HTTR with outlet temperatures over 1000C. Also, fast breeder reactors with solid fuel are just as sustainable as any fluid fuel breeder.
Molten salt is one of about a dozen advanced reactor techs that has huge potential.
Hard part is economics. Hazardous coolant has been a pain to maintain cheaply so far.
Also seawater uranium replenishes from erosion and plate tectonics so it will effectively never decrease in concentration, even if we pull it out at world scale.
The successor of LWR/HWR should be a fluid-fuel reactor.
Once we setup a pebble/advanced fuel making factory, closing it will takes decades (because people may lose jobs) and the new solid-fuel factory will again pause nuclear innovation for another 100 years. The only way to continuously improve nuclear reactors is to go fluid-fuel. No engineered fuel, so no job loses. Reactor innovation is independent of fuel factory. Nuclear fuel becomes a commodity instead of engineered speciality. Example: MSRE ran U235 and U233 without any modification.
Secondly, solid fuel reactors throw away fuel along with heat exchange surfaces (clad). Fuel also undergoes crystal structure degradation along with other solid structures. Maybe there is enough fuel in the seawater, but there may not be enough places suitable for geological repositories.
Solid-fuel reactors always need excess reactivity reserve. Always needs control rods, and if someone (or a bad actor) pulls all the control rods, reactor gets supercritical. Needs highly skilled people and needs security.
For emergency shutdown of solid-fuel reactor, poison is added to coolant, not fuel. In an emergency, poison is added to the liquid-fuel, permanently destroying the fuel. Emergency can be anything, from a natural disaster to terrorist attack. Fluid-fuel reactors offer unbeatable safety features against anything.
Highest temperature fission reactors are the gas core (vapor core) and ion core reactors. They are fluid-fuel reactors, but not demonstrated. Currently solid-fuel reactors hold the record for highest temperatures.
Fusion reactor runs hotter and again fuel is in fluid state.(Fluid: Liquid, gas & plasma.). But only runs for 10-100 minutes. Demonstrated fission reactors run for thousands of hours continuously.
Or make fuel. There are all sorts of ways to make carbon neutral diesel and jet fuel. They just aren't economic compared to pumping it out of the ground unless the externalities of burning virgin fuels are priced in with regulation.
The security aspect would make this a lot more expensive than it already is and cost is already on the high side even before you consider that. This is a problem with traditional reactors as well but they have the advantage that they are huge facilities and that there are only a handful of them; so securing them is relatively easy.
People are suggesting this as complementing renewables but the cold hard truth is that renewables are already dirt cheap and on track to continue to drop in price by magnitudes for the next decades. That includes battery storage as well. Already quite cheap, also dropping in price, and typically already factored into e.g. new solar bids that are killing competing bids for coal and gas plants (or in some cases shutting them down prematurely).
Even at the current prices, that's a problem for any kind of nuclear solutions being contemplated right now. At the low end of the spectrum, we are talking 2 cents per kwh currently. Imagine this dropping to something like half a cent or even less. At those prices, the security alone would make nuclear too expensive probably. A 1 mw facility would have basically be generating only about 500-2000$ worth of energy per hour but only at peak demand. There's no guarantee prices won't drop way below that either. Any kind of operational overhead would be a problem. Needing 24x7 intense security would be very undesirable.