This is nameplate capacity: the power a generator can deliver, full on, under ideal conditions. Solar and wind always look great when measured in nameplate capacity, which is why this is done.
If you want to know how much electricity they will generate, you have to multiply by the capacity factor. Nuclear, for instance has a capacity factor of 0.9 in the US: it delivers 90% of its nameplate capacity in a given year.
In the US, PV solar capacity factor is about 0.24 (partly because lots of people live near very sunny regions like California and Texas). The National Renewable Energy Laboratory claims that in some parts of the US utility-scale solar capacity factors are 0.33 [0]. Wind is about 0.36, Hydro also 0.36, Nuclear about 0.93 [1]. If you apply these capacity factors to the nameplate capacities (ignoring batteries), you get 58.5% solar, 19.7% wind, 15.1% natural gas, and 6.6% nuclear.
Of course those capacity factors hide one important bit, the dispatchability. While the capacity factor is the same in this list for wind and hydro, the wind and solar generation are naturally capped by nature. When demand exceeds the natural availability of these, you need to dispatch some extra generation. Hydro and natural gas are well suited for this, coal can do it even if dirtily. Nuclear generation could be made dispatchable, but due to the essentially zero marginal cost of running it all the time instead of limiting generation, it seldom is used in this fashion.
Electricity demand could be elastic too, but there's a limit to that. It's not always feasible to limit or shut down industrial processes for the few hours of expensive power. And home consumers are loath to not use their stoves when they want. There's potential in elastic demand though.
The problem with running nuclear in a dispatchable fashion (apart from technical problems) is not the marginal cost, but the upfront capital cost. Dispatchable capacity needs to have relatively low upfront cost, because it's seldomly used. The marginal cost is less important.
Nuclear has a high capital cost, it's a really complex tech.
I guess I earned the snark for not mentioning the batteries.
Weirdly the news item doesn't tell what sort of energy storage capacity these battery installation have, only the peak power output of 14.3 GW for this year's addition. While that is certainly a gigantic addition no matter what the energy capacity of these battery installations is going to be, running the whole grid on batteries over the periods of low renewable generation is going to require still orders of magnitude more batteries. I suppose there's enough lithium in the ground for this, but elastic demand and dispatchable generation are probably going to be part of the equation for economical reasons.
A misconception I often read on YouTube, Reddit, HackerNews is that everyone equates batteries with lithium ion batteries. A battery is a chemical storage device for energy and there are already many different types.
1. There are also functioning batteries without lithium, for example with salt, which are now already being tested in Swiss and German households and bring some advantages compared to lithium batteries. Not least the price. One should always remember that the lower energy density is a problem for an electric vehicle, but it doesn't matter if we install a battery in a cellar. Here, the energy density plays a subordinate role because there is enough space.
2. Would it make more sense to talk about *energy storage* in general instead of just batteries (which are by definition chemical energy storage) Kinetic, chemical, thermal and so on. Lithium ion batteries should not be considered for back-up alone. We definitely need more choices and we have them, mostly with today's technology and definitely easier and faster to develop and install than any new nuclear reactor technology.
3. You need different types of batteries short term storage, medium term storage and long term storage. There are different concepts for each use. Batteries, compressed air storage, pumped storage, thermal storage as well as power-to-x systems are able to absorb the increasing summer power from solar, autumn wind, etc. and make the energy available again in the short term, medium term or seasonally shifted. Examples:
4.The best approach, however, is to build a decentralised grid, which is also intercontinently connected. This is the perfect way to compensate for any "dark lulls". There is research on this at some universities around the world that is already out of laboratory status.
Thats (sadly) true! I think the transition to a diverse heat and electricity storage landscape will take time. Citing from "Handbook Energy Storage SCCER" https://doi.org/10.3929/ethz-b-000445597:
"The SCCER has proven in numerous demonstra-
tions that storage technologies are essentially
available and usable. Now it is necessary, above
all, for political decisions to be taken in the inter-
ests of a coherent energy policy in order to re-
duce the regulatory obstacles that currently im-
pede or make impossible the economical use of
energy storage. This can guide business models
and investment decisions necessary to advance
the technologies developed in the SCCER and
bring them from the laboratory into the ultimate
energy system of the Energy Strategy 2050."
in industry here in europe I usually see it written as ESS/BESS (energy storage system or battery energy storage system). For new plants we usually simulate each of five or six technologies, however in a lot of cases yes lithium ion has many advantages.
Many of my texts and links are from the years 2019-2022 when I was researching for various publications on renewable energy. Most of them, unfortunately, without DOI links. I didn't check them before I uploaded them here. And I would have so much more that it would be enough for an entire book.
> no matter what the energy capacity of these battery installations is going to be, running the whole grid on batteries over the periods of low renewable generation is going to require still orders of magnitude more batteries.
I live in CA (Bay Area) the solar people just wandered up to my front door to "sell me" the other day. I do want to go solar in light of my PGE bill.
The sales guy was super sharp and addressed one of the concerns I had (I have an unusual roof) and we got very nerdy.
Because PGE has time dependent pricing, their model is to use battery to not only power the house in these windows but dump back to the grid during them too (and charge off solar when power is cheap).
SO an independent installer is pitching a system to me (the end consumer) in response to the market price conditions that are going to push "more battery" for "peak demand" into the market.
Now do the economics of that system and their sales pitch make sense? I dont know, Im still crunching those numbers (and they are some hard numbers to figure out), but at first blush im inclined to say "yes" cause fuck giving money to PGE.
At least a rooftop PV installation breaks even* here in Finland with scarce sun and cheap power. Batteries are not there yet. If you don't need the battery as a UPS, I would wait still a few years before going for a home battery.
*: Easily half of the cost is the installation labor. If you can DIY at least parts of it, you can get decent ROI.
Here (bay area), the power prices are so obscene that I almost makes sense to install batteries first.
Those with homes with solar are going to end up saturating the grid (duck curve) to the point where renters can buy batteries and "coast", through the peaks of power cost.
Are spot electricity contracts available, so that you can do it like that? Makes sense then. And of course the installation cost of battery is way smaller than for a PV plant as no roof work is required.
AFAIK, there is no residential spot market with PG&E. Just different seasonal and hour-by-hour rate schedules with published rates.
There is an "electric home" rate plan that has three periods per day: off-peak, partial-peak, and peak with three rates. This can apply to a home with batteries, where you can shift your load to different periods. The spread can be up to $0.22 per kWh in summer and up to $0.04 per kWh in winter.
There is another rate plan with two periods per day: off-peak and peak. The spread here can be up to $0.09 per kWh in summer and up to $0.03 per kWh in winter. This is a typical plan for homes without solar nor batteries and with moderate consumption. This plan has two pricing tiers. A lower rate for consumption up to a "baseline allowance" and then a higher price after that. This allowance is summed over a whole billing period, in contrast to the time of use variations each day.
The above discussion do not include any net-metering, so you never sell power back to the grid. You just optimize your load during different hours of the day. With a currently available net-metering plan, PG&E will pay for excess power only around $0.02 to $0.04 per kWh.
Also, it seems PG&E distinguishes a "paired storage" net metering system, and requires special metering to track the solar generation that goes into the battery versus recharging from the grid. They will only credit solar production delivered back to the grid, and not off-peak grid energy reflected back during peak hours. So, I'm not sure why some posters seem to be talking about this arbitrage scenario.
For context, the actual per kWh rates are around $0.36 to $0.65 in the different seasons and rate plans. So these peak price differences may range around 5% to 25%. There isn't any of the wild fluctuation or negative numbers we've heard from other energy markets.
> Weirdly the news item doesn't tell what sort of energy storage capacity these battery installation have, only the peak power output of 14.3 GW for this year's addition.
As a rule of thumb, the capacity will be a few hours worth. So if the power rating is 14 GW, maybe that will be 60 GWh of capacity.
That's almost enough to smooth over the most regular fluctuations in solar power: the day-night cycle (especially when you remember that demand drops at night). Not close to being economical for storing power from summer through to winter.
A source [0]:
> The most common grid-scale battery solutions today are rated to provide either 2, 4, or 6 hours of electricity at their rated capacity
Plenty of batteries in the US are built out of water. They pump the water up during the day and it generates electricity at night. No lithium involved.
> Pumped storage is by far the largest-capacity form of grid energy storage available, and, as of 2020, the United States Department of Energy Global Energy Storage Database reports that PSH accounts for around 95% of all active tracked storage installations worldwide, with a total installed throughput capacity of over 181 GW, of which about 29 GW are in the United States, and a total installed storage capacity of over 1.6 TWh, of which about 250 GWh are in the United States.
Pumped storage is a type of grid storage, but I don't think TFA includes it in battery storage. Pumped storage is a fine technology, but is there a lot of build potential left for it?
> Pumped storage is a fine technology, but is there a lot of build potential left for it?
"A lot" is probably subjective, but the two best known global estimates are Hunt et al. (2020) [1] from IIASA and Stocks et al. from ANU re100 (who incidentally also have some interactive maps [3]) which with different cost targets put potential at 17.3 and 23 PWh respectively, which works out to about 2 MWh per person. For comparison, for the past decade, the US ahs consumed about 13 MWh of electricity per person per year, down from a peak of slightly under 14 in 2000 and 2005. With very high levels of electrification, that could potentially rise to 24 to 28 MWh per person per year, or 8 or 9 PWh/yr for the whole country. Total primary energy use is a lot higher, around 90 MWh per person–year or 30 PWh total, this is because both not everything could be practically electrified and the things that could easily be electrified tend to be much more efficient when done electrically. Energy efficiency is also usually assumed to increase slightly in general.
The US specifically is actually above the world average at about 4.5 MWh per capita according to the ANU team's estimates. That's 1.5 PWh per year roughly. In any case, I would expect that there is likely to very likely sufficient potential in most (if not all) grids for pumped hydro to be a significant part of medium duration energy storage (if not all of it), though whether it actually would depends on the costs of other technologies as well.
Technically, solar and wind are dispatchable as well, and much better than even natural gas at that. Grids were both are regulated on the sub-second scale are much more even than is otherwise possible. The thing is just that we consider that as lost energy, even though it is simply the same as running any other power plant at full power. If wind were three times cheaper than natural gas and we hence build three times as much of it, the achievable capacity factor would not look that bad.
More on-site batteries can fix the need for dispatchable power on the grid side, and the cheaper batteries get, the more attractive such solutions start to look.
The German energy market and therefore me as a consumer had to pay billions for dispatch operations.
The energy market got plenty of inefficiency through the fast build out of renewable that the battery projects, calculating with this excess are getting build but not fast enough.
There are already a few projects at old coal plants were they have connectivity.
And with the ev batteries alone there will be used but still very good batteries hitting this market very soon. Equivalent to the whole water energy storage of Germany
Nuclear had much lower capacity factors in the past, which brings the lifetime capacity factor down. But the relevant point is that the capacity factor of nuclear plants running today is >90%
US Nuclear had a capacity factor of 86.1% as recently as 2012, it just varies through time and over a 40+ year lifetime you don’t have outstanding results every single year. So sure you can argue 71.1% in 1997 no longer applies, but it was almost exactly the same fleet of reactors in use back then.
Yet the person I was replying to said “about 93%” when averaging over 93% has been achieved exactly once in 2019. That kind of nonsense is actively harmful when people hear something and then later realize it’s simply incorrect.
Effective advocacy requires accuracy including technology specific issues and how to mitigate them.
The larger question is what capacity factors would look like if you tried to double the amount of nuclear as many advocates wish. And what that would do to profitability / the need for subsidies.
Or as the industry has been concerned with for a decade, what happens when renewable energy is regularly sending wholesale prices near zero for hours a day.
I'm not a nuclear fanboy (well, I wish I made sense since I like the idea of free energy, but renewables are just as good), but I think it's important to make accurate arguments.
With such a small difference, the burden of proof in this case is probably on the person arguing that it is a big difference, since you'd generally expect a roughly linear relationship between these.
In fact, I think I'd suspect that downtime is less relevant for nuclear, since I believe most of it will be schedulable, as opposed to renewables where downtime is random and based on conditions. Since it's schedulable, it could be done in the off season, or different plants could be planned to be out at difference periods.
You actually have no idea about electricity transmission. For long term problems, look no further than France. Short term, look up how electricity generation, transmission, markets and grid stability are linked. Wikipedia is a good start. Followed by studies about 100% renewable grids, those explain why baseload is much less of an issue than people think. And too much inflexible capacity, aka baseload, can actually be a problem itself.
While I'm not a practicing economist, I have a bachelor's degree in economics and took multiple classes on the economics of electricity markets.
While all of those are certainly relevant if you're comparing nuclear and other sources of power, I fail to see how that's relevant to the question of whether there's a significant difference between 80 and 95% capacity factor over a year.
Now imagine you are working in controlling at a power plant operator, and your yearly generated output is 5% lower that whatever plan has before. What do you think would happen?
- investors, management and board saying "no biggy, 5% is not relevant"
- something else
Companies are doing restructuring and mass lay-offs to save less than 5% on bottom line costs, they are incredibly happy when the top line grows by 5% and worried, if not in crisis mode, if the top line declines by 5%. And for the financing part of a new power plant, those 5% are the difference between the investment being a good or a bad one...
Hold on you are an economist and are telling me the difference between a capacity factor of 95 and 85% is not relevant? What do you think the ROIs and margins are for nuclear investments? Considering that nuclear is almost completely dominated by upfront investment, I'd be surprised if the that difference is not the difference between comfortably making a profit on your investment and loosing a large amount of money.
This discussion makes me realize we could have a 100% renewable energy grid and discussion will still be pushed with the same old “arguments” against renewable being feasible. Ridiculous.
I do not understand your math. How did you go from .36 for wind, apply it to nameplate capacity and get 15.1%? And percent of what? I thought applying the capacity factor to nameplate would result in some measure of energy produced, not a percentage.
Keep in mind that looking at the capacity factor for nuclear vs solar/wind can be misleading as well if you don't take into account even more factors.
Renewables are composed of a wide mix of thousands of power plants. And these days the construction of renewables goes along with construction of energy storage and grid upgrades (even taking into account those costs, renewables are now competitive). So any failures/downtimes is smoothed out over both time and space.
That ~10% of the time that nuclear is down, you get 100s of MWs going down in a single region. France recently had several nuclear power plants go down at the same time, making them dependent on imports for a long stretch of time.
In terms of energy I think it's worth looking at the total net energy added.
The largest rate that nuclear ever grew was around ~200TWh for some years in the 80s.
Solar grew by ~300TWh from 2021 to 2022. Wind grew by ~200TWh. And the growth rate is still increasing exponentially.
For reference the world population grew from ~5 to ~8billion from the 80s until now, if you want to take that into account when comparing the growth rates.
I don't think there can be any doubt anymore that renewables will completely dominate the energy production in the future.
Your range is wildly off for PV systems, concentrating solar thermal can be that low but it’s been abandoned for good reason. Utility scale PV solar in the US averages 0.246 with several solar farms in the US having over 0.32 capacity factors.
The discrepancy around PV solar is largely the degree of tracking and location. Fixed panels have the same maximum, but even 1 axis tracking significantly extends how long a panel is producing the maximum power. 2 axis tracking allows even better capacity factor assuming nothing breaks, but at higher cost per kWh.
That’s not why it is done, it’s because this is reporting in capacity and that is the physical capacity being installed. That phrasing makes it sound nefarious, then you end with a fairly low solar range (which varies substantially and exceeds that in sunnier parts of the US).
Bit of an odd take. You could say the same, or lower, of capacity factors on peaking oil or gas plants in some areas that run for only a couple percent of the year. Nameplate and utilization are just different things.
Maybe worth noting that despite lower capacity factors and despite it being mid-winter, Texas (ERCOT) keeps setting solar output records because they’re installing so much nameplate[1]. It’s not like it does nothing.
Considering nameplate capacity without capacity factor would be bad a prediction?? If I need 1GW of extra energy getting 1GW of solar would be a bad bet. Same if I am calculating the pollution generated next year.
Somehow, that is the latest pro-niclear / anti-renewables talking point. Not tgat people in field learn the principle of load factor in one of the first days of 101 lectures on the subject...
It least it is an evolution up from "solar isn't working at night" and similar deep and hot takes on the subject.
Pro-nuclear doesn't make one anti-renewables. They're complementary technologies, the rivalry is entirely one-sided. Not to mention incredibly aggravating to those of us who would like power to be reliable, with no emissions.
The arguments against renewables from your side since at least the '70s when I came across them first, have been persistent and loud, for half a century! You're gonna just pretend that never happened, walk away from it. The parent mentioned the "duh, sun don't shine at night" line because that's some shit your side was for real pushing, for fucking decades. You don't get to sidestep half a lifetime of that garbage and claim some high ground now.
Here in Finland, the national power grid operator provides a live graph of how much that each source is outputting. Its kind of funny how solar produces almost no energy during the winter this far north. I'm very happy that we have that new 1,600 MW nuclear plant, so we no longer have to get that energy from coal.
People want a one-size-fits-all solution, nuclear makes the most sense for the high northern latitudes. Depending on the geography; wind, geothermal and hydro are viable too. Now if you are talking about Egypt, building nuclear over solar/batteries makes absolutely no sense.
Yeah, I just don't like how people here are providing a false dichotomy of "renewables vs nuclear" instead of "renewables+nuclear vs fossil fuels". We should put solar panels on every rooftop so we're not throwing away that free* energy, while having nuclear plants as baseload power to smooth out the duck curve. Polar regions need to slide the scale more toward nuclear, while the equatorial regions can go mostly solar.
The main reason for me that your scenario isn't ideal is that if we can pull it off, renewables remove significant risk compared to nuclear. When football field of batteries goes up in smoke, some battery money is lost. When a Fukushima happens, an economy can be lost, and a nontrivial number of lives. Thousands or tens of thousands of lives, hundreds of billions or trillions in damages, nah, I'd like to opt out and instead do the extra work to make renewable and safer sources happen. That's how I vote. If you live above the arctic circle you'll likely vote differently and hopefully your next meltdown doesn't reach my shores.
Yes, correcting for capacity factor is essential if you want to compare potential energy generated over a year (which is perhaps not the most common use of these numbers). Gas generators are typically less than 50% capacity factor, and batteries should have course have a 0% capacity factor for comparison of energy created over a year
However, your capacity factors for solar are far too low, average CF is above 20%. There are roughly as many installs with CF>30% as there are with CF<15%.:
Also omitted from this view is the cost per MWh delivered to the grid. A great overview of the current state of the technology, including costs ($40/MWh), is here:
For the new nuclear cost, it's actually really hard to dig up, because it's a bit too embarrassing to the industry. But by taking the factors from this slide deck:
And scaling the slide 14 numbers for $9k/kW overnight cost to the real cost of $15/kW at Vogtle, and we get nuclear costs north of $180/MWh for nuclear when including the new subsidies from Biden's IRA legislation (north of $200/MWh unsubsidized).
The LCOE of nuclear is very sensitive to its capacity factor.
Nuclear costs are set to skyrocket because for a larger and larger part of the day they will not be profitable as renewables undercut the price of power.
With a lower and lower capacity factor, new nuclear plants will never pay for themselves.
Refueling and maintenance are the "normal" things and both are planned to the time of year with least consumption/lowest electricity prices.
Then you have the rare non planned outages but once a new reactor has been running for a year or two and all the kinks have been worked out they should be very rare.
Not an expert, but it's my impression that it's common to have reactors shut down about one month per year for refueling and maintenance. Of course, this can be planned so it's not a big deal.
The plants do have a worse correlated failure mode which is when something bad is discovered in one plant, it can shut down multiple plants because the safety is based on making sure to an extremely high degree that nothing can go wrong - like grounded air planes. Something like Fukushima can shut down plants all over the world temporarily until plans have been revisited and extra precautions possibly put in place.
Scheduled maintenance. Refueling, replacing worn-out components, that sort of thing. One nine of uptime is pretty good in the "continuously running power plants" business.
Emphasis on scheduled. Wind operators get no choice at all in when they're generating power, and this is true of solar as well, although the fluctuation is easier to predict, especially in very sunny climates.
A fleet of nuclear reactors can be taken offline on a schedule, with advance notice, so they alternate being out of commission. This makes it easier to supply reliable baseline power.
I would say cherry-picking data points is better then providing 0 data points.
edit:
Basically competent operators hit ~90%.
US is around 93%.
Sweden 84%
Switzerland 90%
France would hit that if they could run their plants at 100% when they are on but they run theirs in load following mode due to having so many of them. So they run at around 80%.
> Two additional units utilizing Westinghouse AP1000 reactors were under construction since 2009, with Unit 3 being completed in July 2023.[9][10][11] This last report blames the latest increase in costs on the contractor not completing work as scheduled. Another complicating factor in the construction process is the bankruptcy of Westinghouse in 2017.[12] In 2018 costs were estimated to be about $25 billion.[13] By 2021 they were estimated to be over $28.5 billion.[14] In 2023 costs had increased to $34 billion, with work still to be completed on Vogtle 4.[15]
Those numbers are simply mind blowing. Why are these still built? How much solar/wind/battery capacity can 34B USD buy!?
I feel the same about the two big nuke projects in the UK (Hinkley Point C & Sizewell C). It makes no sense compared to solar/wind/battery.
> How much solar/wind/battery capacity can 34B USD buy!?
According to How Big Things Get Done, at least 30B worth. Nuclear plants have some of the worst cost and time overruns for large projects, wind and solar some of the lowest (beaten only by roads).
And his theory as to why this happens, is that wind, solar, and roads are self-similar. Building the first 10% directly informs you on how to build the next 10%. Everything you learn along the way can be rolled back into the project in a feedback loop.
What you learn shingling a roof or building a nuclear reactor doesn't help this project all that much. It helps the next. And how many nuclear reactors does one worker build in their lifetime? 2? 4? That's not a lot of opportunity for process improvement.
According to this article, the staff for the regulator in Georgia (the only entity in the decision making process looking out for the customers, ratepayers) were banned from even considering the cost of renewables:
> Plant Vogtle proceeded with no cost-cap, no consumer protections, and Public Service Commission (PSC) staff were prohibited from conducting analysis comparing the costs of nuclear to clean energy alternatives.
At the other pair of reactors started at the same time and with the same design in Soith Carolina, executives are in jail for lying about the project. The SC reactors were also abandoned.
Nuclear in the US is so expensive that even starting a new reactor requires considerable corruption.
Just seeing your comment 10 days after it was posted, but I have many fond memories of your work, and it would be an honor for Asa Dotzler to use one of my lines.
> And how many nuclear reactors does one worker build in their lifetime? 2? 4? That's not a lot of opportunity for process improvement.
And reactors are inherently too big and complicated. A wind farm, solar farm, or a road is intrinsically much simpler.
And reactors will suffer from decision makers sitting around a table and changing designs for the next one because engineers have thought up new failure modes of the last design and nobody wants to be the person who says "no" and then has a failure. Along with the natural tendency of US management to increase the spending of their departments in order to increase the size of their own personal kingdom. You'll never manage to stamp out reactor after reactor all of the same design.
French nuclear power is still not particularly economical and they didn’t build as many plants as originally planned because of it, nor do they plan to build a sufficient number of new plants to sustain current capacity over the next decades. France justifies the expense with their nuclear bomb program that benefits from a strong civilian nuclear industry.
French nuclear power has a levelized cost of about $70/MWh - which is about the same cost as the levelized cost of solar or wind in France, without most the headaches of intermittency. (Remember the average French person lives further north than the average Canadian...)
> "French nuclear power has a levelized cost of about $70/MWh"
That may be true for the existing nuclear fleet, but the cost of new-build nuclear is significantly higher. Flamanville 3, which is expected to finally come online in mid-2024 after 17 years of construction, was estimated at €125/MWh in 2022.
France did not experience a significant fall in construction costs when they were building lots of reactors in the 1970s. There's not much hope of getting that €125/MWh below €100/MWh, and honestly that €125/MWh is likely a sever underestimate to start with.
Don't know about France but TVO in Finland is way under that. Their earlier reactors were built in the 70s.
They have been hitting 15 to 20€/MWh for decades now. Or at least this is the price they have been selling to their owners without going bankrupt for decades now. Target with OL3 in the mix is 40€/MWh so OL3 alone is probably in the 50€/MWh to 60€/MWh range. (and they got a really good deal with OL3)
One big difference in the costs is that now days, new reactors are required to fund their own eventual decommissioning and long-term waste handling costs from operating income.
That wasn’t always the case! In the UK, taxpayers have been left with enormous liabilities for managing and cleaning up old nuclear sites:
Finland deserves credit for actually building a long-term waste storage repository, which helps solve one of the biggest ongoing issues/costs with nuclear decommissioning.
The operators have been paying into a fund since the 80s which is meant for decommissioning and spent fuel storage.
Basically it is a legal requirement (some small fraction of a cent for every kWh produiced)
This is really the only sane way to do it.
Also we got lucky with the ground under Olkiluoto being good spot for nuclear waste storage so not much NIMBY stuff for that as it is already the site for the biggest nuclear power plant in the country. It is also small town so a huge % of the population there work at the plant or its sub contractors.
Paid off reactors can get by on O&M only costs. The trouble is getting through that initial period of paying off the loan and interest. Also, older reactors were far simpler and reliable to construct. (Though "simpler" does not mean simple, they are still extremely complex beasts of machines.)
Finland was smart enough, and France dumb enough, to sign a fixed-cost contract for OL3. This ended up in complete disaster for the French companies doing the building, with the French government buying up the failed builder.
So you can thank French taxpayers for a €70-€100/MWh subsidy for the new clean energy in Finland.
What you're saying is just true of EPR, but not revelant at all in comparison to what happened when the nuclear plants where mass produced. As the parent said, nuclear has been very cheap for 3 decades.
> which is about the same cost as the levelized cost of solar or wind in France
As of 2024. Remember that solar and wind are still on a downward price trajectory and the reduced investment in Nuclear has meant that the prices to up.
So if you have a figure for example of $70/MWh for PV and wind in mind from a couple of years ago, then this figure might be slightly outdated now and more slightly outdated tomorrow. PV might go under $20/MWh at the end of the decade for instance. Nobody will be able to compete with that.
> PV might go under $20/MWh at the end of the decade
It's never clear to me when these numbers are cited, clearly you must be talking about PV + battery right? Otherwise it would not be a fair comparison.
That's true, because of the gulf stream, but the only thing that matters from a solar panel point of view is the angle to the sun. In fact you might get slightly more out of a Canadian solar panel than a French one at the same latitude because the Canadian one will be colder!
Indeed. But in either case, solar doesn't help too much on cold winter days at higher latitudes. Especially if we want to decarbonise heating, leading to increased grid demand on the coldest days.
The way ocean currents work help Barcelona more than Chicago. Chicago is warmer than its western suburbs because the suburbs are farther from the lake. (there are many other confounding factors so we cannot measure this)
Did you try to find data? Or just reporting on your intuition about this thing that someone else thought might be surprising and turned out to be surprising?
It seems like half of Canada's populations lives south of 45.5°N. [1]
Someone else calculated the mean population latitude of France to be just over 47°N [2]
Mean and median aren't the same, but in this case, the measures are far enough apart that it does seem like the average (however reasonably defined) French resident lives north of the average (same defined) Canadian resident.
Exactly, and don't forget that nuclear costs in France are being driven high by wind (and solar, but less so), because it cannot run at its optimal power because it has to reduce produced power when there's wind.
Downvote all you want, when you artificially decrease capacity factor from the average 90% to 65% because you have to accommodate for wind, in an industry when fixed costs are the vast majority of costs, you mathematically raise the cost per MWh…
Wikipedia writes: "As of early September 2022, 32 of France's 56 nuclear reactors were shut down due to maintenance or technical problems" and "In 2022, Europe's driest summer in 500 years had serious consequences for power plant cooling systems, as the drought reduced the amount of river water available for cooling."
> "As of early September 2022, 32 of France's 56 nuclear reactors were shut down"
This is cherry picking. Look at the 10 years, or even lifetime, average availability. It's 90 or 95%. The reason for this bad number is because of delayed maintenance due to COVID.
> "serious consequences for power plant cooling systems, as the drought reduced the amount of river water available for cooling."
The reduction of power output of French nuclear was something like 0.30%. You read that right. So I would call "serious consequences" a blatant lie.
> "Look at the 10 years, or even lifetime, average availability. It's 90 or 95%."
Average load factor for nuclear reactors globally hovers around 80% according to IAEA data [1]. In France the average is actually lower than this due to load following: by design, many French reactors don't always operate at full capacity because there isn't enough demand at off-peak times.
Few, if any, reactors reach 95%: planned outages for refuelling and maintenance takes up more than 5% of their time.
Related and interesting thing - China appears to be building a trial reactor in the Gobi desert [0] because they think they have air cooling sorted out.
Or at least close enough. I haven't checked if it is technically in the desert.
There is still plenty of water, the problem is that you have to limit heating the river. With a cooling tower can cool with or without evaporating water depending on the design but always without heating the river. But it is cheaper to directly dump the heat into the water.
Dry cooling tower exists but they are more expensive. Natural cooling with the evaporation eventually runs into the problem due to built-up of salts which forces you to replace non-evaporated cooling water with fresh river water and lets you dump the heated water into the river (which is limiting factor due to environmental concerns). The less water is flowing and the higher the temperature of the river are the sooner this point is reached
My hunch is evaporative cooling uses not very much water. You could also fill a small dam when the water flow is higher. All means higher cost, of course.
There are detailed diagrams and formulas to calculate that. Engineering, not back of the envelope highschool physics.
But guess what, people designing an building power plants know this. And whatever is built is the best compromise possible at the time. Backnof envelope calculus in 2023 or not.
Even after the summer heat the reactor availability in France had trouble inching above 2/3 for a number of reasons. One of them being age of the installations and related unplanned outages.
I've been anti-nuclear for a long time, and SMR is the least unpalatable choice.
Not without caveats though. Humans are shit at statistics. Their sense of the aggregate danger of putting 100 'safe' items in close proximity often underestimates the odds by half, and in some cases by an order of magnitude.
You might get lucky and lose two personal hard drives in your entire life, but the IT guy managing 100 drives is dealing with failed drives 'all the time'. Backblaze reports on something like 10,000 drives and they are claiming something on the order of 200-300 failures. And I think that's per quarter, not per year.
What do you do with an array of reactors where one right in the middle did not fail cleanly?
Indeed. I mean, I'm not sure the smallness is a key factor, but the modularity for sure and maybe the size limitation comes with that.
I'm not much of a fan of nuclear (pun not intended?) and would like to see solar + wind (+ other renewables) cover most of the load. But if enough resources were pooled to design an SMR, with extensive safety studies, lifetime planning, involvement of multiple parties (e.g. scientific and engineering teams from different world states) - I don't think I could really oppose, especially given the global warming situation.
Of course - there's "ideal nuclear" and there's real existing nuclear, which is quite different and often very problematic.
They absorb more indirect light, but the main advantage is that hot panels drop in efficiency, and standing them up improves heat dissipation substantially.
One of the bits that stood out to me is that it has an output curve that partially lines up with The Duck, because you get a period of low-angle exposure near sunset. I suspect you get quite a drop at high noon, but that's fine as long as not all solar panels in the grid are vertical.
What specifically is so expensive about the reactors? Having played the simulator[0] and now being an expert in running a nuclear power plant, the idea is "simple". Seemingly not much more complex than any other steam/heat power plant, so how do the costs run so high? Are the tolerances on everything turned to 11 to minimize chance of errors? Triple-checking all work?
As a point of comparison, the Large Hadron collider had a $9 billion budget, and that required a 27 kilometer circle to be dug.
I worked for a Campbell, CA nuclear engineering partial employee-owned co-op formed by mostly ex-General Electric engineers (Stanford and Berkeley alums) ultimately acquired by Curtiss-Wright Corp. (And yes, computers were named after Simpsons' characters.)
Costs are related to the assurance of safety through careful documentation and processes beginning with regulation and insurance underwriting of design engineering, construction, operations, and maintenance. This sort of thing only makes sense to do so at large scale to make it economical. Scale also has its costs. Safety isn't something that can be Boeing'ed because the NRC can and will shut the whole thing down if it were to be found unsafe.
> "I feel the same about the two big nuke projects in the UK (Hinkley Point C & Sizewell C). It makes no sense compared to solar/wind/battery."
Renewables are far cheaper, and can be built much more quickly. But there is an argument for (some) nuclear around energy security and diversification of supply. Batteries are great but can't really provide long-term storage (ie: multiple days/weeks). Do we want to rely solely on natural gas as a backup during the cold, calm, winter weather patterns which in some years can persist for weeks? Last year's energy crisis suggests maybe not. Or can we build enough interconnections and rely on imports from our neighbours?
Also worth considering that Hinkley Point C & Sizewell C are really only replacing existing nuclear plants that are shutting down. Even if both are built, the UK will still have only around half the nuclear capacity in the 2030s that we had in the 1990s.
> Those numbers are simply mind blowing. Why are these still built?
I'm guessing because there was no step in the process where it seemed like a better idea to stop construction and have spent whatever costs so far to get nothing of value, rather than accept the cost increase and end up with high availability, somewhat dispatchable generation.
There's not a lot of new projects in nuclear, because construction costs are high and subject to cost overruns as projects get delayed, and delays seem inevitable. That's why there's all the talk of modular nuclear and what not. If it was feasible to build these plants on time and on budget, the generation parameters are good --- doesn't use much fuel, tends to be available outside of scheduled maintenance, can modulate generation to follow demand, if the economics make sense (as-is, most of the plant expenses are fixed cost, so you may as well generate as much as you can to amortize the cost over more kWh)
According to the NREL (National Renewable Energy Laboratory) in 2022 it cost approximately $1.06 per watt to install utility scale solar. So for the cost of the 1.1 GW Vogtle 4 reactor, utilities could have installed approximately 32 GW of utility scale solar. Obviously this is an extremely simplified and naive calculation that doesn't account for cost or availability of land, transmission infrastructure, battery or other energy storage, etc.
So split the $35B (there's a recent $3B that hasn't made it wikipedia, as I understand it) half and half, and you get 17GW of solar, and 36GWh of storage.
As far as translating this into per-kWh costs, most estimates I have seen put Vogtle at $0.17-$0.18/kWh. The equivalent for solar is $0.04/kWh. To charge a battery with that same solar, and then deliver it later, it's $0.13/kWh, when doing the napkin math with those NREL numbers up there.
Lifetime of panels and batteries vs modern nuclear plant? I guess also do the nuclear numbers use the real liability and disposal costs, or subsidized ones?
PV panels will produce for at least 25-30 years with limited degradation in output. At that time, consider what state of the art will be, and that you can repower an existing PV plant trivially; you de-energize segments, manual labor replaces panels, and you re-energize. Old panels will get shipped for recycling (shredded and materials sorted for reuse).
Providing "base load" is often touted as an advantage of nuclear power plants (NPP) here on HN. The reality is actually the opposite. As the International Atomic Energy Agency says[1]:
"Any unexpected sudden disconnect of the NPP from an otherwise stable electric grid could trigger a severe imbalance between power generation and consumption causing a sudden reduction in grid frequency and voltage. This could even cascade into the collapse of the grid if additional power sources are not connected to the grid in time."
Basically NPPs are designed to SCRAM for all sorts of reasons, then the sudden loss of multiple GW really ruins the grid managers' day. The first paragraphs of [1] make it clear that a large, stable, grid is a pre-requisite for NPPs not a result of NPPs.
It's not quite the same when failing to meet peak demand means pipes freeze or people die of heat exhaustion. There are areas in the country that are over a hundred degrees at night. It's going to take a lot of solar to cool all those homes.
Could we all live a different way, communally instead of in our own big boxes? It's physically possible but socially impossible. The truth is we'd rather burn the planet to the ground and we will. That's our nature, little use fighting it.
> When talking about the electrical grid you have to be able to generate energy in any amount whenever you want.
Not exactly. It is possible to manage the demand side to some degree.
For example, Octopus Energy in the UK has a "Intelligent Octopus Go" contract which offers much cheaper night rates, in exchange for giving up control over when and at what rate your EV charges. You just tell them what battery percentage you need by what time in the morning. They plan the charging within this constraint and get paid by the grid operator to balance the grid.
Another example are dynamically priced contracts where the prices vary hour by hour based on the day-ahead market prices. I have such a contract and I charge my EV only during the cheapest hours when other demand is low. Sometimes I postpone charging for a day or more because I have sufficient charge for my needs and I expect lower prices later, e.g. based on weather or upcoming weekend.
Network stability, night and bad weather hedging is a problem though with full solar/wind, isn't it? We still need a mix until tech is better to solve these problems.
Regulation plays a significant role in cost overruns. Also we (should) expect the nuclear plants to work for at least a century while the panels and wind turbines last what, 5-15 years assuming no major storms damage them?
Check your data, newish solar is expected to have less then 20% degradation for 30+ years, afterwards its still usable.. but an exchange might be financially beneficial.
Windturbines are rated for 20+ and can be extended.. current generation are usally changed as there are bigger models available, with higher profit margins
Design lifetimes for nuclear reactors are typically 30 to 40 years. Solar panels are often guaranteed to produce about 80% of their capacity after 20 years (and often exceed that) IIRC
> How much solar/wind/battery capacity can 34B USD buy!?
I'm guessing you'd run into space constraints pretty quickly. For one large project a nuclear reactor is incredibly compact compared to most alternatives except maybe hydro.
That's nice, but there is a long way to go, also it would probably be better to see batteries as a part of the 'storage' bracket, after all the batteries don't generate electricity, they store it. It's a bit like lumping fields of grain in with bags for grain. The one is useful all by itself, the other can't function without some other generator doing its job first.
Typically the use for storage in the form of batteries is to play arbitrage with power prices around solar noon to farm it back out again when power is more expensive, for instance early evening or the next morning. Longer term storage also works but tends to be more capital expensive and isn't really an option just yet, though there are some edge cases where it may already work on (slightly) longer timescales.
I think it's actually brilliant to include battery together with energy generation.
We know that when renewables become more than 20% of power generation, every GW of renewables you add should be matched by a similar order of magnitude of batteries.
So if we start reporting this way, you can immediately see if we're building enough energy storage or not.
If you wanted to make the amount of power actually produced comparable you could reduce the amount of reported GW of bother renewables and batteries by 50% so they add up to a realistic capacity number.
But capacity numbers are never really comparable anyway. Perhaps best to report raw numbers and assume readers are smart enough to understand that for every source you need to know the significance of what a "GW" means.
Like, a "GW" of a gas peaker plant does not mean much regarding how much energy it produces. That "GW" is actually much more comparable to battery energy storage "GW"
Sure, but batteries simply don't generate power: you store an X number of KWh in them based on the rate of your charger and the rate-of-charge of your battery and you can discharge them using an inverter (which again, has a nameplate rating) limited by the rate of discharge of the batteries (usually pretty close to the charging rate so this tends to be symmetrical).
You could have a massive storage with a relatively small inverter for discharge but a large charging capacity, the reverse and so on all highly dependent on the intended use case. So battery capacity doesn't say anything about the eventual grid capacity of the setup, for that you need to take into account all of the components (charger, battery size + charge/discharge rate + inverter). Only then you can put it into perspective.
Not in a power-grid context; it'll over-volt the grid sometimes and under-volt it other times.
In the specific context of "understanding generation supply to a grid", working in terms of "load supplied by a system of solar & batteries" makes the math drastically simpler.
Every modern inverter respects the local grid regulations with respect to overvolting and undervolting isn't an issue at all and has in principle if it happens absolutely nothing to do with solar power.
The cool thing about batteries is that they are great as an investment. You can charge them when power is cheap and sell back power when you have high demand and market prices for power.
Typically Peaker Plants fill this role but at much higher cost for electricity and emissions. Batteries just make much more sense for this kind of power.
This isn't really how batteries make money in the US right now, as I understand it. ~80% of battery revenue in ERCOT comes from what are called "ancillary services" [0] which is payment from the grid for being able to very quickly increase or decrease the amount of generation to ensure stability.
This revenue will quickly disappear, however, as there is limited need for ancillary services. Lithium batteries served this purpose well in PJM starting long ago (more than a decade?), even with very high battery prices of yesteryear.
New battery additions must be banking on limited ancillary services revenue. Unless Texas investors never bothered to learn the lessons of the storage experience in PJM, which seems unlikely to me.
Those early reg-d PJM batteries also got absolutely physically wrecked, so it wasn’t the best outcome.
ERCOT is quite different from PJM when it comes to AS - the market is very deep and the operational needs are increasing in a way they aren’t in PJM (yet). Couple that with a vastly easier permitting regime and hugely faster interconnection process for a facility (batteries) that require relatively little land compared to conventional generators, and Texas has enabled ERCOT’s queue to become absolutely stuffed full of battery applications.
PJM was ahead of the ball on market design, but that was a (relatively) long time ago. Now they’re in the midst of their massive backlog queue transition and also revamping (for the nth time) facets of their capacity market.
Why did Texas not start building out batteries for ancillary services until recently? Did/do they not need as much ancillary services as PJM? Also -- do you know what happened to ancillary service pricing as batteries became cheaper? Did grids become much more stable in practice?
That is beyond my knowledge; I know that PJM specifically set up a market for ancillary services that allowed battery operators to get paid. I assume ERCOT must have set up some sort of similar market, but I don't know the particulars...
Last time I ran the numbers for this, price arbitrage alone using lithium batteries is not gonna be profitable.
Current lithium batteries are just too expensive, store too little energy and degrade too quickly for true grid-scale storage.
That’s why they typically offer services, other than price arbitrage, that actually makes building them make sense. There’s still no equivalent of a peaker plant using non-hydro storage.
Lots of energy companies across the globe seem to be disagreeing with you and are installing massive amounts of battery storage. Gas peaker plants are getting used a lot less on grids where this happens.
Reason: your numbers and assumptions are probably wrong.
One could have said the same thing 2 or 3 years ago about companies active in the renewable energy space like Orsted. I mean, they were one of the biggest producers of wind-turbines, what was there not to like in this ESG-focused investment environment? Until the numbers suddenly stopped making sense for them a few weeks ago [1].
Which is to say that the investment numbers may look "right" for a good few years until they suddenly don't and the reality catches up with all those involved.
It wasn't a "reality" check from having inaccurate estimates, it was rising interest rates that put a huge damper on the cost of projects where all the expense is front-loaded.
Higher interest rates are part of the game, or of reality, if you want to put it that way. So, yes, it was a reality check and it has proved that that company could only be viable under very particular economic circumstances (i.e. interest rates being close to zero)
Whoa, their viability as a company is not under question.
The only thing under question is whether future long-term agreements will include inflation protection.
When interest rates rise tremendously, long term investment gets pulled way back. That's the entire point of hiking interest rates, to make companies like Orsted slash their growth rates. It does not put Orsted at risk for collapse.
They've had a ~70% share price fall since the height of 2021, 40% lower compared to the summer of last year, of course that their future (at least as an independent company) is under threat, saying otherwise is just wishful thinking.
> That's the entire point of hiking interest rates, to make companies like Orsted slash their growth rates.
I highly doubt that that's the discussion being held at the meetings where the rates are being set, i.e. I've never heard the likes of the US Fed or of the ECB saying "we want to slow our most dynamic sector of the economy by increasing interest rates", but I could be wrong on that.
The question is whether such battery's charge/discharge cycle degrade faster than the return on investment.
For home use, break-even is an acceptable outcome, but not for commercial use like above. Is the price of battery low enough atm to make a return for such an investment?
At this point I think you have to take into account battery recycling.
If you build a big battery storage system now, by the time it's fully degraded, battery recycling will be a massive and streamlined operation. Given the number of energy storage systems built today you'll have massive quantities of similar and easy-to-recycle cells going to these recycling operations.
So if you're a big grid operator you'll probably be looking at making a streamlined and efficient loop out of getting your old cells recycled and making new cells out of that material. The cost for the replacement storage system will be much lower, and given improvements in cell chemistries, the storage capacity will likely be higher.
Then again, it's possible that grid energy operators will transition to low energy density but cheap and durable chemistries, like Ambri's molten metal batteries.. which essentially last forever.
These are all grid investments. About half of the storage is being installed in Texas, which is the closest thing the US has to an open market for investors.
The storage being installed in Texas is all being done purely for profit. Meaning that the investors have run the numbers and find batteries to be the highest return they think they can get for their money.
Storage in other places (specifically California) is being driven both by the profit motive, but also in some cases by legislation that mandates storage (not specifically batteries) be added as part of the grid mix. California has enough solar now that nearly all new installations include storage, in order to profit during the peak evening hours when electricity prices are highest.
It’s currently possible for batteries to be a great investment, but diminishing returns hit hard. The most cost efficient setup is a solar farm where the panels produce DC which directly charges the batteries and the losses from DC>AC conversion only happens once before you send power to the grid. With the added benefit of only paying for a single set of DC>AC equipment.
However, the more such systems come online the less peaking power is worth and batteries aren’t competitive with current ultra low nighttime rates.
"The Gemini solar facility in Nevada plans to begin operating in 2024. With a planned photovoltaic capacity of 690 megawatts (MW) and battery storage of 380 MW, it is expected to be the largest solar project in the United States when fully operational."
That's about 60% of the capacity of one unit at Vogtle nuclear power station, and Vogtle has four units.
Also, adding the solar capacity to the battery capacity to get a gigawatt makes no sense. Either you're charging the battery, which takes power from the solar array, or you're discharging the battery because it's dark and the solar array is idle.
Gemini took 2 years and cost $1.3B. Vogtle cost $34B for 2 units and took 12 years for those 2 units.
Edit: Gemini took 4 years from project submission and 2 years from construction start. Vogtle took 19 years from project submission and 12 years from construction start.
In large part because of constantly shifting requirements, obstructionism, and various forms of lawfare. The US has become terribly inefficient at large infrastructural projects, see the CA HSR project which is massively delayed and over budget. China does not have this problem, and has been rapidly expanding their nuclear capacity.
A standardized reactor design and regulatory certainty would enable the economies of scale and accrual of institutional knowledge to efficiently build new power plants. In response to the 1973 oil crisis, France massively expanded their fleet of nuclear reactors which were producing more than 70% of their electricity in about 15 years. If the political will is there, it can be done.
There was no obstructionism or changing requirements to blame with Vogtle. It was all self-inflicted delays and costs by the designers and the builders.
France is experiencing exactly the same long delays and cost overruns today at Flamanville, despite having a welcoming regulatory environment, etc.
China also has lots of delays, and presumably cost overruns, but costs are a hard thing to pin down in China. China is barely expanding their nuclear program, only something like 50 new reactors are planned, a few orders of magnitude smaller than their plans for solar and wind.
CA HSR has experienced the obstructionism that Vogtle did not, but is still making good progress. The media narrative doesn't cover it well, but there are new sections completed all the time.
We have problems with big construction projects in the US, but nuclear takes those problems to the next level. And we have great non-construction alternatives for nuclear.
Gemini also qualifies as a large infrastructural project; it was delayed by concerns about the tortoises living in the area and challenges from local environmental groups.
China does similar projects in about 6 months. IOW, China does solar projects about 8X as fast as the US. OTOH it does nuclear projects in about 5 years, about 4X as fast as the US.
All the nuclear energy production plants in the world ever built have the same capacity as the amount of solar generation built just last year.
So your comparison doesn’t make much sense, nuclear is dead in current energy market, it makes sense only for producing fissile material for nuclear bombs.
Looking at a single plant isn't very interesting. Solar and wind are fundamentally more distributed so you're comparing apples to oranges. That distributed nature is both a good thing and a bad thing. Having just a few nuclear reactor go down at the same time in a single region can be problematic, as France recently experienced. Of course there are costs/upgrades associated with making large share of renewables viable at all, but those come with a more distributed and resilient solution.
If you want to compare impact on actual energy generation, look at the growth in total net numbers:
The largest rate that nuclear ever grew was around ~200TWh for some years in the 80s.
Solar grew by ~300TWh from 2021 to 2022. Wind grew by ~200TWh. And the growth rate is still increasing exponentially.
For reference the world population grew from ~5 to ~8billion from the 80s until now, if you want to take that into account when comparing the growth rates.
It seems like this thread has sparked a lot nuclear vs solar debate, but both have the potential to be net zero energy sources and most fault tolerant grids are powered by a multitude of sources. We should be aggressively building solar, wind, battery, and nuclear along with anything else that can reasonably be part of a net zero supply chain that produces electricity in the not too distant. They all have different tradeoffs and we shouldn’t be debating which is better than the other. Rather we should focus on what percent of each fit best balancing safety/resiliency/cost of adoption/long term operating costs. And this varies widely from region and changing over time.
What’s need are formulas and frameworks for helping make these wildly complex decisions more straightforward.
We are building solar and wind aggressively. The problem with nuclear power is, nobody manages to build it aggressively or even at enough scale to make a meaningfull impact.
What we should do, is keeping existing nuclear plants running as long as economically and safely possible. Emphasis on economically and safely, because those two points meant that the German reactors did run as long as possible, they even got an extension granted from a Green minister.
That's absolutely not true. 68 GW nameplate capacity, or, say, 30 GW effective capacity, is a very small amount. The US consumed 4 Trillion kWH in 2022 [1], which, if assuming a peak consumption of twice the average, means up to, say 2.2 10^12 W. 68 GW is 6810^9 W, or 3% of the peak consumption.
Even if my cocktail-napkin math is off by a factor of 2, that's still not much more than offsetting demand increases. And even if there were no demand increase - "aggressive" would mean 4x that amount, to be able to phase out fossil fuel power by 2030 or so.
> nobody manages to build it aggressively or even at enough scale to make a meaningfull impact
China does. They will be completing 60-70 GW worth of reactors in the next decade, planning to take nuclear power from 2% of their electricity production to over 10% eventually. (While also building crazy amounts of solar, wind etc.)
You do know that 6-7 GW per year are close to nothing compared to new wind and solar, not to mention other plants, China is installing? Nor is it in the general picture.
Presumably you can discharge the batteries while pv is also producing at full tilt.
But that'll rarely happen.
I absolutely hate the current state of energy reporting in general. The writers of these articles are not even trying to make sure they report accurately.
E.g. the battery stats: Is it MW or MWH? Both are very different and the distinction is important. A battery that can deliver 380MW but only for 10 mins is pretty useless.
This is the EIA. They are in charge of pretty much all energy tracking in the US. They are reporting the right units, and if they seem wrong, it's because the numbers are being used for something useful, but perhaps different than what you have in mind.
(That said, I do dislike units like BTU or "billions of kWh," but they are at least correct units for the quantity measured)
And now look at number of solar and wind installed vs. nuclear. Take net numbers, as most nuclear capacity replaces decommissioned reactors in countries without a nuclear exit strategy.
I am too lazy too look up, and link, those publicly available numbers again....
I agree that summing the two values to a GW seems disingenuous, but there are more scenarios than the two you described.
For instance, it's possible to discharge the battery during the day, as well. If peak load is in the afternoon, the plant could be charging the battery during the morning and mid-day when specific demand on that plant is less than 690 MW, then discharge the battery to have (temporary) output of 1 GW.
From the purposes of grid operation, batteries serve both generating and load functions.
Having a complete catalog of generating capacity is a standard metric that EIA tracks. So even though batteries can also consume from the grid, it really does make sense to add them to the list of "generation," at least according to the purpose that this generation metric has always served.
It’s even a bit funnier than you put it since on net they consume electricity due to round trip efficiency losses.
This also can provide something of an interesting discussion with local authorities when trying to site a battery somewhere that has banned new “generating” technologies (typically targeted at wind and solar), since batteries are consuming electricity and can be likened to transmission infrastructure (or physical trade assets which I think some people are enjoying playing with).
In some scenarios the solar is only deployed because of batteries being deployed in the same network. The batteries are providing energy when the sun is down.
It seems that the thing they're measuring is the amount of instantaneous power that all the new "sources" could deliver at once, assuming they're able to deliver right then, and at max-output-power. In other words, it's an optimistic measure, but it's not meaningless.
I think the promising thing is that as we get more batteries on the grid, it will enable more renewables penetration as a percentage of the grid as a whole
Yes, absolutely, this compound effect is very important. Also interesting is that with every EV that gets added to the grid this effect improves even for the ones that can't deliver the power back later if it is economical to do so. They effectively serve as capacitors.
Trends are often easier to understand from a rolling average or derivative.
Electricity might also seem fungible but even near optimal conditions a rule of thumb of 1% power lost per 100 miles (~160km) of distance. https://en.wikipedia.org/wiki/Electric_power_transmission#Lo... Thus the value of any installations should be rated by their distance to loads and load capacity under typical operating conditions. Solar is probably a good pair against AC units. Base load plants are still desirable.
I saw that qualifier also. Does anyone know what percent of total capacity will be new next year (or better, what percent of total capacity will be solar next year)?
One thing I have wondered: does it make sense to just put most of our eggs in the solar basket in sunny states and then invest heavily in batteries and transmission lines to less sunny areas?
Wind seems so much more expensive and maintenance intensive.
Panel and battery prices have come down such that transmission losses, natural disasters to farms, last mile power lines in storms, etc become bigger concerns comparatively.
I think the end-use solar and battery model makes more and more sense every year from a consumer perspective.
Note that you don't necessarily have to put the panels and the batteries at the same place. You can, and there are benefits like DC <-> AC conversion, but you could in theory de-centralize solar more than you de-centralize battery storage. And there may be some scale advantages there, especially since some storage technologies involve putting things underground, or high temperatures which are a safety concern like with molten salt, etc. Caveat: I'm just a layman about this stuff.
A neat trend. But still renewables are a small fraction of electrical generation capacity in the US. Dominated by gas, even the renewable slice (< 20%) is dominated by hydroelectric. Which hasn't changed in 50 years, even gone down a bit. Second is wind. Solar is a scrap of a scrap.
A long, long way to go to complete renewable energy.
The second link is interesting since it covers actual generated energy. The others are kind of convoluted and need to be more clear on what the capacity measurement is since most renewables have significantly lower capacity factors than nuke, gas and coal.
Which is why the battery uptake mentioned in this piece is of note. It firms renewables and steals grid service revenue from thermal generators. Pulls the uptake trajectory of renewables closer to vertical.
Only one coal fired generator in the US is currently economical to run vs new renewables. You don’t have to match capacity factor, it’s about economics; if renewables can run often enough cheap enough, it makes other generators uneconomical even if they have superior capacity factors. It’s why nuclear is on life support in the US.
Isn't exponential growth where the growth rate becomes more rapid as the stock of the thing gets larger (and, more precisely, the growth rate is a function of the stock)? How is growth in solar capacity exponential? Wouldn't it actually be logarithmic, all other things equal?
The graph looks rather exponential (and decidedly _not_ logarithmic) to me.[0]
The story is actually "the growth rate becomes more rapid as you accumulate experience manufacturing, which also grows as a function of the stock." AKA Wright's Law.[1]
You may notice this is faster than your definition of exponential growth, because Wright's Law also counts the decommissioned panels too. The industry still got that manufacturing experience, even if the panels don't count as 'stock' today. So technically the growth rate scales with total all-time production, not current stockpile size.
PV has been growing with a time constant short compared to the lifespan of PV modules, so most of the cumulative production is still in operation today.
This is also why PV recycling hasn't had a chance to really get going -- very limited potential input to the process yet.
Yes that's true. My point is simply that solar (in this early phase) does show exponential growth.
Spangry's "the growth rate becomes more rapid as the stock of the thing gets larger" was confused enough to made exponential growth sound vaguely absurd, but in fact it's perfectly expected (and indeed observed).
Because the more solar gets deployed, the less the costs are. So each increase in growth triggers a reduction in costs which then enables larger growth and so on.
There's no way solar can grow geometrically for long. It consumes real estate like there's no tomorrow. Land is limited; sunlight is limited. It has a hard, hard limit.
Unlike many other kinds of power generation, which are land-frugal and work 24 hours a day.
I suggest you provide actual evidence for your assertion. You will not be able to, because it is false. The land area in the US is enormous and land cost is only a small fraction of the cost of a PV field. And, of course, grids can be extended as needed.
Again with the hand-waving. "Land is big!" is no kind of argument or evidence. It all comes down to cost and practicality. Energy is not some game, its business.
And with dropping panel cost, land is the last remaining hard nut in the equation. Not a small fraction, a large one, and becoming larger.
As for understanding math, the earth is not getting any bigger. That limit cannot be beat, no matter how breathless the rhetoric.
Now, to find that level unobstructed ground-level stable dry cheap tracts near grids, with access for maintenance. Near where folks need the electricity. And the land isn't ecologically significant. And remember, storage or it's not helping for shit.
There are so many obstacles left! So many reasons it will slow down once the enthusiasm and price supports have dried up. Like everything else we've ever done.
I would have thought the US had several people capable of purchasing an area only a third of an Australian cattle station to corner the future market for solar energy production and build out HVDC delivery lines .. but I forgot it's been all downhill since the space program.
The money is same scale as oil & gas investments so it's not exactly something that isn't being done already - the layout can one big area, many smaller areas, the panels can be raised in the air to allow for crops | animals underneath, etc.
Roadsides would be good, poor quality agricultural land, lake beds that are drying up in the midwest and circa Utah, etc.
But yeah, you're right - too hard for the USofA that used to be great. /<sad>
Meanwhile, we're doing that here already - scaling up to power billion tonne per annum mining operations to deliver resource to the rest of the world .. where do you think China gets it's iron, lithium, mineral sands, etc. from?
Isn't that a curious thing? The most successful businessmen on the planet, and they don't jump into large-scale solar like the pundits think they should?
Maybe that says something about the business model, hm? About how easy or how useful or how successful such an attempt would be.
Armchair energy experts can say anything, claim anything. But follow the money, that tells you what can work and what can't.
Not saying it will never work. But for now, largescale solar is fraught with landmines. But I said that already and been ignored because it's easier to make wild claims than address hard realities.
When CO2 isn't being taxed, of course they don't jump directly into solar. It's like when emissions from a coal plant aren't penalized, they don't install scrubbers or filters. Never mind that the Clean Air Act caused reductions in emissions that were worth 40x the cost of the controls.
What you are just talking about there is that negative externalities are not controlled by market forces, but have to be regulated.
You are engaging in a stereotypical crank behavior here. Make a ludicrous claim without evidence, then demand detailed argument when someone calls you on it.
No, it is YOUR responsibility to argue your initial claim. Only when you have given that detailed argument can you require a detailed argument in rebuttal.
I remind you of Hitchen's Razor: "what is claimed without evidence can be dismissed without evidence."
Luckily we'll have more than enough energy long, long before we run out of land or sunlight. We could satisfy a large part of our energy demand just by replacing energy crops by solar panels, with the added benefit of not needing fertilizer and pesticides.
If you are concerned about solar's use of land then you must be positively apoplectic about using corn for ethanol. That uses more than 40% of US corn production, an amount of acreage so vast it's difficult to comprehend.
The single most fragile ecosystem in the world is a desert. The casual way folks here throw that line out, again and again - shows a profound lack of understanding about what an ecosystem is.
Ultimately how useful is the high uptake in individual states? My understanding was that the US had a lot of weirdness going on about provision of utilities over state lines.
(Mind you here in Oz I think we have our own state politics involved, but I guess at least we have fewer states )
Ah, it was probably me hearing about Texas and then extrapolating that to the rest of the US. Probably not a great idea on my part. Thanks for setting me straight :)
That's not the main use of grid batteries. The main use of batteries is 1) to soak up power that is otherwise not needed when there's too much of it. 2) to deliver that power back when there's too little of it. This is something that happens very often for typically short amount of times (hours). Batteries help smooth this out.
The mistake you are making is only thinking about when there's not enough power. The real challenge is dealing with the very regular situation that there's too much of it. That's energy that is wasted and lost.
Batteries improve the capacity factor of renewables (the percentage of time they are useful).
So, do electricity cables. Shortages and surpluses are highly localized. Germany for example has the problem that the demand is in the south and a lot of wind generation is in the north. So, they are curtailing wind power when there's too much wind and are firing up coal plants in the south because they lack the cables to get the power from where there is too much of it to where it is needed. When Texas had it's blackouts, other states had plenty of power. But Texas is not connected to those states by cables. So they had no way to get the power delivered. So, blackouts happened.
Long term storage is much less relevant currently and a market in it's infancy. The overwhelming majority of grid batteries is for dealing with short term dips and peaks in power generation. Most setups don't provide more than a few hours of power at best. But they can switch between charging and discharging in milliseconds and do both at high capacities.
This is why lithium ion is popular in this space. It can deliver or soak up a lot of power very quickly. You can put cells in series or in parallel depending on the use case. You add more cells to deliver more power more quickly. Not necessarily for longer. You can configure the same 1gwh of cells to deliver 100mw of power for 10 hours or 2gw for half an hour. Most of these batteries are configured for high capacity charging/discharging and relatively short storage.
There are some long term storage solutions emerging as well of course. Redux flow batteries are a good example where there's a fixed size cathode and anode and reservoirs of electrolyte that are pumped around. You can scale these by simply using larger reservoirs. They are cheap and can hold many days/weeks of power. Just add larger tanks. The caveat, is that the power delivery is a constant and typically low.
Just a technical note, series/parallel has no effect on the power capability of a battery. This is largely linked to the specific cells chosen (whether it's a high energy or high power chemistry).
I think the confusion comes from associating more current capability (parallel) as meaning more power, but the same applies to voltage anyway, so it's not relevant.
I'm surprised that wind is such a small part of the US's projected future energy mix. Does anyone know why? Wind power works overnight, it leaves a lot more usable land than solar does, and there's a lot more capacity to be exploited. It's strange even considering the political backlash against wind power in some areas.
The US is a bit behind in catching up with the rest of the world on this front. There are some positive exceptions like Texas where they figured out quite early that it's cheaper to power refineries with renewables than with fossil fuel. So, Texas has a lot of solar and wind at this point. And why not, it's a sparsely populated state that is very suitable for tapping into both.
The complacency in the rest of the US of course has a lot to do with the fact that there's a very loud and active pro fossil fuel lobby that kept insisting coal was the future even while a lot of coal plants were going out of business. A lot of states invested in gas plants instead of wind generation because of that too. And now that renewables are clearly cheaper, a lot of those investments are starting to look pretty bad.
My knee jerk thought is that you have to work significantly harder to secure land for windmills. They have to be distributed such that you require huge tracts of land and/or to secure rights to install a tower on someone else's property. Each of those towers then need additional transmission lines.
While solar might generate less energy per unit area, it is at least condensed so that you could get away with buying a plot and fully exploit the area. You can add additional (unconnected) plots as cheap land becomes available.
I could also see the advantage that solar has quite limited opex after installation. A fleet of windmills may require significantly more resources to keep operating after years of service.
Looking through LCOE+[^1] and references to old reports[^2], it looks like solar used to be significantly cheaper, and wind a surprising dip in cost last 12-18 months. It's now on par if not less for onshore wind across the board. [^1]
Look at the stock prices of the companies installing owning and operating solar and wind farms. They are dogshit.
I think the boom is to capture the transmission system capacity near the best wind/solar sites, try and stay in business for the duration of the 20 year energy purchase agreement signed with local utility, and then mergers and aquisitions so that the larger entity can exert leverage upon renewal.
Power outages have a massive cost on society so the value of the marginal unit of electricity is very high.
Not necessarily a problem; recall that the way economics works is the market will have marginal businesses that are on the verge of being unprofitable to run - so seeing some borderline cases is not troubling.
It looks like solar is finally becoming economic; so it is reasonable that the marginal producers right now will be solar farms. But assuming the price trends continue businesses will start to appear that make good money.
Although there does seem to be a serious risk to grid stability. Most of the energy emergencies in the last few years have been linked to areas that invested heavily in renewables.
Indeed. Arguably it shows just how good for the economy wind and solar are - they are so easy to install and build that companies that run businesses doing that just don't have any real edge.
By discharging energy when the grid operator calls for it, having charged when there was excess energy on the grid (typically what would’ve been wasted as curtailed renewables).
The closest analogy I can recall to all these tireless fission stans are the OS/2 fans who refused to shut up even after Windows was dancing on its grave. It is truly embarrassing for the tech commentariat to be so blinded to reality.
If you want to know how much electricity they will generate, you have to multiply by the capacity factor. Nuclear, for instance has a capacity factor of 0.9 in the US: it delivers 90% of its nameplate capacity in a given year.
Solar ranges between 0.08 and 0.15.