> but grid access fees should be equal for every household
No, they should vary depending on the grid connection type. Someone with a three-phase max 100A per phase connection should pay more than someone with a single-phase max 50A per phase connection.
Because infrastructure can handle a finite number of users based on their connection types.
If you’re running a 50kW kiln (or more realistically 4 x 12kW) in your back yard then your annual use may not be that high but the grid needs to handle larger demand spikes. Paying per kWh alone doesn’t adjust for the relevant infrastructure.
Demand charges (peak kW rather than kWh) are very common in commercial and industrial.
It's very rare a residential house has a 50kW kiln, so those kinds of spikey loads basically balance out over a subdivision.
Although interestingly, an unintended side-effect of aggressive time-of-use pricing is the entire subdivision has programmed their air conditioners or EV chargers to turn back on right at 9pm when peak pricing ends. That kind of unintended mass-coordination DOES create sometimes-problematic high demand spikes.
Ultimately, there's a need for customer-specific real-time pricing that's responded to by behind-the-meter "power control systems"... managing flexible loads, in a way that doesn't negatively affect occupants.
> If you’re running a 50kW kiln (or more realistically 4 x 12kW) in your back yard then your annual use may not be that high but the grid needs to handle larger demand spikes. Paying per kWh alone doesn’t adjust for the relevant infrastructure.
There are two alternatives here. One is that your spiky demand doesn't correlate with others, in which case it's irrelevant because although 50kW is a lot for one house it's really not a big deal for the power grid. The other is that it does correlate with other spiky demand, at which point you charge everyone a higher price per kWh at those times.
Charging based on regional demand works great on the production side, but residential, industrial, and commercial experiences different peak times. Thus distribution demand at the level of substations don’t necessarily correlate well.
You could model things with individual per customer rates per minute, but charging based on connection sizes is a lot simpler.
Distribution costs and production costs are different things.
The grid itself needs to be paid for as does electricity production. A random neighborhood that wants a huge number of Christmas lights isn’t an issue from the production side, but it can be a real issue in terms of distribution without using enough kWh to pay for that infrastructure.
Is that true in practice and is it more than simply proportional to the power consumed. In the US, most new residential electric services are 200 Amps at 240 Volts. The maximum power that could be drawn would be 48 kW. Seems like the potential variability from a home is already enormous. Conversely, someone with a high capacity connection could have very regular usage patterns
The US is very different from Europe in this sense. My house uses a small fraction of an equivalent family home in the United States. On the coldest days in winter I might go through 10 KWh of electricity and 15 cubic meters of natural gas. During a summer day gas usage will be < 0.5 cubic meters (mostly for shower water) and electricity will be 60 KWh or more returned to the grid.
Someone with smaller service won't be able to create as much variability in load as someone with a much more beefy hookup.
>Someone with smaller service won't be able to create as much variability in load as someone with a much more beefy hookup.
I think we are in dangerous waters when we are are basing public policy on "ability" to have impact opposed to "actual impact. I think it is a genuinely interesting question if and how much this variability contributes to the grid capacity requirements.
You can basically think of individual variability as noise on an analog signal.
Does single user variability average out, and if so, on what scale?
How does this variability compare to other amplitude changes, like aggregate or seasonal daily use patterns?
I think it is entirely possible that this noise could be negligible at most scales, but obviously dont have the data.
However, someone with the actual data could easily do an ANOVA evaluation, and see what the actual numbers are.
It's simple physics, actual impact follows ability. In other words: you don't ask for a hookup larger than the one that you intend to use because you already pay more per month for that larger hookup.
You could have some unusually heavy usage during off-peak hours and that doesn't require any additional grid infrastructure because there is already plenty of capacity during off-peak hours. Whereas if you want to use the same amount of power during peak hours, that would require more grid capacity, but in general the way to handle that is by charging a higher price per kWh during peak hours, giving everyone the incentive to use less then (and charging them appropriately if they don't/can't).
Impact does not follow ability, it is the exact opposite when we are talking about variability.
Total demand for power increases linearly with the number of users.
Percent variability of demand decreases with the number of users and approaches a limit of zero.
If we are talking with sizing power infrastructure, capacity required increases with the number of users, but safety factor required decreases with the number of users.
At the margin, infrastructure cost scales with per capita power usage, not with individual variability. The variability cancels out.
Not California, but in general terms this is not accurate. My local grid connection provider (the largest in the country), does not differentiate between 1x25A and 3x63A in cost. It's the same price.
That's a pretty big difference in available power for the same price. (5.8~43.5 kW)
But the only differing cost in that scenarios is the 2 extra conductors to the local transformer, and more likely just to the edge of the property. The 3-phase power is still present in the area, as ideally alternating houses are on alternating phases to balance the load over the phases. The difference in cost would be a one-time hit during installation, and the ongoing maintenance would be the same as 3 separate houses (1 house per phase). That maintenance could be rolled into the cost per watt, so the more you have available the more you can use and the more you could pay.
The installation cost should vary based on what the house wants access to, and the ongoing cost should be the same as every household. A standing charge for the cost of the infrastructure existing is ridiculous when that same infrastructure is what the power company relies on to deliver their chargeable commodity. It's effectively double dipping - how is it any different from ISPs charging for access and then charging for data on top?
I think an important consideration is that the overall grid is not designed for all houses to have the higher capacity connections. So enough connections and they're forced to make massive changes to the infrastructure.
Not to say utility companies don't make obscene profits instead of reinvesting much of that into the infrastructure.
Regarding the ISP, its really the same argument. If I want 2GB/s on a neighbourhood line that supports a max avg service size of 1GB/s then they would be forced to upgrade their lines to service me. Granted, unlike power grids they're liable to just not do that and let service quality degrade and quote the "Up to X Gbps" clause
No, they should vary depending on the grid connection type. Someone with a three-phase max 100A per phase connection should pay more than someone with a single-phase max 50A per phase connection.