DC home electricity. If your future home has solar and local storage, and your domestic usage is mostly in appliances which convert back down to DC immediately behind the plug socket (most of us have this) then at some point we're going to start wanting to power our homes like houseboats or camper vans.
Yes, this won't help for (eg) appliances with heating elements, I'm probably talking about a second discrete wiring loop rather than a total replacement, and it's hard right now to find a TV with the DC transformer on the outside.
But it'd be quite a lot more efficient for almost everything else most of us do. Look around the room you're in now and count how many things use more than 19V DC internally. Where I'm sitting right now, it's None.
> and your domestic usage is mostly in appliances which convert back down to DC immediately behind the plug socket (most of us have this) then at some point we're going to start wanting to power our homes like houseboats or camper vans.
Most of these appliances use low voltage which travels poorly over long distances. Your car, camper van, and boat all use ultra-thick cables to move 12 volts. This is quite uneconomical for anything larger than a small studio apartment. (Copper isn't cheap.)
Furthermore, note that I said "12 volts," which is what cars and capers use. (Not sure about boats.) Some DC appliances need 5 volts, some need 20... They'll all need converters.
So how are both of those problems solved? You'll probably send 100-200 volts, DC, though the wall! The big question is, does this really simplify anything? The big advantage with AC is that it's super-easy to change voltage with a simple transformer. What do we gain by going DC in the walls? Are there any real advantages in simplifying voltage conversion at appliances? Is it worth the added complexity of a whole-house AC-DC converter; or the complexity of a DC grid?
My thoughts exactly. A/C has some important electromagnetic characteristics that make it a lot easier to transport. Particularly for long distance lines, high voltage is critical and transformers make it trivial to modify voltage levels, such as to get to 120V for the house. it's also a lot easier to convert AC to DC than the other way around. If we start transporting with DC, we take on a number of problems/challenges that we don't deal with now.
I could see a point in time where there are AC outlets and DC outlets in a house depending on where the power comes from (power lines vs. solar panels/battery), but unless we radically decentralize (which I don't see happening) it seems unlikely to me that we switch to DC for long-distance power transmission.
Would it make sense to think about upping the line frequency ? IIRC Engineering 101 said that 50~60 Hz are frequencies most dangerous to humans; choosing a higher line frequency would be safer to work with and would (natch!) permit the use of thinner conductors.
> it's also a lot easier to convert AC to DC than the other way around. If we start transporting with DC, we take on a number of problems/challenges that we don't deal with now.
Which brings up a very good point: What happens when grid-scale battery storage is common? Does a DC grid make a lot more sense then?
Isn’t there also a marginal safety advantage for AC in that because it’s an alternating current, you can let go of whatever is live you’ve grabbed and is shocking you? Whereas with DC, your muscles stay contracted and you can’t let go.
It seems that other than a few power hungry appliances (oven, kettle, washing machine, heat pump, water heater, gaming PC) the rest are low-power digital devices that need up to about 200W (PS5 gaming console peak reported usage, even large TVs use much less than that).
I would love to see a comprehensive study of a home that would be designed and built around the concept of using two energy sources: AC and 48V DC, backed by battery storage and power grid in case of smaller installations or northern climate.
Would it make sense to do that on a large scale? Having smaller, energy efficient house should limit the need for long copper cables, we would also exclude all those AC-DC converters from today's devices - leaving us with something similar to a USB-C PD (working in the range 5-48V, which of course still is a converter but could be a standardized DC-DC one).
If I am correct then the main advantages would also include not running solar inverter all the time but only when there is a need for a lot of power (where it should be much more efficient) thus also extending its lifespan.
Having said that I do not have enough knowledge to judge whether possible gains would warrant going into this direction for future home installations. I would very much appreciate all comments and maybe some further reading material.
Sure, all my things may work with 12DC internally, but they expect 120AC (or w/e depending on country) at the plug, so I wouldn't be able to use them if I switch.
The idea being that consumer items will, more and more, be able to take DC directly instead of expecting AC.
For example, any device that takes in USB power with a 120VAC "wall wart" plug can just be used with a buck convert plug instead or being powered directly from the DC current.
LED lighting is taking in AC then converting to DC to power the LEDs. Your phone is taking in AC then converting to DC to power it. Your laptop is taking in AC then converting to DC to power it.
There are some household machines which will require some heavy duty power draw but much of the consumer products we use is powered off of DC to begin with. Powering directly off of DC would be cutting out the "middle man" of AC.
Power conversion efficiency is at best the 3rd or 4th important factor here.
The first factor would be the hodgepodge of wall dongles one needs to own and maintain (plus the cost of buying a dongle for each device that doesn't have one, or multiple of them per device in case you want to charge your phone/laptop/etc in more than one location at home).
The second factor is the "smoothness" of your DC sources. Most of the common LED lamps have a pretty ugly signal shape, and not at all close to a DC flat line. This is mostly unavoidable as AC->Smooth DC conversion is more expensive than AC-> DC + a ton of 120Hz, 240Hz,... on top of it. So, common LED lights tend to opt for cheaper "electronics". People notice the flickery LED lights to various degrees (some get headaches, some outright see the flickers, some claim to be totally oblivious to the difference). The DC "quality" also affects some fairly sensitive electronic devices, so some AC->DC adaptors are fairly sophisticated. A central high quality AC->DC convertor (combined with DC wiring) has better scalability when you need to care about smoothness (it can be a basic quality of life matter for some people).
The third and fourth factors are power discipation and conversion efficiency. They are the same thing, with two remedies: more $ to remedy the inefficiency (which is really small these days, if you go for switching convertors), and plans for heat to discipate properly (devices end up with pretty hot adaptors).
It varies pretty wildly, often efficiency ends up being dependent on the load since power supplies usually get optimized for a certain load range. Individually, the numbers might not look too bad, but when you think about how many individual AC->DC supplies you have, the losses can add up.
I've been involved in a side project developing a consumer-friendly rating of "power quality" for AC devices and AC->DC power supplies which summarizes efficiency over a range of loads, as well as incorporating power factor measurements. We've been testing common devices such as USB power supplies for phones and such, as well as things like laptop power supplies, due to how numerous they are. We've had a few surprises, for example, Apple power supplies generally don't fare that well.
Power Quality Score: https://pqs.app/
Detailed test data is public for some devices but not all, since we're trying to find paths to revenue starting with subscriptions for full test results. Let us know if you have feedback.
You might also be interested in the Youtube channel of my friend/PQS collaborator, where he's done some "deep dive" videos of testing some of the devices in the PQS database- particularly AC->DC USB power supplies due to how ubiquitous they are now- https://www.youtube.com/c/AllThingsOnePlace/ .
I"m not sure I'm really the person to answer this but I would guess inverter and/or buck/boost converters are in the 90%-95% range. So, chaining DC -> AC -> DC gives about anywhere from 15% to 30% losses.
I think the better argument is one for reduced 'hardware complexity'. Instead of having an inverter that then goes through a rectifier, all you need is a buck converter.
Yes, it causes issues today. But tomorrow, I'm expecting that the transformer will be outside the device (as now with wall warts and most laptops) rather than inside (as with your TV). And that minority who are handy enough with tools can patch past the internal transformer with a soldering iron and a screwdriver in the meantime.
> I'm probably talking about a second discrete wiring loop rather than a total replacement
I have considered this sort of thing for the basics around the house (in my head at least). A seperate lighting loop in each room + outside, comms cupboard and some usb/usb-c ports. Could be all powered by a couple of car batteries and not a lot of solar panels.
100% would do this sort of setup if I built a home office shed, but otherwise the plans remain in my head.
(Sorry, replying to myself but by way of example I recently discovered that "old"-style UK plugs -- BS546 ones -- are still rated for DC domestic supply. So in at least one or two corners of the world this whole notion is already supported with a semi-familiar interface.)
Ooh, fascinating. AC is better for long distance transmission, which solar + local storage obviates. Do you happen to know of the efficiency we might stand to gain from switching appliances to DC?
So you've two competing losses: transformers and voltage drops over long DC circuits.
When I was last looking into this myself -- and lamenting you couldn't find PoE LED lights for love or money -- it sounded like (IANAEE) voltage drops start becoming a thing you have to care about around 50 or 100 feet, depending on the gauge; wiring a house with DC isn't impossible but it might require a little bit of care or some thought.
Being honest it's not a massive amount of loss if your entire supply is coming from the grid. But if you're generating and storing energy at home, then it's much more significant because you add the transformer's losses to the additional losses spent in your inverter.
You need to move to higher voltage DC which will present some challenges you don't get with AC, for example if you get an arc in a switch or a circuit breaker with AC it normally extinguishes quickly at the zero voltage crossing, whereas DC will just keep arcing until stuff starts to burn.
LDx HVDC has a small set of use cases where it is more efficient than HVAC. But yes, in some cases it can work better over distances of hundreds of km.
Yes, this won't help for (eg) appliances with heating elements, I'm probably talking about a second discrete wiring loop rather than a total replacement, and it's hard right now to find a TV with the DC transformer on the outside.
But it'd be quite a lot more efficient for almost everything else most of us do. Look around the room you're in now and count how many things use more than 19V DC internally. Where I'm sitting right now, it's None.