Hacker News new | past | comments | ask | show | jobs | submit login
So Where’s My Low Voltage DC Wall Socket? (hackaday.com)
81 points by kyancey on Dec 16, 2019 | hide | past | favorite | 113 comments



We don't need DC sockets as they dont solve any problems. The stupid simple AC mains distribution system is well tested and proven. Who cares that they provide up to 3kW. The "last meter" problem is perfectly solved using AC-DC wall warts and bricks. If the power supply dies, you buy a new one and just plug it in. No electrician required. Putting things in wall or central distribution is not a solution to any existing problem and introduces needless cost and complexity.

A better solution would be to get everyone on board with a single DC voltage and connector. Then we can buy larger multi socket bricks that are just AC-DC power strips. You can then add batteries and give it UPS functionality. But good luck getting everyone to agree on a single standard.


Wait, that exists. USB is the standard. The UPS with standardized DC-outs is a USB power bank with integrated pass-through charging.


USB is kind of a physical standard. I have a few things like an alarm clock that uses USB, but I feel like the lower build quality is way more obvious than if it just had a wall wart.

The problem with calling USB a DC power standard is that it often doesn't work. I have a lot of trouble trying to charge my PS4 controller using random USB cables when it's attached to my PS4--I'm not even introducing an a/c adapter. I've had things like Qi chargers where I had an insufficient a/c adapter so they behaved poorly. If I'm at an airport and need a quick charge on my phone I'm definitely not going to use their USB port because if it works it'll likely charge slowly.

USB-c is worse with all of this. With a/c power it's only an issue when traveling to other countries and in many cases a physical adapter is good enough.


Slightly OT, but have you managed to actually find an acceptable quality USB power bank (PD/Type C or otherwise) that actually supports this "UPS mode" of operation? Most of the one I have tried cannot simultaneously output and input, or introduce a second of switching delay in between.


Some discussion about pass through charging on the Anker forum boards [1]. Pretty much everyone I know who carries a power bank only remembers to charge it at the same time they want to use it, so pass through charging capitalizes upon this consumer behavior to use the power banks more frequently.

The possible dark pattern I see operating instead is manufacturers are opting to ship power banks charged up to the retail shelves, and likely don't mind if people keep buying more new units because they can't wait for the units they already have to charge up?

I tend to keep all my gear plugged into chargers whenever I'm stationary because I know all of them have charge protection circuits and I have a work profile closer to that of a digital nomad than an office-worker, so I'm an outlier with that common consumer behavior of only charging when one must.

[1] http://community.anker.com/t/any-powerbanks-that-allow-drain...


also POE


> A better solution would be to get everyone on board with a single DC voltage and connector

Why? For many things 6V is enough as it will downscale without too much loss to 5V/3V3 which is what most electronics run on... but everything that draws a boatload of power (computers, laptops, LED strips, ...) will go on 12V up to 24V, so if you want a standard it's either 12 or 24V which will introduce a lot of regulation waste in the form of heat in the device.


Why? Routers, switches, small embedded computers, and a myriad of devices already use 12-24V DC and step it down internally to 3.3V or less for chip use. Laptops and even phones/tablets are no different. And conversion losses don't increase with voltage so long as you design your DC-DC converter properly.

5-6V is fine for small loads of around 10 watts but you then have to deal with line losses. A 5 m / 16 foot 24 AWG usb cable will lose 38.5% of its power from line losses at 5V, 2.5A (a measly 15 watts). If you wanted to get a low 2-3% loss for 5V, 2.5A at 5m cable length you would need 12 AWG which is massive. 24V does better but if you wanted 50+ watts you again have voltage drop issues.

So in conclusion, low DC voltages are not very practical for power cable runs of over 2 meters. Though it would be nice if the last meter could be more unified with a single DC voltage and connector (my vote is for 24V). Then we can consolidate multiple devices to one strip/brick and call it a day.


That's true. My aging but still well-functioning WRT54G router had its 12V 2A wall wart die on me last year. At the same time, some USB cable had its micro B end accidentally ripped of.

So I thought, why not, opened the router,and lo and behold, the DC circuit is perfectly happy with 5V and 1.2A or so, and converts it down to I believe 3.3V.

So I drilled a hole in the router and soldered the working end of the damaged cable to the power input. Other end is pluggend into the useless USB TV port. Has been working fine ever since. Neither TV nor router runs hot, which I couldn't say about the old wall wart.

In theory, USB should negotiate the required amount of power It turns out the TV manufacturer decided to simply wire the port directly to the internal 5V rail connected to both USB ports, so it should be able to give at least 2x2A.

Yeah, it was probably cheaper to just buy a new router. Got a bit nerd-sniped there I guess ;-) .


Should've let this thing die. Do you actually have devices hooked up to this.. the speeds you're getting and the likely security holes you have (especially if you don't have this properly behind some other device) are concerning me.


Dont worry, there is a decent enough firewall built in to the ISP modem. And my ISP is more than slow enough that the speed of the router doesnt matter much. And when all my computers are off, so are the router, the modem, the TV and all the other stuff.


Assuming you mean 5V or 3V DC, there will have to be a baseline level, then it will have to be upconverted or downconverted from the wall AC is (to whatever their baseline standard is in DC), then converted back to AC by X-converting then converting to something in AC, then downconverting to DC OR using semiconductors that used to be expensive (don't know if they still are). Seems complicated. Why not just stick with (in the US AC @ 110V - it's not like it isn't wide spread) and let all the device power adapters take care of themselves?

Or, you know, have better silicon components give us SDVC (Software Defined Voltage Control). I'm joking, but I also don't think you will save what you thik you will in TCO - Energy Savings + Device Cost differential.

I could be wrong. I have been before. And I am operating on almost zero sleep for over a day, so odds are, I could be doing to again at greater than my average error rate.


The problem with starting from a low voltage is that even low power levels result in significant current requiring the use of large conductors - or at least a connector capable of supporting those conductors when needed.


Every device has a high efficiency switching regulator in it now. Regulator drop is not the issue.


Wouldn't it be more efficient to have e.g. one DC power supply for the house that can power everything that needs DC power, instead of lots of individual (and likely cheap/inefficient) power supplies?


Most electronic devices require low voltage DC, so resistance losses in cabling across even quite short distances (10+ metres) would be significant. This is why DC distribution in data centres if it is used is normally 48V.


It’s also more difficult do do straight transformation on DC voltages, it can be done but the reason our power grid settled on AC was getting high voltage transmission levels down to reasonable ones for use is simple with some wound coils.

It’s really easiest to just keep everything AC until the point of use, putting a transformer and a bridge rectifier in most devices is a better solution than needing switching power supplies everywhere to convert DC voltages (or wise options like linear regulators).

Switched mode power supplies also have EMI to deal with due to the higher switching frequencies, an AC transformer runs at line frequency (50/60Hz).


Dumb DC supplies using just a bridge rectifier and filter cap(s) isn't really ideal if you need a stable voltage. If the mains voltage sags, so does output. Then factor in ripple from the cap bank under higher loads and the associated nonlinear harmonics and power factor issues from the rectification. You could add a linear regulator and increase the transformer output and burn the excess off as heat but efficiency goes down the toilet.

Switchers, for all their complexity, are pretty damn clean in terms of output and input PF correction. Even in audio and sensitive analog applications where dumb supplies were preferred have been replaced by switchers. A good design goes a long way and it's unfortunately easy to design junk switchers because of the complexity involved.


EDIT: Most consumer appliances will still need there own power supplies. You would still want a last step DC transform to house at a high voltage. If the appliance operates at the voltage DC then it would need no power supply.

Yes, and cheaper to build!

You can achieve the same efficiency with DC and AC do not directly effect efficiency across the line contrary to popular belief. It is simple about what the voltage across the line is.

AC was in because it allowed us to build transformers to up and down voltages to provide efficient across the line.

Nowadays, we can build DC transformers with Power MOSFETs which are cheaper, and more efficient then there AC counter parts (old [magnets and inductors] and new [You can do AC transformers with MOSFETS with some extra steps/components]).

Additionally, since most consumer appliances first step is to rectify the voltages most build in the past two decades are DC compatible. Additionally, newer ones could then drop the rectifier stage and become cheaper and more efficient.


>Yes, and cheaper to build! You can achieve the same efficiency with DC and AC do not directly effect efficiency across the line contrary to popular belief.

In fact there are higher loses with AC power transport (specially in long distances) due to the parasitic capacitance of the lines, that’s why some countries use DC power transport lines now.


Line losses, see my reply below. Once you go beyond needing 10-15 watts and cable lengths beyond a few meters you start losing a lot of power through the wire.


The problem is the resistance of the wires which causes a measurable voltage drop. Try extending a USB cable 10 meters to have an idea.


It seems like USB A/C are going to be that. You can already buy outlets that have both mains and USB ports, there are even dual USB C outlets now, that can do 30 watts:

https://www.amazon.com/dp/B07PTWG5DV


I stayed in a newly built hotel a few years ago that had two USB-A sockets integrated in every single power outlet, both in the room and around the lobby.

What actually worries me about this is that they are all backed by tiny switch mode power supplies, and if not designed and built correctly they have a habit of exploding. So now your hotel has hundreds of potentially explody boxes hidden in the walls.


The author makes the valid point that USB-A is too low power, and USB-C requires active cables. (Translation: expensive or dangerous)


USB-C doesn't require active cables. Only 100W USB-PD and longer high speeds require active cables. The marker chip for 5A charging cables is cheap.

Both USB-A and USB-C chargers require active logic. USB-C is more complicated because includes both USB-C and legacy USB logic. USB-PD is the most complexity.


USB-A can support power delivery. The standard predates USB-C.


In practice I haven't seen USB PD 1.0 in the wild. Most of the non-PD charge standards signal charge rates using varied voltages on the D+/D- lines.

When current manufacturers claim USB-PD they mean PD 2.0 which requires the CC line (Only included in usb-c to usb-c) to signal and set the voltage. When PD 3.0 is included, they usually call it quickcharge 4.0 or PPS.


If you want one port that can charge multiple devices of varying needs safely, you need smarter cables. Sounds like a reasonable tradeoff.

Otherwise every single cable would have to be built to handle the highest possible power delivery rate (100W for USB-C).


Looking at that plug and how chunky the holes are for ac power, when access is limited and you can’t see, shoving a usb c plug into the ac hole is going to happen often.


You'd have to shove awfully hard -- hard enough to crumple the metal shell of the USB connector, since it's wider than the opening in 120VAC socket. (I just tried it with a spare electrical outlet, and even with a fairly hard push and wiggling it around, I couldn't make it fit)


Yikes and thanks - I’m in New Zealand and our plugs are way smaller so you couldn’t make that mistake here, but it’s good to know the terminals are far enough in that it’s safe. I’ve seen some surprising things people have forced into the wrong port. USB into Ethernet etc.


The long vertical slot you could come in contact with is neutral. The bottom peg is ground. The top right vertical strip is hot but it also has recessed connectors and is smaller than the usb c connector.

I'd be impressed if you could cause a short on purpose with significant force let alone fumbling in the dark.


The problem lies in that you're pushing the problem into the hands of anonymous cable makers and clueless end users. It's really difficult to stick a standard plug into a 30 amp socket.

It's really easy to buy some substandard USB-C cable that lies about it's capability.


The same issue would exist with any universal in-wall DC power supply -- if consumers buy a substandard product, it can destroy their device. Unless you want separate ports different voltage/power output then the user has to buy separate cables for each one.


Do to circumstances beyond our control, we had to replace all the outlets in my parent's condo, so we used the USB (A) outlets. It works quite well. I do wonder if we will see USB-C quick charging outlets.


I was going to link to that exact product. I put in the USB A one a few weeks ago and love it. When the iPhone 12 is released I’ll switch to USB C.


> When the iPhone 12 is released I’ll switch to USB C

Depending on what standards your existing ports support and what power input the iDevice can accept, you may find that a simple A->C adaptor will suffice. You may not get the fastest charge possible so in that case will want to move over sooner rather than later, but this may be an inexpensive (in terms of money and effort) stop-gap measure.


Why wait for the iPhone 12? The current iPhone already charges significantly faster over USB-C than USB-A, as do all iPads released in the past ~3 years.


I read here recently that fast charging is bad for batteries.


There's really no reason to worry about it. Battery replacements (at least from Apple) are $70, and all modern phones have sophisticated battery management controllers.

Also, fast charging your battery more often could be better since you're not fully discharging and recharging your battery.


Is "Digital Electricity" a real thing? Its billed as an alternative to AC or DC. This may be as good a place as any to ask this question, but I am a hotel developer, and we've had a few people reaching out to us trying to sell "digital electricity" that sends high voltage over cat5 (up to 2000w per the Belden materials) by essentially sending high voltage for very brief blips with the ability to switch off before any damage occurs if a short is detected. they claim it is supposed to be significantly more efficient, doesn't require electricians to run cables, and its safer. I declined... but im still curious. basically every device in the room is connected via lan to a power server.

https://www.belden.com/blog/smart-building/digital-electrici...


Looks like buzzword marketing bullshit to me. Apparently they're just sending pulsed DC power over Ethernet cables? Nothing else in that article is as new or exciting as they dressed it up to be. GFCI outlets are standard in modern houses, they claim to be safely transmitting "high voltage" but 120V is not, and the "digital" buzzword stinks of marketing.

>It transfers high levels of power over non-power cable

You can't beat physics. Wires heat up from current, regardless of whether it's AC, DC, or "digital" pulsed DC. I doubt Ethernet cable is rated for any significant amount of power, but searching for "cat6 max power" doesn't return any relevant results because search is broken in 2019.

It's also likely to be rather inneffecient, as they are converting from AC to DC and then potentially back to AC? There may be an advantage to pulsing DC but I'm not sure, typically AC is more efficient for power transmission because of lower losses for the same power over distance and higher transformer efficiency (no need for an inverter).


It's apparently a bit smarter than GFCI. [1] says it measures the loss in the wire to determine if a person or whatever is touching it. So that would protect people against shocks from touching both wires, which GFCI can't do.

[1] http://magazine.connectedremag.com/publication/?i=488126#{%2...

I think this is a pretty nice idea except for the patents and only having one or two suppliers and lack of technical details.


You can get 100W over Cat5 with 802.3bt, but I can't tell if there are any special requirements for that cabling: https://en.wikipedia.org/wiki/Power_over_Ethernet#Standard_i...


It sounds like "digital electricity" is a more advanced sort of GFCI: instead of verifying that the two pins are carrying equal current, it verifies that the expected amount of power makes it all the way to the receiving end.

Normally, running high voltage through a wire is dangerous because a fault can dump power into the middle of the wire (starting a fire) or into a person (stopping their heart), but perhaps with end-to-end active monitoring, it's possible to operate safely at much higher voltages?

If this technology works, I think it would also be useful for powering a tethered drone from the ground.

[Edit: according to lopmotr's link, they are using this for drones and balloons already.]


That sounds like the sort of thing your local electrical code authority might get very upset about. It's not clear what the actual voltage is? But if it's over "SELV" ratings different rules are going to apply.


From the material it sounds interesting but I'd be leery of the 600 Hz switching generating a ton of electrical noise and relying on the upstream systems to keep everyone safe. The idea sounds good but I can see a lot of ways this can go wrong outside of very controlled environments.


Oh wow, I would not want 2000w going through Ethernet. Both for the obvious electrical issues, and for the dangerous mixing of safe sockets and hot sockets.

No idea how this would be more efficient. Bursting voltage will lead to higher loses versus a continuous voltage. Are they claiming efficiency thanks to a centralized AC->DC converter?


Cat5 doesn't imply RJ45, so that confusion could be easily averted if people with brains developed that stuff. 2000w constant power transmission on a Cat5 sounds dubious, though. I don't know at what the voltage such a cable gives in and shorts wires. It would have to be pretty high to get the current to something the wire crossection can handle.


It's not the wattage that is going the be the problem, but either the high current or the high voltage (arcing).


I would check you local codes. Running Cat5 might not require an electrician, but that doesn't mean it doesn't mean it doesn't require a licensed installer. Especially if you want to run network over that Cat5, you'll want someone who knows what they're doing.

That said...Belden is reputable, but this doesn't sound like it's fully baked.


My first impression is that it smells like BS but let's take a closer look.

CAT6 cabling uses 4 pairs of AWG 24 wire. This type of wire typically comes with two options for insulation 250V and 600V, Let's be optimistic and use the latter. The maximum recommended current for AWG 24 wiring is 0.577A [1], as a reference PoE specs use 0.3A (if I remember correctly). Using these assumptions our maximum power transmission would be 4 (pairs of wire) * 600V * 0.577A = 1384.8W, this doesn't include power losses inside the wire itself. Belden's descriptions seem to indicate that power transmission is multiplexed in the time domain, which would reduce the power transmission capability even further.

Current limits on wiring are semi-arbitrary, what it really comes down to is how much heat generation is tolerable in the wires and the environment in which they reside. If we were to ignore the suggested current limit of 0.577A per core then it is possible to transmit 2000W. More specifically you would need 0.833A at 600V in 4 pairs of wire to get 2000W.

TLDR: The claims seem dubious. It is possible if the current limits for AWG 24 wire are are exceeded by about 2x.

[1] https://www.powerstream.com/Wire_Size.htm


There are special thicker 16 and 18 AWG Digital Electricity cables [1], so you probably wouldn't get that 2kW with CAT6 unless it's just peak power with a low duty cycle. They are rated for 300V so they might actually use close to that.

[1] https://www.belden.com/products/enterprise/copper/cable/de-c...


Oh, that makes more sense!

4 * 300V * 2.3A = 2.76kW

It looks like it's totally doable with those cables! But if they're running 300V DC through there... may as well just run 120Vac. The only benefit I see is that they can detect faults on the transmitting end in a sophisticated manner.


I've come around to the idea that some type of power over Ethernet is probably the correct solution for low voltage house wiring. There is an automotive standard IEEE 802.3bu-2016 which supports up to 55 watts.

The other advantage it that solves a lot of the IoT last 25 feet problem. Because it's both power and a data link.


I always thought running low-voltage DC through a house from a single supply was doomed due to voltage drop over the distances required....


Yes this and it is also much harder to switch off a heavy DC load than an AC one. Once current starts flowing in DC and you try to open a switch, an arc will form that never gets extinguished since the voltage doesn't cross 0 like it does in AC. Hence why switches are always significantly derated for DC versus AC.


That absolutely is. You could be better with 48 or even 24 volts, but then you are out of luck if you want to power a majority of devices which are 12v. So your only option is to go for USB-A wall socket which essentially is a wall-wart hidden in your wall. I do not see a good solution here.


A buck convertor in the socket (cost: $1) with a voltage dial or selector would do the trick.


Well, there is less drop with more copper, so it's more about costs. Cost of cables, DC-DC converters, power supplies and their efficiency determine optimal configuration. Probably going as high as cheap mass market DC-DC converters and PSUs allow is good rule of thumb for low voltage, so like 20-24 V. Beyond that better to go with high voltage.


For some reason, I feel like 48V DC over super-chunky aluminum bus bars is the only way you could avoid making the DC whole-house wiring cost less than the entire rest of the house.

As I understand it, the problems that make aluminum wire strictly inferior to copper wire--expansion coefficient, heat dissipation, non-conductive oxide, and strand breakage--are less significant in bus bars. The remaining concerns would then be bimetallic junction corrosion and work hardening.

Maybe a bus duct with polybenzimidazole spacers and insulation? Still sounds expensive and inefficient, though.


Boats and RVs do this but the cables are gigantic compared to AC cables. Also as they age voltage drop becomes a big issue, especially on boats since the environment is so corrosive.


You have to choose wire gauges carefully. A lot of cables that connect power supplies to their devices have thin wires that don't loae much voltage over the meter or so of cable. But if you woukd extend that with an equal gauge wire to some meters if length, the conmected device will likely burn out around then as well.


You could counteract that with heavier wires, but house wiring is already awkward and expensive.


I dont think DC wall socket is the problem - its the small electronics that each have their own power pack.

Ideally to me every led lamp, settop box, alarm clock, phone, laptop would not come with a power supply box. It would just take a USB cable or similar standard. Then you could buy a good quality power supply adapter that you can reuse for lots of products - and avoid the abundance of random adapters everywhere.


That is sort of happening already. A lot of devices are just using USB standard for their power supplies.

I suspect that with PD, that allows even lamps to negotiate that they need more power, maybe?


Interesting idea, but running an extra set of cables is a non-starter. The cost/benefit analysis would never make sense for existing structures, and newly built structures wouldn't do it unless it was almost certain to become a universal standard. This is a classic chicken-and-egg problem.

You would also still need something similar to a wall-wart to step down the voltage to whatever your device actually uses (e.g. from 12 V to 5 V to charge your phone), so its not clear how much benefit there would really be from such a system.

Whatever specs we adopted would probably also end up not being appropriate for some future devices, so we'd probably go back to a substantial number of devices bundling their own PSUs anyway.

Interesting idea, unlikely to ever be worth the cost. If wall-warts really bother you, then just invest in some wall outlets with integrated USB sockets - there are many good options on the market.


Also, the biggest non-starter is just how inefficient it is to move 12v any reasonable length. 12v domestic wiring only makes sense in very tiny small dwellings. (Think studio apartment or 3rd world country shack.)

Otherwise the voltage sag and resistance makes it cheaper and more efficient to do normal 120/240 voltage with device-specific converters. (For example, if I had to run 0-gauge cable from my breaker box throughout my house to support 12v, the cost of copper alone would be prohibitive.)

I honestly think standardizing on USB for devices that can use it is the best approach, given that we already have tiny AC->USB converters that don't block other plugs, and outlets with USB built-in are on the market. Furthermore, I wonder just how much we can shrink the typical wall-wart so it won't cover other plugs?

(I also think we're better off trying to do a global domestic 200 or 400 volt DC standard.)


The solar/RV/golfcart world has and entire ecosystem of standardized 12v DC power distribution and the appliances that use it if you really feel the need to have DC. Works pretty good...last year wired up a friends mountain cabin with 12v solar, batteries, inverter for AC, led lights, fans, dorm room sized DC fridge. Different manufacturers and everything popped together fine.


Most laptops can be charged from 12V auto-style sockets as well.

12V DC is fantastic for a solar setup (vacation cabin, off-grid home, or whatever). You can still have some standard AC outlets using an inverter, but there's no reason to have lights running off of AC -- especially if you have LEDs! Since inverters cause a slow but constant power drain, it makes sense to leave them off most of the time.

Edit: I also wanted to mention the amazing "3-way" fridges, which I was not aware of until looking into solar a few years ago. They can run on AC, DC, and propane, giving you an alternative if the power goes out. The RV world has tons of this stuff, and the appliances usually aren't that much more expensive. Some are even cheaper than regular appliances since they are usually more compact.


Keep in mind that your typical RV refrigerator capable of running on AC or propane, and possibly DC, will run much less efficiently on electric power than a standard electric refrigerator. That might make them less-than-ideal for a solar + battery configuration. In an RV context one typically uses the electric mode only when connected to "shore power", where the lower efficiency is less of an issue.

Propane-based refrigerators use an absorption cycle which depends on heat produced by burning the propane; in electric mode they just substitute resistive heating. A pure electric refrigerator would use a compressor-driven heat pump, which is considerably more efficient.


What DC connector does that use? My impression is that solar panels use MC4 connectors which have separate female and male and need two connectors for DC. But there is a convention for polarity.

Amateur radio has standardized on the Powerpole connector for 12V DC. Powerpole is hermaphroditic so don't have to worry about which end is plug or socket, and uses color coding for polarity.


I think that with low power LED lighting, as well as all rechargeable devices, there is a case for some (not all) 12VDC sockets around the house (and cars/vans/RVs). Actually, I think mobile applications will be first: in an RV, it makes no sense to have a big inverter to take your 12V batteries to 120V AC, just to plug in wall warts to charge to camera, phone, etc.

I have a friend who installed 12V in parts of his house, backed by a solar charged battery system. See my previous comment for some details:

https://news.ycombinator.com/item?id=21109247

Talking to this friend, we discussed the same problem as the OP. In RVs that are wired for battery power, they use wall-mounted cigarette lighter plugs, and you can buy some small appliances that use them. They are ugly and take up too much space, it would be so much better to have a plug of a regular shape and size. Of course it has to be polarized for DC, so something like ( - | ).

There is also the problem of breaking the arc in DC, so one prong needs to be longer and making contact further in so it can't arc outside the plug. Or maybe it needs some internal mechanism such as plastic slot covers that close and break the ark (like child-protected sockets have now). Anyway it definitely needs some research and design, but I think it would be cool to have a new 12VDC standard plug, then people could start designing products that use it to get the whole ecosystem started.


I don't get why they write off USB so quickly. It's small, it's reasonably convenient, and is almost ubiquitous already.


The article explains: USB A's power is too limited for a large percentage of household uses, and USB C is complicated and expensive, requiring digital circuitry to negotiate.


The current USB standards are great for small devices but for anything that requires more power it would be good to have something beefier, something less than the roughly 1500W a standard AC outlet provides but more than the current 15W or 20W of the typical USB port. 5V is just too low and requires a lot of compromise on the device side. I'd probably start with 12V or 24V to keep the current lower. I think there are newer USB ideas to provide something of that sort: https://www.digikey.com/en/articles/techzone/2017/mar/design...


> ...and incorporate a fuse to protect the appliance cable from fire. Those British BS1363, fused-mains-plug habits die hard.

Lots of car barrel connectors have fuses inside. I'm not seeing much reasoning behind not using the car barrel connectors, aside from 'it was designed for cars, not as a general purpose connector'.


They break easily. Spring loaded bits (actual springs). No properly defined standard and they vary a lot. Come out easily depending on quality. Easy to contact center pole to side poles on insertion. No ability to add earth.


12v plugs are indeed a standard, and can be quite robust. Other than size and cost, there are no such downsides. On DC there is seldom an 'earth', gound is analougous to dc negative/neutral depending on local standard.

ANSI/SAE J563 and UL2089

https://en.m.wikipedia.org/wiki/Cigarette_lighter_receptacle shows the UL standard.


The case was made as a dc plug in general. Though 12V does definitely not require "earthing", you are correct about that. With earth in this case it is not the ground perse but some connection schemes (and standards) require either a third connector or open return path for DC which this connector cannot provide.

Apparently they decided to make it a standard in 1994 after years of mismatching equipment all over the world, which is longer than I would have ever guessed. Interesting to see. Apparently UL "standard" does not translate well to actual standard. My experience is that most devices fail on the spring center contact, and do not fit for all cars. Anecdotal but still: even the expensive charger I have doesn't work in one car I've owned (opel) due to the misalignment of the outer contacts and it jumps out too much). It says it's UL2089 compatible though so I guess opel was at fault. There are so many "extension cords" out in the wild which hardly work or go lose after time due to bad design or because they are just a bit too wide. I own several of them and have thrown most away.

The funny thing is the standard describes two sockets with differing size which is not helpful in any way and looks like a legacy of some manufacturers made into the standard.

The article also mentions other bad characteristics about the connector and plug like bad contact causing high resistance which is simple due to the design but might also be present with other designs of course.

I mean, it's nice it became a standard as it was in every car anyway since everybody smoked in them but there are way better solutions which do not have the problems I already mentioned and are way better solution for home appliances. If there is not yet a proper standard, why chose one which is not designed for this usage and most definitely was never designed for what it is used for today with all kinds of side effects (good or bad).


I agree that these connectors are typically low-quality. I can't imagine ever needing to earth them, however.


Car connectors are ugly, oversized, and mechanically questionable. They do work though - I once used one to hook a solar cell to a 12V fan.


Crazy, USA needs higher voltage AC, not DC...

People in the 240V world get to pull 3KW from their wall sockets, I'm jealous.

http://wordpress.mrreid.org/2012/04/16/why-kettles-boil-slow...


No mention of Power-over-ethernet?


What is the maximum power provided by PoE? Wikipedia is surprisingly vague on the subject, probably because there are multiple standards. PoE was never intended to power more than just network equipment.


I am not sure how Wikipedia can be more explicit, they even have a table showing the voltages and wattages of the different standards. [0]

According to that table, the latest standard, 802.3bt Type 4, can provide 71 W to the device at the end of the cable (and inject 100W into the otherside).

[0] https://en.wikipedia.org/wiki/Power_over_Ethernet


Clicking on the link "Power capacity limits" brings you to nothing useful. Are you looking elsewhere?


Scroll down to the table "Comparison of PoE parameters".


I think it's something like 30W over a single ethernet cable. I'm running a few devices with PoE that are not network enabled, just conveniently located near an ethernet cable that's not near a power outlet.


Depends on how you define "network equipment" :)

Unifi is selling LED panels, that are powered over PoE and controlled over Ethernet.


I assumed the author dismissed it for the same reasons as USB-C that it requires active negotiation and isn't "simple" enough.


I have to deal with this problem constantly and I haven't figured out a good solution yet. I'm not sure this is particularly meaningful in a normal mains-connected environment but as someone who's base power source is a finite amount of 24v DC, I hate eating the inversion losses when I could save ~7-16% using switching supplies off the main bus.

Currently everything I have is connected up with a hodgepodge of those green pluggable terminal blocks. It's not an elegant solution but it's hard to standardize on something, especially when I have devices that run off 5, 9, 12, 24 and 48 volts. I still run an inverter most of the time (I have two, pure and modified depending on the load).

I think it's possible to devise a decent solution here, I like the XLR based designs but I'm afraid of plugging things into the wrong voltage. I think maybe the good solution is to make a color-mapping for the common voltages up to 48v (past that you require an electrician to do wiring in the US iirc) and then make cables that are XLR on one end, and whatever-plug on the other end with an LED that lights up corresponding to the voltage. That way you know if your barrel jack is 12 or 5. It doesn't have to be foolproof, or even customer friendly really. People setting up DC buses to run all their electronics should be able to accept the responsibility of frying something if they give it too much juice.


The telecom industry has 48 volt as a de-facto standard, bur never came to a mechanical standard.

Thus you see daisy chains of mechanical adaptors in telecom closets.


Well...the daisy chains were the standard for mechanical connections. :-)

What there never was was a connection standard safe enough to put in a house, made to be (ab)used by normal people. 48v DC as it lives in the telco world is flat out dangerous if you aren't careful.


If none of the current plugs are usable, can we have a low-power plug standard based on PoE?

Having both ethernet and low power from one socket would be perfect for IoT devices, save people from having to run Ethernet wiring themselves afterwards, and cut down on wireless spectrum use.

And your 'smart fridge' would just plug into both outlets and use a relay to cut 230V if the motor doesn't need to run.


Wouldn't the labor/effort cost to run and terminate Ethernet vs. 120v wiring be significantly higher?

My perspective is just that of someone who's run a bit of both at an amateur level through my house, and finds terminating stupid-simple 120v wiring much, much easier than ethernet.


While PoE can run up to 120W, it requires digital circuitry for negotiation...


Stop talking sense! Can’t you see there’s money to be made here!


Yeah, we are kind of running low on actual high power devices to plug into residential outlets these days. Those that remain are pretty much confined to specific parts of specific rooms. Various stand alone cooking things on the kitchen counter. Hair dryer in the bathroom by the mirror. The vacuum cleaner seems to be the exception and there is a tendency toward built ins.

Heck, I don't know that there is even much need for high power ceiling lighting any more. Everything that people want to see with any detail is self illuminated these days. Just scatter some low power LEDs around (you don't even have to provide a way to turn them off) and you are done. If you actually want to read a paper book or do some knitting then an appropriate lamp is not going to take hardly any power as it would only have to illuminate a small area.


More than a low voltage plug (which I’d love) is the need for in wall smart plugs that have a wired connection rather than wifi. I can’t believe I’m the only person that wants this and while I’ve found a few in wall smart plugs, I’ve never found a wired one.


There are classic home automation systems like x10 that potentially predate wifi and offer wired solutions that use the power lines themselves for wired communication, you might want to check these out


I've heard that a lot of these systems that transmit data over power lines have a tendency to trip the arc-fault breakers that are required in a lot of new construction.


I remember playing with x10 consumer devices I bought off the shelf from radio shack 25 years ago. The tech was slow but very functional even back then.


Powerline Gigabit Ethernet is a thing, but it's bordering on wireless (uses the power wiring as a "guideline" as to where its RF energy should go). This makes the network attachment unit costly and adoption low. For most applications WiFi is cheaper and more convenient. Anyone who wants really solid connectivity will lay Cat5.


The use case here is turning an outlet or light on/off, so latency matters, but it requires nearly zero bandwidth (assuming that you're not talking tcp). Granted, x10 security cameras exist.

x10 speaks a protocol designed for this purpose and requires a base station.


> USB C meanwhile requires active cables, sockets, and devices, sacrificing any pretence [sic] of simplicity.

Passive USB-C cables should be fine up to 60 watts. Beyond that you need a chip to certify safe construction but that's a good thing.


A DC solution is WAY overdue. Put your hand on your AC->DC converter, or your wall-wart. Above ambient room temperature? Wasted energy. "It's only 2-10 watts." Multiply by 5 units (underestimated) per 100 million households. That's >= a gigawatt. 8700+ GwH / year.

That's aside from the environmental costs of making & disposing of these pernicioius devices. Time for that decades-old solution to retire.


I don't see how going to 12 V would help simplifying devices. Most chips don't run on 12 V, so one will anyway need a switch-mode power supply or linear regulators with big heat sinks inside each device. The added expense of a rectifier and mains-capable transformer is minimal.

On the other hand, at least fewer people would be electrocuted by their crappy Amazon USB chargers.


We don't want low voltage distribution. It's dangerous. The currents have to be MUCH higher, and actually increases fire risk.


Comments below the original post are great.


go look at copper prices...


2016




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: