I set up my own home network with a Vertiv Liebert Li-ion UPS a few years ago and was thinking about how inefficient the whole process is regarding power. The current goes from AC to DC back to AC back to DC. Straight from the UPS as DC would work much better, and as I was teaching myself more about networking equipment, I was surprised to learn that most of it isn't DC input by default (i.e., each piece of equipment tends to come with built-in AC-DC conversion).
Then I started routing ethernet with PoE throughout my house and observed that other than a few large appliances, the majority of powered devices in a typical home in 2026 could be supplied via PoE DC current as well! Lighting, laptops, small/medium televisions. The current PoE spec allows up to 100 W, which covers like 80% of the powered devices in most homes. I think it would make more sense to have fewer AC outlets around the modern house and many more terminals for PoE instead (maybe with a more robust connector than RJ45). I wonder what sort of energy efficiency improvements this would yield. No more power bricks all over the place either.
The problem is that all of those DC devices don't operate on 48V either. The vast majority of chips require a 5V or lower input, so with a 48V DC supply you're still going to need a per-device PSU to do DC-DC conversion. In other words: no getting rid of power bricks.
Efficiency isn't as straightforward either. You're still being fed by 120V/230V AC, so you're going to need some kind of centralized rectifier and down converter. It'll need to be specced for peak use, but in practice it'll usually operate at a fraction of that load - which means it'll have a pretty poor efficiency. A per-device PSU can be designed exactly for the expected load, which means it'll operate at its peak efficiency.
We also don't use 5V DC grids because the wire losses would be horrible, so a domestic DC grid should probably operate at pretty close to regular AC voltage as well. In practice this means the most sensible option would be to have a centralized rectifier and a grid operating at whatever voltage it outputs - but what would be the point?
As to PoE: I personally really like the idea, but I don't believe it'll have a bright future. For its traditional use the main issue is that there doesn't seem to be a future for twisted-pair beyond 10Gbps. 25GBASE-T might exist as a standard on paper, but the hardware never took off due to complete disinterest from the datacenter market, and it is too limited to be of use in offices and homes. I fully expect that 25G will arrive in the home and office as some form of fiber-optic interconnect - with fiber+copper hybrid for things like access points.
On the other hand, for a lot of IoT applications PoE seems to be too complicated and too expensive. It makes sense for things like cameras, but individual lights, or things like smoke sensors are probably better served in office/industrial applications by either a regular AC supply or a local DC one, plus something like KNX, X10, CAN, or Modbus for comms: just being able to be wired as a bus rather than a star topology is already a massive advantage. And for domestic use the whole "has a wire" thing is of course a massive drawback - most consumers strongly prefer using Wifi over running a dedicated wire to every single little doodad.
> I set up my own home network with a Vertiv Liebert Li-ion UPS a few years ago and was thinking about how inefficient the whole process is regarding power. The current goes from AC to DC back to AC back to DC.
With double-conversion, generally yes.
I recently ran across the (patented?) concept of a delta conversion/transformer UPS that seems to eliminate/reduce the inefficiencies:
The double-conversion only occurs when there's a 'hiccup' from utility power, otherwise if power is clean the double-conversion is not done at all so the inefficiencies don't kick in.
"... throughout my house and observed that other than a few large appliances, the majority of powered devices in a typical home in 2026 could be supplied via PoE DC current as well!"
We installed 120 LED ceiling lights in our home circa 2020, all of which were run with high voltage (romex) and accompanied by 120 little transformer boxes that mount inside the ceiling next to them.
Later ...
We installed outdoor lighting with low voltage, outdoor rated wiring and powered by a 12V transformer[1] and I felt the same way you did: why did we use a mile of romex and install all of those little mini transformers when we could have powered the same lights with 12V and low voltage wire ?
I then learned that the energy draw of running the low-volt transformer all the time - especially one large enough to supply an entire house of lighting - would more than cancel out energy savings from powering lower voltage fixtures.
You don't have this problem with outdoor lighting because the entire transformer is on a switch leg and is off most of the time.
So ... I like the idea of removing a lot of unnecessary high voltage wire but it's not as simple as "just put all of your lights behind a transformer".
> I then learned that the energy draw of running the low-volt transformer all the time - especially one large enough to supply an entire house of lighting - would more than cancel out energy savings from powering lower voltage fixtures.
That's not a constraint of physics, you can absolutely build a DC power supply that is efficient in a wide load range. (Worst case it might involve paralleling and switching between multiple PSUs that target different load ranges.) But of course something like that is more expensive...
> Lighting, laptops, small/medium televisions. The current PoE spec allows up to 100 W, which covers like 80% of the powered devices in most homes.
I find it a little hard to imagine that those devices outnumber things like stoves, dishwashers, washers/dryers, kettles, hair dryers... by 4:1.
Unsure why PoE would be better for LED lighting than the standard approach of screwing a bulb directly into AC, either. How many lumens do you get out of strip lights these days? And you still have AC-DC conversion for whatever's sourcing power onto the Ethernet link.
PoE is also fairly bulky, requires large connectors, and either requires a wholly isolated PD or what's basically a class 2 DC/DC converter. That's why PoE-powered stuff usually has that big transformer cube in it with a lot of clearance, slotted PCB, 2-4 kV capacitors etc.
In practice PoE will have lower efficiency than mains powered, since it'll usually be at least double conversion, often three converters in series, plus the losses of the thin network wires, and the relatively high idle losses / poor low-load efficiency of the necessarily over-dimensioned PSE.
I think Ubiquiti (makers of the UniFi wifi products, as well as some of the most popular managed PoE switches) also make a ton of other PoE products such as the usual stuff like cameras, ip phones, network switches, access card readers, door locks, and, now, ceiling lights (presumably due to the latest PoE standards delivering significant wattage).
It's super nice because you only need to put the UPS/ATS at the PoE switch and then you get power redundancy everywhere you have ethernet running (i.e. the phones don't go down).
I think we're slowly, slowly coming around to the idea of domestic DC distribution. The vast majority of consumer electronics would be perfectly happy to consume 12v. It's cheaper, safer, more efficient. Less design work and certification on inbuilt AC adapters.
I think it's highly unlikely we'll see mass scale retrofits, but if enough momentum builds up, I can see it as a great bonus feature for new builds.
I got lucky with my house and every room has a dedicated phone line meeting at a distribution panel (a couple of 2x4s with screw terminals) built in the 50s. I'm in the process of converting it to light duty DC power. The wiring is only good for an amp or two, but at 48v that's still significant power transmission.
800V to each rackmount unit, with hot plugging of rack units? That's scary.
The usual setup at this voltage is that you throw a hulking big switch to cut the power, and that mechanically unlocks the cabinet. But that's not what these people have in mind.
They want hot-plugging of individual rackmount units.
GE has a paper about the power conversion design, but it doesn't mention the unit to rack electrical and mechanical interface. Liteon is working on that, but the animation is rather vague.[2] They hint at hot plugging but hand-wave how the disconnects work.
Delta offers a few more hints.[3] There's a complex hot-plugging control unit to avoid inrush currents on plug-in and arcing on disconnect. This requires active management of the switching silicon carbide MOSFETs.
There ought to be a mechanical disconnect behind this, so that when someone pulls out a rackmount unit, a shutter drops behind it to protect people from 800V. All these papers are kind of hand-wavey about how the electrical safety works.
Plus, all this is liquid-cooled, and that has to hot-plug, too.
> When it is detected that the PDB starts to detach from the interface, the hot-swap controller quickly turns off the MOSFET to block the discharge path from Cin to the system. After the main power path is completely disconnected, the interface is physically detached, and no current flows at this time
> For insertion, long pins (typically for ground and control signals) make contact first to establish a
stable reference and enable pre-insertion checks, while short pins (for power or sensitive signals)
connect later once conditions are safe; during removal, the sequence is reversed, with short pins
disconnecting first to minimize interference.
The power connectors will be on the far side of the rack from the service side so shouldn’t be a problem for humans touching the third rail so to speak.
With that sort of voltage you should be able to use a capacitive or inductive sensor to activate a relay.
EV chargers take a different approach. There is no power on the connector while you're plugging it in. It then locks in place before the contactor closes and power is delivered. Unplugging is the same, power is removed before the plug is unlocked for release.
As long as you can control for fire, electrical safety seems like a temporary condition as robots and intelligent machines are cheaper and more available long term solution to hot swap blades in datacenter racks.
I think you're being downvoted for speaking of a complex future possibility ("robots and intelligent machines ... solution") as if it was a proven commodity. There will be many twists and turns in the path to the possible reliability, scalability, and cost effectiveness of robots and intelligent machines.
DC power has been an option for datacenter equipment since I was a young lad racking and stacking hardware. Cisco, Dell, HPE, IBM, and countless others all had DC supply options. Same with PDUs. What’s old is new again.
48vdc was common in phone exchanges. They filled the basement with lead-acid batteries and to could run without the grid for a couple weeks. In turn the phone was 99.999% reliable for decades.
I'm working on stuff in that market, it's still largely is. DC Power System Design For Telecommunications is still a must read and it doesn't even cover the last 15 years or so of development, notably lithium batteries and high efficiency rectifiers.
I will say that this is a surprisingly deep and complex domain. The amount of flexibility, variety and scalability you see in DC architectures is mind-boogling. They can span from a 3kW system that fits in 2U all the way to multiples of 100kWs that span entire buildings and be powered through any combination of grid, solar and/or gas.
This reminds me of the early google data centers that directly soldered those massive duracell lantern batteries directly to the motherboards as a primitive battery backup. I'm struggling to google examples of it, this would have been back around 2008, but i have a vivid memory of it.
Not to be _that_ guy, but it was technically -48V DC.
Honestly, that was pretty surprising to me when I had to work with some telco equipment a couple of decades ago. To this day, I don't think I've encountered anything else that requires negative voltage relative to ground.
> If you went 48 straight to POL voltages then you would have horrific converter performance.
What's horrific converter performance in numbers?
An isolated flyback (to 12V) should be able to hit >92% and doesn't care if it's fed -48V or +48V or ±24V. TI webench gives me 95% though I'd only believe that if I'd built and measured it. What's the performance of your -48V → +48V?
[with the caveat that these frequently require custom transformers... not an issue with large runs, but finding something that can be done with an existing part for smaller runs is... meh]
-48 to 48 claims something like 97% (load dependent of course). It also needs to arbitrate between two input supplies for glitchless redundancy, plus have PM bus and other spec mandated stuff. There is no technical reason why you cant go -48 -> 12 as you state with good efficiceny, but we cant get hold of a part that ticks all the boxes.
Horrific performance by my definition would be 48v to say 1v. We only realistically use buck topologies for POL supplies. Such a ratio is really bad for current transients, not to mention issues like minimum on times for the controller.
I'm just surprised that either input isolation isn't on your spec, or it still somehow works out better with isolated to +48V than straight to 12V... but I guess if your spec requires other things, it makes sense.
Well if it's negative 48V the electricty flows out of your circuit and back to the grid, so you need to make it positive to have the electricity come in.
Yes, and that tiny little difference can cost you a lot of expensive gear if you run it off the battery and plug in a serial port or something like that. You'll also learn first hand what arc welding looks like without welding glass.
Some old guitar effects used -9V DC.[1] And the convention with guitar effects power adapter is the barrel is center negative (which is motivated with facilitating easy wiring of the socket's switch to connect to a 9V battery inside).
Because the chassis is connected to ground (as in, a literal grounding rod hammered into the soil) and by definition your 0V reference point.
The crucial difference is the direction in which the current is flowing: is it going "in to", or "out of" a hot wire? This becomes rather important when those wires are leaving the building and are buried underground for miles, where they will inevitably develop minor faults.
With +48V corrosion will attack all those individual telephone wires, which will rapidly become a huge maintenance nightmare as you have to chase the precise location of each, dig it up, and patch it.
With -48V corrosion will attack the grounding rod at your exchange. Still not ideal, but monitoring it isn't too bad and replacing a corroded grounding rod isn't that difficult. Telephone wires will still develop minor faults, but it'll just cause some additional load rather than inevitably corroding away.
Does that mean when you have electronics and use multiple dc-dc converters all the inputs and outputs share the same ground, it's not just the values for that pair of wires?
And if I want to use a telephone on an incorrectly wired 48dc circuit, I could switch the positive and negative wires, as long as the circuit in the telephone is isolated and never touches ground?
Thanks. Somehow I got in my head that all circuits were just about the delta from neutral and therefore nothing outside them mattered.
> all the inputs and outputs share the same ground, it's not just the values for that pair of wires?
No, it depends on the converter. There are converters that leave 160V on the DC power rail for a 110V AC input, and 155V on the DC "ground" rail.
They are economic and you could find then when galvanic isolation is at least in theory not important, but they're terribly unsafe when used on PCBs that people might muck with.
If you have some "normal" converters and some of this kind, sharing the ground would be quite dangerous.
There is a true zero potential. You can detect this because two charged objects with zero delta between them will still repel each other.
I think a circuit should mostly care about the deltas, but when you’re talking about things like phone lines, the earth becomes part of your circuit. You can’t influence its potential (it’s almost exactly neutral because any charge imbalance gets removed by interaction with the interplanetary medium) so everything else is going to end up being determined by what you need for their relative potential to that.
Most large scale systems are AC because transformers are relatively cheap, low maintenance, and efficient. When the system is AC ground makes no difference.
With DC systems you generally think about the issues - which is why modern cars are negative ground. However other than cars most people never encounter power systems of any size - inside a computer the voltages and distances are usually small enough that it doesn't matter what ground is. Not to mention most computers don't even have a chassis ground plane (there are circuit board ground planes but they conceptually different), and with non-conductive (plastic) cases ground doesn't even make sense.
> When the system is AC ground makes no difference.
With AC it's about where the ground is attached along the length of the transformer secondary. In the EU they ground one of the ends of the secondary, in the US we ground the center point.
I don't get to say this very often ... but the US way is objectively safer with no downside: 99% of human shocks are via ground, and it halves the voltage to ground (120V vs 240V). A neutral isn't required if there aren't 120V loads.
In the EU it is quite common for houses to have three-phase power. If you squint a bit, the grounded neutral of the Y transformer isn't entirely unlike the grounded center tap in the US. The voltage is a lot higher, of course!
I agree that the US voltage is safer (with the tradeoff of lower output powers available at your outlets). However, I suspect this is more than negated by the US plug design, which carries a much larger risk of shocks than almost all EU plug designs (Schuko, British/Type G, etc...)
- uninsulated metal pins make contact with supply while partially exposed
- much smaller distance between metal pins and the edge of the plug
But there's no inherent power tradeoff: you can have 240V outlets in the US, with the two prongs both 120V to ground. They're just really uncommon in residences.
I ran into a guy at a hardware store who ran just such a power supply attached our city's water (or was it natural gas?) infrastructure. I was incredulous, but the idea that it helped prevent corrosion did make sense.
positive ground used to be in all cars. When they went from 6 volts to 12 the disadvantages became appearant fast and so everyone went negative ground then (mid 1950s). I am not clear why positive ground was bad (maybe corrosion?)
Yeah I always heard that the phone lines carried their own power, and in Florida the phones did keep working when the power went out, but I never knew why.
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?
The phone grid predated the electrical grid. There was no other choice for power.
Actually, there was one. Even earlier phones had their own power. A dry-cell battery in each phone, and every 6 months, the phone company would come around with a cart and replace everyone's battery. Central battery was found to be more convenient, since phone company employees didn't have to go around to everyone's site. Central offices could economize scale and have actual generators feeding rechargeable batteries.
It's a pretty decent chunk of power down a POTS cable too, as it was designed to ring multiple big chunky metal bells in the days of yore.
I was wiring in a phone extension for my grandma once as a boy and grabbed the live cable instead of the extension and stripped the wire with my teeth (as you do). I've been electrocuted a great number of times by the mains AC, but getting hit by that juicy DC was the best one yet. Jumped me 6ft across the room :D
The teeth. Yikes! But yeah, I remember having the rotary phone disassembled and touching the wires adjusting something when a ring came. Gave me enough of a jolt to remember.
Grid charging batteries, phone draining them as I understand. Of course there were switches all over the us so I can't make blanket claims but from what I hear that was normal.
The batteries and phone lines were one system at -48v with power supplies converting AC power to DC while grid / generator is up.
The batteries are floated at the line voltage nothing was really charging or discharging and there was no switchover.
This is similar to your cars 12v dc power system such the when the car is running the alternator is providing DC power and the batteries float doing nothing except buffering large fluctuations stabilizing voltage.
Yeah, it used to be that you could still make calls (particularly to emergency services) even in complete power outages, for as long as your local exchange has batteries for. (AFAIR that tended to be on the order of hours, but probably differs quite a bit across locations and regulatory domains/countries.)
Another thing we lost in the age of VoIP landlines, but then again mobile towers also have batteries. Just don't be unlucky and have a power outage with 3% battery on your phone...
Power plant is the convention for any large company that has backup power. A few UPSes for the server room - they are the power plant. A backup generator - power plant. Sometimes even just the room with all the break boxes from where the grid comes in is called the power plant (though normally power plant is reserved for backup power). It is extremely common for commercial buildings to have their own power plant. Most of their power comes from the grid in all cases, but they have a power plant. At commercial scale you can often save money by buying a backup generator powerful enough for your whole building so you disconnect from the grid when grid power is in highest demand (see your utility, then your accountant: for details if you can afford a generator this large)
Obviously 48VDC has been around and internally they will probably still step down to 48V. But these 48V islands are nowadays inter connected by regular AC grid. They want to replace that interconnection with a 800VDc bus. I kind of assume they chose 800vdc because there are already bunch of stuff available from EVs which also have 800vdc battery packs now.
Much of the world's mains-voltage electronics run at 240V (historical) and have PFC circuits (which are essentially just boost converters) that run at ~400V DC link voltages. 650V gives you enough headroom to tolerate overshoots and still have an 80% safety margin with a single level topology.
This voltage also coincidentally is a convenient crossover point where silicon MOSFETs start to become inefficient and GaN FETs have recently become feasible and mass-produced.
90% of the power in our academic data center goes 13.8kV 3-phase -> 400v 3-phase, and then the machines run directly from one leg to neutral (230v). One transformer step, no UPS losses, and the server power supplies are more efficient at EU voltages.
But what about availability? If you ask most of our users whether they’d prefer 4 9s of availability or 10% more money to spend on CPUs, they choose the CPUs. We asked them.
There are a lot of availability-insensitive workloads in the commercial world, as well, like AI training. What matters in those cases is how much computing you get done by the end of the month, and for a fixed budget a UPS reduces this number.
> and then the machines run directly from one leg to neutral (230v)
And then every machine has a switching power supply to convert this to low-voltage DC, and then probably random point-of-load converters in various places (DC -> AC -> DC again) for stuff like the CPU / GPU core, RAM, etc. Each of these stages may be ~95% efficient with optimal load, but the losses add up, and get a lot worse outside a narrow envelope.
I've been hearing this line for over a decade, now. "Immersion cooling will make data centers scale!" "Converting to DC at the perimeter increases density!"
Yes, of course both of those things are true, and yes, some data centers do engage in those processes for their unique advantages. The issue is that aside from specialty kit designed for that use (like the AWS Outposts with their DC conversion), the rank-and-file kit is still predominantly AC-driven, and that doesn't seem to be changing just yet.
While I'd love to see more DC-flavored kit accessible to the mainstream, it's a chicken-and-egg problem that neither the power vendors (APC, Eaton, etc) or the kit makers (Dell, Cisco, HP, Supermicro, etc) seem to want to take the plunge on first. Until then, this remains a niche-feature for niche-users deal, I wager.
As seen on HN a few days ago, immersion cooling is dead: turns out the risks of getting sued to oblivion due to widespread PFAS contamination isn't worth it. [0]
DC doesn't have such a killer. There are a decent bunch of benefits, and the main drawback is gear availability. However, the chicken-and-egg problem is being solved by hyperscalers. Like it or not, the rank-and-file of small & medium businesses is dying, and massive deployments like AWS/GCP/Azure/Meta are becoming the norm. Those four already account for 44% of data center capacity! If they switch to DC can you still call it "specialty kit", or would it perhaps be more accurate to call it "industry norm"?
It is becoming increasingly obvious that the rest of the industry is essentially getting Big Tech's leftovers. I wouldn't be surprised if DC became the norm for colocation over the next few decades.
Those vendors all have DC power supply options, to my knowledge. It’s hardly new; early telco datacenters had DC power rails, since Western Electric switching equipment ran on 48VDC.
That’s just it though, telco DCs != Compute DCs. Telcos had a vested interest in DC adoption because their wireline networks used it anyway, and the fewer conversions being done the more efficient their deployments were.
Every single DC I’ve worked in, from two racks to hundreds, has been AC-driven. It’s just cheaper to go after inefficiencies in consumption first with standard kit than to optimize for AC-DC conversion loss. I’m not saying DC isn’t the future so much as I’ve been hearing it’s the future for about as long as Elmo’s promised FSD is coming “next year”.
I think the real reason is because battery power didn't have to be converted twice to be able to run the gear in case of an outage, so you'd get longer runtime in case of a power failure, and it saves a bunch of money on supplies and inverters because you effectively only need a single giant supply for all of the gear and those tend to be more efficient (and easier to keep cool) than a whole raft of smaller ones.
Immersion cooling was/is so fucking impractical it is only useful for very specific issues. If you talk to any engineer who worked on CRAY machines that were full of liquid freon, they'll tell how hard it is to do quick swaps of anything.
Its much cheaper, quicker and easier to use cooling blocks with leak proof quick connectors to do liquid cooling. It means you can use normal equipment, and don't need to re-re-enforce the floor.
A lot of "edge" stuff has 12/48v screw terminals, which I suspect is because they are designed to be telco compatible.
For megawatt racks though, I'm still not really sure.
At least for servers, power supplies are highly modular. It just takes 1 moderately sized customer to commit to buying them, and a DC module will appear.
Looking at the manual for the first server line that came to mind, you can buy a Dell PowerEdge R730 today with a first party support DC power supply.
Surely if it makes sense for the big players, they will do it, and then the benefits will trickle down to the rest? Like how Formula 1 technology will end up in consumer vehicles.
It is weird to me how far from the state of the art mainstream server equipment is. I can't imagine anything worse than AC-AC UPS, active PDUs, and redundant AC-DC supplies in each rack unit, but that's still how people are doing it.
Well they're kinda transitioning back. When I grew up most DCs (and telecom facilities) were running on 48V DC. Easy to back up with a big room full of lead acid batteries (just keep an eye on that hydrogen gas lol)
Is there anything left in a modern home that really needs or is better on AC?
We have some old ceiling and exhaust fans, but I know those can be replaced. Our refrigerator is AC, but extended family with an off-grid home has a DC refrigerator that cycles way less, probably due to multiple design factors but I’m sure the lack of transformer heat is part of it. I’m not as sure about laundry machine or oven/cooktop options but I believe those are also running on DC in the off-grid home without inverters.
Most of these AC appliances also have transformers in them anyway for the control boards. It seems kind of insane to me that we are still doing things this way.
Of grid homes are vastly more concerned with the energy efficiency of their appliances and thus DC refrigerators generally have more insulation. Most AC customers prefer more internal volume for food over slightly increased efficiency.
AC motors are using way more power than the puddly control boards in most home appliances. So you lose a little efficiency on conversion but being 80% efficient doesn’t matter much when it’s 1-5% of the devices energy budget. You generally gain way more than that from similarly priced AC motors being more efficient.
I agree with everything you said, except it seems like a false dichotomy. We can clearly build DC refrigerators with more or less insulation. We can clearly build them large or small. If you want to prioritize volume, then surely you could do that with DC. Right?
I know that a long time ago DC-to-DC voltage converters were very large in size, which meant AC would win on space efficiency. But unless I’m mistaken, that’s no longer the case. Wouldn’t a DC refrigerator with equivalent insulation and interior volume have nearly identical exterior dimensions as an AC refrigerator?
> Wouldn’t a DC refrigerator with equivalent insulation and interior volume have nearly identical exterior dimensions as an AC refrigerator?
Sure, but it’s important to separate what could be built from what is being built based on consumer preferences and buying habits. The average refrigerator could be significantly quieter, but how often do people actually listen to what they are buying? People buying Tesla’s didn’t test drive the actual car they were buying so the company deprioritized panel gaps. And so forth, companies optimize in ways that maximize their profits not arbitrary metrics.
Any appliance with strong motors should be more efficient with AC supply. But almost anything else can be regarded as a heater that doesn't care much as long as it is fed with the correct voltage. Which is actually the core issue.
A DC household would have to choose a trade-off between multiple lines with different voltages or fewer voltages that need to be adapted to the appliances. And we're right back at the AC situation, but worse since DC voltages are more difficult to change.
But consumers like datacenters can very well plan ahead and standardize on a single DC voltage. They already need beefy equipment to deal with interruptions, power sourges, non-sinus components, and brownouts, which already involves transformers, condensators, and DC conversion for battery storage. Therefore almost no additional equipment is required.
What qualifies as a strong motor here? Are you comparing to a brushed DC motor? Do you think a washer/dryer would have worse overall efficiency with a BLDC in a DC home compared to what we have today? If so, that’s news to me. Where can I learn more about that?
The trade-off between, say, one (relatively) high voltage DC bus throughout the home vs many branches with lower discrete voltages is indeed a problem. With AC, we took the bus approach, running 120v everywhere (in the U.S., higher elsewhere). I’m inclined to say we should keep doing that for flexibility and predictability. But it’s a trade off, like you said. It would obviously help if regulatory and standards bodies came out with official recommendations.
Things like washing machines, dryers, dishwashers, air conditioners, or fridges spend a lot of energy by running powerful electrical motors, which should benefit from AC.
Everything else I can think of in a typical household is basically a mere heater that in principle works equally well with AC and DC of the correct voltage. Even computers can be said to mostly care about the correct voltage since AC->DC conversion is vastly easier than voltage conversion.
Probably 90% of my devices run 5V DC or similar, but you can't run that through a home so you're back to needing AC. If you're going to have AC and DC then you might as well just have AC.
It's obviously not new. ±400VDC architecture was presented at Open Compute last year, which is a fair indicator that the presenter had put it into practice at least 5 years prior to disclosing it. 48VDC distribution within a rack, and 48-to-1V direct regulators for CPUs, were both contributed to OCP 7 years ago, at which point they were both old hat. And 48VDC telco junk is, of course, totally ancient.
Not going to happen. For the same reason that the US never converted to a higher domestic voltage even though there are many practical advantages. The transition from one system to another at the consumer level would be terrible, even if there would be some advantage (and I'm not sure the one you list is even valid, you'd get DC-DC converters instead because your consumers typically use a lower voltage than the house distribution network powering your sockets) it would be offset by the cost of maintaining two systems side by side for decades.
You could wire your house for 12, 24 or 48V DC tomorrow and some off-grid dwellers have done just that. But since inverters have become cheap enough such installations are becoming more and more rare. The only place where you still see that is in cars, trucks and vessels.
And if you thought cooking water in a camper on an inverter is tricky wait until you start running things like washing machines and other large appliances off low voltage DC. You'll be using massive cables the cost of which will outweigh any savings.
> Not going to happen. For the same reason that the US never converted to a higher domestic voltage even though there are many practical advantages.
It would be relatively easy for the US to go to 240V: swap out single-pole breakers for double-pole, and change your NEMA 5 plugs for NEMA 6.
For a transition period you could easily have 240V and 120V plugs right next to each other (because of split phase you can 'splice in' 120V easily: just run cable like you would for a NEMA 14 plug: L1/L2/N/G).
What would be the real challenge would be going from 50 to 60Hz.
I suppose that still begs the question somewhat, since the US does have 240V (2 phase) already driving many appliances. Why hasn’t it ever become standard for luxury kitchens to have a European-style outlet for use with a European kettle? I know the US already has a different 240V plug shape, so it might have to be an unlicensed installation, but surely someone wanted hot tea faster and did that calculus before?
I wired a UK kettle to an unused 240V range outlet in the US once. It was amazing, boiled a liter of water in just under a minute. Obviously kinda sketchy.
Well, as you say, it would not be according to code and the insurance company might have something to say about it. It's also single phase but not quite the way you do it in the USA, it would be a neutral and a phase whereas in the USA I think it is 2x110. Finally, it's 50 Hz rather than 60 which would work fine for resistive loads but not so well for inductive ones such as transformers and motors.
In all likely not worth the trouble. When I moved to Canada I gave away most of my power tools for that reason and when I moved back I had to do that all over again.
> In all likely not worth the trouble. When I moved to Canada I gave away most of my power tools for that reason and when I moved back I had to do that all over again.
If you ever have to do it again, you can probably get a transformer rated high enough for power-tools for cheaper than replacing all of your power tools.
The line frequency tends to screw with things with motors too. Moved from the US to Belgium back when compact cassette was a common format for music.
Killed a few tapes with a transformer on a US tape deck before buying a 220V 50Hz unit. No, I don’t remember if the pitch was grossly off, but I’m guessing it wasn’t.
Of course you can. That's kind of obvious. It is also highly impractical. Besides the frequency delta you end up having to lug a heavy transformer along and then you have to alternate it across your tools so you don't end up frying the transformer.
You can run 240V circuit to kitchen for kettle and put in NEMA 6 outlet. But few people care about fast boil and importing European kettle. Most people use the microwave or stovetop, and 120V kettles are fine in most cases. It will never become a standard thing.
I think the answer to your question is that it mostly doesn't matter for personal mug size quantities of hot water and if it does matter to you there are readily available competing options such as dedicated taps for your kitchen sink.
Perhaps the biggest reason is that a traditional kettle on any half decent electric range will match if not exceed the power output of any imported electric kettle. Many even go well beyond that with one burner marked "quick boil" or similar.
No one in the USA drinks hat tea. The choices (and it tends to be regionally-based) is sweet or unsweet tea. No need to boil a kettle quickly for that.
> The choices (and it tends to be regionally-based) is sweet or unsweet tea.
... Unless you're buying it pre-made, does this not still start with making hot tea the regular way? Or what exactly are you doing with the tea bags and loose tea from the supermarket?
Perplexingly I was traveling in one of the iced tea regions of the country in need of a cup of hot tea, and they had no way to make it. Like, you have a commercial coffee maker and hot cups, the coffee maker has a hot(ish) water tap. All you need is a $4 box of teabags that’ll last until the heat death of the universe. Nope.
As a counter argument, things like pour over coffee is getting to be more popular in the US and older drip coffee makers seem to be getting slightly less popular.
Still though, I don't seem to see most of those people seriously clamoring for the electric kettle to go a bit faster. The cost for the wiring difference and dealing with odd imported kettles just isn't worth it generally.
> I know the US already has a different 240V plug shape, so it might have to be an unlicensed installation, but surely someone wanted hot tea faster and did that calculus before?
How expensive would a proper AC->DC->AC brick for that power level be?
Not so simple, you'd have to use a 'drier' or 'welder' socket for that otherwise you won't have enough power. A single circuit in Europe is 240V 16A or 3840W!
A pure sinewave inverter for that kind of power is maybe 600 to 1000 bucks or so, then you'd still need the other side and maybe a smallish battery in the middle t stabilize the whole thing. Or you could use one of those single phase inverters they use for motors.
I'm not sure it's likely, but I could see DC lighting start to happen in new construction. Have a single AC-to-DC converter off the main service entrance that powers hard-wired LED lighting fixtures in the house. Would probably be better than running the individual (and usually very low quality) converters in dozens of standard LED light bulbs. Would need to be standardized, codified, etc. so probably not happening soon.
Would be more practical to have a single 50-300W AC-DC 24V PSU per room or group of rooms, then pull relatively short DC cables to each light. A multichannel light controller could also be placed nearby, and then if you need fully-featured brightness and color control, only a small PWM amplifier could be placed at each light if distance from controller to each light is too long to transmit PWM power directly.
I just wish I could run my air conditioner and my desktop computer at the same time without flipping the breaker. The RTX 5090 is a space heater and will easily peg at the 600W it’s rated for, and so with that and an air conditioner window unit, I have to run a long cable from another unused room if I want to do anything that stresses the video card.
Well, having spent some time operating a 12VDC system last year when I moved into some shacks, I will say that I find it a lot more convenient to run 120VAC.
I end up converting stuff anyhow, because all my loads run at different voltages- even though I had my lights, vent fan, and heater fans running on 12V I still ended up having to change voltages for most of the loads I wanted to run, or generate a AC to to charge my computer and run a rice cooker.
Not to mention that running anything that draws any real power quickly needs a much thicker wire at 12V. So you're either needing to run higher voltage DC than all your loads for distribution and then lowering the voltage when it gets to the device, or you simply can't draw much power.
Not that you can't have higher voltage DC; with my newer system the line from my solar panels to my charger controller is around 350VDC and I can use 10awg for that... but none of the loads I own that draw much power (saws, instapot, rice cooker, hammond organ, tube guitar amp) take DC :D
Do you have a website with your system on it? I have an off-grid building I need to add solar to in the next year or so. After I fix the foundation and roof, of course. Naturally I’m exploring options for item 387 on the todo list instead of think about how I’m going to jack the building up.
4KW of panels, 400W 48V
EG4 6000XP charge controller/ inverter
3x EG4 LifePower4 48V batteries
a raspberry pi running solar assistant
I feels like a bit overkill, and there is still a whole mppt unused on the 6000xp so I could still double my panel input. Also solar assistant tells me that I rarly go below 75% battery storage. If I just wanted to run my fridge and assorted convenience loads (and ran things like table saws off a generator) then I could get away with a lot less of a system.
But I'm operating a recording studio, and there were a couple days this winter where I had a full-band session and a couple days of storms and got down to below 50%.
The lesser-known instance of this is RV power. When you're running off small batteries and solar, you want to make the best use of the watt-hours you have, and that means avoiding the DC-to-AC-to-DC loop wherever possible. So you run 12V (or in newer models, higher voltage) versions of everything, upconverting as necessary.
I am really skeptical that 12VDC power distribution in RVs actually saves power compared to a high-quality (hah!) higher voltage AC or DC system. 12V is absurdly low and you can’t easily lose quite a few percent in resistive losses even with fairly large cables, and those large cables are quite unpleasant to work with and rather dangerous.
Assuming you live in a "large" western home, it's impractical. Remember, Edison's first power grid operated at 110/220v DC to the home. If there was lower voltage (IE, 12 volts) going from the street to your walls, the line loss would be significant. It only works in RVs and shacks because the wires are short.
Thus, even if you had DC in the walls, it would be 100+ volts, and you'd still have conversion down to the lower voltages that electronics use. If you look at the comments in this thread from people who work in telco, they talk about how voltage enters equipment at -48V and is then further lowered.
its really wild at all the AC to DC changes. for those non electric engineers / hardware hackers (like myself) one of the biggest "examples" I've seen of this has been ceiling fans.
Installing a ceiling fan used to be treacherous and so heavy. Also loud and buzzy after installed. Now the fans in these things are so lightweight and easy.
seeing the same in many more areas (lighting, etc)
Would love to see more mainstream DC lighting options and an updated code to match. I just finished a remodel of my workshop and blew over a hundred bucks on 14/2 for a 15 amp lighting circuit that is unlikely to ever see more than a 1 amps load.
The irony is all the recessed lights I picked out are DC, they all have little AC-DC boxes hanging off them using a proprietary connector. If I hadn't needed to pass a rough-in inspection going all DC would've been trivial.
It does make a whole lot is sense.
The amount of energy you loose to convert AC to DC can be humongous .
And useless if you produce your own power (normally already in DC).
They're still converting from AC to DC at the datacenter, it just isn't being stepped down at the perimeter. There is no transmission of HVDC going on. This isn't really Edison's revenge, more like his consolation price, ha!
I wonder how much of the benefit is simpler redundant power equipment. For AC, you have standby UPSes and line-interactive UPSes and frequency and phase synchronization. And everything needs a bit more hold-up time because, in case of failure, your new power supply might be at a zero crossing.
For 800V DC, a simple UPS could interface with the main supply using just a pair of (large) diodes, and a more complex and more efficient one could use some fancy solid state switches, but there’s no need for anything as complex as a line-interactive AC UPS.
I don't understand why new houses don't just have one high quality AC/DC converter so you can just use LED lighting without every bulb needing its own AC/DC converter. I imagine the light bulb cartel wouldn't really like that.
With modern technologies, that's power over ethernet or USB-C. Other comments in this thread point out that the telephone service also routinely used 48V for the ring signal.
However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
> I don't understand why new houses don't just have one high quality AC/DC converter so you can just use LED lighting without every bulb needing its own AC/DC converter.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
In the commercial/industrial space this may be worth it: how long do these bulbs last? how much (per hour (equivalent)) do you pay your facilities folks? how much time does it take for employees or tenants to report an outage and for your folks to get a ladder (or scissor lift) to change the bulb?
Every decent LED would then need … a switching power supply. LEDs are current-driven devices, and you get the best efficiency if you use an actual current-controlled supply. And those ICs are very, very cheap now.
The part that would genuinely be cheaper is avoiding problematic flicker. It takes a reasonably high quality LED driver to avoid 120Hz flicker, but a DC-supplied driver could be simpler and cheaper.
LED light bulbs exist exclusively for compatibility with Edison sockets. Every LED fixture I have seen had a single transformer for the entire fixture; and that transformer was reasonably separate from the LEDs themselves.
What voltage do you use? Most DC stuff wants low voltage (5-48V), but appliances need higher voltage like AC-level to get enough power over existing wiring. The result is DC-DC converters every place that have transformers now.
The gain from DC-DC converters is small and DC devices are small part of usage compared appliances. There is no way will pay back costs of replacing all the appliances.
It wouldn't work. leds need low voltages, meaning massive wires. you can run the voltage change on ac or dc. Ac just needs a few capacters to smooth the wave out.
That's traded off against the increase efficiency of LED lighting, at least compared to incandescent lighting. An LED "equivalent replacement" for a typical incandescent globe is down around 1/10th of the power. A 7Watt LED bulb is typically marketed as "60W equivalent". If that configured as a bunch of LEDs in series (or series/parallel) that need 12VDC, it's right about the same current draw as the 120V 60W incandescent equivalent. (Or perhaps double the current for those of us who get 220VAC out of our walls.)
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)
If your house gets 800V DC you're still gonna need "bricks" to convert that to 5VDC of 12VDC (or maybe 19VDC) that most of the things that currently have "bricks" need.
And if your house gets lower voltage DC, you're gonna have the problem of worth-stealing sized wiring to run your stove, water heater, or car charger.
I reckon it'd be nice to have USB C PD ports everywhere I have a 220VAC power point, but 5 years ago that'd have been a USB type A port - and even now those'd be getting close to useless. We use a Type I (AS/NZS 2112) power point plug here - and that hasn't needed to change in probably a century. I doubt there's ever been a low voltage DC plug/socket standard that's lasted in use for anything like that long - probably the old "car cigarette lighter" 12DC thing? I'm glad I don't have a house full of those.
Something to consider, and something I got a vivid demonstration of while playing with solar panels, DC arcs aren't self-extinguishing, unlike AC arcs. At one point I stuck a voltage probe in, and the arc stuck with it as I pulled the probe away. It also vaporized the metal tip of the probe.
My understanding is that DC breakers are somewhat prone to fires for this reason, too.
Heh - I vaporised a fairly large soldering iron tip (probably 4mm copper cylindrical bar?), when I fucked up soldering a connector to a big 7 cell ~6000mAHr LiPo battery and shorted the terminals. Quite how I didn't end up blind or in hospital I don't know. It reinforced just how much respect you need to pay to even low-ish voltage DC when the available current was likely able to exceed 700A by a fair margin momentarily. I think those cells were rated at 60C continuous and 120C for 5 seconds.
heh man, I'm glad you got out of that easy, I definitely wore safety glasses 100% of the time after my experience. I think a lifetime of experience with dangerous wall outlets and harmless little 1.5V/9V DC cells teaches us the wrong lessons about DC safety. I've since heard stories of wrenches exploding when they fall across EV high voltage battery terminals. Wrenches aren't supposed to be explosive.
The electricians I was working with also told me stories about how with the really big breakers, you don't stand in front of it when you throw it, because sometimes it can turn into a cloud of molten metal vapor. And that's just using them as intended.
A bunch of those big breakers require two people. One person in a flash suit and another with a 2m long pole around the first person. That way if an arc flash happens, the second person can yank the first person to safety without also getting hurt.
Ruins the fun and interrupts instilling respect deep into the bones of interns.
Allegedly
While on "work experience" from high school I was put on washing power lines coming straight out of the local power station near the ocean - lots of salt buildups to clear.
Same deal, flashover suits and occasional arcs .. and much laughter from the ground operators who drifted the work bucket close.
This reminds me of the sailor who [decided](https://darwinawards.com/darwin/darwin1999-50.html) to measure his internal resistance by pushing probes through the skin on his thumbs and electrocuted himself with the 9V multimeter battery.
Mythbusters time. Salty fluids can be remarkably conductive. Blood qualifies. What's interesting though is that you have to wonder if there isn't some contributing factor here, as a kid I did this quite a few times, so that's one more for that list of stuff that could have killed me. At the same time: I didn't have nice insulation piercing tips back then (I do now) and that may be what saved me. I will definitely not try this again.
Another story in the same line is that I heard that a horse got killed by contact with a lantern battery, but I don't have any reference for that, just a story by a family member that collected coaches.
I have a couple of those narrow escapes one of which led me to put a significant chunk of Eastern Amsterdam out of power. Another involved Beryllium oxide. 9 lives are barely enough.
Ah! Perhaps you are a member of the gigawatt club? Eligible for entry once you have accidentally tripped off 1000 MW of load or generation! No sweeping that under the table
I'm the idiot that sent a fairly high voltage spike into the grid setting off a cascade. Even years later I do not fully understand how it could happen, you'd think the grid would be low impedance enough to absorb a spike like that. But it set off a cascade on a part of the local grid that was known to be weak.
> DC arcs aren't self-extinguishing, unlike AC arcs. At one point I stuck a voltage probe in, and the arc stuck with it as I pulled the probe away. It also vaporized the metal tip of the probe.
It would have self-extinguished if you waited long enough for the probe to vaporize.
> My understanding is that DC breakers are somewhat prone to fires for this reason, too.
I think its that DC breakers are more expensive, so people use AC rated breakers instead. They are both rated for 400v @10 amps, its the same thing right?
It turns out they are not, and most people, even electronics types rarely play with 200v+ of DC.
I've worked overseas a lot and one thing that's really different from 2 decades ago is that I simply don't need a step-down transformer anymore because every single thing I plug in converts to DC (or otherwise accepts dual-voltage) anyways. So I have a giant collection of physical plug adapters because every device I use just needs to fit into the socket and takes care of it from there.
I spent a few years getting flown out around the world to service gear at different datacenters. I learned to pack an IEC 60320 C14 to NEMA 5-15R adapter cable and a dumb, un-protected* NEMA 5-15R power strip. While on-site at the datacenters, an empty PDU receptacle was often easy to find. At hotels, I'd bring home a native cable borrowed from or given to me by the native datacenter staff or I'd ask the hotel front desk to borrow a "computer power cable," (more often, I'd just show them a photo) and they generally were able to lend me one. It worked great. I never found a power supply that wasn't content with 208 or 240V.
*: Some fancier power strips with surge suppression have a MOV over-voltage varistor that may burn up if given 200V+, rendering the power strip useless. Hence, unprotected strips are necessary.
I've had discussed with people familiar with the matter, and they convinced me its really not worth it for many reasons, the main one being safety - DC arcs are self sustaining - AC voltage constantly goes to zero, so if an arc were to form, it gets auto extinguished when the voltage drops. With DC this never happens, meaing every switch or plug socket can create this nice long arcs and is a potential fire hazard.
The 'what is safer' question for DC and AC at the same effective current and power has a mixed set of answers depending on conditions. For instance, DC is more likely to cause your muscles to contact and not let go (bad), but AC is more likely to send your heart into ventricular fibrillation (sp?, also bad).
AC arcs are easier to extinguish than DC arcs, but DC will creep much easier than AC and so on.
From a personal point of view: I've worked enough with both up to about 1KV at appreciable power levels and much higher than that at reduced power. Up to 50V or so I'd rather work with DC than AC but they're not much different. Up to 400V or so above that I'd much rather have AC and above 400V the answer is 'neither' because you're in some kind of gray zone where creep is still low so you won't know something is amiss until it is too late. And above 1KV in normal settings (say, picture tubes in old small b&w tvs and higher up when they're color and larger) and it will throw you right across the room but you'll likely live because the currents are low.
HF HV... now that's a different matter and I'm very respectful of anything in that domain, and still have a burn from a Tronser trimmer more than 45 years after it happened. Note to self: keep eye on SWR meter/Spectrum analyzer and finger position while trimming large end stages.
Thanks Jacques. So creepage is when current flows/arcs across the surface of an insulator, vs through the air. And it's worse with DC due to its unidirectional nature. Worsens when pollution builds up, or the surface degrades.
Indeed. And it's a really nasty thing to properly protect against because that pollution, especially with stuff that is unattended for a long time has a habit of ending up much worse than your worst fantasies. I've taken more than one electrocuted mouse out of the HV section of older color TVs for instance. Up to 250V or so it is manageable, above that you can get the weirdest problems including completely invisible arcing where the only giveaway is the ozone smell and the occasional click. Looking at HV circuitry in the dark or by putting a flame near a suspect spot is a great way to spot these kind of issues.
Really depends on what we're talking about. A lot of electrical safety equipment has a DC rating, usually something like 90VDC/300VAC. Also, most DC equipment just isn't going to have the stored energy to generate a big arc. Well, except batteries, and we're already piling them all around us.
I mean it depends, but for dual rated stuff has both a voltage and current limit, both of which are way lower.
Like typically a 230V/20A AC switch can switch 24VDC/2A. And the energy is not in the equipment, its in the mains (or batteries like you said, or PV panels)
Right, but that's why I mentioned safety equipment. Your common DIN-mount UL-489 branch circuit breaker will be rated for the same trip current, same short circuit current rating (SCCR), but lower voltage. So you can use the same wiring and breakers as you might have with AC and your 48V battery bank won't vaporize the $5 hardware store toggle switch that somehow became a shunt.
I mean, most AC circuit breakers use electromagnets to trip on overcurrent (as well as bimetallic strips using thermal methods for sustained high current).
Electromagnets dont work for DC, so your breaker will never trip. For thermal protection, you need current, so that checks out, and it would make sense for it to be rated under 50V as thats considered the highest voltage thats not life threatening on touch.
PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
> Electromagnets dont work for DC, so your breaker will never trip.
If electromagnets don't work for DC then what am I supposed to do with this pile of DC solenoids and relays? ;)
> PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
That's what the SCCR rating is for. When there's a fault you're going to have a LOT of current flowing until your safety kicks in. Something like the grid or a battery bank will happily provide thousands of amps almost instantaneously. Breakers designed for protecting building wiring are rated for this. Now, most household breakers aren't dual DC/AC rated, but you can actually buy DC rated breakers that fit in a home panel (Square D QO series).
> Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
My point is that there isn't any material reason why DC can't be as safe as AC, all the proper safety equipment already exists. Extinguishing a DC arc during a fault is a solved problem for equipment at household scale.
> The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
I just mentioned 48V because it's a common equipment voltage for household DC systems. 400V would be good for big motors and resistive heating loads.
Regarding DC vs AC and wiring efficiency, talking about mean vs peak wattage just confuses the issue. 1 volt DC is 1 volt RMS. It is an apples-to-apples comparison. If you want to say "we can use 170VDC or 120VAC with the same insulation withstand rating, and at lower current for the same power", then that is absolutely true. But your common 600V THHN building wire won't care if you're using 400V AC or DC, so it's mostly immaterial.
That’s actually a recent phenomenon. Before the age of electronics most household appliances either worked with AC or DC equally well (like incandescent bulbs) or worked well with AC only given the technology at the time (think anything with a motor, fans, HVAC compressors etc).
Taking it to an extreme, the house I lived in while in grad school had wall lamp fixtures that doubled as electric and gas lamps. At some point I imagine it would have been possible to choose between using electric or gas by either flipping the switch or turning a valve. They said "Edison Patent" on them. We could have lit the house on AC, DC, or gas.
Thinking about the failure modes gave me the heebie jeebies, but the gas had been disconnected ages prior.
It’s kind of fun that light switches predate electricity. I think you used to turn a key, I guess you were turning a valve? Now that I think of it using a key to operate a valve makes a lot of sense but you don’t see it too often, well, I guess you want to turn things off without needing to find a key…
I lived in a 19th century house in San Francisco that had gorgeous plaster work medallions on the ceilings - think cherubs and fruits - in the middle of which were the light fixtures. One day my dumb-ass flatmate made an ill-advised attempt to DIY his light fixture and cracked the still-active gas line embedded in the ceiling. Sometime in the 1920s - the date was printed on a sticker in the electrical panel - when they electrified the house, they'd wrapped the electrical wires around the gas pipes, and left them otherwise in situ. Crazy stuff.
I'm renovating a house, and have been considering 24V or 48V DC outlets in a few rooms. Semiconductors become more expensive above ~32V, so 24V might be the sweetspot.
However, there's also PoE (24 or 48V!), so maybe that's the right approach. It's not like each outlet is going to run a heater anyway.
Lower voltage makes voltage drop across the line proportionally worse. Depending on the purpose PoE is probably the way to go since the wiring and hardware is all standardized and safety certified.
Unless you mean running AC and installing inverters in the wall? What is this even for? All my electronics are DC but critically they all require different voltages. The only thing I might benefit from would be higher voltage service because there are times that 15 A at 120 V doesn't cut it.
AC is less efficient than DC at a given voltage. The advantage of AC is that voltage switching is cheap, easy and efficient. Switching DC voltage is way harder, more expensive, and less efficient. However the switching costs are O(1) and the transmission losses are O(n) so for some distance (currently somewhere around 500 km) it's worth paying the switching cost to get super high voltage DC. The big thing that's changed in the last ~30 years is a ton of research into high voltage transistors, and fast enough computers to do computer controlled mhz switching of giant high power transistors. These new super fancy switching technologies brought the switching costs down from ludicrous to annoyingly high.
To expand on this, a given power line can only take a set maximum current and voltage before it becomes a problem. DC can stay at this maximum voltage constantly, while AC spends time going to zero voltage and back, so it's delivering less power on the same line.
Maybe if by "same voltage" we mean DC voltage the same as AC peak voltage. When we talk about AC voltage we are referring to root-mean-square (RMS) voltage. It's kind of like saying the average, though for math reasons the average of an unbiased sine wave is 0. Anyhooo, 1 VRMS into a load will produce the same power as 1VDC. If AC delivered less power than DC at the same voltage then life would be very confusing.
That’s true, but my understanding is the main contributor is skin effect, since AC travels only on the surface of the wire, while DC uses the whole area, resulting in lower resistance loss (https://en.wikipedia.org/wiki/Skin_effect)
this iirc is the smallest of 3 problems. the other 2 are skin effect (AC wires only store power on the outside of the wire) and capacitive effects (a write running parallel to the ground is a capacitor and AC current is equivalent to constantly charging and discharging the capacitor)
The primary benefit of AC is it's really easy to change the voltage of AC up or down.
The transmission efficiency of AC comes from the fact that you can pretty trivially make a 1 megavolt AC line. The higher the voltage, the lower the current has to be to provide the same amount of power. And lower current means less power in line loss due to how electricity be.
But that really is the only advantage of AC. DC at the same voltage as AC will ultimately be more efficient, especially if it's humid or the line is underwater. Due to how electricy be, a change in the current of a line will induce a current into conductive materials. A portion of AC power is being drained simply by the fact that the current on the line is constantly alternating. DC doesn't alternate, so it doesn't ever lose power from that alternation.
Another key benefit of DC is can work to bridge grids. The thing causing a problem with grids being interconnected is entirely due to the nature of AC power. AC has a frequency and a phase. If two grids don't share a frequency (happens in the EU) or a phase (happens everywhere, particularly the grids in the US) they cannot be connected. Otherwise the power generators end up fighting each other rather than providing power to a load.
In short, AC won because it it was cheap and easy to make high voltage AC. DC is comming back because it's only somewhat recently been affordable to make similar transformations on DC from High to low and low to high voltages. DC carries further benefits that AC does not.
Important factor is that AC at given nominal voltage V swings between 1.41V and -1.41V, so it requires let's say 40% better/thicker insulation than the equivalent V volts DC line. This is OK for overhead lines (just space the wires more) but is a pain for buried or undersea transmission lines; for that reason, they tend to use DC nowadays.
> I always thought AC’s primary benefit was its transmission efficiency??
There are many factors involved, and "efficiency" is only one. Cost is the real driver, as with everything.
AC is effective when you need to step down frequently. Think transformers on poles everywhere. Stepping down AC using transformers means you can use smaller, cheaper conductors to get from high voltage transmission, lower voltage distribution and, finally lower voltage consumers. Without this, you need massive conductors and/or high voltages and all the costs that go with them.
AC is less effective, for instance, when transmitting high power over long, uninterrupted distances or feeding high density DC loads. Here, the reactive[1] power penalty of AC begins to dominate. This is a far less common problem, and so "Tesla won" is the widely held mental shortcut. Physics doesn't care, however; the DC case remains and is applied when necessary to reduce cost.
How is DC better than a three phase delta 800Vrms, at 400Hz?
- Three conductors vs two, but they can be the next gauge up since the current flows on three conductors
- no significant skin effect at 400Hz -> use speaker wire, lol.
- large voltage/current DC brakers are.. gnarly, and expensive. DC does not like to stop flowing
- The 400Hz distribution industry is massive; the entire aerospace industry runs on it. No need for niche or custom parts.
- 3 phase @ 400Hz is x6 = 2.4kHz. Six diodes will rectify it with almost no relevant amount of ripple (Vmin is 87% of Vmax) and very small caps will smooth it.
As an aside, with three (or more) phase you can use multi-tap transformers and get an arbitrary number of poles. 7 phases at 400Hz -> 5.6kHz. Your PSU is now 14 diodes and a ceramic cap.
- you still get to use step up/down transformers, but at 400Hz they're very small.
- merging power sources is a lot easier (but for the phase angle)
- DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
"now run that unshielded wire 50 meters past racks of GPUs and enjoy your EMI"
Multipole expansion scales faster than r^2.
Also, im not in the field (clearly) but GPUs cant handle 2.4 kHz? The quarter wavelength is 30km.
"nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms"
Current wise, the catalog covers this track just fine. As to the voltages, well that's the whole point of AC! The voltage you need is but a few loops of wire away.
"you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates"
So keep it? To clarify, this is the "we're too good for plebeian power, so we'll transform it AC->DC->AC", right?
"SiC solid-state DC breakers are shipping today from every major vendor"
Of course they do. They're also pricey, have limited current capability (both capital costs and therefore irrelevant when the industry is awash with GCC money) and lower conduction, and therefore higher heat.
They're really nice though.
"wide-bandgap converters are at 95%+ with no moving parts"
transformers have no moving parts. Loaded they can do 97%+ efficiency, or 2MW of heat eliminated on a 100MW center.
An advanced AI rack might use 100kW = 800V 125A, requiring gauge 2, quarter inch diameter---this isn't your lol speaker wire. Actually, I apologize, I realized I may be talking to a serious audiophile, didn't mean to disrespect your Monster cables.
The skin depth by the way is sqrt(2 1.7e-8 ohm m / (2 pi 400Hz mu0))=~3mm for copper---OK for single rack, but starts to be significant for the type of bus bars that an aisle of racks might want.
As for efficiency, both 400Hz transformers AND fancy DC-DC converters are around 95% efficient, except that AC requires electronics to rectify it to DC, losing another few percent, so the slight advantage goes to DC, actually.
As for merging power, remember that DC DC converter uses an internal AC stage, so it's the same---you can have multiple primary windings, just like for plain AC.
> I realized I may be talking to a serious audiophile, didn't mean to disrespect your Monster cables.
I am a recovering audiophool.
I do own a pair of 2m long Monster Cable speaker cables (with locking gold plated banana plugs). I am fairly certain I've used welders with smaller cables.
(In my defence, I bought those as a teenager in the late 80s. I am not so easily marketed to with snake oil these days. I hope.)
(On the other hand, I really like the idea of a reliably stable plus and minus 70V or maybe 100V DC power supply to my house. That'd make audio power amplifiers much easier and lighter...)
400Hz is an aircraft hack. In a data center, where batteries and most of the stuff behind the PSU already want DC, cutting conversion stages and a bunch of UPS weirdness is a boring win even if DC breakers are nastier and pricier. If you want switchgear with aerospace pricing in a building full of racks, AC at boutique frequencies is one way to get there.
>- no significant skin effect at 400Hz -> use speaker wire, lol.
What are you talking about? There's a very significant skin effect at 400Hz. Skin effect goes up with frequency. These datacenters use copper busbars, not cable, so skin effect is an important consideration.
At 100 000 A for a 100 MW data center at 1000 V, speaker wire is a joke.
You obviously need at least a dozen stands in parallel!!
Clearly skin effect scales with frequency but, 400 Hz is still low, only 2.5x lines frequency (the scale is by the root); so the skin depth is 3mm. 3mm on each side makes for a pretty hefty rectangular cross-section.
If you could get that 100,000Amps flowing through your speaker wire, the vaporised copper and the plasma channel would probably keep your 100MW flowing, at least until your building caught fire.
This would be the funniest thing to do. 100K Amps is doable, the question is for how long. That would be one very impressive bank of capacitors. And to turn a 00 into plasma would have some spectacular side effects, such as raining molten copper across a sizeable area. Just your reading glasses would indeed not be enough, there probably isn't any PPE that I would consider entirely safe other than sufficient distance from ground zero. But now I'm really curious. I have a spot welder that will do bursts of 5KA and that will happily throw the breaker every so many welds. 100KA sustained will be a fair engineering challenge.
Ah, that lego project... that was one I always wondered if I should have industrialized it but sourcing enough lego was a real problem.
Holy crap. That's a whole series of bad ideas extremely well executed. That guy probably has never seen what a lead acid battery can do when it explodes. He keeps hiding away from the hot metal but in the path of ~half of those batteries. Ignorance is bliss.
I think there'a a regulatory "Low Voltage" definition of "below 50V", which has implications around whether you need to be a licensed electrician to install it or not. Anything above that is - for at least some purposes - considered "High Voltage".
Other people, of course, have other definitions of high voltage:
"This resonant tower is known as a Tesla coil. This particular one is just over 17 feet tall and it can generate about a million volts at 60,000 cycles per second."
and:
"This pulse forming network can deliver a shaped pulse of over 50,000 amps with a total energy of about 1,057 times the tower primary energy"
Transitioning? It already happened decades ago. Only smaller scale/generic or less proficient "we bought all Dell and HP" use AC. At large scale it's been a ton of DC for literally decades. And for 70 years in telco and network gear.
AC is also waaaay safer for households: since the power drop to to zero 100x (50Hz) per second switches are cheaper and safer, and electrocution is less likely to happen.
The large brick you have on all your tech when you plug it in is the converter. AC works great for some applications, none of them really technical in nature.
It is absolutely stupid to talk about this as edisons revenge. If Tesla had the modern high power transistors needed to get high voltage dc out of the ac produced from a spinning turbine he would be all for high voltage dc too. Tesla understood that high voltage was needed for efficient long range transmission. He also understood that transformers were the inly remotely efficient way to climb up to and down from these high voltages. And transformers only work with ac. So he designed an ac system and even designed some better transformers for it.
If there was anything like a high power transistor back then he would have used that. High power transistors that are robust enough to handle the grid were designed inly recently over 100 years after the tesla/edison ac/dc argument.
>It is absolutely stupid to talk about this as edisons revenge. If Tesla had the modern high power transistors needed to get high voltage dc out of the ac produced from a spinning turbine he would be all for high voltage dc too.
This!
The soon people realized these facts the better. The pervasive high rise buildings did not happen before the invention of modern cranes.
Exactly twenty years ago I was doing a novel research on GaN characterization, and my supervisors made a lot money with consulations around the world, and succesfully founded govt funded start-up company around the technology. Together with SiC, these are the two game changing power devices with wideband semiconductor technology that only maturing recently.
Heck, even the Nobel price winning blue LED discovery was only made feasible by GaN. Watch the excellent video made by Veritasium for this back story [1].
[1] Why It Was Almost Impossible to Make the Blue LED:
Gallium is expensive to extract because it is extremely diluted in the environment.
It accompanies in very low quantities aluminum and zinc, so it is extracted only in the mines of aluminum or of zinc, as a byproduct.
However, the abundance of gallium is similar to that of lithium, while gallium is used in smaller amounts, so there is no risk to not have enough gallium in the near future.
On the other hand, all semiconductor devices with gallium also use some indium. Indium is used in even greater quantities in all LCD or OLED displays, to make transparent electrodes.
Indium is an extremely rare element in the entire universe, comparable with gold, so for indium there is a much greater risk that its reserves will become insufficient.
This could be mitigated by extracting such critical elements from the dumped electronic devices, but this is very expensive, because only small amounts of indium are used per device, so very large amounts of garbage would have to be processed in order to extract a sizable amount of it.
There's a component of modern culture that trains and expects people to be extremely pessimistic about long term human development. It results in situations above, where without any further information people just assume by default that were going to run out of a thing and are on some collision course with not just a disaster, but every single conceivable one.
(Gallium is a byproduct of aluminum production. We aren't going to run out.)
My understanding of most elements is if we want more it’s either pretty easy to make from something else we have a lot of, or we need to redo the Big Bang, the latter being, in my opinion, a bit of a disaster scenario.
Even synthesizing helium is prohibitively expensive. Unless you want whatever heavy decay products we have from nuclear waste, synthesizing elements at industrial scale probably isn’t happening.
Unless by “make from something” else you mean extract the element from existing chemical compounds found in Earth, in which case we’re still just using existing deposits on Earth.
On the other hand, it is possible to run out of a metal when all of it is either somewhere in some device or scattered among landfills (i.e. not concentrated in a place like a mine).
That is true, but gallium is present in the aluminum and zinc ores only in minute quantities.
We will not remain without gallium, but it is impossible to scale up the gallium production to a higher level than provided by the current productions of aluminum and zinc.
So there is a maximum level of gallium that can be used per year and it would not be possible to increase the production of blue and white LEDs and of power transistors above that level.
Fortunately, the amount of gallium used per device is very small, so it is not likely that we will hit that level soon. A much more serious problem is the associated consumption of indium, for which the resources are much less.
Practically speaking, sure. It's obviously not cost-effective to extract it. But it's there if someone can get it. I don't expect anyone to be extracting gold from ocean water, but there are other source of other elements that may not be cost-effective now but could be in the future or may simply become necessary despite the cost.
Cost scales with refinement effort, so it just results in more expensive TVs. That said, pretty sure we'll have drowned the planet in landfilled TVs long before this becomes a serious issue
From your earlier comment, your curiosity was more about what happens after we run out.
In your question you stated the running out as a given fact ("When" we run out, not "if").
If that was what you wanted to say I can't tell you, but that's definitely how it was received and thus you also got the harsh response. Since it reads a lot like doomsday thinking.
(Example: Does that mean when we run out of oxygen there are no more humans?
Yes, my curiosity was about when we run out, because I didn’t know if we would run out. That was the whole point of the question. Have some leniency, we’re not all experts about everything.
Sidenote: Whenever someone tells you that (vital) reserves of some ressource are going to run out soonish (implying drastic consequences), you should be extremely skeptical:
Such predictions have an abysmal historic track record, because we tend to find workarounds both on the supply side (=> previously undiscovered reserves) as well as flexibility on the demand side (using substitutes).
This applies historically for oil, lithium, rare earth metals and basically everything else.
edit: I'm not saying we're never gonna run out of anything-- I'm just saying to not expect sudden, cataclysmic shortages in general, but instead steadily rising prices and a somewhat smoothish transition to alternatives.
I always add "cheap" to the sentence. It seems they are always talking about the cheap version of anything. Going to run out of water? Or are we running out of the "cheap" version of water that does not have to be processed?
This is a valid point: quickly depleting reserves often indicate that pricing is not sustainable. Which is bad.
But non-sustainable pricing is very different from "cataclysmic collapse", and too many people expect the latter for too many things, which is just not realistic in my view (and historical precendent makes a strong case against that assumption, too).
A society where water prices gradually increases to "reverse-osmosis only" (instead of "pump-from-the-ground-everywhere") levels is very different from a society where water suddenly runs out.
> Such predictions have an abysmal historic track record, because we tend to find workarounds both on the supply side (=> previously undiscovered reserves) as well as flexibility on the demand side (using substitutes).
That's a classic example of the "preparedness paradox" [1]. When no one raises the alarm in time or it is being ignored, resources can go (effectively) exhausted before alternatives can be found, or countries either need to pay extraordinary amounts of money or go to war outright - this has happened in the past with guano [2], which was used for fertilizer and gunpowder production for well over a century until the Haber-Bosch ammonia process was developed at the start of the 20th century.
And we're actually seeing a repeat of that as well happening right now. Economists and scientists have sounded the alarm for decades that oil and gas are finite resources and that geopolitical tensions may impact everyone... no one gave too much of a fuck because one could always "drill baby drill", and now look where we are - Iran has blasted about 20% of Qatar's LNG capacity alone to pieces and blocked off the Strait of Hormuz, sending oil prices skyrocketing.
I've seen articles from the 1880s claiming oil will run out by 1890. 140 years latter...
Yes we can run out of oil, but nobody really knows if or even when that will happen. Right now I'm guessing we won't run out because wind and solar is so much cheaper for most purposes everyone is shifting anyway - this will take decades to play out.
I don't see the Guano industry as a straight counter-example, it even illustrates my point:
If you had made predictions/scenarios in 1850 based on Guano deposits running out within a decade or two, you would have mispredicted completely, because a lot of the industry just transitioned to sodium nitrate (before synthetic fertilisers took over). Nowadays media landscape would've gladly made such doom-and-gloom predictions for global agriculture back then.
I completely agree that quickly depleting reserves often indicate non-sustainable pricing for ressources (which is obviously bad long term), but that is very different from sudden collapse.
the internet really needs to stfu about tesla and get over that oatmeal comic that spawned a billion internet myths. dude was a decent inventor but suffered from chronic mental health issues and, in his lifetime, wasted so much time/energy/money and burned so many bridges with his horrible attitude. there's a reason most people didnt like him in his day, he was a depressed asshole who alienated everyone around him, and yes I know he was likely gay in a time when that wasn't cool. the fact still remains; his inventions are massively overblown by internet nerds.
the podcaster Sebastian Major from "Our Fake History" did a looonnngg patreon episode on tesla and debunked most of the weird myths around tesla. Sebastian doesn't have a vendetta or anything, it's just amazing how much of the Tesla stuff is just nonsense or is viewed through a very weird bias nowadays. Major also briefly touches on the weird Edison stuff and how the internet has twisted Edison into a villain.
Software engineers idolize Tesla because they see themselves as the Tesla (a selfless devotee of the abstract idea of technology) against evil Edisons (businessmen who only care about money and steal other people's ideas). They've basically projected the Jobs/Woz divide back onto two historical figures who, in reality, barely interacted.
The funniest part is that The Oatmeal comic didn't invent this concept, but drew on pre-Internet narratives put forward by The Tesla Society, who were mailing busts of Tesla to universities around the country since the 70s at least. And that organization is explicitly nationalistic and religious, tied to other Serbian-American heritage organizations, and doing events with the Orthodox church.
People need heroes. It's like the Keanu Reeves or Musk era, all the ""badass"" stories about this or that soldier / local hero / w/e that are very often overblown and get further and further away from the initial facts every time they resurface.
No hate here, just noticing there is a weird visceral need to distill stories to their most essential, good vs evil, and the Tesla v Edison thing embodies this perfectly I think.
Keanu Reeves and Nikola Tesla to a degree as well, are decent figures.
Aside from all the cult classics Keanu is part of like john wick and the matrix, even discounting that, he is a good person in it of itself who is genuinely humble and might be one of the best persons within hollywood.
What I feel pissed about is that people like Andrew Tate and others like them took the concept of Matrix and the contributions Keanu did within that movie and tried to capitalize on that cult classic decades after in the most toxic form that might be the issue if we are talking about an era
To be honest, Nikola tesla is also a great person within the context of his time. GGP's comment is still true but Tesla's contributions can hardly be reinstated and I'd much rather people believe these to be the heros (Keanu/Tesla) rather than Tate/Musk etc.
If I take anything from Keanu, I would like to take his humility/humbleness.
Whilst I agree that Keanu is a most excellent human, he was hardly responsible for the concept of the Matrix. In my opinion, Philip K Dick was a major influence (I'm a fan and consider him the prophet of the modern age), though Gibson's Neuromancer was likely a big influence too. (Also, there's the old Doctor Who episode "The Deadly Assassin" which features the Matrix).
It always seems to me that the far right are bereft of original ideas and always co-opt other pre-existing concepts. There's exceptions, but I always find that right wing works are always lacking humour or irony (c.f. Ayn Rand's works).
I mean yeah, but it's not like the guy's 'horrible attitude' came from nowhere. He naiively romanticised migrating to the US thinking the game was about scientific progress rather than capital, and so he got repeatedly screwed over by almost everyone around him for decades.
If I was in his position I'm not sure I'd have taken it as well as he did.
There’s no way he suddenly developed autism or whatever mental illness plagued him upon arrival to American. Like most absolute geniuses he struggled in other areas. He said he had visions as a child.
Tesla was an outstanding technologist, but a poor businessman. He had a "vision" (actually more than one) about how his ideas could transform the world. Some of his ideas were amazing, but he was swindled out of his patents because the investors knew he had a passion and wanted to see them in use. The polyphase AC motor or fluorescent light bulb could have made him millions.
IMHO, the vision he had about universal free electricity (transmitted wirelessly) was the dumbest. It was a novel idea, and he invested a lot (his time and other people's money) in it. The problem with his idea is that there was no way to monetize it (and profit from it). (There were also the technical issues of the power loss over distance (1/R^2), the harm to the environment, and the interference with radio communications.)
Edison was quite a villain. He stole many of his "inventions", and orchestrated a PR campaign against Tesla touting the "evils" of AC power. AFAIK, the electric chair was either invented or inspired by him.
I know these things because I've read many books on various topics related to Tesla, and all of this knowledge predates the Internet.
Essentially none of this is true. The war of the currents was between Edison and Westinghouse, not Tesla. Tesla's downfall was that he turned into a crackpot who rejected modern science, such as Maxwell's equations, and started defrauding investors. Edison was an outspoken opponent of the death penalty, and the electric chair used AC simply because it is much more deadly.
> The war of the currents was between Edison and Westinghouse [...]
Thank you for quashing the gross misinformation. I was going to post this, but searched and found your comment. `\m/`
(I learned of the "Current War" in the 70's, since the Edison Museum was in my "backyard" -- and was a common destination of local school field trips.)
Edison did not invent the electric chair. When the inventors were trying to choose between using AC or DC he helped them decide on AC as part of his PR campaign.
Also, if anything would have been Edison's revenge it would have been HVDC, where they're sending power long distances with DC. (But as you said, even there it wouldn't make a ton of sense, since they were arguing in a different era).
The two primary reasons to do that are to allow the intertie of two AC grids that are not otherwise synchronized, and to take advantage of "earth return" paths when necessary to double the capacity of the line. The latter you may need to consider just to make the line cost effective over an equivalent AC span.
sure, and also Montezuma didn't actually plan on diarrhea ruining people's vacations, but vernacular usage being what it is we have the phrase Montezuma's revenge.
I only found Edison in the headline, I didn't find it anywhere in the body, nor did I find Tesla. Glancing through the article it almost seems like someone tried to make a catchy headline to get clicks.
Yeah this isnt an argument. It was far simpler to wrap some copper wire around a chunk of metal than it was to fire up a mosfet fabrication plant in the 1800's.
You can have the best idea in the world, but if you cant manufacture it you're SOL.
Note that one could email the mods to de-clickbait/enrage the title, especially with such a concrete point as this comment’s. (I haven’t done so as TIL is a poor basis for such an argument.)
Yes, but a rectifier only rectifies. That's not going to give you DC-DC conversion - let alone converting it to a higher voltage for long-distance transmission.