Sometimes when working on a righteous hack, we get goosebumps while watching our code execute faster than we could ever possibly comprehend. Seeing the pixels of the LCD come alive, hearing the chatter of relays and the hum of fans…it’s an amazing thing what electricity can do. And it is equally amazing when you realize that it all started one hundred and thirty five years ago, when [Thomas Edison] changed the world forever with the first practical electric light bulb.
That bulb was lit by a Direct Current – the same thing that runs the computer you’re reading this article on. The same thing that runs many of the hacks you read about here on Hack a Day, and almost all electronic devices in your house. But somewhere in the mix must exist a device that changes the Alternating Current from your wall outlet to the needed DC. Why? Why is it that we transport electricity as AC only to convert it to DC in our homes? You might answer:
“This argument was played out in the War of Currents back in the 1880’s.”
Indeed, it was. But that was a long time ago. Technology has changed. Changed so much to the point that the arguments in the War of Currents might no longer be valid. Join us after the break, where we rehash these arguments, and explore the feasibility of an all DC environment.
Let’s see…110 AC in, 5V DC out, 1000ma…this should work. Quick check with the meter to make sure it’s actually 5V and not 50 and you’re up and running. Each and every one of us has done this at some point in our lives. But why do we have to? Is there any reason we can’t have DC outlets? We’ve seen USB ports built into outlets while strolling the isles of our favorite hardware stores, but most are unlikely to be switch mode supplies.
You would still need AC for kitchen appliances and such. But consider changing these over to DC. Imagine a house where everything ran on DC!
Let’s take it further and imagine running DC from the power station to your house. This brings us back to the War of Currents. We all know that it’s relatively easy to step AC voltage up and down. You just need a transformer. But it’s not that easy with DC, so running DC over long distances is just not practical. Indeed, this was true in the late 1800’s. But is it true today? The technology exists to step up DC to higher voltages. But can we do so efficiently?
Lots of questions still remain. Be sure to sound off in the comments on the idea of an all DC house. Good idea? Or not so good idea?
want an all DC home? live in a RV.
I know this is late, but I think you guys are missing one important part.
Electricity is generally created by turbines (AC).
I guess you have never heard of solar panels… Also, wind turbines output “wild frequency” AC (at least for the smaller ones), which generally has to rectified for use.
Solar panels lack the ability to directly replace turbines due to lack of control. This is true of wind power as well. All sorts of schemes have been suggested to solve this (I’d like to see an industrial scale flywheel system that uses graphene or carbon-nanotubes for a higher maximum speed), but the only one that I know of actually being used is a pump+generator variant on hydropower, and only where you have both plenty of water, and somewhere convenient AND HIGH to pump massive amounts of it to. There are certainly things like the pumped-electrolyte (or maybe it was dielectric?) batteries, but the liquid is apparently too expensive for mass use.
Carbon-fuel electrical generation replacements for mass use mostly come down to variations on hydroelectric (many of the dam sites are either legally unusable, or used and unexpandable, leaving mostly the newer “alternate tech” systems), nuclear (which almost all environmentalists seem to oppose, despite the primary problem of half-life being solvable with legal changes), and some stuff that we haven’t achieved yet (net-power fusion (I have hopes for the Whiffle-ball design), solar-power satellites (environmentalists will oppose those too, as will NIMBYists, and you need space-born mining, refineries, and factories anyways), jet-stream wind turbines).
So, as far as generation is concerned, it’s just not a convincing reason for in-house DC. The convincing reason is entirely the possibility of cheaper devices, and with USB we’re already seeing it delivered on.
Why not bring AC to the supply inlet of the building (since transmitting AC voltage is more cost efficient), then at the breaker box, have specified circuits converted to DC at a specified voltage(s) to power all other DC appliances? This could conceivably be done with a single rectifier and multi-tapped transformer. The DC outlets could be either be labeled with their output voltage, or be color coded.
Simple resistive load appliances, like low cost toasters, heaters, incandescent lighting, etc, do not require AC at all, and can run on 110 DC with no modification. I read about this years ago in an article about going off the grid with solar.
Switches may fail faster, due to a 100% chance of an arc with DC, compared with some lower chance on AC.
Long distance transmission of even High Voltage DC has other issues besides bringing the voltage down to what you want to use, There is induction on the lines, which greatly limits DC transmission.
On getting shocked, it’s not AC that makes muscles contract, DC does, without relaxing, meaning you can’t let go, and causing burns and nerve and muscle damage quickly. AC causes contraction and relaxation at 60 hz, and allows a decent chance to let go.
I’m building an off-grid house at the moment that is uses DC.
All the lighting is LED, there will USB plugs fitted as outlets and where required there will be 240v inverters for specific appliances (and a couple of spare ones).
The reason for me going this way is efficiency, longevity and cost. Installing a single large AC Inverter is not only costly(1. to buy the invertor 2. pay an electrician to install the 240V wiring) but also the life span of an inverter under constant operation may be as short as 5 years (but probably longer though).
By distributing the load across many smaller inverters I get an increased life expectancy of each invertor and a high level of redundancy. Multiple small inverters for solar installation (micro-invertors attached to each panel) are starting to become a thing for those very reasons.
Except for the TV we use USB to charge every entertainment/communication device we own. I’m converting the fridge and washing machine to run on DC and everything else will be run from inverters (eg Kettle, Toaster)
Large wattage heaters and air conditioners won’t be required because the house has a 7.5 star energy rating and has passive heating and cooling features (thermal mass and ground well heat pump).We’re building what is referred to as an earthship http://en.wikipedia.org/wiki/Earthship
The plan is to use a H-Bridge (or possibly mechanical) circuit to create a square wave 50ish Hz 48v 30A AC current for transmission from the battery bank to the house. This because to solar panels, windturbine and backup generator will be housed about 30m from the house. Then to rectify and convert to the various voltages required to run appliances. The wiring in the house will be as short as possible because of the house is round with a centralised power distribution point that will have access from most of the rooms directly.
I don’t have everything worked out and something’s may have to change but I think the power losses through conversion should be less than running a 240v system and the cost of installation and long term maintanence should work out to be of a net benefit.
I have a few suggestions:
1) It’s been demonstrated that graphene flakes can be separated from graphite with water, soap, and a kitchen blender. Similarly it’s been shown that graphene sheets can be formed by mixing graphene into some particular petrochemical that’s suspended above water, and immersing a “wetting” surface (the tests used glass) into the mixture, at which point graphene starts creeping up the surface. I’d suggest that you run a test to see if you can duplicate both of these, and use them to create a small capacitor. If you can, then I think you should allocate some spare space for a future powerbank expansion (if I was building a new house I would definitely do this).
2) For power transmission I would advise you to look at higher frequencies (LOOK, I haven’t checked prices myself so it might not be worth doing). Switching regulators are supposedly RELIABLY the most efficient regulators (linear regulators have an inherent resistive loss which can’t be avoided, whereas switching regulators only have it as a unavoidable detail of real-world implementations), including for power purposes (the http://www.wallindustries.com/products/series/psak3000 apparently provides up to 3000 watts, which is ~ twice what you specified), and they work better at higher frequencies. If the price turned out good, it could be well worth doing (including if you intend to build it yourself).
3) Allocate enough space to physically bring in conventional line power. You probably don’t foresee moving at this point, but plans can change unexpectedly, so it’s worth a few extra bucks to ensure that you can sell the house faster and for more if you get forced into it (don’t bother installing the line, of course: if any purely theoretical buyers want it, you’ll have already provided everything that they need for a problem-free install).
4) Don’t skimp on the foundation, have it poured by a professional (regardless of how you build the rest), and ensure that they do it right (if I didn’t personally know them, or know that they did high-quality work, I would stipulate in the actual contract that they would only get paid if I signed an affidavit stating that I WATCHED the foundation being poured). One of my relatives was a concrete contractor (a millionaire when it meant something, though embezzlement by the customer’s representative cleared him out when he was too old to restart), so I grew up with occasional mentions of some concrete contractors (and some general contractors on government housing jobs: he preferred avoiding those resultingly) who would go so far as to show the customer that the reinforcement was placed, wait for them to leave, and then remove it for the next job before starting the pour. Easier with a single design, but if the foundation goes you probably won’t be able to salvage an EarthShip, so make certain that it’s strong enough. Also: whether a thick layer of sand, a thick layer of gravel, or actual bedrock, the foundation must NOT be build on dirt.
Thanks jam. I’m going to grind some pencils up and give it a go.
I’ve been thinking about caps for primary storage and being that space isn’t an issue as we have 8 acres, power density also isn’t a major concern. I’m going to experiment with a corrugated iron stacked cap. The over size will be 3000mm x 800mm x 1000m. I’ll insulate each layer with corrugated plastic sheets(maybe two per layer). I’ll build the storage shed around it.
That PSU is about right. I’m limited on the voltages I can legally put through wires in Australia, if its over 60V you have to be an electrician to do it.
There is a power line running by the property so if i ever leave( which i highly doubt) the next owners can absorb the cost of installing mains power. But by the way things are going here with monsieur dickhead and co reapealing the carbon tax, removing renewable energy targets, the lowering of feed in tarrifs and the never ending rise of electricity prices any new owners will probably look at buying specifically because its off grid.
I’m having the foundations engineered and a friend who’s a builder is is going to help (for a carton) with the laying. So no one is going to be doing the dodgy with me.
Excellent. There’s surely more stuff to worry over, but it sounds like you’re covered for the stuff I can think of.
I have thought about this for quite a while.
To me the way you solve many of the issues is to run high voltage DC.
Something like 300 volts or so (in 240v mains countries, basically whatever the peak to peak voltage of your AC supply is so 160v or so in 110v places?) Now all the equipment that has an AC-DC converter as the first stage of its input already handles that voltage without issue. The wiring should in theory be up to it (see second order effects). Converting from this high voltage DC into the old style AC signal would then be a relatively simple H bridge, no (or very little) copper required.
I can see a DC grid eventually being rolled out like this much like how gas supplies were converted from town gas to natural gas (or vice versa)
Firstly power companies would convert their high tension lines to DC, you can carry more power over them without needing to run more wires, so when its time to upgrade the capacity of a trunk, they just buy silicon instead of transformers.
Then it’ll spread into the distribution grid, at the 11KV level, again for the same reasons, and they will replace the pole transformers with inverters. Best bit is you can do all that whilst the grid is live much as existing transformer replacements are done.
You can add some smarts to it as well, if it detects an arc forming (i imagine it would have a fairly obvious signature) it can shutdown intelligently and do any other fancy thing you care to think of.
Eventually they would start DC converting houses, particularly in places like the USA where they have a transformer for every house or so. Again for efficiency of transmission. Firstly in an area they would deploy whole house inverters and once that has 100 penetration on the local grid, switch that to DC.
Then the nice man from the electric company comes out goes through your house and switches all your plugs over to DC and gives you a handful of wall wart style inverters for any legacy equipment.
Given that this would obviously be seen coming equipment manufacturers would have 10 years to ensure the latest holographic TV can take a pure DC input between 48 and 400 volts in addition to AC.
I think it could well happen, it might take another 100 years, but if the power companies can see pennies in it they will push for it.
Of course at least in Australia if they keep going the way they are with pricing the entire country is going to go off grid. The payback period for solar is under 4 years now and battery tech is *almost* there in terms of cutting the cord totally paying for itself.
Hi,
I’m converting my house to a mix of AC-DC. For some things that really needs 240VAC, I just leave it that way. But what can be used with 12VDC, I’m converting it. I’ve changed several light bulbs to 12VDC led bulbs, and I have a USB outlet that is regulated using LM2576 to be as efficient as possible. My next step is to remove some wall adapters and plug the equipment with a regulated 12VDC. For example, some comm Switches, routers, etc. And would like to do it too to a mini Itx pc I’ve.
I’ve wired the 12VDC with 2,5mm2 wire. And all the wires came from a Solar controller. (145W panel + 2 x 50Ah stationary batteries).
I’m not sure where this article got the idea that the stepping up/down of voltages was why DC transmission was abandoned…. it was the fact that DC is very inefficient over long distances. Even with today’s tech AC is just cheaper and easier to transmit. So yeah, that bit is never going to happen. If your power is coming from the power station, then it’ll be AC.
The reason we don’t have an all DC house is the same reason why it’s a bad idea to keep un-used wall warts plugged in all the time.. it’s wasteful as the conversion process is constantly losing energy. So the only way that would happen is if the outlets and the appliances themselves were all radically re-designed so that they turned the transformers off at the AC end when not in use… this adds a lot of cost and complexity to an otherwise simple wiring job.
remember K.I.S.S. (that’s two band names in this discussion).
On top of all of this, keep in mind that some devices just do better with AC power, and while converting AC to DC is fairly easy an inexpensive, doing the opposite is not… and if you don’t believe me, go pricing your cheapest power inverter and compare that to a dc transformer.
It’s all about cost people. Like for instance brushless motors are mentioned a few times in these posts… yup they work on dc… they also cost roughly 10X what a cheap AC motor costs. So yeah, if you want to pay 100 bucks for what would otherwise be a 10 dollar oscillating fan, have at it.
Small inverter drive fan motors are super cheap thanks to the computer industry. They totally blow away PSCs and (especially) shaded pole motors when it comes to efficiency. The fact that they can run faster than 3600RPM gives them better power density, too.
I live in an off-grid home with many thousands of watts of PV and wind. I have hashed this over for years and I am now in my third home.
I started by building a lighting system that ran off 48VDC, and some DC spread around the home (two buses 12VDC and 24VDC) for work benches, office, and garage.
I sold that place and built an AC lighted house and and no distributed DC. At that place I hooked up DC only for heating water.
I am now only using DC for storage.
I quickly learned that AC is the only way to go.
AC is much more controllable. It is much easier to switch (zero point crossing SSRs) Much less expensive to move around and at higher capacities.
Inverter technology is now so advanced that it really makes no sense to bother with running DC anywhere.
All the talk about wall wort, charger type converters is just straight up a waste of time.
Almost no power is consumed on that part of the system. (<3%)
AC is just way more flexible, and you can purchase any AC to DC converter for any normal voltage (=<48VDC) for reasonable prices.
About the only place to distribute DC is on the work bench, 3.3,5,12,24,48 VDC is super easy to accomplish on any work bench with just the savings in wire needed to get it there from some central location in the home.
In reality the real power draws (depending on climate) are
1. heating,
2. cooling(fridges/AC),
3. cooking,
4. and then lighting.
Now if you want to live in a mini home or have a substandard living arrangement or you don't do anything with your life except text and watch movies then none of my experiences apply.
The future DC home will have 400V DC grid connection, 400V DC in the house for heavy appliances and will use USB 3.0 for appliances <100 W. USB 3.0 can deliver 5V/500 mA for dumb devices, up to 5V 4A with appropriate negotiation by DC on data lines (think iPad method) and intelligent devices can up-negotiate the voltage to 25V@4A, so a total consumption of 100W, sufficient for most, if not other all, but the 400V devices.
Actually, All-DC homes are currently being built by a major Dutch construction company. (http://preview.tinyurl.com/ptkchmd)
The implementation is thought to be a three-step process: (http://preview.tinyurl.com/qgjwevx) first there will be AC mains-to DC converters in the house DC in the house using to AC for those old appliances that are still a bit wasteful (who uses an electric boiler if you can have a heat pump), then an all DC home and finally also the links between the now-already HVDC wind parks and the home will be DC.
In the Netherlands there was an interesting conference about all-DC. The lectures ranged from milliWatts in standby mode of switching power supplies via KW DC-DC converters for trains to gigaWatts (960V/6500A) for an all-DC powered cutter suction dredger vessel.
The tour across the various research projects afterward showed an application where the interconnection stations of wind parks, power plants and the land grid negotiated flow of energy, indicating that DC even at TW is not a problem.
The links (in Dutch) to the lectures (many in English) can be found here:
http://www.vermogenselektronicaonline.nl/programma/
Especially the circuit breakers and switchboards of the HUTA vessel (http://preview.tinyurl.com/kkxq8q6) with its 23 MW generator capacity are impressive.
Draw an arc, someone? Not with properly designed power electronics (the name of the conference translated). IGBT and SiC are the magic words here.
I still haven’t seen a good argument WHY we would want to switch to DC. All of the power supplies we have hanging around are there to convert 120VAC to somethingVDC, but the problem is that there’s no agreement on what a good value of DC would be best.
So I’ll come right out and say WHY we should switch to DC. Electrolytic capacitors. All of those wall warts and every other DC power supply we have, has to at some point – either before or after a transformer – rectify the current and filter it to remove the huge amount of ripple that you end up with when you rectify single-phase AC. These capacitors (which I’m going to guess add up to about half the cost of a typical AC adapter) could be a small fraction of their size if they didn’t have that huge amount of ripple to remove.
The solution is to use a three-phase rectifier to produce the HVDC that will power our houses. If you take three of the existing pole transformers, move two of them so all three are on one power pole, and rectify their output, you get about 320VDC, and you don’t need much capacitance to filter it because the three sine waves overlap. Now you can run that 320VDC to the houses that those pole transformers were supplying. Their wiring can take the higher voltage, because it won’t be +320V and ground; it will be +160 and -160. So there’s no problem there. Once it’s in the house, you do need to run it through a new breaker panel, because those AC breakers will NOT work on DC, as a couple people have noted.
NOW you can plug all of those AC adapters you’re already using into the new outlets, because all of them are designed to run on 100-240V, which means 338V peak. But any NEW appliances you get will be a little cheaper because they don’t need that big electrolytic in them. “Legacy” motor-driven appliances will need an inverter, or to have their motors replaced, but any new motor-driven appliances will use brushless DC motors (many already to, to handle the different voltages and frequencies around the world), which because they are really three-phase variable-frequency induction motors, will run more efficiently than the old single-phase induction motors.
For the power utilities, they can upgrade wherever they want, at their own pace, because by having a simple way of converting to semi-backwards-compatible power to the end user (i.e., the 320VDC standard), they can do the AC-to-DC conversion right at the power pole, and convert a block at a time to DC distribution, IF there’s a good reason to do that. I don’t think there is.
Don’t fix something that’s not broken
For certain devices AC is still required, but I would still wire my apartment with 12V DC (or maybe 24V) and set up outlets with buck DC-DC converters with iPhone-capable USB charging ports, or standard car lighter outlets.
Seems to me that would be the smartest thing to do. We still need the 120 VAC for washing machines, refrigerators and what not, and air conditioners, clothes dryers and electric ovens all run on 220.
Nothing else needs to run on it however. Even the LED bulbs all contain a rectifier and step down transformer inside the bulb.
Eliminating that makes the bulbs cheaper, much cheaper, and then you can get away with Class 2 wiring for your lighting fixtures.
So if every home had a high capacity step down and rectifier stage, then you could pipe all the 5VDC and 12-24 VDC lines piggybacked next to the relatively few 120 VAC outlets you’d still need.
The only thing missing is an agreed upon standard for 12-24 volts DC but I already know what would be ideal, because it is already accepted in most of the industries that already use that voltage.
XLR 4.
Its cousin, the XLR 3, is already a professional standard for audio.