Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the frustrating thing about LEDs that IDK we can change.

If there was a "DC" light socket in the house we could have LEDs outlasting owners, and for cheap. Nearly all the expense of LED bulbs is the power supply. Everything else is dirt cheap. A single home DC power supply with ~200W of output could light an entire house, flicker free.

What's even more frustrating is I think we could fix it. A national regulation for DC light sockets would fix it. Mandate a voltage, shape, and max amperage and BAM, you'll get 1000 different manufactures making standard compliant bulbs and home power supplies that will last an eternity.



I am designing an off-grid cabin with a solar panel array charging a bank of batteries with a propane generator backup. I run ethernet as power with a custom designed PCB that terminates at the outlet side where it exposes a 20 watt USB charging port and an ethernet port.

The lights are all basically cut 12v light strips inside of old light fixtures with a custom controller that also terminates PoE. The 48 volts that most PoE standards specify is more than enough to push power down the line for < 100 meter runs.

The advantage of PoE here is that anything under 50 volts is considered low voltage and does not need to follow the same rules as normal house wiring. I did not like that everything is hinging upon a beefy PoE switch so I actually made it passive PoE instead by design.


If you're willing to share your design, I'm sure there are other folks like myself who think this is a cool idea. I've wanted to do PoE (or passive PoE) for lights for a while now...


I am going to open source it. The goal was to be able to get all the SMD stuff available at JLPCB so you can just send it to be fabbed (with some thru-hole components you would just solder yourself) or I would also sell them at cost + 10%. My brother designed some 802.3at chips and was going to have him review my work first as I don't want to send out into the world a poorly design power system (there are enough of those things out there unfortunately).


Cant see any github link in your profile. Any way I can get notified when you open source this? Thank you.


Sell them at cost + 100% so you actually do it. I reckon buyers won’t care.


I'm interested in installing your design in my cabin as well, could also be very useful for boats too.


We've had 2 standard DC outlets for a while now: 12V cigarette lighter and 5V USB. You do often see them in odd places. But the voltage and wattage of those specs is too low to be useful, so they haven't evolved into DC power distribution.

USB-C PD is at a useful voltage & wattage level, and so is Ethernet POE. I wouldn't be surprised to see them start to be used for general power distribution in niche applications, like RV's and off-grid cabins.

I don't think we're going to ever get a bulb standard, though.


Cars are starting to move to 48V DC. My under cabinet lighting in the kitchen are powered by DC from a power supply in the basement.

I could definitely see this becoming more common. Powering the ~100 watts of fixed lighting spread across my whole house on ten different 15A 120v circuits, each with their own arcfault breaker and 12 gauge copper electrical lines running back to the panel is fabulously expensive for what could be done with a bunch of CAT5 in each floor running to some conveniently located “POE injector” type devices.

You would want to be able to take a standard fixture and just push DC through it and use special bulbs with a standard A19 base, but that’s problematic when the next owner tries to screw in a standard bulb - what happens when it sees 48V DC?

I would guess if for safety reasons it has to be a non-A19 connector, then your light fixture choices get cut down to almost nothing and no one will make the switch?

It’s really interesting to think about, most everything I’m plugging into AC outlets in my house, the first step is converting it to DC. A lot of my outlets I’ve switched to include USB ports so I don’t need the wall warts. If you have solar and battery backup even more-so you start to question why we are wasting so much money moving everything back and forth between DC/AC/DC within a house.


If you're introducing electrical incompatibility, why on Earth would you try to preserve mechanical compatibility?


This is a fair point, breaking mechanical compatibility will at least stop any electrically exciting goofs from occurring from plugging a low voltage DC lamp into a (comparatively) high voltage AC socket.


> but that’s problematic when the next owner tries to screw in a standard bulb - what happens when it sees 48V DC?

If by "standard" you mean a incandescent tungsten filament bulb, nothing at all.

For a true LED driver power supply, it would be constant current, so the tungsten filament would see 25mA (or whatever the constant current is set for) of DC, and nothing bad would happen (the filament also would not likely illuminate either).

Screwing in an LED bulb with integrated power supply, the external supply will still feed the constant current value, so what happens depends upon the design of the LED bulb's integrated power supply. If 25mA is enough to drive everything, the LED bulb might light up. If 25mA is not enough to drive everything, most likely nothing lights up.


48V without a current limit shouldn't be nothing, but you should expect less than 10% brightness.

For constant current, you'd need to drive at least 9 watts so it would be more like 250mA if not higher.

A 1600 lumen LED module might take as much or more current than a 60w incandescent. If your constant current supply can output between 0 volts and input volts, and it's set for a bulb with such a module, it would be able to power an incandescent bulb.


I suspect the results would be quite poor. Incandescent filaments increase their resistance when they get hotter, so driving them at constant RMS voltage means that the power will decrease as they heat up, which will give them a degree of stability. At constant current, though, the power will increase with increasing temperature.

(Of course, they’re quite hot and radiative cooling increases like T^4, so this isn’t necessarily a show stopper. But it’s probably not helpful.)


>screw in a standard bulb - what happens when it sees 48V DC?

Either it lights up or not? I don't see a problem here.

But I'm not sure moving part of power supply elsewhere will help that much, it needs current driver electronics anyway.


I wondered for a long time why we don't have standard built-in DC in building codes that could power our lights, and most electronic devices. Really the few things in my house that require full line voltage are all in the kitchen. Everything else has a transformer attached.


Me too - standard 12V and 5V rails run throughout the house would be great. I even thought about a wallpaper with conductive strips so the power could be invisibly delivered to any part of any room and "tapped" with a push-into-the-wall socket.


The usual counterargument is: voltage drop can become a problem. Trying to use one big power supply and use DC as your only distribution mechanism probably isn't a good idea.

But choosing a DC system for part of the house can make a lot of sense.

For one residential new construction room, it can be practical to have one shared power supply rather than one per LED. Say you have a 12 V, 5 A DC power supply. Using a star wiring topology, this can serve 10 lights (at 500 mA) fine with 16 AWG.


But how far can the power go before wire resistance causes too much Vdrop? Maybe one good transformer+rectifier per room? AC to the room and DC in the room. Those DC runs would be <5m each.


Even 12V over 50ft+ will have noticeable voltage drop unless you have huge wires. 5v? Not even worth considering.

And switch mode power supplies are relatively inexpensive and quite efficient.


The wires would be thick asf

Not practical


It's because of Ohm's Law. Any sophomore-level electrical engineering student could answer this question immediately.


Until I see someone defining a representative example and running the numbers, I'm skeptical of their DC vs AC commentary.

I say this, because I was guilty of this exact shortcut thinking (in another comment). But I paused and thought to myself "I should run the numbers before just repeating the usual voltage drop criticism".

So I compared scenarios and it depends a lot on the topology, lengths, costs, and situation (new vs renovation).

Sure, a whole house system doesn't typically make sense, but I don't think that's what people are really talking about. I think people are interested in hybrid systems; e.g. DC power supply for each room.

I don't know if you meant it, but the sentence about "any sophomore level electrical engineering student can solve this" can easily come across as dismissive. I also think it gives too much credit to sophomore students. :)

I would have more confidence in an electrician apprentice on this one. I think they'd have more practical experience when it comes to figuring out what are the right questions to ask.

I did EE in college and do a fair bit of hands on residential electrical work.

P.S. How many sophomore level engineering students learn to do a sensitivity analysis?


>Sure, a whole house system doesn't typically make sense, but I don't think that's what people are really talking about. I think people are interested in hybrid systems; e.g. DC power supply for each room.

I completely disagree. Where exactly are you going to put a power supply in a room? Make a special electrical box for it? Won't it be unsightly in many rooms, or need some huge special panel that looks like a breaker panel? The comments I see seem to be advocating a whole-house solution, where a power supply is mounted in the breaker panel to supply LVDC to the whole unit. But this makes no sense for several reasons, especially the voltage drop.

>I don't know if you meant it, but the sentence about "any sophomore level electrical engineering student can solve this" can easily come across as dismissive. I also think it gives too much credit to sophomore students. :)

It's supposed to be dismissive, because this whole discussion is a bunch of software people trying to make up solutions for a perceived problem when they obviously don't know one of the most basic things about electrical theory, which makes all of their solutions unworkable. It's like a bunch of people trying to make a new kind of personal vehicle to replace cars when they don't even understand Newton's Laws. It's really annoying, because I see this kind of discussion pop up every so often, over many many years.

I have another comment here I don't feel like copy-and-pasting, but basically this whole discussion is silly because people are trying to make a solution using a very expensive power supply to fix a problem they see because they're buying cheap $2 light bulbs that burn out quickly, instead of just buying light fixtures that were properly engineered in the first place. With modern SMPSs, you're not going to get any kind of benefit by centralizing the power supply to drive individual LEDs, you're only going to get problems. LEDs need a driver circuit to provide constant current, and that means the power supply needs to be matched to the emitters and kept very close to it.


Where exactly are you going to put a power supply in a room? Make a special electrical box for it?

Switched-mode power supplies can be as small as your average Arduino board. They can fit inside the space used for wall outlets or light fixtures. Or you can put the DC transformer inside the light switch.


> I completely disagree. Where exactly are you going to put a power supply in a room? Make a special electrical box for it? Won't it be unsightly in many rooms, or need some huge special panel that looks like a breaker panel?

This sounds like a non-issue, specially considering the pervasive use of "unsightly" installations like air ducts, heating vents, radiators, electrical sockets, telecommunication service panels, routers, and even light fixtures.

If you intentionally dismiss obvious solutions, of course you only end up with problems without obvious solutions.

> It's supposed to be dismissive, because this whole discussion is (...)

If you have nothing to add, please add nothing.


I'm skeptical of their DC vs AC commentary

It's not about DC vs AC, it's high-voltage vs low-voltage. The power dissipation by wire resistance scales with the square of the current ($P=RI^2$), and low line voltage means that you need large currents to transmit the same amount of power.


Whether or not that's feasible is going to depend a lot on the application. I don't think we'd ever fully rid homes of AC sockets, it's too useful for things like vacuum cleaners or space heaters.

But what about the sub 100W or even 200W applications? That's where I think something like 48VDC would start to shine. Every light in a home, phone chargers, tablet chargers, computer monitors, televisions, computers? (maybe not gaming rigs, but certainly laptops and nucs).


>But what about the sub 100W or even 200W applications? That's where I think something like 48VDC would start to shine.

How so? Exactly what benefit does it have over the current AC mains? With 48VDC, you'd still need to use DC-to-DC converters to power everything. I fail to see how that's any kind of improvement over the current switch-mode power supplies used. Instead, it'll just be less efficient because you'll get higher line losses in the power lines in the walls and all the way from wherever that 48VDC is coming from. If that's from a big SMPS in a closet somewhere, that's going to have its own losses. Overall, the entire system will have lower efficiency compared to the current system.

Exactly what problem are you trying to solve with this idea? If you think you're going to eliminate SMPSs in all your electronic equipment, you're not; that's a fantasy. Everything needs a power supply because electronics only work at very low voltages (5V, 3.3V, even 1.8V in places, now 20V with USB3) and most equipment has some kind of peculiar voltage requirements, and usually multiple different requirements inside the same device. There's no improvement in efficiency by running a computer, for instance, from 48VDC vs. 120VAC or 240VAC, in fact it's probably worse.


Yes. Thanks for catching that.

Also, DC and AC have differences in power transmission independent of resistance, some due to first principles (reactivity), and others related to devices for stepping voltage up or down (eg transformers).


Trying to cram all the infrastructure for an LED lamp into the shape of a light bulb is a bad idea, even if the input power is DC. Good designs for LED lighting have larger surface areas for heat dissipation and some physical/thermal separation between the LEDs and the power supply. A quality power supply does not produce flicker. As other comments have noted, dimming, or even predictable output requires some sort of power regulation even with DC input.

I think the way to change it is to replace sockets with hardwired LED fixtures. This is easy for something like a standalone ceiling light. It may be harder for other devices like ceiling fans that integrate a light bulb socket, but converting those devices to take DC power as in your proposal isn't easy either (most would just get discarded and replaced).

Doing it well is more expensive in the short-term than screw-in bulbs. A quick look on Amazon suggests integrated ceiling lights are about 10x the price of LED bulbs, though I suspect the longer service life pays for itself.


> Trying to cram all the infrastructure for an LED lamp into the shape of a light bulb is a bad idea, even if the input power is DC.

Absolutely, the incandescent light bulbs have that shape for a reason: the screw is small because there is nothing to put in it and it doesn't heat, the bulb is large to dissipate all the light and heat it generates. And the LED light bulbs have exactly opposite problems: almost all of the heat is generated near the screw while the bulb itself generates almost none and the light-emitter doesn't even need the bulb that large around of it. Oh, and the casing around the screw is plastic so the thermal conductivity is horrible. Honestly, it's a profoundly terrible form-factor which we're now stuck with.


There are finned LED bulbs which can fit standard Edison Screw sockets, e.g.:

<https://www.designboom.com/technology/self-cooling-100-watt-...>

<https://i.pinimg.com/originals/b5/c2/c5/b5c2c5d69fb240a571ba...>

It's also helpful to recognise that existing lighting fixtures and lamps were designed around the constraints of incandescent bulbs. The first generation of LED bulbs and lamps largely conform to these. As LEDs mature, both fixtures and lamps which address the limitations and requirements of the technology (transformers, perhaps dedicated 12v circuits, heat dissipation for the transformer rather than lighting elements themselves, and better light-temperature and intensity regulation) should emerge.

We're presently in the somewhat-messy half-emerged state. Think horseless carriages, wireless, and the days of dual gas/electric lighting and lamping systems (yes, these existed, and yes, the failure modes were ... much as you might imagine).


I sure hope 12V doesn’t happen. 12V is absurdly low for lighting and needs extremely thick wires to get decent efficiency.

24V is okay. 48V would be nicer for indoor use.


Wiring up a house with 48V for lights and 120V for plugs would be such a pain. Pulling 2 different wires to every room. Weird circuit breakers. Yuck.


Already happens in New Zealand: lighting is usually low current 1mm2 wiring, and everything else is heavier gauge. Circuit breakers mostly care about Amps (all breakers could be rated to mains voltage if you wanted to avoid “weird”).

Also low voltage wiring can legally be done by anyone in NZ (a bonus when doing your own work, and a pitfall when buying a house?)


Do not use AC breakers for DC! (they lack arc extinguisher)


Maybe for you, but I have been considering just this. I would love to have dedicated 24v for lighting and charging of devices. My house already has various systems for lighting, such as xenox throughout kitchen under the cabinets and also the basement. Both are driven from separate transformers. Then I got the rest of the house with can lights utilizing br30 bulbs that are just a waste of 12 awg. The one place I was able to replace with dedicated LED fixture, I had to overpay for a decent product that wouldve been better off as a 24v basic LED light. When you consider most hvac systems operate at 24v, there is some real potential to create a decent standard serving multiple purposes.

And besides, idk if you have ever pulled 12ga wire, but it's a pita. Idk any electrician that would agree with you saying it would be a pain to cut back on heavy wire and pull half that with light 22 awg.


A lot of countries already have lighting on a separate circuit. It means that when something trips a breaker you don't lose all your lights as well.


Lighting (on AC 110v / 220v circuits) also typically is specced for a lower peak amperage than utility or appliance outlets. For US codes, generally 15A rather than 20A. Lighting may use 20A, but isn't required to.

Other circuits must be 20A, e.g., kitchen outlets serving appliances.

A summary of standards here: <https://www.thespruce.com/common-electrical-codes-by-room-11...>

A 15A lighting circuit can serve up to 14 100W bulbs. Or 150 LEDs drawing 10W each....


Why? Modern code requires a separate lighting power runs


Are there sockets for 24V or 48V bulbs that could be standardized on?


This is what I want. A standard 48VDC socket would be a game changer for lighting.

Heck, with such a standard you could have 120VAC -> 48VDC converters and you'd be in the same position we are today with Leds, only better because you'd just have to replace the converter and not the whole bulb.


> needs extremely thick wires

Not extremely thick. Wire losses remain similar at 12V as they were at 110V (Replace 100W bulb with a 10W bulb at 12V, current remains ~1A so wire losses stay the same as the were). Wire losses might be say 1W for 1mm2 cabling. 240V example: https://ausinet.com.au/voltage-drop/

Agree that it is worth upping voltage to chase a few more percent savings, but still need to consider other constraints.


Fair enough. DC then, of some reasonable voltage.


There are also these type of "ceramic substrate" bulbs which claim to give longer life. I suspect other compromises in the construction may negate that.

https://www.sansiled.com/blogs/learn/what-are-the-benefits-o...


I don't think we're exactly stuck with the old form factor. We can start phasing them out. Replacement of screw sockets with modern fixtures is well within the capabilities of the average DIYer (though perhaps some places it's illegal for anyone but a professional electrician to touch anything hardwired).


Well, one of the main sales point of the LED bulbs was compatibility with existing E14/E27/etc sockets: no need to change the wiring, or the fixtures, just buy a new, better light bulb and screw it right in! It will also serve longer and be better for the environment, what's not to like? We'll even ban the sales of 100W and higher incandescent light bulbs to help you make the right choice!

That's also the pitch of the smart bulbs: a sane way would be to make a smart light switch but what if you can't do that (e.g., you rent the apartment)? So we'll shove the controller chip into a disposable light bulb, that's still perfectly fine for the environment.

By the way, I don't know how things turned out in your part of the world but over here, after the ban went into the force the manufacturers of incandescent lightbulb started selling 95W light bulbs 8D


Probably going to sound crazy, but we could start running water pipes in front of the walls and under the ceilings and mounting the LED's directly on the pipes for cooling. Creativity, thinking wholistically... the entire contemporary western house design needs a rethink frankly, from DC circuits to electrification to modular, mass-produceable utility drop-in pods, all with an eye towards integrated systems design paired with scalable modularity.


In a normal house you're not going to need that much light that you'll need water cooling.

Simple metal fins are more than sufficient along with a high efficiency power supply.


One of the problems is that in some countries like the US, ceiling lamps are hard-wired and not "user-replaceable", so people have to resort to using those stupid bulbs in their old fixtures.

I live in Japan, and instead of just a pair of wires coming out of the ceiling, there is a standardized "ceiling socket" [0] which can also support the weight of a lamp. This means that swapping out light fixtures is plug and play, so the standard LED lamp is something like this [1] where you have a nice big flat metal plate backing the hardware is mounted to for heat-sinking.

I don't own any LED bulbs at all - all our lamps are of this type so I wouldn't have anywhere to put one.

It was the same when I lived in Sweden - a standard ceiling light outlet (IIRC there is a EU standard for this now called DCL) so that replacing light fixtures was easy. Moving into an apartment, often they wouldn't even come with light fixtures, you'd bring your own.

[0] https://www.e-connect.jp/images/to_quickB.jpg

[1] https://www.irisplaza.co.jp/IMAGE/HK/PRODUCT/H246902.jpg


In the Netherlands we have just a pair of wires coming out of the ceiling but everyone replaces their own lamp fixtures anyway. Most people should be able to manage clamping or screwing down the brown or black wire to the L and the blue wire to N.


You can't have a low-voltage DC power supply supplying the entire home: the voltage drop between the supply and the LED would be huge. There's a reason we use higher voltages for long wire lengths: to increase efficiency and reduce line losses, since losses increase geometrically with the square of the current (according to Ohm's Law: P = R * I^2). Higher voltage means proportionally lower current, and geometrically lower losses.

And since we need high voltage (at least 100V) to keep line losses very low and allow the use of thinner-gauge copper wiring, we need a switching power supply at every light fixture, so it really doesn't matter if it's AC or DC, since modern SMPS (switch-mode power supplies) work equally well with either.

Finally, on top of all that, LEDs are current-driven devices, and need a constant-current power supply. So the power supply must be very close to the diodes, or else fluctuations in supply voltage will have very negative effects.


Low voltage DC lighting is a thing that has existed for a very, very long time. That most houses don't have it is more cultural than anything else, in my opinion.

That means it's totally fixable. You can install such a system in existing buildings right now, and it's not crazy expensive unless you want to run the wires inside the walls.

If we could shift cultural expectations around this, adding a LV system in new construction would not significantly increase the construction costs. It will start to be done if buyers start demanding it.


12V requires quite a lot of amps for enough light, so low DC is not optimal. Also LEDs are current driven devices, i.e. they will be sensitive to voltage changes (even with a current limiting resistor)


Low-voltage doesn't necessarily mean 12V. I think it's anything below about 50, although lighting systems currently marketed as "low voltage" are usually 12 or 24 volts.

The constant current thing is true, but that's not a terribly difficult problem.


It's complicated. https://en.wikipedia.org/wiki/Low_voltage

Depending on who you ask, the limit for low voltage DC might be 42 or 50 or maybe 60 or 120 or 1500.


For example: the stairwell shin-height lights in this 90s house are 12 VDC. There's a transformer plugged into a wall outlet in the nearby storage closet.


That works OK because the transformer is relatively close to the lights. If it were a reasonably-large house, and the transformer were on the opposite side of the house, you'd have a problem with a noticeable voltage drop. All these ideas people are throwing out here involve a single whole-house power supply. If it were for 48VDC, it would probably be fine, but 12V would result in significant line losses.


We already see transformers for a run of e.g. track lights, low voltage lights on tension wires, and so on. That's been a thing ever since halogens came to market.

Having multiple transformers is perfectly doable and commercially viable -- though I would appreciate more product availability for something easy to stash in the hollow space of a ceiling, like recessed lighting is installed.


I still don't see the point of all this. If you have a handful of lights in a room, and drive them with a single power supply, you're still going to have big problems: the line lengths to each fixture will be different, resulting in different voltages. You can't drive LEDs that way with good results: they need fixed current. And you can't daisy-chain them either: if one emitter dies, then the remaining ones will suddenly have different current, and probably die quickly. The proper way to drive LEDs is with a power supply very close to the emitters and designed specifically for those emitters and the (short) wire length to them, not 4 meters away and not with some variable-length wire that can't be designed for.

Everyone here is complaining about ultra-cheap LEDs that don't last very long because they're poorly engineered, but that's exactly what you're all trying to do here by using a separate, shared power supply. You could get away with that in the 1980s using incandescent bulbs, but you can't do it now unless you want the same crappy lifespan and reliability you're all complaining about.

The solution is very simple: buy fixtures that are engineered well. Switch-mode power supply electronics are not expensive at all, but when mfgs cheap out or do a crappy job designing them, you get bad results, usually short lifetime of either the power supply or the LED. What you're trying to do here is buy a really expensive power supply, which has to be engineered to a far greater degree and for a far wider range of operating conditions (since they don't know what you're going to connect to it), just because you had a bad experience buying some $2 light bulb that had a crappy power supply built-in. This really makes no sense.


LEDs must be powered by a constant-current supply, and distribution does not work well at constant-current, and is always constant voltage. So no matter what you will need some sort of switching power supply.

LEDs are like 15% efficient and power supplies are >95%. They just need to be separated slightly so the LEDs aren't heating the power supply. Most recessed LED lighting now has a separate junction box with the power supply.


> LEDs must be powered by a constant-current supply, and distribution does not work well at constant-current, and is always constant voltage. So no matter what you will need some sort of switching power supply.

I think the biggest problem is that many cheap power supplies cycle at lower frequencies that cause flickering which is perceptible subconsciously. A modern switchmode power supply might operate in the 50-500khz range which will not cause perceivable flickers.


The really cheap stuff actually doesn't even have a power supply! There's a breed of LEDs that takes straight AC and rectifies it using the LEDs themselves. By using a large number of tiny LEDs in series (typically in COB form), you can easily reach close to 110v or even 220v, and then you add a small current limiting controller in series that's dirt cheap compared to magnetics... These are super cheap, and appear bright, but they flicker at 120hz, which can be annoying when there's motion or if you're sensitive to it.

I'd say it's a very bad choice for a bedroom or living room light, but I have nothing against it for the outdoor lights, signage and a bunch of other applications where cost is king.


I have a serious problem with it for outdoor lighting and signage: it gives me a headache. Enough exposure will make me feel actively sick. The effect is not subtle.

Just don’t use these devices, please.


Branding matters. If your brand is a light that flickers, you might want to consider the old adage penny wise, pound foolish. As a consumer, why would I choose to shop at an establishment that has flickering lights when I could shop at a different one that did not? Unless of course, I had no choice.

But then, a wise entrepreneur would recognize paying extra to have non-flickering signage would attract some customers.

Flickering lights can induce migraines in susceptible people, so literally, saving a penny here actively drives away business.


I think this falls apart in the details. LEDs want constant current power supplies, and their owners frequently want them to dim. So you will still need a power supply.

You can fudge it with resisters like in an LED strip, but you lose efficiency and dimming quality.

That being said, I expect that power supplies with 48VDC input or so would be cheaper.


Maybe this could be a prosumer retrofit thing, where the AC voltage gets converted to DC in the junction box, and then DC is sent down to the fixture.

Probably with some sort of current sensing system to make it compatible with dimmers.

Pair that with DC A19 LED bulbs that have no internal power systems.

Probably expensive to put together and to install, but if the goal was to have LEDs that last longer, that would do it.


>Maybe this could be a prosumer retrofit thing, where the AC voltage gets converted to DC in the junction box, and then DC is sent down to the fixture.

The problem is that in 99.99% of homes outlets are on the same circuits as light fixtures, you would need to do some major rewiring.


No, I'm saying you put a module into the junction box that the light fixture is attached to that serves as an AC/DC adapter, current limiting driver, and possibly a dimming sensor that would then provide downstream DC voltage to retrofit A19 bulbs.

Those bulbs would then have no internal switching systems to burn out and rely entirely on the module hidden behind the wall to handle their power needs.


You will see need to current drive the LEDs, DC alone won't help.


> This is the frustrating thing about LEDs that IDK we can change.

I think that non-bulb LED fixtures are relatively common. For example, a style exists where you cut a hole in the ceiling and friction-fit the LEDs with the power supply up in the attic (presumably with infinite convective airflow): https://www.lowes.com/pd/Utilitech-Canless-Color-Choice-Inte...

These power supplies aren't going to die from overheating because the power supply is nowhere near the heat-producing LEDs. And, it's not like $30 for your entire light fixture is going to break the bank.


We _could_ even have standard DC bulbs for lamps with built-in standard power supplies, but they don't really exist.


> A single home DC power supply with ~200W of output could light an entire house, flicker free.

How about power over ethernet?


Would this system need new electrical wiring ?


Yes. For the commercial DC lighting installations I've seen they were using power over ethernet. That's not necessarily the only way to deliver DC power but whatever you do it's going to be wired differently from 120 VAC.


I used to do electrical installs in commercial buildings and this was starting to catch on, mainly because the the practice of running ethernet (including the 8P8C aka RJ45 connector, patch paneling, etc) is already established. This always felt very roundabout and requires expensive networking equipment just to run lights which I do not personally like because it will just cause confusion.


You also get (the wiring for) remote control for free though, which is nice.

It seems like there would be a market for cheap as possible 10 mbit switches with 802.3bt/802.3af support though.

https://www.amazon.com/Gigabit-802-3af-100Mbps-250Meter-Unma... is pretty cheap as is, I'm sure you could buy something in bulk for cheaper.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: