The push toward LED seems to be primarily for emission target related reasons. It is very hard to buy incandescent bulbs in the UK; even for those of us that accept the cost implications. Also, many less expensive LEDs flicker at the rate of the frequency supply of the current (ie 240 or 120 Hz). This is very annoying and related to the instantaneous response of LED vs the averaging effect of the alternating current through an actual glowing hot filament. It is interesting to read on the development of blue and white LED technology.
In the EU this was indeed done for energy efficiency/emissions. Incandescent bulbs were gradually banned from normal sale, starting with the most energy hungry (diffused 100W) and gradually expanding until only low-wattage and special-purpose bulbs were left. Special-purpose bulbs cover a large variety for everything where switching didn't make sense, like machine shops or historic buildings. LEDs aren't mandated per se, but they are the most attractive alternative. And because this all happened before brexit the UK has the same rules, unless they revised any of them post-brexit
For the most part this was a very positive step. Prices for LED bulbs plunged when they went from the "premium" energy-efficient alternative to the default option. But you also get a lot of crap on the market, and stuffing LEDs in form factors designed for incandescent bulbs makes good electrical and thermal design challenging. Even for those brands that actually try
> The push toward LED seems to be primarily for emission target related reasons
Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
Some cheap LEDs do flicker (at 50 or 60 Hz). But that’s fairly easily solved. I don’t think I’ve noticed the flicker since some cheap bulbs I bought in 2014 or so.
Well… (Sorry, let me put my tinfoil hat on.) Yeah, well that noticed part is what is worrisome to me. I do worry that there is some effect on our brains even though we might not perceive the flicker.
As an analogy, I got into those supposedly audiophile "Class D" (or "Class T") amplifiers over a decade ago. Every day I turned on the music in my office and coded with the T-amp playing. I would have told you at the time that, indeed, it sounded amazing.
Some time later I built a tube amplifier (The Darling, in case anyone cares—I've since built perhaps a dozen more).
When I brought it into the office and swapped it out for the T-amp, the change was sublime but immediately noticeable. I hate to fall back on audiophile terminology but it's the best I have for the experience: I was suddenly aware of "listening fatigue" that had been a component of the T-amp. I hadn't even known it had been fatiguing until I heard the tube amp in its place for days on end.
With the loss of color fidelity and the flickering issue, I'm embarrassed to say that incandescent is starting to look good to me again.
I might, as an experiment, replace only those lights that we turn on in the evening when we are relaxing, reading.
>Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
At least in EU is true. Citing from Wikipedia: "The 2005 Ecodesign directive covered energy-using products (EuP), which use, generate, transfer or measure energy, including consumer goods such as boilers, water heaters, computers, televisions, and industrial products such as transformers. The implementing measures focus on those products which have a high potential for reducing greenhouse gas emissions at low cost, through reduced energy demand."
If I were able to see the flicker of mains supplied LED lighting (which I cannot), then I would be very tempted to install low-voltage DC LED lighting, which presumably does not flicker.
It only doesn't flicker if there's no power driving circuitry - eg just LEDs and a resistor.
Otherwise, if there is a power IC present, there is flicker, though fast enough for most humans to not perceive normally (you can still check it by waving your hand in front of the light and seeing the strobed afterimage.)
Was just discussing last week with a colleague how for the same 'lumen' there was such a dramatic difference between led and incandescent bulbs for ease of reading paper books.
I use 90CRI LED lights where I study. They are Philips' non-flickering, dimmable, High CRI bulbs, yet my classic 25W halogen table lamp still provides a far better reading experience. The spiky spectra of LED lights are not comfortable for eyes as a full-spectra incandescent light, let it be classic or halogen.
For reading black & white text, incandescent lamps are perfectly fine.
However, if your book has color illustrations, a high-quality neutral-white LED lamp is better than any unfiltered incandescent lamp.
A standard white illuminant with filtered incandescent lamps would be even better, but such lamps, as they were made a century ago, were extremely good space heaters, which may prevent their use for reading a book.
The OSRAM halogen lamp I use has special filters, and looks greenish when it's turned off, so it might not be a "brute force, let everything through", 25¢ halogen lamp. The bad thing is it's not being produced anymore, but I have a couple of spares.
I went through university with the same lamp/bulb, so that combo doesn't create the unwanted reflections much. Also, I still print everything in color, because a good color choice still boosts understandability of the material at hand.
There are very great differences in light quality between various kinds of LED lamps.
You may have various LED lamps, all of which appear to have the same white color, but their spectra are very different. Those with narrow spectral peaks are very bad lighting sources, while those with wide spectral peaks, achieved by using multiple kinds of conversion phosphors, are much better lighting sources.
With the best LED lamps, there is not much difference in comparison with incandescent lamps. While incandescent lamps are best from the point of view of their continuous spectrum, their yellow light strongly modifies the perceived colors. While bluish white LEDs (e.g. of 6500 K color temperature) are also bad, neutral white LEDs (e.g. 5000 K or 5500 K) provide much better color perception than incandescent lamps.
For home lighting I prefer a white that is only very slightly yellowish, i.e. 4000 K lamps, instead of LED lamps with a higher color temperature or of incandescent lamps, which are so obviously yellow that no clothes have the same color as in daylight.
The best quality of lighting can be achieved by incandescent lamps in conjunction with frequency-selective filters, which modify their spectra to resemble the spectra of a blackbody with a higher temperature, like the Sun.
Such filtered incandescent lamps were used a very long time ago, to provide lighting for color photography, movies and television (e.g. for the original white point of NTSC), but they were abandoned due to high cost and due to an exceedingly low energy efficiency.
As I have mentioned in another comment, filtered incandescent lamps might see a revival, but implemented with very different technologies than those used a century ago.
Very interesting. I've always thought that there was something a bit "off" about LED torches and car headlamps; the brightness is there, but something about the light just doesn't seem to illuminate as well as an old dim incandescent or even fluorescent tube.
It's usually the Color Rendering Index (the spectrum of frequencies that the light puts out). Incandescent bulbs more of less mimic that of the Sun, they are "black body radiators". Cheap LEDs tend to be missing a lot of the red spectrum.
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
They're saying that the visual performance is indirectly affected by invisible wavelengths somehow. Not that you can see the difference between two types.
They are saying that, and most real world LED lighting uses very cheap diodes, like, 99.9999% of them, which create very poor colour compared with incandescent bulbs, which create perfect colour representation.
It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.
I get that they're more efficient in some sense, but man the LED streetlights and other big lamps are so irritating and make things like like such ass compared to mercury vapor or even sodium lights.
True. Yet, somehow more and more cities install them blindly because efficiency. I remember when I moved to Odense Denmark in 2013 - they had LED street lights all over the place. I thought - this is the future compared to my uderdeveloped post soviet Latvia. And yet, I remeber when I moved back, streets at night looked so yellow because the city still relied on sodium lights. And my eyes felt much more comfortable. At the time I wrote it off to nostalgia or something, and here we are.
For LED lamps, the color must be controlled at the emission source, not by filtering, i.e. by using an adequate combination of different conversion phosphors, to ensure a neutral white with a quasi-continuous spectrum, instead of a bluish white with great narrow peaks in its spectrum.
Unfortunately, the phosphors for the latter variant are much cheaper than for the former, so the lamp vendors have the incentive to make the lamps as bad as possible.
I ask only because I was retrofitting some navigation lights on a sailboat - and you can’t just upgrade the original incandescent bulbs with LEDs (or aren’t supposed to).
You are either supposed to get a special LED (backing up what you’re saying) or there are some new red/green enclosures that are differently treated / tinted to then put a “white” led into.
But I am so far from an expert on that, I may be completely misunderstanding.
There is a 15-30% difference between the groups at baseline (fig 8c-9c, 8d-9d), about the same magnitude as the claimed effect of the experimental condition.
I think the result would be much stronger if these baselines were comparable, so they show they have accounted for other variables like time of day and light history. I am also skeptical of any effect in the retina lasting 6 weeks, with no fading.
Consider that people are often exposed to much more infrared light outdoors, so "worked under a relatively dim incandescent lamp" is not a particularly novel stimulus. Imagine that any of these people spent time outdoors during the six weeks - thousands of times more infrared light there.
Just to point to anybody that comes here directly, the article has no relation at all with perceived illumination, color fidelity, or anything else people complain about leds.
It's an interesting niche topic that you may want your working place to notice if you work indoors.
This should also be true for TL lights. Which kinda contradicts common sense seeing that those are used all over the place in offices, kitchens, and hospitals, makes me think this paper is bogus.
Also even limited to visible spectrum, I have not seen any 99 CRI bulbs. The highest one I have ever found are the 98 CRI by YujiLED, but you pay around $35 for a single bulb. It is absolutely not "easy" to get flicker-free high CRI bulbs, let alone ones that cover the infrared range.
Phillips, GE, Cree, and others sell high-CRI bulbs.
10 years ago you had to work to find high CRI bulbs but could still find Cree bulbs pretty easily. Now you can get high CRI bulbs at the grocery store.
High CRI bulbs generally have low or no flicker because high CRI is toward the premium end of the market.
Almost all of the bulbs you can find at a hardware store (let alone grocery store) exhibit terrible 120hz flicker. I know because I've literally tried every single one. Also it's not hard to get "high" (~90-94) CRI while nonetheless having terrible deep reds.
Out of the manufacturers you listed, only Philips Ultra Definition (95 CRI, R9 90) have low flicker and good R9. Unfortunately they are poorly made and I have to keep buying new packs each year but it's more cost effective than Yuji for lesser used areas.
Also the claim from TFA is that NIR component improves visual performance (and I've read elsewhere that NIR also has health benefits).
How about Phillips flicker-free "warm glow" bulbs? I honestly have a hard time believing that they flicker because I can literally unscrew the bulb and watch it dim gradually over the course of a second. Which indicates to me that there's a capacitor in front of the LED drivers smoothing the current out. (Which I guess is required to be compatible with triac dimmers anyway.)
Never tried those, but speaking about flicker, some LED lamps flicker not because of the mains frequency (50/60 Hz depending on where you live) but because of their internal switching power supplies.
Huh, through experience with (mostly non-premium) LED bulbs, I've learned to interpret "gradually dims over the course of a second" an an early indicator of imminent bulb failure.
If you look at energy efficiency, it totally is. But the whole point in the discussion is that IR _might_ (according to the paper) have biological relevance.
You can't buy heat lamps? They are even more infrared and last longer.
Also LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout. The cheaper bulb spectra that they show is a blue led + phosphor coating, but there are infrared LEDs, UV leds, and more. You can make quite the convincing sun simulation, even better than any incandescent bulb, but there is almost no demand for UV + Infrared super full spectrum lighting unfortunately. Only movie & theater lights come close.
>LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout
Do you have a link to a bulb that you can purchase meeting all these criteria? The only one I'm aware of was this obscure "StarLike" that was never actually sold in bulk. LEDs can be made good in theory sure, but in practice they are all terrible in light quality compared to a standard incandescent.
You would need to see the spectra of the various LEDs available and create a mix along with phosphor mixes. The closest thing is something like a BLAIR-CG light engine from aputure where they have something like 9 different colors of LEDs that mix together, but they don't put any infrared leds in them because they are for movies and they don't put any UVB or proper UVA leds. But there are infrared, UVA & UVB LEDs that you could apply the same kind of engineering principle to make something that closely follows the sun spectra.
No, you can't buy them as bulbs. The closest thing is those red light therapy panels that include them.
Yes, there is something obviously wrong with most LED lights, but it isn't too much of short wavelength light, but on the contrary. It's the near absence of cyan light in most LEDs. Our eyes are by far the most sensitive to it, the majority of receptors in the eye are sensitive to it, and we may focus primarily on it (focus differs for different wavelengths). This is how you get the feeling of something being wrong with your vision as you for example walk into a mall, and so on.
If anything, higher temperature lights seem to make it better, not worse, but the problem will persist as long as the cyan hole stays there.
Sensitivity peak for humans is in cyan (~510nm) only for low-light conditions (night vision / rod cells). In daylight (cone cells) it's green-yellow (555nm).
https://www.giangrandi.ch/optics/eye/eye.shtml
I've been using incandescent more often. All my vanity lights are 40w appliance bulbs now. The difference at night is remarkable. The LED is just too much even at 2700k. I still prefer LED for high power situations like br30/40 can lights.
No mention of CRI which seems kind of odd. LEDs for lighting are increasingly graded by how natural their emission spectrum is. Older lights are quite bad, newer ones sacrifice a tiny bit of performance for more uniform spectrum.
They use rf numbers, which is a newer standard, so that's probably good.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
CRI is a pretty bad rating system. They are showing the full spectrum graphs which is what you'd want anyway. Spectral Similarity Index (SSI) is the better number
Sure, but I don't see them mention what they're actually using for LEDs at all. They mention a "colour fidelity index" but I'd expect a manufacturer part number or something so I can pull the datasheet.
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
I think CRI is not important here as thats a measure in the visual spectrum. The paper talks about all the missing wavelength outside of the visual spectrum.
I found some interesting tidbit about this bigger issue. And I want to share how to more easily check it.
We many times see some people reporting that they clearly see lower quality LED light flicker and is really distracting to them and even causes them headaches.
Now, I didn't see this until recently (unless in failing lights) in the right conditions. If the light is very, very dim: For instance, only 1 light on in the night, and you are in a division far away from the light so that it's extremely dim. There, I could finally really see it flicker.
I've replaced that light for a better one and the effect went away.
I have incandescent light bulbs at home I have to pretty much smuggle from China. It's amazing how we're replaying the asbestos playbook a century later. Only this time it's government mandated.
Asbestos was pushed as a magical solution to problems of fire in homes without paying attention to the health effects. It took 80 years for the obvious to become law.
Leds are pushed as a solution to energy consumption by humans without paying any attention to the health effects. Hopefully it will be less than 80 years of cancers and metabolic disruption before the obvious is done.
But this time the regulation was captured pre-emptively, to the point that following best scientific advice for your health is illegal is most of the developed world.
> But this time the regulation was captured pre-emptively, to the point that following best scientific advice for your health is illegal is most of the developed world.
Please cite your sources then. And no the other article you linked is not proving your claim
There's a mostly-unsubstantiated-by-data belief that LED lighting can cause health problems by some combination of flickering and narrow color spectrum.
Where does this article mention LED lights vs other types of artificial light-at-night?
What I could find regarding light color:
> However, most studies relied on satellite-images with a very low resolution (1 to 5 km, from the Defense Meteorological Program [DMSP]) and without information on color of light
> noted that data quality suffered from many limitations due to the types of satellite images used and the focus in the vast majority on visual light levels only rather than considering the circadian-relevant blue light component, among others. Future studies should consider improved satellite-based ALAN technologies with improved resolution and information on spectral bands and apply these technologies to a variety of cancer sites to yield better estimates for the potential risks between ALAN and cancer.
So nothing conclusive about LED being bad for your health (vs other types of light).
Looks to me that _you_ conclude it's related to LED, I couldn't find that stated in the abstract, it might just be related to a general increase of artificial lightening, regardless of the source.
There's a mostly-unsubstantiated-by-data belief that LED lighting can NOT cause health problems by some combination of flickering and narrow color spectrum.
I’m guessing the Russian theory that asbestos is totally fine and isn’t harmful? The Russians still use asbestos and say it’s a plot by the west that we got rid of asbestos in our buildings. (Don’t shoot the messenger here, I have no dog in this fight and am not expressing an opinion)
Asbestos is totally safe as long as it’s not friable and you don’t sand it or disturb the fibers. Mesothelioma was a major problem if you were repeatedly exposed to asbestos. It’s present in nature and especially in soil in small concentrations. What makes it dangerous is if you’re constantly breathing it in. You would be doing that if you ran a buffer over asbestos tile for years, or if you worked in a space with asbestos pipe insulation, or if your job was to install asbestos siding or sheet flooring or formica(many adhesives contained it). Even gypsum wallboard contained asbestos up through the 80’s. It’s precisely its ubiquity as a building material that makes it dangerous because people are constantly disturbing it occupationally.
The problem is that not disturbing the fibers is impossible if you work with it at all, and workers in Russia still suffer from life-changing injuries. Disposing of it safely is also not realistically possible. The regulator just doesn't care, it's as simple as that. Of course they don't "think it's safe" as GP said, there's a ton of research and practice on the opposite and they set a specific (pretty low) limit on the exposure. But they turn a blind eye to the fact it's impossible to enforce and will never be followed in practice as long as asbestos is still being used anywhere. This is why asbestos use is banned everywhere, and this is the issue with Russian regulations, they give a tiny bit of economy a priority over public health, using the convenient research that pretty much "натягивает сову на глобус" in trying to downplay the hazard, if you actually read the relevant studies in Russian.
Why is it that right now there is still on the frontpage of an "article being found flawed after 6k citations " ( https://statmodeling.stat.columbia.edu/2026/01/22/aking/ ) but this random article coming out of nowhere makes the front page on the same day?
People really should get it and stop sharing newly published papers to the general public. The value of one single academic paper is exactly 0. Even a handful of such articles still has 0 value to the general public. This is only of interest to other academics (or labs, countries, etc.) who may have the power to reproduce it in a controlled environment.
Be very skeptical of correlations like this that have dubious or poorly understood causation. Be even more skeptical if they are about day-to-day stuff that would likely have large swaths of people able to reproduce something like it on huge scales yet they haven't. Extraordinary claims require extraordinary evidence.
This article is not making an extraordinary claim, and your offence is hyperbolic. Analysis of research should not be restricted to the academe, but careful not to cherry puck research.
Considering the percentage of live mitochondria that are exposed to external light in a human this seems like an enormous effect. The effect we'd expect from publication bias though is already pretty big. I'm going to go with the latter until we've got some replication, and a plausible mechanism (like.. why wouldn't whales be badly sick if this was a thing?).
The article mentions that unlike visible light, which is mostly absorbed by the skin, near infrared light penetrates deep into the body and the lowest frequencies of the Solar spectrum pass through the entire body.
This explains why most mitochondria are exposed to infrared light, even those deep in the body.
The article also mentions an inhibiting effect of blue and violet light upon mitochondria. For that it should be valid what you say, that this effect can happen only in the superficial layer of the body, because both skin and blood strongly absorb such light.
I don't know about the mitochondria bit, but it is plausible that work performance is affected by light spectrum. The n=22 is too small, but replication or larger studies is an obvious next step. Let's hope the researchers in this field use pre registration.
Scientific Reports is a junk journal fyi. Not conclusive, but indicative.
Despite saying the visible flux component is "small" and that the tungsten lamps "were not expected to [be used] as task lamps," Figure 6 (a) and (c) shows... desk lamps right at the work stations like task lamps! Not only is this experimentally unblinded, but the visible light immediately in front of the test subjects is noticeably brighter and warmer. The effect could simply be due to reduced eye strain.
What would James Randi do? "Extraordinary claims require extraordinary proof," and unfortunately this isn't it.
This would be more interesting if they add a visible light filter on the lamps so they only emit infrared radiation, and have an identical double-blind control with a 60 watt heater bulb so it emits no SWIR but the same radiant heat (which could confound and/or unblind).
It should be noted that even if we assume that the conclusion of this study is correct, i.e. that artificial lighting should have a wide spectrum including near-infrared light, that does not mean that returning to classic incandescent lamps is the right solution for this problem.
The incandescent lamps with tungsten filaments have a much lower temperature than the Sun, thus much more energy is radiated in infrared than needed.
There was about a year or two ago a discussion about a very interesting research paper that reported results from testing an improved kind of incandescent lamp, with energy efficiency and lifetime comparable to the LED lamps.
The high energy efficiency was achieved by enclosing the lamp in a reflecting surface, which prevented energy loss by radiation, except for a window that let light out, which was frequency-selective, so only visible light got out, while infrared stayed inside. The lamp used a carbon filament in an environment that prevented the evaporation of the filament.
With such a lamp, one can make a tradeoff between energy efficiency and the content of healthy near infrared light, by a judicious choice of the frequency cutoff for the window through which light exits the lamp.
Even with enough near-infrared light, the efficiency should be a few times higher than for classic incandescent lamps, though not as good as for LED lamps. Presumably, one could reach an efficiency similar to that of the compact fluorescent lamps (which was about half of that of LED lamps), for such an incandescent lamp that also provides near-infrared light.
How does enclosing the lamp in reflective material help with the energy efficiency? Isn't the infrared radiation emitted anyway? Doesn't that make the lamp overheat?
If the reflective material is ideal, by definition no infrared or other radiation is emitted.
Perhaps I was not clear, but the reflective surface was the interior surface, so it reflected any light, visible or infrared, back towards the emitting filament, while the front window reflected only the infrared, while transmitting the visible light.
The lamp does not overheat, because the filament is kept at a constant temperature, the same as in a classic incandescent lamp. The difference is that you need a much lower electrical current through it for maintaining the temperature, because most of the heat is not lost away, like in a classic lamp. The fact that you need a much smaller electrical current for the same temperature is the source of the greater energy efficiency.
Only if you had used the same electrical current as in a classic lamp, the lamp would have overheated and the filament destroyed, but you have no reason to do that, like you also do not want to use in a classic lamp a current higher than nominal, which would overheat and destroy it.
reply