This news article doesn't even mention the article's name, the authors, or when it was published.
It was actually published in January, and has just now been picked up by the Telegraph.
Here is the citation:
Ognjen Ilic, Peter Bermel, Gang Chen, John D. Joannopoulos, Ivan
Celanovic, Marin Soljačić. Tailoring high-temperature radiation and
the resurrection of the incandescent source. Nature Nanotechnology,
2016; DOI: 10.1038/nnano.2015.309
The Telegraph's explanation is terrible: "with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through."
The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light.
They used a numerical model to design and evaluate various candidate compositions for these plates. For a proof-of-concept, they chose one which uses layers of silicon oxide and tantalum oxide, with 90 layers in total per plate. This reflected about 90% of infrared radiation, producing a luminous efficiency of about 6.6%. This is comparable to commercial LEDs and compact fluorescents, though far from state-of-the-art.
However, the results closely matched their numerical model, and a more complex structure comprised of layers of silicon dioxide, aluminium oxide, tantalum oxide and titanium dioxide, with 300 layers in total, should produce a luminous efficiency of 40%. This is significantly better than the state-of-the-art in LEDs (about 15-30%). They did not, however, actually build that one.
I have no idea how expensive this would be to commercialise. It doesn't sound like the physics is particularly complex, but manufacturing costs could be prohibitive. I think it's safe to say we won't be seeing it outside the laboratory any time soon.
"The Telegraph's explanation is terrible: "with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through."
The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light."
Whats terrible about that? Its the same as what you said using words that could be understood by a child.
Ask the child to draw it, you'll notice the problem. One description makes me think that the glass has some new treatment, the other makes me think that there is a new structure within the bulb.
I don't understand how that's supposed to make the bulb more efficient. If you're trying to keep the interior of the bulb hot, mission accomplished. But the problem we're trying to solve is that when we run a current through the filament, some of the energy we supply becomes visible light and some becomes infrared light. We want more of it to become visible light and less of it to become infrared light. How does preventing infrared light from leaving the bulb help with that?
The reflected IR light will be absorbed by the filament, therefore requiring less energy over time to keep the filament at temperature. That thermal energy while lower than the input energy will create a baseline temperature, so now you only need Some of the new energy to output more photons, instead of All the energy.
Bingo, this is how incadescence work: heating any solid material above a certain temperature will make it glow, all we need to worry about is finding a material that can withstand the temperature and make it into an efficient package.
Gas mantles work pretty much the same way. A cage made of ceramic metal oxides (formed by firing a cotton bag dipped in thorium or yttrium nitrate, if thorium is used then every use will release a few microrems of radioactove radon gas) is heated by gas or oil fire to incadescence; because the temperature is much higher than a light bulb, gas lamps actually waste less heat as infrared.
Here’s my take on this (corrections welcome): the thing you need to get light out is a hot filament. You use electricity to a) make it hot and b) keep it hot. Both are easier the better the filament is thermally isolated.
Caveat: if that argument is correct, I would think a bulb with triple glazing would help, too. Has that been tried?
The thermal mirror needs to selectively reflect infrared while being transparent to visible light, and it needs to bounce radiation back and forth, which requires a certain geometry. The novelty of this paper is creating a big flat filament sandwiched between high-temperature plates that reflect infrared and can be deposited as layered coatings.
You're trying to keep the filament very hot while spending the least amount of power. Apparently, reflecting the thermal infrared back at the filament lets you spend less power while keeping the filament equally hot and equally luminous in the visible band.
The IR is only a loss if it escapes. Their design tries to reflect it back at the filament to reuse it. A hot filament encased in a perfect window that reflects everything but visible light is highly efficient, simply because there's nowhere for waste heat to go.
Check out the wikipedia article on "luminous efficacy". Hypothetical black-body sources truncated to ≥2% photopic sensitivity range are excellent, and that's what this group is trying to approximate.
Think of it like a blanket for the filament. The IR radiation helps keep the filament at the appropriate temperature, so the amount of ohmic heating from electricity can be less.
I also found the Telegraph's explanation lacking. Their explanation they give makes me think that the interior of the glass bulb has some coating or special composition.
This impression is quite distinct from the reality of the filament between plates with alternating layers of SiO2, TiO2, Al2O3 or Ta2O5.
The Telegraph could have at least said that the advance involved sandwiching the filament between layers of transparent material that transmit light by reflect IR. Not to mention conflating crystal structure and glass. (Glasses can be distinguished from crystals by a lack of repeating long range structure).
The manufacturing costs won't be too bad, I'd imagine. At least, in terms of the normal shocking costs of manufacturing. If the design is simple enough, that 300 layers may be just 75 repititions of four layers (no doubt not actually the case, but the machine spraying material onto the plates can adapt).
I'm a bit disappointed that the recipe wasn't included; I would have liked to see what the design entailed. All the layers are on dialectrics, and they're all oxides, so the process chambers won't be too hard to control. And I say that knowing full well some poor coater engineer is actually just going to have a devil of a time with it (or not! quarter wave filters are fairly standard stuff).
Most of all, the thing I'd be intersted in is how resistant to error is the design? It sure looks like it doesn't need to be too perfect per layer. For a pair of microscope lenses, precision is everything. In a lightbulb you can probably get away with a lot of error (per layer: as a whole you could easily make the color something goofy). Of course, if it's color swings wildly with minor errors in certain layers, well, that could hole the product up in R&D hell.
> Most of all, the thing I'd be intersted in is how resistant to error is the design?
I feel like this is what keeps the most impressive research out of real world application. Yeah, manufacturing is expensive, but tolerances are just so much more controllable in a lab.
Manufacturing also produces revenue, whereas funding the years of research necessary just to figure out a new nanometer-precise process is a hard investment for most companies to swallow, especially if the end product isn't guaranteed to produce a profit itself.
The good news is that the processes I described above are already well established. The trick is always in the dirty, nasty details, but in this case I'd be bullishly optimistic. I haven't personally worked on a large batch coater, so I can't attest to how hard it is to tune the things in, but your typical coater engineer is a bit eccentric anyhow and will no doubt make it work.
See, the weird bit of how these coatings work is that it's fairly easy to empirically tune the damn things to be angstrom accurate (-ish), but the geometry of how the material is sprayed messes up the uniformity, which in turn ruins your yields (one square centimeter in the center is good, chuck the rest!). But if the coating can withstand some variation between layers, tuning it such that the layers sorta cancel out in the bulk, you'll lose the efficiency, but your yield can absolutely skyrocket. There will likely be key layers at the optical half, quarter, eighth, etc. thicknesses, but if you get enough layers you can probably get away with a lot. You can also just build really neat source geometries to sorta flatten out the spray over a larger area, but that's something I never worked on.
My ignorance of this sort of coating has me worry about it immediately destroying itself from the insane material stresses... Though no idea how well it'll anneal after 1000 hours of constant heat (or worse: on-off heat-cooling cycles!) But since they're all oxides and likely put down together while the substrate stays hot, I bet it works out ok.
Each of those layers of oxides are "quarter wave stacks" (from the paper) which means they're somewhere between 200 to 1000 nanometers (depending on the wavelength of light they want to transmit or reflect).
So one of those layers is say, 200 nanometers, you could end up reflecting the visible and transmitting the infrared, which is the opposite of what you want to do.
True, but when you have an awful lot of layers, you can play some silly buggers with it. Even if you botch a layer, it won't neccesarily ruin the whole thing, since the interference betwen the sets of layers also act as filters. I haven't worked on these sorts of stacks, personally, but I'd be surprised if the whole stack dies because one layer was a bit screwed up (I'd expect a degredation in the efficiency, but not necessarily a total reversal - now ask me what happens if your goal is to tune to neutral color and you make a slight error and then I'll flip sides and say "rainbow").
What I'd really expect is that if you make one layer slightly larger, your process almost certainly making all the repititions of that layer larger too, and now yeah, your bulk properties are just hosed.
An additional and important application discussed in the paper for the (relatively) high efficiency infrared reflectors is for thermophotovoltaic systems, which many think will be generating a lot of our electricity in 50 years.
There's nothing state-of-the-art with 15-30% efficiency LEDs, maybe a few years ago but not 2016, blue LEDs are almost at 59% and even the whites are at about 50%.
I'm confused. If reflecting 90% of infrared radiation only produces a luminous efficiency of 6.6%, how can any further increase in reflection lead to 40% efficiency?
Obvious but maybe wrong answer: because it's not linear, if you trap nearly all the IR it stays until converted to visible, otherwise it still escapes after a couple of bounces.
If you reflected all the infrared light, the efficiency would be 100%, since all the light leaving the bulb would be visible. Normally, the vast majority of the light leaving the bulb is infrared, but if you keep 90% of it in the bulb, then 6.6% of the light leaving the bulb is visible light.
Fluorescent lights don't use filaments, they use electrically excited gasses.
The reason this technique improves the efficiency of incandescent lights is that they produce their light by heating the filament. The hotter the filament, the more light it produces.
So by reflecting the heat back towards the filament, you can achieve a hotter filament with less input energy.
Fluorescent lights do use filaments. At each end of the tube there's a hot cathode - essentially a small incandescent filament designed to emit electrons rather than light. This is the source of the electrons that excite the gasses.
While there are some designs that use cold cathodes these are not the common type as they are less efficient.
Still won't make this filament treatment help though - sandwiching with layers would probably block the electrons.
So it sounds like this new bulb will have the same problem people complain about with LEDs and CFLs: slow warm-up times. The filament will take longer to warm up to full operating temperature, since you're relying on thermal reflection in the glass to let the heat build up. I guess they could get around this by having a power control circuit in the bulb, to give it a large inrush current to heat up quickly, and then back off once it's warmed up.
Is that the same oxide as used in Tantalum capacitors? Those things have a tendency to explode randomly if inrush current isn't controlled appropriately.
They're also expensive and somewhat of a conflict material from what I understand in the little reading I've done on them(but they're awesome for power circuits).
People complaining about LEDs being too "bright" or "clinical/sterile" were too bothered to look into the different color temperature options. The annoyingly bright bulbs are around 5000K brightness which would make a home at night feel like a fully-lit classroom. I have 2700K bulbs which give off a much cozier incandescent glow: http://blog.batteriesplus.com/2013/seeing-things-in-a-differ...
Sure, the bulbs take about 1/5 of a second to turn on, which is not instant, but the cheaper LEDs can take up to 2 seconds to reach full brightness. You have to buy a better made bulb to get the quicker "on", which means spending more money. People only trying the cheapest LEDs available are the ones complaining the most about them, I'd assume.
Color temperature isn't the only thing, CRI is important too (though less so). It's hard to find CRI specs on most bulbs, nobody mentions it unless they have a really good number to brag about (90+).
One thing never mentioned is that LED color temperature will vary over time. It's a two-part system with complementary colors, the LED itself is blue and it's coated or surrounded with a yellow phosphor. The phosphor will dim eventually.
I've never seen a LED bulb that was so slow to light up that I noticed it, unlike some fluorescents that are really atrocious.
I've already given up on fluorescents, if only for the hassle of recycling them due to the mercury content.
Normally when people complain of led bulbs being slow to turn on, it's actually the dimmer at fault. Most dimmer switches have only two wires, and so charge their caps through the bulb. Since LEDs draw far less current, this takes longer, and so the bulb takes longer to appear on.
I have some stairways I must light up and they need to light up instantly for safety reasons. I make sure one of the bulbs in the fixture is incandescent, and CFL/LED the rest.
Bizarre that the LEDs would have a delay then, or that including an incandescent would make any difference in that delay. (Or are you saying it doesn't make a difference in how quickly the leds turn on, but it itself turns on faster? I guess if they're cheap ones with poorly tuned internal rectifiers that could be the case.)
My Cree Type A LED replacement bulbs are over 90 CRI.
Incidentally, is CRI a totally bullshit measurement, or does it mean something? I find it oddly coincidental that an incandescent supposedly has a CRI of 100 and matches daylight. How is this possible for such an old tech? Did they just get really lucky with the first lightbulb? Or have the gases and filament material improved over time to bring CRI up to 100? I have a hard time believing CRI was the ultimate goal of incandescent development, but I could be wrong.
CRI measures how close (using a terrible metric) oranger lamps get to a black body radiator, or how close bluer lamps get to an approximation for daylight from the 60s, which each have a CRI of 100 by definition. Incandescent bulbs are a black body radiator, so unless you put some kind of filter in front, they’ll have a CRI of 100. It’s not a 0–100 scale though. 100 is an arbitrarily selected number.
There are reasons to be dissatisfied with the light produced by LEDs and fluorescent lamps, but CRI is only very marginally useful.
Better is to look at the spectral power distribution, and try to find lamps with a broad emission spectrum and no sharp spikes. For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.
CRI measures how natural the objects' colors appear under an artificial light source against the objects colors under a natural light source (standardized daylight).
The color temperature and luminosity of natural light sources varies tremendously during the course of the day or in different parts of the world, yet the objects colors remain consistent. This is a property of the human vision called chromatic adaptation. https://en.wikipedia.org/wiki/Chromatic_adaptation
Sources which emit a continuous spectrum have better CRI than tri-color sources, especially on artificially colored objects (dyed/pigmented) due to metamerism. https://en.wikipedia.org/wiki/Metamerism_(color)
If you want to get technical, the algorithm for computing CRI is:
- take a specific set of arbitrary paint chips, and record their colors when illuminated by the target lamp (using the CIE 1931 2° standard observer, and the 1960s CIEUVW color space)
- perform an outdated and not very effective type of chromatic adaptation transformation so that the white point matches either a black body radiator (like an incandescent bulb) for lamps of correlated color temperature of <5000K, or a point on the “Illuminant D” series of approximations to daylight for lamps of CCT >5000K
- measure the distances in UVW space between those target colors and the colors of the paint chips as illuminated by the reference black body radiator or D series illuminant.
- sum up all the color distances, multiply by an arbitrary factor, and subtract from 100.
Everything about this process was totally ad hoc and arbitrary 50 years ago when it was first invented, and is absurdly outdated today.
There was an attempt at the CIE to replace the CRI with something better in IIRC the late 1990s, but the lighting industry didn’t want to follow the recommendations of the color scientists, so it fell apart. (I could be misrepresenting what happened; I’m not an expert in CIE politics or the history here, and haven’t ever researched it in detail.)
> For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.
Moonlight has significant energy at wavelengths below 500 nm [1]. Does this light disrupt the sleep cycles of other primates, or is this something that developed in humans only after we invented "indoors" and started sleeping there away from sources of bluish light?
Yes, if you stare directly at the full moon, it will be harder to see into the shadows for a while afterward until your eyes adapt back to the dark, and you can probably (I’m not sure this has been scientifically tested) push back when you start feeling sleepy.
In general, the moon is high overhead, so it won’t be directly in your field of view while it lights up your surroundings.
Light, especially bright sources of glare, in the short-wavelength part of the spectrum saturates the “rod” light detectors in the eye, causes vision to become bright-adapted. To adapt back to fully capable night vision takes something like 20–30 minutes.
Additionally, beyond rods and cones, there are a 3rd set of light detectors in the eye, the “photosensitive retinal ganglion cells”, which regulate melatonin and sleep cycles. These are particularly sensitive to light in the 400–500 nm range. Looking at such a light source at nighttime can suppress melatonin production until about an hour after you stop the light. This is the reason that smartphone, television, and computer displays are all so disruptive to sleep when used before bed.
> matches daylight. How is this possible for such an old tech?
Kind of makes sense when you realize that the sun and an incandescent bulb are both generating light as a small slice of their overall thermal radiation envelope. It's just heat, made visible in both cases. And like sibling commenter said, the CRI was defined based on the incandescent.
Though the filament would have to be at about the temperature of the sun's visible surface (with some adjustment for atmospheric absorption and scattering) to have the same spectrum. That is about 5700K, and tungsten melts at 3695K. As was mentioned above, the scattering changes quite a lot through the day.
True, though I guess I'm explaining more the continuous nature of the spectrum. LEDs are very notchy or peaky, while light produced from glowing tungsten is a smooth output across the visible spectrum that it covers. The fact that bluer light is scattered by the atmosphere also helps bring sunlight closer to incandescent light in terms of color temperature though, right?
Incandescents work more similarly to the sun and are basically a natural full spectrum light source. Florescents and LEDs are not, so they need to be tweaked/hacked/refined to produce an ideal spectrum. Florescents are known to use mercury to help with this, making them bad for the environment. And though the prices are coming down, high CRI LED's have been a trailing technology with high CRI 5000K bulbs becoming affordable/viable only recently.
Yep, the spectrum of light emitted is what matters. A histogram or spectrum intensity curve printed on the cardboard package would be ideal. Sadly, I've never actually seen that in consumer grade stuff.
I've found that most LED bulbs you can find in supermarkets are really quite bad in this respect.
For my home office, I got something called "True-Light" E27 bulbs. The specs are 12W, 5500K, 920lm, CRI 94. They emit a passable bright, white light.
My favourite light source are still E11 halogens. Just pepper the ceiling with them, add dimmers, and hope it doesn't heat up the room too much…
I’m one of those goofball hipsters who is in love with those “Edison filament” bulbs. In the past year, passable LED facsimiles have been turning up everywhere. No idea what the color temperature is, but if you like that cozy, dim feel…
Hey, those look really nice. I may turn to the Victorian style ones for my lighting needs. The site advertises the colour temperature as 2200K to 2700K depending on the model.
Yep. I've replaced about a lot of the bulbs in my house with LEDs. The key is to buy a number of them, find ones that you like, and then put the ones you don't like in places that the color doesn't matter as much (garage, storage areas, etc).
For me, a lot of the Philips LED Soft White (2700k) bulbs worked.
Agree. While tastes differ, to me the Philips models (the ones that look like an Erlenmeyer flask) give the most pleasant light by far. They also seem to be much better constructed than the knock-off cheapies you find in bulk packs at Lowes or Home Depot.
I've found that that Walmart "Great Value" brand 8.5 W, 800 lumen, 2700K, 20000 hour bulbs are nearly indistinguishable from the 60 W incandescent bulbs I had before, except that these LEDs are not dimmable.
They are nominally about $2 each, although they frequently drop the price to an astounding $0.17 each. I'm not sure if these price drops are due to some sort of automatic at-the-register rebate from the local power company, or just an internal Walmart effort to promote efficient lighting.
The same company that manufacturers these for Walmart is called TCP and they sell the bulbs on Amazon as well. I've bought about 30 of them.
Pretty happy with them so far. The only problem I have had (as with all brands so far) is led bulbs seem to interfere with my garage door receiver signal (if placed right next to it).
Philips makes lots of different bulbs, even with the same specs. I suspect they have very little in common, other than the name on the box. This is a time of very high churn in the industry.
I wanted to like 2700k, but I find it to be extremely yellow. I remember, even on tungsten times I found lights to be annoyingly yellow. It's even more apparent when working on a led screen, then turning around to see your devastatingly yellow lights. Then again, I find 5000k lights too blue, and I have found that 3500k-4000k is the most pleasant temperature, but the industry has settled on yellow and blue lights. It's a shame.
I've tried two brands so far (GE and Hyperikon) for LED bulbs at 2700k and 3000k and they all were incredibly white and nothing like my incandescent bulbs- and on top of that, even the "dimmable" LED bulbs suck. I want to have LED all over my house, but there's honestly almost an "uncanny valley" feel to the LED bulbs I've tried, and so far I'm sticking with my incredibly inefficient bulbs.
I've tried plenty of LED bulbs and so far they've all been rubbish. The light of 2700K bulbs is definitely whiter than incandescent bulbs. Stuff in the room is not the right color. In addition, the bulbs seem to "flicker" and/or make strange noises when dimmed. Good thing some shops in smaller towns still sell incandescent bulbs.
That can happen if 1) you have an older dimmer not rated for LEDs 2) the dimmer is mis-wired; some are very specific where you hookup the hot and load wires especially if you have a 3-way circuit for the light.
Cree TW series are the best i've found. They were only bulb to meet california's voluntary requirement for LED lights when i bought them. The requirements covers color rendering, temperature, dimmability, and other important things.
I have Cree recessed modules in my kitchen, and they're fine when not dimmed. They don't change color temperature when they dim, which incandescents do.
I tested ~10 different bulbs for my kitchen lighting about 5 years ago. I don't know how much the situation has changed since then, but at that time I had a few key requirements - high output (150W equivalent), warmish (but not super-warm for the kitchen) color temperature, fast turn-on speed, and dimmable.
Much to my surprise, the (almost) cheapest bulbs (some Utilitech model that doesn't seem to exist anymore) ended up being the winners. Instant on (no perceptible delay, compared to all others I tried), great color temp, no flickering, wide range of dimmability. I tried many different brands, from Phillips at 3* the price.
It's just a single anecdote, but I cared a lot about the performance of the bulbs, and I was surprised to find the cheap ones actually worked best. I think there's a lot of variability out there and you can't rely on cost and you definitely can't rely on brand name (Philips was the worst).
Lately, I've found Ikea's LED bulbs are good and quite cheap (with imperceptible delay on turning on).
As another anecdote, I have several LED bulbs from Philips (LPrize-PRO and Hue), GE (GE Link) and Ikea.
All of them are bright, work fine, and start instantly, but the Ikea drivers have a constant buzz. I've relegated those to bathroom fixtures that are only used briefly so that the sound won't drive me crazy.
Indeed. I recently replaced all the incandescent cans in my house with LED cans and was very certain to specify a color temperature. I've also got "LED filament" bulbs in one room at 2200K (designed to replace those clear 'vintage' bulbs), which give a really cozy feel, if a little too yellow. Yet for my garage, I love the high color temperature bulbs.
I've done countless tests with different temperatures and I've found that 3500k is the perfect one for my house but it's really hard to find bulbs of that color temperature.
I've used some of the better tested LED bulbs (per http://www.ledbenchmark.com/ 2700K, 80+ CRI, "nice" looking spectrum, "flicker free"). They are very good, and have improved over bulbs from years back. Definitely better than any CFLs I've used. Still a bit less vibrant and warm compared to incandescent. I personally don't need huge amounts of light outside of bathroom/garage, and find most of the LED bulbs too bright if swapped out 1:1 per the equivalent wattage labeling in my existing fixtures.
I've tried multiple different LEDs rated at 2700K to replace the GU10 halogen lights in my kitchen - literally every single option I've tried is too "white". Nothing gives off that nice warm colour of a halogen, even expensive Philips LEDs can't match right colour warmth.
They use considerably more energy, since each one uses 50W, times 8 halogens in the kitchen=400W just for the lights, vs. 40W if they were LEDs. But then again, the energy saving is not worth anything if the light annoys me. But they are not expensive, halogen GU10 costs $1 each.
Halogen bulbs (as a sub-set of incandescent bulbs) are actually more efficient than normal incandescents, because they operate at a slightly higher temperature, and the glass usually stays clear for longer. This is possible because the halogen inside the bulb prevents evaporated tungsten from condensing on the glass. This also means that an appreciable amount of ultra-violet light is produced, so most halogen lamps will have a UV filter.
The reason people need more power going into halogen lighting is usually because of all the other arrangements around the bulb that reflect the light into the wrong place. A bare halogen bulb (without a reflector) will light a room better than the same amount of power going into a bare normal incandescent bulb, but it will be a harsher bluer light.
The colour issue can be true for CFLs too, when I was living in Asia last year I had a really hard time finding warm-white bulbs. It seems like they prefer cold-white bulbs there (you even can see the difference when flying over Asia vs Europe). It's great for a kitchen where you need to see clearly what you are doing, but not so much when you want a cosy living room.
I assume most LED bulb factories are in China at the moment, so they are just targeting their own preferences for colour.
> People complaining about LEDs being too "bright" or "clinical/sterile" were too bothered to look into the different color temperature options
And where when I'm buying these lights at the supermarket does the information that other color temperature options present itself? I've bought lighting for camera work, but I've never seen anything about color temperature at a local store.
In Australia at least, pretty much every single different bulb has its colour temperature printed on the box, as well as a usually having little bar chart thing showing where in the visible spectrum it sits.
They all have descriptions like "Cool White", "White" and "Warm White" too.
It works pretty well - I haven't been disappointed in LED globes I've bought, whereas I bought a few CFLs back in the day that I couldn't stand.
You're not going to get a ton of granularity, but you should be able to find at least ~2700K and ~5000K bulbs in your local supermarket, if they've got a decent selection at all-- "soft white" bulbs are going to be ~2700K and "daylight" bulbs are ~5000K.
Stores with larger selections may have a few options in between, too; "cool white" and "bright white" designations aren't uncommon (although exactly what they mean may vary; check the packaging if you're not sure).
I guess my problem is the information discovery. How does the average person buying bulbs in Target or Walmart know to look for this new thing? "Soft White" and "Daylight" are good terms but I wonder how you know the difference? Daylight bulbs had a bit of a different meaning in my mind, but I guess I grew up in a rural area.
Incandescent bulbs never had any variation, they're all around 2700K. The GE Reveal bulbs I believe use a color filter to influence the appearance, but the filament still glows the same as everybody else's.
> Incandescent bulbs never had any variation, they're all around 2700K.
That's basically my point. Of course people will complain about the light since LEDs act differently and buying options didn't properly explain how to get back to something that looks like what people had.
I have two different brands of 2,7k led bulbs, and they are like night and day. The ones that were shipped with a lamp are actually quite bluish. Haven't bothered to measure them though..
"Alexa, I'm ready for bed." is like I live in the future (warms up the lamp colors like flux on my mac). Still working on tying the coffee pot into the morning wakeup alarm though.
I'm in the process of figuring out how to talk to my color changing LEDs so I can change them over time like flux, but for now I'm just turning them a dim, dull red at bedtime and I am sleeping better than I have in years.
We switched to LEDs about two years ago and have switched back to halogen bulbs recently for few reasons.
1) It didn't actually make much difference on our electricity bill. I believe this is because we use most of our lighting during the winter, and the wasted heat from the traditional bulbs isn't really wasted at all -- we just shifted more work to our electric heating system.
2) Many of the bulbs had burnt out. I am now very skeptical of all the "20 year lifetime" claims since we lost many expensive bulbs in two years.
3) The light quality just isn't great. People will argue this point, and I can't point to any evidence to support my claim, but our perception was that the light was either too blue, or too yellow, and just "looked weird." I wonder if it's related to the strobing of the LEDs.
> 1) It didn't actually make much difference on our electricity bill. I believe this is because we use most of our lighting during the winter, and the wasted heat from the traditional bulbs isn't really wasted at all -- we just shifted more work to our electric heating system.
The only reason people expect them to is the, IMO, obsolete idea that lighting load is a significant driver of power usage.
> 2) Many of the bulbs had burnt out. I am now very skeptical of all the "20 year lifetime" claims since we've lost lots of expensive bulbs in two years.
Maybe I have just been lucky, but the LED floods I put on the exterior of my house have been working like champs. I used to go through halogen and CFL bulbs like crazy out there. The Cree LED retrofits for my dimmable can lights have also been solid.
I'm not sure this is an obsolete idea. The lighting in my house is simply an astounding use of electricity. In fact, I think people underestimate how much power it consumes.
My house's baseline load is about 200w; networking/computer equipment mostly. The fridge that cycles on and off, and an instant-hot tap that cycles on and off to keep up to temperature.
Each bedroom had 60w floodlights in 4 cans. Sure, we don't have them on all the time, but 4x3x60 = 720watts. That's half a hairdryer! Our living room+hallway was even worse. There were a bunch of MR16s at 50w. With 16 of them in the house, that's another 800w. When we're home in the evenings, the living room lights are on all the time, because it's in the main part of the house. To be fair, the payback on the $500+ in LED bulbs I put in is way out there, but a big driver in my spending was to keep temps down during the summer. It seems so silly to artificially heat the house with wasted electricity.
In the winter, we don't use much heat, and the gas forced air heat is a hell of a lot better than using lightbulbs.
I don't know offhand what my baseline is. Still, this works out to 2.4 kWh baseline energy usage for you.
> Each bedroom had 60w floodlights in 4 cans. Sure, we don't have them on all the time, but 4x3x60 = 720watts. That's half a hairdryer! Our living room+hallway was even worse. There were a bunch of MR16s at 50w. With 16 of them in the house, that's another 800w. When we're home in the evenings, the living room lights are on all the time, because it's in the main part of the house.
That's a hell of a lot more lighting than I have. Our bedrooms have single-bulb overhead lights. The guest bedroom is rarely used, the master has two 25W lamps that are used infrequently, and the third is my lab with has light on maybe two hours a day (I prefer it off unless I am at my workbench). We have a bunch of cans, but except for one that is always dimmed they are usually off. The lights that are typically on when we are home are T8 fluorescent tubes in the kitchen.
Anyway, going back to your example, the typical assumption for lighting is that a given fixture is used on average 3 hours per day. Using that assumption for all your lights gives 4.6 kWh for lighting, or only about twice your baseline load. Noticeable, but far from astounding.
At much greater cost, I might add. I ran my kill-a-watt for a few days, and, IIRC, it uses something negligible like 7 cents a day. I'm looking at Emerson's webpage and they're quoting 6 cents per day at $0.0986/Kwh so my 7 cents seems about right in a more temperate climate but with electricity costs more like 15-16 cents/kwh. Their numbers say .6kwh, so call it 219watt-hours in a year. Or, in other words, 1/365th of my house's normal standby power consumption when nobody is home.
And I've got a tankless hot water heater for our main hot water supply which is of course a savings in orders of magnitude more than is 'lost' by the instant hot.
The instant hot also probably uses negligibly more energy than the other ways we heat water; microwave (for beverages) or running a burner on the stove on full blast for minutes at a time just to get a large pot of water up to boiling.
Of course, you can see from the fact that I bother measuring electricity of new devices with my kill-a-watt (and replacing all bulbs with LEDs at great cost) that I'm somewhat sensitive to electricity consumption. But I choose to spend my consumption where I think it makes the most sense.
Sorry but I can't understand your first paragraph since I can't figure out the pronouns and specifically what objects you're measuring/looking up the wattages of. Can you clarify?
Good points. I still use LEDs outside and they last a very long time. I suspect it's because they stay cool, which extends the lifetime of LEDs. I'm also not as sensitive to light quality for outdoor lighting.
It's easy to do the science behind the light quality from different bulbs, but no one seems to bother doing it. You just need to print a graph of the spectrum that the bulb emits and compare it to whatever your favorite bulb is like. Would love to have this information available on the box, but of course that would make comparison too easy.
Instead we're stuck with a simplification where a very high-dimensional property is reduced to a single number, and even that with low precision since few LED bulbs have emission spectra that resemble a black body.
Here's hoping TM-30-15 gets wide adoption. If every bulb listed its fidelity index, gamut index, and color vector diagram, you'd actually be able to tell which ones are awful.
The problem is that comparing a spectrum isn't actually meaningful. I really don't care about the spectral shape of the light if it's not perceptible to me. You want some sort of quantitative estimate of what you care about.
I'm not disputing the fact that there is a difference. But whether it's perceptible depends on where in the spectrum the differences are in relation to what you are looking at and the response curves of your cones. Not to mention your expectations.
I'm a pretty picky person when it comes to color when working with photos. But I don't have a single incandescent in my house and I can't even remember the time when I've been bothered by not being able to discriminate colors.
I have to agree. Every few years I'll try again with LEDs and energy efficient bulbs. I'll look online, buy a few different brands and specifically look for the ones with a "cool" colour temperature that should look natural.
In every instance the lighting was terrible. I really haven't yet found an energy efficient bulb that didn't make my place look like a jail cell.
My current place has halogen track lighting and I really like it.
Halogen is very much not a cool color temperature. If you're trying to match your existing halogen lighting, stick to lights labeled "warm white". If they list a color temperature, 2700K is a reasonable option, 3000K works in a pinch. Halogen bulbs are typically a bit over 3000K, but they'll shift warmer as they're dimmed, which is a pretty common use case.
The vocabulary is a bit weird, since "warm white" corresponds to "low color temperature". It's because we mentally associate blue with cold/ice and orange with fire, but the color temperature scales are based on the color an object glows as you heat it. To make a black body glow blue you have to heat it pretty damn hot.
You don't want "natural" lighting if you want it to look like an incandescent. "Natural", ie "day" light is very blue. You specifically want a non-natural, warm color temperature.
In architectural lighting, the reference spectrum depends on color temperature. Warm whites are measured against the black body spectrum, cool whites are measured against the daylight spectrum.
That's done because it's very unusual to get a black body radiator that hot on Earth, so if you're talking about something on the cool white end odds are high it's daylight or trying to approximate it, not a tungsten filament or a campfire.
If you are looking for "natural" indoor lighting (like for a bedroom or living room), you probably want a "warm"er color, on the range of 2400k to 2600k.
I don't know if the 20 year lifetime claim will pan out, but I replaced as many lights in my apartment that I could with LEDs a couple of years ago and haven't had a single one die yet, including in the bathroom, where I used to have to replace a bulb almost once a month.
Most photographers understand the color temperature differences between incandescent/tungsten light and daylight, and have to account for them by setting their camera's white-balance (manually, or automatically).
So it comes as a shock to me that a traditional incandescent bulb's coloring matches that of natural daylight.
I think this is a difficult concept for people because our brains do such a good job compensating for mixed lighting. Our built in auto-white balance is too good. One example I use to explain this is the classic Hollywood 'moonlight' trick, which is achieved by using a daylight balanced source like a large HMI, and then using film or a camera that is balanced for tungsten. Looks like daytime to the eye, but on camera it looks like moonlight.
There are even special gels just for correcting various light sources CTO (color temp orange) CTB (color temp blue) and minus green (correct gross flo lights). The rest are usually regarded as 'party colors' since they are for non-technical corrections.
You're conflating two concepts, color temperature and CRI. Together they roughly describe the spectrum output; color temperature tells you where the peak output is along the black-body temperature line, and CRI tells you how lumpy the response is. A low CRI can lead to metameric failure[1].
I was going to say the same thing. People are confusing "continuous spectrum" with "solar spectrum". Try comparing a "daylight" bulb with a normal incandescent. The color temperature of the sun is ~5500K. The daylight one will look insanely blue, very much like a "cool white" LED or fluorescent.
The people that are claiming that incandescent light is "natural" are out to lunch, imho. Maybe it appeals to the cave man inside them, because it's closer to the color of fire?
Our eyes can adapt pretty readily to differences in color temperature. What is harder to adapt to (or maybe theoretically impossible) is the non-continuous spectrum. Even if some things look "natural" under such lighting, other things won't, it will shift colors in weird ways. It depends on the pigment etc.
Yes, the spectral power distribution is smooth for any approximate black body radiator. That low temperature (2700K) incandescent and high temperature (5000K-6500K) daylight have a range of "color temperatures" doesn't negate the smoothness of the distribution. Human chromatic adaptation accounts for this difference automatically very nicely.
I guarantee you it's very straightforward to reproduce color mismatches with 5000K "daylight" fluorescents that have CRI 90 and higher, compared to actual daylight. The CRI is just short of complete utter nonsense. It's very useful in specific comparison contexts, but requires qualification. It's almost like having your kid come up with a report card that says B+ and you think, yeah OK not bad kid! But then realist they got A's in everything except they got a D in history. Oops.
CRI is next to b.s. by itself, without further qualification. This is how we get craptastic lights where socks mismatch under a CRI 90+ light but not under daylight. The whole point of CRI is to try to turn a complex evaluation into a simple one, and encourages ignoring the dirty filthy truth about certain kinds of light sources, i.e. its emission lines. If you compare the spectral power distribution of incandescent and daylight (sun light on earth filtered through the atmosphere) they don't look anything alike except that both are very smooth spectra.
Whereas LED and fluorescent have very spikey spectra. We tend to not notice these spikes, but sometimes, depending on their location and depending on the materials being viewed under such a light source, they can have a pronounced effect that isn't predicted by CRI.
CRI has been a pretty hot potato for some time with a lot of scientists trying to fix it, butting up against a lighting industry that doesn't give a flying crap about accurate color reproduction, they just want a high number CRI so they can sell their shitty lights.
>CRI compares a light source to an ideal black body, not necessarily daylight.
Isn't the sun a pretty-close-to-ideal black body emitter? Is there something specific you are thinking of that makes daylight deviate significantly from black-body radiation? (Rayleigh scattering in the upper atmosphere?)
The sun provides a spectrum with a temperature of roughly 6500K, domestic lightbulbs produce something more like 2700K up to 3000K. The biggest difference between incandescent and CF/LED lights is that the spectrum from an incandescent bulb is quite smooth, whereas other lights tend to have a much spikier spectrum meaning you may see some odd colour effects from them and certain materials.
Indeed it is, but that doesn't mean anything illuminated by a blackbody "match the hue of objects seen in natural daylight." CRI has much more to do with the ability to discriminate colors. The absolute hue is a strong function of the color temperature of the illumination, but the eye/brain is also very good at normalizing that away.
This is precisely why I don't think people complaining about "nasty cool light" are based in fact. The color of the environment around you will quickly be normalized to be neutral by your brain, and it will be impossible for you to tell whether you are illuminated by a 3000K or 5500K light. You can really only tell color differences accurately.
> 5 seconds of research into CRI says that the first sentence is false. CRI compares a light source to an ideal black body, not necessarily daylight.
I'll clarify this a bit further: At color temperatures below 5000K (warm/neutral whites), it's compared against an ideal black body. For 5000K and up (cool whites) it's compared against CIE Standard Illuminant D[1], representing a daylight spectrum.
Two interesting notes on CRI:
1) This makes CRI discontinuous at 5000K. Two bulbs with very similar color spectrums at 4995K and 5005K can have significantly different CRI scores.
2) CRI of 100 doesn't mean it's "perfect", it means it's the same as a black body emitter (or illuminant D for cool whites). There's some recent research suggesting that people prefer oversaturated reds relative to the black body emitter, but if you put more red in the spectrum it actually lowers your CRI.
As far as the two quotes you pulled from the article
> Traditional incandescent bulbs have a ‘colour rendering index’ rating of 100, because they match the hue of objects seen in natural daylight.
Yep, totally wrong. A tungsten filament bulb is effectively a black body radiator. Being well below 5000K color temperature, it's gauged against the black body spectrum. It's the same spectrum as the reference! Of course it scores well!
> Previously researchers have warned that the blue light emitted by modern bulbs could be stopping people from getting to sleep at night
This one's a more reasonable statement. Daylight doesn't keep you up at night because there's no daylight at night. The fact that you're running a blue pumped phosphor LED at 11 PM is the potential issue. They have a giant spike of blue in the emission spectrum[2]. Exactly how that affects melatonin levels or your circadian clock is out of my expertise, but it's definitely a real concern that we need to investigate and get a better handle on.
Yes, but the sun automatically goes down at night.
Most people don't turn off their lights at sunset. I believe what the author is saying that if you have a lot of artificial light after sunset giving off a lot of blue light, it can interfere with your sleep.
You missed my point. The author simultaneously claimed that incandescent lights are like daylight, and that non-incandescent light is worse because it has lots of blue in its spectrum. All I am saying is that, given that sunlight contains significant blue light, these things cannot simultaneously be true.
There's insufficient evidence the problem results from blue light specifically, rather than the absolute amount of light. I would not be surprised any amount of light that results in mesopic, let alone photopic, adaptation would be a sleep inhibitor. The human visual system is very adaptive, and we're not nocturnal animals, so pretty much anything brighter than moon light pretty much I'd expect would be a potential trigger for "time to be awake".
In addition, not only is it an observed effect, but it can be harnessed. This study showed increase in salivary melatonin by using glasses that filtered lower wavelengths of light:
http://www.ncbi.nlm.nih.gov/pubmed/15713707
The effect is such a strong one that it was discovered despite not being one of the measurements in an unrelated experiment on breast cancer.
And throughout antiquity, the only light at night we're exposed to has been that of fire, which is a higher wavelength source.
I think the author has somewhat conflated the concept of CRI and color temperature in the article. An incandescent bulb or a candle has 100 CRI because they are almost by definition close to the ideal. That said a typical incandescent bulb is going to be somewhere in the 2500-3500K color temp. Lots of cheap flo bulbs tend to be both relatively low CRI and have a color temp in the 5000-6000K range which is more like 'daylight without benefits'. They tend to have lots of blue and greens and nothing else. Lots/most LED lights fall in this range as well. My personal opinion is that lighting products should be required to list CRI and color temp, just like nutrition information on food. :)
Yeah, there are fluorescent lights with very high CRIs (for example those in professional light booths) but they cost more because of the phosphor sets.
Daylight has a CRI of 100 because the sun is as close to an ideal black body as we're going to get. Perhaps absorption and reflection within the atmosphere lowers it a bit, but I doubt it's significant.
As for daylight containing lots of blue, I think that's the point - our bodies have adapted to recognizing blue as the defining characteristic of daylight, and keep us awake for it.
This article read an awful lot like a marketing puff piece and failed to answer some obvious questions. There is way too much "those evil fluorescent bulbs are making your life miserable" in the article.
1. How long are the bulbs going to last?
2. How expensive would they be to manufacture?
3. When could we see these in stores?
Often times there is a long road from laboratory prototype to production, and the supposed cost savings in electricity won't matter if they only last 1,000 hours and cost $100/bulb.
Stop buying such shitty bulbs. I moved into my house 8 years ago and replaced the incandescent as they burned out. Most within the first year. Of all of the bulbs I have replaced, not a single one has failed. The only ones that have failed me are some that came packaged in with a new light fixture. Those were the pits.
Worse, the bulbs in the lamps came from my previous home, and have been going strong for a decade now. I've been wanting to upgrade to LED bulbs when they finally burn out, but they've been too damn reliable.
There is a light at the end of the tunnel (hah) with a couple of bulbs that now won't turn on unless you tap them a bit. They're roughly a decade old and have been used for many hours, so I guess it's finally time.
It also depends on the quality of electricity in one's house. I've lived in places that chewed through CFLs at a rapid pace, and yet the same sourced bulbs would last for years in a different house.
Their aging properties are different than incandescent. Incandescent tends to just fail when the filament breaks. There isn't much change in the spectral power distribution, and hence the overall stability of the light source is quite good. Whereas with fluorescent and even LED, there's a decaying of the phosphors, so while it'll produce light for a longer time (probably) the quality of the light will suffer immensely to the point that I think the longevity claims are approximately useless crap unless of course you don't care at all about the quality of light.
Fluorescent bulbs have been superceded, LED bulbs are the best ones to get at the moment. There's very little reason to buy fluorescent bulbs anymore, LED bulbs last longer, are more efficient, and are easily affordable.
I don't know, this may be anecdotal, much like your experience. But I have three fluorescent bulbs that have been running for at least the past five years. One of them runs all night from dusk till dawn, even through New York winters. The other is in the garage, and the other is above a stove, that runs all night as well.
I wonder how much is also because they might be comparing apples to oranges: the guy with long-lived fluorescents sounds like he's using the traditional long tube kind, which are indeed known for very long lifespans, which is why they've been used in offices for many decades. The other guy complaining probably used CFLs, which are notorious for the cheap ones having very short lifespans, not because of the fluorescent bulb itself, but because of the cheap, crappy electronic ballast packaged inside. These bulbs almost always fail because of the electronics. On a traditional fluorescent system, the bulbs are just bulbs, with no electronics at all, and the ballast is a separate part that's built to last decades (the life of the fixture). Only the bulb is routinely replaced. When they went to CFLs, this changed, since they were trying to retrofit CFLs into sockets meant for incandescent bulbs, so they packaged a small electronic ballast into the base of the bulb assembly. And of course, to keep costs down, they massively cheaped out on the ballast electronics. With high voltages (needed to ionize the gas), it's easy for marginal components to fail early.
>> They are considered toxic waste with special clean-up and disposal considerations.
>That isn't true, it's one of those urban myths that spread by email.
It can be hazardous waste, you can't categorically say that it isn't true. Both of you are technically wrong. While it's correct that low mercury bulbs can be disposed of in dumpsters according to EPA, some states have stricter rules with respect to even low mercury bulbs. I believe CA, WA, and VT all require specific recycling of low mercury bulbs, for example.
Beyond that, looking at bulbs that are NOT low mercury, those DO become regulated hazardous waste after they burn out according to federal laws. So those MUST be recycled properly to comply with federal and state laws.
In any case, 100% of fluorescent bulbs contain mercury, which is really bad for the environment. So even if you're legally allowed to throw them in dumpsters, it's really not something you should be doing if you care about the environment or the people in it.
Bottom line: Depending on the bulb and jurisdiction, a spent/broken bulb might be classified as hazardous waste legally requiring special cleanup considerations, or it might legally be able to be thrown in a dumpster. But regardless of the law you should properly recycle 100% of bulbs because they contain mercury (even if in small amounts).
I tried to figure this out for my household... 1) The squirrel cage on the HVAC unit uses about 600 watts. 2) Couldn't convince everyone in the house to turn off TVs when they leave their rooms. 3) Laundry (gas dryer, but again there is a motor on it). 4) Dish washer 5) Fridge/Freezer.
I added up the cost of leaving lights on in all all the rooms (10 60-watt bulbs) 24x7, and that would only cost $43 a month. Typically only half would be on, and for a few hours in the evening, (5 bulbs, 4 hours a night, 10 cents/kwh), and that comes to $3.60 a month.
Some rooms have more than one light but typically only one would be used. Examples:
Master bedroom -- 2 lamps, but one is mine, which is hardly ever turned on.
Kids rooms -- one overhead hanging (swag) light each. Plus the closet light, but I'm not counting that as it is already a low voltage bulb (code requirement).
Living room -- the foyer has hanging light with 3 candelabra bulbs, which are 20 watts each. The dining room (the other side of the great room) has a hanging light with 5 bulbs, again 20 watts each. So one room is 60 watts, the other is 100.
Kitchen/family room -- overhead long tube fluorescent (120 watts total), family room had a torch light.
Bathrooms have multiple bulbs (4, 6, and 10 bulbs for the 3 bathrooms), but most of them are burned out and/or unscrewed. And they were 25 watt bulbs. So I'm counting an average of 60 watts per bathroom.
Basement has 6 bulbs total, but only a couple are used on the way to the laundry.
So slightly more than the equivalent of 10 60-watt bulbs for the lights that are used most often. At first I thought that I could save a lot with the compact fluorescent bulbs, but I really didn't see any change in the electric bill after changing out all that I could -- the chandeliers take the candelabra bulbs, and while I could get fluorescents to fit they would be way too bright. Same with the bathrooms, I didn't want 10 bulbs in one bathroom that put out the same lumens as 10 60-watt incandescents, and I couldn't readily find any fluorescent or led bulbs that were equiv. to a 20-watt incandescent.
This article also doesn't mention the current U.S. law which says it is now illegal to sell incandescent bulbs across state lines. The law should have been written to mandate efficiency, not implementation. Now there must be a legal fight to allow incandescent bulbs again (for the U.S., at least).
(C) Candelabra incandescent lamps and intermediate
base incandescent lamps.--
``(i) Candelabra base incandescent lamps.--A
candelabra base incandescent lamp shall not exceed
60 rated watts.
``(ii) Intermediate base incandescent lamps.--
An intermediate base incandescent lamp shall not
exceed 40 rated watts.
Anything over 60 watts was to become unavailable by 2012, with a schedule to retire the 40 and 60 watt bulbs. Ironically, this meant that to get the same lighting levels, you were permitted to buy more 40w bulbs, but not a single 100w bulb.
As to the "why cover US law?" question, I thought it relevant because the discovery was made by a US team at MIT and for there to be a "return" of incandescent bulbs would require both US and UK/European markets to be opened back up. Even if only as an irony juxtaposed with the location of the discovery, it was related.
Now the law as stated would seemingly permit incandescent bulbs of equivalent brightness to old 100w bulbs, but only if they are 27w or less. Or, you know, you could just buy a gaggle of 27 watt bulbs.
What is your point about a candelabra incandescent lamp?
That concerns only lamps which use a "candelabra screw base as described in ANSI C81.61–2006, Specifications for Electric Bases, common designations E11 and E12." Moreover, that is from a section which specifically allows them to be incandescent, so long as they don't exceed a given power limit.
You also seem to have missed the part where it says:
> The rulemaking— (I) shall not be limited to incandescent lamp technologies; and (II) shall include consideration of a minimum efficacy standard of 45 lumens per watt.
or the "Backstop requirement":
> If the Secretary fails to complete a rulemaking in accordance with clauses (i) through (iv) or if the final rule does not produce savings that are greater than or equal to the savings from a minimum efficacy standard of 45 lumens per watt, effective beginning January 1, 2020, the Secretary shall prohibit the sale of any general service lamp that emits less than 300 percent of the average lumens per watt emitted by a 100-watt incandescent general service lamp that is commercially available on the date of enactment of this clause.
This allows incandescent bulbs so long as they are significantly more efficient than those sold before 2007.
As to why UK newspapers writers are supposed to understand the details of domestic US law, and why UK newspaper readers are supposed to care, in order to get an extra moment of irony - well, that seems like a lot of work for very little gain. Especially when the EU has very similar laws, so UK readers may default to think that the US is already in the same boat, so there's little to no irony available.
They seem to be many on sale here: www.amazon.com/b/ref=sr_aj?node=328865011&ajr=0
Maybe the order would fail at checkout if the relevant warehouse is outside of your state. (I once had an order for smoke detectors fail since they were illegal in California)
From the paper: "This experimental device is a proof-of-concept, at the low end of performance that could be ultimately achieved by this approach."
Somebody really needs to get MIT's PR department under control. The hype level is so high it's embarrassing to a good school. Especially in materials science articles. It's like reading the National Enquirer of science.
> The clinical white beam of LEDs and frustrating time-delay of ‘green’ lighting has left many hankering after the instant, bright warm glow of traditional filament bulbs.
Sheer garbage. LEDs come on instantly, and are available in various color temperatures, thanks to filtering: you can have then in 2700K. If your LED isn't coming on instantly, it has some problem with the power supply, probably because you bought the lowest bargain-bin crap you could get your hands on.
I've had Phillips flood lights (3000K) in my kitchen and living room for several years. They come on instantly and put out a warmly colored light that is consistent from the center of the spot to the fringes.
And your apples and oranges are the wrong color because the color rendition on them is so poor that Philips doesn't even publish the number. The only downlight I could find on their site has a CRI of 80 -- basically garbage. CRI is imperfect but things glow green until you get to the low 90s.
I built a house a couple years ago where the lighting used is predominately LED bulbs and I absolutely hate it. The pre-determined lighting plan for the home was presumably based on the brightness given off by certain incandescent bulbs and placing LED bulbs in those locations that are supposed to be wattage equivalent has resulted in a fairly dark house with what I would consider to be low quality of light. Many of the bulbs used in certain fixtures also emit a noticeable high pitched noise when on.
I have no idea what a good quality brand is. And if I did find better bulbs it would cost a fortune to replace all of the bulbs I just bought two years ago.
Sounds like somebody spent a lot of money without doing any research.
I must have tried samples of a half-dozen different brands before buying enough LED bulbs to retrofit all of my ceiling cans. Actually I sort of wasted my time, because the (subsidized) $2.99 65w replacements from the hardware store on the corner proved to be as good as any of them. No failures a year later (N = about 36) so they seem to be working out OK.
If they're on dimmer circuits, they could just be cheap LED bulbs. Even if the dimmer is all the way up, and even if the LED bulb is marketed as dimmable.
Didn't see it mentioned in other comments, but a particular failing of LED lamps is their unsuitability for critical color matching applications. Fluorescents and CFLs are available for that use, but to date haven't seen LEDs that are satisfactory.
This is important in art studios, museums and other venues. For many years I've used specialized 48" T12 fluorescents with good results. Incandescent/halogen lamps are also essential because color selection can depend on the target environment, there's a world of difference between indoor illumination and daylight in this respect.
The high-efficiency lamps described in the article will likely be very useful and a welcome refresh of a light source that's so far been hard to emulate with LEDs.
With IR energy being pumped back into the filament, maybe it's an option to run a hotter filament emitting a higher color temperature. More likely the input energy is just decreased so the resulting filament temp remains in conventional range. There still might be an issue re: time to filament burnout not differing from conventional lamps unless the filament is modified in some way.
In the long run I imagine LEDs will be improved, curbing the disproportionate blue output, while enhancing and smoothing the long end of the spectrum. LED dominance will probably be complete when color rendering is optimized, they work properly with electronic dimmer switches, and lamp envelopes produce illumination as diffuse as classic sources, e.g., like "frosted" incandescents.
So, if I understand dkbrk correctly, this is a glass that transmits light but reflects infrared. This seems to have a huge number of other applications (assuming it is practical). Make windows out of it, so that your house stays cooler in the summer and warmer in the winter. Keep your car cooler in the summer. Better face shields for fire fighters. Put a window in your refrigerator. Let people get closer to metal foundries/casting metal.
No, the bans were stupid to begin with and should be unwound. Instead, there should be highly granular electrical billing, if the electric company cares to really bill people properly for necessary overcapacity for that ~1 hour a day when demand spikes. I'd let them charge 1000% surge pricing if they wanted. People will absolutely alter their behavior to no negative effect for people who can't afford it, so long as the typical cost per kW is affordable.
It is interesting to think about how this commercial product might have to fight the anti-incandescent laws. Would this bulb be illegal in many states/countries right now?
No. Just let consumers choose. If I could have super-efficient incandescent bulbs, I would prefer them precisely because of the better colors and instant-on startup. (This assumes production ramp-up has kept these new incandescent prices well below that of LEDs)
That wasn't the previous solution and I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns? I get the feeling the answer is no which makes me wonder about the original reason.
> That wasn't the previous solution and I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns?
Yes, it was: the so-called "ban on incandescent bulbs" was actually an efficiency requirement which no existing incandescent bulbs met. Much of the research into high-efficiency incandescent bulbs (which IIRC actually had produced some bulbs that met it prior to the requirement going into effect, but which were not cost competitive with CFL or even LED bulbs) was directly spurred by the requirement.
> I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns?
I'm pretty sure that most the people in the public that were fine with the government setting efficiency targets that could be met by a variety of technologies on the market (but not, at the time, cost effectively by incandescent bulbs) would be just fine with moving the efficiency requirements up as new technologies become available.
At any given time, which particular industry players would support and oppose such a move would, of course, vary based on the relative efficiency of what each particular industry player was invested in.
Traditional incandescent lights have terrible efficiency 2.5% or so. The benefit from Old Incandescent vs LED's is huge (5x to 10+x), but there is just not much room from the best LED's to 100% efficiency making the remaining gains less important. https://en.wikipedia.org/wiki/Luminous_efficacy
PS: Power is kind of an odd thing. Direct costs are so cheap most people ignore it, but the external costs are high enough to push efficiency gains. Taxes would be the best solution, but politics is odd.
There you go. Set the "sales" tax rate on a luminous efficacy sliding scale. The lower the efficacy, the higher the tax %. It'd have to be some kind of wholesale/manufacturer tax, at least in the U.S., so that it's reflected in the sticker price before checkout.
It would not surprise me one bit if it were discovered the lobbying effort for such laws were pushed by the very lighting industry that needed a way to compel people to buy much more expensive light bulbs with a higher profit margin.
It reeks to me of the same reasons why we have ethanol mandated fuel in some cities.
I was and am perfectly fine with the old incandescents going away, and I'll be perfectly happy when LEDs are legally phased out in favor of something better.
>Researchers at MIT have shown that by surrounding the filament with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through.
Why can't/don't they use those same crystals on LED lights?
This was posted a while back and picked apart by the comments, as it has been here too. This is all just "potential" improvements, just like we've been hearing about "crazy new battery technologies" for the past few years and have seen next-to-nothing come of it. Always good to see new tech and ideas, but let's not kid ourselves into thinking this will make it's way back into homes anytime soon.
"Return of incandescent light bulbs" is an incredibly misleading headline that implies that due to this new tech, we're already starting to use these great new efficient bulbs- um, no.
Not to mention, the prototype/initial version is nowhere near the "potential" level of efficiency they are claiming.
> The clinical white beam of LEDs and frustrating time-delay of ‘green’ lighting
I hear a lot about the "delays" in incandescent alternatives, but all of mine are barely perceptible. My Flux bulb has like 400ms; it's palpable but how could it ever annoy me? How often were you trying to illuminate something with a light bulb within 400ms and missed it?
And of course, many of the adjustable/smart LEDs can mimic incandescent light temperature with ease. They cost more, but not necessarily over their lifetime.
"Green" and "energy saving" are often used to refer to florescent lighting in the UK. Florescent bulbs take some time to warm up before they reach full brightness, which is what they are complaining about. LED bulbs like your Flux don't have a delay.
Ah, thanks. The article does a poor job here, too, since it uses LED/CFL interchangeably. But if that's a known euphemism it gets a pass on this one :)
Instant start is not what the comment you're replying refers to. Instant start is about how an arc is struck to get the gas to glow initially. The comment above is referring to how the light gets brighter as it runs longer, well after the gas begins to fluoresce.
I know, I was not referring to instant start either. (I said instant on, so I can see how that might be confusing.)
The CFL's in my house are full brightness immediately. I suppose a lab could measure a change in brightness, but it's nothing that is visible to the eye.
Zero-warmup florescent fixtures do exist, but I've never seen them installed in a home. They work by continuously running a small current through the filaments to keep the bulbs warm, which requires a separate always-on power feed in addition to switched power. Normal (instant-start) florescents start producing light instantly, and in sufficiently hot weather don't need to warm up, but if it's below freezing expect less than half of rated brightness for the first five or ten minutes. It's worse for CFL bulbs than for straight tubes, and worse still for CFL bulbs installed upside-down because the mercury settles.
So it's like the "stand by" mode of a tube amplifier, which keeps the filaments on.
Even though fluorescent lamps are called cold cathodes, "[a] cold cathode does not necessarily operate at a low temperature: it is often heated to its operating temperature by other methods, such as the current passing from the cathode into the gas." (Wikipedia, Cold Cathode).
Obviously, the tech you're referring to requires a special circuit, since an ordinary light socket goes comletely open circuit when switched off. (I.e. anyone who claims to have these zero-warmup CFL's in ordinary light fixtures is confused.)
I find that the slow warm-up time of regular CFL's is excellent for bathrooms. When you have to "go" at wee hours in the morning, you don't want the full glare in your sleepy eyes.
Your notes about the ambient temperature effect on warm-up time are spot on. People who are not seeing the effect with CFL's may be living in a warm climate or well heated home.
Am I the only person that doesn't like "warm" light? I wish I could have all the lights in my apartment be D65 like my monitor. It makes everything look like a calm, cloudy day. At least with my monitor calibrated to that temperature, anyway.
There is still one active for over 114 years and it has a 'live' feed http://www.centennialbulb.org/photos.htm ( 1 million hours and it's not that efficient anymore )
Depends. If your electricity comes from coal/gas fired power stations, yes it is wasteful - just burn the fossils locally instead. If it comes from hydroelectricity, solar, nuclear, or wind, then no it is sensible.
Incandescent lights, when heating would be used, are 100% efficient.
They provide 3% light output, and 97% heat output. A single one on provides a great amount of heat and can easily provide spot-heat where humans are.
Instead there's CFLs and LEDs. And CFLs are a great way to spread mercury pollution across a great area. Snopes has a decent article about hazards and response: http://www.snopes.com/medical/toxins/cfl.asp
Direct electric is a really inefficient way of heating anything. I ripped out a direct electric heater system and replaced it with a ground source heat pump which is giving 4KW heat for every 1KW electricity (used to run the pump and compressor etc) even in the cold depths of the Swedish winter.
It's true that where heating would be used, if mounted in such a way that all that heat is available (ie, not recessed into a ceiling fixture), the 97% is useful. (But not necessarily maximally efficient, because if you wanted heat in the first place you'd be better off not turning your energy into electricity, a process with ~50% efficiency at the power plant, and then incurring further distribution losses.)
On the other hand, if cooling would be used, it's even less efficient because now you have to cool away all those 97%.
The electricity for my lights comes from thermal power plants with maybe 40% efficiency and then loses a little bit more in transmission. If I use LED bulbs, then that heat is replaced by my natural gas furnace, which is somewhere around 90% efficient. Net result, even in the dead of winter it saves energy to use more efficient bulbs. In summer, when I'm using electricity to cool my house, the difference is even greater.
Resistive heating with electricity is just terribly inefficient. If you have to do it anyway (because you need to run equipment, and that equipment is already as efficient as it'll reasonably get) then you can benefit a little by ensuring that this heat is put somewhere it's wanted. But if you don't have to do it, don't do it.
CFLs suck, but LEDs are great. I fear that the CFL interregnum may have hurt the cause of efficient lighting in general.
I have both. There's basically no difference, except the LEDs are hot on the base and CFL's are hot on the bulb.
Power consumption is identical, CRI identical, and Cost per Hour (before failing) is basically identical (assuming the hours listed on the package is the truth).
> I fear that the CFL interregnum may have hurt the cause of efficient lighting in general.
It has not. I've been using CFLs for about 20 years (almost since the day they came out), and they work just fine.
CFLs take longer to come on, don't reach full brightness immediately, are less efficient, don't last as long, and contain mercury.
I don't know what your personal statement about using CFLs for 20 years has to do with my fear about the cause of efficient lighting being hurt. Your own preferences don't necessarily reflect those of people in general.
I don't know what to say, besides that your statements run contrary to my experience where they conflict, and I don't understand why you think it's irrelevant that CFLs die quicker and are more poisonous.
A whole lot of people experienced CFLs and came away with the conclusion that they suck. I fear that they will assume LEDs also suck.
> I don't know what to say, besides that your statements run contrary to my experience where they conflict
Did you buy CFLs recently or only more than 10 years ago? What brand? I've had excellent results with GE and GreatValue. Mixed results with Feit.
> and I don't understand why you think it's irrelevant that CFLs die quicker
Because I don't (yet) trust LEDs to live as long as they say. So that advantage doesn't exist as far as I'm concerned.
> and are more poisonous
Because it makes no difference in actual use. You can toss them in the trash if you want, or take them to Lowes/Homedepot to recycle. The amount of mercury is too small to hurt anyone if they break.
Sure, but when air-conditioning would be used that equation flips around. Not only do you lose electricity as heat, you need to use more electricity to counteract that heat.
And in most places, far more money is spent on cooling than heating. Even in the frozen wastelands of Minnesota where I live, air-conditioning costs over the summer are about equal to heating costs over the winter.
As others have mentioned, heat pumps and gas furnaces are much more economical than resistive heating via incandescent lights.
"Abstract
Energy demand for climate control was analyzed for Miami (the warmest large metropolitan
area in the US) and Minneapolis (the coldest large metropolitan area). The following relevant
parameters were included in the analysis: (1) climatological deviations from the desired indoor
temperature as expressed in heating and cooling degree days, (2) efficiencies of heating and
cooling appliances, and (3) efficiencies of power-generating plants. The results indicate that
climate control in Minneapolis is about 3.5 times as energy demanding as in Miami. This
finding suggests that, in the US, living in cold climates is more energy demanding than living
in hot climates."
Of course, the bulbs would add 97% waste heat if heat isn't wanted. But most places here in the US have some soft of heating system for the winter. That would pretty much require changing out the bulbs for the winter.
During sleeping? Scraping the bottom of the bucket much?
I never said that incandescents would eliminate heating, only that it is 100% efficient when used appropriately.
I think it's a fair point; these bulbs are not used for heating because that's not their primary function (despite being more efficient at it than lighting). The inverse would apply to, say, a fireplace. Light is an acceptable (even desired) side effect of the primary function, but it would not be convenient to rely on it for that side effect.
It was actually published in January, and has just now been picked up by the Telegraph.
Here is the citation:
And here is the actual paper, via Sci-Hub: http://sci-hub.cc/10.1038/nnano.2015.309The Telegraph's explanation is terrible: "with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through."
The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light.
They used a numerical model to design and evaluate various candidate compositions for these plates. For a proof-of-concept, they chose one which uses layers of silicon oxide and tantalum oxide, with 90 layers in total per plate. This reflected about 90% of infrared radiation, producing a luminous efficiency of about 6.6%. This is comparable to commercial LEDs and compact fluorescents, though far from state-of-the-art.
However, the results closely matched their numerical model, and a more complex structure comprised of layers of silicon dioxide, aluminium oxide, tantalum oxide and titanium dioxide, with 300 layers in total, should produce a luminous efficiency of 40%. This is significantly better than the state-of-the-art in LEDs (about 15-30%). They did not, however, actually build that one.
I have no idea how expensive this would be to commercialise. It doesn't sound like the physics is particularly complex, but manufacturing costs could be prohibitive. I think it's safe to say we won't be seeing it outside the laboratory any time soon.