Hacker News new | past | comments | ask | show | jobs | submit login

CRI measures how close (using a terrible metric) oranger lamps get to a black body radiator, or how close bluer lamps get to an approximation for daylight from the 60s, which each have a CRI of 100 by definition. Incandescent bulbs are a black body radiator, so unless you put some kind of filter in front, they’ll have a CRI of 100. It’s not a 0–100 scale though. 100 is an arbitrarily selected number.

There are reasons to be dissatisfied with the light produced by LEDs and fluorescent lamps, but CRI is only very marginally useful.

Better is to look at the spectral power distribution, and try to find lamps with a broad emission spectrum and no sharp spikes. For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.




CRI measures how natural the objects' colors appear under an artificial light source against the objects colors under a natural light source (standardized daylight).

The color temperature and luminosity of natural light sources varies tremendously during the course of the day or in different parts of the world, yet the objects colors remain consistent. This is a property of the human vision called chromatic adaptation. https://en.wikipedia.org/wiki/Chromatic_adaptation

Sources which emit a continuous spectrum have better CRI than tri-color sources, especially on artificially colored objects (dyed/pigmented) due to metamerism. https://en.wikipedia.org/wiki/Metamerism_(color)


If you want to get technical, the algorithm for computing CRI is:

- take a specific set of arbitrary paint chips, and record their colors when illuminated by the target lamp (using the CIE 1931 2° standard observer, and the 1960s CIEUVW color space)

- perform an outdated and not very effective type of chromatic adaptation transformation so that the white point matches either a black body radiator (like an incandescent bulb) for lamps of correlated color temperature of <5000K, or a point on the “Illuminant D” series of approximations to daylight for lamps of CCT >5000K

- measure the distances in UVW space between those target colors and the colors of the paint chips as illuminated by the reference black body radiator or D series illuminant.

- sum up all the color distances, multiply by an arbitrary factor, and subtract from 100.

Everything about this process was totally ad hoc and arbitrary 50 years ago when it was first invented, and is absurdly outdated today.

There was an attempt at the CIE to replace the CRI with something better in IIRC the late 1990s, but the lighting industry didn’t want to follow the recommendations of the color scientists, so it fell apart. (I could be misrepresenting what happened; I’m not an expert in CIE politics or the history here, and haven’t ever researched it in detail.)


> For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.

Moonlight has significant energy at wavelengths below 500 nm [1]. Does this light disrupt the sleep cycles of other primates, or is this something that developed in humans only after we invented "indoors" and started sleeping there away from sources of bluish light?

[1] http://www.olino.org/us/articles/2015/10/05/spectrum-of-moon...


Yes, if you stare directly at the full moon, it will be harder to see into the shadows for a while afterward until your eyes adapt back to the dark, and you can probably (I’m not sure this has been scientifically tested) push back when you start feeling sleepy.

In general, the moon is high overhead, so it won’t be directly in your field of view while it lights up your surroundings.


I would suppose that the difference would having these < 500nm wavelength light sources pretty much right up in our faces.


> try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.

Citation please. What does it mean "below 500 nm"? 400 nm or 600 nm? 400 nm is violet, 600 nm orange. Which of those "knocks out" what?


Light, especially bright sources of glare, in the short-wavelength part of the spectrum saturates the “rod” light detectors in the eye, causes vision to become bright-adapted. To adapt back to fully capable night vision takes something like 20–30 minutes.

Additionally, beyond rods and cones, there are a 3rd set of light detectors in the eye, the “photosensitive retinal ganglion cells”, which regulate melatonin and sleep cycles. These are particularly sensitive to light in the 400–500 nm range. Looking at such a light source at nighttime can suppress melatonin production until about an hour after you stop the light. This is the reason that smartphone, television, and computer displays are all so disruptive to sleep when used before bed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: