Hacker News new | past | comments | ask | show | jobs | submit login

HDR content is pretty damn mainstream and uses 10 bits per channel for 30-bit color. Some variants can do 12 bits per channel, but I believe this isn't often used.

Moreover, this content often comes with dynamic tone mapping to allow access to many more colors in a movie, whilst remaining limited to 30-bit color per frame (or usually, per scene)




> Some variants can do 12 bits per channel, but I believe this isn't often used.

Dolby Vision has 12bit colour channels, but it's reportedly not such great a leap as going from 8bit to HDR.

iPhones(12 onward) apparently support this standard, so anyone with such a device can see for themselves.


Dolby vision can use 12 bit channels, and they are proud of it, but it also has a 10 bit channel mode, and that is used essentially everywhere.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: