I love how, if you do view it accurately, it just makes every other white look washed out and grey in comparison. Our visual system is just one big comparison machine.
It's really striking if you have a projector. Before you turn on a projector in a room that isn't dark, a white wall looks pretty white. Once the projector is on, that "pretty white" wall is now playing the role of black in the projected scene.
This is actually super interesting: for displays that can get bright enough for HDR content, but do not have self-illuminated pixels/local dimming zones/any other mechanism to display very dark content alongside very bright (like, say, most modern iPad and Mac screens), this is exactly what is happening.
macOS/iPadOS cranks the backlight brightness up but then adds a black filter to the non-bright content to sim it back to “normal” levels.
Yeah, it seems to auto-dim the "SDR" brightness range a bit too much; the jump in the white background brightness ends up being clearly noticeable when e.g. I'm switching to an from the superbright QR code tab in the browser
Next time you see a scene in a movie where people enter a house and you can see both the interior and exterior, think about the 1000x ratio and how much artificial light was needed inside the house to balance the lighting in the shot. (Assuming it's a real sunlit exterior and not a sound stage, of course.)
In the modern digital era, that has actually changed quite a bit. Now cameras are really sensitive – the equivalent of what used to be 2500 to 4000 ISO is now really common – so instead of adding light, often you have to flag off light in areas you want dark.
So instead of many bright lights you just have duvetyne and flags everywhere, which many DoPs have complained is harder to work with.
A stop is double the amount of light. Useful for equivalence across different ways of changing exposure. So I halve my shutter speed I’ve lost a stop. But I increase my aperture from 4 to 2.8 I’ve gained that stop back. Or I double the sensitivity of my film or sensor (eg. iso 400 to 800).
So yes, in terms of data if using linear gamma exactly equivalent to bits. This is why log gamma curves are used to store for eg 16 stops of light info in 12 bit format
Imagine light represented as a floating point number of the type a * 2 ^ b
Stops describes the range of the exponent b.
With a typical cinema camera, max_b > min_b + 14
You can use tricks like gradient ND filters to increase the perceived range within of a frame or variable NDs to slowly adjust over time (like our eyes do)
Sorry for self replying but I just had a look at the repo and it's definitely worth fully automating this.
A js/python snippet converting pngs to superwhite video frames should be fairly easy to implement.
You also only really need the video to push the browser to do things in HDR. You can’t express a CSS colour way outside the normal range, but blending calculations are not clamped so can be used to get the colour. https://github.com/kiding/wanna-see-a-whiter-white was posted here a while ago and demonstrated the technique.
I'm not sure why you're being downvoted when that's exactly what's happening. The QR code comes from the mask-image in the inline-style, not the video.
It’s already in use at washingtonpost.com. About 4-6 months ago (IIRC) they had some superbright inline ads that made me doubt my eyes - I couldn’t understand how they were so vivid. Now I know how they did that.
This comment makes me wonder why ad companies haven't already used this to show ads. They are notorious for using any technology to grab users' attentions.
This HDR QR code currently (AFAIK) only works on Apple platforms, where a user already has an option to long-press a QR code in place. Maybe going forward we should spread easy in place interactions instead of spreading a different feature (HDR) and then abusing it?
It works by embedding a video element with a one-frame mp4 file. That would technically turn the ad into a "video ad". I can imagine that video ads already have tighter restrictions in ad networks today. At least I'd hope so...
HDR content still explodes my M1 MacBook. The cursor jumps into the corner of the screen, triggering expose, so I have to move out of the corner and back into it to undo the explosion. Now the HDR video is as bright as the sun while everything else looks washed out. So I close the HDR content and have expose trigger again. I don't know who's fault this is, but I despise HDR content because of that.
I think one possibly underused utility of HDR is to have more saturated bright colors in various contexts. I would love a more saturated light blue color for text that is normally available in sRGB, just because #0000ff is too dark and I would like to just crank up its brightness instead of mixing in green and red. Think syntax highlighting or terminal colors.
What you're describing is a more intense (brighter, higher energy) blue, not a more saturated one. For more saturated colours, you need different RGB primaries than the ones defined in sRGB. The DCI-P3 colourspace used in HDR displays is exactly that, offering something halfway between sRGB and Adobe RGB. Of course it pales in comparison to filmic colour spaces like ACEScg, which can represent almost the entire visible gamut.
Thanks! Maybe I didn't express myself clearly, but what I meant is more saturated "light blue" than what's available in sRGB, with the same brightness. For me #0000ff doesn't qualify as "light blue", so the point of comparison would be something like #aaaaff.
If you're using a DCI-P3 (wide-gamut HDR) display, then #0000ff (or #aaaaff) in that display's colour space will be emitted as more saturated blue light than what an sRGB display is capable of. I am not sure if you will perceive them as exactly equally bright, brightness is subjective and depends on the reaction of your retina to the light hitting it, which is why #0000ff (blue) appears darker than #00ff00 (green), however the objective energy of the emitted light should be equal between displays of different gamuts, i.e. #ff0000, #00ff00, #0000ff should all result in the same amount of light being emitted in every correctly calibrated display regardless of the RGB primaries (aka gamut) it uses. The whole colour space topic is pretty deep, if you want to learn more about it and how colours are represented and converted between spaces, I encourage you to go and check out https://www.colour-science.org/ and linked resources.
Mixing in green and red is equivalent to brightening #0000FF with added white. Try #1AFFFF, for example, it's a magnificently bright light blue. You've not muddied it with brown as you would have by adding an equal amount of green paint and a tenth of red paint; RGB illumination color spaces don't work that way.
It still makes it less saturated though and makes it pale blue instead of what it could have been by just really cranking up the light intensity behind that blue subpixel.
Author says that it can’t be represented with css, but I think HDR is supported in Safari within Display P3 I believe, you can give a value higher than 1.0?
"HDR" contains multiple components:
- Electro-Optical Transfer Function. That's the famous "gamma" of the display which is no longer a gamma
- color space (which real-life color does RGB = 100%, 0%, 0% match)
- Absolute brightness. There are metadata in the video files that say that RGB=100%, 100%, 100% mean 2000cd/m2 (and gives the average frame brightness like 10%, which helps scales for displays not capable of 2000cd/m2 everywhere)
- Increased bit depth
Display-P3 is only the color space (and maybe the EOTF). This video uses the "absolute brightness" feature of HDR to increase brightness.
They've made the odd choice of placing the QR code as a mask on the video via CSS, rather than positioning a black & transparent image over the video (which would allow long press).
Not sure why they've done this, perhaps something about the HDR approach requires both the black and white to be a part of the same video render.
Also - I'm not sure if it's a browser default or some other website CSS but imo there's no real reason long press shouldn't work on a video anyway... videos need accessibility too.
I am on an iPhone 13 and to my surprise I actually saw the HDR QR code displaying way more brightly than the rest of the page, and sure enough when I turned my power saving mode on, it went back to looking like the other non-HDR QR code. Impressive! It will be interesting to see if CSS ever gets full support for it.
When I look at the two codes side-by-side with normal (full) brightness, the HDR one looks quite a bit brighter. If I dim my screen brightness substantially, the HDR one also gets a lot dimmer. Not sure if this solves that problem, though it could be an issue with my particular screen?
I will never understand HDR. A friend has tried to explain it many times to me, but I always fail to get it. A display just has so much dynamic range (from full light off to full light on). HDR can't give you more range, so what does it actually do?
The only thing I've seen it do is override my brightness setting to make my screen go to full brightness when I've set it lower.
There are generally TWO things going on, and we use one term. I think that’s because generally displays do both so it’s just easier.
One is that HDR displays can usually get MUCH brighter. An iPhone 5 could do 500 nits. An iPhone 14 Pro can do 1,000 - 2,000 nits.
That’s what’s used here. You get all the same colors as before, but max brightness is way brighter.
The other thing is color gamut. The standard we had for a long time was 24-bit color, so ~16.7 million colors. Instead of having 8 bits per channel, screens may now have 10 or 12 bits per channel. I can’t find just how many a modern iPhone has.
This means there are more shades between black and 100% bright red. There are more variations between blueish green and greenish blue. Gradients can be smoother. Objects that are mostly one color (yellow corn, a red apple, etc) now have more options the can use to provide definition and details.
In a very dark scene, there are now more dark shades to show things with. When looking at bright clouds, they don’t have to be all white and washed out.
And combined with the increased brightness a scene can show definition in both dark and bright areas without having to wash everything out.
To clarify slightly - the max brightness is the real feature, and everything else including more bits per channel is just a bonus used to describe colours that are dimmer or brighter in more detail than was possible before. Essentially more bits per channel means greater precision, but the real game changer is how the range has expanded and you can display more details in dim scenes, more details in bright scenes, and, very key for the HDR effect, to have both very bright and very dim content display next to each other. As a side effect, just like how you don’t want your seats to rattle in a movie theatre every time music plays, there’s a difference in visual brightness between “normal” sRGB content and brighter (or dimmer) than usual P3 content. This QR code trick exploits that difference.
BTW there are, or at least were, wide gamut only displays.
I have one Dell made a few years ago. It supports the additional colors of P3 but isn’t any brighter than any other quality normal display of the time.
Technically, then, that display is like OLED TVs pre-2018, which could only display about 70-85% of P3 wide gamut because they could not go bright enough across the entire display. They were amazingly bright if a small part of the panel needed to be lit, though, with excellent contrast ratios. By comparison, most LED panels that are wide gamut but don’t have dimming zones can only display a uniform brightness or dimness and often struggle to make colours pop against darker backdrops the way they do on MacBook screens (with dimming zones) or iPhones (with OLED screens). Point is, just because something is wide gamut doesn’t mean it can display the entire gamut nor does it mean it can display the entire gamut at the same time. Its quality can vary by image and lighting technology (since the quality and brightness of colours all rely on their lighting source, etc.) as well as the image content demanded to be displayed and the lighting of the room (since your eyes have to perceive the screen too, against whatever backdrop or lighting your room has).
The way it is implemented on recent Apple devices is that the maximum brightness your display hardware is actually capable of is much higher than the color #FFFFFF at the maximum brightness available in settings. In other words, there's quite a lot of headroom that is only ever used for HDR content. This results in quite a strange sight of something (a QR code in this case) being much brighter than all of its surrounding UI.
That's odd, though, right? Why would they constantly cap the brightness of my screen, which I could really do with in bright light, just in case I watch some HDR content at some point?
If you applied HDR-white to all the UI on your screen, it'd likely lose brightness, because most HDR displays have limits on sustained brightness that are far lower than the peak brightness they can achieve on a small portion of the screen. HDR video doesn't really have this problem because people are mastering their HDR video to be significantly darker than SDR[0].
I will say that it is really annoying to have an HDR video just be way brighter for no reason, and I kind of hate HDR for this and this alone.
[0] This is also why streaming services have shows that are WAY TOO DAMNED DARK. Related: the people mastering the audio have also decided to make all the dialogue way too low because fuck people with hearing disabilities[1].
[1] An audio engineer was asked about this and he outright said he doesn't master for substandard audio setups. No I don't remember the source, it was from one of those articles that show up on the Firefox new tab page. Yes I am kind of reading into things and getting angry about it.
"The only platform I’m interested in talking about is theatrical exhibition."
and
"“We made the decision a couple of films ago that we weren’t going to mix films for substandard theaters... We’re mixing for well-aligned, great theaters... At a certain point, you have to decide if you’ve made the best possible version of the film and you’re trying to account for inadequacies in presentation... That’s chasing the tail. It doesn’t work. I will say, with our sound mixes, we spent a lot of time and attention making sure that they work in as predictable a way possible."
Doesn’t that happen because shows are mastered for 5.1 systems and the dialogs are put in the centre channel, whereas most of us are watching in stereo with poorly setup automatic remixing?
On my macbook with vscode on the left and chrome on the right. When I click on this link with the right side of my screen covered up, I can visibly see vscode on the left get a couple of shades darker (animated over a second or so).
So it looks like it's not only a question of hidden max brightness, it looks like the device adjusts the whole screen to enhance contrast for the HDR content as part of the strategy.
Their idea is that you won't ever need your UI to be this bright, I guess? It's Apple, that's what they do — they build things that work optimally for most people. They aren't wrong about it in this particular case either. The MacBook Pro display does get bright enough for me as is to be readable in direct sunlight.
But if you do want to "use the full potential of your hardware", there was some third-party app that used private APIs to set the screen brightness above that limit. I don't remember its name.
This sounds incredibly stupid to me, giving some unspecified range of content providers access to things that users don't have access to. But that's Apple for you I guess.
The problem is when this was first rolled out no content was designed for it.
So if they they just mapped the new brightness everything, everywhere, would look wrong. And people would complain that the iPhone is broken. And they have to redo all their websites/apps. And when they do, they look wrong on every other device.
This is the only sane way. It has to be something people opt-in to. That’s what Apple did.
The colors are meant to be within a calibrated (sRGB) color space. Images and videos can request to be in a different color space which includes the HDR range. CSS can also request colors outside that space, but that is done by using extended RGB triples (e.g. RGB(999,999,999) for ultra white).
The difference is while most things support sRGB, those other color spaces may just be outside what the display can handle. My Ultrafine 5k for instance does not show a discernible difference between the two QR codes.
You also have the issue that static images displayed at higher brightness will use more power and require quicker mitigations to prevent burn-in, so an 'ultra white' background may just not be something supported for a web page.
> The colors are meant to be within a calibrated (sRGB) color space.
If this were true, monitor brightness would be hard-set at 80cd/sqm, which would be borderline unusable during the day and way too bright in the dark. But hey, true sRGB colours!
What do you mean by "some unspecified range of content providers"? You can edit your own HDR videos on it too. Affinity Photo also allows using the HDR mode for viewing raw photos. The APIs to sear user's retinas are there, they are public and available to all native apps, it's just that there's a very strict distinction between SDR and HDR content.
Maybe it makes sense inside the Apple reality distortion field, but in the rest of the world the monitors job is to represent colours the way it can from the current black to the highest possible white, utilizing its complete dynamic range, and it's the tonemapper's job to convert HDR to monitor colours.
I imagine very few people, i.e. graphics designers, want true sRGB colours. The rest (i.e. normal people) adjust the brightness to the ambient conditions, adjust their eyes to the current "white" and expect everything to follow suit.
> Maybe it makes sense inside the Apple reality distortion field
This is not Apple specific, and not what HDR is designed for. No implementation works as you expect. Linux doesn't even support HDR at all. When I plug in my non-Apple HDR monitor into my Linux desktop the brightness remains capped at SDR since there is no hardware support. Even when Linux does eventually support the hardware, it is unlikely that any UI will use HDR by default. The UI would need to be redesigned for HDR specifically. It is way too bright for sustained usage on a UI. Uncomfortably bright to the point where it may cause damage to eyesight with prolonged use. It is intended for dynamic scenes that are occasionally bright in some parts of the image, as in TV and movies. I would never use that as a default global brightness level.
Every (compliant) screen will produce that range, when given numbers between the minimum and maximum. It's nowhere near every color we can see (the dark gray area), but it is a decent manufacturing target and it covers a very practical area of our vision.
Screens could just produce their maximum range all the time... but then you'd get screwed up colors, with every display being different than the others (some redder, some greener, etc.). Hence standard color ranges, like sRGB, so bananas used for color scale always look the same shade of yellow. That triangle looks the same on my screen as it does on yours, assuming the manufacturer cared at all about consistency. (remember the stupidly red OLEDs when they were new? they looked awful, humans looked like oompah loompahs)
So screens are capable of more than they normally display, sometimes by a pretty large margin. Somehow you have to tell the screen to go beyond its normal range, to show yellower yellows for super-bananas - that's HDR.
> HDR can't give you more range, so what does it actually do?
It does give you more range.
A lot of modern screens literally can't run at their full dynamic range, they would draw too much power and/or get too hot (some professional displays I use at work have a power cable as thick as your thumb and radiate heat like an oven with the door open. They also require following warm-down procedures to prevent the electronics from failing if they cool down too quickly when you power them off).
They can, however, run at full brightness for a small section of the display or brief periods of time. HDR videos are one way to instruct the screen to do that.
(obviously, how much range it can give you depends what screen technology you have - Apple's software has supported this for a long time, but even some current model displays they sell can't really do HDR properly - though many that "cant" to HDR are able to make a best effort using complex software tricks)
HDR increases the granularity of available brightness levels, from 0-255 to some higher level. (1023, usually.) It then defines the "old" 100% level to be some value less than the new maximum.
It is usually combined with new display technologies that can emit unusually high brightness levels. On a traditional display there is obviously no point.
Ah, this kind of makes sense, but then why wouldn't they map the old 100% to the new maximum?
I can understand the increased granularity, if you're making monitors that go much brighter, then you get more light levels, but why define the old maximum to be less than the new maximum?
Yes that's essentially how I understand it: hardware-wise HDR is basically "a really bright screen" coupled with good black levels. To ensure that you always get the HDR effect at any brightness, you need to reserve some headroom. It should be noted that you probably could not sustain this peak brightness for long anyhow due to thermal issues.
Basically your eyes can see much more than SDR. There are lots of colors that are smooshed together on an SDR monitor that would be distinct on an HDR monitor. Sometimes this means more vivid colors, or sometimes this means much more subtle gradients, or sometimes darker darks that don't get killed by neighboring brights.
Whether the effect is significant depends on a lot of stuff; how the display is built, how the codec works, your viewing conditions, and not least of all, whether the content "looks better" artistically with that much more range.
Just like the compression audio effect clips the range of music and how some pieces of music sound more expressive when the range in amplitudes is used well for artistic effect.
But this only tells me that some monitors can show more colors than others. If you made a monitor that can show more colors, does it matter what you call it? SDR or HDR, it can still show more colors than other monitors.
The idea is that "full red" should always be about the same brightness (user controlled, so "same"). Not "red" on an SDR screen and "iris burning bright red" on a HDR screen. Essentially, you want to extend your color parameter space to give the iris-burning range new names so that SDR and HDR images can both be displayed sensibly.
If you have done a little C programming, you will know (or can check) that a 32-bit `int` represents values from -2147483648 to 2147483647 while a 32-bit `float` goes from -340282346638528859811704183484516925440. to 340282346638528859811704183484516925440. - clearly more "range". This is almost exactly what is happening here.
(If you haven't done a little C programming, you should!)
> The only thing I've seen it do is override my brightness setting to make my screen go to full brightness when I've set it lower.
That is too bad. My experience is very good: only the QR code becomes much more striking, while the rest of the display remains the same. Perhaps it is implemented by increasing the brightness on your display to compensate for the reduced range available due to your brightness setting, but if that's what is happening here, something else is also compensating the colours to make the non-HDR areas of my display darker so that I cannot notice.
A 32 bit float and only represent 2^32 numbers (maybe less), same as a 32 bit int.
What’s happening here is using a short (16 bit) rather than a byte (8 bit), although not quite to the same extent, for each channel. HDR allows you to represent 2^6/12/18 times as many colors as SDR.
HDR10 is 10-bits per component and there are plenty of 16-bit images that aren't HDR, for example: http://i.imgur.com/Wm2kSxd.png -- the main thing about using HDR is to increase the range of values, i.e. what the biggest and smallest values are, not merely how many distinct values you can represent.
Lot of HDR tech is actually making sure that old conventional SDR sRGB looks "correct" (what people expect) on these newfangled displays. Lot of problems stem from lot of tech basically completely ignoring color management, even if wide-gamut displays were already a thing before HDR.
Basically if srgb content was just naively stretched to rec2020 it'd look garish and oversaturated instead of what the designer/photo/videographer intended. If it was additionally stretched to HDR, it'd look garish and eye-searingly bright.
Because the differences between HDR displays and SDR is so dramatic it is forcing everyone to scramble to do color management as the result would be pretty unusable otherwise.
Is higher granularity all HDR is, then? Other commenters mention that the HDR brightness max is more than what the SDR brightness max is defined as, which seems odd to me. It's as if you're crippling the display for SDR permanently.
Rec.709 actually defines a peak luma of 100 nits. sRGB has it defined as 80 nits. These numbers are really only for mastering, but as you push further from the standard you start to see artifacts, namely banding in gradients.
HDR standards require higher bit depth and define higher peak brightness. In addition, and this is probably technically the toughest bit, they define dynamic metadata formats which define tone mapping as the content changes [0].
One thing you may be missing about the particular effect we see on this website is that it disappears as you turn your device brightness up. So you can still get that brightness with SDR content.
Ideal HDR goes up to 10000 nits or higher. It would be ridiculous to put SDR white anywhere near that level.
And there is no need to cripple anything. Brightness adjustment can/should let you put SDR white all the way up to the max full-screen brightness. If a manufacturer cripples that it's not HDR's fault.
But it's also important to note that "max full-screen brightness" is often a lot less than maximum spot brightness.
I'm sitting in front of both a 2012 Macbook Pro (no HDR) and a 2020 M1 Macbook Air (HDR), and I'm expectedly seeing the difference in brightness between the QR codes on the newish Air, while they look the same on the old Pro. But what's interesting is that both of the QR codes on the old Pro look just as bright as the big bright HDR QR code on the new Air.
Basically, the only observable difference I'm noticing between the two laptops is that the non-HDR QR code looks dimmer on the HDR-supporting machine. I'm sure this is some kind of optical illusion, but it's certainly not doing much to make me value the existence of HDR.
What you're seeing here is Apple's EDR implementation [1], which, on non-local dimming displays (every Mac display except the M1/M2 14" and 16" MacBook Pros and the Pro Display XDR), works by bumping up the whole display brightness (and adjusting brightness of all non-HDR content downwards) to display colors brighter than standard SRGB white.
So, yeah, both QR codes probably are brighter on your old Pro, if its display brightness is set higher. And, if you adjust display brightness on the Air, you'll see them converge to the same brightness there, too (on my M2 Air, it seems that Apple leaves a little headroom for the HDR "white" to appear a bit brighter than UI white at full brightness; not sure if the M1 Air behaves the same or not as it has an older display).
Displays with local dimming (and OLEDs) can display, in small areas, colors that are brighter than the maximum full-screen brightness. That's where HDR content gets really compelling-- if all you can do is dim the whole screen, you're unlikely to see much of anything that you couldn't do just by bumping up the brightness with SDR content.
Does this mean some type of monkey patch could force the M2 screen to display a bit brighter? Because that’s my biggest gripe with it, I wish the screen would get to at least double the current max brightness.
Iirc, the 2020 Air maxes out at or below 500 nits, whereas a new MBP maxes out at something like 1600 nits (and only for HDR content - the standard SDR white-point is below this even at max screen brightness).
...while other devices could simply crank up the brightness of the whole display. I personally wouldn't mind, as long as the QR code is then readable (or rather, scannable) outside in the sunshine. I have a public transport app that does this when you show your ticket to be scanned.
Safari also apply “HDR” to CSS filter: brightness() with value greater than 1. Instead of the over-exposure effect, this portion of the screen will be brighter.
You must have an HDR or EDR-capable screen and there must be an HDR video playing to activate the HDR context (can be <video> playing somewhere in a webpage).
Interesting. Safari also adjusts the hdr over a few seconds so when you alt tab to it, the qr code becomes a lot brighter. Also firefox looks really bland right next to it.
Also just found a good reason to not use this hack with Chrome, maxes out the CPU. I left this in the background for an hour or so and I noticed my laptop getting warm. Checked with htop and Chrome was guilty. Closed it and it went back to normal. The only open tab: this thing.
Does that only work with white or with other colors in general?
I would not have thought that QR codes could become a hot topic in 2023 but combining that with the new Stable Diffusion Controlnet generated QR codes [1] could actually be pretty interesting
How do you even represent HDR colors? I've tried Googling this and I can never really find an answer. Is it basically just eight hex digits instead of six?
Depends on the format, but 10 or 12 bits per color primary is typical; moreover, HDR video standards typically use limited-range YCbCr encodings instead of full-range RGB, so certain low and high values are defined as "blacker than black", "whiter than white", or are otherwise reserved.
So in terms of hex digits, three sets (Y, Cb, Cr) of three, with not all values representing valid colors.
Compared to SDR standards like sRGB, HDR formats also typically use larger color spaces (Rec. 2020[1] is typical) and far more extreme transfer ("gamma") functions (PQ[2] or HLG[3]).
Finally, note that it is common for the encoded values to represent colors and intensities that far exceed the capabilities of most, if not all, display hardware, so the mapping from encoded values to actual displayed pixels can be rather complicated. Google "HDR tone mapping" for more than you ever wanted to know.
HDR is less about the colors and more about achieving a wider range of brightness levels (hence increasing the total dynamic range). It's typically encoded using 10 or 12-bit color depth, and metadata is added to the media on how to map the colors' luminance/brightness values (e.g., what darkest and brightest values of a image/video should be). This is then used to transform HDR content into appropriate color and luminance values for your specific monitor (e.g., the reference monitor used for grading might have a peak brightness of 1000 nits, but yours might be different, or have a different luminance-response curve or support a different color space).
In SDR content, colors are encoded with 8-bits ranges. So, 24bits for the three color channels. With HDR, we usually use 10 bits instead (this is where the "10" comes from in "HDR10"). See https://en.wikipedia.org/wiki/Color_depth#Deep_color_(30-bit....
The "bright" one has metadata that tells the operating system to render white at the maximum possible brightness, instead of whatever brightness it would normally render white at.
It's broadly supported on Apple devices, though how well it works depends on the hardware you have.
You're correct in part, as both image sources (the picture and the video) have maxed out pixel data. If it's HDR10, it does not have 24 but 30 bits of color though, so it maxes out at #3fffffff.
I think specifically mobile safari refuses to pass through the HDR properties in most cases.
I think Apple is trying to keep HDR for native apps only, like many other platform exclusive features like faceid, fingerprint reading, etc - which are all unavailable to webapps. Video is probably an exception.
Viewing this page (with the HDR element) resets my cursor to the bottom left. Also triggers this behavior when changing brightness whilst the HDR element is visible on screen. Anyone have a similar experience? On MacBook Air (M1, 2020).
Why does this work only on Apple devices. My android phone has a super bright screen (~1000 nits for HDR content). It supports HDR photos and videos as far as I can tell. But both QR codes appear to be the same brightness.
This is pretty neat. I tried the "Digital Colour Meter" app on MacOS, but both whites shows exactly the same values and it is also impossible to take a screenshot of. So I wonder how this works in the OS.
Doesn't seem to work (which I see as a positive) for me on an iPhone 13 Pro Max, using native Safari, low power mode is off as indicated. I guess I have another reason to thank Lockdown mode?
Neat trick. For one device I had to reduce contrast on the QR codes to yield an increase in detection. I assumed too much contrast was less desirable for the optics.
I wouldn’t want all whites to be displayed that brightly on my monitor, because of battery drain and eye fatigue. So I’m fine with reserving HDR for certain applications like video, and using “normal” levels for everything else.
Maybe there should be some CSS extensions for using HDR directly on web pages, though, if the developer really wants to.
I mean that a designer may think, hey just give me the brightest white, but the most obvious way to do so is not “correct”. But is it impossible? No. It is just insane.