> Domain experts sometimes don't realize when their knowledge is out of date.
Sigh. Another unique pleasure of HN.
Sometimes there's a difference between practical and possible. I did not say it was not possible. While you're technically correct, the worst kind of correct, very little content and few delivery systems actually utilize that capability -- look how much you had to type just to get there. Then how are you going to deliver that alternate colorspace content? To wit: I note you've overlooked ATSC and other delivery mechanisms, as well as HDMI being a specification designed to handle all cases including non-NTSC-legacy regions which had a different colorspace all along (notably PAL, which defined "blanking" and "black" as equivalent).
Practically speaking, pretty much no content you'll ever play makes use of wide gamut in North American contexts. If you're cooking a file for yourself, that's one thing, but once you're producing something that's mass consumed there's a lot of compatibility issues to worry about. We still have fractional frame rates and picture safe zones, for crying out loud, despite drop frames being pointless for decades and overscan being a word nobody under 30 has heard. What's the point of leveraging wide gamut in HDMI if all your capture equipment, editing tech, broadcast delivery, playback equipment, and displays cannot handle it? For the general case you're counting on Vizio adding that ability into a $200 TV and Comcast handling it in their CPE. Good luck.
Why does that matter? When producing video content handling a wide array of distribution media matters. Except for circumstances where you control the entire system end to end, such as a planetarium or some other kind of venue, you always have to plan for your content potentially being sent over lame, legacy-style HDTV broadcasting. That's the floor. Sometimes you even have to think about SDTV transmission; God help you, and have fun with chyrons...
Not even Rec. 2020, which is designed for UHD and is way better, changed this[0]. It's not a concept that's in dispute, despite whatever HDMI -- one standard of several hundred that are involved in delivering you a single TV broadcast -- can do.
1. You asserted that 0/0% in image codecs was superblack. The most common codec for which that is actually true is WebP, which I'll bet no one in this thread actually cares about.
2. In digital colorimetry, gamut and range are orthogonal. Gamut is the space of real colors a given colorspace encompasses, range is whether the digital signal is quantized to 85% of the theoretical bitdepth and is what we've been talking about. Rec 2020 very much does increase gamut, as mentioned in your link.
3. Studios are actually starting to master in full-range 48-bit RGB (or XYZ technically, I think...) But yeah, that's irrelevant to end-users and this discussion.
4. Yes, everyone has been talking about PCs and how they render / output to TVs. That's why images and MPC were mentioned; broadcast doesn't deal with images. Computers deal with full-range RGB; that's literally the framebuffer format everyone renders everything to. As such, they convert video's YCbCr to full-range RGB before HDMI or TVs come into the picture; MPC's setting mentioned earlier is to ignore what the file says about its YCbCr range when it performs that conversion. Because while there are video files on the internet that set the wrong range, there are also internet video files that are legitimately full-range. And I'm not even talking about ancient codecs using palettes or VQ.
And since computers have traditionally outputted full-range digital RGB signals (over DVI, DisplayPort, LVDS, etc), they do indeed negotiate full-range RGB over HDMI when possible. And yes, even $200 TVs generally support that nowadays.
Heck, if you created a full-range H.264 file, tossed it on a thumb drive, and stuck it in your TV, I'd be willing to bet most if not all modern TVs play it correctly.
And absolutely, positively none of that matters to what the last episode of Scandal looks like as delivered over Netflix alongside an image of RGB black, which was the entire point of the thread. My perspective was the production of the average content that a user would be doing that comparison against, not your damned thumb drive that absolutely works and therefore, obviously, invalidates everything I'm saying. I even said, repeatedly, that the content just doesn't exist. That's all I said. Let me repeat that, just so we're abundantly clear: yes, video engineer, it is technically possible, but it is pretty much never used.
I did not assert that 0% in image codecs was superblack. I specifically mentioned Photoshop to avoid pedantry with someone coming along and saying "but nonlinear editors won't emit superblack unless you force them to!", not to talk about image codecs. You did that, and I ended up with more pedantry than I had bargained for in this entire thread, so that was a pointless exercise. As was sharing my perspective at all, honestly; I'm a professional video editor with credits (where "gamut" is a common term to discuss this phenomenon, despite your doctorate in colorimetry), not a video engineer. I apologize for speaking out of turn, and will defer to your expertise in the future.
The irony is that you responded to someone making a joke about the exact thing that you're doing. Please, just stop.
And other people who do different things have different perspectives and discussion to offer! Amazing! Are you satisfied with your correctness?
You are the reason I hate this community. "Oh, neat, a thread I know something about. Let me try contributing. Oh, look, now I'm in a slapfight with a guy who develops codecs for a living who completely missed the point of what I was saying and the context thereof, and wants to correct my usage of an incorrect term because it's his area of expertise. Let me rush to contribute more."
HN: The Game of Being More Right Than Everyone Else. Congratulations on winning. I'll go play something productive.
Edit: This was left when the parent comment said "Sorry, I only develop video codecs for a living," and it has now been rewritten. I'm leaving mine and I'd rather my whole subthread just be detached and deleted at this point
Sigh. Another unique pleasure of HN.
Sometimes there's a difference between practical and possible. I did not say it was not possible. While you're technically correct, the worst kind of correct, very little content and few delivery systems actually utilize that capability -- look how much you had to type just to get there. Then how are you going to deliver that alternate colorspace content? To wit: I note you've overlooked ATSC and other delivery mechanisms, as well as HDMI being a specification designed to handle all cases including non-NTSC-legacy regions which had a different colorspace all along (notably PAL, which defined "blanking" and "black" as equivalent).
Practically speaking, pretty much no content you'll ever play makes use of wide gamut in North American contexts. If you're cooking a file for yourself, that's one thing, but once you're producing something that's mass consumed there's a lot of compatibility issues to worry about. We still have fractional frame rates and picture safe zones, for crying out loud, despite drop frames being pointless for decades and overscan being a word nobody under 30 has heard. What's the point of leveraging wide gamut in HDMI if all your capture equipment, editing tech, broadcast delivery, playback equipment, and displays cannot handle it? For the general case you're counting on Vizio adding that ability into a $200 TV and Comcast handling it in their CPE. Good luck.
Why does that matter? When producing video content handling a wide array of distribution media matters. Except for circumstances where you control the entire system end to end, such as a planetarium or some other kind of venue, you always have to plan for your content potentially being sent over lame, legacy-style HDTV broadcasting. That's the floor. Sometimes you even have to think about SDTV transmission; God help you, and have fun with chyrons...
Not even Rec. 2020, which is designed for UHD and is way better, changed this[0]. It's not a concept that's in dispute, despite whatever HDMI -- one standard of several hundred that are involved in delivering you a single TV broadcast -- can do.
[0]: https://en.wikipedia.org/wiki/Rec._2020#Digital_representati...