HDR/SDR Tone mapping has long been the bane of hardware transcoding for Plex. Even when it "kinda works" which is Nvidia on Linux only it kills my GTX 1650 in RAM and compute required. Normally NVENC can handle 3 4k Streams or many many 1080p streams, only constrained by RAM.
on quicksync (600 series above igpu, i.e. kabylake/gemini lake) it can do it without an issue. on gemini lake cores, I've done 4 hdr tone mapping (uhd bluray source to avc sdr) without a problem.
I have a real-world use case: I have a full Atmos audio setup (7.1.4) and a projector for video. Now projectors don't do HDR very well[1], and to be honest I'm more interested in immersive audio than shiny highlights in the picture.
Unfortunately, a good few film disc releases only put the Atmos audio on the UHD disk along with HDR and running those disks through my rig and forcing SDR yields a picture that is too dark compared to the original HD SDR copy (and letting it try HDR isn't much better).
So I'm wondering if this will enable me to rip and convert to a good SDR UHD copy of a film with the Atmos audio ?
--
[1] ...and 100 inch OLED TVs are stupidly expensive.
It looks too dark because you're forcing light levels on a curve that goes up to 10,000 nits through a device that has linear gamma and is probably over an order of magnitude dimmer. This process should help because it's manually remapping HDR reference white to Rec. 709/sRGB max white, transforming the curve back into linear, and forcing the display to clip anything brighter to max white which is an acceptable way of doing things when compressing a gamut.
Your output device's color space needs to match the color space the display device expects. We've been in an SDR world of everything being sRGB/Rec. 709 for so long we don't even realize when something is putting out a mismatched color space. Normally if you have something like an Apple TV it'll do the transform automatically from whatever the media is to the color space of the screen it's connected to but if you're using something like a PC that assumes sRGB and force video in the BT.2020 color space out at sRGB levels you're going to have a bad time.
Thanks for that info. I'm currently putting the discs into a mid-range Panasonic UHD player, which is supposed to be able to convert HDR to SDR properly, but clearly it's not doing a good job of it.
I don't really understand this so I asked it to explain:
how does this fix the colorspace?
It replied:
The RGB to YUV conversion is done using a "sigmoidal" function, which basically means that it applies a non-linear transformation to the input video. This transformation is designed to "compress" the bright colors in the video, making them more manageable for SDR displays.
The colorlevels filter is also adjusting the black and white points of the video. This is done to ensure that the video can be displayed correctly on SDR displays, which have a limited color gamut.
Finally, the target_range and source_range options are used to ensure that the video is encoded in the correct range for SDR displays.
does anyone know if DV oriented tonemapping is in development? (i.e. DV to HDR10 or DV to SDR). so DV files can play on non DV screens and look correct.
Dolby Vision is a proprietary format. So unless Dolby develops something (highly doubtful) then it's unlikely anything will be developed. It would be incredibly complex, and almost certainly lead to legal action from Dolby.
Yes, but that support is limited to playback. All it does is pass the Dolby Vision metadata to the output, which means you need a Dolby Vision display to actually use it.
Nope. You don’t need a DV compatible display with the latest MPV. Their latest gpu-next renderer can tonemap DV to SDR or HDR so as you can watch it on an old display.
DV is an additional metadata layer on top of HDR-10 PQ. If you play it back on a non Dolby Vision device it will just play back the same as a HDR-10 file would do. No tonemapping is required.
In 4.1, a OpenCL variant filter was added for GPU processing.
In 4.2 threading support was added to the tonemap CPU filter.
In 4.3, a VAAPI variant filter was added for GPU processing on linux.