Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Most of the desktop applications (well, literally all right now) are using Standard Dynamic Range (SDR).

Wait, does that mean that Firefox, Gimp, VLC... when run in X11, do not currently support HDR ? What about gamuts other than sRGB ?

Anyway, sound like this means that my old wide gamut monitor should eventually be able to show at least partially the 'HDR gamuts' (and luminosities) ? That would be great!



Gamut is orthogonal to dynamic range in many three dimensional color spaces. A wide gamut CRT will tend to have less dynamic range than a wide gamut OLED. Dynamic range is a description of contrast. Gamut is a description of color.


In marketing-speak, the various 'HDR' standards refers to a bunch of things, not just to dynamic range : https://www.cnet.com/news/dolby-vision-hdr10-advanced-hdr-an...

IIRC most of those are using Rec.2020 ?


Early drafts of my comment kept going off into the weeds of "high dynamic range" versus how people tend to use "HDR" to describe pictures made by reducing more bits into fewer bits while maintaining the perception of more bits which in photography is basically every black and white print made from a well exposed black and white negative of a scene with at least moderate contrast...obviously I am letting this comment go into the weeds.

More recently, "HDR" can also refer to mapping fewer bits in the source into more bits on the output in the case of displays.

Anyway, disentangling dynamic range from gamut just a bit seemed like it might add some light.


Practically nothing in the PC world properly supports either HDR, wide gamut colour, or even just mapping to the physical display gamut in general.

There are bout half a dozen things you can do, in some configurations, sometimes, where everything will "work". Anything else is a total shitshow.

It all goes back to generations of developers assuming that a pixel is an RGB triple of bytes in the range 0..255 in the sRGB colour space. Millions of lines of code have been written when "images" are "byte[]" or "struct {r: byte; b: byte; g: byte} []" or the equivalent.

First of all, there's an implicit assumption by most programmers that this is a linear colour space. It isn't. The (127,127,127) pixel colour is not 50% grey! Not even close. Even Adobe Photoshop made this mistake.

Second, monitor manufacturers like to stretch colours to the maximum (native) display capability, because it makes them stand out more in the shop. Everyone wants the monitor that makes the colours "pop". Unfortunately, this means that with modern wide-gamut monitors, skin tones are so stretched that everyone looks like they're in clown make-up.

Third, Microsoft is the laziest company you can possibly imagine. They will not lift a finger to do anything other than the bare minimum, which then they will never touch again unless forced to at gunpoint. Hence, Windows had colour management technically added to it back around the year 2000, in the sense that it has a handful of colour management pieces that never do anything unless explicitly invoked by applications such as Adobe Photoshop. Nothing has changed since, other than a new "HDR mode" that blurs text, turns the brightness down, and is only suitable for gaming in a dark room.

Fourth, the 10-bit-per-channel displays required for HDR were sold only to professionals, so of course NVIDIA and ATI milked this market for all it was worth. Until very recently, 10-bit output was available on most consumer cards but disabled in the drivers. You had to buy the Quadro "professional" cards with identical chips, but different drivers. As you can imagine, practically nothing in Windows (or Linux) can output 10 bit, other than a handful of video viewing apps. Web browsers certainly can't. Even games that do compositing with 16-bit buffers output 8 bit only unless they were explicitly made for HDR. Even those still refuse to output 10 bit SDR or wide-gamut SDR even if it's available.

Fifth, because of the deliberate hardware limitations and the completely uncalibrated $150 budget displays most people buy, support for anything other than sRGB and SDR was pointless in file formats, so nobody bothered to lift a finger. Not the standards bodies, not Microsoft, not Linux, not anyone[1].

As of 2021, it is impossible to do any of the following in the general case:

- Send someone a HDR still image file.

- Send someone a HDR video and expect it to work in anything other than the YouTube app on a handful of platforms.

- Send someone a 10-bit or better file and expect it to actually deliver a benefit

- Send a wider-than-sRGB file and expect it to display correctly, even on an sRGB monitor. Expect clowns or zombies, not proper skin tones.

(I wrote a similar rant back in 2020, and 2019, and 2018, back all the way to 2010. This will not change in 2022. Or 2023. Or...)

1] Okay, I tell a lie: Apple did actually make all of the above work! Nobody else though. Right now, within the Apple ecosystem only, I can:

- Send someone a 10-bit, Display-P3 image and it'll display perfectly. Better colour, calibrated, smoother gradients.

- Send a wider-than-Display-P3 image and it'll also be correctly displayed, not stretched unpredictably like it would be on a PC.

- Send a HDR video in 10-bit Dolby Vision as a chat message and it'll work.

Etc...

Everywhere else though, no chance.


WCG has been a thing for much longer than HDR, thus support is also more established. Firefox and GIMP do support color management via ICC profiles.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: