

I don’t actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don’t typically happen
They wouldn’t, because applying ICC profiles is opt-in for each application. Games and at least many video players don’t apply ICC profiles, so they do not see negative side effects of it being handled wrong (unless they calibrate the VCGT to follow the piece-wise TF).
With Windows Advanced Color of course, that may change.
I think I am a bit confused on the laptop analogy then, could you elaborate on it?
What analogy?
How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn’t really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that “if you feed 300nits of PQ, the monitor will not allow it to go above it’s 300nits”? IF so this is not the case on what happens on my TV unless I am in “creator/PC” mode. In other modes it will allow it to go brighter or dimmer.
Yes, that’s exactly what happens. TVs do random nonsense to make the image look “better”, and one of those image optimizations is to boost brightness. In this case it’s far from always nonsense of course (on my TV it was though, it made the normal desktop waaay too bright).
unless I am in “creator/PC” mode
Almost certainly just trying to copy what monitors do.
With libjxl it doesn’t really default to the “SDR white == 203” reference from the “reference white == SDR white” common… choice? not sure how to word it… Anyways, libjxl defaults to “SDR white = 255” or something along those lines, I can’t quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.
Heh, when it came to merging the Wayland protocol and we needed implementations for all the features, I was searching for a video or image standard that did exactly that. The protocol has a feature where you can specify a non-default reference luminance to handle these cases.
It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then “HDR gets brighter then SDR” and that is “it’s more smooth in the dark areas” because that is also equally true.
That is technically speaking true, but noone actually sees that. People do often get confused about bit depth vs. HDR, but that’s more to do with marketing conflating the two than people actually noticing a lack of banding with HDR content. With the terrible bitrates videos often use nowadays, you can even get banding in HDR videos too :/
When you play an HDR and an SDR video on a desktop OS side by side, the only normally visible differences are that the HDR video sometimes gets a lot brighter than the SDR one, and that (with a color managed video player…) the colors may be more intense.
This sounds like a bug that was fixed some time ago - the desktop window is stealing focus when it gets created, so every time the display reconnects to the PC.
Because you’re on Debian with Plasma 5.27.5, you don’t have that fix though.