That's odd, though, right? Why would they constantly cap the brightness of my screen, which I could really do with in bright light, just in case I watch some HDR content at some point?
If you applied HDR-white to all the UI on your screen, it'd likely lose brightness, because most HDR displays have limits on sustained brightness that are far lower than the peak brightness they can achieve on a small portion of the screen. HDR video doesn't really have this problem because people are mastering their HDR video to be significantly darker than SDR[0].
I will say that it is really annoying to have an HDR video just be way brighter for no reason, and I kind of hate HDR for this and this alone.
[0] This is also why streaming services have shows that are WAY TOO DAMNED DARK. Related: the people mastering the audio have also decided to make all the dialogue way too low because fuck people with hearing disabilities[1].
[1] An audio engineer was asked about this and he outright said he doesn't master for substandard audio setups. No I don't remember the source, it was from one of those articles that show up on the Firefox new tab page. Yes I am kind of reading into things and getting angry about it.
"The only platform I’m interested in talking about is theatrical exhibition."
and
"“We made the decision a couple of films ago that we weren’t going to mix films for substandard theaters... We’re mixing for well-aligned, great theaters... At a certain point, you have to decide if you’ve made the best possible version of the film and you’re trying to account for inadequacies in presentation... That’s chasing the tail. It doesn’t work. I will say, with our sound mixes, we spent a lot of time and attention making sure that they work in as predictable a way possible."
Doesn’t that happen because shows are mastered for 5.1 systems and the dialogs are put in the centre channel, whereas most of us are watching in stereo with poorly setup automatic remixing?
On my macbook with vscode on the left and chrome on the right. When I click on this link with the right side of my screen covered up, I can visibly see vscode on the left get a couple of shades darker (animated over a second or so).
So it looks like it's not only a question of hidden max brightness, it looks like the device adjusts the whole screen to enhance contrast for the HDR content as part of the strategy.
Their idea is that you won't ever need your UI to be this bright, I guess? It's Apple, that's what they do — they build things that work optimally for most people. They aren't wrong about it in this particular case either. The MacBook Pro display does get bright enough for me as is to be readable in direct sunlight.
But if you do want to "use the full potential of your hardware", there was some third-party app that used private APIs to set the screen brightness above that limit. I don't remember its name.
This sounds incredibly stupid to me, giving some unspecified range of content providers access to things that users don't have access to. But that's Apple for you I guess.
The problem is when this was first rolled out no content was designed for it.
So if they they just mapped the new brightness everything, everywhere, would look wrong. And people would complain that the iPhone is broken. And they have to redo all their websites/apps. And when they do, they look wrong on every other device.
This is the only sane way. It has to be something people opt-in to. That’s what Apple did.
The colors are meant to be within a calibrated (sRGB) color space. Images and videos can request to be in a different color space which includes the HDR range. CSS can also request colors outside that space, but that is done by using extended RGB triples (e.g. RGB(999,999,999) for ultra white).
The difference is while most things support sRGB, those other color spaces may just be outside what the display can handle. My Ultrafine 5k for instance does not show a discernible difference between the two QR codes.
You also have the issue that static images displayed at higher brightness will use more power and require quicker mitigations to prevent burn-in, so an 'ultra white' background may just not be something supported for a web page.
> The colors are meant to be within a calibrated (sRGB) color space.
If this were true, monitor brightness would be hard-set at 80cd/sqm, which would be borderline unusable during the day and way too bright in the dark. But hey, true sRGB colours!
What do you mean by "some unspecified range of content providers"? You can edit your own HDR videos on it too. Affinity Photo also allows using the HDR mode for viewing raw photos. The APIs to sear user's retinas are there, they are public and available to all native apps, it's just that there's a very strict distinction between SDR and HDR content.
Maybe it makes sense inside the Apple reality distortion field, but in the rest of the world the monitors job is to represent colours the way it can from the current black to the highest possible white, utilizing its complete dynamic range, and it's the tonemapper's job to convert HDR to monitor colours.
I imagine very few people, i.e. graphics designers, want true sRGB colours. The rest (i.e. normal people) adjust the brightness to the ambient conditions, adjust their eyes to the current "white" and expect everything to follow suit.
> Maybe it makes sense inside the Apple reality distortion field
This is not Apple specific, and not what HDR is designed for. No implementation works as you expect. Linux doesn't even support HDR at all. When I plug in my non-Apple HDR monitor into my Linux desktop the brightness remains capped at SDR since there is no hardware support. Even when Linux does eventually support the hardware, it is unlikely that any UI will use HDR by default. The UI would need to be redesigned for HDR specifically. It is way too bright for sustained usage on a UI. Uncomfortably bright to the point where it may cause damage to eyesight with prolonged use. It is intended for dynamic scenes that are occasionally bright in some parts of the image, as in TV and movies. I would never use that as a default global brightness level.