Did NVIDIA silently downgrade the G-Sync Ultimate HDR specification?

Published by

Click here to post a comment for Did NVIDIA silently downgrade the G-Sync Ultimate HDR specification? on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I don't think it's a question - they definitely did. That being said, I don't think anyone buying an G-Sync Ultimate display is going to do so without knowing the HDR level but it's not like that matters because the HDR level doesn't even mean anything (see below). Also the 1000nit brightness requirement is kind of dumb when you have OLED displays that won't hit 1000 nits but obviously do HDR extremely well. https://displayhdr.org/general/not-all-hdr-is-created-equal/
The next question, or confusion, that we often see is when people compare HDR-600 and DisplayHDR 600, or HDR-1000 and DisplayHDR 1000. Are these the same thing? The answer is no, and this time not because one is a protocol and the other a performance spec (both are in fact performance specs). However, HDR-600 and HDR-1000 are undefined specs. What does that actually mean? One might assume that HDR-600 means that a display would be able to handle an HDR signal and some part of the screen would be able to achieve 600 cd/m2 luminance levels. However, do you know how much of the screen? Do you know the test patterns? Do you know how long the display is required to display at this level before power or thermal regulation kicks in and reduces the luminance level? No, you don’t. No one does. HDR-600 and HDR-1000 are undefined specs. Nothing about the testing methodology is disclosed, and so it’s impossible to really know what they mean. Further, do they indicate the color gamut coverage, or the black level of the HDR dimming, the speed of response time for the HDR dimming, or the accuracy of the luminance and color output from the display? Again, the answer is no – there is no definition at all.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
they probably got 1 sample per quarter with such high standards, lets see what happens with miniLED
https://forums.guru3d.com/data/avatars/m/259/259564.jpg
Denial:

I don't think it's a question - they definitely did. That being said, I don't think anyone buying an G-Sync Ultimate display is going to do so without knowing the HDR level but it's not like that matters because the HDR level doesn't even mean anything (see below). Also the 1000nit brightness requirement is kind of dumb when you have OLED displays that won't hit 1000 nits but obviously do HDR extremely well. https://displayhdr.org/general/not-all-hdr-is-created-equal/
I agree with you that there generally needs to be a better, more meaningful, standard for conveying HDR support to consumers. If you create a certification standard such as this it should be against the law for you to change it mid-stream. If you need to replace it, or it's not granular enough, then you find another name. The only argument against calling it something else now that it's changed would be that they put all this marketing work and spin into the "G-Sync Ultimate" name in which case, yeah, that's exactly why you should be forced to call it something else. Like the only thing worse than an opaque, useless standard is an opaque, useless, malleable standard. Once again Nvidia manages to fuck everyone.
data/avatar/default/avatar27.webp
After experiencing HDR for the first time on my 4K TV, and reading a lot about the problems the technology has (and a mediocre performance on PC monitors) I gotta say It looks really great when it is working properly and care has been invested in the technology. I will be buying my first HDR 4K monitor in about three to four years when this technology will be properly supported and implemented. Right now not even 4K is not so mainstream both I think it will be in about 2,5 to 3 years. My money is going on VESA, Dolby Vision will fail...
https://forums.guru3d.com/data/avatars/m/207/207253.jpg
Yep, I just bought the AW3821dw and its listed as Gsync Ultimate even though it only does DisplayHDR 600. I was pretty confused because it used to be DisplayHDR 1000 with FALD, not anymore apparently.
https://forums.guru3d.com/data/avatars/m/245/245554.jpg
Reardan:

I agree with you that there generally needs to be a better, more meaningful, standard for conveying HDR support to consumers. If you create a certification standard such as this it should be against the law for you to change it mid-stream. If you need to replace it, or it's not granular enough, then you find another name. The only argument against calling it something else now that it's changed would be that they put all this marketing work and spin into the "G-Sync Ultimate" name in which case, yeah, that's exactly why you should be forced to call it something else. Like the only thing worse than an opaque, useless standard is an opaque, useless, malleable standard. Once again Nvidia manages to frack everyone.
You're agreeing in one sentence that HDR is a mess and HDR standards right now are useless and then in another sentence blaming Nvidia for not keeping the unrealistic standard that barely anybody is willing to meet thereby making it useless. It's not such a simple black and white situation. A standard is rarely if ever based on merit when it comes to for-profit products and companies, it's simply the number of hands raised by the interested parties of any particular industry. "Yes we can do this, it won't take too much R&D and not eat into our profits" there now you've got a standard. If that group overshot their expectations of what they could do and now cannot meet them then that standard is useless and cannot be met and has to be changed. This is also how LTE became a thing.
https://forums.guru3d.com/data/avatars/m/263/263205.jpg
DerSchniffles:

Yep, I just bought the AW3821dw and its listed as Gsync Ultimate even though it only does DisplayHDR 600. I was pretty confused because it used to be DisplayHDR 1000 with FALD, not anymore apparently.
That's a really nice monitor regardless. I have the prior model and it's been a joy to game on.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
Does it matter though? Who cares about Nvidia's stamp? I am sure my two LG OLEDs gsync compatible TVs are superior to any ultimate mere nvidia badged PC monitor. Overrated anything NVidia related as always. PS, I also enjoy my weak Odyssey G7 monitor.