VESA is a little irritated, listed HDR2000 certification in Asia etail, is not an existing standard

Published by

Click here to post a comment for VESA is a little irritated, listed HDR2000 certification in Asia etail, is not an existing standard on our message forum
data/avatar/default/avatar38.webp
Well more than vesa, how pays for actual certification should be irritated.
data/avatar/default/avatar35.webp
2000 nits hdr can be a thing even if you dont pay vesa for a badge of approval đŸ™‚ likely amazing monitors.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
There are HDR2000 Colorimeter's being sold which had to follow a certification program somewhere......
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Vesa certification sucks anyway.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Talk about being pissed when a company beats you of the proverbial punch?!?
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
I'd be irritated with Vesa for not providing the standard! Its laughable they state their trademark is misused when they neglected to do their job and provide it. HDR xxxx has become the standard nomenclature. You'd think they would want this to remain so? Try harder VESA, you're late to your own party.
data/avatar/default/avatar40.webp
Camaxide:

2000 nits hdr can be a thing even if you dont pay vesa for a badge of approval đŸ™‚ likely amazing monitors.
Peak brightness is like grey-to-grey response times and contrast levels - they might be true under ideal conditions, but they have almost zero actual impact on actual use. In other words, it's all marketing BS - monitor companies lie about this stuff all the time. For example, the Samsung Odyssey G7, which is HDR600 certified, could only reach 528 peak nits even when only displaying a 2% window when Rtings tested it. And the Acer X27, which isn't even certified, could reach a much more reasonable 978 nits brightness at 25% display (which is a more real-world example of how HDR is used in games and movies - no more than 25% of the screen needs to be at full brightness for any scene transition, the point where peak brightness is used). To be frank, if Samsung is willing to lie about the specs of their monitors when they *are* certified, I really doubt they will be telling the truth about their brightness on their new ones when they are already lying about having a certification that doesn't even exist. And given how crappy their backlight zones were on their last super-ultra-wide, I wouldn't touch these panels at all. HDR without appropriate local dimming looks like garbage at monitor distance. FWIW, as someone who uses an OLED TV that can actually hit 1000 nits peak as a monitor, at computer viewing distance 1000 nits is already an eyeball searing level of brightness. I don't really think that going brighter would really make that big of a difference, since you have to tweak everything down in HDR games to avoid washouts even at 1000, but I could be wrong - nobody has seen what a brighter screen can really do. I will say that you should probably look into spending a couple hundred bucks and using Hue Sync to control your room lighting and then getting a less expensive well-reviewed HDR monitor or small 2020 OLED TV, rather than splurging several extra thousand dollars on a crazy bright monitor. You'd be shocked at how much impact ambient room lighting can have on perceived brightness on your screen.
data/avatar/default/avatar37.webp
It is irritated since HDR2000 would be such a better name than true black variants.
data/avatar/default/avatar01.webp
Standards are there for a reason. Imagine if every company had their own version of a display port cable or monitor mounting configuration. It would be like a thousand apple companies gone wild. Heck if your monitor can display HDR2000 than just say capable of HDR2000 but don't state it was VESA tested if it wasn't. That's BS plain and simple.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
asturur:

Well more than vesa, how pays for actual certification should be irritated.
Why? The whole point is so you know with 100% certainty that you're getting what is legit and actually conforms to a standard. Otherwise, you can just pull an Acer and invent your own standard, or, just make up some bogus claim about HDR like so many cheap displays do. It's not free to do these tests.
https://forums.guru3d.com/data/avatars/m/284/284177.jpg
universal standards do have a place...especially in plumbing and computer hardware.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
most "HDR" claims on LCD are deceptive, specially on monitors
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Well now that Apple is pushing MiniLED hopefully rest of industry will follow suit and we'll actually get decent monitors soon.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
EspHack:

most "HDR" claims on LCD are deceptive, specially on monitors
This so why does it even matter what Vesa thinks? less this about there Name being used I mean Most HDR monitor claim are and what they actual do are way off the mark
data/avatar/default/avatar07.webp
Their brand is being used unauthorized and for a standard that they do not currently have, ofc they will be pissed, that their whole business, trademarking. They are "non profit" but they have facilities, staff, bills... they are getting paid, livelihoods are still at stake.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
illrigger:

Peak brightness is like grey-to-grey response times and contrast levels - they might be true under ideal conditions, but they have almost zero actual impact on actual use. In other words, it's all marketing BS - monitor companies lie about this stuff all the time. For example, the Samsung Odyssey G7, which is HDR600 certified, could only reach 528 peak nits even when only displaying a 2% window when Rtings tested it. And the Acer X27, which isn't even certified, could reach a much more reasonable 978 nits brightness at 25% display (which is a more real-world example of how HDR is used in games and movies - no more than 25% of the screen needs to be at full brightness for any scene transition, the point where peak brightness is used). To be frank, if Samsung is willing to lie about the specs of their monitors when they *are* certified, I really doubt they will be telling the truth about their brightness on their new ones when they are already lying about having a certification that doesn't even exist. And given how crappy their backlight zones were on their last super-ultra-wide, I wouldn't touch these panels at all. HDR without appropriate local dimming looks like garbage at monitor distance. FWIW, as someone who uses an OLED TV that can actually hit 1000 nits peak as a monitor, at computer viewing distance 1000 nits is already an eyeball searing level of brightness. I don't really think that going brighter would really make that big of a difference, since you have to tweak everything down in HDR games to avoid washouts even at 1000, but I could be wrong - nobody has seen what a brighter screen can really do. I will say that you should probably look into spending a couple hundred bucks and using Hue Sync to control your room lighting and then getting a less expensive well-reviewed HDR monitor or small 2020 OLED TV, rather than splurging several extra thousand dollars on a crazy bright monitor. You'd be shocked at how much impact ambient room lighting can have on perceived brightness on your screen.
Last panels were MICRO-LED? With two thousand plus zones of active backlighting? This monitor is the shite even if it hits 1.5k NITS. And that blooming/bleeding will be none existent on these monitors in comparison to my Viotek 5120x1440. And that monitor damn near has none. Why all the hate on this monitor from so many people. Is it not them saying "vesa" standard like saying they're "number one" in the industry?? It's more of a statement than a fact. But yeah.... "Several extra thousands of dollars" would mean one had bought a couple of these monitors over a 55" oled. Last I checked a great oled today costs $1.5k for the newest HDMI 2.1 standard on an evo panel attaining the brightness you speak of. Again even if this monitor hits 1.5k Nits that's insane! And the way "EVERYTHING" is tested for brightness is on a "percentage" of the window guy. So everyone is lying about their overall screen brightness. People need to stop their opinions from running away and becoming facts.
data/avatar/default/avatar29.webp
Denial:

Well now that Apple is pushing MiniLED hopefully rest of industry will follow suit and we'll actually get decent monitors soon.
MiniLED is mostly marketing BS as well. The only real benefit you get from it is a more uniform backlight, since there's no drop in brightness the further from the screen edge like on edge-lit displays. Even increased brightness isn't guaranteed when using the tech, a company could source cheap LEDs or power delivery and end up with a dimmer screen than a good edge-lit LED backlight can provide. Even if they do everything right and get a super bright Mini LED backlight, if they slap a crap TN LCD panel over the top of it you will end up with muddy blacks and washed out colors, just like if you used it on an edge-lit backlight. It *could* be used to improve FALD as well, but the tech itself is just a dense field of white LED backlights without any local dimming - adding FALD capabilities isn't a part of the tech, it's an add-on that would require extra expenditure from manufacturers. FALD in general has its own set of issues regarding synchronizing the backlight to the panel on high-refresh displays, so even that tech is a toss-up. There's no magic bullet right now for high refresh HDR monitors. Until MicroLED emerges as a useful tech, there are trade-offs no matter what you try, so don't fall for any of the marketing about these things. Wait for reviews and make an informed decision rather than getting overly excited about stuff that will never measure up to the hype.