ASUS and Acer UHD G-Sync HDR Monitors Forced to Use Color Compression at 120/144 Hz

Published by

Click here to post a comment for ASUS and Acer UHD G-Sync HDR Monitors Forced to Use Color Compression at 120/144 Hz on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Did not study HDR yet, but I considered it as simple as another 8bits per pixel which can work as kind of exponent to RGB bits of pixel. Or HSL expansion where one would need even less additional bits. There are many methods which can easily achieve Very wide HDR range. I guess they used something more bandwidth hungry.
data/avatar/default/avatar05.webp
I can't believe that the interconnect is the STILL the limiting factor. People are paying tons of money for GPUs and monitors and they are bottlenecked by a cheap cable. I would love to get a 4K, HDR high refresh rate monitor but I will wait.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Early adopter syndrome... this happens when display tech is ahead of other factors (here, connections). Couldn't USB-C work for this? I don't know the numbers tbh
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Didn't they say "visually lossless compression" was supported by 1.4? Anyways, it's a good idea to skip this generation and wait for DP 1.5 or Gsync with HDMI 2.1 support.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
I'll wait for a 16:9, 23/24''; Freesync 2 monitor, below 200€. Thank you.
data/avatar/default/avatar02.webp
422 looks very bad on my UHD TV. Only 444 or RGB is the way to go.
https://forums.guru3d.com/data/avatars/m/262/262759.jpg
Its not even a 10bit panel. its an 8bit panel with processing
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Silva:

I'll wait for a 16:9, 23/24''; Freesync 2 monitor, below 200€. Thank you.
Yes we all know you want the very best for pennies. Anyways. There are already Freesync 1 monitors for that price. Why do you want FS 2? FS2 Adds HDR support and forces manufacturers to use LFC(which you can find on Freesync 1). Do you think you'll get a high refresh rate with HDR support for cheap? You'll be waiting a long time, probably indefinitely; even if they were that cheap I can guarantee they would be junk monitors as all are generally at that price point.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
fantaskarsef:

Early adopter syndrome... this happens when display tech is ahead of other factors (here, connections). Couldn't USB-C work for this? I don't know the numbers tbh
Yes, early adopter syndrome and "smart" hardware manufacturers selling Ferraris without wheels. 😀 USB-C could be a solution but it's another huge mess by itself (get a couple of Tylernol at hand...just in case): http://blog.fosketts.net/2016/10/29/total-nightmare-usb-c-thunderbolt-3/ https://www.digitaltrends.com/computing/usb-c-implementation-messy-and-unclear/ HDMI is the proper solution for TV and monitors and will arrive not before q3 2018...
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Selling 2500€ monitors with this kind of issues is a very bad joke 🙁
data/avatar/default/avatar13.webp
Most movies at 4k HDR stream no more than 60Hz, so there is a reason to their madness. I don't agree with what they did, but still - early tech is doing what it said it would. I am sure the specs mention it as well (although I haven't checked). Everything is always in the details, but if the specs say 4K HDR @ 144Hz then people will have a case against them.
data/avatar/default/avatar21.webp
As is mentioned above - isn't the panel actually an 8-bit panel with processing (8-bit+frc)? Doesn't that mean that you actually cannot see the difference between an 8-bit and a 10-bit image on this monitor? If that is the case, DP 1.4 supports up to 4K 120Hz@8-bit. Whereas 10-bit is limited to 98hz, as the article says. But if you cannot see the difference on this panel, due to it being 8-bit+frc, and not true 10-bit, then I was thinking I'd just run it at 120hz, and then leave it at that.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
H83:

Selling 2500€ monitors with this kind of issues is a very bad joke 🙁
It was just a matter of time, I'd be surprised if there won't be more issues.
data/avatar/default/avatar25.webp
4K blurays are 4:2:0, never seen someone being upset about it. Now 4:2:2 is not enough ?
https://forums.guru3d.com/data/avatars/m/172/172989.jpg
Am I naïve in thinking that... well, just use 2 cables from GPU to monitor? I get that surround/multimonitor enthousiasts will be left cold in the water, but It'd be a good temporary solution, I think? I remember that Dell 8K monitor used 2 or even 4 DP cables.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
FeDaYin:

4K blurays are 4:2:0, never seen someone being upset about it. Now 4:2:2 is not enough ?
It's a different level of source material. Apples versus oranges and the likes. 4:2:2 and 4:2:0 are indeed commonly applied towards Blu-rays (h.265 as well). However movie material has little to suffer from it as it's always a bit more washed out with compression and thus is less prone to be noticed. The refining level of quality you see with things like thin lines, filled polygons and text for example, in a game will just look much fuzzier and washed out due to color compression. I have yet to see it for myself though, and the fact that these monitors are 27" will make the effect look less visual in sight due to pixel density. But the fact that people who purchased such a monitor can actually noticed it and complained about it when switching towards 120 or 144 Hz, says enough. Whats worse, the display manufacturers simply did not include this info in their spec sheet while it clearly is a compromise they pro-actively made.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
sammarbella:

Yes, early adopter syndrome and "smart" hardware manufacturers selling Ferraris without wheels. 😀 USB-C could be a solution but it's another huge mess by itself (get a couple of Tylernol at hand...just in case): http://blog.fosketts.net/2016/10/29/total-nightmare-usb-c-thunderbolt-3/ https://www.digitaltrends.com/computing/usb-c-implementation-messy-and-unclear/ HDMI is the proper solution for TV and monitors and will arrive not before q3 2018...
Thanks for the reads. Yes looks like of course, again, the industry that doesn't use a standard kills it off right away.
data/avatar/default/avatar28.webp
WhiskeyOmega:

Its not even a 10bit panel. its an 8bit panel with processing
gtx can output just 8bit, Quadros are for 10bit displaying even with SDI you can display 8bit with GTX ))
https://forums.guru3d.com/data/avatars/m/217/217682.jpg
GxCx:

gtx can output just 8bit, Quadros are for 10bit displaying even with SDI you can display 8bit with GTX ))
GTX can output 10bit colour in DirectX applications. You need Quadro for 10bit in OpenGL. As for the limitation of the monitors and their price.. https://i.imgur.com/dLUEc2dh.jpg
data/avatar/default/avatar36.webp
same with 4:4:4 sampling, YCbCr is not same as RGB think but not sure HDMI allows that