ASUS and Acer UHD G-Sync HDR Monitors Forced to Use Color Compression at 120/144 Hz
Click here to post a comment for ASUS and Acer UHD G-Sync HDR Monitors Forced to Use Color Compression at 120/144 Hz on our message forum
Fox2232
Did not study HDR yet, but I considered it as simple as another 8bits per pixel which can work as kind of exponent to RGB bits of pixel.
Or HSL expansion where one would need even less additional bits. There are many methods which can easily achieve Very wide HDR range.
I guess they used something more bandwidth hungry.
Timothy D Smith
I can't believe that the interconnect is the STILL the limiting factor. People are paying tons of money for GPUs and monitors and they are bottlenecked by a cheap cable. I would love to get a 4K, HDR high refresh rate monitor but I will wait.
fantaskarsef
Early adopter syndrome... this happens when display tech is ahead of other factors (here, connections).
Couldn't USB-C work for this? I don't know the numbers tbh
Agent-A01
Didn't they say "visually lossless compression" was supported by 1.4?
Anyways, it's a good idea to skip this generation and wait for DP 1.5 or Gsync with HDMI 2.1 support.
Silva
I'll wait for a 16:9, 23/24''; Freesync 2 monitor, below 200€. Thank you.
GameLord
422 looks very bad on my UHD TV. Only 444 or RGB is the way to go.
WhiskeyOmega
Its not even a 10bit panel. its an 8bit panel with processing
Agent-A01
sammarbella
http://blog.fosketts.net/2016/10/29/total-nightmare-usb-c-thunderbolt-3/
https://www.digitaltrends.com/computing/usb-c-implementation-messy-and-unclear/
HDMI is the proper solution for TV and monitors and will arrive not before q3 2018...
Yes, early adopter syndrome and "smart" hardware manufacturers selling Ferraris without wheels. 😀
USB-C could be a solution but it's another huge mess by itself (get a couple of Tylernol at hand...just in case):
H83
Selling 2500€ monitors with this kind of issues is a very bad joke 🙁
Brit90
Most movies at 4k HDR stream no more than 60Hz, so there is a reason to their madness. I don't agree with what they did, but still - early tech is doing what it said it would.
I am sure the specs mention it as well (although I haven't checked).
Everything is always in the details, but if the specs say 4K HDR @ 144Hz then people will have a case against them.
Pale
As is mentioned above - isn't the panel actually an 8-bit panel with processing (8-bit+frc)? Doesn't that mean that you actually cannot see the difference between an 8-bit and a 10-bit image on this monitor? If that is the case, DP 1.4 supports up to 4K 120Hz@8-bit. Whereas 10-bit is limited to 98hz, as the article says. But if you cannot see the difference on this panel, due to it being 8-bit+frc, and not true 10-bit, then I was thinking I'd just run it at 120hz, and then leave it at that.
Solfaur
FeDaYin
4K blurays are 4:2:0, never seen someone being upset about it.
Now 4:2:2 is not enough ?
Chess
Am I naïve in thinking that... well, just use 2 cables from GPU to monitor?
I get that surround/multimonitor enthousiasts will be left cold in the water, but It'd be a good temporary solution, I think?
I remember that Dell 8K monitor used 2 or even 4 DP cables.
Hilbert Hagedoorn
Administrator
fantaskarsef
GxCx
Memorian
https://i.imgur.com/dLUEc2dh.jpg
GTX can output 10bit colour in DirectX applications. You need Quadro for 10bit in OpenGL.
As for the limitation of the monitors and their price..
GxCx
same with 4:4:4 sampling, YCbCr is not same as RGB
think but not sure HDMI allows that