HDR Gaming on AMD RX 400 Series Cards Limited to 8-bit via HDMI
Click here to post a comment for HDR Gaming on AMD RX 400 Series Cards Limited to 8-bit via HDMI on our message forum
SirDremor
And AMD trolls will NOW start to bash Nvidia for doing exactly the same as AMD...
Denial
Has anyone here played SW2 on a HDR display? If so how is it? I watched Marco Polo on Netflix on my HDR TV - at first I liked it, it was noticeably like more vibrant and nice looking, but after a while the brightness just kind of hurt my eyes. My screen is calibrated for the room, but I'm not really sure how HDR effects that calibration.
moeppel
For all I care HDMI belongs in the dumpster right there with DVI, insofar the they really intend to get rid of DVI, which at least AMD seems to be.
DisplayPort 1.3 will hopefully remedy the bandwidth issues of 'current gen' ports.
Prince Valiant
ruiner13
@Hilbert, I don't think this has much effect on Ultra HD movies since those are generally 24FPS, not 60, so there should be enough bandwidth to support 4:4:4 over HDMI 2.0. It is only a problem for PCs that use 60Hz to upsample video (24 -> 60Hz upsample can be crappy) or for gaming of course. However, your movie comment doesn't seem to apply.
Hilbert Hagedoorn
Administrator
Toss3
So does this only apply to 4k resolutions, or basically all HDMI-monitors?
Marios145
Why is hdmi still alive?Can't everyone(tv/monitors/gpus) just make it legacy and convert to DP???
Tree Dude
How does this happen when the XB1S supports 10-bit HDR over HDMI? Is it just 8-bit when at 4K and 10-bit at 1080p?
EDIT - It helps if I read the article. NVM!
Denial
MegaFalloutFan
fry178
@Marios145
why?
just so every cheap no name 40in for 300$ has it?
ever thought that that it might cost more (than hdmi), which will affect prices for millions of ppl on this planet buying their tv/monitor based on price?
and they might not even care/have/use a hdr disc/player/tv.
not even talking about all those business/office computers, they sure all need HDR capable UHD screens to check their email.
yeah, why not equip ever desktop with a 1080sli setup for that matter, right?
MegaFalloutFan
rl66
fry178
@MegaFalloutFan
Yes there is. How about the billions of ppl on this planet, that are not using hdr in any way (no gaming/movies or for office pc's).
You expect everyone to switch out half of their hardware, because of one application bottlenecking (hdr gaming@ FHD/UHD)?
Right.
MegaFalloutFan
Angrycrab
It's not just HDR gaming even the standard 4K 4:4:4 10-bit @ 60hz can't be achieved with HDMI 2.0 which Is disgusting for people using PC on 4K HDR TV's
P.S Movies are always encoded at 4:2:0 so even If your device was set to display 4:4:4 It will downconvert to 4:2:0 when playing a video source.
Fender178
wiak
nothing new, hdmi is a dead end, fun fact, you can connect a DisplayPort to HDMI 2.0 Adapter to a 5 year old HD 7900 series card and get full hdmi 2.0 with RGB 8bit 4:4:4 ... lol
why is hdmi still a thing amazed me, long live the displayport overlords
also VGA & DVI is DEAD, been that for awhile, the writing has been on the freaking wall for the past 5 YEARS, both amd and intel has removed those from their products, you will only find those on partner cards
didnt nvidia also do some funky things with their first so-called hdmi 2.0 cards?, going 4:2:0 or something instead of 4:4:4?
http://www.club-3d.com/index.php/products/reader.en/product/displayport-12-to-hdmi-20-uhd-active-adapter.html
JonasBeckman
http://en.community.dell.com/support-forums/peripherals/f/3529/t/19987837 Though it could of course be one of those "Community" guys who don't really have the technical knowledge on these products.)
EDIT: Basically for now we'll just have to deal with this limitation until HDMI 2.1 or whatever has the needed bandwidth for this.
(Come to think of it DP 1.4 specifically might not be all that widespread on PC monitors yet, my own from last year uses DP 1.2 for example though it doesn't really need the extra bandwidth either that later updates offer.)
DisplayPort certainly has the bandwidth and on a PC monitor it's less of a issue as the format is pretty extensively used in most if not all newer monitors.
However HDMI is still the more popular on TV's which affects console hardware as well, I'm guessing some of the newest TV's might support both HDMI 2.x and DP 1.4x though they are likely very expensive.
(Granted a true 10-bit or 10-bit with dithering 12-bit - or true 12bit? - HDR compatible TV is likely already in a fairly high price class, hell even on PC monitors fully true 10-bit displays are rare and most use 8 bit with dither for 10 bit support, funnily enough I've even seen e.g Dell representatives claim their monitors as true 10-bit but reviews and similar point out pretty clearly that it's 8 with dithering. ->