HDR Gaming on AMD RX 400 Series Cards Limited to 8-bit via HDMI

Published by

Click here to post a comment for HDR Gaming on AMD RX 400 Series Cards Limited to 8-bit via HDMI on our message forum
data/avatar/default/avatar34.webp
And AMD trolls will NOW start to bash Nvidia for doing exactly the same as AMD...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Has anyone here played SW2 on a HDR display? If so how is it? I watched Marco Polo on Netflix on my HDR TV - at first I liked it, it was noticeably like more vibrant and nice looking, but after a while the brightness just kind of hurt my eyes. My screen is calibrated for the room, but I'm not really sure how HDR effects that calibration.
data/avatar/default/avatar01.webp
For all I care HDMI belongs in the dumpster right there with DVI, insofar the they really intend to get rid of DVI, which at least AMD seems to be. DisplayPort 1.3 will hopefully remedy the bandwidth issues of 'current gen' ports.
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
Has anyone here played SW2 on a HDR display? If so how is it? I watched Marco Polo on Netflix on my HDR TV - at first I liked it, it was noticeably like more vibrant and nice looking, but after a while the brightness just kind of hurt my eyes. My screen is calibrated for the room, but I'm not really sure how HDR effects that calibration.
I believe the spec demands some absurd maximum achievable brightness for displays to be branded with it. It shouldn't affect anything if you calibrated the TV to a specific max brightness and any dynamic brightness/backlight mode isn't on.
https://forums.guru3d.com/data/avatars/m/253/253059.jpg
@Hilbert, I don't think this has much effect on Ultra HD movies since those are generally 24FPS, not 60, so there should be enough bandwidth to support 4:4:4 over HDMI 2.0. It is only a problem for PCs that use 60Hz to upsample video (24 -> 60Hz upsample can be crappy) or for gaming of course. However, your movie comment doesn't seem to apply.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
@Hilbert, I don't think this has much effect on Ultra HD movies since those are generally 24FPS, not 60, so there should be enough bandwidth to support 4:4:4 over HDMI 2.0. It is only a problem for PCs that use 60Hz to upsample video (24 -> 60Hz upsample can be crappy) or for gaming of course. However, your movie comment doesn't seem to apply.
Depends a little, if the uplink is fully connected at a HDMI 2.0 handshake at 60Hz it might be an issue. In a lot of situations say where you'd use say KODI set at a variable refresh rates it lowers the UHD TV refresh rate to match at say 50 / 30 or indeed 24Hz, in such situations there would be enough bandwidth. But sure, chances are that for movie playback this is not an issue.
data/avatar/default/avatar19.webp
So does this only apply to 4k resolutions, or basically all HDMI-monitors?
data/avatar/default/avatar37.webp
Why is hdmi still alive?Can't everyone(tv/monitors/gpus) just make it legacy and convert to DP???
https://forums.guru3d.com/data/avatars/m/119/119722.jpg
How does this happen when the XB1S supports 10-bit HDR over HDMI? Is it just 8-bit when at 4K and 10-bit at 1080p? EDIT - It helps if I read the article. NVM!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
How does this happen when the XB1S supports 10-bit HDR? Is it just 8-bit when at 4K and 10-bit at 1080p?
Yeah, this only effects UHD.
data/avatar/default/avatar25.webp
Depends a little, if the uplink is fully connected at a HDMI 2.0 handshake at 60Hz it might be an issue. In a lot of situations say where you'd use say KODI set at a variable refresh rates it lowers the UHD TV refresh rate to match at say 50 / 30 or indeed 24Hz, in such situations there would be enough bandwidth. But sure, chances are that for movie playback this is not an issue.
Guys, Guys, sorry to disappoint you but both 2k BluRay and 4K BluRay is 4:2:0. Its the BluRay standard. In that mode, you can do 10bit in 4K and 60hz.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Marios145 why? just so every cheap no name 40in for 300$ has it? ever thought that that it might cost more (than hdmi), which will affect prices for millions of ppl on this planet buying their tv/monitor based on price? and they might not even care/have/use a hdr disc/player/tv. not even talking about all those business/office computers, they sure all need HDR capable UHD screens to check their email. yeah, why not equip ever desktop with a 1080sli setup for that matter, right?
data/avatar/default/avatar02.webp
@Marios145 why? just so every cheap no name 40in for 300$ has it? ever thought that that it might cost more (than hdmi), which will affect prices for millions of ppl on this planet buying their tv/monitor based on price? and they might not even care/have/use a hdr disc/player/tv. not even talking about all those business/office computers, they sure all need HDR capable UHD screens to check their email. yeah, why not equip ever desktop with a 1080sli setup for that matter, right?
You are wrong, there is no point in using HDMI! 1. Fact, DP has more bandwidth and always beats HDMI in adding new features. 2. Fact, you have to PAY for HDMI and DP is FREE! The FREE part is whats important! The only reason we still have HDMI is for backwards compatibility, they cant kill it. If electronic companies cared enough about progress, they can kill HDMI in their next years models and offer HDMI 2.0b to DP dongles with every product model year 2016 and 2017, than in 2018 offer the dongles as paid accessory and that it, forget about HDMI.
......Let’s review the fees associated with being an HDMI Adopter, namely annual fees and royalty fees. There are 2 annual fee structures: High-volume (more than 10,000 units) HDMI Adopter Agreement - $10k/year. Low-volume (10,000 units or less) HDMI Adopter Agreement - $5k/year + flat $1/unit administration fee. The annual fee is due upon the execution of the Adopter Agreement, and must be paid on the anniversary of this date each year thereafter. The royalty fee structure is the same for all volumes. The following variable per-unit royalty is device-based and not dependent on number of ports, chips or connectors: US$0.15 for each end-user licensed product. US$0.05 – If the HDMI logo is used on the product and promotional material, the per-unit fee drops from US$0.15 to US$0.05. Use of HDMI logo requires compliance testing. US$0.04 – If HDCP is implemented and HDMI logo is used, the per-unit fee drops further from US$0.05 to US$0.04.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Why is hdmi still alive?Can't everyone(tv/monitors/gpus) just make it legacy and convert to DP???
HDMI is the standard for home entertainment, it's cheap, secured (yeah major love DRM)... etc etc DVI is still THE standard for asiatic computer (despite it lost each year facing DP) DP is for everything pro and/or computer. and it is more versatile than those upster
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@MegaFalloutFan Yes there is. How about the billions of ppl on this planet, that are not using hdr in any way (no gaming/movies or for office pc's). You expect everyone to switch out half of their hardware, because of one application bottlenecking (hdr gaming@ FHD/UHD)? Right.
data/avatar/default/avatar05.webp
@MegaFalloutFan Yes there is. How about the billions of ppl on this planet, that are not using hdr in any way (no gaming/movies or for office pc's). You expect everyone to switch out half of their hardware, because of one application bottlenecking (hdr gaming@ FHD/UHD)? Right.
Its not just about HDR, there are ton of features current and future. For starters No ONE is taking away your current monitor, so your comment has zero logic. Also every PC monitor in the last 5 years moved on to DP. All 1440p and higher moved to DP since day one, HDMI 2.0 is new. The one ones that do have HDMI 2.0, usually have one port and the rest is DP. So PC monitors are NOT the issue, we ALREADY moved on. The issue is everything else, TVs, Consoles, BluRay player etc
https://forums.guru3d.com/data/avatars/m/252/252408.jpg
It's not just HDR gaming even the standard 4K 4:4:4 10-bit @ 60hz can't be achieved with HDMI 2.0 which Is disgusting for people using PC on 4K HDR TV's P.S Movies are always encoded at 4:2:0 so even If your device was set to display 4:4:4 It will downconvert to 4:2:0 when playing a video source.
data/avatar/default/avatar05.webp
Why is hdmi still alive?Can't everyone(tv/monitors/gpus) just make it legacy and convert to DP???
Its has become the standard for the Home Entertainment industry and to change that would cause a bunch of problems for the manufacturers as well as the consumers. Maybe the Manufacturers of TVs and other devices that use HDMI only have a deal where it does not cost them alot to use HDMI since they might have bought HDMI licenses in bulk.
data/avatar/default/avatar27.webp
nothing new, hdmi is a dead end, fun fact, you can connect a DisplayPort to HDMI 2.0 Adapter to a 5 year old HD 7900 series card and get full hdmi 2.0 with RGB 8bit 4:4:4 ... lol why is hdmi still a thing amazed me, long live the displayport overlords also VGA & DVI is DEAD, been that for awhile, the writing has been on the freaking wall for the past 5 YEARS, both amd and intel has removed those from their products, you will only find those on partner cards didnt nvidia also do some funky things with their first so-called hdmi 2.0 cards?, going 4:2:0 or something instead of 4:4:4? http://www.club-3d.com/index.php/products/reader.en/product/displayport-12-to-hdmi-20-uhd-active-adapter.html
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
nothing new, hdmi is a dead end, fun fact, you can connect a DisplayPort to HDMI 2.0 Adapter to a 5 year old HD 7900 series card and get full hdmi 2.0 with RGB 8bit 4:4:4 ... lol why is hdmi still a thing amazed me, long live the displayport overlords also VGA & DVI is DEAD, been that for awhile, the writing has been on the freaking wall for the past 5 YEARS, both amd and intel has removed those from their products, you will only find those on partner cards didnt nvidia also do some funky things with their first so-called hdmi 2.0 cards?, going 4:2:0 or something instead of 4:4:4? http://www.club-3d.com/index.php/products/reader.en/product/displayport-12-to-hdmi-20-uhd-active-adapter.html
DisplayPort certainly has the bandwidth and on a PC monitor it's less of a issue as the format is pretty extensively used in most if not all newer monitors. However HDMI is still the more popular on TV's which affects console hardware as well, I'm guessing some of the newest TV's might support both HDMI 2.x and DP 1.4x though they are likely very expensive. (Granted a true 10-bit or 10-bit with dithering 12-bit - or true 12bit? - HDR compatible TV is likely already in a fairly high price class, hell even on PC monitors fully true 10-bit displays are rare and most use 8 bit with dither for 10 bit support, funnily enough I've even seen e.g Dell representatives claim their monitors as true 10-bit but reviews and similar point out pretty clearly that it's 8 with dithering. -> http://en.community.dell.com/support-forums/peripherals/f/3529/t/19987837 Though it could of course be one of those "Community" guys who don't really have the technical knowledge on these products.) EDIT: Basically for now we'll just have to deal with this limitation until HDMI 2.1 or whatever has the needed bandwidth for this. (Come to think of it DP 1.4 specifically might not be all that widespread on PC monitors yet, my own from last year uses DP 1.2 for example though it doesn't really need the extra bandwidth either that later updates offer.)