AMD Radeon Fury X doesn't have HDMI 2.0 support
So if you had a peek about all & everything presented in the past few days, one thing you will have noticed. The new architecture doesn't seem to offer support for HDMI 2.0, but instead still uses HDMI 1.4a. This means with Fury GPU based products, like the new project Quantum for example which really is a small form factor PC, you can't fully use it on an Ultra HD TV.
See, HDMI offer bandwidth support for 60hz, on 1.4a it'll drop back to a measly 30 Hz. A big miss if you ask me, HDMI 2.0 is the answer for products in the living room. Especially Nano and Project Quantum in mind, this might not have been the smartest move from the engineering team within AMD.
It's a bit of a thing for 4K gaming in the living room I'd say.
This question was answered by AMD Matt from our forums:
Q: Does the AMD Radeon R9 Fury X GPU have HDMI 2.0?
A: No. AMD recommends and uses DisplayPort 1.2a for 4K60 content.
AMD Radeon Fury X beats Titan X in OpenCL - 06/10/2015 05:47 PM
So the funny thing about people as that once they have hardware, they test stuff and sometimes forget that the results of a benchmark are inserted into a database, which is publicly available. AMD Rad...
AMD Radeon Graphics Channel Roadmap Surfaces - 06/10/2015 08:17 AM
Nothing is safe on the web it seems. A new roadmap leaked onto the web showing AMD's Radeon lineup for 2015. It is as discussed many times now, exactly what we told for a while now. Lots of respins a...
AMD Radeon R9 370 Reference Design Pictured - 06/09/2015 04:01 PM
Over at Alienware they are showing a photo of what seems to be the Radeon R9 370. The reference product has had a bit of a redesign alright. ...
AMD Radeon 300 Series Pricing Revealed - 06/09/2015 08:23 AM
The pricing for the AMD Radeon 300 Series pricing leaked onto the web, from Radeon R7 370 to R9 390 that is. ...
AMD Radeon R9 390, R9 380, R9 370 and R9 360 Series Rebrands - 05/26/2015 06:03 PM
We discussed this a couple of times already, but as we near Computex things are getting more and more clear about AMD's Radeon line-up. Yes there will be a new product with HBM memory, but it's not ...
Senior Member
Posts: 6682
Joined: 2004-05-13
You do realize that even if DP managed to kill HDMI there would be many upset users. Many users who use HDTV's as a monitor (or second screen) probably still connect to an older HDTV that supports 1080p which is HDMI only (or VGA/DVI). DisplayPort is only supported on some UHD4k screens. Just look at the AMD Fury thread and see how many are upset that the card lacks DVI, which is even older than HDMI. Just because you don't use it, doesn't mean that the need or want for it is gone.
Moderator
Posts: 30226
Joined: 2007-09-19
30Hz lol thats crap
Junior Member
Posts: 15
Joined: 2014-08-29
I think this is a really bad move on AMD's part.
Look at the sales charts. There are A LOT of 4K TVs being sold right now, and most of the 4K content is on PC, not on 4K Blu-Ray or Smart TV networks yet.
Watching 60Hz content at 4K 30Hz on HDMI 1.4 is crappy, I know from experience.
I bought a GTX 980 for the PC attached to my Vizio P702ui-B3 and it's great. Controller based games that aren't very demanding look great. Video looks great. If the Fury X was out when I got it, I probably would have bought it instead if it had HDMI 2.0 (Dust is a problem in media center PCs and AMD's radiator paired with an easily accessible filter would have been a great solution).
I wonder how much cost is saved by not using HDMI 2.0 -- I bet lost revenue is more.
Senior Member
Posts: 3655
Joined: 2007-05-31
DP is very versatile and do every thing that DVI does (DL-DVI and DVI-i included) but i agree with those who pout because there is no HDMI 2.0, this card is the flag ship of the brand... it's not middle segment, those who will buy it are very precise on spec they want.
Lot of people play on their TV and DP is not on their screen (and are less expensive than computer screen at these res, only freq are lot lower) so they can only use HDMI.
btw i was expecting the same for NVidia.
Senior Member
Posts: 11808
Joined: 2012-07-20
AMD promotes AdaptiveSync, at time they get it into HDMI standard or hack it in, they'll get their cards up to that standard.
I personally preferred DL-DVI over HDMI since IQ is same and I use separate sound.
(And I was not happy that DP did not manage to kill HDMI, but that is interface which matters to me now.)
DP all the way, if it can do 1440p @144Hz then it is good enough.