AMD Radeon Fury X doesn't have HDMI 2.0 support

Published by

Click here to post a comment for AMD Radeon Fury X doesn't have HDMI 2.0 support on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
AMD promotes AdaptiveSync, at time they get it into HDMI standard or hack it in, they'll get their cards up to that standard. I personally preferred DL-DVI over HDMI since IQ is same and I use separate sound. (And I was not happy that DP did not manage to kill HDMI, but that is interface which matters to me now.) DP all the way, if it can do 1440p @144Hz then it is good enough.
https://forums.guru3d.com/data/avatars/m/79/79987.jpg
You do realize that even if DP managed to kill HDMI there would be many upset users. Many users who use HDTV's as a monitor (or second screen) probably still connect to an older HDTV that supports 1080p which is HDMI only (or VGA/DVI). DisplayPort is only supported on some UHD4k screens. Just look at the AMD Fury thread and see how many are upset that the card lacks DVI, which is even older than HDMI. Just because you don't use it, doesn't mean that the need or want for it is gone.
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
30Hz lol thats crap
data/avatar/default/avatar29.webp
I think this is a really bad move on AMD's part. Look at the sales charts. There are A LOT of 4K TVs being sold right now, and most of the 4K content is on PC, not on 4K Blu-Ray or Smart TV networks yet. Watching 60Hz content at 4K 30Hz on HDMI 1.4 is crappy, I know from experience. I bought a GTX 980 for the PC attached to my Vizio P702ui-B3 and it's great. Controller based games that aren't very demanding look great. Video looks great. If the Fury X was out when I got it, I probably would have bought it instead if it had HDMI 2.0 (Dust is a problem in media center PCs and AMD's radiator paired with an easily accessible filter would have been a great solution). I wonder how much cost is saved by not using HDMI 2.0 -- I bet lost revenue is more.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
DP is very versatile and do every thing that DVI does (DL-DVI and DVI-i included) but i agree with those who pout because there is no HDMI 2.0, this card is the flag ship of the brand... it's not middle segment, those who will buy it are very precise on spec they want. Lot of people play on their TV and DP is not on their screen (and are less expensive than computer screen at these res, only freq are lot lower) so they can only use HDMI. btw i was expecting the same for NVidia.
data/avatar/default/avatar32.webp
the right question which video card have True HDMI 2.0? Only one problem, no graphi the right question which video card have True HDMI 2.0? Only one problem, no graphics cards with HDMI 2.0 in 2015 AMD Radeon Fury X doesn't have HDMI 2.0 support everything is cheating, false and fake test in nvidia not support hdmi 2.0 not support 18 gbps to processing 18GBPS you need 1.8 times power NVIDIA has sold tens of thousands to customers promised video cards with HDMI 2.0 It's a lie! not have hdmi 2.0 NVIDIA not Supports DCI-P3 because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4 not support HDR because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4 some people brought as testing like NVIDIA supports HDMI 2.0 this is wrong test test does not test bandwidth .
https://forums.guru3d.com/data/avatars/m/230/230424.jpg
the right question which video card have True HDMI 2.0? Only one problem, no graphics cards with HDMI 2.0 in 2015 AMD Radeon Fury X doesn't have HDMI 2.0 support everything is cheating, false and fake test in nvidia not support hdmi 2.0 not support 18 gbps to processing 18GBPS you need 1.8 times power NVIDIA has sold tens of thousands to customers promised video cards with HDMI 2.0 It's a lie! not have hdmi 2.0 NVIDIA not Supports DCI-P3 because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4 not support HDR because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4 some people brought as testing like NVIDIA supports HDMI 2.0 this is wrong test test does not test bandwidth .
Even the gtx960 is Hdmi 2.0 http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
AMD just keeps on making bad decisions.
data/avatar/default/avatar19.webp
There was shown a bizlink displayport to hdmi 2.0 4K@60hz at CES, so if you absolutely need HDMI2.0 since you have a 4k TV-set with HDMI2.0 already, there is a way or atleast very soon.. The total messup on HDMI2.0 standard is horrific, complaints everywhere, uncomressed color format failng with HDCP2.2 etc.. I'll never buy a UHD-TV without DisplayPort, ever..
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Even the gtx960 is Hdmi 2.0 http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications
I think that is what he is trying to say... His statement shortened: nVidia states that cards have HDMI 2.0, but in reality it does not have all HDMI 2.0 capabilities to drive all HDMI 2.0 features. (Not that it would be 1st time when nV declares something what is not so true. Even DX12_1 feature set has to be tested 1st to see if it is real HW or just driver level = SW implementation.) But I think that at time DX12_1 goes live in games, even Fury X/Titan X will be kind of weak.
data/avatar/default/avatar32.webp
I think that is what he is trying to say... His statement shortened: nVidia states that cards have HDMI 2.0, but in reality it does not have all HDMI 2.0 capabilities to drive all HDMI 2.0 features.
No GPU ever supported all HDMI features. CEC anyone? Yet NVIDIA does 4K @ 60Hz (at 4:4:4, of course) on the 900 series, which is the important part, and contrary to what that guy says, requires 12 gbps in performance, which wouldn't work on HDMI 1.4 link (which only does 10.2 gbps) Many professional review sites have confirmed this working, so don't take the word of some anonymous online troll for it.
data/avatar/default/avatar18.webp
That is completely and utterly *insane*. Good luck getting that into any Steam Boxes or HTPCs.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
No GPU ever supported all HDMI features. CEC anyone? Yet NVIDIA does 4K @ 60Hz (at 4:4:4, of course) on the 900 series, which is the important part, and contrary to what that guy says, requires 12 gbps in performance, which wouldn't work on HDMI 1.4 link (which only does 10.2 gbps) Many professional review sites have confirmed this working, so don't take the word of some anonymous online troll for it.
There are things and things. Those DVI/HDMI can be OCed a bit. And nVidia apparently used in past this: TOM's-HW AMD has 4:2:0 available too if someone wants to use it, I never did.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
the right question which video card have True HDMI 2.0? Only one problem, no graphics cards with HDMI 2.0 in 2015 AMD Radeon Fury X doesn't have HDMI 2.0 support everything is cheating, false and fake test in nvidia not support hdmi 2.0 not support 18 gbps to processing 18GBPS you need 1.8 times power NVIDIA has sold tens of thousands to customers promised video cards with HDMI 2.0 It's a lie! not have hdmi 2.0 NVIDIA not Supports DCI-P3 because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4 not support HDR because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4 some people brought as testing like NVIDIA supports HDMI 2.0 this is wrong test test does not test bandwidth .
... of course if you read ONLY amd test... on pro side it support 4K since few after release, on normal consumer as already stated even the 960 is capable (despite it is not intended too (i guess it might be horrible exept in video playback) as middle/low segment)... about power and bandwith you haven't get the right info...
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Buy a dongle and win?
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
"In any case, with 4:2:0 4K TVs already on the market, NVIDIA has confirmed that they are enabling 4:2:0 4K output on Kepler cards with their R340 drivers. What this means is that Kepler cards can drive 4:2:0 4K TVs at 60Hz today, but they are doing so in a manner that’s only useful for video. For HTPCs this ends up being a good compromise and as far as we can gather this is a clever move on NVIDIA’s part. But for anyone who is seeing the news of NVIDIA supporting 4K@60Hz over HDMI and hoping to use a TV as a desktop monitor, this will still come up short. Until the next generation of video cards and TVs hit the market with full HDMI 2.0 support (4:4:4 and/or RGB), DisplayPort 1.2 will remain the only way to transmit a full resolution 4K image." http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
data/avatar/default/avatar10.webp
Then what about AMD Quantum which "we’re talking 60 to 90 fps in all games when played at 4K (3840*2160) resolution" and has Fury X2? They promote it as a living room set, right? Also, may be Fury's Display Port can send (via adapter) 4K@60 4:4:4 signal which can be correctly recognized by a TV?
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Fury X is AMD manufactured card right? when MSI / Gigabyte / XFX and there others I can't recall of the top of my head.. When they do their versions maybe they will put HDMI 2.0 on their cards? Or won't there be other manufacturers making this?
https://forums.guru3d.com/data/avatars/m/247/247729.jpg
Lots of you people don't even come close to having the GPU power to drive 4k ... some of you can't even get 30Hz in games 1-2 years old at any decent quality settings yet all I see is people complaing about not getting 60Hz on the HDMI version on cards. Where's all this elitists c**p coming from ? ... why should they bother implementing something that less that 0.1% of people use (and I'm sure its much less than 0.1%) ? ... if a product is good for 99.9% of people that's all that matters ... those 0.1% don't matter ...
I think this is a really bad move on AMD's part. Look at the sales charts. There are A LOT of 4K TVs being sold right now, and most of the 4K content is on PC, not on 4K Blu-Ray or Smart TV networks yet.
The majority of people who buy a 4k TV buy it to watch TV(upscale on blu-ray looks great) not connect it to a PC so I doubt they care much about the few who actually DO connect them to PC's.
But I think that at time DX12_1 goes live in games, even Fury X/Titan X will be kind of weak.
I seriously doubt they will be weak by any stretch of the imagination ... I have a GTX 780 that drives most games to the max on 1440p at above 45-50Hz and its a fairly old card at this moment. If 700+ euro cards would be considered weak 1-2 years down the road then there would be little reason to buy one unless you were swimming in cash ... it would be much better to get a cheaper card and upgrade more often.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Then what about AMD Quantum which "we’re talking 60 to 90 fps in all games when played at 4K (3840*2160) resolution" and has Fury X2? They promote it as a living room set, right? Also, may be Fury's Display Port can send (via adapter) 4K@60 4:4:4 signal which can be correctly recognized by a TV?
If you can afford that kinf of performance in such small form factor, you can afford to have 4k TV with DisplayPort.