AMD Radeon Fury X doesn't have HDMI 2.0 support
Click here to post a comment for AMD Radeon Fury X doesn't have HDMI 2.0 support on our message forum
Fox2232
AMD promotes AdaptiveSync, at time they get it into HDMI standard or hack it in, they'll get their cards up to that standard.
I personally preferred DL-DVI over HDMI since IQ is same and I use separate sound.
(And I was not happy that DP did not manage to kill HDMI, but that is interface which matters to me now.)
DP all the way, if it can do 1440p @144Hz then it is good enough.
pimp_gimp
You do realize that even if DP managed to kill HDMI there would be many upset users. Many users who use HDTV's as a monitor (or second screen) probably still connect to an older HDTV that supports 1080p which is HDMI only (or VGA/DVI). DisplayPort is only supported on some UHD4k screens. Just look at the AMD Fury thread and see how many are upset that the card lacks DVI, which is even older than HDMI. Just because you don't use it, doesn't mean that the need or want for it is gone.
WhiteLightning
Moderator
30Hz lol thats crap
raptor15sc
I think this is a really bad move on AMD's part.
Look at the sales charts. There are A LOT of 4K TVs being sold right now, and most of the 4K content is on PC, not on 4K Blu-Ray or Smart TV networks yet.
Watching 60Hz content at 4K 30Hz on HDMI 1.4 is crappy, I know from experience.
I bought a GTX 980 for the PC attached to my Vizio P702ui-B3 and it's great. Controller based games that aren't very demanding look great. Video looks great. If the Fury X was out when I got it, I probably would have bought it instead if it had HDMI 2.0 (Dust is a problem in media center PCs and AMD's radiator paired with an easily accessible filter would have been a great solution).
I wonder how much cost is saved by not using HDMI 2.0 -- I bet lost revenue is more.
rl66
DP is very versatile and do every thing that DVI does (DL-DVI and DVI-i included) but i agree with those who pout because there is no HDMI 2.0, this card is the flag ship of the brand... it's not middle segment, those who will buy it are very precise on spec they want.
Lot of people play on their TV and DP is not on their screen (and are less expensive than computer screen at these res, only freq are lot lower) so they can only use HDMI.
btw i was expecting the same for NVidia.
nohdmi2
the right question which video card have True HDMI 2.0? Only one problem, no graphi
the right question which video card have True HDMI 2.0?
Only one problem, no graphics cards with HDMI 2.0 in 2015
AMD Radeon Fury X doesn't have HDMI 2.0 support
everything is cheating, false and fake test in nvidia
not support hdmi 2.0
not support 18 gbps
to processing 18GBPS you need 1.8 times power
NVIDIA has sold tens of thousands to customers promised video cards with HDMI 2.0
It's a lie!
not have hdmi 2.0
NVIDIA
not Supports DCI-P3 because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4
not support HDR because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4
some people brought as testing like NVIDIA supports HDMI 2.0 this is wrong test
test does not test bandwidth .
(.)(.)
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications
Even the gtx960 is Hdmi 2.0
Spets
AMD just keeps on making bad decisions.
Ourasi
There was shown a bizlink displayport to hdmi 2.0 4K@60hz at CES, so if you absolutely need HDMI2.0 since you have a 4k TV-set with HDMI2.0 already, there is a way or atleast very soon..
The total messup on HDMI2.0 standard is horrific, complaints everywhere, uncomressed color format failng with HDCP2.2 etc.. I'll never buy a UHD-TV without DisplayPort, ever..
Fox2232
nevcairiel
CDJay
That is completely and utterly *insane*.
Good luck getting that into any Steam Boxes or HTPCs.
Fox2232
TOM's-HW
AMD has 4:2:0 available too if someone wants to use it, I never did.
There are things and things. Those DVI/HDMI can be OCed a bit. And nVidia apparently used in past this:
rl66
AlmondMan
Buy a dongle and win?
leszy
"In any case, with 4:2:0 4K TVs already on the market, NVIDIA has confirmed that they are enabling 4:2:0 4K output on Kepler cards with their R340 drivers. What this means is that Kepler cards can drive 4:2:0 4K TVs at 60Hz today, but they are doing so in a manner that’s only useful for video. For HTPCs this ends up being a good compromise and as far as we can gather this is a clever move on NVIDIA’s part. But for anyone who is seeing the news of NVIDIA supporting 4K@60Hz over HDMI and hoping to use a TV as a desktop monitor, this will still come up short. Until the next generation of video cards and TVs hit the market with full HDMI 2.0 support (4:4:4 and/or RGB), DisplayPort 1.2 will remain the only way to transmit a full resolution 4K image."
http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
Oversemper
Then what about AMD Quantum which "we’re talking 60 to 90 fps in all games when played at 4K (3840*2160) resolution" and has Fury X2? They promote it as a living room set, right?
Also, may be Fury's Display Port can send (via adapter) 4K@60 4:4:4 signal which can be correctly recognized by a TV?
KissSh0t
Fury X is AMD manufactured card right? when MSI / Gigabyte / XFX and there others I can't recall of the top of my head.. When they do their versions maybe they will put HDMI 2.0 on their cards?
Or won't there be other manufacturers making this?
NamelesONEMail
Lots of you people don't even come close to having the GPU power to drive 4k ... some of you can't even get 30Hz in games 1-2 years old at any decent quality settings yet all I see is people complaing about not getting 60Hz on the HDMI version on cards.
Where's all this elitists c**p coming from ? ... why should they bother implementing something that less that 0.1% of people use (and I'm sure its much less than 0.1%) ? ... if a product is good for 99.9% of people that's all that matters ... those 0.1% don't matter ...
The majority of people who buy a 4k TV buy it to watch TV(upscale on blu-ray looks great) not connect it to a PC so I doubt they care much about the few who actually DO connect them to PC's.
I seriously doubt they will be weak by any stretch of the imagination ... I have a GTX 780 that drives most games to the max on 1440p at above 45-50Hz and its a fairly old card at this moment.
If 700+ euro cards would be considered weak 1-2 years down the road then there would be little reason to buy one unless you were swimming in cash ... it would be much better to get a cheaper card and upgrade more often.
Fox2232