AMD Radeon R9 Fury X review

Graphics cards 1048 Page 38 of 38 Published by

teaser

Final Words & Conclusion

Final Words & Conclusion

You know, for AMD the Radeon Fury X is one of the biggest releases ever, and I do mean that both in a figural and literally speaking manner. AMD has been thinking outside the box, and I like that so very much. They are the first ever to place HBM onto an enthusiast class consumer grade GPU, and it actually works pretty nicely. Now, every advantage has a disadvantage, and that rule does apply to the Radeon Fury X, but overall it is an extremely powerful product that can compete with a reference clocked GeForce GTX 980 Ti pretty well. I name this product specifically as AMD is pricing the Radeon Fury X precisely similar to that product. So comparing the Titan X to the Fury X is not really in order. The Fiji XT based GPU is a tough GPU to deal with. It is grand in performance, but starts to show its true power at the extremely high image quality and resolution modes, but we'll talk about that in a second. I absolutely love the overall design, the small form factor, and granted the liquid cooler works out really nicely. And in the end the performance the Radeon Fury X is able to deliver is exceptionally good, given the right circumstances.

The Fiji GPU

Credit where credit is due. 20nm fab nodes should have been here, they failed. So now the GPU industry is waiting on 14nm nodes to become ready for the big gun GPUs. So what do you do until that happens? Well, both Nvidia and AMD decided to put Godzilla on a chip, for AMD that is Fiji, and the end result is a chip that has close to 9 Billion transistors on the GPU (that's excluding HBM), and that is HUGE. the end result is a chip sized roughly 5x5cm; imagine that at 14nm, that would be half the size with reduced voltage needed. Regardless of size and complexity on the 28nm node AND added HBM on the die, you have to bow to AMD as they did manage to create a chip as powerful as Fiji is with its 4096 shader processors. Despite its size, AMD also managed to keep power consumption under control, it's roughly similar to the R9 290X / 390X. And that is impressive, as you need to accumulate the liquid cooling system in there as well. Overall, hats off and kudos to AMD for Fiji, I've had it in my hands and even when you look at it (see photo below) it is an impressive thing.


Img_0569-radeon-gury-fiji-gpu


Cooling & Noise Levels

So, Fiji is a big chip. Big chips produce heat when clocked high, and at 1050 MHz it is clocked high. The challenge at hand was that with HBM, the overall PCB production boards can be much smaller and thus cheaper. The downside is that you have far less radiator surface area for the heatsink to cool the GPU. Now there will be the regular Fury with an air based cooler, but the proper way to go for AMD was liquid cooling for their top of the line product. And it does work well. The liquid cooler is self contained, I mean you screw it into the system with four screws and that's really it, you do not need to worry about wires etc, it's all already hooked up to the graphics card. The end result is a product package that keeps the graphics processor at a maximum of 50 degrees C, and that really works out well. You guys have seen the thermal imaging, that picture perfect stuff there. Noise levels then, they are quite okay, you can hear the fan silently once housed into a PC. There is one slight concern, there is more noise coming from the graphics card than there is from the fan. There is some coil noise audible but it is mostly dulled by the cards cover/casing. Not irritating at all, but an audiophile might get annoyed by it. Our overall dBA readins are OK, in line with a normal PC making normal noises. But I cannot quantify the Fury X as a silent solution, that's the honest truth. But it certainly isn't loud either, normal is the one word for it. BTW once you crank up the fan RPM (fairly noiseless) you can even chill down the GPU towards 40 Degrees C. please do make sure you have proper ventilation inside your chassis as liquid cooling relies on an ambient temperature. 


Img_0558-amd-radeon-fury-x-guru3d

Aesthetics

The looks might have been a big challenge for AMD, but they succeeded. The card itself is small, nearly Mini ITX form-factor. That looks tiny and cute inside any PC. The coloring is dark and red, with the subtle Radeon LED lit logo at the top of the card. The phase LEDs might be a little bright, but are color configurable and for whatever reason, I do seem to like them. From all sides the Fury X is covered, you will not see any components. Obviously, as small as the card itself is, there is an LCS unit attached to it that you will need to house. But again, it is all black including the tubing and wiring (sleeved) so the total package will look nice in any PC as far as I am concerned.

Power Consumption

While the overall power consumption is spicy, the card is rated by us as having a 275 to 300 Watt TDP, depending on the workload. Compared to the last gen 290X and 390X, that is roughly the same power draw. So while it's a substantial wattage, the performance per watt ratio has improved significantly as both these cards utilize the same amount of power. Again, with such a huge chip and included liquid cooling (which draws power as well), things are not looking bad at all. That TDP will make running multi-GPU solutions a bit more complicated. With two cards in Crossfire mode we think an 850~900 Watt PSU would be sufficient. So yeah, it's not great to have a GPU consuming this much power, but it could have been a lot worse. So I am good with this. 

Game Performance

The Radeon Fury X is a beast, but needs the right circumstances to really shine. We think a driver tweak or two is still in order as up-to a monitor resolution of 2560x1440 the performance definitely is grand, but lacks a little here and there opposed to what the competition is showing. It looks and feels like driver API overhead. The thing is, once you pass 2560x1440 and head onwards to Ultra HD then that's where the card all of a sudden becomes very competitive and starts to really shine. As such, we really rate this card as a viable Ultra HD product. Regardless of that, in lower resolutions the card was often a notch slower, but still... you are looking at 120 FPS instead of 135 FPS, I doubt it could be a real hindrance for the true AMD Radeon aficionado. The Radeon Fury X in most scenarios (depending on the game intensity) will perform close to the GeForce GTX 980 Ti, with the usual exceptions here and there. And then once Ultra HD kicks in, things equalize or get better real fast at very acceptable framerates. That leaves open the 4GB discussion. In Ultra HD you are bound to run out of graphics memory fairly fast. Remember though, most of the data that resides on the framebuffer is cached data, that prevents things like swapping files from your HDD/SSD. Not all cached data is used and in most scenarios when you run out of graphics memory it just doesn't matter. It does start to matter when you have Ultra HD Texture packs and weird AA modes. If you want 8x MSAA, sure, your rendered frames will start swapping back and forth in graphics memory to do its thing, causing a slowdown, normally. However, with HBM, you have extremely low latency graphics memory that is seriously fast. So if these frames are moved back and forth in the framebuffer, they will gain from the increased bandwidth and that can alleviate the performance loss you'd normally get. It is a bit brute force, but everything about Fiji is just that, brute force performance.

So, I feel the 4 GB definitely is sufficient, the numbers back that up. But sure, to be a little more secure at Ultra HD, graphics memory wise I would have liked to have seen a 6GB or 8GB model. But that probably would have been too complicated on-die wires and expenses wise, so we can understand the decision made. But hey, performance wise really there's not one game that won't run seriously good at Ultra HD and that remains a fact.

No HDMI 2.0 and no DVI

We just have to cover this, once the news got out that HDMI 2.0 is not supported and that the card does not have a DVI connector, things got feisty in our forums for people that didn't care, liked or even hated these choices. My take on this then; Fury X is AMD's most modern and advanced graphics card ever made. it comes with a very hefty price-tag, I feel HDMI 2.0 (60Hz Ultra HD) should have been implemented. The Fury X might be a gaming graphics card that will end up in PCs. So there we're all good right? However AMD is to release small form factor Fury based products, these are intended for the living room. In the living room you'll have that nice ultra HD telly, these do not have Display Port and require HDMI 2.0 for 60hz Ultra HD. AMD sticks to 1.4a meaning 30Hz at ultra HD is supported only. For movies, not a big deal. But we do not believe for a second that you'd buy a Fiji based setup for movies, this is all about gaming at Ultra HD, and as such that's where a limit of 30 Hz / 30 FPS will kick in. It's just too low, disable VSYNC and it'll be a tearing freak fest. So in short, for a gaming PC this is all fine through DisplayPort, but in a living room situation HDMI will fall short. DVI then, AMD opted to provide three DP connectors and one HDMI, this means that if you have a monitor with only dual-link DVI (and a lot of people have these) then you can't connect the monitor to the card. You could use a HDMI to DVI cable, but that would be single link and your 2560x1440 monitor would be forced to 1920x1080 or best case scenario 1920x1200. There are complicated active adapters available, but you'd probably be better off buying a new monitor. There is definitely space enough at the backside of the Fury X for a DVI connector, but perhaps the display engine did not allow for a fifth connector, I dunno. This, however, is problematic for end-users with a monitor that only has a Dual-link DVI connector.

Overclocking

Overclocking then. 3rd party tools currently do not offer full tweaking support on the Radeon Fury X, it is simply too new a product. Current limitations are voltage adjustment for both GPU and memory, as well as frequency changes on the HBM memory. In the future we'll have to see how support on that will pan out. GPU voltage is going to be trivial here though as we could only mildly tweak the GPU to 1125 MHz (+75 MHz), after that it started to become unstable. Whether or not you need to tweak memory... well with that much bandwidth at hand you have to wonder if it even would make a difference. As always, overclocking results can vary per production batch and your PC infrastructure and cooling. 

 


Concluding

Alright, it's time to wrap things up. AMD unleashed Fury and that pretty much sums things up. The Radeon R9 Fury X is a little beast with a lot of variables in the right places. Great performance, good frametime results, fairly silent, nice design combined with a solid cooling solution. That said, we do hope to see some driver tweaks that will lift up game performance in the Full HD to WHQD domain a bit more, as being as close to the reference GTX 980 Ti is what this release is all about. If you plan to go for an Ultra HD build, then the Radeon R9 Fury X could be a feisty match alright. I am still in doubt about the 4GB of memory though, yes it works fine and fast at Ultra HD, but I do feel this 650 USD product would be a little more future proof with 6GB graphics memory. We understand AMD's decision making here though, but in the end the consumer does not care if that memory is HBM or based on a PCB that is a little longer. It is the end result + future proofing that counts. We like the liquid cooling, no longer do we have a reference product that runs at 95 Degrees C, this puppy will run at 50 Degrees C at its default settings. If you purchase the product, here's a small tip, add another silent fan in a push/pull setup and it might shave off a few extra degrees in temperature. Fluid like gameplay is what you get back in return whilst you enable the most intensive image quality settings. And isn't that what it is all about with PC gaming? The final MSRP pricing of this product is 649 USD and, depending on volume and availability, prices are going to vary here and there in the long run. It is a lot of money, sure. But the enthusiast space always has been expensive and considering this product is lined up against the GeForce GTX 980 Ti, the pricing makes a lot of sense.

Overall we can certainly recommend the Radeon R9 Fury X if it fits your budget. For 649 USD you do receive a product with liquid cooling factory installed already, I mean that right there is 100 bucks by itself and something I can only applaud. There are some small oversights like HDMI 2.0 and the lack of a DVI connector, but 'nuff said about that. If AMD can bring performance up with a few tweaks in the lower resolutions, they will have an extremely competitive card on their hands. It might not be perfect or a competition slaughtering product at release - but it is close enough. We like it very much.

“You Know Nothing, Jon Snow”

Recommended  Downloads

Sign up to receive a notice when we publish a new article
Or go back to Guru3D's front page

Share this content
Twitter Facebook Reddit WhatsApp Email Print