Asus GeForce RTX 2080 Ti RoG Strix review

Graphics cards 1049 Page 30 of 30 Published by

teaser

Conclusion

Conclusion

While testing the Strix you will have noticed that the product is extremely close towards the performance of the founders edition. News this year is that the standard founder edition clock frequencies will be closer to board partner tweaks and settings. In that essence, you could say that NVIDIA is more actively competing with their own board partners It's pretty freaky when you think about it. That said, the STRIX has an incredibly shy factory tweak applied and runs a whopping 15 MHz higher on that boost clock. However, ASUS does seem to apply a little more on the power limiter giving them an overall higher boost frequency. We see this product sits in that 1800~1850 MHz range depending on game title and limiters. The new STRIX revision has gotten thicker and longer size. The cooling of this puppy is simply badass! But we'll talk about that in a second thought. Right now though all registers turn green for me, as it comes with an excellent silent cooling solution, the looks are very familiar but nice (albeit this is subjective of course), it has a very nice LED implementation. The reality remains that the differences in performance in-between all tweaked 2080 models Ti models are going to be negligible.

DX-R gaming and DLSS

So yes, I would have loved to show you some DXR (Raytracing) performance, but all that NVIDIA provided was an internal Star Wars Reflection demo you've probably already seen, which is lovely to see! but simply is irrelevant of course to benchmark. Also, very cool that UL contacted us and allowed us to test and showcase where they are at with the Raytracing demo. You have seen an exclusive there, work in progress though! So yeah, that's it for Raytracing for now. Once we actually see games release them and once Microsoft releases DX-R (fall update) we'll create some additional content to show you the potential of the new RT cores as really, this release is all about the RT cores my jolly good friends. So I am not dismissing the RT cores here, as honestly, I am super excited about the technology and what it can do for image quality in gaming. It's just that I objectively cannot comment or give you an advise based on something we cannot test in an actual game (which what this is all about eh?).

Deep learning Supersampling. Totally lovely, really .. but here again we've been dealing with the fact that there is no software to test aside from precisely two titles, you've seen the results and image quality in this article with the help of Ultra HD videos. I'll say this though, despite we've only been able to visually inspect and measure one title, we are impressed by the technology as when you use DLSS, you don't need to enable AA in your shader engine anymore as DLSS takes care of that for you by using the Tensor cores, so that's completely offloaded and that brings in a good chunk of additional performance in return, that you'd normally miss out on. The technology in its current state, I can confirm that it is impressive. And with that out of the way, I need to close the primary new features and that DNA that is RTX and AI already and turn back to traditional rasterized shading. More relevant would be the EPIC infiltrator DLSS demo I showed you? I meant that's really good right? Plus it just runs much faster compared to TAA, that's win-win right there.

Gaming performance

Face it, the RTX 2080 Ti still is a beast. We're seeing some caps at lower resolution gaming, but for Ultra HD, the next step in performance has been made. Depending on title and workload you'll see 25% to maybe even 40% performance increases in this specific resolution. The RTX 2080 Ti really starts to flex muscle at 2560x1440, anything below is normalized due to driver and processor bottlenecks. We cannot think of a game that won't run really good combined with the best image quality settings. Gaming you must do with at least a 24" monitor of course, at 2560x1440/1600 or better would be a nice fit. The 11 GB graphics memory is excellent, making the product very future proof. In terms of multi-GPU setups, Turing will support 2-way SLI, but really that's it and all. NVIDIA did not have the new NVLINK connector available, so we've not been able to test it. 

Aesthetics

The STRIX still is a very nice looking card, it slowly is becoming more of the same looks though?. There's little more to say really. I like the dark design and, sure, the LEDs. All the extra RGB LED functionality like an RGB LED strip connector for me is not needed, hey - haters will hate, lovers will love it. At the backside you will now find a switch, allowing you to completely disable the LED system. Nice to see is the meshed hat back-plate, with openings in some areas (GPU/VRM) for venting. I'd like to see more vents at the GPU position though. I remain skeptical about back-plates, they can potentially trap heat and thus warm up the PCB. But the flip-side is that they do look better and can protect your PCB and components from damage and, well, they can look nice so they have an aesthetic appeal as well. Consumer demand is always decisive, and you guys clearly like graphics cards with back-plates. Both the front IO plate and back-plate are dark matte black which certainly gives the card that premium feel.


Img_6200

Cooling & Noise Levels

Excellent, that is the word I am starting with. You can opt for a performance or silent mode. Both modes will offer you the same game performance, however, performance mode topped out at 63 degrees C and Silent mode roughly 73 Degrees C. Whatever your preference is, you'll be fine with it. At Perf mode, you sit in the 39~40 DBa range which means you can slightly hear the product but is considered silent. At Silent mode, however, we measured an all-new 33 DBA, and for a flagship GPU that is an impressive thing as that is totally silent. So yeah, while card might be a bit thicker due to adding radiator surface area, it, however, does cool the GU102 really well plus you get options, we like that.

Power Consumption

Any TU102 Turing GPU and thus graphics cards based on them are rated at roughly 260 Watt TDP under full stress, our measurements show it to be in the 275 range when gaming. We think a 650 Watt PSU would be a nice match for these cards and. Remember - when purchasing a PSU, aim to double up in Wattage as your PSU is most efficient when it is at 50% load. Here, again, keep in mind we measure peak power consumption, the average power consumption is a good notch lower depending on GPU utilization. Also, if you plan to overclock the CPU/memory and/or GPU with added voltage, please do purchase a power supply with enough reserve. People often underestimate it, but if you tweak all three aforementioned variables, you can easily add 150 Watts to your peak power consumption budget as increasing voltages and clocks increase your power consumption.

Overclocking

We like the new OC Scanner that you will see throughout most GPU tweak utilities, for us that is Afterburner (download). While it only overclocks your GPU a notch, it does create a reliable tweaking curve. A combination of memory, power and voltage settings will bring you a proper overclock. Once you've applied it, you get more performance. The real gurus, of course, will likely prefer a manual tweak. Here we cannot complain either. We gained 135 MHz extra on the clock frequency and with the increased power limiter you'll now see that dynamic boost frequency hovering at and over the 2050 MHz range. Mind you, that frequency can be higher and lower depending on game and benchmark title. The memory clock tweak was far good, a full +1000 MHz, double that for a double-data-rate and, yes, we had 16 Gbps running stable. Which is still nice when you think about it for a few seconds. So in closing, exciting is the new OC Scanner functionality. Honestly, it probably needs a bit more development and finetuning, but I can see this become the primary methodology for automated overclocking in the future. Right now it is a little conservative though, but yes, promising. 

 

Guru3d-recommended

 

Concluding

The new 11 GB RTX 2080 Ti STRIX offers a small step up in additional performance seen from the founder edition from NVIDIA, it really is a small step. However, the cooler is much better and we think the VRM area cooled really good as well. We expected a better factory tweak as these cards can also easily be configured at a 1700 MHz boost, but again as honestly, I have a gut feeling that value will be changing. Right, we expect the card to retail at 1250 towards 1300 EUR/USD range, while I would recommend this product overall, I am continuously fighting with that price level as set by NVIDIA, as it is just crazy money. So yes, once again, for ASUS all lights are green for the STRIX, but the current inflated pricing raises a massive red flag.

The 2080 Ti seen from a 1080 Ti purely based on shading performance is impressive, but the big question remaining is that extra 25 to 40% extra performance worth the price tag? Purely based on rasterized/shaded game performance I would say no. The extra money needs to be found in the RT and Tensor cores, thus raytracing, DLSS and everything new that will follow from the new cores. DLSS I am savvy about, at least from what I have seen at the NVIDIA event and here with FFXV, so that is a limited scope of software to form an objective opinion on. Currently, I, however, do not know if the number of RT cores are fast enough or plentiful enough to be able to produce fast enough numbers for hardware-assisted ray tracing in rasterized games, and that is my biggest dilemma for this conclusion. I will say this though, Raytraced gaming blew me away when I briefly tested it on the NV event. It is the dawn of a new era and likely the path that defines the next decade of gaming. But we need games to support it before we can say anything solid about it. 

ASUS delivers a really good product here with that excellent cooler and dark style as well. It's really silent, cools great and offers performance and we assume tweaking levels on the GPU that will be really good. Typically products like these end up with a top pick or best hardware award, at this time we're giving it a recommended award mainly due to the limited testing and yes, price level. It's a beauty though.

Recommended  Downloads

Share this content
Twitter Facebook Reddit WhatsApp Email Print