GeForce RTX 2080 Ti Founders review

Graphics cards 1049 Page 40 of 40 Published by

teaser

Conclusion

Conclusion

You know, it's been a rollercoaster ride. Ever since NVIDIA announced the RTX cards last month we (media) all figured, hey .. there's plenty of time to test and review two reference samples. Well, that idea went belly up :) We also expected to get some example software in the form of a real game, say a pre-build of Battlefield 5 or an RTX enabled Tomb Raider. None of that happened and next to that NVIDIA has been moving around embargoes and decided to have all board partners launch their cards on the same day as well. As if that all wasn't challenging enough, NVIDIA also released the driver last Friday, so nobody has been able to test the cards prior. Meanwhile, we as press get visits or need to visit board partners for presentations and onwards. Basically what I am trying to say here is that there has been way too little time to properly test these cards, and they sincerely do deserve more then what we are showing today.  We gave it our best though, but these articles will get updates over time specifically for DLSS and DX-R Gaming of course, as that is where the money is at.

DX-R gaming

So yes, I would have loved to show you some DXR (Raytracing) performance, but all that NVIDIA provided was an internal Star Wars Reflection demo you've probably already seen, which is lovely to see!, but simply is irrelevant of course to benchmark. Also, very cool that UL contacted us and allowed us to test and showcase where they are at with the Raytracing demo. You have seen an exclusive there, work in progress though! So yeah, that's it for Raytracing for now. Once we actually see games release them and once Microsoft releases DX-R (fall update) we'll create some additional content to show you the potential of the new RT cores as really, this release is all about the RT cores my jolly good friends. So I am not dismissing the RT cores here, as honestly, I am super excited about the technology and what it can do for image quality in gaming. It's just that I objectively cannot comment or give you an advise based on something we cannot test in an actual game (which what this is all about eh?).

DLSS

That other feature then, Deep learning Supersampling. Totally lovely, really .. but here again we've been dealing with the fact that there is no software to test aside from precisely two titles, you've seen the results and image quality in this article with the help of Ultra HD videos. I'll say this though, despite we've only been able to visually inspect and measure one title, we are impressed by the technology as when you use DLSS, you don't need to enable AA in your shader engine anymore as DLSS takes care of that for you by using the Tensor cores, so that's completely offloaded and that brings in a good chunk of additional performance in return, that you'd normally miss out on. The technology in its current state, I can confirm that it is impressive. And with that out of the way, I need to close the primary new features and that DNA that is RTX and AI already and turn back to traditional rasterized shading. More relevant would be the EPIC infiltrator DLSS demo I showed you? I meant that's really good right? Plus it just runs much faster compared to TAA, that's win-win right there. 

Gaming performance

Face it, the RTX 2080 Ti is a beast. We're seeing some caps at lower resolution gaming, but for Ultra HD, the next step in performance has been made. Depending on title and workload you'll see 25% to maybe even 40% performance increases in this specific resolution. The RTX 2080 Ti really starts to flex muscle at 2560x1440, anything below is normalized due to driver and processor bottlenecks. We cannot think of a game that won't run really good combined with the best image quality settings. Gaming you must do with at least a 24" monitor of course, at 2560x1440/1600 or better would be a nice fit. The 11 GB graphics memory is super sweet , making the product very future proof. In terms of multi-GPU setups, Turing will support 2-way SLI, but really that's it and all. NVIDIA did not have the new NVLINK connector available, so we've not been able to test it. 


Aesthetics

We feel the new looks work out well, RTX founder edition cards are very nice looking cards. We like the metal and black design. It has a very nice looming backplate, however that one will trap heat. The dual fan design I am fine with as well. Taste differs, of course, luckily there are a dozen AIB partners that will have multiple designs available as well.


43679_img_6112


Cooling & Noise Levels

The reference design (Founders Edition) of the RTX 1080 Ti is set at an offset threshold of roughly 80 degrees C but will not reach that value. That's good. We reached roughly 74 Degrees C under load/stress. And when the card remains at that value, it will hardly throttle. Of course, there are many other factors in play for a card to throttle these days, but in that particular respect, we have to say that the cooling works well enough. Where the GeForce RTX 2080 remains to be a silent card, the 2080 Ti exhibits normal noise levels. Under heavy load, we reached 40~41 DBa, that means you can hear some airflow, but that's it.

Power Consumption

Any TU102 Turing GPU and thus graphics cards based on them are rated at roughly 260 Watt TDP under full stress, our measurements show it to be in that range when gaming. We think a 650 Watt PSU would be a nice match for these cards and. Remember - when purchasing a PSU, aim to double up in Wattage as your PSU is most efficient when it is at 50% load. Here, again, keep in mind we measure peak power consumption, the average power consumption is a good notch lower depending on GPU utilization. Also, if you plan to overclock the CPU/memory and/or GPU with added voltage, please do purchase a power supply with enough reserve. People often underestimate it, but if you tweak all three aforementioned variables, you can easily add 150 Watts to your peak power consumption budget as increasing voltages and clocks increase your power consumption.

Overclocking

We like the new OC Scanner that you will see throughout most GPU tweak utilities. While it only overclocks your GPU a notch, it does create a reliable tweaking curve. Once you applied it, you get a few percent more performance. The real gurus, of course, will overclock manually. Here we cannot complain. We gained 190 Mhz extra on the clock frequency, and with the increase power limited you'll now see that dynamic boost frequency hovering at that 2000 MHz range. Mind you, that frequency can be higher and lower depending on game and benchmark title. The memory clock was insane, we've been able to add a 1000 MHz, double that for a double-data-rate, and yes we had 16000 Gbps running stable. Very exciting is the new OC Scanner functionality. Honestly, it needs some more development, but I can see this become the primary methodology for automated overclocking in the future. Right now it is a little conservative though, but yes , promising. 

  

Guru3d-recommended

  

Concluding

It is time to conclude folks. With the founders reviews, I've been battling two things, time and the lack of RT software. RTX is all about Hybrid raytracing, and at this stage, it cannot be properly tested in a game. The good news is that in the upcoming month or so that will change, and that goes as well for DLSS, which is something I do appreciate. Both techniques have seen a dozen or so announcements, so once that gets released, the dynamic of your viewpoint on these technologies is going to change, as they really are game changers, literally.

The 2080 Ti seen from a 1080 Ti purely based on shading performance is impressive, but the big question remaining is that extra 25 to 40% extra performance worth the price tag? Purely based on rasterized/shaded game performance I would say no. The extra money needs to be found in the RT and Tensor cores, thus raytracing, DLSS and everything new that will follow from the new cores. DLSS I am savvy about, at least from what I have seen at the NVIDIA event and here with FFXV and the EPIC demos, but that is a limited scope of software to form an objective opinion on. Currently, I am struggling with one thing though, I, however, do not know if the number of RT cores are fast enough or plentiful enough to be able to produce fast enough numbers for hardware-assisted ray tracing in rasterized games, and that is my biggest dilemma for this conclusion. I will say this though, Raytraced gaming blew me away when I briefly tested it on the NV event. It is the dawn of a new gaming era and likely the path that defines the next decade of gaming. But yes, we need games to support it before we can say anything solid about it. 

The GeForce RTX 2080 Ti card itself has lovely looks, was a tad too noisy (we're still checking out if that was an isolated issue with our sample) for me but overall runs with pretty normal temperatures. We're fine with the power consumption and it's current performance level (it'll probably speed up a bit more over time with drivers) and the tweaking experience to squeeze out a bit more FPS was lovely as well. The elephant in the room, however, will be pricing, we're talking $1199 for the founders edition model as tested today and that makes it completely out of reach for the vast majority of people as that is just a motherload of money for a consumer product.

Last words, as after 40 pages my brains could use some Tensor cores as well. Overall the 2080 Ti has the potential of being something incredible, I am not joking about that. But we do need more time and software for us to give a solid recommendation on that. Typically products like these end up with a top pick or best hardware award, at this time we're giving it a recommended award mainly due to the limited testing, as well as its very high price level. But man, what a beast, and what a nice looking founders card this round as well.

Recommended  Downloads

Share this content
Twitter Facebook Reddit WhatsApp Email Print