Gigabyte GeForce RTX 2080 GAMING OC 8G review

Graphics cards 1022 Page 32 of 32 Published by




You know, I think the RTX 2080 card will be the hardest sell out of the 2070, 2080 and 2080 Ti range? I mean, they offer heaps of performance, above a GeForce GTX 1080 Ti performance level but do miss out that little bit of extra edgy'ness compared to the founder edition card from NVIDIA. Typically a board partner card offer a good chunk, say 10% extra performance over the reference/founder edition cards and I already stated it a couple of times in the reviews; it's almost as-if NVIDIA is now trying to compete with the AIC/AIB partners. The difference is just too narrow in performance. The new GAMING OC 8G card shows a really shy factory tweak compared to the founders card and runs merely 15 MHz higher on that boost clock. While the Gigabyte software will allow you to use an OC button, that also is merely a 15 MHz step. The two combined probably will not even bring you an extra single FPS on average. We see this product sitting in that ~1900 MHz range (depending on game title and limiters), which is great .. but yeah too close to the founder cards? Other characteristics, the cooler simply put is good. But we'll talk about that in a second thought. 

DirectX raytracing and DLSS

So yes, I would have loved to show you some DXR (Raytracing) performance, but all that NVIDIA provided was an internal Star Wars Reflection demo you've probably already seen, which is lovely to see! but simply is irrelevant of course to benchmark. Also, very cool that UL contacted us and allowed us to test and showcase where they are at with the Raytracing demo. You have seen an exclusive there, work in progress though! So yeah, that's it for Raytracing for now. Once we actually see games release them and once Microsoft releases DX-R (fall update) we'll create some additional content to show you the potential of the new RT cores as really, this release is all about the RT cores my jolly good friends. So I am not dismissing the RT cores here, as honestly, I am super excited about the technology and what it can do for image quality in gaming. It's just that I objectively cannot comment or give you an advise based on something we cannot test in an actual game (which what this is all about eh?).

Deep learning Supersampling. Totally lovely, really .. but here again we've been dealing with the fact that there is no software to test aside from precisely two titles, you've seen the results and image quality in this article with the help of Ultra HD videos. I'll say this though, despite we've only been able to visually inspect and measure one title, we are impressed by the technology as when you use DLSS, you don't need to enable AA in your shader engine anymore as DLSS takes care of that for you by using the Tensor cores, so that's completely offloaded and that brings in a good chunk of additional performance in return, that you'd normally miss out on. The technology in its current state, I can confirm that it is impressive. And with that out of the way, I need to close the primary new features and that DNA that is RTX and AI already and turn back to traditional rasterized shading. More relevant would be the EPIC infiltrator DLSS demo I showed you? I meant that's really good right? Plus it just runs much faster compared to TAA, that's win-win right there.

Gaming performance

The GeForce RTX 2080 is a bit hard to spot in the benchmarks, it shy's away between the 1080 Ti and Titan Xp most of the time. We 've seen some anomalies in some titles at the low resolutions, but that's common with new hardware and the state of the drivers. The overall performance range, however, is nice, this is the new high-end. What's also interesting is that the harder you make it on the GPU, the better it'll perform. So these cards tend to flex their muscles at Ultra HD. We're seeing some caps at lower resolution gaming, but for WQHD and Ultra HD, the next step in performance has been made. Depending on title and workload you'll see 25% to maybe even 30% performance increases in this specific resolution compared to the 1080 Ti. Again, this GPU really starts to flex its muscles after 2560x1440, anything below is normalized due to driver and processor bottlenecks as you've passed a number way beyond 100 FPS on average. While I like that 11 GB graphics memory on the Ti, admittedly we feel 8 GB is a great number of VRAM to have. In terms of multi-GPU setups, Turing will support 2-way SLI, but really that's it and all. NVIDIA did not have the new NVLINK connector available, so we've not been able to test it. 


Gigabyte tweaked the design and basically revamped the 82mm fan based WindForce 3X model cooler, combined with its black design it certainly looks good, but I liked the Aorus look better. As any manufacturer will do, the product on the top side also comes with RGB LED lighting control. Switch it on/off or to any color and animation you prefer, the choice is yours. Cool dibs is that backplate, however, it lacks openings at the proper areas (GPU/VRM) for venting. As you can see, I remain skeptical about backplates, they potentially can trap heat and thus warm up the PCB. 



Cooling & Noise Levels

The WindForce 3X reached 67 Degrees C with three fans, we do wonder if that number could have been a notch lower if that backplate wasn't closed completely. Backplates do look much better, make the PCB more sturdy (bends less or not at all) and can protect your PCB and components from damage. Consumer demand is always decisive, and you guys clearly like graphics cards with backplates. Granted, 67 Degrees C is still really good and below the reference/founders products. So again once the fans kick in, you can expect to hover at that 65~70 Degrees C marker, with seriously demanding games. Please do note that you will need proper ventilation inside your chassis to achieve that number as most heat is vented inside your chassis. Nice to see is that in IDLE the card remains passive, and thus inaudible.  Noise wise, we can’t complain about cooling whatsoever as at 38 DBa during gaming you need to listen carefully to be able to hear it. Also, we have not heard any coil-whine.

Power Consumption

Graphics cards with a TU104 sit at a 225 Watt TDP under full stress, our measurements show it to be in that range when gaming. We think a 600 Watt PSU would be a nice match for these cards paired with a modern age PC. Remember - when purchasing a PSU, aim to double up in Wattage as your PSU is most efficient when it is at 50% load. Here, again, keep in mind we measure peak power consumption, the average power consumption is a good notch lower depending on GPU utilization. Also, if you plan to overclock the CPU/memory and/or GPU with added voltage, please do purchase a power supply with enough reserve. People often underestimate it, but if you tweak all three aforementioned variables, you can easily add 150 Watts to your peak power consumption budget as increasing voltages and clocks increase your power consumption.


We like the new OC Scanner that you will see throughout most GPU tweak utilities. While it only overclocks your GPU a notch, it does create a reliable tweaking curve. A combination of memory, power and voltage settings will bring you a proper overclock. Once you've applied it, you get a few percent more performance. The real gurus, of course, will overclock manually. Here we cannot complain either. We gained 140 MHz extra on the clock frequency and with the increased power limiter you'll now see that dynamic boost frequency hovering at and over the 2000 MHz range. Mind you, that frequency can be higher and lower depending on game and benchmark title. The memory clocks pretty spectacular, we've been able to add 950 MHz, double that for a double-data-rate and, yes, we had close to 16 Gbps running stable. Which is still very impressive when you think about it for a few seconds. So in closing, exciting is the new OC Scanner functionality. Honestly, it probably needs a bit more development and finetuning, but I can see this become the primary methodology for automated overclocking in the future. Right now it is a little conservative though, but yes, promising. 





The GeForce RTX 2080 is having a bit of a rough(er) launch IMHO, performance wise it sits at a decent spot, but the pricing for many is off. That extra money needs to be found in DirectX Raytracing or DLSS, which can hardly be tested right now. The 2080 carries a steep 799 USD price tag though, and that is the main issue of this product series, the pricing level. Back to the Gaming OC 8G, we felt a need for a better factory tweak as these cards can quite easily be tweaked a notch higher, but again, NVIDIA upped the game, bringing the founders cards closer towards AIB performance. The 2080 seen from the 1080 purely based on shading performance offers a nice increase, but that mostly only pays out in Ultra HD. So yes, the extra money needs to be found in the RT and Tensor cores, thus raytracing, DLSS and everything new that will follow from the new cores. DLSS I am savvy about, at least from what I have seen at the NVIDIA event and here with FFXV, so that is a limited scope of software to form an objective opinion on. Currently, I, however, do not know if the number of RT cores are fast enough or plentiful enough to be able to produce fast enough numbers for hardware-assisted ray tracing in rasterized games, and that is my biggest dilemma for this conclusion. I will say this though, Raytraced gaming blew me away when I briefly tested it on the NV event. It is the dawn of a new era and likely the path that defines the next decade of gaming. But we need games to support it before we can say anything solid about it.

Gigabyte offers an overall solid product here, with a nice performing Windforce based cooler and dark style as well. It's silent enough, cools lovely at that 67 Degrees C range under load. For those that want to be 1st in that next-generation gaming experience, the card could make a lot of sense. Realistically though my advice always is and has been, with totally new technology you are often better off with a 2nd gen product series. And as I write this, at this time I cannot tell you how good or bad the RT performance will be in actual games, and for me as a reviewer that tries to be as objective as can be, that's a hard and harsh thing to experience. So let me close a little subjective then. The little bit of a DLSS experience that I had was, however, very enjoyable, so thumbs up on that one. The Raytracing stuff I've seen brought a smile on my face, that's also the honest truth. Overall the Gigabyte RTX 2080 Gaming OC 8G is a very nice product ticking the right boxes, however, more time and supported games are needed for it to start making proper sense? We can still happily recommend it though as it's a lovely product overall for the ones that like to be the 1st next-gen adopter of Hybrid Raytracing technology, that will become the new standard for the next decade to come. 

Typically products like these end up with a top pick or best hardware award, at this time we're giving it a recommended award mainly due to the lack of RT testing, as well as its very high price level.

Recommended  Downloads

Share this content
Twitter Facebook Reddit WhatsApp Email Print