NVIDIA Reveals RTX 4090 Overwatch 2 Performance Numbers

Published by

Click here to post a comment for NVIDIA Reveals RTX 4090 Overwatch 2 Performance Numbers on our message forum
data/avatar/default/avatar07.webp
Sorry I would like to understand what is meant in the graphics as it reads 500 FPS with a latency of 8 ms. My graphics card 6900 xt at 240 fps 5 ms .... Do I misinterpret what is written? What is meant? System latency or just the graphics card?
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
+18% 3080 to 4080 lol
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
In a cherrypicked benchmark, no less. Bloody hell.
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
with DLSS 3 fake double framerate?
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
That 192bit bus really killing the 4070....cough cough 4080 12GB's performance there lol :P
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Why a none pro would want 600 fps is beyond me. Even if i could reach 600 fps it would just be a waste of electric. I'd just prefer locking it down to 144 or 240hz anyway. The days of just leaving the card to max out fps is over for me. These days i just try find a balanced result between max and minimum fps with v-sync.
https://forums.guru3d.com/data/avatars/m/270/270091.jpg
so in this game at least, the 4080 16GB isn't quite 2x as fast as the 3070. I'll wait until the 5070/5080 series since my next GPU ideally has to be at least 3x as fast as my old one (went from an RX 570 to the current 3070)
https://forums.guru3d.com/data/avatars/m/270/270091.jpg
CPC_RedDawn:

That 192bit bus really killing the 4070....cough cough 4080 12GB's performance there lol 😛
that and it also has a smaller L2 cache of 48MB versus 64MB on the 16GB card
https://forums.guru3d.com/data/avatars/m/270/270091.jpg
Sukovsky:

And still some people say tHe 4090 iS thE NeW TiTAn. It's just the high-end card and you're getting f'd with that pricing.
to be fair it's using its own fat die with a 384-bit bus and way more shaders and L2 cache than the 256-bit bus 4080 16GB, unlike the 3080/12GB, 3080 Ti, 3090 and 3090 Ti that all used the same die with either a 320 or 384-bit bus.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Whats up with this marketing? If i had a 4090 all i would wanna do is to push that high resolutions with all the raytracing effects. Isnt that what are these cards made for?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Denial:

It still can be. The 4090 is a huge jump in performance.. about what you'd expect from a regular release. Make that card $1200. Take the 4080 16GB and make it $700. Take the 4070 and make it a 4070 at $500. Lineup is fixed. Problem is the cost/naming/marketing bullshit they are trying to pull here.. probably because they have such a severe stock of 3000's they need to clear. The 4070 is the biggest joke ever. It should be completely shamed and shunned until Nvidia either renames it with a price drop or recalls it and issues an apology from the leather jacket in real life and a 3d version rendered on a replacement.
the 4080 12 is not a 4070, its a 4080. the 4080 16 is a 4080 ti.
https://forums.guru3d.com/data/avatars/m/259/259564.jpg
Astyanax:

the 4080 12 is not a 4070, its a 4080. the 4080 16 is a 4080 ti.
Shader cores: Titan RTX: 4608 2080ti: 4352 2080 2944 2070: 2304 2060 Super: 2176 3090ti: 10752 3090: 10496 3080ti: 10240 3080 12GB: 8960 3080: 8704 3070: 5888 3060ti: 4864 3060: 3584 4090: 16384 4080 16GB: 9728 4080 12GB: 7680 You're right, he is wrong, he should be calling it a 4060.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Another bullshit marketing move, big dumb numbers for no reason. The monitors that can do 300hz+ are dumb and stupidly overpriced. I reckon I could beat someone at 100fps with their 500fps, why because that shit doesn't matter when you're playing online with lag comp and round robin ping all games estimate with online play. Competitive play is hilarious this is just another marketing bullshit tactic. Is this with DLSS 3.0 because you don't want to be running that shit at 1440p, it looks like trash? Nvidia you are getting desperate try and get as many "Nvidia reviews" over the real ones that will come out next week. What I find amazing as well is that they are not allowing reviewers to review the cards before release. I wonder why, card is not as good as they claim it is. So gotta keep it a secret until people pre-order.
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
Netherwind:

nVidia keeps making the same typo for some odd reason, 4080 12GB doesn't exist, it should say "4070".
Yeah. Let's call the "4080-12gb" a 4070 from now on
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Crazy Joe:

As an RTX 3090 owner I feel wronged by these graphs. Not that I ever plan to play Overwatch 2, mind you, but at least an acknowledgement of where my card stacks up would have been nice!
3090 is around 9% faster than a 3080. 323fps
https://forums.guru3d.com/data/avatars/m/280/280620.jpg
Hmm then i need to buy at least a 4090 for overwatch 2 if I wanna play it more smoothly
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Astyanax:

the 4080 12 is not a 4070, its a 4080. the 4080 16 is a 4080 ti.
The x080ti historically has been a cutdown big chip, so I think the 4080ti is still to come, towards the end of next year.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
southamptonfc:

The x080ti historically has been a cutdown big chip, so I think the 4080ti is still to come, towards the end of next year.
Almost definitely - there's plenty of room to slot both a 4080Ti and a 4090Ti in - AD102 goes to 144SMs - 4090 only has 128 enabled.
Astyanax:

the 4080 12 is not a 4070, its a 4080. the 4080 16 is a 4080 ti.
Disagree. From CUDA count, to relative performance, to the chip name itself. And personally I don't even care that they call it a 4080 but the price is outrageous for what it is.
data/avatar/default/avatar39.webp
AlmondMan:

Why do we want 600 FPS?
The ability to run a game at silly fps that runs just fine on my old card is not enough reason to spend silly money on a new one.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Dribble:

The ability to run a game at silly fps that runs just fine on my old card is not enough reason to spend silly money on a new one.
Yeah but you can't heat your room by running at 600W 500FPS 😛