GeForce RTX 4070 Specifications Confirmed due to GPU-Z Validation

Published by

Click here to post a comment for GeForce RTX 4070 Specifications Confirmed due to GPU-Z Validation on our message forum
https://forums.guru3d.com/data/avatars/m/259/259564.jpg
They can call it a 4070 if they want, but by any previous metrics it wouldn't even be a xx60. In fact, if we go back to the 2k series and compare TFLOPs it looks even worse. A 4090 is 82.58 tflops. If this does hit 29 TFLOPs, your 4090 is worth 2.8ish 4070s worth of computational power. If I go back to the 2k series, I need to compare the 2080ti to a fucking 2050 Max-Q mobile to see that kind of ratio versus the best GPU. (26.9/9.462) 3k series isn't much better, hitting it somewhere between 3060 and 3060ti levels.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Yeah nvidia has shifted up all numbers, 4080 included. All cards below the 4090 are too severely cut down to be what they claim to be. At the same time, prices per model name went up, significantly. Not even Turing was so badly priced as 4000 series 🙁
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
very much underwelming special for the price. nvidia must in full rip off mode, I curious to see were it lands in terms of 3000 series perfwise I thinking it might be maybe on par with 3070/ti?
https://forums.guru3d.com/data/avatars/m/277/277079.jpg
This should be the 4060 RTX when it comes to the spec and at approx half the projected price. Come on NGreedia. wake the f... up.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Unfortunately 4070 is starting to look quite disappointing. I'm not awfully surprised considering the specs seemed to suggest it was supposed to be 4060 (Ti) originally, like others have already said. If it has only higher clocks working for it compared to 3070, the performance increase is quite limited. Oh, well, it also has the 100 dollars higher price "working" for it. The worst thing about 4070 is that AMD seemingly has disappeared from the face of the Earth and thus nobody knows how AMD is planning to respond to Nvidia's offerings. If it's planning anything at all. Maybe Nvidia knew beforehand that AMD is dead and thus Nvidia's products below 4090 are increasingly lackluster.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
it was never supposed to be a 4060, the only people saying that are ignorantly assuming based on a single factor, the memory interface.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Kaarme:

The worst thing about 4070 is that AMD seemingly has disappeared from the face of the Earth and thus nobody knows how AMD is planning to respond to Nvidia's offerings. If it's planning anything at all. Maybe Nvidia knew beforehand that AMD is dead and thus Nvidia's products below 4090 are increasingly lackluster.
They must have known something imo, no way they'd go so low with 4080/4070ti cuda count if they didn't know rdna3 struggles with dual issue fp32/wave 64 performace and power. Same as they knew how much trouble rnda2 was going to be for ampere, and made an ad102-based 3080. Worst of all, no sign of n32, not even a single leak, they must be having trouble fixing the architecture.
tsunami231:

I curious to see were it lands in terms of 3000 series perfwise I thinking it might be maybe on par with 3070/ti?
3080, with 12G and a single 8-pin https://videocardz.com/newz/nvidia-claims-geforce-rtx-4070-and-rtx-3080-offer-equal-dlss-performance-without-frame-generation not great, but all amd have on this atm is a retail sale on 6800xt/6900xt, 300-400w cards that'll only compete with ada in rasterization.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
cucaulay malkin:

Worst of all, no sign of n32, not even a single leak, they must be having trouble fixing the architecture.
7950XT and XTX will show their faces when AMD is good and ready, don't expect miracles though.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
don't care about 1600usd cards. 500-700 is where I buy. I'm good with 6800 atm tho.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Astyanax:

it was never supposed to be a 4060, the only people saying that are ignorantly assuming based on a single factor, the memory interface.
Sure, every single card is exactly what Jensen decides them to be, so nothing is supposed to be anything. But if you compared the specs increase from 2070 to 3070 and then from 3070 to 4070, the 4070 seems like very little progress indeed in comparison. So, historically it would make more sense for what's going to be 4070 to be 4060 Ti.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Kaarme:

Sure, every single card is exactly what Jensen decides them to be, so nothing is supposed to be anything. But if you compared the specs increase from 2070 to 3070 and then from 3070 to 4070, the 4070 seems like very little progress indeed in comparison. So, historically it would make more sense for what's going to be 4070 to be 4060 Ti.
Spec change and performance increase has never been a guaranteed %, and this is the biggest thing tech speculators get wrong every single time. you get the occasional architecture with an above normal increase and then several lesser increments where power util and efficiency are the focus, ampere would have actually been a bigger increase and even less gain for AD if the samsung node hadn't been leaky garbage. full AD104 would have been the 4080 with the 103 part a Ti under architectures prior to ampere.
https://forums.guru3d.com/data/avatars/m/126/126739.jpg
Hopefully the 4070 performs well, for the price. I really feel like anything below the 4070ti, nivida is going to rely on dlss3 and frame gen to bring the performance. It would appear on paper they are really cutting the 4070 and lower back, and if you have frame generation to bring the performance back... why not right? Its a win win of course for them.
data/avatar/default/avatar03.webp
wavetrex:

https://videocardz.com/newz/nvidia-geforce-rtx-4070-matches-geforce-rtx-3080-in-leaked-tests ~20% slower than a 3070 Ti, and roughly matches a 3080 in synthetics and older games. I suspect it might be worse than a 3080 in more modern games with bigger textures, due to the inferior memory bandwidth. 192bit vs 320bit will hurt, despite the higher clocked memory. If this was $400, it would be a great GPU, but at $600 and up, bleh, stagnation and involution.
You keep forgetting the 36MB L2 cache, which compensates for memory bandwidth.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
wavetrex:

https://videocardz.com/newz/nvidia-geforce-rtx-4070-matches-geforce-rtx-3080-in-leaked-tests ~20% slower than a 3070 Ti, and roughly matches a 3080 in synthetics and older games. I suspect it might be worse than a 3080 in more modern games with bigger textures, due to the inferior memory bandwidth. 192bit vs 320bit will hurt, despite the higher clocked memory. If this was $400, it would be a great GPU, but at $600 and up, bleh, stagnation and involution.
Nvidia is really trying to milk gamers as hard as possible... 😕
Pinstripe:

You keep forgetting the 36MB L2 cache, which compensates for memory bandwidth.
The larger L2 cache is going to help, but it won`t totally compensate for the lost memory bandwidth. Nvidia doesn`t want cheaper cards "punching" above their weight like in the past...
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Pinstripe:

You keep forgetting the 36MB L2 cache, which compensates for memory bandwidth.
Yea uhm, I'm not so sure about that. It's not the 128 MB L3 which 6800/6900/6950 has, plus whatever size L2. That 36 MB seems tiny in comparison. I guess we'll see in the reviews, not long left.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
wavetrex:

Yea uhm, I'm not so sure about that. It's not the 128 MB L3 which 6800/6900/6950 has, plus whatever size L2. That 36 MB seems tiny in comparison. I guess we'll see in the reviews, not long left.
Besides having larger L2 didnt help 4070ti at higher resolutions still loosing to 3090 and 3090ti.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
In order not to lose performance compared to a 1TB/s card, you need 1TB/s bandwidth. even 7900xt with 800GB/s and Infinity Cache sees the lead over 3090Ti drop to 5% from ~15% at 1080p, and those are still very high end specs, 4070 is mid-range. The performance lead decrease for 7900xt vs 3090Ti starts at 1440p already. btw, look how cpu limited 4090 is at 4K, that is crazy, they're testing this on a watercooled 12900k with 6400 ram. https://i.imgur.com/XiMvdPk.jpeg