Are these the GeForce RTX 4090 and RTX 4080 Final (leaked) Specifications

Published by

Click here to post a comment for Are these the GeForce RTX 4090 and RTX 4080 Final (leaked) Specifications on our message forum
https://forums.guru3d.com/data/avatars/m/54/54063.jpg
I'm ready.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
My PSU doesn't have a 12VHPWR. I'm not ready 🙁
data/avatar/default/avatar14.webp
12GB GeForce RTX 4080 Finally, the GeForce RTX 4080 12GB is the GPU previously known as the RTX 4070
Heh. Why I smell high price tag for the whole lineup? Imho if they put the 4070 at $700-800 people would have been revolted, now they can sell it as "is 4080" to justify the price tag. Already we have the same people justifying a $600+ price tag on 4060.
https://forums.guru3d.com/data/avatars/m/54/54063.jpg
southamptonfc:

My PSU doesn't have a 12VHPWR. I'm not ready 🙁
You will be Brother. Our current systems are still heavy heavy hitters for 99% of the games out here at max settings. Heck 100% at 1440p. 😀 My next PSU is 1600 watts. I'll be done for awhile.
data/avatar/default/avatar38.webp
southamptonfc:

My PSU doesn't have a 12VHPWR. I'm not ready 🙁
Hopefully PSU companies allow us to buy 12VHPWR cables so we don't have to replace our already good PSUs.
data/avatar/default/avatar26.webp
TheDigitalJedi:

You will be Brother. Our current systems are still heavy heavy hitters for 99% of the games out here at max settings. Heck 100% at 1440p. 😀 My next PSU is 1600 watts. I'll be done for awhile.
yeah it's like iphone now you don't need it you just want the latest for a inflated price
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
Nah, they're too hungry
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
gQx:

yeah it's like iphone now you don't need it you just want the latest for a inflated price
I could apprecipate 100% more FPS in most games that I play.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Due to the huge difference in CUDA cores between 4090 and 4080 16G it should mean that there will be a 4080Ti somewhere down the road with perhaps 12-13k CUDA cores? Could be worth waiting for.
data/avatar/default/avatar32.webp
southamptonfc:

My PSU doesn't have a 12VHPWR. I'm not ready 🙁
You don't need new PSU because your PSU 1kW w/o 12VHPWR is fine for RTX 4090 and this card will include the new 12VHPWR to few 8-pin PCIe's adapter. Don't worry. 🙂
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Fediuld:

Heh. Why I smell high price tag for the whole lineup? Imho if they put the 4070 at $700-800 people would have been revolted, now they can sell it as "is 4080" to justify the price tag. Already we have the same people justifying a $600+ price tag on 4060.
Unfortunately I think you're correct. It is the only logical reason for this retarded naming scheme...
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
I suspect my decision will once again be based on availability ref 4090 vs 7900xt as it was with 3090 vs 6900xt
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Fediuld:

Heh. Why I smell high price tag for the whole lineup? Imho if they put the 4070 at $700-800 people would have been revolted, now they can sell it as "is 4080" to justify the price tag. Already we have the same people justifying a $600+ price tag on 4060.
If that's the case, then im happily switching to AMD, 7000 series will have great RT so no more worries there. Screw nvidia and their never-ending greed.
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
Building a whole new PC, but will have to hang on with the GPU. Not buying a 7900XT if there's a 7950XT just around the corner...
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
slot in that 12GB GDDR6, 220w 4070 and we're good.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
So the 4090 has more Cuda cores and 8 more gigs of Vram. I feel like this is going to be the same as what happened with the 3080 and 3090 with not much of a difference in performance in gaming. The 4090 will be just that a "creators card" if you going to use 8 more gigs of Vram and CUDA for rendering or AI shit but for games only expect a few extra fps.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Fediuld:

Heh. Why I smell high price tag for the whole lineup? Imho if they put the 4070 at $700-800 people would have been revolted, now they can sell it as "is 4080" to justify the price tag. Already we have the same people justifying a $600+ price tag on 4060.
This will be hilarious. Indeed, if they changed 4070 into 4080 12GB with a higher price, they obviously need a new 4070. It can only be 4060 Ti. The new 4060 Ti will then be what was supposed to be 4060. This would actually 100% explain the rumours of Nvidia's mainstream segment of the 4000 series bringing less performance increase than the flagship is expected to bring. Based on this, Nvidia must really expect AMD's 7000 series to suck long noodles. Or maybe Nvidia trusts nobody will switch over to AMD's side, no matter how pricey Nvidia's offerings are. Thanks to Intel's own failures to get anything onto the market, nobody expects anything from Intel anymore. Of course the other alternative is that the mainstream 4000 will suddenly become cheaper compared to the 3000 series, but that's something I'd need to see with my own eyes to believe.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Kaarme:

It can only be 4060 Ti.
um. no. 4090 14 layers PG139-SKU330, AD102-300-A1 16384FP32, 24G 21Gbps GDDR6x, 450w , TSE <20100 4080 16GB 12 layers PG136/139-SKU360, AD103-300-A1 9728FP32, 16GB 23Gbps GDDR6X, 340w, TSE <15,000 4080 12GB 12 layers PG141-SKU340/341 AD104-400-A1 7680FP32, 12G 21Gbps GDDR6X, total card power 285W, TSE <11000. 4070 10 layers PG141-SKU336/337, AD104-275-A1 7168FP32, 12G(or10) 18Gbps GDDR6, total card power 220W, TSE <10000. GDDR6 doesn't need as many pcb layers as 6x.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Astyanax:

um. no. 4080 12GB PG141-SKU340/341, 7680FP32, 12G 21Gbps GDDR6X, total card power 285W, TSE <11000. 4070 PG141-SKU336/337, 7168FP32, 12G(or10) 18Gbps GDDR6, total card power 220W, TSE <10000.
So, they will cut down still some extra from the chip for 4060 Ti? If they jammed the new 4070 between the new 4080 12GB (original 4070) and 4060 Ti, it most likely would mean the 4060 Ti will be made a bit less powerful to justify the higher prices of the 4070 and 4080 12GB. After all, 4060 Ti is supposed to be a real mainstream card and even Nvidia likely (hopefully) feels like it can't be abhorrently expensive. But who would buy a 4070 if 4060 Ti was practically as strong?
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Meathelix1:

So the 4090 has more Cuda cores and 8 more gigs of Vram. I feel like this is going to be the same as what happened with the 3080 and 3090 with not much of a difference in performance in gaming. The 4090 will be just that a "creators card" if you going to use 8 more gigs of Vram and CUDA for rendering or AI crap but for games only expect a few extra fps.
Uh? No....do the math: difference between the 12 GB version of 4080 and the 4090 is 6620 - it's HUGE (more than triple!) if you compare with the difference between 3080 and 3090. I think that Nvidia exactly this did not want: to have the two cards too close in performance By the looks of it 4090 will be a real king of the hill not like the 3090 Ti which was close in performance to the 3090, 3080 Ti so an weak, fable king (AMD again did not count for this title unfortunately). Also, if the rumors that Nvidia already stacked in warehouses lots of 4xxx gpu since august in order to avoid a paper launch are true then the price will be decent and with sometimes double the performance over the last generation what else is out there? To me it seems that now we will choose graphic cards according to how much electricity can we afford to pay based on "Nvidia electricity bill plan" with their variable consumption...so choose not only based on price & performance but also how much electricity can you afford to pay monthly. The card may be finally affordable but you could still face financial ruin based on price of energy and how much these cards eat every second of playing a demanding game. So this month you received a bonus at work? go play metro exodus, call of duty, etc on your brand new 4090; no bonus? - lower the limit of consumption at minimum but you can only afford to play windows solitaire! - I'm exaggerating of course but it seems that Nvidia finally thought about the consumer: you choose how much current you spent.