NVIDIA cancels the GeForce RTX 4080 12GB

Published by

Click here to post a comment for NVIDIA cancels the GeForce RTX 4080 12GB on our message forum
https://forums.guru3d.com/data/avatars/m/204/204717.jpg
Imo, it should've been illegal for them to have a product called "A 4080" that is entirely different. It would've been fun to buy that on amazon (or something) and return it for a misleading description lol.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Krizby:

Probably some jebait moment because Nvidia knew something about upcoming rx7000 price/perf LOL
All this talk of RDNA3 being weak is seriously weird to me. The 6900XT had 5120 cores and the 3090 had 10496. Sure the architectures are totally different but AMD still managed to near match/beat the 3090 in general raster performance with this architecture. 7900XT is rumoured to be around the 12288 cores. This is OVER DOUBLE that of the 6900XT. Now factor in other architecture changes (mcm), clock speed increases. 6900XT could already hit 2.6GHz easily. So 3GHz+ isn't out of the question on the 7900XT. Now go to the 4090 with 16k cores which is nowhere near double that of the 3090. Factor in architecture changes, and more importantly the huge clockspeed increases and insane power draws its easy to see where the 40 series gets its performance from. Unless something majorly went wrong in the RDNA3 developement, given the current rumoured specs it would be extremely surprising to see them not at least match the 4090 in raster performance. RT performance is another story but with the MCM design maybe AMD will use more of the die space for RT enhancements. DLSS3 with its frame generation is I think where Nvidia could really pull ahead of AMD.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Undying:

Or they knew how bad the card was and how people would rip it appart in the reviews.
Slikar:

Now they gonna release the same chip and name it 4070 but have the same price which would suck even more than naming it a 4080. Or maybe just maybe they lower the price as well for this chip
For fairness sake, if you review x070, you won't expect performance nearly as good as from x080. So, Nvidia wouldn't need to be that embarrassed about the performance, whatever it will be. The only mistake was calling it 4080 when there already was a stronger 4080. What comes to the price, as we have seen, they are unprecedented, so it is what it is. Nvidia alone decides the prices. Nothing prevents them from making 4070 cost 1000 euros. In such a case consumers can only pray AMD will offer something equal for a cheaper price. If virtually nobody is willing to pay the price and 1000 euros 4070s collect dust on shop/warehouse shelves, it will send Nvidia the only message that matters.
https://forums.guru3d.com/data/avatars/m/263/263271.jpg
fredgml7:

Maybe price cutting is the next step. If people don't buy I would bet on that.
They better do that.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Aahahhahahhahaha... hahahaha... ...hahaha...haha, haha, haha 😀 ....hahhahaha...hahaha... 😀 Sorry, I'll stop. ... I can't stop. HAHAHAHAHA!
data/avatar/default/avatar27.webp
CPC_RedDawn:

All this talk of RDNA3 being weak is seriously weird to me. The 6900XT had 5120 cores and the 3090 had 10496. Sure the architectures are totally different but AMD still managed to near match/beat the 3090 in general raster performance with this architecture. 7900XT is rumoured to be around the 12288 cores. This is OVER DOUBLE that of the 6900XT. Now factor in other architecture changes (mcm), clock speed increases. 6900XT could already hit 2.6GHz easily. So 3GHz+ isn't out of the question on the 7900XT. Now go to the 4090 with 16k cores which is nowhere near double that of the 3090. Factor in architecture changes, and more importantly the huge clockspeed increases and insane power draws its easy to see where the 40 series gets its performance from. Unless something majorly went wrong in the RDNA3 developement, given the current rumoured specs it would be extremely surprising to see them not at least match the 4090 in raster performance. RT performance is another story but with the MCM design maybe AMD will use more of the die space for RT enhancements. DLSS3 with its frame generation is I think where Nvidia could really pull ahead of AMD.
Well, it doesn't always translate between architectures, as you know. The 2080 ti has 4352 cores, where as the 3090 has 10496 cores... however, 3090 is "only" about 70% faster, where as from what i've tested, my 4090 strix is 85% faster than my 3090 strix, despite only having 60% more cores. 7900XT might end up having faster rastarization performance than 4090, but im just saying that you can't conclude it based on the specs alone. And fyi, the 4090 doesn't actually have crazy power draw - my 4090 strix draws less power than my 3090 strix 🙂
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Karma jacket. xD Now make it a 600€ 4070 and it just might sell
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Are there any estimates on the sales of 4090 compared to previous launches? Somehow I have a feeling not many are willing to shell upwards of €2000 for a GPU these days. US is probably not faring any better...
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
JamesSneed:

I have leaked photos of the new card:
I reckon the board partners can order stylish stickers from China, delivered in a week, that are handy to cover the "4080" and replace it with "4070". Adventurous buyers can take a hairdryer and carefully peel the sticker off, instantly feeling better about their overpriced 4070 when it transforms into a 4080, at least as far as the label goes.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
The 4080 12Gb was too gimped to be in the 4080 stack, reduce memory bandwidth, reduced cores, reduce RT cores, reduce tensor cores, reduce cache, and for what? cards nVidia are already selling but with without DLSS3 which has it's own problems and is something i would not use anyway because the latency hit is too much.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Dazz:

The 4080 12Gb was too gimped to be in the 4080 stack, reduce memory bandwidth, reduced cores, reduce RT cores, reduce tensor cores, reduce cache, and for what? cards nVidia are already selling but with without DLSS3 which has it's own problems and is something i would not use anyway because the latency hit is too much.
I guess Jensen wanted more then he could chew, also wasn't willing to sell 2 year old 3000 series bellow msrp, suits him right.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Undying:

I think this decision is ultimately bad for us. I would rather see them rename it to 4070/4070ti than cancel it. Now we ended up with 1200$ 4080 and 1600$ 4090 so 40 series are out of reach for most people.
mackintosh:

I'm assuming they're just going to rebadge it and relaunch it as a 4070, or keep it for a 4070Ti later.
I also think they are going to rename it 4070, like it was supposed to be initially. After all, they are not going to throw away the chips they stored for the fake 4080. In the end, i think Nvidia did this because of all the backlash so far regarding the card, so this means we still have (some) power!
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
I think it's a mix of both. The backlash, and the risk of getting hit with misleading advertising by local regulators - as the name implied the only difference was in memory capacity, which clearly wasn't the case.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
There was no FE "4080 12GB" aka 4060, so it's the AIB partners that have to eat the cost of this "unlaunch" (whoever came up with that weasel word please jump off a cliff onto jagged rocks). EVGA are probably laughing pretty hard right now.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Dragam1337:

Well, it doesn't always translate between architectures, as you know. The 2080 ti has 4352 cores, where as the 3090 has 10496 cores... however, 3090 is "only" about 70% faster, where as from what i've tested, my 4090 strix is 85% faster than my 3090 strix, despite only having 60% more cores. 7900XT might end up having faster rastarization performance than 4090, but im just saying that you can't conclude it based on the specs alone. And fyi, the 4090 doesn't actually have crazy power draw - my 4090 strix draws less power than my 3090 strix 🙂
Yea I totally agree it was more of a general comparison based on the previous generation and how each side compared in general raster performance. Very basic and rudementary analysis :P
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
OMG ! We all knew it was not an rtx 4080 all this time lol , barely can call it a rtx 4070. Mr.Leather Jacket is smoking some good shit 😀 . What en epic Fail for ngreedia ! And yes evga made the right choice at the right time and they are having the last laugh 😛
images.jpg
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
… May I have your attention, please? May I have your attention, please? Will the real Forty-Eighty please stand up? I repeat, will the real Forty-Eighty please stand up? We're gonna have a problem here. --- The reality is that none of these cards are real 4080's... one is the 4060 (Ti?) and one is the 4070 Ti, that AD103 chip is not even fully enabled SMs (which are probably 80 of them, and this has 76). Close, but no cigar. And the performance "uplift" is a disaster, it appears to be barely on par with the 3080-12GB at anything less than max settings 4K. https://forums.guru3d.com/attachments/upload_2022-10-15_1-29-1-png.13840/?temp_hash=53a505d24748e838cdeea8e160ae3c5b EPIC FAIL indeed.
https://forums.guru3d.com/data/avatars/m/259/259045.jpg
Good, I'm glad they cancelled that name for it. Would have been confusing af for some people. Just the online benchmarks and videos of the 16gb not matching a customer's 12gb would have been a nightmare for pr. But does this mean that we will see the most expensive 4070/ti-4060ti ever? Whatever happened to midrange cards? At this point, I would totally understand people flocking back to the latest consoles. Hopefully AMD releases something truly competitive price and performance wise.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
nVidia, come on, you're better than this...