Rumor: GeForce RTX 4090 66% quicker than RTX 3090 Ti and 82% faster than RTX 3090.

Published by

Click here to post a comment for Rumor: GeForce RTX 4090 66% quicker than RTX 3090 Ti and 82% faster than RTX 3090. on our message forum
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
TheDigitalJedi:

The mid to high end 3000 and 6000 series cards are still phenomenal performers.
Something people forget when a new card comes out, performance of current cards in current games doesn't drop just because a new faster thing is out, Your 60fps@4k today in whatever game will still be 60fps@4k next year with the same game. I changed my 1080Ti to a 6900xt so i could get a constant 60fps in DCS (60hz monitor). So i wouldn't have upgraded to a 7900xt, BUT, ive now got a 144hz monitor so im again chasing those frames 😀
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
If true, Nvidia really went all out with the 4090. Which to me means... Navi 31 will also be a killer GPU. Nvidia knows their competition better than anyone else. And the 4090 is their all out attempt to avoid being upstaged by N31.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
alanm:

If true, Nvidia really went all out with the 4090. Which to me means... Navi 31 will also be a killer GPU. Nvidia knows their competition better than anyone else. And the 4090 is their all out attempt to avoid being upstaged by N31.
N31 chiplets, 384bit, v-cache hell it could be even faster than 4090. Its better for us anyway let them compete. 🙂
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Undying:

Its better for us anyway let them compete. 🙂
Exactly, nothing stifles an advancement of anything if there is no opposing force to exert pressure.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
TheDigitalJedi:

The mid to high end 3000 and 6000 series cards are still phenomenal performers. Some have stated they don't care about the latest or greatest top of the line cards. They just want to see a decent bump to enjoy they're games better. I told this particular bunch to grab an Nvidia or AMD card now while prices are down. Even with next gen around the corner, they're not caring about that. Some of them have 960M laptops and desktops with 1050-1060 cards in them. Going from that to a 3060 - 6700 or higher GPU would be a huge leap for them.
It seems like the shops here in Finland foresaw the situation and didn't order any new stock anymore. The prices are actually getting higher again and the models offered are few. So, unless you order from abroad, this is no time to get a new card over here, apart from entry level ones, perhaps. From all the post I'm reading here, it seems like in other countries the stores didn't orchestrate things like this and ended up with a lot of cards they need to get rid of before the next gen.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Undying:

Apparently stock is huge as no one is buying them. No wonder 4090 will be only 40 series card released this year.
Apparently gamers aren't stupid, and now that Ngreedia lost the miners buying everything price will have to come down to reasonable levels (hopefully?).
https://forums.guru3d.com/data/avatars/m/116/116362.jpg
I've had my 2080Ti for over 3 years now. Never before did I not really need to upgrade at all. I can play everything I want on my 1440p 144hz monitor. Looks good, plays good. Games just aren't progressing that much graphically. Never thought I'd say it has been good value, LOL!
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Sukovsky:

I've had my 2080Ti for over 3 years now. Never before did I not really need to upgrade at all. I can play everything I want on my 1440p 144hz monitor. Looks good, plays good. Games just aren't progressing that much graphically. Never thought I'd say it has been good value, LOL!
Probably you wont need a change of card until the next gen of consoles, or if you buy a 4k 144hz monitor
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Sukovsky:

I've had my 2080Ti for over 3 years now. Never before did I not really need to upgrade at all. I can play everything I want on my 1440p 144hz monitor. Looks good, plays good. Games just aren't progressing that much graphically. Never thought I'd say it has been good value, LOL!
Still got a GTX 1070 myself, but it's on it's last legs when it comes to performance. I also want to move to 1440p in the very near future. Sitting it out until the RTX 4060 reviews. Longest i've ever had a GPU.
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
I have to agree, never expected the 2080Ti to last this long. It turned out to be one of the best purchases I've ever made. I don't play any competitive multiplayer or many AAA games anymore, so I might get even more life out of it yet. Unless I just upgrade for the sake of upgrading... unfortunately, I still suffer from that bug every now and then.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
mackintosh:

I have to agree, never expected the 2080Ti to last this long. It turned out to be one of the best purchases I've ever made. I don't play any competitive multiplayer or many AAA games anymore, so I might get even more life out of it yet. Unless I just upgrade for the sake of upgrading... unfortunately, I still suffer from that bug every now and then.
And this is why buying the best card you just about cant afford can be a decent long term investment, my 1080Ti at £600 (on release day) turned out to be good value considering i only replaced it six months ago.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
pegasus1:

And this is why buying the best card you just about cant afford can be a decent long term investment, my 1080Ti at £600 (on release day) turned out to be good value considering i only replaced it six months ago.
I do believe buying the higher-tier card at a given time is the lesser evil if you take into account resellability, expected time before new generation comes out, and performance longevity.
data/avatar/default/avatar03.webp
icedman:

Still waiting for something 500-600$cad that offers a reasonable jump in performance getting 6650xt or 3060 just feels wrong when I got my 1080 for 550$cad
Agree and I am hoping the prices of the 3060Ti/3070/Ti and 3080 come down a bit more and on the AMD side the 6750 and 6800 which for 1080p high refresh and 1440p are immense and really more than enough for a long time to come. The 3060Ti is down to GBP 440 in the UK new and circa GBP 370 on ebay. On AMD even better with the 6700XT down below GBP 350 on ebay and new at 349!! The 6800 is down to GBP 500 on ebay and new GPB 580..I have a feeling that prices will come down a little further, especially on ebay where a decent bargain can be picked up!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
vestibule:

The power consumption thing is also getting me down. Like, I feel we need to lobby NVidia to keep energy consumption 200watts or less. Save the planet and all that. But then that doesn't help or change anything.. So all in all a pointless post from me. :)
AMD needs to be lobbied too. 200W is asking too much but I'd say 300W is a good stock maximum, since that works with 2x 8-pin connectors and can be comfortably cooled with a dual-slot heatsink. It's power hungry but not outrageous. It is possible to get a lot of performance out of 300W but so long as both Nvidia and AMD keep moving goal posts, they don't have to try that hard, they just have to get better performance-per-watt than the other. Even with a 300W limit, nothing should prevent AIB partners from pushing the limits for overclockers and enthusiasts. I suspect the next generation might be the highest wattage parts we'll ever see. They're just way too impractical and expensive to justify for gamers. I suspect even wealthy people are going to have a hard time justifying it, considering the heat they dump into the room. With mining dying out, these likely won't sell well.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Undying:

Apparently stock is huge as no one is buying them. No wonder 4090 will be only 40 series card released this year.
No one is buying them because they are still priced for miners, not for gamers. Not to mention that the cards are almost 2 years old... If they drop the prices accordingly, then they will start to sell the cards again.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
schmidtbag:

AMD needs to be lobbied too. 200W is asking too much but I'd say 300W is a good stock maximum, since that works with 2x 8-pin connectors and can be comfortably cooled with a dual-slot heatsink. It's power hungry but not outrageous. It is possible to get a lot of performance out of 300W but so long as both Nvidia and AMD keep moving goal posts, they don't have to try that hard, they just have to get better performance-per-watt than the other. Even with a 300W limit, nothing should prevent AIB partners from pushing the limits for overclockers and enthusiasts. I suspect the next generation might be the highest wattage parts we'll ever see. They're just way too impractical and expensive to justify for gamers. I suspect even wealthy people are going to have a hard time justifying it, considering the heat they dump into the room. With mining dying out, these likely won't sell well.
Don't underestimate wealthy people: is it stupid to have a +450W TDP card and then a powerful AC to cool your room? Yes. But they can afford it easy. Personally I had AC too (it's broken now) but I'm more conscious. I don't run 4k or need 240 FPS so there's no way I'm going past 200W TDP, ever. Electricity is expensive too.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
At this rate, I can imagine that the GPU sales are no longer "just" the current lineup (read: 7k AMD lineup and 4k Nvidia lineup), with old cards hardly wanted, but a two layer market of fresh GPUs might be a thing, in addition to used mining GPUs and any after market there is. As in, 3k series cards that just dipped below MSRP might be popular, also with lower TDP a lot of people do seem to make that a priority. Then 4k series for the enthusiasts and general new buyers, and used cards dating back to 2k series of Nvidia. All of that with GPU manufacturers complaining "they don't buy our GPUs!" in the near future, because they're still priced like they should have been at release 2019, just to maximize profit per SKU sold, not SKU sales in total.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
Well, i want to know performance improvement with same power.. Add more performance through more power is cheap.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
i actually believe this rumor. but right now Nvidia has scaled the mountain of monolithic chip design. which is good, but double-edged. the Halo (until the 4090ti) is at the extreme end of "acceptable" power draw and is sure to have a lower yield than any AMD design (as the die area is maxed out) but is a uArch masterpiece nevertheless. if the RX7700 rumors are true (the last monolithic GPU in 7xxx) then Nvidia is going to have a very, very hard time dealing with the RX7800/7900 despite the marvelous generational uptick. the RTX 4090 cannot compete on price as the RX7800/7900 will be at least 25% less expensive to produce with an over 30% higher yield for the "chiplets" vs a(ny) monolithic design. and as far as the power draw, the MCM draw may look large to AMD fans, but it will be less than the 4090 while having the equivalent of several gpu's inside. the real winner this series is the midrange cards from both Nvidia and AMD
data/avatar/default/avatar09.webp
Cooling this thing is going to suck, even for those of us with large water loops.