NVIDIA Preps RTX 3060 8GB, 3070 16GB and RTX 3080 20GB Graphics cards

Published by

Click here to post a comment for NVIDIA Preps RTX 3060 8GB, 3070 16GB and RTX 3080 20GB Graphics cards on our message forum
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Hilbert Hagedoorn:

You're going to be seeing Gigabyte mentioned in a couple of news items today. This news is not perse 'new' as we all know they're coming, but Gigabyte did kind of confirm it. ... NVIDIA Preps RTX 3060 8GB, 3070 16GB and RTX 3080 20GB Graphics cards
This spot indeed confirmws what we've been saying for weeks not, there will be a 20GB version of the RTX 2080. Really? A 20GB 2080? :P
data/avatar/default/avatar40.webp
is the RTX 3090 just 9% faster than the RTX 3080? and if so why? The first answer that comes to my mind is that the amount of core increase that we saw in the 3000 series is just too big for software stacks to handle. While the drivers would (probably) have been updated to handle the massive throughput, game code and engines have to scale up to take advantage of the available processing power as well. This is sort of like games being optimized primarily to take advantage of just 1 core and not scaling perfectly.
https://wccftech.com/nvidia-geforce-rtx-3090-teclab-review-leaked/
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Undying:

It will also add a power consumtion. We are looking at 400w+ for 20gb version, ouch.
Don't care - I have a Corsair AX1200 that I can finally put to good use - what I care is the price increase for this version and if/when there will be a Ti version for 3080 or this "S" 20 GB version is all that we will get.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I'm beginning to wonder how seriously the 3090 is crippled by air cooling, when the true perf gain is only 10% with 20% more shaders. Sounds like throttling to me, either due to temp or power limits of sorts. But seriously, if they push a 1500$ card to beat the 3080 by 10%, they better save themselves the manufacturing costs. Or they start to market that thing as your AIO or probably custom loop beast when it shines. 3080 to 3090 only being a 10% difference would be a total fail. Nobody besides professional users would buy that, no need to run it with the gaming cards besides the bragging about "having the fastest card" (by way too little margin over their own product).
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
fantaskarsef:

I'm beginning to wonder how seriously the 3090 is crippled by air cooling, when the true perf gain is only 10% with 20% more shaders. Sounds like throttling to me, either due to temp or power limits of sorts. But seriously, if they push a 1500$ card to beat the 3080 by 10%, they better save themselves the manufacturing costs. Or they start to market that thing as your AIO or probably custom loop beast when it shines. 3080 to 3090 only being a 10% difference would be a total fail. Nobody besides professional users would buy that, no need to run it with the gaming cards besides the bragging about "having the fastest card" (by way too little margin over their own product).
I don't know if the games that are today can "wake up" the 3090 - so in my opinion it is software bottleneck - like the article says maybe after driver optimization and game software scaling up - so you need some game updates - things will be different.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
barbacot:

I don't know if the games that are today can "wake up" the 3090 - so in my opinion it is software bottleneck - like the article says maybe after driver optimization and game software scaling up - so you need some game updates - things will be different.
Hmm might be, but I'm sceptical... benchmarks with RTX and DLSS off should do the trick for now. As for the sentiment of buying, preordering a 1500$ GPU for "performance about to come with future updates" sounds a bit ridiculous, with all due respect to such a 10k shader beast 😀
data/avatar/default/avatar35.webp
robredz:

A 3060 will likely be the big seller, depending on price, it needs to be a max £320 card for mainstream, unless they are going to bring hardware Ray tracing to a possible 3050 at low end, the new Nvidia Broadcast app for Streamers needs RTX to work as it utilises the Tensor Cores for the AI
Supposedly, all of the 3000 series cards willl have RT/Tensor cores, even the low-end cards (ie. 3050 or something)
data/avatar/default/avatar03.webp
First 10%, now 19%. Give me legitimate reviews, please.
We have the first benchmarks results for the upcoming flagship TITAN-class graphics card, the GeForce RTX 3090. NVIDIA GeForce RTX 3090 is ~19% faster than RTX 3080 The GeForce RTX 3090 scores 20387 ‘graphics’ points in Time Spy 1440p preset. The same card scores 10328 points in Time Spy Extreme preset. This means that the card is on average 19% faster than the reference-clocked RTX 3080.
https://videocardz.com/newz/nvidia-geforce-rtx-3090-3dmark-time-spy-scores-leaked
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
And i still have to see what the larger vram versions have to do with the -S label. no S card i remember got more vram than the non S, so not sure why even ppl here are "pushing" that those two are connected. completely ignoring that there is more than one word that starts with an S outside "super".
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
barbacot:

I don't know if the games that are today can "wake up" the 3090 - so in my opinion it is software bottleneck - like the article says maybe after driver optimization and game software scaling up - so you need some game updates - things will be different.
You can say that of every products though. If the games would focus more on multi-threading past 4-6 cores maybe AMD CPUs would be on top. But they don't and the reality remains Intel still dominate gaming. You can always blame optimization but you can only evaluate a product based on how it performs in real life versus how much it cost.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Looks like its probably gonna be at least March or even Summer 2021 before I even consider making the switch to Ampere.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Its looking like a pretty good choice to just wait until AMD cards are out and reviewed. By then we might have larger VRAM options from Nvidia too. Patience is a virtue.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
JamesSneed:

Its looking like a pretty good choice to just wait until AMD cards are out and reviewed. By then we might have larger VRAM options from Nvidia too. Patience is a virtue.
That's what I was already planning on doing, but it looks like everyone else doesn't have a choice.
data/avatar/default/avatar21.webp
The 3090 is a Titan like card, it's will be mainly use in rendering workstation, it is why it have a lot ram, have sli support, and have even blower type card being released. As always gamers who want the best at any price will buy it too. The lack of performance upgrade for game is not a problem, for pro app scaling will be much better. Why they called it 3090 ? to be reviewed by everyone and be put against big Navi, and ensure the perf crown stay at Nvidia.
data/avatar/default/avatar12.webp
sbacchetta:

The 3090 is a Titan like card, it's will be mainly use in rendering workstation, it is why it have a lot ram, have sli support, and have even blower type card being released. As always gamers who want the best at any price will buy it too. The lack of performance upgrade for game is not a problem, for pro app scaling will be much better. Why they called it 3090 ? to be reviewed by everyone and be put against big Navi, and ensure the perf crown stay at Nvidia.
It's possible, and you are right about the 3090 use cases, but I have no doubts that the 3080 will still the performance king vs Big Navi despite the rumors. For many years AMD has been competing well in quality/price but the top of the range is unattainable for them, and more if we include DLSS on the mix...
data/avatar/default/avatar14.webp
clopezi:

It's possible, and you are right about the 3090 use cases, but I have no doubts that the 3080 will still the performance king vs Big Navi despite the rumors. For many years AMD has been competing well in quality/price but the top of the range is unattainable for them, and more if we include DLSS on the mix...
I have no particular opinion right now on whether big navi will beat the 3080 or not, I just wanted to say that Nvidia saw big navi as a potential/possible treat and wanted to be sure of keeping the perf crown in the consumer space.
data/avatar/default/avatar07.webp
HybOj:

They have all reasons to be angry if they buy 700USD card which is made obsolete few month later by the same company which sold em the 1st one.
The performance or the specs of the card you bought doesn't change just because a new one came out. Thats the entire point. What if AMD releases the best video card ever, making everything else "obsolete" as you put it, will those people then blame either AMD or NVIDIA for not warning them beforehand that this might happen? Every sensible media outlet will tell you not to buy right now but wait Surely a 20GB model will cost quite a bit more. I would assume at least a cool hundred on top, probably more. Does that really constitute "obsoleting" the 10GB model, considering there are many cases where the extra memory isn't needed (and it isnt even conclusively proven that it is needed at all)? Its the same nonsense as with presumably 20-series buyers supposed to be sad how much better the 30 series is, or something? If you bought it in the last couple months or so you just didn't do your research or you desperately needed one. But if you've had it for a while, why would anyone complain? You got the hardware you paid for, and better coming out doesn't change that. If you don't want your $1200 2080 Ti to lose massively in value, then you shouldn't buy one, because peak products like that will lose their value the fastest, since their price/performance ratio is already terrible. Consider this, would it really change anything if NVIDIA flat out told you that a 20GB variant is coming in a few months? We did practically know this before launch, just not "officially". But I don't think it would change anything. People would still buy it on launch day, and still complain that its not enough. Ultimately, there will always be a better card or a better price in the future. Thats just the progress of tech. TL;DR: Don't impulse buy. Do the research when spending hundreds of dollars on a commodity of all things. And if you game on 4K and think 10GB isn't enough for you, then wait for another card with more memory. You don't *need* a 3080 on the first day. Chances are you didn't get one anyway, so a good chance now to take a breather and consider what it is you really want.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Will be interesting to see what the S variants turn out to be but chances are it's the rumored Ti variants. And yeah there's going to be a bit of a price premium for that and just the higher memory variant and then the higher memory variant S model GPU's which going by the Gigabyte naming database that was shown there's going to be both. Doesn't really invalidate the 3080's or 3070's though still excellent cards. 🙂 EDIT: Unsure on what the 3090's position will be though since it costs a fair bit more for a small performance gain over the 3080 so that's a interesting position. Eh it'll probably still be the fastest GPU overall but instead of up-to 20% it might be up-to 10% and at a 4 GB VRAM extra instead of 14 GB.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
It's kinda like the Fury X's ram problem. Even that card is still supported despite it's ram shortcomings. My feeling on this situation is that I wouldn't be able to afford the 20GB version anyway and would've been more upset that there wasn't a sub £600-700 option in the initial announcement. The pricing of RTX3080 AS-IS is what got people interested in it. Now imagine the 10GB card doesn't exist. Nvidia already showed us the top card with 24GB ram but for a lot more money. Everything between that card and RTX3080 will be less than RTX3090, but, more than RTX3080. I don't think it's a scam, still great products for the price. If it was £1000 for RTX3080 from the get go I would've said no and probably ranted like everyone else. Nope. They gave us something more of us could afford. Although for transparency and longevity I can see why people wanted more acknowledgment of different ram-size versions becoming available once production ramped up. It's a rock and a hard place tbh. I'd be happy to see 12 and 16GB versions for a much smaller increase in price.