NVIDIA Could Release RTX 3080 20GB en RTX 3070 16GB in December
Click here to post a comment for NVIDIA Could Release RTX 3080 20GB en RTX 3070 16GB in December on our message forum
alanm
Undying
alanm
Chert
If this version of the 3070 has 16gb, does this mean that its memory bus will increase to 320-bit (or more) or will it still be 256-bit like the 8gb version?
Astyanax
jbscotchman
fry178
@Undying
did the performance of the non-S drop, after the -S versions (for each model) were released?
and: are you knew to the fact that almost everything will have a faster/better/etc version released at a later point?
so i can assume you went to the dealership and complained the new model had more horsepower/features than the one you bought?
PrMinisterGR
Calmmo
3060/ti/3070 GA104, 3090/80 GA102, different silicon
Aura89
Fox2232
With GPU shortage, number of variants of each card with so many AIBs... potentially doubling model count may result in even lower availability of many models.
And only thing that will change as result of this will be asking price.
LevelSteam
I bet the ten people that get their hands on that 20gb model will be very impressed.
The Reeferman
On topic: I would be more comfortable getting a 16 or 20 GB vram GFX card opposed to 10GB. There were already one or 2 games, when the 16GB Radeon 7 was released, that ran into a vram limit on G/RTX 1080Ti/2080Ti. I believe it was Tomb Raider at 4K ultra settings. All stutter caused by hitting the vram capacity were gone with the 16GB Radeon, while the Nvidia cards were faster. I expect more games to go past the 12GB vram in the future. And since I tend to use my hardware pretty long(since performance doesn't advance as it did back in the days making >3yrs feasible) it better be future proof.
Noisiv
H83
PrMinisterGR
I just watched a small demo of Shadow of the Tomb Raider with everything on max (with RT), and it was using 10GB of VRAM with a 3090 (so no "it's just caching" crap). I really think that 10GB is the bare minimum for these cards, and a 20GB model, maybe with some extra shaders or clocks would be a good "Ti" for NVIDIA.
Also that 3090 was constantly using almost 400W. WTF.
itpro
alanm
Zooke
I'm very interested in some real life tests of 20GB vs 10GB 3080's.
As a barely informed consumer, the 20GB model seems a no-brainer to wait for, but I do have suspicions I am just believing the hype however much I try to keep myself level headed.
TheSissyOfFremont
It seems like a mental situation to me.
Either 10GB is enough and doubling that with a 20gb is absurdly overkill (and it will significantly up the price)
Or
10Gb isn't enough and Nvidia launched a flagship card without enough RAM - either cynically or incompetently.
I mean, is there really any evidence that we're going to need a doubling of VRAM in the next 2-3 years? Game/GFX professionals weigh in here...
The things that could cause that are (with my layman's understanding): Textures, Resolution, and general asset quality/density?
I guess we will continue to see moderate increases in texture and asset quality to match the standardisation of 4k as the high quality resolution for PC and consoles, as well as the continued increase in density of assets in the environment.
How much does VRAM volume effect RT? Not all that much I would have thought?
Is this something that will allow DirectStorage to do something it otherwise wouldn't be able to?
I'm really curious, because I feel like the next 5-10 years is going to see some big leaps in the technical development side of games. But I had assumed that was going to necessitate architectural innovation and significant raw power increases rather than memory expansion. (as a priority anyway)