NVIDIA Could Release RTX 3080 20GB en RTX 3070 16GB in December

Published by

Click here to post a comment for NVIDIA Could Release RTX 3080 20GB en RTX 3070 16GB in December on our message forum
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Undying:

Especially when it comes to nvidia.
And AMD (5700xt) where many had to wait months to get black screen issue resolved. Also "fine wine" means early adopters have to wait long time for it 😀.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
alanm:

And AMD (5700xt) where many had to wait months to get black screen issue resolved. Also "fine wine" means early adopters have to wait long time for it 😀.
Software issues are there as you saw even with nvidia. They both have driver issues but atleast 5700xt isnt replaced with 5700xt super somewhat after. 😉 How would 3080 user feel after he waited months to get the card and then its replaced with the actually more future proof 20gb version?
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Undying:

Software issues are there as you saw even with nvidia. They both have driver issues but atleast 5700xt isnt replaced with 5700xt super somewhat after. 😉
Yes, SW issues that take months to resolved vs few days for Nvidia.
How would 3080 user feel after he waited months to get the card and then its replaced with the actually more future proof 20gb version?
Everyone knows what they are getting beforehand and nothing wrong with the 10gb 3080 for its offering price. You are just basically trolling as majority of your 13000 posts are just anit-nvidia tirades. 🙄
data/avatar/default/avatar19.webp
If this version of the 3070 has 16gb, does this mean that its memory bus will increase to 320-bit (or more) or will it still be 256-bit like the 8gb version?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Chert:

If this version of the 3070 has 16gb, does this mean that its memory bus will increase to 320-bit (or more) or will it still be 256-bit like the 8gb version?
no, its just 8x 16Gb density GDDR6 modules (2GB a chip)
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Undying:

Software issues are there as you saw even with nvidia. They both have driver issues but atleast 5700xt isnt replaced with 5700xt super somewhat after. 😉 How would 3080 user feel after he waited months to get the card and then its replaced with the actually more future proof 20gb version?
Until AMD can fix their black screen and free-sync issues, I have no desire to leave team green.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Undying did the performance of the non-S drop, after the -S versions (for each model) were released? and: are you knew to the fact that almost everything will have a faster/better/etc version released at a later point? so i can assume you went to the dealership and complained the new model had more horsepower/features than the one you bought?
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Babel-17:

A 3070 Ti could be configured as a salvage setup using 3080 chips. With both 9, and 18, gigabytes of ram being an option. nVidia could drop those in to their lineup so as to rain on AMD's parade.
The 3080 is a salvaged chip already.
https://forums.guru3d.com/data/avatars/m/69/69564.jpg
3060/ti/3070 GA104, 3090/80 GA102, different silicon
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Undying:

How would 3080 user feel after he waited months to get the card and then its replaced with the actually more future proof 20gb version?
I feel like the more valid question is how will the people who overpaid for a 20GB version of a card feel in 3-5 years when they realize they paid more for something that didn't benefit them at all and could have saved that money for a new GPU that actually matters in 3-5 years.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
With GPU shortage, number of variants of each card with so many AIBs... potentially doubling model count may result in even lower availability of many models. And only thing that will change as result of this will be asking price.
https://forums.guru3d.com/data/avatars/m/254/254103.jpg
I bet the ten people that get their hands on that 20gb model will be very impressed.
https://forums.guru3d.com/data/avatars/m/252/252776.jpg
On topic: I would be more comfortable getting a 16 or 20 GB vram GFX card opposed to 10GB. There were already one or 2 games, when the 16GB Radeon 7 was released, that ran into a vram limit on G/RTX 1080Ti/2080Ti. I believe it was Tomb Raider at 4K ultra settings. All stutter caused by hitting the vram capacity were gone with the 16GB Radeon, while the Nvidia cards were faster. I expect more games to go past the 12GB vram in the future. And since I tend to use my hardware pretty long(since performance doesn't advance as it did back in the days making >3yrs feasible) it better be future proof.
data/avatar/default/avatar19.webp
The Reeferman:

Later he also tasked this forensic witness statement analyst to analyse the press conference with the 3 astronauts that where the first to (allegedly) set foot on the moon.
Managed to watch like 1minute. Very, very hard to watch due to being unable to a punch a hole in "analyst's" face. The idea that you can tell whether someone is lying by checking whether their responses match the expected responses, and the notion that there is anything objective about this so much so that we should use "analysts" to determine ones guilt in court in objective manner - is reprehensible. And while we're there... 90% of all this analysts in social sciences, psychology, economy, are nothing but glorified straight-out-of-my-ass-opinion givers. And yeah that is just my opinion 🙂 After watching this guy analyze the astronaut for 1 minute, I did not learn anything about the astronaut or Apollo mission - the analyst did convince me beyond any reasonable doubt I wanna brake his jaw.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
LevelSteam:

I bet the ten people that get their hands on that 20gb model will be very impressed.
By the time they release the higher memory models they should have fixed their production and stocks, i think.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I just watched a small demo of Shadow of the Tomb Raider with everything on max (with RT), and it was using 10GB of VRAM with a 3090 (so no "it's just caching" crap). I really think that 10GB is the bare minimum for these cards, and a 20GB model, maybe with some extra shaders or clocks would be a good "Ti" for NVIDIA. Also that 3090 was constantly using almost 400W. WTF.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
PrMinisterGR:

I just watched a small demo of Shadow of the Tomb Raider with everything on max (with RT), and it was using 10GB of VRAM with a 3090 (so no "it's just caching" crap). I really think that 10GB is the bare minimum for these cards, and a 20GB model, maybe with some extra shaders or clocks would be a good "Ti" for NVIDIA. Also that 3090 was constantly using almost 400W. WTF.
Lots of people here would even buy a 500w+ capable gpu board for bragging rights. WTF isn't enough.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
PrMinisterGR:

I just watched a small demo of Shadow of the Tomb Raider with everything on max (with RT), and it was using 10GB of VRAM with a 3090 (so no "it's just caching" crap). I really think that 10GB is the bare minimum for these cards, and a 20GB model, maybe with some extra shaders or clocks would be a good "Ti" for NVIDIA.
You dont really know if the 10gb "used" is entirely used or needed or whether caching does not occur to some degree despite it being a 24gb card. Caching does not necessarily mean it will use up the entire vram. Again, the real answers will be known when 8 and 16gb versions or 10 and 20gb cards of same model cards are tested. The results must show a difference in actual performance (fps, frame times, etc) to be meaningful, NOT how much vram they appear to "use".
https://forums.guru3d.com/data/avatars/m/268/268716.jpg
I'm very interested in some real life tests of 20GB vs 10GB 3080's. As a barely informed consumer, the 20GB model seems a no-brainer to wait for, but I do have suspicions I am just believing the hype however much I try to keep myself level headed.
data/avatar/default/avatar35.webp
It seems like a mental situation to me. Either 10GB is enough and doubling that with a 20gb is absurdly overkill (and it will significantly up the price) Or 10Gb isn't enough and Nvidia launched a flagship card without enough RAM - either cynically or incompetently. I mean, is there really any evidence that we're going to need a doubling of VRAM in the next 2-3 years? Game/GFX professionals weigh in here... The things that could cause that are (with my layman's understanding): Textures, Resolution, and general asset quality/density? I guess we will continue to see moderate increases in texture and asset quality to match the standardisation of 4k as the high quality resolution for PC and consoles, as well as the continued increase in density of assets in the environment. How much does VRAM volume effect RT? Not all that much I would have thought? Is this something that will allow DirectStorage to do something it otherwise wouldn't be able to? I'm really curious, because I feel like the next 5-10 years is going to see some big leaps in the technical development side of games. But I had assumed that was going to necessitate architectural innovation and significant raw power increases rather than memory expansion. (as a priority anyway)