Rumor: Nvidia RTX 3080 Ti could have more Graphics Memory than ever

Published by

Click here to post a comment for Rumor: Nvidia RTX 3080 Ti could have more Graphics Memory than ever on our message forum
https://forums.guru3d.com/data/avatars/m/273/273838.jpg
I haven't seen a game that actually uses more than 9GB on 4k. Then again, nVidia could market it a 5k or 8k "capable" GPU. I think the new Unreal engine is going to be very VRAM heavy though, so let's wait and see. It's going to be a tough task to justify the added cost of that amount of memory, on an already very expensive GPU. Now, regarding the "future-proof" thing, I think that on most occasions, the GPU chip itself becomes the bottleneck, not the VRAM the card packs
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
fry178:

@cryohellinc lol. funny how you say you dont care what/how others are gaming, yet turn around stating 4k gaming is overrated. maybe to you, but thats irrelevant to the plant and millions of gamers..
I'm a horrible disgusting biased old slav. And 4k gaming IS overrated - anything lower than 30 inch in size is a tinfoil hat zone when it comes to 4k and screen size. Most of the people won't be able to distinguish 4k from 1440p and that size - slap on top of it actual hardware requirements in order to drive those pixels. And yes, this is my probably biased opinion based on practical usage. If someone wants to build dial 3080Ti rig for 4k gaming @120+fps - by all means. For me, if I would have done that, I would, first of all, get one of those BFG types of monitors (43+ inch) in order to actually get a good experience, instead of sitting with horsepower like that on a 27inch screen (my friend has that).
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
cryohellinc:

4k gaming is overrated - it makes sense only on a screen larget than 35inch. 1440p is still the perfect middle-ground. Slap on top of it upcoming ray-traced titles and any upcoming GPU most likely won't be able to handle even 60fps on 4k + ray-tracing. A similar situation is with those 13inch laptops with 4k screens - 99.9% won't be able to see the difference between 1080p and 4k at that scale - but "muh screen size"....
4K gaming is a pain in the a55 for sure but I've been using a big screen tv since the launch of GTX 285 and can't go back to sitting at a monitor. Here it's 1080p or 4K, really, and I don't want 1080p at all now. 4K is tough to drive, DLSS will hopefully help more going forward. These leaks are properly doing my head in at this point. The previous leak suggested these cards would have between 10 and 12GB, now memory for days. This bull5h17 will be over soon.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
I don't care for memory and resolution. I just need a gpu to give me ~240 fps at 1440p. Will 3080ti be able to provide me that kind of power?
data/avatar/default/avatar40.webp
Kaarme:

Those are some super optimistic price predictions you have there.
yea first thing i thought when i saw that post XD slap 100-150 more on there and you might be close, perhaps even 500 more for the TI if it ships with that amount of ram (as the 11gb card is already 1000+, not sure if the price for Vram has plummited but i doubt 24gb for ~1000 a few years later)
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
Netherwind:

I can't even remember seeing VRAM usage @ 10-11GB on my card. Is anything above 12GB useful unless you're playing at 8K which no GPU can muster anyway?
Cod mw maxed out can pull more than 10gb on my card.
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
itpro:

I don't care for memory and resolution. I just need a gpu to give me ~240 fps at 1440p. Will 3080ti be able to provide me that kind of power?
You got a 240hz monitor? Be tearing all over
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Supertribble:

4K gaming is a pain in the a55 for sure but I've been using a big screen tv since the launch of GTX 285 and can't go back to sitting at a monitor. Here it's 1080p or 4K, really, and I don't want 1080p at all now. 4K is tough to drive, DLSS will hopefully help more going forward. These leaks are properly doing my head in at this point. The previous leak suggested these cards would have between 10 and 12GB, now memory for days. This bull5h17 will be over soon.
If you want my opinion on this, which I fully share with yours, check my comment that is literally above yours. 4k on a large screen = amazing. 4k on a small "g@m3r" screen is just silly - that's my point. As a 4k 55inch TV owner - console gaming on it is a joy, and indeed DLSS is the future. Fingers crossed that leaks about Sony patenting this tech for its console will work out in the end.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
itpro:

I don't care for memory and resolution. I just need a gpu to give me ~240 fps at 1440p. Will 3080ti be able to provide me that kind of power?
Unlikely. But it depends on game. Paladins will likely do at max details. But in game like that, it is not about 1080p vs 1440p vs 4K from visual standpoint. Games with high details shaders that are able to provide additional details for surfaces on higher resolution are going to have trouble going 240fps. 2080Ti was limiting factor for those high detail games on 1440p and it is likely that 3080Ti will be same. I even think that frame rate at 1080p while having all DX-R effects enabled will not be anything extra.
insp1re2600:

Cod mw maxed out can pull more than 10gb on my card.
Cached, not actively used every frame for rendering.
cryohellinc:

As a 4k 55inch TV owner - console gaming on it is a joy, and indeed DLSS is the future. Fingers crossed that leaks about Sony patenting this tech for its console will work out in the end.
DLSS is no miracle. Some parts look better, some worse. If fine details are not produced on lower resolution, upscaling will not add them. In other words, it is good on macroscopic objects, but bad for microscopic details.
https://forums.guru3d.com/data/avatars/m/274/274577.jpg
i could with a gpu with more then 8gb
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
cryohellinc:

If you want my opinion on this, which I fully share with yours, check my comment that is literally above yours. 4k on a large screen = amazing. 4k on a small "g@m3r" screen is just silly - that's my point. As a 4k 55inch TV owner - console gaming on it is a joy, and indeed DLSS is the future. Fingers crossed that leaks about Sony patenting this tech for its console will work out in the end.
The thing I was trying to justify, more to myself really, is that there is no middle ground when using a tv, it's 4K or 1080p, and 1080p isn't where I want to be now. Having a 1440p high refresh television would be ideal for my personal use case but that isn't going to happen. 4K for console is a separate thing, a console game at 4K or whatever arbitrary screen resolution it's outputting at 30fps is more straightforward and obviously less of a headache than trying to run the latest AAA game on PC at a 4K/60fps minimum.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Supertribble:

The thing I was trying to justify, more to myself really, is that there is no middle ground when using a tv, it's 4K or 1080p, and 1080p isn't where I want to be now. Having a 1440p high refresh television would be ideal for my personal use case but that isn't going to happen. 4K for console is a separate thing, a console game at 4K or whatever arbitrary screen resolution it's outputting at 30fps is more straightforward and obviously less of a headache than trying to run the latest AAA game on PC at a 4K/60fps minimum.
The OLED panels seem to do 1440p real well from what I hear.
https://forums.guru3d.com/data/avatars/m/172/172989.jpg
There's more to it than 4k screen, no? I'm thinking ultra wide 1440p screens and VR like the Valve Index ,HP Reverb G2 and future HMD's @90Hz or more. Even that -smirk- Apple 8k screen. No, that last one wasn't serious 😉. Other than that, I'd like to see how heavy HDR, Raytracing is on the memory.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
insp1re2600:

You got a 240hz monitor? Be tearing all over
80-240hz gsync/freesync range. It has a quite huge headroom tbh, 3x min isn't bad.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Still waiting for the most importanst numbers: the prices...
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
Loophole35:

The OLED panels seem to do 1440p real well from what I hear.
I'd love an OLED but I still use the PC for general use, browsing and whatnot and screen burn in would be a problem so stuck with LED, which really burns my a55 to be honest. Looking at game videos on my phone, which uses OLED in HDR makes me want to cry.
data/avatar/default/avatar16.webp
How many hypocrites. Last year some of the above were saying that the 16GB HBM2 the RVII had were not needed!!!!!!!!!!!! Also what happened to the 3090 :P
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
cryohellinc:

4k gaming is overrated - it makes sense only on a screen larget than 35inch. 1440p is still the perfect middle-ground. Slap on top of it upcoming ray-traced titles and any upcoming GPU most likely won't be able to handle even 60fps on 4k + ray-tracing. A similar situation is with those 13inch laptops with 4k screens - 99.9% won't be able to see the difference between 1080p and 4k at that scale - but "muh screen size"....
4K isn't overrated, like you said, it just depends on what the display is. I myself have a 40" 1080p and I can tell you right now, it's not a great experience, and I'm confident going to 1440p isn't going to cut it. On a laptop though, 1080p is plenty good enough to me. Hell, on a 13" display, I think even 1366x768 is adequate for everyday use (I don't play games on a laptop). Most software and websites are still built to comfortably fit in these lower resolutions with 100% scaling and I don't see that changing any time soon. The only crappy part about 1366x768 is watching videos, and even then, it's not a terrible experience.
Netherwind:

I can't even remember seeing VRAM usage @ 10-11GB on my card. Is anything above 12GB useful unless you're playing at 8K which no GPU can muster anyway?
Increasing your resolution doesn't make that big of a difference in VRAM usage. The vast majority of your VRAM is used by instructions and assets. With the direction of next-gen consoles, I think Nvidia is hinting at what we're going to see for newer games. Looks to me like 8GB is going to be the new bare-minimum.
https://forums.guru3d.com/data/avatars/m/160/160436.jpg
Lol @ 4K being overrated. Even on my 1440p monitor, upscaling to 4k has a noticeable impact on image quality. In many games it looks better than traditional AA. For example in Total Warhammer 2, 4x MSAA is significant in terms of performance impact, vs me running 4k upscaling which looks great and is roughly equivalent for performance hit.