4GB VRAM isn't cutting it anymore says AMD

Published by

Click here to post a comment for 4GB VRAM isn't cutting it anymore says AMD on our message forum
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Frankly speaking, neither is 8.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
cryohellinc:

Frankly speaking, neither is 8.
laughs (cries?) in heavily modified 8K textures Skyrim Mods
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
My APU can allocate 8.5GB...very useful. 😛
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
sverek:

wtf, there is another interview? Link please.
[spoiler][youtube=ijTFAHRO8WM][/spoiler]
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Back in the day I was choosing between this 8GB AMD 390 or 3.5GB Nvidia 970.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Kaarme:

Back in the day I was choosing between this 8GB AMD 390 or 3.5GB Nvidia 970.
Now that would be an interesting revisit. On the other hand a lot of people changed em before it matters.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
8gb vram even for 1080p. Doom Eternal cant be maxed out on a 6gb card.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
My 680 was 8GB, specially bought it for Skyrim 😛 Then i found out modding Bethesda games is a bad idea, because they're unstable out of the box.
data/avatar/default/avatar16.webp
TheDeeGee:

My 680 was 8GB, specially bought it for Skyrim 😛 Then i found out modding Bethesda games is a bad idea, because they're unstable out of the box.
A 680 with 8GB, you sure? The base model had 2
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
buhehe:

A 680 with 8GB, you sure? The base model had 2
He probably meant 4gb model, as 8gb doesn't exist.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Uhm, right 4GB... was still waking up.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
4 GB felt limited back with the AMD Fury although perhaps a bit more so for how some titles had these optional texture packs whereas nowadays even the lower settings hit around 2 - 3 GB scaling up to 4 or even 6 before adding in texture packs, display resolution, shader or shadow taking up a bit or the method of just reserving and using as much as possible of 75 - 85% of total available video memory and then even with a higher-end GPU still seeing problematic streaming and slow loading ha ha. (Well asset loading and less conservative usage of both RAM and VRAM has changed things up a bit.) Curious to see how that changes but 8 GB will probably be a pretty close baseline if not more not too long from now and hopefully developers do remember to scale things both up and down. Though there's the usual bit of disparity between reasonable and then these Ultra preset settings although some titles come closer to justifying the steeper performance hit for how the actual visual changes are instead of small but costly shader tweaks or shadow adjustments. Interesting to hear though even if I assumed 4 GB was at an end years ago but I guess they do mean even the lower end hardware and using settings scaled for this still getting close to 3 - 3.5 GB and whatever the OS itself likes to keep. 😀 (Or the browser being all "This is mine!" if hardware acceleration is a thing and GPU or the OS or the software in question and trying to get it to share at least a bit. ~ )
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Kaarme:

Back in the day I was choosing between this 8GB AMD 390 or 3.5GB Nvidia 970.
You made the right choise.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
cryohellinc:

Frankly speaking, neither is 8.
not with web browsers and cef applications eating vram these days.
data/avatar/default/avatar24.webp
HBC dead? A year ago they were mansplaining that allocating data does not mean actually using it. https://abload.de/img/vega20final20presentamyjcq.png Although it has to be said: other than Fury X experiment, AMD has pretty much always been leading the way with the amount of VRAM.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Venix:

Now that would be an interesting revisit. On the other hand a lot of people changed em before it matters.
Yeah. I was also pretty close to upgrading twice (1080 and 2070). Now I'm happy I didn't because waiting for a more mature ray tracing solution is more interesting.
Noisiv:

HBC dead? A year ago they were mansplaining that allocating data does not mean actually using it.
That picture seems to be dated to 2017. Quite a bit more than a year ago.