AMD's Emphasis on Memory Capacity in Modern GPUs for Optimal Performance

Published by

Click here to post a comment for AMD's Emphasis on Memory Capacity in Modern GPUs for Optimal Performance on our message forum
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
rather show us your 4K capable 16G card at 600 usd amd. it's easy to point out flaws in competition, nvidia could point some out the same way. a lot harder to come up with a better alternative. so far all they have for 500-700eur is discounted rx6000 series. decent at these prices, but really not what we expected from amd in mid-2023. 6800xt's successor starts at 850eur for the reference, and aib I have to buy from Germany or +950eur here. If entry level 200w version of 4070 is available at msrp, people will certainly ignore all that talk and policy of waiting indefinitely in favor of buying what they can. any news on n32 would help amd a thousand times more.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Nvidia: we sell GPU with VRAM on the side AMD: we sell VRAM with GPU on the side
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Either run out of money, out of VRAM or out of GPU horsepower... The customer has all the options. o_O
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
One of the few things AMD is doing better than NVidia. But only on the mid range. On the ultra high end, both the 4090 and 7900XTX have the same amount of vram, but then the 4090 trounces the 7900XTX in everything else. And on the low end, AMD has disasters like the 6500 and 6400. So not much to brag about there.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
if only their mid-range wasn't discounted rdna2 they're offloading...... I definitely will be more careful with vram though to avaid the 3070 situation, ideally would like a 7800xt with 16g, but if that doesn't even get the slightest mention in leaks and 4070 will be available at 600 in a few days, then I won't bother waiting. 12g is fine for 1440p, and my next display will still be a 1440 high refresh, just want an oled one.
data/avatar/default/avatar32.webp
One of the reasons that I picked an RTX 3090 during the pandemic (apart from the hope of better availability) was that I was coming from a GTX 980 with 4GB of VRAM, which was clearly showing its age in more modern games (VRAM limitations were much more of a problem than GPU performance), so with the 24 GB on the RTX 3090 it would cover me for the foreseeable future.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
All this shows is how shit these lazy PC ports are.
data/avatar/default/avatar19.webp
the same AMD that still sells shit graphics card with 4GB vram and 4x PCI-e lines?
data/avatar/default/avatar38.webp
The more gurus compare and contrast, the more I want to buy a PS5+PSVR2...
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
fantaskarsef:

Either run out of money, out of VRAM or out of GPU horsepower... The customer has all the options. o_O
F***, I loled so hard at this one!
TheDeeGee:

All this shows is how crap these lazy PC ports are.
True, game makers seem to think that everyone has a 4090 on their system... Anyway, AMD is just pointing out an obvious advantage over Nvidia, just like Nvidia loves to shout about their RT superiority.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Alessio1989:

the same AMD that still sells crap graphics card with 4GB vram and 4x PCI-e lines?
It will play 99% of games out there. Vast majority of games are not recent AAA titles. Lot of very basic, simple indie games as well as old games that play fine on 1gb or less vram. Remember, there were 1000s of games out there when we had less than 1gb vram cards many years ago. Those games are still there.
data/avatar/default/avatar04.webp
TheDeeGee:

All this shows is how crap these lazy PC ports are.
I've got to agree with this. How much memory do the xbox series x and ps5 have? Ps5 has 16gb GDDR6 shared with gpu and cpu. Xbox series x has 10gb full bandwidth and 6gb lower bandwidth shared GDDR6. If game designers can't make texture streaming work between 16gb ddr4 system memory (I would think the most common amount these days with 32gb not far behind) and 8gb GPU memory, then they need to try harder. There's really no merit in saying a game setting is for "future" computers or will only work on the £1000+ gpu's. I always think of doom and doom eternal as proof of how a game can look fantastic and play even better.
data/avatar/default/avatar06.webp
This is the reason that I learned to wait for playing newer games. I dont care if I'm going ot play 1/2 years later if that means better stability and I have better hardware. Is just a matter to have patience. I started Cyberpunk a few days ago, not perfect, but better and I'm enjoying it. Did that with other games to. The last game I bought day 1 was Doom 2016 and Overwatch, and both were running quite nice on mid tier hardware back then (I had an FX 6300 and GTX 960 4GB).
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
Is AMD daring Nvidia to add more VRAM? Are they asking to be left behind even more?
https://forums.guru3d.com/data/avatars/m/212/212018.jpg
Upgrade every gen. The more you buy, the more you save.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Maybe the Xilinx acquisition will mean AMD can add fancy things to their GPU boards, like RT cores.
https://forums.guru3d.com/data/avatars/m/126/126739.jpg
TheDeeGee:

All this shows is how crap these lazy PC ports are.
Does it though? If the developers have more than 8gb of vram available, when they design or (redesign) a game. Why is it their fault that a pc user has less vram than the console counterpart? There is a simple solution to the vram issue, turn down the quality. The options are there to make the game run fine on a 8gb buffer. Its just people are mad that they cant have all the bells and whistles on the said small buffer. 8gb is EOL when it comes to having the highest quality settings. Going forward this is going to be an issue.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
cucaulay malkin:

rather show us your 4K capable 16G card at 600 usd amd. it's easy to point out flaws in competition, nvidia could point some out the same way. a lot harder to come up with a better alternative. so far all they have for 500-700eur is discounted rx6000 series. decent at these prices, but really not what we expected from amd in mid-2023. 6800xt's successor starts at 850eur for the reference, and aib I have to buy from Germany or +950eur here. If entry level 200w version of 4070 is available at msrp, people will certainly ignore all that talk and policy of waiting indefinitely in favor of buying what they can. any news on n32 would help amd a thousand times more.
True but AMD has a valid point. Nvidia has too low of VRAM under the 4080. Both sides have problems this generation.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
mattm4:

Does it though? If the developers have more than 8gb of vram available, when they design or (redesign) a game. Why is it their fault that a pc user has less vram than the console counterpart? There is a simple solution to the vram issue, turn down the quality. The options are there to make the game run fine on a 8gb buffer. Its just people are mad that they cant have all the bells and whistles on the said small buffer. 8gb is EOL when it comes to having the highest quality settings. Going forward this is going to be an issue.
Going over 8GB is very likely not a lazy dev thing. Going over say 16GB at 4K likely is. They develop to targets and 8GB is from 2015-2016 so its easy to see for 1440p and 4K how they would use a higher standard.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
nGreedia will sell u more VRAM, last time 2gb costed 200 USD (3080 10 vs 12gb)