AMD's Emphasis on Memory Capacity in Modern GPUs for Optimal Performance

Published by

Click here to post a comment for AMD's Emphasis on Memory Capacity in Modern GPUs for Optimal Performance on our message forum
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
mattm4:

Does it though? If the developers have more than 8gb of vram available, when they design or (redesign) a game. Why is it their fault that a pc user has less vram than the console counterpart? There is a simple solution to the vram issue, turn down the quality. The options are there to make the game run fine on a 8gb buffer. Its just people are mad that they cant have all the bells and whistles on the said small buffer. 8gb is EOL when it comes to having the highest quality settings. Going forward this is going to be an issue.
If they have more than 8GB available it means they can throw optimization out of the window and become lazy.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
mattm4:

Does it though? If the developers have more than 8gb of vram available, when they design or (redesign) a game. Why is it their fault that a pc user has less vram than the console counterpart? There is a simple solution to the vram issue, turn down the quality. The options are there to make the game run fine on a 8gb buffer. Its just people are mad that they cant have all the bells and whistles on the said small buffer. 8gb is EOL when it comes to having the highest quality settings. Going forward this is going to be an issue.
For me the problem is that suddenly recent games are much demanding regarding the necessary hardware but they don`t so much better to justify the same requirements. And then we have cases like the Last of Us, that was a PS3 game, that has been remaked 2 or 3 times and now requires an absurd amount of RAM... So for me this looks like a case of laziness regarding PC ports. And it also shows that publishers don`t mind releasing games in very poor conditions...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Two years time right before blackwell both 4070 and 4070ti will suffer the same fate as 3070/3070ti. 16gb will become a norm and raytracing will not even be an option due lack of vram. On the other hand 7900xt and upcoming 7800xt will hold its own and still be relevant. Buy smart if you plan keeping your card longer.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Undying:

Two years time right before blackwell both 4070 and 4070ti will suffer the same fate as 3070/3070ti. 16gb will become a norm and raytracing will not even be an option due lack of vram. On the other hand 7900xt and upcoming 7800xt will hold its own and still be relevant. Buy smart if you plan keeping your card longer.
Just a few months ago you were on a 2080s 8gb then moved to a 3070 8gb. Then suddenly the flood of reviews of recent demanding games caused you to switch to a 12gb card. Things have progressed quite rapidly when 8gb did not seem like much of an issue to many ppl, incl myself. 😀
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
TheDeeGee:

All this shows is how crap these lazy PC ports are.
Came here for this. People like to argue with me on this, but games have a lot of room for more optimization. Textures are an easy thing to neglect, because disk space is cheap and they have very little impact on the performance of the GPU core. So long as you have enough VRAM (which devs typically will), it's practically free enhanced detail. While I don't know what practices are commonly used to minimize the disk and memory footprint of textures, I have a strong feeling enough aren't being used. In some cases, I'm sure lossy compression could be used with no noteworthy fidelity loss, but an immense decrease in data. Other times, I'm sure the textures are much larger than they ever need to be, for objects where you can't get close enough to see the extra detail. You could argue "what about 8K displays?" but we're still barely able to achieve desirable framerates at 4K, and besides... if you lower the resolution of a texture and cannot see a difference in 4K, it's probably not going to distract you in 8K, even if it can be noticed. Depending on what it is you're playing (including just regular videos), sometimes (depends on the content) 1080p with good AA hardly looks any different compared to 4K when you're sitting at a normal viewing distance.
Alessio1989:

the same AMD that still sells crap graphics card with 4GB vram and 4x PCI-e lines?
While it was stupid of AMD to do both 4GB and 4x lanes for the price they were asking, so long as you don't do high-res textures, the performance seemed totally fine for those GPUs. I don't really understand why people always want to crank up texture detail on modern games when they're playing at 1080p, since you won't have a visual improvement but you will have a big performance loss.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
alanm:

Just a few months ago you were on a 2080s 8gb then moved to a 3070 8gb. Then suddenly the flood of reviews of recent demanding games caused you to switch to a 12gb card. Things have progressed quite rapidly when 8gb did not seem like much of an issue to many ppl, incl myself. 😀
I did what i could and i switched not becouse i didnt like nvidia but started to have issues with newer games and ironically not able to use rt due lack of memory.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Memory capacity is cool.... Gimping it with a small bus width is not cool.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Undying:

I did what i could and i switched not becouse i didnt like nvidia but started to have issues with newer games and ironically not able to use rt due lack of memory.
I don`t want to be that guy... but those 12Gb seem insuficcient for 2024. Time for a new GPU!:D:p
data/avatar/default/avatar10.webp
Wow they are going all in. This is all they have left! Shame they never learned from their past mistakes, because they have always had more vram since ATI, and it has never mattered.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
AlmondMan:

Maybe the Xilinx acquisition will mean AMD can add fancy things to their GPU boards, like RT cores.
already happening w/ Instinct but those are a.i. accelerators, i fully expect rdna 4 to have fgpa rt cores ALSO... game engines are just as important to RT performance UE5 has the smallest differential between 4090 and 7900xtx - often less than 5% in modern games using UE5
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
tunejunky:

already happening w/ Instinct but those are a.i. accelerators, i fully expect rdna 4 to have fgpa rt cores ALSO... game engines are just as important to RT performance UE5 has the smallest differential between 4090 and 7900xtx - often less than 5% in modern games using UE5
Kind of amusing given that UE has historically ran so much better on Nvidia chips. I'm looking forward to seeing UE5 in more games - hopefully all the shader comp issues are resolved as of 5.1 because its clearly the biggest issue with the engine.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Denial:

Kind of amusing given that UE has historically ran so much better on Nvidia chips. I'm looking forward to seeing UE5 in more games - hopefully all the shader comp issues are resolved as of 5.1 because its clearly the biggest issue with the engine.
idk for sure if all the shader issues are resolved but i can say my 6900xt has better fps on UE5 than any other game engine...and the numbers i saw for rdna3 are even better
data/avatar/default/avatar26.webp
AMD is just kicking up some dirt about 8GB cards and with good reason. The 3070 8GB gets stomped on by the 6800 16GB. [youtube=Rh7kFgHe21k] When the memory runs out the performance and visual hit is brutal compared to the 6800. The 4060 8GB is out there in the future and is likely going to be a copy of the 3070 8GB problems shown by Hardware Unboxed.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Undying:

Two years time right before blackwell both 4070 and 4070ti will suffer the same fate as 3070/3070ti. 16gb will become a norm and raytracing will not even be an option due lack of vram. On the other hand 7900xt and upcoming 7800xt will hold its own and still be relevant. Buy smart if you plan keeping your card longer.
Frame generation also increases VRAM
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Just 2 pages? C'mon boys, it's your favourite subject!:p
data/avatar/default/avatar29.webp
TLD LARS:

AMD is just kicking up some dirt about 8GB cards and with good reason. The 3070 8GB gets stomped on by the 6800 16GB. [youtube=Rh7kFgHe21k] When the memory runs out the performance and visual hit is brutal compared to the 6800. The 4060 8GB is out there in the future and is likely going to be a copy of the 3070 8GB problems shown by Hardware Unboxed.
What. The. Fuck. Did people forget that a 6800xt is a 3080 competitor? The fact that a 6800xt is being directly compared to a 3070, speaks more in favor in EVERY way about the 3070. But brainwashed fools, who actually listen to that biased idiot Steve in the video, forget this. They think its fair to compare cards, only now, that would have not even been considered a comparison two years ago. This video is a joke. HUB is a joke. Stop posting this crap it means nothing! *Respond. I wanna hear your response.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
TimmyP:

What. The. frack. Did people forget that a 6800xt is a 3080 competitor? The fact that a 6800xt is being directly compared to a 3070, speaks more in favor in EVERY way about the 3070. But brainwashed fools, who actually listen to that biased idiot Steve in the video, forget this. They think its fair to compare cards, only now, that would have not even been considered a comparison two years ago. This video is a joke. HUB is a joke. Stop posting this crap it means nothing! *Respond. I wanna hear your response.
It's a 6800 not a 6800XT. The RX6800 was $80 more expensive than the 3070 at the time which he mentions in the video. He also mentions in the video that he recommended people get the 6800 over the 3070, despite the cost, because he felt the additional VRAM would be useful in future titles and worth the $80. Either way the problem in the video is the 1% lows on the 3070 is abysmal. It would have been more interesting to me to see a 3070 vs 6700XT and see how that turns out with 1% lows across both cards but either way I think the comparison is fair given his comments in the original video. I think @TLD LARS could have done a better job stating this though.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
TimmyP:

It was 150 more. 3080 competitor.

upload_2023-4-12_11-43-55.png

upload_2023-4-12_11-44-15.png
579 - 499 = 80 You can't even math bro
upload_2023-4-12_11-45-38.png

upload_2023-4-12_11-43-55.png
699-579 = 120 Edit: Lol you deleted the post
data/avatar/default/avatar18.webp
Denial:

It's a 6800 not a 6800XT. The RX6800 was $80 more expensive than the 3070 at the time which he mentions in the video. He also mentions in the video that he recommended people get the 6800 over the 3070, despite the cost, because he felt the additional VRAM would be useful in future titles and worth the $80. Either way the problem in the video is the 1% lows on the 3070 is abysmal. It would have been more interesting to me to see a 3070 vs 6700XT and see how that turns out with 1% lows across both cards but either way I think the comparison is fair given his comments in the original video. I think @TLD LARS could have done a better job stating this though.
Still was a 3080 competitor, not 3070. Not a fair comparison. *Yeah I deleted it because I had the wrong card again.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
TimmyP:

Still was a 3080 competitor, not 3070. Not a fair comparison.
It's $40 closer to a 3070 competitor than a 3080 competitor. You're just wrong. Just type it, it's easy "I was wrong, I'm sorry - I wish @TLD LARS did a better job representing the mismatch but given the context it actually makes sense to compare the two cards" Everyone would respect it so much more.