Leaked Details Emerge for AMD Radeon RX 7600 GPU - 2048 Stream Processors

Published by

Click here to post a comment for Leaked Details Emerge for AMD Radeon RX 7600 GPU - 2048 Stream Processors on our message forum
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
Any 8GB card is a DOA card. Minimum should be 10GB now. An 8GB 3070 holds less value in future proofing than a 12gb 3060.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
AMD releasing a GPU with 8GB of vram, after spending so much time bragging about having more vram, just feels like they are taking a piss at PC gamers.
data/avatar/default/avatar34.webp
For $250 and some one playibg 1080p, this card would be a fair deal... even with 8 gb ram. But I doubt it will sell at that price when the 4060 will be more.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Horus-Anhur:

AMD releasing a GPU with 8GB of vram, after spending so much time bragging about having more vram, just feels like they are taking a piss at PC gamers.
Atleast they dont ask 399$ for it like nvidia.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Undying:

Atleast they dont ask 399$ for it like nvidia.
350$ to 375$ isn't that much of a diference. Considering that 1GB of GDDR5 is now going for 3.4$, there is no excuse or nvidia or AMD to release a GPU with only 8GB.
https://forums.guru3d.com/data/avatars/m/255/255510.jpg
I blame AI. Its making all the decisions. Like with recruitment and no one gets a job.
https://forums.guru3d.com/data/avatars/m/228/228573.jpg
Need to see reviews for these and 4060 and then probably end up buying a 7800xt and eat beans for a year.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Horus-Anhur:

AMD releasing a GPU with 8GB of vram, after spending so much time bragging about having more vram, just feels like they are taking a piss at PC gamers.
As embra pointed out, if it's marketed as a 1080p (with DXR on) GPU and priced appropriately, the VRAM is fine. Underwhelming, but fine. If MSRP is over $300 and/or this is marketed as a 1440p GPU then I agree with you.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Embra:

For $250 and some one playibg 1080p, this card would be a fair deal... even with 8 gb ram. But I doubt it will sell at that price when the 4060 will be more.
I agree. Not every card needs 20Gb of VRAM...
vestibule:

I blame AI. Its making all the decisions. Like with recruitment and no one gets a job.
That would explain a lot... 😱
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
daffy101:

Need to see reviews for these and 4060 and then probably end up buying a 7800xt and eat beans for a year.
:)That brings back memory - you may be saying this for the fun of it but back in my youth when I bought an Athlon X2 4800+ I remember eating Heinz canned beans for three weeks - I can't stand beans now... This is an entry level card and for an entry level (1080p) 8GB is enough as it was already said here - it doesn't have dlss, probably weaker ray tracing than competition but also lower price - anyway the price should be much lower but who knows these days...
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Whatever 8GB is enough or not depends on the price and marketing. The 3070 was sold as the 2080TI killer and a 4k raytracing card and was over 1 grand in many countries. In this context 8GB was clearly not enough. As long as a card is sold as a 1080p only card and at less then 300$ USD then 8GB is perfectly fine. This card will likely be more than 300$ USD though so yeah shame on AMD if it's the case.
https://forums.guru3d.com/data/avatars/m/255/255510.jpg
@H83 There is no getting away from it, there is something going on with AMD & Ngreedia that cannot be easily explained nor understood. On a less serious note. Bahhh to them. 😀 I think I need to let this go. 😛 and so I will. 🙂 Game on. 😉
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Same cu's as the 6600xt lower clocks ....hmmm so this will be about as fast as a 6650xt at best I reckon.... I mean what kind of ipc improvements rdna 3 has over 2 Now about people that say 8gb is enough for a 300 USD 1080p card.... Really since when it was the norm to have to lower from day 1 the texture quality to medium on a 300 USD card ? This has to happen since the GTX 460 days ! Maybe even farther back !
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Venix:

Now about people that say 8gb is enough for a 300 USD 1080p card.... Really since when it was the norm to have to lower from day 1 the texture quality to medium on a 300 USD card ? This has to happen since the GTX 460 days ! Maybe even farther back !
8GB has always been enough for 1080p and perhaps always will be. The problem is, texture detail (which seemingly accounts for the majority of VRAM consumption) is always measured as "low, medium, high, ultra" when really it should be based on what resolution it is most optimal for. You don't need to play a modern game with ultra texture detail at 1080p; it's just simply unnecessary, because those textures were meant to look crisp on 4K. Do side by side comparisons at 1080p with different texture settings and unless you're point-blank with a wall, you won't see an appreciable difference. Now to be clear: I'm not happy that for almost a decade, 1080p GPUs have cost around $300. That is definitely a problem, though, I'm a little less irritated when that's 1080p with DXR enabled. For 1080p without DXR, even $250 with 8GB is asking too much as far as I'm concerned.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
schmidtbag:

8GB has always been enough for 1080p and perhaps always will be. The problem is, texture detail (which seemingly accounts for the majority of VRAM consumption) is always measured as "low, medium, high, ultra" when really it should be based on what resolution it is most optimal for. You don't need to play a modern game with ultra texture detail at 1080p; it's just simply unnecessary, because those textures were meant to look crisp on 4K. Do side by side comparisons at 1080p with different texture settings and unless you're point-blank with a wall, you won't see an appreciable difference. Now to be clear: I'm not happy that for almost a decade, 1080p GPUs have cost around $300. That is definitely a problem, though, I'm a little less irritated when that's 1080p with DXR enabled. For 1080p without DXR, even $250 with 8GB is asking too much as far as I'm concerned.
You clearly don't fully understand how textures work. Resolution impact VRAM allot more than texture quality due to the mip maps being loaded. That's why 4K eat so much Vram. That said, most of the times High and Ultra almost look the same and there's a big penalty to take. Also Shadows, they never look good on the highest setting, I always prefer medium. As for allot of other stuff I like to tune to my taste and optimize. At the end of the day, I usually get 2x FPS of the advertised and IQ is virtually the same (or at least I'm satisfied with it). Pricing is a joke, depending on how they perform I'd rather pay 223€ for a RX6600. Still waiting to see if the RX 6700 comes down in price, common AMD: release the 7700/7800 too!
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Silva:

You clearly don't fully understand how textures work. Resolution impact VRAM allot more than texture quality due to the mip maps being loaded. That's why 4K eat so much Vram. That said, most of the times High and Ultra almost look the same and there's a big penalty to take. Also Shadows, they never look good on the highest setting, I always prefer medium. As for allot of other stuff I like to tune to my taste and optimize. At the end of the day, I usually get 2x FPS of the advertised and IQ is virtually the same (or at least I'm satisfied with it). Pricing is a joke, depending on how they perform I'd rather pay 223€ for a RX6600. Still waiting to see if the RX 6700 comes down in price, common AMD: release the 7700/7800 too!
yup, shadows are usually unnaturally crisp and detailed at ultra, while in reality they're softer and not as dark. btw I've seen 6700xt's at 320eur new here, I think it was asrock challenger. imo you'll soon find one at 300.
https://forums.guru3d.com/data/avatars/m/262/262613.jpg
if its true that it's only 8GB, AMD missed an opportunity here to come out as a clearly better card than the Nvidia counterpart, very disappointing...
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
schmidtbag:

8GB has always been enough for 1080p and perhaps always will be. The problem is, texture detail (which seemingly accounts for the majority of VRAM consumption) is always measured as "low, medium, high, ultra" when really it should be based on what resolution it is most optimal for. You don't need to play a modern game with ultra texture detail at 1080p; it's just simply unnecessary, because those textures were meant to look crisp on 4K. Do side by side comparisons at 1080p with different texture settings and unless you're point-blank with a wall, you won't see an appreciable difference.
Textures are not just the diffuse ...that was on dx7 era ! A lot of techniques use alot more textures than people realize techniques that add "almost free" eyecandy as far you have enough v ram with our beloved materials ! So say you have a 2 million triangle face in blender ...and you drop that to 2k triangles ! This as is with just diffuse the difference would be colossal but if you make a normal map out of the high poly count model and apply it on the low poly model you can fake the missing geometry and give you visually 99.9% the same visual resaults for the fraction of computational cost .... Now the normal map alone is not enough together with it you have to make a specular map as well witch contains the information how light behaves on the surface ! So already 1 surface needs 3 textures not just one .... Since then more and more things got added roughness map and height maps and alpha maps .....are also added so when you scale textures from say 1024x1024 to 2048 it might means 6 textures for each surface ! Although it does not mean all the materials has to match the diffuse texture ... But here you go this is why ram requirements go up. Now .... Nanite and lumen kiiiiinda throw all that out of the window you use the ultra high poly count out of the box so you do not need materials to fake geometry and lumen does the light great as well ... So actually when you use those the vram usage drops (only if you drop materials usage!else it goes Waaaay up) but it does not drop as much high Polly models are highly compressable but huge ....so compressed down end up being a bit less than using full flashed materials in game ....anyway long story short vram usage is a very complex matter is not as easy to nail down but one thing is certain over time it goes up and even if it staled for a while and might stall again for a bit it will keep going up and up! (Except if we find some miracle compression techniques)
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Silva:

You clearly don't fully understand how textures work. Resolution impact VRAM allot more than texture quality due to the mip maps being loaded. That's why 4K eat so much Vram. That said, most of the times High and Ultra almost look the same and there's a big penalty to take. Also Shadows, they never look good on the highest setting, I always prefer medium. As for allot of other stuff I like to tune to my taste and optimize. At the end of the day, I usually get 2x FPS of the advertised and IQ is virtually the same (or at least I'm satisfied with it).
Clearly, eh? Bit of a strong accusation. First of all, it varies pretty drastically between games. In some cases going from 1080p to 4K might only be a 10% increase in VRAM, while in others it's a 60% increase. In a lot of situations, keeping AA on causes the major penalty to VRAM (and often performance in general), where you can sometimes double your VRAM usage. Secondly, while I'm not familiar with today's development practices, as far as I understand, mipmaps are dependent upon the maximum texture detail level. Even if a game can dynamically scale the texture based on your resolution where the maximum level of detail isn't used, as far as I understand, you're still loading all of that into VRAM. Considering benchmarks of VRAM usage between different resolutions and/or different texture settings, seems like evidence points that more often than not, lowering texture detail has a more significant impact on lowering VRAM usage than dropping from 4K to 1080p. The fact of the matter is, even in 4K and max detail, many (if not most) modern games barely demand (as in, not just cache) more than 8GB already. Of the small handful of games that use more than 8GB of VRAM at 1080p, they're either: A. So poorly optimized that even 12GB wouldn't be enough B. Going to run like crap on a 7600 regardless of VRAM usage C. Just barely overflowing VRAM, where DRAM can keep up just fine So, of those games that are violently bursting at the seams of VRAM at 1080p, is it really worth raising the bar of what's an acceptable amount of VRAM, just to compensate for lazy development practices? When there are games that can fit within 8GB and look better, I don't know if I'd want to give my money to developers lazy enough to not do the same.
Venix:

Textures are not just the diffuse ...that was on dx7 era ! A lot of techniques use alot more textures than people realize techniques that add "almost free" eyecandy as far you have enough v ram with our beloved materials ! So say you have a 2 million triangle face in blender ...and you drop that to 2k triangles ! This as is with just diffuse the difference would be colossal but if you make a normal map out of the high poly count model and apply it on the low poly model you can fake the missing geometry and give you visually 99.9% the same visual resaults for the fraction of computational cost .... Now the normal map alone is not enough together with it you have to make a specular map as well witch contains the information how light behaves on the surface ! So already 1 surface needs 3 textures not just one .... Since then more and more things got added roughness map and height maps and alpha maps .....are also added so when you scale textures from say 1024x1024 to 2048 it might means 6 textures for each surface ! Although it does not mean all the materials has to match the diffuse texture ... But here you go this is why ram requirements go up. Now .... Nanite and lumen kiiiiinda throw all that out of the window you use the ultra high poly count out of the box so you do not need materials to fake geometry and lumen does the light great as well ... So actually when you use those the vram usage drops (only if you drop materials usage!else it goes Waaaay up) but it does not drop as much high Polly models are highly compressable but huge ....so compressed down end up being a bit less than using full flashed materials in game ....anyway long story short vram usage is a very complex matter is not as easy to nail down but one thing is certain over time it goes up and even if it staled for a while and might stall again for a bit it will keep going up and up! (Except if we find some miracle compression techniques)
Meshes, relatively speaking, don't have that big of an impact on VRAM. As you said, there are techniques to improve quality, and some of those techniques involve making the polygon density seem higher than the asset defines. You're also right that VRAM usage isn't so simple, particularly because as I mentioned earlier in this post: there's a lot of unoptimized development practices out there. But since there is compelling evidence that VRAM usage will go down when you lower texture details, and, since there is very little evidence that visual fidelity is compromised, to me it just makes sense to lower it.