PowerColor Teasers: Radeon RX 7800/7700 XT Graphics Cards: Insights into Hellhound and Fighter Design

Published by

Click here to post a comment for PowerColor Teasers: Radeon RX 7800/7700 XT Graphics Cards: Insights into Hellhound and Fighter Design on our message forum
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
I could be interested in the 7800xt. I'm hoping it comes with 20Gb of memory though.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Maddness:

I could be interested in the 7800xt. I'm hoping it comes with 20Gb of memory though.
Its a 16gb card. Not that exciting but it could be decent card if priced correctly.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Maddness:

I could be interested in the 7800xt. I'm hoping it comes with 20Gb of memory though.
It will be a re-release of the 6800XT with slightly better raytracing and support for FSR3 frame generation. Other than that, same performance, or worse. RDNA3 is, overall, disappointing. Hoping that they get their sh|t together for RDNA4, otherwise I'll just keep my card until it breaks.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Maddness:

I could be interested in the 7800xt. I'm hoping it comes with 20Gb of memory though.
Undying:

Its a 16gb card. Not that exciting but it could be decent card if priced correctly.
Why would an upper-mid-range GPU need 20GB of VRAM? I'm sure there are no titles out there that would be both playable and use more than 16GB on such a GPU.
wavetrex:

It will be a re-release of the 6800XT with slightly better raytracing and support for FSR3 frame generation. Other than that, same performance, or worse.
I agree it will probably just be a 6800XT with better RT (and video codecs) but what makes you think the performance would be worse?
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
schmidtbag:

Why would an upper-mid-range GPU need 20GB of VRAM? I'm sure there are no titles out there that would be both playable and use more than 16GB on such a GPU. I agree it will probably just be a 6800XT with better RT (and video codecs) but what makes you think the performance would be worse?
16gb is indeed plenty for a 1440p that this card will designed for. Also 7800xt have a 60cus where 6800xt have 72cus but core freq. and memory are clocked higher. RDNA3 have alot better rt performance and tensor cores, who knows maybe fsr3 will make use of them.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
schmidtbag:

but what makes you think the performance would be worse?
Well, for one, this is supposed to have 3840 shaders or something like that. 6800XT has...
upload_2023-8-24_19-46-32.png
I'm also not seeing any clock speed miracles in the leaks, both of them will boost to roughly the same numbers, 2400-2500 Mhz Third, it has LESS cache and using the chiplets with higher latency. Fourth, the 7900 GRE was benchmarked and it was just barely beating a 6800 XT, or roughly equal. And that one has a lot more shaders (5120). https://www.techspot.com/review/2721-amd-radeon-7900-gre/ I fail to see how the 7800XT with less can even catch up, unless it's factory overclocked to the moon and back... (like 2.9-3Ghz), with ridiculous power consumption.
data/avatar/default/avatar17.webp
The biggest gains the 7xxx has over the 6xxx is better RT performance. These will be simular. Certainly not an upgrade from the matching 6xxx card.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
wavetrex:

Well, for one, this is supposed to have 3840 shaders or something like that. 6800XT has...
upload_2023-8-24_19-46-32.png
I'm also not seeing any clock speed miracles in the leaks, both of them will boost to roughly the same numbers, 2400-2500 Mhz Third, it has LESS cache and using the chiplets with higher latency. Fourth, the 7900 GRE was benchmarked and it was just barely beating a 6800 XT, or roughly equal. And that one has a lot more shaders (5120). https://www.techspot.com/review/2721-amd-radeon-7900-gre/ I fail to see how the 7800XT with less can even catch up, unless it's factory overclocked to the moon and back... (like 2.9-3Ghz), with ridiculous power consumption.
Seems to me clock speeds makes much more of a difference than you expect. For the 7900GRE, the only way it's worse than the 6800XT is the lower clock speeds. Otherwise, it's better in just about every way, so it should be more than just a few percent better. Clearly, those clock speeds matter. The 6700XT has very similar specs to my ancient R9 290 but clock speeds are one of the major differences. The 6700XT obviously benefits a lot from more VRAM, even where VRAM isn't a bottleneck, it's still more than twice as fast. So given those two examples, it seems to me AMD GPUs benefit much more from MHz than they do cores. At least on Windows anyway - seems to me their Windows drivers are perpetually in need of optimization. Theoretically, the 7800XT ought to perform better despite clock speed and transistor size being the only real improvements: https://www.techpowerup.com/gpu-specs/radeon-rx-6800-xt.c3694 https://www.techpowerup.com/gpu-specs/radeon-rx-7800-xt.c3839 But, FP16 is where the 7800XT appears to really stand out, which is likely where the improved RT performance comes in. While I think the 7800XT ought to be branded the 7800, I think AMD's goal is to compare in terms of RT performance, where it very likely will be an upgrade. For me personally, I still think we're a few years away from enabling RT on mid-tier GPUs (at least at 4K) so I'm probably going to opt for a 6800 when these new cards come out.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
wavetrex:

Well, for one, this is supposed to have 3840 shaders or something like that. 6800XT has...
upload_2023-8-24_19-46-32.png
I'm also not seeing any clock speed miracles in the leaks, both of them will boost to roughly the same numbers, 2400-2500 Mhz Third, it has LESS cache and using the chiplets with higher latency. Fourth, the 7900 GRE was benchmarked and it was just barely beating a 6800 XT, or roughly equal. And that one has a lot more shaders (5120). https://www.techspot.com/review/2721-amd-radeon-7900-gre/ I fail to see how the 7800XT with less can even catch up, unless it's factory overclocked to the moon and back... (like 2.9-3Ghz), with ridiculous power consumption.
well given that its a smaller chip, it will use a lot less power overall, the 7900 cards are choked by power pretty badly, all of them can hit 3ghz + , but with 96cus, its too much. at 60cus those higher clock speeds are much more attainable with a reasonable power envelope, since there is simply less chip, Smaller chips also have better yields, which means better bins will be available. just depends on what tdp amd will allow. If they allow partners to do 350w, then you might even see some models with >3.3ghz
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
wavetrex:

So you're saying this thing will be faster than the 7900 GRE, despite having 25% less shaders. https://media.tenor.com/0KEvxoQb5a4AAAAC/doubt-press-x.gif
didn't say that, but it could be closer than you think to elaborate, a 7900 xtx will do 3.3-3.4ghz at 600-700w, if you take that it has 60% more cus than the 60cu die, then scale down that power target , you end up with 375-437w, this is incomplete , but it puts it in perspective.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
user1:

didn't say that, but it could be closer than you think
Still Mega Doubt. But I guess we'll see in a few days...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
wavetrex:

Still Mega Doubt. But I guess we'll see in a few days...
Didnt the Igors lab tested the pro version of navi 32? He overclocked the card to match the theoretical gaming clock but i think it still had room to improve, It ended up 10% faster than 6800xt, thats probably where will 7800xt sit. If amd price it 499$ as i think it will that is not bad for you want to upgrade from lets say 6700xt like i want to.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
schmidtbag:

Why would an upper-mid-range GPU need 20GB of VRAM? I'm sure there are no titles out there that would be both playable and use more than 16GB on such a GPU. I agree it will probably just be a 6800XT with better RT (and video codecs) but what makes you think the performance would be worse?
That's easy. ATM 16Gb is plenty enough for gaming. When I brought the 3080 almost 3 years ago, 10Gb was more than enough for 4k gaming. But I want to use the next card for at least 2 years and then have it still be relevant to hand down to my son. The way Vram usage is heading either because of how complex games are getting or lazy devs, I'm not sure 16Gb will cut it and would rather invest in a 20Gb card at the minimum. I'd love a 7900xtx or 4090, but both are absurdly priced in my country.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Maddness:

That's easy. ATM 16Gb is plenty enough for gaming. When I brought the 3080 almost 3 years ago, 10Gb was more than enough for 4k gaming. But I want to use the next card for at least 2 years and then have it still be relevant to hand down to my son. The way Vram usage is heading either because of how complex games are getting or lazy devs, I'm not sure 16Gb will cut it and would rather invest in a 20Gb card at the minimum. I'd love a 7900xtx or 4090, but both are absurdly priced in my country.
The 3080 when it was new had an underwhelming amount of VRAM; it definitely was not widely regarded as "more than enough". In 2020, 10GB was just simply "sufficient" for 4K native, if you ignored poorly optimized games. If you were to use DLSS, then 10GB was plenty. Lazy devs are a large reason for the VRAM increases (and I personally refuse to financially support such games, because I want to discourage such behaviors), but as I've said over and over again: a lot that memory usage comes down to 4K-ready textures and AA. You don't need 16x AA in 4K and you don't need maxed-out textures at 1080p (or even 1440p depending on the game). We're not talking about 4080+ level performance here, so why are you determining the potential needs of a game using graphical settings that your GPU can't render? I find it so odd and confusing - you want to pay more out of principle? Because like it or not: that's what you're asking for. The RX 7700 is likely to have similar performance to the RTX 4060, and yet, the latter proved to have very little benefit going from 8GB to 16GB. Wherever 16GB yielded more than a 1% improvement, the frame rate was below 60FPS, and just because the framerate went up, that doesn't mean all 16GB was needed. The main reason I'm arguing this at all is because I don't feel like spending more money on mainstream GPUs because people keep asking for unnecessary expenses. It's the same reason why we're upcharged for things like unicorn vomit and heatsinks that could substitute the intercooler for a Mini Cooper.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
I wasn't talking about the RX7700, I was specifically talking about the RX7800XT as posted. Yes you're right about the 3080 should have had more Vram, but I have had almost 3 years and the card has been excellent for my needs. I personally wanted the 16gb 6900xt, but that was vapor wear at the time. The rest i agree on.
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
The RX 7800 XT/7700 XT would be good alternatives for gamers looking to upgrade from RX 6700 class cards and lower, I think RX 6800 and higher owners have little reason to look at 'em as upgrade choices. I hope the full functionality of FSR3 would be available to the RX 6000 class cards, though I do have my doubts...my RX 6900 XT in my 2nd rig still handles games at 3440x1440 just fine, pure rasterize gaming performance only.
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
PowerColor has the RX 7800 XT 'Red Devil' and 'Hellhound' cards, the latter being lower in price. For the RX 7700 XT, there are not only these two versions, there's also the 'Fighter' range as well, I guess this should be price as close as possible (may even be the same) as the reference model from AMD.