AMD publishes Radeon VII benchmark results from 26 games

Published by

Click here to post a comment for AMD publishes Radeon VII benchmark results from 26 games on our message forum
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
I'd like to see some info on power consumption under load.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
emperorsfist:

I'd like to see some info on power consumption under load.
Well Lisa said that it the same as Vega 64. Which is not to bad considering the performance increase.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
OMG. AMD, stop using i7-7700K for any kind of comparison tests. There are quite a few games this thing will bottleneck even on 4K.
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
Maddness:

Well Lisa said that it the same as Vega 64. Which is not to bad considering the performance increase.
Sure, but I'd still wait for some 3rd party benchmarks.
https://forums.guru3d.com/data/avatars/m/275/275892.jpg
Some of these FPS numbers are pretty close to the performance of RTX 2080. I wonder what would the full die do in games? I guess we will see the fully enabled gpu sooner or later in action 🙂
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Fox2232:

OMG. AMD, stop using i7-7700K for any kind of comparison tests. There are quite a few games this thing will bottleneck even on 4K.
I find the fact they used an intel processor, weird....the main point would be the difference between the GPUs, the CPU could have been an AMD. lol
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Aura89:

I find the fact they used an intel processor, weird....the main point would be the difference between the GPUs, the CPU could have been an AMD. lol
I think they'll stop it with Ryzen 3000 chips availability.
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Well back in September ,i got my new Gigabyte gaming oc 1080Ti for 629.00, knew i got a steal of a deal, this is just icing on the cake.
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
HARDRESET:

Well back in September ,i got my new Gigabyte gaming oc 1080Ti for 629.00, knew i got a steal of a deal, this is just icing on the cake.
Lucky ba$tard...
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
emperorsfist:

Lucky ba$tard...
Yes SIR !
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
I fail to see why so many people in these enthusiast forums are so picky on power use the difference between even 150w to 300w at the end of the year will make a negligible difference on your power Bill's and as long as it has a decent after market cooler it makes no difference.
data/avatar/default/avatar02.webp
Pretty stupid, showing benchmarks for 4k, which the GPU is obviously not capable of running. But understandable since 4k is where AMD are the strongest.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
FrostNixon:

Pretty stupid, showing benchmarks for 4k, which the GPU is obviously not capable of running. But understandable since 4k is where AMD are the strongest.
The higher the resolution, the less likely the CPU causes issues with the performance of the GPU. This ofcourse can make other things the bottleneck, such as ram, so it's important to judge a GPU where it's not being hindered by a CPU, and not being hindered by itself (unless you're simply interested in that specific situation and want to know the difference) Vega 64 vs Radeon VII, 4K is the most logical place to put that, to see the differences between the cards, and not other random factors.
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
icedman:

I fail to see why so many people in these enthusiast forums are so picky on power use the difference between even 150w to 300w at the end of the year will make a negligible difference on your power Bill's and as long as it has a decent after market cooler it makes no difference.
I completely agree with you , my last system was 290x CFX , i didn't care about how many watts used , but my utility cost was less in the winter time.
data/avatar/default/avatar15.webp
residentour:

Radeon VII is actually a desktop version of "Radeon Instinct MI50" : https://www.amd.com/en/products/professional-graphics/instinct-mi50 But there is also a "Radeon Instinct MI60" with 64 compute units, 4096 stream procs : https://www.amd.com/en/products/professional-graphics/instinct-mi60 so still there is hope for a possible RTX 2080 Ti competitor .
No sorry. 4 compute units more are not gonna give you 8-9% boost each. RTX 2080 Ti is at average ~32-36% faster than RTX 2080 which is equal or faster than Radeon VII.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
residentour:

Radeon VII is actually a desktop version of "Radeon Instinct MI50" : https://www.amd.com/en/products/professional-graphics/instinct-mi50 But there is also a "Radeon Instinct MI60" with 64 compute units, 4096 stream procs : https://www.amd.com/en/products/professional-graphics/instinct-mi60 so still there is hope for a possible RTX 2080 Ti competitor .
It's not likely the MI60 desktop/gaming version will ever be released. And personally i feel that there is a very simply reason for this. MI60 version of Radeon VII, would have 32GB of HMB2 memory....way too costly to produce a consumer grade card with, especially since that extra ram would not likely help the gaming performace all that much, maybe if you're trying to play 8K games? And if it doesn't come with 32GB of HMB2, then the only difference would be the compute units.....4 extra compute units, if the frequency was the same, would be a 6% performance boost. Where would the pricing for this card be, for a 6% performance increase? $750 instead of $699? That would simply be...odd. The only way it'd make since would be if it came with 4 extra compute units, and a 200-300mhz base clock increase, but i doubt that'd even be possible, as the Radeon VII is likely already hitting its "comfortable" max frequency.
https://forums.guru3d.com/data/avatars/m/262/262241.jpg
Aura89:

It's not likely the MI60 desktop/gaming version will ever be released. And personally i feel that there is a very simply reason for this. MI60 version of Radeon VII, would have 32GB of HMB2 memory....way too costly to produce a consumer grade card with, especially since that extra ram would not likely help the gaming performace all that much, maybe if you're trying to play 8K games? And if it doesn't come with 32GB of HMB2, then the only difference would be the compute units.....4 extra compute units, if the frequency was the same, would be a 6% performance boost. Where would the pricing for this card be, for a 6% performance increase? $750 instead of $699? That would simply be...odd. The only way it'd make since would be if it came with 4 extra compute units, and a 200-300mhz base clock increase, but i doubt that'd even be possible, as the Radeon VII is likely already hitting its "comfortable" max frequency.
Maybe an updated version could add HDMI 2.1 and full compute units. Not a new model to exist alongside the current, but a replacement, like the updated GTX 1080 with better memory.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Aura89:

I find the fact they used an intel processor, weird....the main point would be the difference between the GPUs, the CPU could have been an AMD. lol
On other hand most mobo for AMD use Intel lan chip too... the two companies aren't too oposite in real world (even more in pro segment were there is intel mobo with lot of AMD inside and AMD with lot of intel inside). If you want to release a benchmark you should use the most used CPU even if it is from other company, also Ryzen is very hard to get in some country while Intel is well distributed all around the world. But i agree that they could add an own CPU (but OK if AMD do as in their "update hardware advisor" app... on a friend's (Intel) computer it put all Ryzen less powerfull than a 6th gen Intel for the program he own... AMD should fire the programmer lol !!!)
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
rl66:

On other hand most mobo for AMD use Intel lan chip too... the two companies aren't too oposite in real world (even more in pro segment were there is intel mobo with lot of AMD inside and AMD with lot of intel inside).
I feel that comment would hold more ground if AMD actually created an AMD branded motherboard with Intel LAN or something else on it. Otherwise, Intel LAN on "AMD" motherboards have less to do with AMD, more to do with the motherboard manufacturer. That being said, i feel like this must have been done by some third party company, as AMD is more likely to have AMD parts laying around then Intel parts. The fact they did it on Intel parts kinda makes it seem like AMD doesn't believe in its own products. But, at 4K, that doesn't make sense, since both CPUs from both companies would perform very, very similarly. Like i said, i find it....weird. It's not really a problem, just weird. I very much doubt if/when Intel makes their GPU, they will ever test (officially and publicly) on an AMD system lol