GPUScore Relic Of Life benchmarks with 22 GPUs

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for GPUScore Relic Of Life benchmarks with 22 GPUs on our message forum
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
almost as powerful as a 6800xt
Screenshot 2022-03-09 120549.png

Screenshot 2022-03-09 120638.png
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
"So here we see the GeForce RTX 3080 is faster on the Vulkan API, at the same workload but only marginally." But the graph shows that Vulkan is slower?
https://forums.guru3d.com/data/avatars/m/201/201182.jpg
Gigabyte RTX 3080 Gaming OC Res: 2560x1440 API: Vulcan 1.2 Linux (Manjaro) 5643 Windows 11 5751 API: DirectX 12 Windows 11 5796
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Why is anyone complaining about this when: A. This is a synthetic benchmark, and those don't tend to represent realistic workloads B. It's pushing the latest technology. It's not exactly news that AMD lags when it comes to RT performance. Anyone who bought such a GPU is assumed to not care that much about RT, in which case, this benchmark isn't relevant to you.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
schmidtbag:

B. It's pushing the latest technology. It's not exactly news that AMD lags when it comes to RT performance. Anyone who bought such a GPU is assumed to not care that much about RT, in which case, this benchmark isn't relevant to you.
its relevant when they want to post claims of favouritism.
data/avatar/default/avatar39.webp
Nvidia bet big on RT and risked quiet a bit but it is starting to pay off especially in conjunction with DLSS. RT is showing up more and more and will be mainstream now that the consoles have added RT and are promoting it even though it is pretty much RT lite! I am sure AMD will get on top of it with the next gen but for now Nvidia has the edge with AMD competing more than well enough on the rasterization side.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
vMax1965:

Nvidia bet big on RT and risked quiet a bit but it is starting to pay off especially in conjunction with DLSS. RT is showing up more and more and will be mainstream now that the consoles have added RT and are promoting it even though it is pretty much RT lite! I am sure AMD will get on top of it with the next gen but for now Nvidia has the edge with AMD competing more than well enough on the rasterization side.
Actually I will disagree . While the architectural design started long before 2020 when the 2xxx series begun , Nvidia pushed prices high and introduced dlss and rtx in a generation where the rasterisation performance crown was not ever in danger being contested. So they had the feature advantage and still do, but at the time there was not even an alternative for top raster performance , you thought rtx is not for you maybe gimmick or what ever it did not matter at all if you still wanted one of the top 3 raster cards Nvidia was the only option there too. So yes they took a bet but dear lord when they lunched the 2xxx series was THE ideal time for such thing 😛.
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
vMax1965:

Nvidia bet big on RT and risked quiet a bit but it is starting to pay off especially in conjunction with DLSS. RT is showing up more and more and will be mainstream now that the consoles have added RT and are promoting it even though it is pretty much RT lite! I am sure AMD will get on top of it with the next gen but for now Nvidia has the edge with AMD competing more than well enough on the rasterization side.
The fact is Nvidias implementation of RT and DLSS is propriety to them, its not something a number of producers can just dip into and provide a comparison, its hardware that needs the software to be written in a particular way to work and bring out its best for which they charge a nice licence amount, AMD has the same issue its approach is unique but the main difference is they give away the tech FOC much like Gsynch and freesynch. For this reason the benchmark is flawed its focusing too much on one competitors implementation not on a universal test that all competitors use, RT in games is pointless right now even with my 3090, it just doesn't make a massive difference to my experience and running my 6900xt and the 3090 side by side same monitor etc they look identical, honestly am hard pressed to see what RT brings to the game other than a much larger power draw, in this situation both systems run the game within a few frames of each other especially as the resolution ramps, benchmarks like this provide a nice chart where one producer sits at the top but it doesn't offer a fair comparison in fact it just confirms what we already know Nvidias RT cores are better then AMDs at the moment, raster performance however is another matter, try a raster comparison and the table will look a lot different.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
suty455:

The fact is Nvidias implementation of RT and DLSS is propriety to them, its not something a number of producers can just dip into and provide a comparison, its hardware that needs the software to be written in a particular way to work and bring out its best for which they charge a nice licence amount, AMD has the same issue its approach is unique but the main difference is they give away the tech FOC much like Gsynch and freesynch. For this reason the benchmark is flawed its focusing too much on one competitors implementation not on a universal test that all competitors use, RT in games is pointless right now even with my 3090, it just doesn't make a massive difference to my experience and running my 6900xt and the 3090 side by side same monitor etc they look identical, honestly am hard pressed to see what RT brings to the game other than a much larger power draw, in this situation both systems run the game within a few frames of each other especially as the resolution ramps, benchmarks like this provide a nice chart where one producer sits at the top but it doesn't offer a fair comparison in fact it just confirms what we already know Nvidias RT cores are better then AMDs at the moment, raster performance however is another matter, try a raster comparison and the table will look a lot different.
What RT games have you played? Godfall? RE Village?
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
Don't know why G3D has so high scores for 3070, I only get about 3400
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Kaleid:

Don't know why G3D has so high scores for 3070, I only get about 3400
They always seem to be a bit higher. I figure because of super optimized testbench setups where most of us are running it on our daily drivers. I get 630ish on my cpu-z singlecore on my 5800x for instance... the reviews have it at 660.
data/avatar/default/avatar10.webp
Kaleid:

Don't know why G3D has so high scores for 3070, I only get about 3400
My scores for my RTX 3090 were a bit higher than Guru3D's: 7258 for DX12 and 6936 for Vulkan => Windows 11 Enterprise.
data/avatar/default/avatar39.webp
I'm just waiting for Radeon 7xxx series to launch and be good at Raytracing to see all the fanboys changing their mind that RT is good now. You could already see it before 6xxx launched, when there was no info on RT performance, only after it turned out that AMD is not that great at it was brushed off as unwanted and unneeded feature again same as when RTX2xxx launched. I've had 6900XT, it was a really performant card in raster but RT was not good enough. AMD is well aware of it and all the games that use RT and are sponsored by AMD like FC6, DIRT 5 and RE: Village use RT in a light way with for example half-resolution reflections and no advanced features like GI. It's not matter of benchmark preferring one manufacter to the other. RT cores in RTX cards do full BVH traversal calculation and triangle/bounding box intersection while AMD solution does just the intersection part. I think their solution to RT in RDNA3 will be a full blown solution that is as good as nvidia or better at it but for now, they have it and can do it, but it's not on par on nvidia solution and every benchmark that uses RT will show it.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
It's only me or no one think this benchmark looks really bad? Talking about animation, design, geometry and textures, of course. Combining the reflection with shadows, global ilumination and volumetric lightning... multipliying them in a mirror doesn't sound smart at all, only taxating, maybe is what they intend, but it sounds silly to me, because they failed with really simplistic models and everything they put togheter. Maybe I didn't get the point, but I think they failed in their promise of delivering "high-end gamelike 3D content". [youtube=2yzLQpHQHxg]
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
Krizby:

What RT games have you played? Godfall? RE Village?
Cyberpunk, Doom a few others on a 4k and a ultrawide monitor just dont see a massive change AMD to Nvidia unless of course RT and DLSS is on even then its a small amount of candy and i mean very small if at all noticeable whilst playing
data/avatar/default/avatar01.webp
suty455:

The fact is Nvidias implementation of RT and DLSS is propriety to them, its not something a number of producers can just dip into and provide a comparison, its hardware that needs the software to be written in a particular way to work and bring out its best for which they charge a nice licence amount, AMD has the same issue its approach is unique but the main difference is they give away the tech FOC much like Gsynch and freesynch. For this reason the benchmark is flawed its focusing too much on one competitors implementation not on a universal test that all competitors use, RT in games is pointless right now even with my 3090, it just doesn't make a massive difference to my experience and running my 6900xt and the 3090 side by side same monitor etc they look identical, honestly am hard pressed to see what RT brings to the game other than a much larger power draw, in this situation both systems run the game within a few frames of each other especially as the resolution ramps, benchmarks like this provide a nice chart where one producer sits at the top but it doesn't offer a fair comparison in fact it just confirms what we already know Nvidias RT cores are better then AMDs at the moment, raster performance however is another matter, try a raster comparison and the table will look a lot different.
Your statement is false. The code of this benchmark (or another) is the same for all vendors. Therefore the code must be hardware agnostic. The 3DMark DX12 RT benchmark (Port Royal) show a similar difference, and the games that uses diverse RT caracteristics, too. You know the real fact because your claims about traditional rasterization, for a minnimal difference that favors AMD.
data/avatar/default/avatar20.webp
suty455:

Cyberpunk, Doom a few others on a 4k and a ultrawide monitor just dont see a massive change AMD to Nvidia unless of course RT and DLSS is on even then its a small amount of candy and i mean very small if at all noticeable whilst playing
Not true, you can see this site: https://gamegpu.com/ https://gamegpu.com/action-/-fps-/-tps/cyberpunk-2077-v-1-5-test-gpu-cpu RT test without DLSS or FSR (1080 and 1440)
upload_2022-3-10_17-34-11.png
Same at 2160
upload_2022-3-10_17-35-50.png
2160p and RT with DLSS or FSR - only playable with 3080, 3080ti and 3090
upload_2022-3-10_17-37-5.png
https://forums.guru3d.com/data/avatars/m/288/288917.jpg
AlmondMan:

"So here we see the GeForce RTX 3080 is faster on the Vulkan API, at the same workload but only marginally." But the graph shows that Vulkan is slower?
This!! There is numerous text in this short review saying Vulkan is faster yet all the graphs show DX12 being faster????
AlmondMan:

They always seem to be a bit higher. I figure because of super optimized testbench setups where most of us are running it on our daily drivers. I get 630ish on my cpu-z singlecore on my 5800x for instance... the reviews have it at 660.
I do 674 Single core with PBO on and single core clocks hitting 5.05Ghz
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
suty455:

Cyberpunk, Doom a few others on a 4k and a ultrawide monitor just dont see a massive change AMD to Nvidia unless of course RT and DLSS is on even then its a small amount of candy and i mean very small if at all noticeable whilst playing
As shown in the results above, just RT alone already makes a huge difference. DLSS is just an added bonus to get more fps at the cost of some details, almost the same as FSR with the exception that FSR doesn't add something as close to AA without enabling any specific AA. Both have their ups and downs too, although the issues are mainly caused by how different game engines are developed and updated, which unfortunately does make FSR look much worse compared to DLSS when it comes to certain game implementations. If LOD etc. was done properly in those games it wouldn't be as bad at all.
data/avatar/default/avatar33.webp
RT can Look great. It's not just Reflections as many ppl I speak to think it is. But Depending on How it's done. Some games Don't look better and with the added performance hit. DLSS is a massive thing for Nvidia, anything over 4K is a pure waste of GPU power as well. Hopefully in a couple of gens for Amd and Nvidia, RT will awesome in it's full range of effects. AMD needs custom cores in it's next line and needs to come out swinging.