HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) - 12 Games at 4K HDR

Published by

Click here to post a comment for HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) - 12 Games at 4K HDR on our message forum
data/avatar/default/avatar35.webp
1440p AC origin HDR with the GTX 1080 Ti -2fps
https://forums.guru3d.com/data/avatars/m/230/230335.jpg
This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
data/avatar/default/avatar38.webp
robintson:

This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
I think you've missed the point of these benchmarks. This test isn't meant to determine the better card but rather showcase the performance penalty they suffer when HDR is enabled between the RX VEGA 64 and the GTX 1080. It looks to me as if something within nvidia's driver is amiss, and they'll have this sorted out. There's absolutely no favorites or bias here. The only people who would find these results offensive are fanboys.
data/avatar/default/avatar24.webp
robintson:

This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
GTX1080 and Vega 64 are on same performance level in general and the benchmark is made to show the discrepancies on HDR - SDR. AMD never saw the Vega 64 as "flagship" product, (is a compute chip after all) to compete with the likes of Titan and Ti. As for power consumption spare us. 1630 factory overclocked Vega 64 Nitro+ never goes over 276W power consumption. GTX1080Ti Xtreme (and all 2012 factory overclocked models) burns north of 330W constantly. (I have both) And when was the last time you checked the massive power consumption of an overclocked GTX1080? These things do not consume power linearly. There is a sharp angle over the 1780Mhz clock. GTX1080 Armor watercooled at 2190 core it was burning 325W. For the same performance level the Vega 64 has to run at 1724 burning exactly the same amount of power (330W).
data/avatar/default/avatar29.webp
robintson:

This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
What you said has absolutely nothing to do with what is shown in these graphs. The FACTS show that AMD performance is a lot higher in these tests than Nividia. This has nothing to do with a 1080 not being the top of the range. From you miss calculations you are saying that if it was the top 1080 in play, the HDR would make no difference and those who bought inferior cards can suffer. Fact - Nvidia Drivers right now suck. Hopefully Nvidia can fix it with a Driver update, all the people who bought G-Sync Premium Monitors maybe a bit upset that they get this big loss and also pray that a driver will fix it.
data/avatar/default/avatar23.webp
I can confirm zero impact in performance on farcry 5 with HDR Vega 64 I think some how the fact Nvidia installs drivers in Limited colour has something to do with this result. They hide something.and HDR reveals it. Just a guess but maybe it has something to do with the fact Nvidia have to drop the chrome Info in HDR. Not sure but its interesting.
data/avatar/default/avatar18.webp
robintson:

This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
TDP.. If you drop clock speeds on vega 64 just by 5% and compensate by overclocking the hbm2 you will slash power in half. Im not talking about just under volt im talking about dropping the clock speed as well. Im not exaggerating you slash power in half.. AMD set their clocks to high in the red zone they gain huge amounts of efficiency when you drop the clock speeds.
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
Maybe nVidia's memory compression methods don't work very well with the bright pixels, so the 256 bit memory bus on that 1080 returns to what it actually is, i.e. $250 videocard material. Just beating around in the bush, of course...
data/avatar/default/avatar13.webp
Same settings, Nvidia HDR is 10 bit full range?
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
Vega64 is better than i remember.
data/avatar/default/avatar31.webp
I wonder if Vulcan will ever have HDR Just keep in mind peeps. If you buy a HDR screen avoid anything higher than 600 nits unless its a TV. Honestly 600 nits is blinding from 2 feet away anything more is folly.
https://forums.guru3d.com/data/avatars/m/272/272863.jpg
Maybe answer is easy :Vega 12.6TFLOPS vs gtx1080Ti -10.6TFLOPS ?
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
robintson:

This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
If they had used 1080 Ti instead, I assume you would have been here posting that the whole point of the test would have been to make Nvidia look better and AMD worse. Right?
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
HH any chance you can run your own tests that are more reliable? Last time i checked HDR should have little to no performance hit, it's just color reproduction right?
data/avatar/default/avatar39.webp
Ricepudding:

HH any chance you can run your own tests that are more reliable? Last time i checked HDR should have little to no performance hit, it's just color reproduction right?
Flaw test. The performance loss was due to the way NVIDIA converts RGB to reduced chroma YCbCr 4:2:2 in HDR which was used for their HDR test. If they used 4:4:4 or RGB, the performance would be the same as SDR. Nice to know.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
Texter:

Maybe nVidia's memory compression methods don't work very well with the bright pixels, so the 128 bit memory bus on that 1080 returns to what it actually is, i.e. $250 videocard material. Just beating around in the bush, of course...
could be rabbit could be if it comes down to a bus size anyway,they just love doing that cutting down the bus width to save money ,it really pissed me off when they put out the 7600gt to replace the 6800 with a 128 bus. but it does not hurt as much till res goes up then they crap out lol any 1080ti users can say different on the performance hit?
nizzen:

Flaw test. The performance loss was due to the way NVIDIA converts RGB to reduced chroma YCbCr 4:2:2 in HDR which was used for their HDR test. If they used 4:4:4 or RGB, the performance would be the same as SDR. Nice to know.
could be rabbit could be....just a setting?
data/avatar/default/avatar15.webp
Honestly, this is hardly front page news. At the very least Hilbert should update the post to reflect the oversite of Computerbase's testing methods.