HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) - 12 Games at 4K HDR

Published by

Click here to post a comment for HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) - 12 Games at 4K HDR on our message forum
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
Pimpiklem:

I wonder is this is related to this problem ? http://www.guru3d.com/news-story/asus-and-acer-uhd-g-sync-hdr-monitors-forced-to-use-color-compression-at-120144-hz.html Much wiser and smarter people in here can answer that for me.
Not really no, because HDMI 2.0 has 18GB/sec vs Display port 1.4 11GB, but also this is being used at 60Hz so both can provide enough bandwidth. The problem you have with Gsync it can only use display port but it's not really the issue we are seeing here since HDMI 2.0 is being used.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
dp1.2 has 18gbit dp1.4 has 32gbit Hmdi2.0 has max 18gbit, real 14.4gbit Hmdi2.1 has now theoretical real 42.4gbit
data/avatar/default/avatar37.webp
Corbus:

Vega64 is better than i remember.
Yes it is. Almost 10 months worth of driver updates improved the perf a lot. Even the difference between 18.6.1 and 18.7.1 is around 5% on average. For many games had driver optimisations since also, some 20%+
data/avatar/default/avatar12.webp
Hasn't AMD hardware & drivers been processing the equivalent of HDR internally for years now before downgrading to display? Shouldn't really impact AMD cards at all beyond margin of error for benchmarking.
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
robintson:

This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64. The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64. I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
Doesn't have to be. Nvidia used to win tessellation tests quite easily because they were stronger at that. AMD even had to make a slider to cut down on the tessellation levels to keep up, not that it really mattered that much because not all games had tessellation at all but I do remember Crysis 2 taking a huge hit on the AMD card I had. (The tessellation addition was sponsored by Nvida though). Same is true here, how many have HDR screens? By the time they become more common Nvidia will put in extra effort to catch up.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Krzysztof:

Maybe answer is easy :Vega 12.6TFLOPS vs gtx1080Ti -10.6TFLOPS ?
it is not as easy as that... AMD and NVidia don't use the same way to render the same thing... it's like comparing an american muscle car with a german hybrid GT both do the job, but not the same way.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Kaleid:

Doesn't have to be. Nvidia used to win tessellation tests quite easily because they were stronger at that. AMD even had to make a slider to cut down on the tessellation levels to keep up, not that it really mattered that much because not all games had tessellation at all but I do remember Crysis 2 taking a huge hit on the AMD card I had. (The tessellation addition was sponsored by Nvida though). Same is true here, how many have HDR screens? By the time they become more common Nvidia will put in extra effort to catch up.
i agree. but as 4K display are on last year 1080P s price, HDR shouldn't be too far either in term of mass production... we will see.
https://forums.guru3d.com/data/avatars/m/256/256350.jpg
Interesting findings. The thing that puzzles me about this whole HDR craze is you can pick up a 55" LG 4K HDR TV for less than $400USD, but when it comes to a PC monitor, HDR is a $500+ premium. What a joke?! It's the single-most over-priced feature on monitors these days. It even beats the premium people pay for G-sync. Pure ridiculousness.
data/avatar/default/avatar13.webp
Again, this isn't an issue. As long as the display is using RGB Full @ 4k when rendering HDR and it is set as such in the NVCP ( Which I posted in my previous reply.) there is no performance loss.
https://forums.guru3d.com/data/avatars/m/179/179499.jpg
ubercake:

Interesting findings. The thing that puzzles me about this whole HDR craze is you can pick up a 55" LG 4K HDR TV for less than $400USD, but when it comes to a PC monitor, HDR is a $500+ premium. What a joke?! It's the single-most over-priced feature on monitors these days. It even beats the premium people pay for G-sync. Pure ridiculousness.
You can find some good ones out there for not too bad. I got this one recently on sale for $440 and it's been great, granted the HDR implementation isn't the best out there: Acer XZ321QU. It's basically a Predator, however it has HDR instead of Gsync which was a bigger deal to me. Although it's HDR isn't the best out there it looks amazing when I have it turned on and is well worth it. I'm still surprised to hear it takes an FPS hit, although I haven't noticed it myself.
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
ubercake:

Interesting findings. The thing that puzzles me about this whole HDR craze is you can pick up a 55" LG 4K HDR TV for less than $400USD, but when it comes to a PC monitor, HDR is a $500+ premium. What a joke?! It's the single-most over-priced feature on monitors these days. It even beats the premium people pay for G-sync. Pure ridiculousness.
Well, for instance on my 800euro Philips 55" 4k HDR TV it does not support the same color levels, 4:4:4 as my monitor does despite the monitor being 8bit and the TV 10bit. And when I scroll the text I can see more problems with the text for instance, it's not as high quality as the monitor on many levels.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Yakk:

Hasn't AMD hardware & drivers been processing the equivalent of HDR internally for years now before downgrading to display? Shouldn't really impact AMD cards at all beyond margin of error for benchmarking.
both vendors process at up to 11bit internally
data/avatar/default/avatar09.webp
The whole thing looks a bit fishy...
https://forums.guru3d.com/data/avatars/m/248/248203.jpg
but but but . . .RTX ray tracing! idk, it's like nvidia doesn't care except w/their own tech. i'm sure when the HDR adoption rate increases, they'll consider looking into it. until then: oh look! gameworks!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
There is only one important thing to note... Less than 60fps on average and dips to pretty low fps. 4K HDR is not for high end cards.
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
maikai:

Yup the premium the manufacturers are charging is absolutely absurd and straight highway robbery. Interesting, my Vizio P55-C1 displays no such issue when scrolling web pages etc, I actually finally have a screen I prefer to my ASUS VG278H which came out in 2011
It takes some time to notice this, and perhaps my setting was slightly wrong...but I don't think there is any TV that matches a proper monitor. It also seems harder to get a TV with no black light bleeding compared to monitors, or perhaps I've been lucky.
data/avatar/default/avatar25.webp
DxVinxGado:

Again, this isn't an issue. As long as the display is using RGB Full @ 4k when rendering HDR and it is set as such in the NVCP ( Which I posted in my previous reply.) there is no performance loss.
? You can't use RGB with HDR . you can't use Full Range with HDR . you can only use YCbCr 4:2:2 with LIMITED when using HDR . Also Dolby vision is only 4:2:0 LIMITED , so I don't understand where did you get this information . Also that's what Nvidia Automatically is doing . if you don't HDR signal won't be working on the TV or monitor
data/avatar/default/avatar27.webp
X7007:

? You can't use RGB with HDR . you can't use Full Range with HDR . you can only use YCbCr 4:2:2 with LIMITED when using HDR . Also Dolby vision is only 4:2:0 LIMITED , so I don't understand where did you get this information . Also that's what Nvidia Automatically is doing . if you don't HDR signal won't be working on the TV or monitor
? Asus PG27UQ works like this: 4:4:4 HDR: 98Hz 4:2:2 HDR: 120Hz Overclocked 4:2:2 HDR: 144 Hz. Set 4K, 120hz, 8 bit, SDR, RGB for normal usage Set 4K, 98hz, 10 bit, HDR, 444 for HDR usage ----- At the risk of further complicating things, let's quickly go over the supported refresh rates and settings for SDR and HDR content. Here are the ones for SDR: 98Hz, RGB (4:4:4), 10-bit color depth 120Hz, RGB (4:4:4), 8-bit color depth 144Hz, YCbCr 4:2:2, 8-bit color depth And here are the supported modes and settings for HDR content: 98Hz, RGB (4:4:4), 10-bit color depth 120Hz, YCbCr 4:2:2, 10-bit color depth 144Hz, YCbCr 4:2:2, 10-bit color depth
data/avatar/default/avatar04.webp
Can tell when people dont own HDR. I do and loving it. Right now my settings say RGB 4.4.4 10 bit 144 htz 32" 1ms 600 lumin freesync2 Sorry Nizzen your reference is flawed.