Far Cry 5 PC graphics settings unveiled
Click here to post a comment for Far Cry 5 PC graphics settings unveiled on our message forum
Hughesy
So is there no HDR support on PC? If there isn’t but there is on consoles I know which version I’ll be getting, really frustrating if that is the case.
rcole134
Well, I'm going to go for the one with better performance and better graphics, so, PC for me.
alanm
Amx85
Then is DX12? i hope supports "explicit multi adapter" flag (Multi GPU) i gona buy an APU + my old R7 260X
grettings
Hughesy
alanm
https://www.pcgamer.com/what-hdr-means-for-pc-gaming/
Dude, you need an HDR capable monitor or TV. If you are talking emulated HDR gimmickry done in software, thats something else.
Aura89
schmidtbag
signex
Emulated or not, it's still HDR and able to enable HDR you still see a difference. You just won't get the full experience of real HDR that's it.
My monitor is also HDR capable, even though it's 8bit and doesn't have backlight zones like the expensive premium HDR TV's have. And the difference is phenominal.
Serotonin
HDR or not, I hope it's good. Not worried about brighter colors as much as I am fun gameplay.
Denial
I'm actually surprised about the HDR mostly because this an AMD sponsored game and most of the HDR monitors available right now (I think all of them actually) are Freesync + AMD had that FreeSync 2 HDR low latency tech stuff - this would be a good game to showcase that in.
rcole134
Aura89
schmidtbag
illrigger
OK, hold on now. HDR is just to do with brightness and contrast. It has nothing to do with color. "HDR Color" is actually called Wide Color gamut. Terminology is important.
Now, let's get some things straight about WCG. The standards that are set up to handle 10-bit color (rec.2020) don't give a crap whether your screen is capable of all 1.07 billion colors - they will feed 10-bit and the panel will interpolate the signal with whatever color gamut it can display. Since there are literally zero fully 10-bit capable displays out there in the consumer space right now, that's the best we can hope for. Most TVs, phones and monitors aim for 100% of DCI-P3, which is around 60% of the color depth of rec.2020. Even the most capable professional monitors out there are only capable of around 75% of full rec.2020.
Knowing that, now you need to understand that the entire discussion around this is complete BS.
The thing is, the difference between a basic 4k HDR TV and a professional level monitor is huge, but that makes very little difference in games and movie watching when you compare that to the difference between a non-WCG 8-bit panel and that same basic HDR TV. A basic 8-bit panel can display 16.8 million colors, but even the basic Vizio E-series 4k HDR TV with its 8-bit frc panel can display over 500 million (50% of rec.2020) in WCG mode - that's around 25 times the number of colors. Sure, that professional monitor can display 700 million, but that's only 250 more compared to the 400+ more that even a low-end Vizio TV can do vs a non-WCG capable screen can. That means much, much smoother color gradients without software needing to step in.
So basically, all these "blah is not even really 10-bit" comments are pretty meaningless. You DO want your games to support HDR because you DO want that extra 400m+ colors, and you DO want a WCG capable monitor to see them, even if it's a crap entry level TV. Everything else is gravy.
xrodney
signex
illrigger
illrigger
Pinscher