Far Cry 5 PC graphics settings unveiled

Published by

Click here to post a comment for Far Cry 5 PC graphics settings unveiled on our message forum
https://forums.guru3d.com/data/avatars/m/242/242371.jpg
So is there no HDR support on PC? If there isn’t but there is on consoles I know which version I’ll be getting, really frustrating if that is the case.
https://forums.guru3d.com/data/avatars/m/249/249685.jpg
Well, I'm going to go for the one with better performance and better graphics, so, PC for me.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Hughesy:

So is there no HDR support on PC? If there isn’t but there is on consoles I know which version I’ll be getting, really frustrating if that is the case.
Makes sense. A lot of modern TVs support HDR, very few PC monitors do.
https://forums.guru3d.com/data/avatars/m/268/268759.jpg
Then is DX12? i hope supports "explicit multi adapter" flag (Multi GPU) i gona buy an APU + my old R7 260X grettings
https://forums.guru3d.com/data/avatars/m/242/242371.jpg
alanm:

Makes sense. A lot of modern TVs support HDR, very few PC monitors do.
That argument makes no sense. Most people can’t use ultra graphics settings, but they still include them. I’d hazard a guess that there are as many people using HDR TVs as there are people using ultra graphics settings. When the code is there, there is no reason to not include it. When done right, HDR has a bigger impact on the look of a game compared to the highest graphics settings.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Hughesy:

That argument makes no sense. Most people can’t use ultra graphics settings, but they still include them. I’d hazard a guess that there are as many people using HDR TVs as there are people using ultra graphics settings. When the code is there, there is no reason to not include it. When done right, HDR has a bigger impact on the look of a game compared to the highest graphics settings.
Dude, you need an HDR capable monitor or TV. If you are talking emulated HDR gimmickry done in software, thats something else. https://www.pcgamer.com/what-hdr-means-for-pc-gaming/
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Hughesy:

That argument makes no sense. Most people can’t use ultra graphics settings, but they still include them. I’d hazard a guess that there are as many people using HDR TVs as there are people using ultra graphics settings. When the code is there, there is no reason to not include it. When done right, HDR has a bigger impact on the look of a game compared to the highest graphics settings.
Except the point was that very few monitors utilize HDR. As to the "code being there", you don't know what the code is do you? Do you for certain know if there would have to be different code for every single graphics change? Something that wouldn't be needed for a console? I'm not saying, just an FYI, that there would be, i'm saying you and i do not know how easy it would be to implement HDR into the PC release of the game, for a community that would likely have 99% of people never use it since they don't have an HDR monitor or connect to a HDR TV. It's always interesting how people make things seem as though something is so simply and easy and they definitely know its simple and easy even though they don't know diddly squat about the subject and can't make any even remotely educated statement. Also, don't take this as i don't think PC gamers should have access to HDR in games, i do. I think it's something developers need to start including so that way there's a reason for computer monitor manufacturers to introduce HDR to their lineup. But just because i want that doesn't mean i'm going to start making uneducated comments about how "easy" something is that i know zero information about.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
alanm:

Makes sense. A lot of modern TVs support HDR, very few PC monitors do.
A lot of PC gamers use TVs; I'm one of them. I suspect this is one of the reasons so many GPUs come with HDMI, or even prioritize it over DP, despite it not being royalty-free.
Aura89:

As to the "code being there", you don't know what the code is do you? Do you for certain know if there would have to be different code for every single graphics change? Something that wouldn't be needed for a console?
I know your post isn't directed at me, and I take your points, but being devil's advocate: * Isn't that kind of the point of Vulkan/DX12? Those remove a lot of the responsibility of the OS and drivers, where the GPU is supposed to do most of the work. To my recollection, aside from reducing CPU overhead, it was also supposed to add more uniformity in how games are operated. This is why, for example, you can mix and match any GPU, including ones from other brands. These APIs operate at a low level, as do console games. * Console hardware isn't a whole lot different from PC hardware. Even sloppy console ports nowadays are playable on PC. Due to the hardware similarities, I think it is reasonable to think that the hardware capable of HDR would exist in PC hardware too. However... When you consider hardware limited to DX11/OGL4 and older, they likely don't have any compatibility with HDR displays, and perhaps backward-checking with them added too much complication.
It's always interesting how people make things seem as though something is so simply and easy and they definitely know its simple and easy even though they don't know diddly squat about the subject and can't make any even remotely educated statement.
That seems a bit harsh. Seeing as the code was already implemented, among the 2 points I made earlier, it is a reasonable to wonder why HDR isn't found in the PC version. Yes, it's true that we don't know how easily it translates between hardware, but more often than not I've seen the inverse of this situation. For example, PC often gets additional graphical settings that consoles might not have. Back in the day, PC got hardware-accelerated Physx, whereas consoles didn't. Some PC versions nowadays have VR support, whereas consoles don't. Very often, PC versions get support for multiple APIs (like various versions of DX, OpenGL, and Vulkan), all of which are not so easily translated. Seeing as Ubisoft cares about making this game PC-optimized, it is a bit questionable why they would remove something.
https://forums.guru3d.com/data/avatars/m/113/113386.jpg
Emulated or not, it's still HDR and able to enable HDR you still see a difference. You just won't get the full experience of real HDR that's it. My monitor is also HDR capable, even though it's 8bit and doesn't have backlight zones like the expensive premium HDR TV's have. And the difference is phenominal.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
HDR or not, I hope it's good. Not worried about brighter colors as much as I am fun gameplay.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I'm actually surprised about the HDR mostly because this an AMD sponsored game and most of the HDR monitors available right now (I think all of them actually) are Freesync + AMD had that FreeSync 2 HDR low latency tech stuff - this would be a good game to showcase that in.
https://forums.guru3d.com/data/avatars/m/249/249685.jpg
Hughesy:

That argument makes no sense. Most people can’t use ultra graphics settings, but they still include them. I’d hazard a guess that there are as many people using HDR TVs as there are people using ultra graphics settings. When the code is there, there is no reason to not include it. When done right, HDR has a bigger impact on the look of a game compared to the highest graphics settings.
And of course that the majority of supposedly HDR capable monitors and TV's aren't actually true HDR anyway simply because either there only 8 bit panels (10 bit frc is just 8 bit) and only have a static contrast ratio of 1000:1 (dynamic is bull) plus of course lack the max nit brightness level to be considered true HDR. Basically Contrast and brightness are the most important factors with colour being secondary.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
schmidtbag:

A lot of PC gamers use TVs; I'm one of them. I suspect this is one of the reasons so many GPUs come with HDMI, or even prioritize it over DP, despite it not being royalty-free.
Most monitors have HDMI these days, so i'm not sure that's valid. And i won't say that there aren't people that use TVs for monitors, but that they are very much a very, very small percentage. By definition i "use" a TV as a monitor for my laptop when i play games with my wife on them, but i would not even remotely consider myself as using a TV as a display, since that's by no means my main.
schmidtbag:

Seeing as Ubisoft cares about making this game PC-optimized, it is a bit questionable why they would remove something.
Who says its "removed"? How do we know it's not playstation of xbox that are providing the HDR experience for games that support it? The simplicity of it all is: We don't know. Unless a developer comes and tells us why games that are HDR available on consoles are not HDR available on PC, we simply do not know how hard, or easy it is. From the fact there has been issues with HDR on PC (from the link provided on a previous post) tells me it's not as simple as one might think in comparing console HDR to PC HDR.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Aura89:

Most monitors have HDMI these days, so i'm not sure that's valid. And i won't say that there aren't people that use TVs for monitors, but that they are very much a very, very small percentage.
I don't think it's as small of a percentage as you suggest. It was incredibly rare for people to use TVs as their primary display back before HDMI was available, but off the top of my head, I can think of 3 people other than myself who use a TV as their primary display, two of which do not use their PCs for games. I also know of 2 more people who have connected their laptops to their TVs as a temporary larger display. But, I understand not everyone accepts personal anecdotes, so, consider HTPCs. They are pretty popular, and I'm willing to bet most of them are connected to TVs.
Who says its "removed"? How do we know it's not playstation of xbox that are providing the HDR experience for games that support it? The simplicity of it all is: We don't know. Unless a developer comes and tells us why games that are HDR available on consoles are not HDR available on PC, we simply do not know how hard, or easy it is. From the fact there has been issues with HDR on PC (from the link provided on a previous post) tells me it's not as simple as one might think in comparing console HDR to PC HDR.
If the console itself provides the feature, shouldn't all games on consoles have support for HDR? To my knowledge, that isn't the case, but I'm not a console gamer so I'm not really sure.
data/avatar/default/avatar16.webp
OK, hold on now. HDR is just to do with brightness and contrast. It has nothing to do with color. "HDR Color" is actually called Wide Color gamut. Terminology is important. Now, let's get some things straight about WCG. The standards that are set up to handle 10-bit color (rec.2020) don't give a crap whether your screen is capable of all 1.07 billion colors - they will feed 10-bit and the panel will interpolate the signal with whatever color gamut it can display. Since there are literally zero fully 10-bit capable displays out there in the consumer space right now, that's the best we can hope for. Most TVs, phones and monitors aim for 100% of DCI-P3, which is around 60% of the color depth of rec.2020. Even the most capable professional monitors out there are only capable of around 75% of full rec.2020. Knowing that, now you need to understand that the entire discussion around this is complete BS. The thing is, the difference between a basic 4k HDR TV and a professional level monitor is huge, but that makes very little difference in games and movie watching when you compare that to the difference between a non-WCG 8-bit panel and that same basic HDR TV. A basic 8-bit panel can display 16.8 million colors, but even the basic Vizio E-series 4k HDR TV with its 8-bit frc panel can display over 500 million (50% of rec.2020) in WCG mode - that's around 25 times the number of colors. Sure, that professional monitor can display 700 million, but that's only 250 more compared to the 400+ more that even a low-end Vizio TV can do vs a non-WCG capable screen can. That means much, much smoother color gradients without software needing to step in. So basically, all these "blah is not even really 10-bit" comments are pretty meaningless. You DO want your games to support HDR because you DO want that extra 400m+ colors, and you DO want a WCG capable monitor to see them, even if it's a crap entry level TV. Everything else is gravy.
data/avatar/default/avatar09.webp
illrigger:

OK, hold on now. HDR is just to do with brightness and contrast. It has nothing to do with color. "HDR Color" is actually called Wide Color gamut. Terminology is important. Now, let's get some things straight about WCG. The standards that are set up to handle 10-bit color (rec.2020) don't give a crap whether your screen is capable of all 1.07 billion colors - they will feed 10-bit and the panel will interpolate the signal with whatever color gamut it can display. Since there are literally zero fully 10-bit capable displays out there in the consumer space right now, that's the best we can hope for. Most TVs, phones and monitors aim for 100% of DCI-P3, which is around 60% of the color depth of rec.2020. Even the most capable professional monitors out there are only capable of around 75% of full rec.2020. Knowing that, now you need to understand that the entire discussion around this is complete BS. The thing is, the difference between a basic 4k HDR TV and a professional level monitor is huge, but that makes very little difference in games and movie watching when you compare that to the difference between a non-WCG 8-bit panel and that same basic HDR TV. A basic 8-bit panel can display 16.8 million colors, but even the basic Vizio E-series 4k HDR TV with its 8-bit frc panel can display over 500 million (50% of rec.2020) in WCG mode. Sure, that professional monitor can display 700 million, but that's only 250 more compared to the 400+ more that even a low-end Vizio TV can do vs a non-WCG capable screen can. That means much, much smoother color gradients without software needing to step in. So basically, all these "blah is not even really 10-bit" comments are pretty meaningless. You DO want your games to support HDR because you DO want that extra 400m+ colors, and you DO want a WCG capable monitor to see them, even if it's a crap entry level TV. Everything else is gravy.
There is reason to actually require 10bit color (8bit+frc 2bit dithering as minimum) for HDR. Main purpose or benefit of HDR is to extend colour range to make more details visible and its done by increasing colorspace, contrast and brightnes. If you do that with 8-bit you might get more details but also make overall picture looking terrible due to increased color banding. Also HDR displays often accept and process 12 or even 14 bit colors and use look up table to display most acurate color it can do. Not sure if what you are talking about with basic 4k HDR TV, but difference in color picture and acuracy betwen cheap TVs and top HDR models from LG using their nanocel or Samsung with their Quantum dots look quite diferent and can even beat most afordable professional monitors if correctly calibrated.
https://forums.guru3d.com/data/avatars/m/113/113386.jpg
xrodney:

There is reason to actually require 10bit color (8bit+frc 2bit dithering as minimum) for HDR. Main purpose or benefit of HDR is to extend colour range to make more details visible and its done by increasing colorspace, contrast and brightnes. If you do that with 8-bit you might get more details but also make overall picture looking terrible due to increased color banding. Also HDR displays often accept and process 12 or even 14 bit colors and use look up table to display most acurate color it can do. Not sure if what you are talking about with basic 4k HDR TV, but difference in color picture and acuracy betwen cheap TVs and top HDR models from LG using their nanocel or Samsung with their Quantum dots look quite diferent and can even beat most afordable professional monitors if correctly calibrated.
This is the first thing i checked on my monitor, there isn't any visible color banding i could see. As my TV is also 8bit with HDR and has been proven to display 10bit colors without banding(My TV has different panels, i happen to have the good one luckily). I can't recall the source, but they do reviews for TV's and have a good reputation and the hardware to measure these stuff.
data/avatar/default/avatar32.webp
xrodney:

There is reason to actually require 10bit color (8bit+frc 2bit dithering as minimum) for HDR. Main purpose or benefit of HDR is to extend colour range to make more details visible and its done by increasing colorspace, contrast and brightnes. If you do that with 8-bit you might get more details but also make overall picture looking terrible due to increased color banding. Also HDR displays often accept and process 12 or even 14 bit colors and use look up table to display most acurate color it can do. Not sure if what you are talking about with basic 4k HDR TV, but difference in color picture and acuracy betwen cheap TVs and top HDR models from LG using their nanocel or Samsung with their Quantum dots look quite diferent and can even beat most afordable professional monitors if correctly calibrated.
That's exactly what I just explained. Except you keep using the term HDR when you should be using Wide Color Gamut. HDR and WCG are completely separate things. HDR defines the extended brightness and contrast range of a display. WCG defines the extended color range. The biggest differences between entry level screens (ones with 8-bit frc panels) and high end ones is the actual HDR properties (brightness and contrast) and color accuracy, not the number of colors being displayed. OLED and Q-Dot displays have extraordinary black levels and contrast which make their colors pop more, but they don't in general do much better at displaying those colors over an accurately calibrated lower-end TV. That's primarily because most of the content isn't mastered at a high enough depth to matter much. Many 4k Blu-Rays are actually mastered at rec.709 8-bit, with only the really big names being fully mastered to the 10-bit 4:2:0 standard for WCG TV content (BT.2020), and most 4k set-top boxes and blu-ray players can't display higher than 10-bit 4:2:2 anyway. Even the nVidia shield is capped at 10-bit 4:2:2 or 12-bit 4:2:0. Only PCs at this point are doing 10-bit 4:4:4, and most games (actually all, as far as I am aware) aren't mastered to make use of it because they are usually just using the mastering from the PS4 Pro and Xbox One X, which only support 10-bit 4:2:0 or 4:2:2 at this point. It just doesn't make sense to spend extra time and effort mastering content that such a small subset of devices can display when it's easier and faster to do it for the more common configuration, especially when that configuration is so much better than the previous standard.
data/avatar/default/avatar05.webp
signex:

This is the first thing i checked on my monitor, there isn't any visible color banding i could see. As my TV is also 8bit with HDR and has been proven to display 10bit colors without banding(My TV has different panels, i happen to have the good one luckily). I can't recall the source, but they do reviews for TV's and have a good reputation and the hardware to measure these stuff.
That would be rtings.com - an awesome source for TV info. 😉
data/avatar/default/avatar33.webp
alanm:

Makes sense. A lot of modern TVs support HDR, very few PC monitors do.
HDR support for PC platofrm is native. The moniker only applies to TV industry as they needed a new moniker to segregate their offerings and even then HDR isn't HDR on tv's as they have different levels of HDR capable abilities. Does it make you feel better to know that you have an HDR set? if it does then the TV's industry decision was justified. The HDR Moniker used in video games is really sad and if their game was able to receive an HDR update, then their game was sub par to begin with. "we are releasing an HDR update for our game" These days, with the sheer Size of textures, there is little reason to try to save room by using non HDR textures. Who's trying to save space these days? Let's make a dull looking game loaded with dull looking textures, "no one said ever."" Your GPU was almost always capable of HDR output ( I say almost because for a while there Nvidia was handicapping their consumer cards to a lower bit rate VS their pro cards. This has since been rectified as competition never did such disservice to their customers), though your monitor may not be able to support handle the ranges. Traditionally, only professional monitors offered a large range of colours/light as if in most situations any 32 bit screen has enough colours for the average users, some 16,777,216. In 2018, I don't see how this is even a point of conversation these days. Any tv or monitor that doesn't have HDR is purposely built for one reason and one reason only, and that's to create a product hierarchy to create the illusion of superiority, or lack of, to entice the consume to spend more for what they feel their deserve.