Remedy lowers the PC requirements for Control
Click here to post a comment for Remedy lowers the PC requirements for Control on our message forum
cyberfredxxx
why gsync or freesync support in additional specs ????
I believed it was completely independent on the game.
All my games work perfectly with gsync/freesync and it s not in specification (fortunately !)
Noisiv
https://abload.de/img/screenshot2019-03-270n1jlz.png
AMD's Windmill demo doesn't work with FreeSync. Pendulum does.
Its not given. FreeSync has a mind of its own.
FreeSync was the best and the worst thing with my 290/FS monitor combo.
Awesome when it's working. A constant nuisance having to check all the time and fretting if its working or not.
And it makes perfect sense when you remember that FS badge has been given to every single monitor that yelled ME2, no matter how awful freesync range or unsuitable otherwise.
And when you remember that when examined in lab conditions by Nvidia only handful passed VESA Adaptive Sync check.
PS im ready.
They say Control is the best RTX yet. Ray tracing so sweet. 😀
Astyanax
amd's windmill demo is absolute garbage, it never made it to a 1.0 build because it was so broken
no matter what fps you had, it always ran a 99% gpu utilization on either vendors card and the framerate it reported was not correct when compared to FPS monitors built into the monitor itself or RTSS.
Noisiv
Aura89
Reardan
Noisiv
yasamoka
Undying
Netherwind
@man_daddio
Not my normal genre/type of game but willing to give it a go and enjoy it. Got it free with my RTX card. And I am the type of person that must have nice graphics in my games.
Consoles, toaster PCs and competitive games are the things really keeping us from getting to ultra-realistic graphics. People will say hardware is the limiting factor but it's not. Consumers drive the market.
We should be way out of 1080p gaming now but we are not even close.
Margalus
Aura89
CPC_RedDawn
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-AMD-RX-580/4027vs3923
I would honestly love to be proven wrong by the devs, and for them to show me how they optimised the game to get such a massive difference in recommended specs. Because this is nothing short of amazing, but I am still on the side of this being complete BS. If the recommended specs went from a 2080Ti to a 2070/1080/Vega56 I would believe "optimisation" a lot more.
The only game I can remember that had a big performance jump was the original STALKER game. Where hardware was on the cusp of going from single core CPU's to proper dual core CPU's with Intel's conroe architecture. The devs released a patch where they allowed a second shader stream to run on the second CPU core if was one present in the system. This patch gave me an insane increase in performance, going from 45fps to over 80fps (if my memory serves me well). Other than that I can't remember any game that had such a big performance increase right after release. Just boggles my mind as to why any company would want to show off their games in such a horrible state, or release specs that >1% of the player base will ever actually be able to run it. Only Crysis comes to mind here, but again the industry was at a massive turning point in terms of hardware. Going from discrete pixel pipelines (pixel, geometry, vertex) to full dynamic shaders with the G80 8800GTX and 2900XT cards and also DX10 as we had DX9 for years. Sure we have RTX now with this release but seriously, look at the game with RTX. It doesn't look that good at all, very narrow small corridor settings, bland environments, etc. Not saying the game won't be good, just the original specs were so insane going from that to this new revised spec sheet just doesn't seem right.
I know this isn't respectful of real world use, but just look at how faster the 2080Ti is compared to an RX580
Margalus
Aura89
Margalus
Aura89