Remedy lowers the PC requirements for Control

Published by

Click here to post a comment for Remedy lowers the PC requirements for Control on our message forum
data/avatar/default/avatar31.webp
why gsync or freesync support in additional specs ???? I believed it was completely independent on the game. All my games work perfectly with gsync/freesync and it s not in specification (fortunately !)
data/avatar/default/avatar40.webp
cyberfredxxx:

why gsync or freesync support in additional specs ???? I believed it was completely independent on the game. All my games work perfectly with gsync/freesync and it s not in specification (fortunately !)
AMD's Windmill demo doesn't work with FreeSync. Pendulum does. Its not given. FreeSync has a mind of its own. FreeSync was the best and the worst thing with my 290/FS monitor combo. Awesome when it's working. A constant nuisance having to check all the time and fretting if its working or not. And it makes perfect sense when you remember that FS badge has been given to every single monitor that yelled ME2, no matter how awful freesync range or unsuitable otherwise. And when you remember that when examined in lab conditions by Nvidia only handful passed VESA Adaptive Sync check. PS im ready. They say Control is the best RTX yet. Ray tracing so sweet. 😀 https://abload.de/img/screenshot2019-03-270n1jlz.png
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
amd's windmill demo is absolute garbage, it never made it to a 1.0 build because it was so broken no matter what fps you had, it always ran a 99% gpu utilization on either vendors card and the framerate it reported was not correct when compared to FPS monitors built into the monitor itself or RTSS.
data/avatar/default/avatar39.webp
Astyanax:

amd's windmill demo is absolute garbage, it never made it to a 1.0 build because it was so broken no matter what fps you had, it always ran a 99% gpu utilization on either vendors card and the framerate it reported was not correct when compared to FPS monitors built into the monitor itself or RTSS.
They are tech demos, not games. Pendulum runs at 99% as well. However it works at whats its supposed to do, while Adrenaline fixed and subsequently broke Windmill god knows how many times. Its hilarious that 290 does not work with FreeSync showcase app aka Windmill, yet it works with NV's Pendulum 🙂
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
cyberfredxxx:

why gsync or freesync support in additional specs ???? I believed it was completely independent on the game. All my games work perfectly with gsync/freesync and it s not in specification (fortunately !)
It actually can be pretty specific. For instance, Skyrim Special Edition is fairly difficult (but not impossible) to get working with gsync.
https://forums.guru3d.com/data/avatars/m/259/259564.jpg
Noisiv:

AMD's Windmill demo doesn't work with FreeSync. Pendulum does. Its not given. FreeSync has a mind of its own. FreeSync was the best and the worst thing with my 290/FS monitor combo. Awesome when it's working. A constant nuisance having to check all the time and fretting if its working or not. And it makes perfect sense when you remember that FS badge has been given to every single monitor that yelled ME2, no matter how awful freesync range or unsuitable otherwise. And when you remember that when examined in lab conditions by Nvidia only handful passed VESA Adaptive Sync check. PS im ready. They say Control is the best RTX yet. Ray tracing so sweet. 😀
It's so funny, my experience is exactly the opposite. I found gsync to be extremely unreliable but Freesync, I never need to think about.
data/avatar/default/avatar15.webp
Reardan:

It's so funny, my experience is exactly the opposite. I found gsync to be extremely unreliable but Freesync, I never need to think about.
Hey at least I didn;t pay royalties for my FS :P VESA Adaptive Sync needs to be listed in the UN charter of uman's rights!
data/avatar/default/avatar23.webp
Margalus:

No, that is not a good thing. It's not code "optimization." I hate the word "optimization" in computing. Most gamers seem to think that you can just magically make things run on inferior equipment just by "optimizing." Nothing could be farther from the truth. The ugly truth is, if they lowered the standards required that means that they either lowered the quality or that they just arbitrarily chose to say it requires less while changing nothing.
You have no idea what you're talking about, therefore stop.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
airbud7:

Yay,I can play it now!
We could play it even with the previous system req. but on minimum now we are at recommended. Isnt that nice. 😀
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
ViperAnaf:

Can you smell it? Downgrade....
...or they've lowered the resolution/FPS target which by the way is still unknown. So instead of going for 1080p@60fps they'd go for 1080p@40fps.
data/avatar/default/avatar27.webp
Not my normal genre/type of game but willing to give it a go and enjoy it. Got it free with my RTX card. And I am the type of person that must have nice graphics in my games. Consoles, toaster PCs and competitive games are the things really keeping us from getting to ultra-realistic graphics. People will say hardware is the limiting factor but it's not. Consumers drive the market. We should be way out of 1080p gaming now but we are not even close.
data/avatar/default/avatar19.webp
yasamoka:

You have no idea what you're talking about, therefore stop.
I do know what I am talking about. And no, this is a public forum, I have the right to contribute what I think. It's sad that there are people like you that think you have the right to control what other people post.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Margalus:

I do know what I am talking about.
You really don't. Anyone who thinks that every code is as optimized as it ever will be, literally has zero clue as to what coding is. I'm not saying you don't know how to code, maybe you do maybe you don't, but i wouldn't trust you to build code for anyone with that mindset.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Aura89:

We've seen games go from 20fps before release to 80 fps on the same hardware at release to 100 fps after a few patches after release. Again all on the same hardware. So to say optimizations don't happen, or aren't that big, and must be a downgrade, is pure nonsense. There's no reason to claim "DOWNGRADE" unless you actually have evidence of a downgrade.
The only game I can remember that had a big performance jump was the original STALKER game. Where hardware was on the cusp of going from single core CPU's to proper dual core CPU's with Intel's conroe architecture. The devs released a patch where they allowed a second shader stream to run on the second CPU core if was one present in the system. This patch gave me an insane increase in performance, going from 45fps to over 80fps (if my memory serves me well). Other than that I can't remember any game that had such a big performance increase right after release. Just boggles my mind as to why any company would want to show off their games in such a horrible state, or release specs that >1% of the player base will ever actually be able to run it. Only Crysis comes to mind here, but again the industry was at a massive turning point in terms of hardware. Going from discrete pixel pipelines (pixel, geometry, vertex) to full dynamic shaders with the G80 8800GTX and 2900XT cards and also DX10 as we had DX9 for years. Sure we have RTX now with this release but seriously, look at the game with RTX. It doesn't look that good at all, very narrow small corridor settings, bland environments, etc. Not saying the game won't be good, just the original specs were so insane going from that to this new revised spec sheet just doesn't seem right. I know this isn't respectful of real world use, but just look at how faster the 2080Ti is compared to an RX580 https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-AMD-RX-580/4027vs3923 I would honestly love to be proven wrong by the devs, and for them to show me how they optimised the game to get such a massive difference in recommended specs. Because this is nothing short of amazing, but I am still on the side of this being complete BS. If the recommended specs went from a 2080Ti to a 2070/1080/Vega56 I would believe "optimisation" a lot more.
data/avatar/default/avatar13.webp
Aura89:

You really don't. Anyone who thinks that every code is as optimized as it ever will be, literally has zero clue as to what coding is. I'm not saying you don't know how to code, maybe you do maybe you don't, but i wouldn't trust you to build code for anyone with that mindset.
I never said anything of the sort, somebody else said I said that.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Margalus:

I never said anything of the sort, somebody else said I said that.
This is you, stating exactly that.
Margalus:

No, that is not a good thing. It's not code "optimization." I hate the word "optimization" in computing. Most gamers seem to think that you can just magically make things run on inferior equipment just by "optimizing." Nothing could be farther from the truth. The ugly truth is, if they lowered the standards required that means that they either lowered the quality or that they just arbitrarily chose to say it requires less while changing nothing.
And if you're saying that's not you stating exactly that, then explain. How is it that "code optimizations" don't provide better performance, but at the same time, you're saying codes can be optimized? You're either saying code can't be optimized or code can be optimized.
CPC_RedDawn:

Other than that I can't remember any game that had such a big performance increase right after release.
I've seen a few games either have much better FPS after release with no, noticeable, graphics differences, but no they are generally not that drastic. We don't get to see most games in their "very much not optimized" state. An exception to this is early access games, where quite a few of them are pretty early release games. The ship remastered for instance is not that good looking of a game but performed pretty bad on initial release. It got BETTER looking (still not amazing, it's the ship afterall) and performed better as time went on. But that's my only point in what i was saying: Optimizations don't have to be a visual downgrade, in fact sometimes it can be visually better with better performance. When they did their original estimates for this game, we don't know if there was some major issue that they thought couldn't be resolved, or didn't even know was causing an issue. Maybe they were/are still creating their Northlight Engine, and that's something that fixed it. Or maybe they downgraded, who knows. No one can know and all this talk of "They must have downgraded!" without proof and people trying to say "optimizations don't happen" is just pointless. Optimizations do happen and not everything is a downgrade.
data/avatar/default/avatar02.webp
Aura89:

This is you, stating exactly that. And if you're saying that's not you stating exactly that, then explain. How is it that "code optimizations" don't provide better performance, but at the same time, you're saying codes can be optimized? You're either saying code can't be optimized or code can be optimized.
There is a difference between what optimization is and what some think it is. You can optimize code sometimes and get a little better performance. But more often than not the developers are just decreasing graphical quality to improve performance. Sometimes it is something that isn't noticeable. Other times you can see the difference. I have seen games recently that people thought where "optimized" when all they really did was lower the graphics quality. Unfortunately there are people who think that you can increase a games performance by 400-500% with a little "optimization." That is a blatant falsehood and cannot be done.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Margalus:

There is a difference between what optimization is and what some think it is. You can optimize code sometimes and get a little better performance. But more often than not the developers are just decreasing graphical quality to improve performance. Sometimes it is something that isn't noticeable. Other times you can see the difference. I have seen games recently that people thought where "optimized" when all they really did was lower the graphics quality. Unfortunately there are people who think that you can increase a games performance by 400-500% with a little "optimization." That is a blatant falsehood and cannot be done.
In that i agree, sometimes people have more expectations then is realistic, and sometimes people claim a game is un-optimized simply because "This game looks better (subjective) and runs better, why doesn't this one!" So realistically, i don't think we disagree then, but your original post did make it sound like you were saying everything is as optimized as it will ever be.