AMD Polaris (Radeon RX 400/500) users unable to play Forspoken

Published by

Click here to post a comment for AMD Polaris (Radeon RX 400/500) users unable to play Forspoken on our message forum
data/avatar/default/avatar21.webp
Astyanax:

No, you're wrong. The developers have targeted and use Feature Level 12_1, Polaris does not support this feature level.
you say I'm wrong and then say the developers chose not to support 400/500 cards. its got nothing to do with AMD.As always I AM RIGHT.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
On the one hand, this is a 6.5 year old GPU, and we can't expect for old hardware to support all modern games. On the other hand, the GPU market has been a freaking mess for nearly 3 years, so a lot of people just can't upgrade. And finally, this is not a good game, so gamers that can't play it are not missing out on something special.
https://forums.guru3d.com/data/avatars/m/263/263841.jpg
Horus-Anhur:

On the one hand, this is a 6.5 year old GPU, and we can't expect for old hardware to support all modern games. On the other hand, the GPU market has been a freaking mess for nearly 3 years, so a lot of people just can't upgrade. And finally, this is not a good game, so gamers that can't play it are not missing out on something special.
Indeed. When DX12 was being introduced there was quite a bit of talk about how these different feature levels would cause some "DX12 cards" to become glorified paperweights. We knew it was going to happen at some point, I'm just surprised it took nearly 8 years. And that my 290x would still be kicking by then!
data/avatar/default/avatar01.webp
Horus-Anhur:

On the one hand, this is a 6.5 year old GPU, and we can't expect for old hardware to support all modern games. On the other hand, the GPU market has been a freaking mess for nearly 3 years, so a lot of people just can't upgrade. And finally, this is not a good game, so gamers that can't play it are not missing out on something special.
considering in the first 2 years of covid most GPUs sold were mouldy shit silicon left in abandoned storages and that new GPUs were basically just for rich papa kids, you should count them as 4.5yo at least. Yes, first 2 years of covid instead of rising new hardware spread they lowered it. And today hardware is still expensive: consider cheaper GPUs today costs like medium-tier hardware pre-covid, with almost no performance improvement, sometimes even lower (es: RTX 3050 vs RX 5700)
https://forums.guru3d.com/data/avatars/m/295/295892.jpg
My opinion: If you want to play with moderns games buy a modern hardver.
data/avatar/default/avatar26.webp
Lol on ubuntu with proton it actually runs on polaris XD [youtube=w7T5yMG2N7Y] Probably skipping some calls of rovs or conservative rasterization.. if they are some..
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Yogi:

Indeed. When DX12 was being introduced there was quite a bit of talk about how these different feature levels would cause some "DX12 cards" to become glorified paperweights. We knew it was going to happen at some point, I'm just surprised it took nearly 8 years. And that my 290x would still be kicking by then!
the features in question Rasterizer-ordered views and Conservative Rasterization aren't essential things, they are useful mostly for improving performance/optimization. not exactly make or break imo. seems kinda silly to break compatibility for such things. Reminds me of the issue with popcnt and ssse3, breaking compatiblity for core 2 , and phenom ii chips, neither instruction(set) is particularly important or essential , but developers still set their compiler targets to the equivalent of first gen core i7 , and it breaks compatibility, when it probably won't change performance much if at all, since all of the important instructions are there.
https://forums.guru3d.com/data/avatars/m/295/295892.jpg
Alessio1989:

modern games have to consider modern hardware situation. if they are built upon running good at 60fps in 4K for a 4090 RTX they will sell very few copies. https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
And who say if you have a 4090 you must play in 4K? 😕 Or use 4K + DLSS and OMG i have 120FPS. Plague Tale Requiem: All max + 4K + DLSS3 + DLSS Balanced = Continuous 120FPS with 4080. You don't just have to buy the hardware, you also have to know how to use it.
data/avatar/default/avatar19.webp
user1:

the features in question Rasterizer-ordered views and Conservative Rasterization aren't essential things, they are useful mostly for improving performance/optimization. not exactly make or break imo. seems kinda silly to break compatibility for such things. Reminds me of the issue with popcnt and ssse3, breaking compatiblity for core 2 , and phenom ii chips, neither instruction(set) is particularly important or essential , but developers still set their compiler targets to the equivalent of first gen core i7 , and it breaks compatibility, when it probably won't change performance much if at all, since all of the important instructions are there.
I had a Phenom965BE X4 at the time and complained like others that couldn't play FARCRY and we were told it was due to SSE4.2a or 4.1a and once they patched it a month or so later ( from a large amount of owners like me complaining) my game finally made it past the splash screen and ran well with a RX560 and RIPJAWS DDR2 yes DDR2 8GB RAM. 1080p High settings 43-70FPS if I recall.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Espionage724:

I don't know anything about Forspoken gameplay, but it's definitely interesting from a technical perspective 😛 I wonder if something could be hacked-up or if it would just-work with VKD3D?
Don't think anything special has to happen, though the DRM might not like using the standalone vkd3d dlls on windows.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Kaarme:

If Polaris graphics cards haven't got enough muscle to run it anyway, does it really matter?
This how pascal runs the game. I mean gtx1060 at 720p is dropping below 30fps at lowest settings. [youtube=14idRgbV6zY]
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Imagine still using an RX 400 series video card in 2023..... pfft..
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
yeah this happens every now and then, nothing new, just wait till they all fire the artists doing baked-in lightning cuz RT what kinda bothers me is how most of these new games that run like crap also look like crap, years ago the demanding games had the looks/physics to justify it, whereas todays demanding games look like what my sandybridge could've run but the 2010 laptop APUs prevented
https://forums.guru3d.com/data/avatars/m/279/279250.jpg
GREGIX:

So far as I know, this game runs badly on any hardware, so Polaris users could expect 5-10FPS, so what point of trying even... Like, 4090 renders 80ish. So what would be Polaris scores?
The game ran just fine for me when i played the demo.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
EspHack:

yeah this happens every now and then, nothing new, just wait till they all fire the artists doing baked-in lightning cuz RT what kinda bothers me is how most of these new games that run like crap also look like crap, years ago the demanding games had the looks/physics to justify it, whereas todays demanding games look like what my sandybridge could've run but the 2010 laptop APUs prevented
I feel like it's because of the shift for studio's just using off the shelf engines, like Unreal... and then a lot of people using those off the shelf engines don't know programing in the same way video game developers of the past used to, where they were the ones building the engine with the ability to optimize to get better performance out of the code. I first realized this with the release of Mass Effect Andromeda, if memory serves they had a beginner in charge of animation.
data/avatar/default/avatar15.webp
KissSh0t:

Imagine still using an RX 400 series video card in 2023..... pfft..
imagine ignoring how the GPU market was in the last 3 years and how is the current GPU usage on potential customers. Imaging ignoring that best selling cards on amazon during first 2 years of covid were low-end EOL products..
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Alessio1989:

imagine ignoring how the GPU market was in the last 3 years and how is the current GPU usage on potential customers. Imaging ignoring that best selling cards on amazon during first 2 years of covid were low-end EOL products..
Heheh... I watched the GTX 1050 get more expensive than my current card when I bought it new and then the GTX 1050 never coming back down to what it should be priced at... lol The way I see it, Forspoken is a PS5 game that was badly ported to pc, the developers did minimum effort getting it to run on pc, basically whatever can run it can run it and screw anyone else that can't... they should all go out and buy a $1000+ video card for the ability to play this one AMAZING video game.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
user1:

but I'm pretty sure that amd nor nvidia knew exactly what 12_1 was going to look like when they started developing their architectures.
AMD knew that conservative rasterization (as an optional component way back when) would matter one day, they just stuck their head in the sand over it. Nvidia got caught lacking 11.1 (and thus 12_0) compliance with kepler, and hasn't been caught out that way since.