AMD FSR 3 Update: Forspoken and Immortals of Aveum to Receive Frame Generation Technology Today
Click here to post a comment for AMD FSR 3 Update: Forspoken and Immortals of Aveum to Receive Frame Generation Technology Today on our message forum
AuerX
https://cdn.discordapp.com/attachments/749302480633987152/1157282009316593775/2023-09-29_07_43_43-Steam_Charts_Most_Played_Games_on_Steam_SteamDB_-_Brave.png?ex=65180a5c&is=6516b8dc&hm=bcdc76186c8517fb5b82fed339d4f739869290a9ad58c3080e12b8e3961f6e00&
Anything in the Steam top 20 would have made sense.
09-29-2023 7:45AM
Digilator
Digilator
6700XT with Ray Tracing:
[youtube=08lIzIwRb7c]
Nvidia 3050:
[youtube=Zzxc3QIqlPM]
AuerX
Digilator
mattm4
TheDigitalJedi
I just tried out the Forspoken Demo with FSR3. There were some image quality issues with major blurring/ghosting when I scrolled side to side. The game play was stuttery as well. The game played better with FSR3 turned off. It's almost as if FSR3 has some type of syncing issue. Fidelity FX is much smoother. Maybe it's just this game and the tech will showcase better in others.
I'm still very happy this tech exist and that it's finally released. It's going to take some time. DLSS didn't get great overnight. I'm very curious to see the advancement and future improvements.
Despite what I'm seeing now, there is still a major conversation to be had. Does an enthusiast gamer spend $900 to a $1000 on a top of the line AMD card with FSR3 or $1600 to $2000 on the best Nvidia card with DLSS3/Framegen?
TheDigitalJedi
GeniusPr0
TheDigitalJedi
AuerX
Horus-Anhur
251 concurrent players.
This is like a 100X increase 😀
https://media4.giphy.com/media/kSlJtVrqxDYKk/giphy.gif?cid=ecf05e47qwy2qkvggyjemnmt93tkeftetcd49puvop9sdfyy&ep=v1_gifs_search&rid=giphy.gif&ct=g
The Forspoken demo, in the last few days had 0-4 concurrent players. Now it's at Mufflore
GeniusPr0
https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-30-afmf-tech-preview
Digilator
Digilator
Mufflore
Horus-Anhur
https://media0.giphy.com/media/RJo6Uas77p4zzcEj5I/giphy.gif?cid=ecf05e47lfgsbtr0nqocsnjwd6cxvzhh8o0lkrtpg4skjge9&ep=v1_gifs_search&rid=giphy.gif&ct=g
Nah. It's still tiny.
Truder
I find it amusing how pissed off people are that their proprietary features weren't used in one game - not like Nvidia shoves blackbox features in countless games that have caused many a headache for users whom were on previous generation NV cards or AMD cards. Anyone remember gameworks? Witcher 3 and its hairworks, Fallout 4 with its ridiculous godrays, FFXV and its always on gameworks features (despite settings supposedly being turned off in the games menu) and while many will say "but those gameworks features were open to all users", yes this is true but we all know it's a marketing strategy so that users will feel compelled to upgrade to the current generation of cards for the respective games by crippling performance on older cards or other competitors cards. So regarding these features, do we blame developers of the games or Nvidia? Using the haters on Starfield then as an example, we should hate on Nvidia..... Food for thought?
All in all, AMD are just copying Nvidia and now you guys know how sh*t it is when it happens to you.... As for my opinion? I hate it anyway how AMD and Nvidia are behaving this way, stupid duopoly of which I believe in my tinfoil hat conspiracy theory, I'm sure there's collusion in the background and price fixing too with financial backhanders being passed around.
Digilator