New Rendering Mode for Cyberpunk 2077 with Significant Performance Hit - 16 FPS

Published by

Click here to post a comment for New Rendering Mode for Cyberpunk 2077 with Significant Performance Hit - 16 FPS on our message forum
data/avatar/default/avatar20.webp
If only developers abandon all this RT bullshit and focus on proper optimisations, instead of bringing the already shit performance even down...
data/avatar/default/avatar20.webp
D1stRU3T0R:

If only developers abandon all this RT bullshit and focus on proper optimisations, instead of bringing the already crap performance even down...
Nonsense. There are several other modes to play and path tracing is the future. This is how the industry moves forward.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Rt has nothing to do with it, most non-rt games are the same or worse mess. Especially recently nothing runs as it should and needs patches right after launch. Serves people right for buying unfinished crap at full price.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
cucaulay malkin:

Rt has nothing to do with it, most non-rt games are the same or worse mess. Especially recently nothing runs as it should and needs patches right after launch. Serves people right for buying unfinished crap at full price.
Forspoken, Stutter Protocol and now The Last of Us, but people are blaming RT for some reason...
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
D1stRU3T0R:

If only developers abandon all this RT bullshit and focus on proper optimisations, instead of bringing the already crap performance even down...
When implemented correctly, RT has a hugely beneficial impact on a games atmosphere and immersion. Walk through a dusty building in Metro EE, with sunlight coming through gaps in the boarded up windows and flickering lamps casting light and shadows on the wet floors and walls.
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
Don't worry, we will be able to play this properly in 2077 đŸ˜€
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
AMD paid to have rt and dlss removed from boundary, we're in for a flawless launch with no vram leak patches within a few days, like tlou or hogwarts.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Krizby:

Forspoken, Stutter Protocol and now The Last of Us, but people are blaming RT for some reason...
Clearly missed the turn when it said adoredtv comment section and ended up here.
data/avatar/default/avatar31.webp
Even in my college years back in 2008 we were told in our course that ray tracing is the future for graphical fidelity. In some games you even have a toggle for ray traced audio, which is interesting.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
16fps lmao But why? Sure it's the future but why now? Why not wait for hardware to catch up to play it properly. I'm old enough to remember the days when PC gamers would be up in arms at stuff like this or they would laugh in the face of upscaling and frame generation. I love when people say "I have this game totally maxed out at 4k with DLSS at balanced"..... Erm that's not 4K bud. When will this mode actually be playable then? 50 series could be 2x faster and if it scales 100% then that's still only 32fps in this mode which is far from playable. Or is Nvidia now going to sell us on 30fps is new norm and our GPUs are now $2000 for a 70 class card lol
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Money. R&D investments have to start paying off earlier than in 10 years. You're paying for this on AMD cards too, wonder why people think they dont just because NVIDIA runs rt faster.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Yeah let's not buy any GPU now, we need to wait before we are about to die to enjoy the best gaming experience of a lifetime /s
data/avatar/default/avatar10.webp
pegasus1:

When implemented correctly, RT has a hugely beneficial impact on a games atmosphere and immersion. Walk through a dusty building in Metro EE, with sunlight coming through gaps in the boarded up windows and flickering lamps casting light and shadows on the wet floors and walls.
We had all those effects you mention in games for like 20 years. It can all be simulated with raster techniques rather well and usually perform much better too. RT results can be more accurate, but often you need to flip between screenshots to notice a difference, so usually it's not worth 50% or more performance hit.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Nvidia.. the way we stay in business..
Nvidia milking time.jpg
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Glottiz:

We had all those effects you mention in games for like 20 years. It can all be simulated with raster techniques rather well and usually perform much better too. RT results can be more accurate, but often you need to flip between screenshots to notice a difference, so usually it's not worth 50% or more performance hit.
Ive been gaming since the Atari Console days and ive never seen anything that looks even remotely like MEE, if you are saying we had 4K RT graphics 20 years ago, you must have been taking a lot of drugs at the time.
data/avatar/default/avatar17.webp
pegasus1:

Ive been gaming since the Atari Console days and ive never seen anything that looks even remotely like MEE, if you are saying we had 4K RT graphics 20 years ago, you must have been taking a lot of drugs at the time.
what are you going on about old timer?
https://forums.guru3d.com/data/avatars/m/53/53598.jpg
How long have nivida been pushing RT cards now, three generations, and still it's about as popular as rat p%$s covered sponge cake. Lol
data/avatar/default/avatar29.webp
Oh here we go again with people complaining about RT "but is not worth 50%" the funny part is people talking about 20 year old games on PC. 20 Years ago you were lucky to get 30 fps using a voodoo1 card with Unreal, still we didn't complain, it looked awesome and it was the future. This happened with T&L and shader model 3, all graphical enhancement start somewhere, NOBODY IS FORCING YOU TO ACTIVATE RAYTRACING. Also is NOT true that you need like a 4090 to play with RT On, you just need to lower your expectations, even with a 2060 you can play most games at 60fps using RT and I have tons of videos in my channel to prove this. heck even the Arc 750 is a beast when it comes to RT. just yesterday I tested a 3060 and I could play most games with RT, hogwarts legacy on high and high RT running at freakish 80 fps!!! at 1440p with Dlss Balance. Returnal at 4K on Epic Quality. The problem is people don't set expectations, you can get a good balance/compromise between raster and RT and make the game look better than with just Raster. But nowadays it seems people just set ultra and then complaints when it doesn't run correctly on their 3050s. Come on people honestly. RT is the future, this Cyberpunk is an extreme case but one I am glad it exist, it proves is possible and RT is awesome. No form of raster is going to be "close enough" light is damn hard to simulate, artist need to put a lot more work. Have you played Spider Man with RT reflections? damn is beautiful, buildings, floors, everything reflecting the way it should. Good thing other people are driving the graphic technology into the future or we would still be on the atari 2600 era
data/avatar/default/avatar39.webp
Krizby:

Forspoken, Stutter Protocol and now The Last of Us, but people are blaming RT for some reason...
Ok. But The Last of Us does NOT have RT support. lol