Battlefield 2042 will get DLSS and Nvidia Reflex support (but not raytracing?)

Published by

Click here to post a comment for Battlefield 2042 will get DLSS and Nvidia Reflex support (but not raytracing?) on our message forum
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
nizzen:

Frostbite is THE best game engine IF you have the hardware to drive it. Frostbite sucks for poor people 😉 Cpu and memoryperformance (latency) is the most important for high fps in frostbite.
Tell that to the developers of Anthem....
https://forums.guru3d.com/data/avatars/m/249/249528.jpg
nizzen:

Frostbite is THE best game engine IF you have the hardware to drive it. Frostbite sucks for poor people 😉 Cpu and memoryperformance (latency) is the most important for high fps in frostbite.
That's one helluva disgusting comment there bud, good job.
data/avatar/default/avatar09.webp
no one is going to use ray tracing in a competitive game, they know this so they kept it out. that isn't a bad thing that's a good thing. sense prevails for once.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Martin5000:

no one is going to use ray tracing in a competitive game, they know this so they kept it out. that isn't a bad thing that's a good thing. sense prevails for once.
Lol competitive game.. Doesn't make any sense, just have it as an option.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
I prefer that DICE spends more time creating more maps, optimizing the netcode, fizing bugs, than imnplementing RT on a Multiplayer game.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Horus-Anhur:

I prefer that DICE spends more time creating more maps, optimizing the netcode, fizing bugs, than imnplementing RT on a Multiplayer game.
None of these things overlap in AAA game development lol
data/avatar/default/avatar39.webp
Denial:

None of these things overlap in AAA game development lol
No matter how you try to twist and turn it, implenting raytracing takes resources, and more or less everyone would rather see those resources spent on something else. It would be interesting to see data on how big a % of the userbase used raytracing in bf5... a miniscule amount id wager, which is likely why dice came to the conclusion that it's a waste of resources to include.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Dragam1337:

It would be interesting to see data on how big a % of the userbase used raytracing in bf5... a miniscule amount id wager, which is likely why dice came to the conclusion that it's a waste of resources to include.
Well, the people in marketing needed something to tell the press that is of interest in BF5, because, well, y'know.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

No matter how you try to twist and turn it, implenting raytracing takes resources, and more or less everyone would rather see those resources spent on something else. It would be interesting to see data on how big a % of the userbase used raytracing in bf5... a miniscule amount id wager, which is likely why dice came to the conclusion that it's a waste of resources to include.
No, sorry - that's not how this works. DICE has an engineering team in Sweden that handles rendering and the engine/engineering. Those people are hired whether the game includes RT or not. It's not like not implementing RT suddenly allows the company to hire a few more/better map designers - they have open positions for those people in LA regardless. That's not even to mention that we know implementations in Unity/Unreal literally require a dropdown in the former and a checkbox in the latter. They don't even have a campaign in this game to do a separate lighting master on.
data/avatar/default/avatar36.webp
Denial:

No, sorry - that's not how this works. DICE has an engineering team in Sweden that handles rendering and the engine/engineering. Those people are hired whether the game includes RT or not.
Yes, and then they can spend their time optimizing other aspects of the engine, rather than wasting their time on something no one is actually going to use.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
Dragam1337:

Yes, and then they can spend their time optimizing other aspects of the engine, rather than wasting their time on something no one is actually going to use.
I’d use it though I acknowledge I’m probably not representative of the norm. Given all the work they put in for RT in BFV I’d be very surprised if they just dropped it by the way side in BF6.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
bobnewels:

First Battlefield game I will not be buying, no real single player. RTX is dead but it will be a big seller without me. I am ok with that.
I too am one of those “weirdos” who actually cares about the single player. For me it’s the main draw in fact, so BF One was the last good one for me as it had a campaign of a decent length and quality. V’s campaign was anemic/very short.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Dragam1337:

Yes, and then they can spend their time optimizing other aspects of the engine, rather than wasting their time on something no one is actually going to use.
No one? Speak for yourself. Because *I* would use it. Together with a lower resolution DLSS screenmode. Same as I do with Cyberpunk, and it DOES make a gigantic improvement to visuals when done as well as Cyberpunk. Now whether DICE will use anything more than the most basic reflection only is another thing. Its real shadows and/or full global illumination that would be very nice to look at. Obviously no RTX needed if all you care about FPS and kill rates on multiplayer. But at least give us the on/off option.
data/avatar/default/avatar32.webp
geogan:

No one? Speak for yourself. Because *I* would use it. Together with a lower resolution DLSS screenmode. Same as I do with Cyberpunk, and it DOES make a gigantic improvement to visuals when done as well as Cyberpunk. Now whether DICE will use anything more than the most basic reflection only is another thing. Its real shadows and/or full global illumination that would be very nice to look at. Obviously no RTX needed if all you care about FPS and kill rates on multiplayer. But at least give us the on/off option.
Yeah, cause introducing a ton of lag in an online fps game is a great idea... have fun dying.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

Yeah, cause introducing a ton of lag in an online fps game is a great idea... have fun dying.
Lol it's Battlefield.. Battlefield has never been a game where I intentionally lower settings to get better performance. CS1.6/Source/GO? All low, playing at half my monitors native res with a custom config that lowers the quality even more than what's offered in game. Siege, same thing - everything low but I actually ran it in native 200+fps. Valorant? Optimized for the start for 200+ fps because even on high it looks mediocre. Those are actual competitive games that are worth extracting every bit of performance out of if you're playing in a league. Battlefield? I want awesome looking explosions, photogrammetric textures, beautiful sweeping Vistas, high quality assets, incredible lighting etc - and the great part about Battlefield, historically, is that even with all the settings turned up and the top end graphics card, you still get 70fps+. Which is more than enough to have people calling me a cheater whole game. I didn't play BF5 because Battlefield gameplay is so stagnant, but with my 3080 I have no problems hitting 70+fps with RT on without DLSS. And it's a DXR 1.0, first tier iteration of RT reflections only. Games now are running multiple RT effects with better optimization across the board. There's no reason why DICE can't do better given their track record. Not to mention that six months after the game comes out we'll probably be getting a whole new generation of RT cards. And worst case scenario you just TURN IT OFF. By your logic why even include "Very High" settings or whatever it's called in BF? Why include any setting that lowers performance? Texture Detail High? Have fun dying, said no one ever.
data/avatar/default/avatar01.webp
Denial:

Lol it's Battlefield.. Battlefield has never been a game where I intentionally lower settings to get better performance. CS1.6/Source/GO? All low, playing at half my monitors native res with a custom config that lowers the quality even more than what's offered in game. Siege, same thing - everything low but I actually ran it in native 200+fps. Valorant? Optimized for the start for 200+ fps because even on high it looks mediocre. Those are actual competitive games that are worth extracting every bit of performance out of if you're playing in a league. Battlefield? I want awesome looking explosions, photogrammetric textures, beautiful sweeping Vistas, high quality assets, incredible lighting etc - and the great part about Battlefield, historically, is that even with all the settings turned up and the top end graphics card, you still get 70fps+. Which is more than enough to have people calling me a cheater whole game. I didn't play BF5 because Battlefield gameplay is so stagnant, but with my 3080 I have no problems hitting 70+fps with RT on without DLSS. And it's a DXR 1.0, first tier iteration of RT reflections only. Games now are running multiple RT effects with better optimization across the board. There's no reason why DICE can't do better given their track record. Not to mention that six months after the game comes out we'll probably be getting a whole new generation of RT cards. And worst case scenario you just TURN IT OFF. By your logic why even include "Very High" settings or whatever it's called in BF? Why include any setting that lowers performance? Texture Detail High? Have fun dying, said no one ever.
Issue isn't fps - issue is that raytracing + dlss introduces quite a bit of lag, which you dont want in an online shooter, regardless of you wanting it to look good. Bf5 doesn't get the lag with rt, cause its "rt" solution doesn't use the rt cores.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

Issue isn't fps - issue is that raytracing + dlss introduces quite a bit of lag, which you dont want in an online shooter, regardless of you wanting it to look good. Bf5 doesn't get the lag with rt, cause its "rt" solution doesn't use the rt cores.
The latency of a frame with RT/DLSS at 100fps is almost identical to one with just RT or even no RT at 100fps given that every thing else is the same. It 100% uses the RT cores if they are available - there is a reason why it runs significantly faster on RT enabled hardware than not.
data/avatar/default/avatar14.webp
Denial:

The latency of a frame with RT/DLSS at 100fps is almost identical to one with just RT or even no RT at 100fps given that every thing else is the same. It 100% uses the RT cores if they are available.
Have you tried running cyberpunk / metro exodus / control with raytracing? Deffo alot more lag with raytracing enabled vs disabled, even when the fps is higher due to dlss.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

Have you tried running cyberpunk / metro exodus / control with raytracing? Deffo alot more lag with raytracing enabled vs disabled, even when the fps is higher due to dlss.
If that's true (I don't think it is) then you're argument is even worse. In this post you're saying RT always adds latency whether DLSS is on or off.. but in the previous post you're saying BF5 has some magical RT implementation that does not add latency ever. So which is it? Does RT add latency or is there a way to add RT without latency? Why can't BF2042 use this magical no-latency, no RT core, RT that DICE has previously created? I've played Metro/Control and now Doom with RT. I play on a 144hz monitor and I've previously played games at fairly competitive levels (CS in both CEVO-M, CAL-P in 1.6) I'm older now but I still play Siege and other competitive shooters maintain better than average levels, so I'm still sensitive to latency and prefer higher framerates in these titles. Outside of lowering the framerate (which is always going to increase latency), I don't notice any significantly induced latency to the frames. I've also never seen any evidence of increased latency in the frame by anyone else (if you have a source I'd love to read it and learn about it). Either way I'm not sure why it's relevant to the discussion here. Simply increasing framerate lowers frame latency - so why not argue that there should be no high quality settings? Why not demand the game be optimized for 200fps+ like other competitive shooters? The reason why no one cares is because there are lots of people that like Battlefield games to have great graphics. For people that want performance, low latency less "lag" they can just TURN THE FEATURES OFF. It's really that simple.