Cyberpunk 2077 runs 30 FPS with ray tracing and DLSS in 8K on RTX 3090

Published by

Click here to post a comment for Cyberpunk 2077 runs 30 FPS with ray tracing and DLSS in 8K on RTX 3090 on our message forum
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
DLSS "Ultra Performance" means it's actually rendered at barely 1440p... ... with quite horrible upscaling quality. This "Gaming at 8K" is nothing more than a bad meme. Move along, nothing to see here folks ! (Correction, what we should see here is the fantastic 6.6 fps slideshow with DLSS off, that's the real benchmark)
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
wavetrex:

DLSS "Ultra Performance" means it's actually rendered at barely 1440p... ... with quite horrible upscaling quality. This "Gaming at 8K" is nothing more than a bad meme. Move along, nothing to see here folks ! (Correction, what we should see here is the fantastic 6.6 fps slideshow with DLSS off, that's the real benchmark)
you're confusing upscaling quality with render resolution ultra performance looks horrible on lower resolutions,but if it's rendering from 1440p,it's gonna do very well here's 1080p native vs 1080p reconstructed dlss makes a night and day difference at 1080p.reconstructing for 1440p is gonna look great,though it will not be a native 8K image. https://i.imgur.com/sPYDMYI.jpg
data/avatar/default/avatar12.webp
The sharpness of 8k is wasted at 30 fps...screenshots will look nice, but in motion...
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
cucaulay malkin:

here's 1080p native vs 1080p reconstructed dlss makes a night and day difference at 1080p.reconstructing for 1440p is gonna look great,though it will not be a native 8K image.
That's performance DLSS you're showing, not ultra performance which looks much much worse. The RTX 3080 shits itself and dies at 1440p with no DLSS, and the RTX 3090 is barely any faster. Forget 4K, and LMFAO at 8K, fuck off nVidia. In fact, despite CDPR's claim to the contrary, I get enough of a performance uplift with an R9 3900X + RTX 3080 with the SMT enabler mod that it's halfway to being the the same as using the 3900X + a 3090.
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
No chance the 3090 can do nice 8k gaming, can barely do watchdogs legion smooth at 1440
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Neo Cyrus:

That's performance DLSS you're showing, not ultra performance which looks much much worse.
do you understand how this works at all ? dlss presets are internal resolutions,e.g. 1080p quality is same as 1440p performance https://i.imgur.com/OvVKWTW.jpg you're getting the same image from internal 1440p,whether you call it 4K Quality or 8K ultra performance so yeah,it is very misleading since nvidia is calling "8K" what is basically 1440p + dlss in their 4K Quality preset.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
cucaulay malkin:

do you understand how this works at all ? dlss presets are internal resolutions,e.g. 1080p quality is same as 1440p performance you're getting the same image from internal 1440p,whether you call it 4K Quality or 8K ultra performance so yeah,it is very misleading since nvidia is calling "8K" what is basically 1440p + dlss in their 4K Quality preset.
My bad, I forgot about that and my brain replaced the 4K in your image with 8K because it's a troll that hates me. But for now there's a caveat... the effective result is not the same exactly, since 8K is going to be on a giant fucking TV because that's the only way to get 8K right now. Meanwhile you can still have 4K on a 27" monitor. No one is going to sit far away enough if there's even enough room to make those the same effective quality. I'm assuming most PC players use monitors anyway. nVidia really need to STFU with these 8K lies.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Neo Cyrus:

My bad, I forgot about that and my brain replaced the 4K in your image with 8K because it's a troll that hates me. But for now there's a caveat... the effective result is not the same exactly, since 8K is going to be on a giant procreating TV because that's the only way to get 8K right now. Meanwhile you can still have 4K on a 27" monitor. No one is going to sit far away enough if there's even enough room to make those the same effective quality. I'm assuming most PC players use monitors anyway. nVidia really need to STFU with these 8K lies.
that's marketing so who cares,you only listen to it if you don't know what's going on. but I guess they shot themselves in the foot,cause if they advertised it for what it is,a great way to get 4k-level of detail while much faster,people like you wouldn't be throwing fits. for me they can call it whatever,I still like the idea of a middle ground solution between 1440p and 4K that produces IQ near 4K and performance closer to 1440p. I would never consider a 4K monitor otherwise cause the performance hit compared to image quality increase is nowhere near interesting.
data/avatar/default/avatar16.webp
cucaulay malkin:

do you understand how this works at all ? dlss presets are internal resolutions,e.g. 1080p quality is same as 1440p performance https://i.imgur.com/OvVKWTW.jpg you're getting the same image from internal 1440p,whether you call it 4K Quality or 8K ultra performance so yeah,it is very misleading since nvidia is calling "8K" what is basically 1440p + dlss in their 4K Quality preset.
The base image is what DLSS works off of but the processing is different. Also DLSS uses 8K assets at 8K. 8K Ultra Performance and 4K Quality will perform different and have different image quality despite having the same base resolution.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kcthebrewer:

The base image is what DLSS works off of but the processing is different. Also DLSS uses 8K assets at 8K. 8K Ultra Performance and 4K Quality will perform different and have different image quality despite having the same base resolution.
yup turns out that is correct. higher output=bigger performance hit and better quality so yes,8K dlss ultra perf will cost more and look better than 4K dlss quality,just like 1440p performance will be better than 1080p quality but won't run as fast
https://forums.guru3d.com/data/avatars/m/90/90026.jpg
DLSS is like handicap geting one crutch. I admit, it adds some performance, and makes RT playable, especially in cp2077, where RT real adds something. But artifacts, always. Granted, when u are on run, u will probably have problem with spotting it(unless is obvious flickering neon sign in view) but stop a little, watch ppl move, cars, etc, and there is trail always behind them. And other things. Nothing beats native resolution. And IF native is somewhat less sharp, like on screenshots above, I will just say, fuk it, it was made for purpose, to make DLSS look better.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
One of these days Nvidia is going to have to get something like 200% performance uplift from a next gen release to make any of this stuff usable for anything other than screen shots.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
GREGIX:

I will just say, fuk it, it was made for purpose, to make DLSS look better.
no it was not made on purpose,it's just the way dlss works,everything looks sharper even at 1440p dlss q (960p internal) cause it will reconstruct a lot of fine lines,shapes,patterns or detail that would otherwise blend in or fade on native.I'm not saying better,I'm saying more defined and "in your face". dlss image takes getting used to for me personally,whether or not it looks better or not is personal preference,but it definitely looks different. playing control it struck my eyes how defined everything is.it's not really higher resolution,but it does produce a very good impression on objects that it manages to reconstruct correctly. problem is you almost wanna make it tone it down a little.not everything needs to be so defined,otherwise some scenes will just overwhelm you. imo the technology is very good at individual object level,but the whole picture - I don't know.Native just has a more subtle perception in viewer's eyes. On the other hand,it will successfully get rid of flickering on those faint lines,something that has been an absolute plague in so many games. all in all,I wanna have it,but if I always wanna use will depend.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
cucaulay malkin:

you're confusing upscaling quality with render resolution ultra performance looks horrible on lower resolutions,but if it's rendering from 1440p,it's gonna do very well here's 1080p native vs 1080p reconstructed dlss makes a night and day difference at 1080p.reconstructing for 1440p is gonna look great,though it will not be a native 8K image. https://i.imgur.com/sPYDMYI.jpg
I see you're citing GamerNexus article on the topic, have to note that they failed to disable the TAA FXAA on their reference native shots, which is possible by adding a couple commands in the engine file. Think it's interesting to truly see the signal input for DLSS 'magic' This is a shot i made using DLSS Quality + Nvidia control Panel sharpening (80% sharp, 20% film grain ignore ) (no spoiler, just hiding the picture) [SPOILER] https://images-wixmp-ed30a86b8c4ca887773594c2.wixmp.com/f/fb278791-4258-45db-9944-65439602904a/deayox2-b0fbdfc8-bed6-4645-9d9a-37f904e1687c.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOiIsImlzcyI6InVybjphcHA6Iiwib2JqIjpbW3sicGF0aCI6IlwvZlwvZmIyNzg3OTEtNDI1OC00NWRiLTk5NDQtNjU0Mzk2MDI5MDRhXC9kZWF5b3gyLWIwZmJkZmM4LWJlZDYtNDY0NS05ZDlhLTM3ZjkwNGUxNjg3Yy5wbmcifV1dLCJhdWQiOlsidXJuOnNlcnZpY2U6ZmlsZS5kb3dubG9hZCJdfQ.MxoxIb9-Srz3hIA1ZlDTWZVfc_hC4J_Z-GdgYuPmwNs [/SPOILER]
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
I put my brother off getting a 3090 to do 8K for reasons already cited here. He has stuck with his 2080ti @4K or less and is happy with the decision. But then I ended up getting a 3090 for myself because its impossible to get a 3080 🙂 Theres no point bothering with 8K, not for probably 4 to 6 years. And even then, who cares. 4K is plenty high enough res.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
XenthorX:

I see you're citing GamerNexus article on the topic, have to note that they failed to disable the TAA FXAA on their reference native shots, which is possible by adding a couple commands in the engine file. Think it's interesting to truly see the signal input for DLSS 'magic' This is a shot i made using DLSS Quality + Nvidia control Panel sharpening (80% sharp, 20% film grain ignore ) (no spoiler, just hiding the picture) [SPOILER] https://images-wixmp-ed30a86b8c4ca887773594c2.wixmp.com/f/fb278791-4258-45db-9944-65439602904a/deayox2-b0fbdfc8-bed6-4645-9d9a-37f904e1687c.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOiIsImlzcyI6InVybjphcHA6Iiwib2JqIjpbW3sicGF0aCI6IlwvZlwvZmIyNzg3OTEtNDI1OC00NWRiLTk5NDQtNjU0Mzk2MDI5MDRhXC9kZWF5b3gyLWIwZmJkZmM4LWJlZDYtNDY0NS05ZDlhLTM3ZjkwNGUxNjg3Yy5wbmcifV1dLCJhdWQiOlsidXJuOnNlcnZpY2U6ZmlsZS5kb3dubG9hZCJdfQ.MxoxIb9-Srz3hIA1ZlDTWZVfc_hC4J_Z-GdgYuPmwNs [/SPOILER]
how does it compare to the native shot you took ?
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Mufflore:

I put my brother off getting a 3090 to do 8K for reasons already cited here. He has stuck with his 2080ti @4K or less and is happy with the decision. But then I ended up getting a 3090 for myself because its impossible to get a 3080 🙂 Theres no point bothering with 8K, not for probably 4 to 6 years. And even then, who cares. 4K is plenty high enough res.
I still think 1440p high refreshrate is still the way to go. 4k was such a stupid leap that TV and consoles pushed for, 1440p high fps will always be the way to go, though least with older games the 3090 can do 4K 120 quite easily.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
cucaulay malkin Holy hell, are you some kind of NVIDIA DLSS paid fanboy? Like seriously. Bunch of us think it sucks, dont like it, etc. Deal with it.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
It barely can keep up at 1440P with ultra settings and RTX on with DLSS let alone 8K...
data/avatar/default/avatar03.webp
the hole title i based on a lie. it isnt 8k its done in a lower screen res to emulate 8k.