Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs

Published by

Click here to post a comment for Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs on our message forum
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
half_empty_soul:

I'm 100% sure they just blocking backwards compatibility just to pump up sales of 4000 series Typical NGreedia
Several YT Tech tubers stated that the 3000 series cards do indeed have the required hardware to run DLSS 3, however Nvidia has stated that the 3000 series hardware is just slower at the task by some amount versus the 4000 series it seems (there's a hardware engine of some kind that it uses for the specific task iirc). I'm speculating of course, what do I know, but yeah -- it seems to me that Nvidia probably "could" get DLSS 3 running at least the 3000 series cards then (albeit at a slight to moderately higher runtime cost), but they have a very bad/anticonsumer habit of just screwing over the last gen cards. For example, even though there's no real reason Smart Acces Memory (Rebar) couldn't work on the 2000 series GPUs they just didn't bother. That kind of thing is a real blow imo and this sort of thing will definitely push a lot of consumers to AMD I'd argue as they aren't burning their consumers like this in recent times (you can also get a reasonable amount of VRAM on the AMD side without spending ludicrous amounts of cash -- VRAM skimping has been Nvidia's GPU flaw for awhile now imo). Obviously DLSS3 sounds amazing on paper, but imo they should support it on the 3000 series at least if at all possible then just give the caveat that you may gain slightly less performance from it than the 4000 series if it's at all possible. Digital Foundry/Alex actually talked about this sort of thing sometime back in terms of giving people the option to run reconstruction tech on shader cores when possible even if the gained perf is lower iirc (i'm butchering what he said and obviously he was referring to other reconstruction techniques and all, but it seems like they "could" run DLSS 3 on the 3k series with slightly worse performance gains it would just take effort for them and they're using it as marketing for their new overpriced cards). I could be wrong though *shrug*
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
goat1:

How is that greedy? You should try running a business and see how it works. Its called a selling point. There are a lot of kids on this forum that want stuff for nothing. I'm not defending them, they can irritate me also, but they have shareholders to please. They care about the shareholders, not us. If they can sell 10,000 units to miners, guess what? Now they love miners temporarily.
I was under the impression most people on this forum were working adults -- this forum seems to skew higher in terms of age than, say, reddit or some such for example. From a business point of view I understand why Nvidia does what they do -- they're the market leader in terms of performance so like Intel with CPUs when they were on top Nvidia can get away with a lot. But if they push too hard they will lose more customers to AMD -- they're probably fine with that and I expect Nvidia will sell out of these cards anyway. Personally if Nvidia keeps pulling these kinds of stunts where they introduce new tech every generation that is not compatible with literally 1 gen back I will move to AMD where their focus has been on technologies that everyone can use. It's just more cost efficient for most people. I don't think there's anything at all wrong with criticizing a company for doing something anti-consumer or for not enabling a given feature when they technically could. It's a kick in the teeth for existing customers and if you do it too much they will not buy your GPUs anymore, they'll move to a manufacturer that doesn't bite in the rear so much. I don't think it reasonable to say people want something for nothing in this case -- people obviously paid for their new 3000 series GPU with the last year most likely and a GPU purchase like that comes, reasonably so I'd argue, with certain expectations from Nvidia for feature support (these GPUs are not cheap -- they cost more typically than an entire console). This is something of a unique case too I'd argue as several YT PC outlets have stated that technically the required hardware engine for DLSS 3 is present in the 3000 series cards, it's just somewhat slower than the one present in the 4000 series GPUs. I'm speculating of course and could be wrong, but my bet is that they probably "could" get the feature to work on those cards (albeit without as much of a perf uplift as the 4k series) but won't bother with the effort as it would take some portion of resources and they want the advertising bullet for the 4k series as you say. Still sucks if you own a 3k card though, nothing wrong with stating that imo.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
Glottiz:

DLSS, FSR and all other upscaling techniques are very favorable for marketers and Youtubers. Because if you stand still and take a screenshot ofcourse image looks perfect (TechPowerUp likes doing these comparisons, IMO they are nearly useless), alternatively if you make a video for Youtube it will also look great because video compression punishes native resolution image and benefits DLSS/FSR image. I remember DF singing praises about DLSS since their 2080Ti coverage, and it was hard to see any negatives in their videos, it did look nearly perfect. It's only when I got Nvidia GPU myself and started playing on a big 4K TV that I started seeing deficiencies of DLSS/FSR. It's been really hit and miss so far. It's really hard to take serious those who proclaim that DLSS is always better than native rendering, they either have no attention to detail at all or are just marketing victims.
In my experience playing at 1440p, FSR (even 2.0) has noticeable quality loss compared to native resolution + TAA (though of course that tech is fairly new and I appreciate that it's hardware agnostic). Early DLSS looked weird and painterly in games like BFV and it looked really "soft" til it was patched imo. The "hand tuned" 1.9 version of DLSS in Control was very interesting, but broke down a ton in motion and I didn't care for it very much. Lots of noticeable artifacts around hair and when moving the camera and in fan blades (etc). DLSS 2.0 was when it really started to get good to my perception and the Quality mode with 1440p output looked pretty solid in games like Cyberpunk to my eye (though there were still some issues like ghosting trails and weird softness from time to time). DOOM Eternal though I preferred native + TAA as there was just something weird about how the DLSS handled detailed surfaces in motion and the game came out looking sort of "soft" to my eye -- not bad mind you, but I thought it looked noticeably worse than native. The latest iteration of DLSS (2.3 or 2.4? can't recall) looks amazing with the right sharpen settings to my eye, it even holds up pretty well in motion the few games I've tested it on. This is all assuming Quality mode however -- turn it lower and native + TAA starts looking better again to me. I wouldn't say DLSS 2.4 is "better" than native + TAA, but considering the performance uplift for me it's now "worth it" assuming I'm GPU bound that is (some games like Spiderman I'm more CPU bound).
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
BlindBison:

In my experience playing at 1440p, FSR (even 2.0) has noticeable quality loss compared to native resolution + TAA (though of course that tech is fairly new and I appreciate that it's hardware agnostic).
FSR 2.1 mod looks better than native TAA in e.g. Doom Eternal or Dying Light 2 and also native 2.0 looks better than old UE TAA in Wonderlands. But yes, there is still room for improvements with ghosting in some situations, particle effects or disocclusion.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
I hope there's a way to distinguish between normal frames and interpolated frames, the render latency would be at w/e the normal frames are with dlss2. More concerned by the strange interpolation motion, I haven't seen a good one yet but that's something that'll have to be hands on to test. I'm sure BlurBusters and Batte(non)sense will be testing that in depth.
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
schmidtbag:

The weird thing is, frame interpolation seems like it'd be less computationally expensive compared to everything else DLSS does, so I'm kind of surprised it wouldn't be available. However, older GPUs might not have enough tensor cores to handle all of DLSS3, but it'd be cool if you could at least choose between supersampling and frame interpolation. For example, maybe your GPU is powerful enough to play a game at 4K and 45FPS. You don't want the quality loss of lowering your resolution and doing supersampling, but you'd still like those extra 15FPS. At 45FPS, fame interpolation shouldn't be too distracting since you're already 75% of the way there.
Most modern TVs, even cheap ones, do motion interpolation so it is really nothing new and doesn't seem require that much processing power considering how limited the CPUs and graphical hardware tend to be in TVs. I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards. Boosting of 3X-4X increases in performances sounds impressive on paper but if most of that comes from image upscaling and "fake" doubling of the framerate then in my view it isn't really impressive at all. DLSS 2 already has obvious compromises to visual quality when upscaling in the form of visual shimmering on fine details and so on so I can only imagine when DLSS 3 looks like with those issues and motion interpolation artefacts as well. These things are great for consoles in my opinion, where you typically sit much further away from the TV and are therefore less likely to notice the glitches, but I would wager that most PC gamers sit directly in front on a monitor so the issues are far more obvious... and, in my experience, can be distracting/annoying.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Darren Hodgson:

Most modern TVs, even cheap ones, do motion interpolation so it is really nothing new and doesn't seem require that much processing power considering how limited the CPUs and graphical hardware tend to be in TVs.
That's true but what the FI TVs do is hot garbage and (like hot garbage) usually really distracting. Seems to me what Nvidia has done is basically just crumpled up paper - a sign of something on the right track but maybe not quite good enough. Since the GPU itself is producing the image, it can more intelligently figure out how to apply the sub-frames, and hopefully you can dial in how "aggressive" it is.
I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards. Boosting of 3X-4X increases in performances sounds impressive on paper but if most of that comes from image upscaling and "fake" doubling of the framerate then in my view it isn't really impressive at all. DLSS 2 already has obvious compromises to visual quality when upscaling in the form of visual shimmering on fine details and so on so I can only imagine when DLSS 3 looks like with those issues and motion interpolation artefacts as well. These things are great for consoles in my opinion, where you typically sit much further away from the TV and are therefore less likely to notice the glitches, but I would wager that most PC gamers sit directly in front on a monitor so the issues are far more obvious... and, in my experience, can be distracting/annoying.
I agree, though typically I've found these AI-based enhancers to only be distracting if you look where it wasn't optimized. At least from what I've seen, Nvidia's FI is only distractingly bad if you look at it frame by frame, but at that point you're not really playing the game. When you just let it do its thing, it's "okay". I would be fine with the occasional distortion if it meant I could have a consistent 60FPS experience. If you look at hand-drawn animations, you'd be surprised to find how weird everything can look between frames, but when fully animated, you either don't catch such details or you kinda get used to it. It's the same idea as playing games on an old CRT TV - it's unbearable at first but after about a half hour, your brain sorta adjusts to it and suddenly it's not so awful to look at anymore.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Darren Hodgson:

I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards.
Idk - like yeah obviously value-add features are going to be used to increase prices but there are other reasons that factor in: R&D cost for the G80 architecture (8800 GTX) was ~$450M over 4 years. The R&D cost for a modern card is easily in the billions. Nvidia is spending close to $5B per year on R&D costs now. The price of manufacturing is increasing exponentially: https://cdn.wccftech.com/wp-content/uploads/2019/04/Screen-Shot-2019-04-19-at-7.41.50-PM-1480x781.png Then you have general inflation, tariffs, ban of chips to china, etc. This is all going to pass onto the customer in some form. So you'd have to like decouple what is like "justifiable" increases in price, from Nvidia's value-add features and mining influences. I'm not sure how you do that. I know the prices are really high and I think 4080 is insanely priced but I also think expecting a 4080 16GB to be like $600-650 is probably equally far-fetched. There exists some pricepoint that's probably fair to be both customers and the companies. $1200 isn't it lol. __ And I said this in the other thread but the reality is technologies like DLSS/AI based image reconstruction & Interpolation is probably the future whether you like it or not. Physics is limiting manufacturing. There's no way to just brute force increase performance anymore without massively increasing power requirements. You need to get fancy with the software and that's the route every company is going. Best we can hope for is that Nvidia/AMD/Intel & whoever starts solving the issues with these technologies - I don't think we as gamers should be pushing against them.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

Then you have general inflation, tariffs, ban of chips to china, etc. This is all going to pass onto the customer in some form. So you'd have to like decouple what is like "justifiable" increases in price, from Nvidia's value-add features and mining influences. I'm not sure how you do that. I know the prices are really high and I think 4080 is insanely priced but I also think expecting a 4080 16GB to be like $600-650 is probably equally far-fetched. There exists some pricepoint that's probably fair to be both customers and the companies. $1200 isn't it lol.
While R&D has grown exponentially, their net income has also grown quite substantially. So, while a 4080 16GB at $650 is farfetched, $750 would be totally justified given Nvidia's pricing history.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

While R&D has grown exponentially, their net income has also grown quite substantially. So, while a 4080 16GB at $650 is farfetched, $750 would be totally justified given Nvidia's pricing history.
I agree -- I think $750-800 is a fair price for the card.. $1200 is absurd.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Denial:

I agree -- I think $750-800 is a fair price for the card.. $1200 is absurd.
it will sell at those absurd prices none the less cause people and brains and sense dont combine well, I am more likely to rma my 450gts back to evga that running way to hot idle get new card even if it 3050 and using that till it dies cause prices are absurd. DLSS/FSR is slowly becoming crutch to industry which i dont like. it great if you running say 4k monitor and cant actual "push" 4k at your target of 60fps but native res power seem to be take back sit to DLSS which i not fan of usefull yes it can not and will not ever replaced native. there more advances in DLSS performance for last 3 gen then actual native performance advances dont get me wrong i like idea of DLSS for certian work cases, but nvidia seem to more concerned with DLSS advances at this point
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
BlindBison:

Several YT Tech tubers stated that the 3000 series cards do indeed have the required hardware to run DLSS 3, however Nvidia has stated that the 3000 series hardware is just slower at the task by some amount versus the 4000 series it seems (there's a hardware engine of some kind that it uses for the specific task iirc). I'm speculating of course, what do I know, but yeah -- it seems to me that Nvidia probably "could" get DLSS 3 running at least the 3000 series cards then (albeit at a slight to moderately higher runtime cost), but they have a very bad/anticonsumer habit of just screwing over the last gen cards. For example, even though there's no real reason Smart Acces Memory (Rebar) couldn't work on the 2000 series GPUs they just didn't bother. That kind of thing is a real blow imo and this sort of thing will definitely push a lot of consumers to AMD I'd argue as they aren't burning their consumers like this in recent times (you can also get a reasonable amount of VRAM on the AMD side without spending ludicrous amounts of cash -- VRAM skimping has been Nvidia's GPU flaw for awhile now imo). Obviously DLSS3 sounds amazing on paper, but imo they should support it on the 3000 series at least if at all possible then just give the caveat that you may gain slightly less performance from it than the 4000 series if it's at all possible. Digital Foundry/Alex actually talked about this sort of thing sometime back in terms of giving people the option to run reconstruction tech on shader cores when possible even if the gained perf is lower iirc (i'm butchering what he said and obviously he was referring to other reconstruction techniques and all, but it seems like they "could" run DLSS 3 on the 3k series with slightly worse performance gains it would just take effort for them and they're using it as marketing for their new overpriced cards). I could be wrong though *shrug*
Yeah it's that new RT thread reordering module, it speeds it up a lot. [youtube=_lQNl0h7EIo]