Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
Corsair H170i Elite Capellix XT review
Forspoken: PC performance graphics benchmarks
ASRock Z790 Taichi review
The Callisto Protocol: PC graphics benchmarks
G.Skill TridentZ 5 RGB 6800 MHz CL34 DDR5 review
Be Quiet! Dark Power 13 - 1000W PSU Review
Palit GeForce RTX 4080 GamingPRO OC review
Core i9 13900K DDR5 7200 MHz (+memory scaling) review
Seasonic Prime Titanium TX-1300 (1300W PSU) review
F1 2022: PC graphics performance benchmark review

New Downloads
FurMark Download v1.33.0.0
Intel ARC graphics Driver Download Version: 31.0.101.4091
Corsair Utility Engine Download (iCUE) Download v4.33.138
CPU-Z download v2.04
AMD Radeon Software Adrenalin 23.1.2 (RX 7900) download
GeForce 528.24 WHQL driver download
Display Driver Uninstaller Download version 18.0.6.0
Download Intel network driver package 27.8
ReShade download v5.6.0
Media Player Classic - Home Cinema v2.0.0 Download


New Forum Topics
AMD Announces Pricing and Availability for Ryzen 7000X3D Series Processors AMD Confirms Strategy of Restraining Chip Supply to Maintain High CPU and GPU Prices What reason to go to Windows 11? Cyberpunk 2077 NVIDIA DLSS 3 Update Is Out Now Amernime Zone AMD Software: Adrenalin / Pro Driver - Release Discovery 22.12.2 WHQL Recommend some good and old drivers for GTX1660S Netflix threatens to ban customers who share an account unauthorized DirectStorage testing reveals that PCIe 3 SSDs are as fast as PCIe 5 SSDs, PCIe 4 SSDs almost similar The Callisto Protocol: PC graphics performance benchmark analysis Microsoft halts selling Windows 10 on January 31




Guru3D.com » News » Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs

Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs

by Hilbert Hagedoorn on: 09/22/2022 09:10 AM | source: | 33 comment(s)
Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs

New Tensor cores have new functionality. Nvidia announced the GeForce RTX 40-series (Ada Lovelace) graphics cards, as well as the company's latest DLSS 3 technology. With the chipmaker boasting a 4X performance gain, DLSS 3 drew much attention.

Will DLSS 3 will be available for earlier GeForce graphics cards ? Nvidia's Vice President of Applied Deep Learning Research, Bryan Catanzaro revealed that DLSS 3 is currently only supported by GeForce RTX 40-series graphics cards. The Nvidia engineer did leave the door open for backward compatibility, indicating that DLSS 3 may theoretically function on prior generation GeForce graphics cards like the RTX 30-series (Ampere) or RTX 20-series (Turing).

To function, Nvidia's DLSS 3 technology is dependent on Ada's fourth-generation Tensor cores and the new Optical Flow Accelerator (OFA). OFA is not a new invention developed by Nvidia in collaboration with Ada. OFA has been there since the days of Turing. The difference is that Nvidia has vastly enhanced it in Ada vs Ampere, resulting in higher performance and higher quality. According to reports, the OFA in Ada is 2 to 2.5 times faster than on Ampere. Nvidia appears to have made some algorithmic modifications as well. Ampere and Turing can theoretically benefit from DLSS 3, but not to the same extent. Catanzaro believes that DLSS 3 will not improve frame rates on Ampere or Turing but will instead cause laggy gaming and poor visual fidelity.

According to Nvidia, DLSS 3 can reduce latency by up to 2X compared to native latency. DLSS 3 in conjunction with DLSS Super Resolution, DLSS Frame Generation, and NVIDIA Reflex, demonstrated 4X higher performance and 2X improved responsiveness in a demo of Ray tracing: overdrive mode in Cyberpunk 2077.

 

 

DLSS 3 has been shown to benefit CPU-intensive games like Microsoft Flight Simulator. On October 12, DLSS 3 will be released alongside the flagship GeForce RTX 4090. Nvidia anticipates that more than 35 games and applications will support DLSS 3 at launch. More importantly, DLSS 3 builds on the foundations of DLSS 2, making it simple for game creators to enable DLSS 3 in existing games that already support DLSS 2 or Nvidia Streamline.



Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs




« GeForce RTX 4090 FE PCB Reveals 23 Phases and Cleaner Power Deliver · Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs · Corsair MP600 Pro XT PCIe 4.0 SSD Sees 8TB Version »

Related Stories

NVIDIA DLSS Extends Support for 12 More Games including Hitman 3 and F1 22 - 05/25/2022 08:39 AM
NVIDIA is working to expand DLSS support in more titles. During its presentation at Computex 2022, the business revealed that, in addition to Hitman 3, 11 titles will receive DLSS, with more to follow...

DOLMEN will be the first game to have Nvidia DLSS, AMD FSR and Intel XeSS - 05/16/2022 06:08 PM
Dolmen producer Kyiv Martins stated in an interview that the new souls-like sci-fi from publisher Prime Matter and developer Massive Work Studio will be the first game to leverage Intel's scaling tec...

NVIDIA Data Breach Aftermath Gets more serious, hackers make new demand - 03/02/2022 09:35 AM
The aftermath of the NVIDIA breach is slowly disclosing more and more info about pending products. Yesterday an older version of DLSS source code was already spotted, new GPU architectures have been c...

Nvidia Driver 510.39.01 Unlocks GPU System Processor Performance Enhancement - 01/19/2022 10:26 AM
For the time being, 2022 Max-Q laptops and Enterprise GPUs are the only ones with this feature. In a stealth unveiling, Nvidia revealed a capability that has been buried in its enterprise and consume...

Nvidia delays manufacturing of the GeForce RTX 3090 Ti due compatibility issues? - 01/17/2022 10:24 AM
Temporary that is. In January, NVIDIA unveiled their new flagship GeForce RTX 3090 Ti graphics card, which was scheduled to be delivered later this month... but it appears that may not be the case....


7 pages « < 4 5 6 7


Spets
Senior Member



Posts: 3433
Joined: 2011-05-10

#6053719 Posted on: 09/23/2022 12:26 PM
I hope there's a way to distinguish between normal frames and interpolated frames, the render latency would be at w/e the normal frames are with dlss2.
More concerned by the strange interpolation motion, I haven't seen a good one yet but that's something that'll have to be hands on to test.

I'm sure BlurBusters and Batte(non)sense will be testing that in depth.

Darren Hodgson
Senior Member



Posts: 16249
Joined: 2004-08-18

#6053737 Posted on: 09/23/2022 01:54 PM
The weird thing is, frame interpolation seems like it'd be less computationally expensive compared to everything else DLSS does, so I'm kind of surprised it wouldn't be available. However, older GPUs might not have enough tensor cores to handle all of DLSS3, but it'd be cool if you could at least choose between supersampling and frame interpolation. For example, maybe your GPU is powerful enough to play a game at 4K and 45FPS. You don't want the quality loss of lowering your resolution and doing supersampling, but you'd still like those extra 15FPS. At 45FPS, fame interpolation shouldn't be too distracting since you're already 75% of the way there.


Most modern TVs, even cheap ones, do motion interpolation so it is really nothing new and doesn't seem require that much processing power considering how limited the CPUs and graphical hardware tend to be in TVs.

I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards. Boosting of 3X-4X increases in performances sounds impressive on paper but if most of that comes from image upscaling and "fake" doubling of the framerate then in my view it isn't really impressive at all. DLSS 2 already has obvious compromises to visual quality when upscaling in the form of visual shimmering on fine details and so on so I can only imagine when DLSS 3 looks like with those issues and motion interpolation artefacts as well. These things are great for consoles in my opinion, where you typically sit much further away from the TV and are therefore less likely to notice the glitches, but I would wager that most PC gamers sit directly in front on a monitor so the issues are far more obvious... and, in my experience, can be distracting/annoying.

schmidtbag
Senior Member



Posts: 7161
Joined: 2012-11-10

#6053755 Posted on: 09/23/2022 03:10 PM
Most modern TVs, even cheap ones, do motion interpolation so it is really nothing new and doesn't seem require that much processing power considering how limited the CPUs and graphical hardware tend to be in TVs.

That's true but what the FI TVs do is hot garbage and (like hot garbage) usually really distracting. Seems to me what Nvidia has done is basically just crumpled up paper - a sign of something on the right track but maybe not quite good enough. Since the GPU itself is producing the image, it can more intelligently figure out how to apply the sub-frames, and hopefully you can dial in how "aggressive" it is.
I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards. Boosting of 3X-4X increases in performances sounds impressive on paper but if most of that comes from image upscaling and "fake" doubling of the framerate then in my view it isn't really impressive at all. DLSS 2 already has obvious compromises to visual quality when upscaling in the form of visual shimmering on fine details and so on so I can only imagine when DLSS 3 looks like with those issues and motion interpolation artefacts as well. These things are great for consoles in my opinion, where you typically sit much further away from the TV and are therefore less likely to notice the glitches, but I would wager that most PC gamers sit directly in front on a monitor so the issues are far more obvious... and, in my experience, can be distracting/annoying.

I agree, though typically I've found these AI-based enhancers to only be distracting if you look where it wasn't optimized. At least from what I've seen, Nvidia's FI is only distractingly bad if you look at it frame by frame, but at that point you're not really playing the game. When you just let it do its thing, it's "okay". I would be fine with the occasional distortion if it meant I could have a consistent 60FPS experience. If you look at hand-drawn animations, you'd be surprised to find how weird everything can look between frames, but when fully animated, you either don't catch such details or you kinda get used to it.
It's the same idea as playing games on an old CRT TV - it's unbearable at first but after about a half hour, your brain sorta adjusts to it and suddenly it's not so awful to look at anymore.

Denial
Senior Member



Posts: 14010
Joined: 2004-05-16

#6053757 Posted on: 09/23/2022 03:21 PM
I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards.


Idk - like yeah obviously value-add features are going to be used to increase prices but there are other reasons that factor in:

R&D cost for the G80 architecture (8800 GTX) was ~$450M over 4 years. The R&D cost for a modern card is easily in the billions. Nvidia is spending close to $5B per year on R&D costs now.

The price of manufacturing is increasing exponentially:



Then you have general inflation, tariffs, ban of chips to china, etc. This is all going to pass onto the customer in some form.

So you'd have to like decouple what is like "justifiable" increases in price, from Nvidia's value-add features and mining influences. I'm not sure how you do that. I know the prices are really high and I think 4080 is insanely priced but I also think expecting a 4080 16GB to be like $600-650 is probably equally far-fetched. There exists some pricepoint that's probably fair to be both customers and the companies. $1200 isn't it lol.

__

And I said this in the other thread but the reality is technologies like DLSS/AI based image reconstruction & Interpolation is probably the future whether you like it or not. Physics is limiting manufacturing. There's no way to just brute force increase performance anymore without massively increasing power requirements. You need to get fancy with the software and that's the route every company is going. Best we can hope for is that Nvidia/AMD/Intel & whoever starts solving the issues with these technologies - I don't think we as gamers should be pushing against them.

schmidtbag
Senior Member



Posts: 7161
Joined: 2012-11-10

#6053766 Posted on: 09/23/2022 03:41 PM
Then you have general inflation, tariffs, ban of chips to china, etc. This is all going to pass onto the customer in some form.

So you'd have to like decouple what is like "justifiable" increases in price, from Nvidia's value-add features and mining influences. I'm not sure how you do that. I know the prices are really high and I think 4080 is insanely priced but I also think expecting a 4080 16GB to be like $600-650 is probably equally far-fetched. There exists some pricepoint that's probably fair to be both customers and the companies. $1200 isn't it lol.
While R&D has grown exponentially, their net income has also grown quite substantially. So, while a 4080 16GB at $650 is farfetched, $750 would be totally justified given Nvidia's pricing history.

7 pages « < 4 5 6 7


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023