Nvidia promises to improve image quality issues with DLSS

Published by

Click here to post a comment for Nvidia promises to improve image quality issues with DLSS on our message forum
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
This whole RTX generation is one big scam, so not impressed with that awful features and price. Defiantly the easiest generation to ignore and skip, nVidia probably gonna ignore that generation, once they stabilize 7nm architecture and provide much better cards at better price(knowing nvidia, probably won't happen but well...)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HardwareCaps:

So they basically admit that DLSS is bad... yikes Nvidia shouldn't you test something before spending billions on it?
They admit that it needs work. I'm not sure why you think it cost them billions for an autoencoder that they already use in self-driving. The only real thing they figured out was how to anchor it to triangles in the scene. The whitepaper for the Siggraph presentation lists two Nvidia researches lol
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
You can't polish a........................
data/avatar/default/avatar33.webp
Rich_Guy:

You can't polish a........................
Well, at least Nvidia has something to polish. AMD cannot even be close to 2080 with their RVII, fine wine, is a bigger lie than DLSS.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
DLSS is potentially a great bit of tech. In it's current state it's an option most people won't take. But, with further improvements and continual support, it could turn out pretty good. Time will tell.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
RTX and DLSS are NEW....they want to work kinks out, to be ready for the next gen, when consoles take on RTX and it's more perfected. This entire gen is about trying beta versions of these things and anyone in this hobby long enough already knew that and is fine with it. PC gamers are funny, complain you don't get enough features, too much parity with consoles. Then when games give features, you complain the features don't run 100fps or look gorgeous (based on how developer implemented) and just go for the nvidia is to blame rant. DLSS needs improvement. But it's in, 2 games? Give it time. At this point, they'll probably buckle under pressure, just forcefully upgrade the quality, which will drop the frame rate gains and make it just another AA option.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Rich_Guy:

You can't polish a........................
You can polish anything... dorodango 😀
warlord:

Well, at least Nvidia has something to polish. AMD cannot even be close to 2080 with their RVII, fine wine, is a bigger lie than DLSS.
Fine Wine (tm): - not made by AMD, made by us (consumers) - cards to which this apply had tremendous performance difference between initial launch driver performance and driver released year after I have seen quite a few wanna be Fine Wine testers who took random driver in middle of GPU's life cycle and then compared it to newest one. And making conclusion that it does not exist. or that improvement over time is same on both camps. Wanna compare something, compare initial driver benchmarks, based on which people buy GPU and then actual performance over time. That's Fine Wine. = = = = As for Radeon 7, I do not really expect it to get better as Vega 64 is practically same except few fixes and higher bandwidth which will not improve things over time as its effect is already in place. But All GCN cards can get meaningful boost if AMD changes driver to use compute for some feature which otherwise performs poorly. And in that case Radeon 7 is going to get biggest boost. If we look at compute performance of 1st gen GCN (HD7850/70, HD7950/70, ...) and today's, there is great improvement. But it is mostly untapped potential.
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
Robbo9999:

To be honest, I think it's ok to be pretty toxic towards the Radeon VII - it's noisy, a power guzzler, performance is on 1080ti level (but 2 yrs later than that card launched!), and all while having the advantage of being on 7nm which should give them an advantage over NVidia which is on 16nm (1080ti)! And did I mention it's an overpriced product, even in relation to RTX NVidia. Oh, and lots of bugs at launch. It's really a pretty poor product, and a poor buying choice when there are better options out there for the cost/performance/features.
While I agree that it's a power guzzler, and noisy, the part in bold is the one that everyone who has a negative opinion (based on initial reviews) have opined. Yes, AMD could have and should have done better with the launch of this card, but that problem has been resolved with the 19.2.1 Radeon VII only driver. I'm speaking as one who is using the product, not based on reviews and hearsay. Edit - I do hope that nVidia keeps its word about improving DLSS, and they're not saying that so that they can sell more RTX cards.....
data/avatar/default/avatar32.webp
SerotoNiN:

RTX and DLSS are NEW....they want to work kinks out, to be ready for the next gen, when consoles take on RTX and it's more perfected. This entire gen is about trying beta versions of these things and anyone in this hobby long enough already knew that and is fine with it. PC gamers are funny, complain you don't get enough features, too much parity with consoles. Then when games give features, you complain the features don't run 100fps or look gorgeous (based on how developer implemented) and just go for the nvidia is to blame rant. DLSS needs improvement. But it's in, 2 games? Give it time. At this point, they'll probably buckle under pressure, just forcefully upgrade the quality, which will drop the frame rate gains and make it just another AA option.
huh?
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
warlord:

Well, at least Nvidia has something to polish. AMD cannot even be close to 2080 with their RVII, fine wine, is a bigger lie than DLSS.
Agreed that fine wine is overblown , but there is a huge difference , fine wine is something the users said it never came from amd's mouth on the other hand dlss came out presented by nvidiav like the messiah of 4k gaming.
data/avatar/default/avatar02.webp
@Fox2232 @Venix if that fanmade fine wine did really work, fury x should win 980ti by now. But it has so much power. Like Vega 64 liquid edition never won against 1080ti. People should never believe fanboys blindly ever again. I was a victim too.
data/avatar/default/avatar02.webp
What we really need is dlss2x in all games. However, after seeing dlss in action, i highly doubt nvidia will be able to improve image quality with dlss. An object 1 meter away from you in game has no detail at all when using dlss in metro (after the update, too)
data/avatar/default/avatar15.webp
Bigbeard1986:

What we really need is dlss2x in all games. However, after seeing dlss in action, i highly doubt nvidia will be able to improve image quality with dlss. An object 1 meter away from you in game has no detail at all when using dlss in metro (after the update, too)
It is too early for critics. People should truly consider rtx 20 series as an early pass and future investment. Next Gen will surely be prepared to a level capable of blowing us away.
data/avatar/default/avatar15.webp
Xtreme1979:

So much NVIDIA hate. Face it they are the first to support DXR, and if you've seen it first hand in BFV you can't help but be impressed. The reflections in glass, water, mirrors, car paint, etc. is a big step forward graphically. Granted I don't use it in MP, but in SP 70-90 fps at Ultra 2560*1080 is more than enough. Now DLSS in it's current form isn't that great, so don't use it. It's an option, at least it's there in the first place! Also, for those cards that can't hit 60fps in GPU intesitive resolutions with DXR enabled it could be viable. Not like anyone is being forced to use either.
The hate comes from the false promises & the price increase for a feature that is clearly not ready. since RTX is hardware embedded you can't avoid it. it feels bad to "spend" money on something that doesn't work
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
Nvidia deserves criticism for DLSS, there's nothing hateful about that.
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
Yep, and also NV themself sad that Turing was 10 years in development but still Dev's took a lot of time to support 1~2 games few month after launch- and still the resultants are very bad. Skip this Gen and wait for next 7nm GPU's.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
warlord:

@Fox2232 @Venix if that fanmade fine wine did really work, fury x should win 980ti by now. But it has so much power. Like Vega 64 liquid edition never won against 1080ti. People should never believe fanboys blindly ever again. I was a victim too.
Take original review performance. Or look at my post #5. There is quite some evidence about improvements over time. Like: - DX11 overhead: Launch driver = 664k MT vs. 15.200.1055.0 and newer where score is over 1M MT - BF4: Multiplayer - Lancang Dam (64-man; Everything maxed except Deffered AA): launch Driver 1080p: 88~119 fps 1440p: 72~97 fps 2160p: 55~75 fps -------- Multiplayer - Lancang Dam (64-man; Everything maxed except Deffered AA): W10 - 15.200.1023.10 1080p: 101~146 fps (74% GPU utilization) 1440p: 87~128 fps (96% GPU utilization) 2160p: 58~75 fps (100% GPU utilization) AMD's drivers still did not allow to maximize utilization of GPU on 1080p at time with i5@4.5GHz. - Dead Island: launch driver drifted between 107~115fps on 1440p and 15.200.1023.10 had minimum 198fps - Luxmark Table:
Test 1        262K Trianlges        score        util.
HD7970        AMD release           13266        100%
HD7970 GHz    AMD 2013-10b          17641         97%
Fury X        15.150.0.0 W8=>10     29370         93%
Fury X        16.150.1009-160213a   33070        100%

Test 2        488K Triangles
HD7970        AMD release           1534        100%
HD7970 GHz    AMD 2013-11b8         2523         98%
Fury X        15.150.0.0 W8=>10     3474         95%
Fury X        16.150.1009-160213a   4013        100%

Test 3        2016K Triangles
HD7970        AMD release           827         100%
HD7970 GHz    AMD 2013-10b2         1258         99%
Fury X        15.150.0.0 W8=>10     1753        100%
Fury X        16.150.1009-160213a   1841        100%
data/avatar/default/avatar29.webp
Every card can actually do something similar to DLSS - just upscale + sharpen
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
warlord:

Well, at least Nvidia has something to polish. AMD cannot even be close to 2080 with their RVII, fine wine, is a bigger lie than DLSS.
i do not think fury X will ever win on average the 980 ti especially with just 4 gb memory , the fine wine *in my opinion* came out because of how well the 7xxx and 2xx generation aged Gcn was new the drivers for it where immature to put it mildly and over time they got better it helped that the next generations where also gcn so improvements trickled down , on the other hand time was not kind at all to the 69xx cards or fury
data/avatar/default/avatar09.webp
Xtreme1979:

How much did you spend on DLSS? If you actually spent a dime then buyer beware. I bought my 2070 because it was a huge leap in performance from my aged GTX 680, not on the premise of an unreleased AA alternative. The RAW performance fit my price point compared to the competition. DXR, DLSS, and the other Turing features are gravy. Some like gravy (NVIDIA), and some like their potatoes plain (AMD).
you do spend money on DLSS indirectly, DLSS requires the tensor cores which take up die-space, increase complexity, reduce yield and so on. the Turing architecture is great but Nvidia was marketing RTX, they justified the price increase because of RTX, they demoed ONLY RTX on their CES keynote. If you buy GPUs based on "gravy" then you're certainly not the majority. people don't liked being manipulated and lied to.