Raytracing without RTX: Nvidia Pascal receives DXR support via driver

Published by

Click here to post a comment for Raytracing without RTX: Nvidia Pascal receives DXR support via driver on our message forum
https://forums.guru3d.com/data/avatars/m/216/216235.jpg
I smell this move is to counter Crytek one.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
The comments at wccftech hurt my brain
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
theoneofgod:

The comments at wccftech hurt my brain
You will see some of the most colorful comments on earth right there!
https://forums.guru3d.com/data/avatars/m/271/271700.jpg
Nvidia just turned your 1080TI into a 2080! Please don't get upset. Glad they did this tho. Maybe game dev's can actually use the software in much wiser ways now that there is actually a market for RT. I see little point but many are smarter than I and i'm sure some good will come out of this? RT, the way Nvidia wants it, is doomed to fail. Let's see what the Dev's can do. That is where it's at.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Killian38:

Nvidia just turned your 1080TI into a 2080! Please don't get upset.
Except the ray tracing performance on the 1080 ti won't work even remotely well performance wise. It's great they are going to enable this, for the simple fact it'll show how much of a leap and importance dedicated hardware is currently needed, but from an actual usability standpoint, there won't be any reason to actually try and use it. Maybe on the 1660's there will be some usefulness since they will utilize FP32 and INT32 Since apparently you didn't read the article https://cdn.wccftech.com/wp-content/uploads/2019/03/2019-03-18_23-13-22-1480x827.png https://cdn.wccftech.com/wp-content/uploads/2019/03/2019-03-18_23-14-13-1480x823.png Nvidia just turned your 1080 ti into a 2080, but you get 43 less FPS! Yay! Always wanted to play games at 18fps! Again the only point of this is to showcase how important and necessary it is to have dedicated hardware
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Killian38:

Nvidia just turned your 1080TI into a 2080! Please don't get upset.
My 1080Ti is out of control, won't stop humping my leg, better go play ,to calm him down.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
oh no way I must have missed that news too bad I am rolling on a 750ti
data/avatar/default/avatar16.webp
seems like a marketing pitch, pascal doesnt have RT cores or Tensor cores. I question Nvidias intentions sometimes.... Wouldnt they want people to buy new RTX cards? From a business standpoint, It seems as if this would slow RTX sales more. Dont even get me started on the 1660ti and 1660, just why.... Theyre competing with their own cards. Buy a damn rtx 2060 or gtx 1070.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
dannyo969:

seems like a marketing pitch, pascal doesnt have RT cores or Tensor cores. I question Nvidias intentions sometimes.... Wouldnt they want people to buy new RTX cards? From a business standpoint, It seems as if this would slow RTX sales more. Dont even get me started on the 1660ti and 1660, just why.... Theyre competing with their own cards. Buy a damn rtx 2060 or gtx 1070.
This should have been this way from the beginning. The biggest thing people have said about RTX cards is they don't like the pricing, they don't like the performance relative to Pascal, and they don't like the performance of ray tracing. Since no one has had anything but nvidias word about "how much faster" turing is compared to pascal when it comes to ray tracing, many people simply didn't care. They looked at the "Up to 6 times the performance compared to pascal on ray tracing" as simply a meaningless number, as it's not something that they or any reviewer could verify and legitimize. Now that'll change and we'll see exactly where it stands and exactly how much of a leap the RTX cards actually are at ray tracing, and yes, there will still be people saying that it's not good enough, there always is, but for some, hopefully, it'll showcase where the future may come as people gotta remember: Current RTX cards are first iteration, first iteration are always the worst, and the technology, generally, leaps from there once they finally figure out how to do it (again, must remember tessellation, which is hardly mentioned anymore, because it's not that big of a performance hit it once was in its first iteration) But i'm sure there will still be people complaining for the sake of complaining.
data/avatar/default/avatar30.webp
Another disappointing keynote with way too much focus on AI, data analytics and cloud "gaming"...
data/avatar/default/avatar20.webp
Makes no sense because the 10 series cards wouldn't be able to handle it. The RTX cards barley can handle it natively. Unless Direct X can handle alot of the overhead needed but that is still pushing it. As mentioned maybe the 16xx based cards could possibly handle it because of the new technologies it has.
https://forums.guru3d.com/data/avatars/m/239/239932.jpg
MS allows certain DX12 games to work on windows 7 and now nvidia enabling DXR on pascal? wut. Performance will be awful so it won't even matter.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
lucidus:

MS allows certain DX12 games to work on windows 7 and now nvidia enabling DXR on pascal? wut. Performance will be awful so it won't even matter.
You have tried both already? Nice, tell us more about it! What's the performance issue you have experienced in Win7 with DX12? And how about DXR? Was it under-performing in some specific scenarios? Can't wait to hear your feedback!
data/avatar/default/avatar36.webp
RzrTrek:

Another disappointing keynote with way too much focus on AI, data analytics and cloud "gaming"...
Its GTC, its not a consumer focused show. AI and Data Analytics are essentially the key focus points of GTC as a whole Maybe there will be some consumer-focused news later this week from GDC, which is happening at the same time.
https://forums.guru3d.com/data/avatars/m/245/245078.jpg
I watched for like 25 minutes last night but was tired before the presentation started and eventually had to give up. While the target audience wasn't consumers it was one hell of a bad presentation. Last years RTX unveil which I considered bad was somewhat better, despite some rambling and an unfocusedness from JHHs part. More enjoyable to read todays news of what happened last night. 🙂 While I like that Nvidia actually choose to support Microsofts DXR, it still makes the RTX series even more pointless for those of us looking to upgrade. In Nvidias words, Pascal GPU's are too slow for raytracing. The RTX series leaves a lot to be wanting performance wise, especially with raytracing enabled. The biggest win will be more GPUs on the market that support some form of performance for some kind of raytracing, thus making more developers interested in supporting raytracing. I'll enjoy trying out how well Quake II raytraced will work. Considering how badly it runs even on a RTX 2080Ti I'm guessing it won't be that enjoyable. But still, getting to try it without shelling out for a 2080Ti is great. Something I've been wondering about is if it would be possible to do the old dedicated PhysX thing and having a second Pascal card in the system, or even a RTX 2060, to offload ones primary GPU for all the DXR stuff. Won't happen most likely, Nvidia want's us to buy a new, more expensive, RTX card instead. But I had enjoyed the possibility. [youtube=vrq1T93uLag]
data/avatar/default/avatar08.webp
gtx 1060 3gb and gtx 1050 ti 4gb are out of the dxr ray tracing equation.Amd rx 570 4gb is "peasant" by nvidia's own words but has more compute power... Amd also supports the same thing from top to bottom mainstream line up but nvidia's not... just my two cents
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
I don't know if this is a response to Crytek or what, but this is a hilarious move. I am all for increasing feature sets, but seems this will simply confuse lines further, and I wonder if this will ever be real world functional on the 16xx cards. Shame the graphs do not include demonstration of 1660 Ti vs 2070 and 2080 for RTX workloads so we can see if any improvement over 1080Ti for that. I am also confused that if this is done by INT32 cores, what exactly are the RT cores bringing to the table then? Also 3 times performance compared to DLSS enabled? Still going for hard sell on DLSS it seems.
https://forums.guru3d.com/data/avatars/m/263/263487.jpg
Aura89:

Always wanted to play games at 18fps! Again the only point of this is to showcase how important and necessary it is to have dedicated hardware
Or we could change the resolution to 1080p and get the cinematic 24fps! \m/ \m/. Yah its a marketing ploy. I'm still not buying your overpraised RTX gimmick cards nVidia 😀
https://forums.guru3d.com/data/avatars/m/265/265660.jpg
If they really wanted RTX to take off they would have priced their new RTX cards way lower to encourage mass adoption then the developers would care to implement more of it. But no. Let's charge an arm and a leg and pay off few devs to implement something only to warrant the purchase of few rich people. RTX will die or be replaced by something else if they continue like this.