NVIDIA Titan V Raytraces Battelfield V in RTX mode at proper perf (but does not have any RT cores).

Published by

Click here to post a comment for NVIDIA Titan V Raytraces Battelfield V in RTX mode at proper perf (but does not have any RT cores). on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
"However, the findings raise the question, are dedicated RT in hardware cores actually needed?" That's not the question I would and do ask. I do ask: "Are there actually dedicated RT in RTX cards?" I did ask this before, but it does not hurt to repeat question: "Is anyone who knows details of DX-R able to say what kind of mathematical instructions are required by DX-R?" Because if it is something like FP16, then Titan V is twice as fast as 1080Ti and still bit faster than RTX 2080Ti. It should be known that DX-R is part of Microsofts DX12, therefore instructions should not be proprietary. Some nV guys here with BF V willing to trick game into thinking that you have RTX card?
data/avatar/default/avatar08.webp
I wonder if it'l work for an AMD card o_O...
data/avatar/default/avatar37.webp
I knew it, I f-ing knew it that something like this will possible. Now let's see how quickly Nvidia release new "hotfix" drivers that block all possible RTX hacks on Pascal GPUs 😀
data/avatar/default/avatar13.webp
Very interesting.
data/avatar/default/avatar07.webp
How did they do if the game refuses to connect RTX out of rtx2000 graphics? It has already been tried by UFD Tech and they could not get it even using a 2080 ti to start the game and then move on to another graphics card. For me it's fake ..
https://forums.guru3d.com/data/avatars/m/219/219428.jpg
flashmozzg:

I wonder if it'l work for an AMD card o_O...
Raytracing has already been working on AMD cards for quite a while. GCN has a ton of cumpute power to handle raytracing. https://gpuopen.com/gaming-product/radeon-rays/ Dubbed Radeon Rays, the company’s ray tracing developer suite will now support real-time ray tracing in Radeon Rays 2.0. The new engine is compatible with OpenCL 1.2. Built on Vulkan, Radeon Rays 2.0 leverages the API’s advanced support for asynchronous compute to make real-time ray tracing a reality.
https://forums.guru3d.com/data/avatars/m/274/274129.jpg
I really do think that Nvidia have released the RTX cards really badly. Too expensive, not enough Ray Tracing support. I'm hoping the new 7 nanometer cards will drive the prices down for RTX. But it won't be soon.
data/avatar/default/avatar15.webp
Great White Shark:

I really do think that Nvidia have released the RTX cards really badly. Too expensive, not enough Ray Tracing support. I'm hoping the new 7 nanometer cards will drive the prices down for RTX. But it won't be soon.
And the question raises: why? They have all the money to release a more powerful card, or they can simply releasea 700-900 like generation leap with a decrease in price like it was in that case.
data/avatar/default/avatar35.webp
GlennB:

Raytracing has already been working on AMD cards for quite a while. GCN has a ton of cumpute power to handle raytracing. https://gpuopen.com/gaming-product/radeon-rays/ Dubbed Radeon Rays, the company’s ray tracing developer suite will now support real-time ray tracing in Radeon Rays 2.0. The new engine is compatible with OpenCL 1.2. Built on Vulkan, Radeon Rays 2.0 leverages the API’s advanced support for asynchronous compute to make real-time ray tracing a reality.
The number of rays they advertise is an order of magnitude lower then NVIDIAs however, which makes the real difference between real-time gaming use, or "just" using it in 3D graphics design and rendering. Doing Ray Tracing itself isn't "new", doing it really fast is what was missing.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
BReal85:

And the question raises: why? They have all the money to release a more powerful card, or they can simply releasea 700-900 like generation leap with a decrease in price like it was in that case.
if i had to take a wild guess why they did that ... my speculation is that the gpu is a huge monolithic die i really really doubt their yields on 100% functional ones are the best around , so what ever is not good enough to make it as a quadro it trickles down to the gaming gpus instead of throwing em away.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
nevcairiel:

The number of rays they advertise is an order of magnitude lower then NVIDIAs however, which makes the real difference between real-time gaming use, or "just" using it in 3D graphics design and rendering. Doing Ray Tracing itself isn't "new", doing it really fast is what was missing.
doesn't matter how fast your system would render rays, the option is locked out for all other cards in BF V for a reason or not we do not know !?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
MK80:

How did they do if the game refuses to connect RTX out of rtx2000 graphics? It has already been tried by UFD Tech and they could not get it even using a 2080 ti to start the game and then move on to another graphics card. For me it's fake ..
That's not way to do it. Did they not learn a thing about graphical drivers? That's why you can't use modern drivers on ancient graphics card. There are no code paths for given architectures. Way to go is to persuade driver that your GTX card is actually RTX card. One of ways is just change HW ID in driver's inf file for installation. Other way is to change HW ID in vBIOS. I did that few times in past. Especially with Fury X release as performance was not exactly as expected. And making driver believe that it is different card yielded different performance results. When Polaris launched and there were some new features, I just changed HW IDs in vBIOS and had some mixed results. To put it simply: HW <-> Driver <-> Windows Libraries <-> Game Driver takes HW an exposes its abilities to windows libraries, in this case to Direct3D. That reports supported features to game. This is done per device, therefore having 2 different cards will result in 2 different sets of features exposed. Hack can be done on multiple places. Easiest is to trick driver. Bit more risky is vBIOS mod. And most reliable is actually creating wrapper library between game and Direct3D libraries. Those will report whatever they (you) want to game and then do whatever they (you) want with instructions returned by game.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
I was wondering about this fact. While playing shadow of tomb raider also....notice the reflections (with my 1070Ti)... Is that software or hardware really?
https://forums.guru3d.com/data/avatars/m/261/261343.jpg
is this not expected? Wasn't NVIDIA’s RTX technology originally developed... powered on Volta GPUs? ( I imagine an RT core SM was originally tweaked on Volta SM accelerated in tandem with Nvidia's software component to real time ray tracing? ) With 8 years of development wouldn't Volta benefit from the final version of Nvidia's RTX software technology? Kind of wondering how the original Nvidia Raytracing Demos that ran on Volta ( like the Starwar's "Reflections" demo ) might run today as the software has probably matured alot since release? [youtube=lMSuGoYcT3s] Quote : " The demonstration is powered by NVIDIA’s "RTX technology for Volta GPUs", available via Microsoft’s DirectX Ray Tracing API (DXR)... " So I imagine RTX is then supported for Volta as well??? ( or hopefully the RTX component used to develop the tech leveraging Volta hardware is not abandoned )
data/avatar/default/avatar32.webp
Fox2232:

"However, the findings raise the question, are dedicated RT in hardware cores actually needed?" That's not the question I would and do ask. I do ask: "Are there actually dedicated RT in RTX cards?"
Turing has the same relative transistor density to Pascal, which suggests that whatever dedicated RT logic is there in Turing it's rather simple ad-hoc solution to the existing compute resources. The rest of the architectural changes are way more significant than RTX.
Fox2232:

I did ask this before, but it does not hurt to repeat question: "Is anyone who knows details of DX-R able to say what kind of mathematical instructions are required by DX-R?" Because if it is something like FP16, then Titan V is twice as fast as 1080Ti and still bit faster than RTX 2080Ti.
Ray-tracing is also very demanding on the memory sub-system and I can only guess HBM2 used for Volta is a major contributor to the RTX performance numbers, along with the large internal caches and registers. In the case for Battlefield V, ray-traced reflections require a lot more work from the graphics pipeline, since each reflection is actually rendered "projection", accumulated along the ray-path. The ray traversal alone is not that much costly (memory latency sensitive), particularly at single sample per pixel and utilized for one effect only.
data/avatar/default/avatar25.webp
Volta was originally intended to run ray tracing, wasn't it?
https://forums.guru3d.com/data/avatars/m/252/252732.jpg
DXR in BF:V was initially developed using Titan V's before the 2080Ti's were available to DICE so not really that surprising.
https://forums.guru3d.com/data/avatars/m/269/269560.jpg
anxious_f0x:

DXR in BF:V was initially developed using Titan V's before the 2080Ti's were available to DICE so not really that surprising.
Do you mean BF:V would actually only be software supported for ray tracing, and not using the RT cores physically ? We would not have really seen the true power of the RTX architecture (2019 wishful thinking...) ?
data/avatar/default/avatar11.webp
Nvidia guy seems had a reason, to put his signature in Titan V JHH CEO Edition. It is still the top dog for every single one application out there. You can't say the same for Titan RTX. One more reason to drag Nvidia to court. Titan RTX should never have beed released for that price. What a scam!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Here's what I don't understand: How did Nvidia did not have the foresight that people were going to test this? In general, I'm actually a bit surprised people haven't tested this sooner, and, haven't attempted testing raytracing on even more non-RTX GPUs. In fact, has anyone tried testing RTX on with and without the RT cores on a 2080Ti? Seems like an experiment worth conducting. Although the Titan V is functionally very different than a 2080Ti, it's still not looking good that a piece of hardware with no RTX compatibility is pushing out comparable results. As far as I'm concerned, the tensor cores aren't used for RTX, so, that really just leaves the V's memory bandwidth as the major difference. Part of me wonders if the RT cores are actually functioning properly.