NVIDIA Titan V Raytraces Battelfield V in RTX mode at proper perf (but does not have any RT cores).

Published by

Click here to post a comment for NVIDIA Titan V Raytraces Battelfield V in RTX mode at proper perf (but does not have any RT cores). on our message forum
data/avatar/default/avatar12.webp
Delete pls, i am done...
data/avatar/default/avatar38.webp
Now, we all need a Titan V rt edition. An improved Volta with RT cores at 4000 $. That will destroy everything until 2020. Start a petition.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
RTX - The cash grab story!
data/avatar/default/avatar26.webp
So what did I pay $1200 for? I knew RTX was overpriced, but now it sounds like we got scammed. This could turn into a big lawsuit against Nvidia.
data/avatar/default/avatar34.webp
Fox2232:

It should be known that DX-R is part of Microsofts DX12, therefore instructions should not be proprietary. Some nV guys here with BF V willing to trick game into thinking that you have RTX card?
It's not about instruction nor about hardware: Microsoft DirectX Raytracing is simply an API, not an ABI, which means it doesn't matter how IHVs implement it, the library only requires driver support. There is also a fallback layer based on compute shaders but is primarily intended for development purpose only. If the driver correctly handles it, it can be used on production code, but mostly for very small computations since the overhead is not negligible. The real question is: how much driver support differs on Volta from Pascal.
data/avatar/default/avatar34.webp
LOL gotta love INTEL and NVIDIA, keep doing $hit like this to pi$$ of your fans and the ones smart enough will jump to AMD but most INTEL & NVIDIA are too ignorant and stubborn to accept this reality. Keep buying that Vaseline.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Maybe ray-tracing has been rendering in software this entire time making those rt cores useless which would probably explain the huge performance hit with ray-tracing.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Can someone with RTX card and BF5 try: 1) Disable raytracing in game menu, but keep DX12 2) Add following parameter to executable shortcut: -DxrEnable Potentially test combinations too: -DxrNullEnable -DxrLeakPsoEnable This can confirm if that's correct parameter to enable DXR. Game itself checks following, therefore it may or may not help on GTX cards and additional persuasion may be needed: detectedVendorId detectedDeviceId
https://forums.guru3d.com/data/avatars/m/249/249528.jpg
icedman:

Maybe ray-tracing has been rendering in software this entire time making those rt cores useless which would probably explain the huge performance hit with ray-tracing.
Why would they release such an expensive graphics card, market the hell out of its RT cores and not make use of them until somebody actually found out about it.. that'll run them into the ground even further. More so why isn't amd on this boat already if its really true. Vega 64 might get the same results..
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Alessio1989:

It's not about instruction nor about hardware: Microsoft DirectX Raytracing is simply an API, not an ABI, which means it doesn't matter how IHVs implement it, the library only requires driver support. There is also a fallback layer based on compute shaders but is primarily intended for development purpose only. If the driver correctly handles it, it can be used on production code, but mostly for very small computations since the overhead is not negligible. The real question is: how much driver support differs on Volta from Pascal.
That's ignoring important part. Actual instructions used are determining performance. No RT on Titan V, but equal FP16 performance as 2080Ti hints that this may be actually used. Considering mediocre Pascal's FP16 and nVidia's comparison... That would mean Vega64 having FP16 performance between 2080 and 2080Ti. And therefore AMD's implementation would do just fine. That's why it is important to know what kind of instructions are actually required.
data/avatar/default/avatar15.webp
I don't know, but that screenshot looks like simple screen space reflection and not true path tracing like it's used on RTX card. We need video as a proof. [youtube=Kqn_mJRv5B4] Looks like DXR is old thing and was tested on 1080Ti where it's running only about half the speed of Titan V. So if RTX2080Ti is a bit faster than Titan V, we can probably assume RT cores are working?
data/avatar/default/avatar34.webp
Glottiz:

I knew it, I f-ing knew it that something like this will possible. Now let's see how quickly Nvidia release new "hotfix" drivers that block all possible RTX hacks on Pascal GPUs ๐Ÿ˜€
But what are you excited about? You knew what? That a $3k card can perform similarly to a $1200 card with ray tracing? This really isn't that hard to believe, the Titan V has a ton of tensor cores and Cuda cores as well as 12 GB HBM2 memory. Its not like they ran DXR on a 1080 Ti and matched 2080 or 2080 Ti performance. This finding to me shows that RT cores are in fact necessary as they do a more efficient job of handling DXR. Why would Nvidia hotfix this issue? Would they be worried about people buying a $3k card over a $1,200 card?
data/avatar/default/avatar40.webp
Great White Shark:

I really do think that Nvidia have released the RTX cards really badly. Too expensive, not enough Ray Tracing support. I'm hoping the new 7 nanometer cards will drive the prices down for RTX. But it won't be soon.
I really don't agree with this statement. Sure RTX cards may not be what we all hopped for because it seems most just wanted increased rasterized gameplay for cheaper. Nvidia gave us cards at existing price points (except for the 2080 Ti but maybe if you just think of it as the old titans) and added some new tech to them. Sure you don't get 1080 Ti performance for $400 you hoped for but this is the first time we have seen Ray Tracing in video games officially. Sure from here on out Ray Tracing should get cheaper but who else is offering DXR compatible cards right now? We can point fingers at Nivida but where is AMD? I don't think there is anything wrong with RTX cards I think it is great new gaming tech. Am I going to run out and buy one? No, I own a 1080 Ti. If I had something lower end I was still hanging onto a 2080 would make for a great new card.
data/avatar/default/avatar08.webp
Fox2232:

That's ignoring important part. Actual instructions used are determining performance. No RT on Titan V, but equal FP16 performance as 2080Ti hints that this may be actually used. Considering mediocre Pascal's FP16 and nVidia's comparison... That would mean Vega64 having FP16 performance between 2080 and 2080Ti. And therefore AMD's implementation would do just fine. That's why it is important to know what kind of instructions are actually required.
I agree that it is important to know what instructions are actually required but I highly doubt that a Vega 64 could run DXR anywhere close to that of a 2080 Ti. Nvidia chose RT cores for the job because they are developed to run that tech most efficiently (cost to performance). The extra compute of Vega rarely shines in gaming as it is. I don't think adding real time ray tracing to the mix would go well. I would love to see it though. We just don't know enough about what is going on here. Who knows what about the GV100 is providing the performance. Not directed at you but I am not sure what the excitement is about a $3,000 Volta card handling ray tracing like a $1,200 Card. For me personally, I learned that the RT cores are extremely valuable to bringing down the cost of Ray tracing in games. Much like other forms of computing there is hardware that completes the job and then there is hardware that completes the job much more efficiently.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
After reading this thread i'm more confused then ever why people are surprised by this. Volta was highly marketed for ray tracing capabilities, and there's not much different from Volta and Turing. Titan V has 5120, whereas RTX 2080 ti has 4352 Titan V has faster memory Titan V has more tensure cores Titan V has pretty much more of everything, vs the RTX 2080 ti, and even the Titan RTX The Titan V should be faster then the RTX 2080 ti in most scenarious, but its clock speed doesn't really help it out in this case, and ends up being generally slower. So the fact that the Titan V, a card i'll remind people again was DESIGNED for ray tracing, is able to match, OVERCLOCKED (according to the article) an RTX 2080 ti, without dedicated RT cores, shouldn't be surprising. The question should not be are RT cores needed The question should be: What would the Titan V perform like in RTX games if it HAD RT cores. We also already know that Battlefield V specifically does not fully utilize RTX, so realistically, it's not the best game to see the full differences either. Personally i read through the thread posted in the news article and it's hard to tell what is being tested and what isn't, and what is in comparison to what, it'd be nice if an actual reviewer were to do it at some point.
https://forums.guru3d.com/data/avatars/m/219/219428.jpg
nevcairiel:

The number of rays they advertise is an order of magnitude lower then NVIDIAs however, which makes the real difference between real-time gaming use, or "just" using it in 3D graphics design and rendering. Doing Ray Tracing itself isn't "new", doing it really fast is what was missing.
Mrays per second itself does not say a lot. There is no specification if these numbers come from the RT cores alone or if CUDA was used or if there is any post processing needed after. We might need another 2 generations of GPU's before we have sufficient performance needed to do proper raytracing on even lower end cards.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
this is not entirely surprising, nvidia accelerates raytracing using both the tensor cores and the so called rt cores, a good chunk of it is done by the tensor cores(specifically denoising) all this implies is that at least in bfv the specialized rt hardware is not heavily utilized or the implemention via dxr does not leverage it.
data/avatar/default/avatar16.webp
Goutan:

So what did I pay $1200 for? I knew RTX was overpriced, but now it sounds like we got scammed. This could turn into a big lawsuit against Nvidia.
You paid 1200$ for a life lesson, about patience and about how not to ever trust a corporate entity that exists solely to create ever increasing amounts of profit.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
oh dear here we go again ๐Ÿ˜› ! What are you doing ngreedia ??? ๐Ÿ˜•
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
user1:

this is not entirely surprising, nvidia accelerates raytracing using both the tensor cores and the so called rt cores, a good chunk of it is done by the tensor cores(specifically denoising) all this implies is that at least in bfv the specialized rt hardware is not heavily utilized or the implemention via dxr does not leverage it.
Wasn't that what Dice said as well? They were not currently utilizing the RT cores although this was back in the alpha or beta version and shortly after the initial presentation and it's also changed since a few patches back. (The implementation now also uses SSR - in addition to the ray-tracing itself that is. - to cover more reflections in addition to tweaking it for performance and other improvements or changes.) Wonder what state the SDK is in, it's being implemented and used in various games (DLSS, NVIDIA's take on ray-tracing through DX12 and also Vulkan via extensions.) though I guess that's covered by NDA's and it's still very early on. (Something's going on with Shadow of the Tomb Raider too but I doubt we'll ever know what and it'll eventually make it's way into the release build of the game via some future patch.) EDIT: Ah it's covered in a earlier reply already. ๐Ÿ™‚
We also already know that Battlefield V specifically does not fully utilize RTX, so realistically, it's not the best game to see the full differences either.