3DMark Time Spy Raytracing Benchmark Update To Arrive Next Month

Published by

Click here to post a comment for 3DMark Time Spy Raytracing Benchmark Update To Arrive Next Month on our message forum
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
ooo
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Cool, I'm gonna run this at 320x240 on my GTX 1070, should run pretty darn good, don't worry I've done the math!
https://forums.guru3d.com/data/avatars/m/200/200296.jpg
Isn't Direct XR ? 🙂
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Robbo9999:

Cool, I'm gonna run this at 320x240 on my GTX 1070, should run pretty darn good, don't worry I've done the math!
Don't go there, I was playing ES:Oblivion at 720x480 with all eye candy for a while on notebook just to see hat I am missing at 1280x800.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Fox2232:

Don't go there, I was playing ES:Oblivion at 720x480 with all eye candy for a while on notebook just to see hat I am missing at 1280x800.
Ha, I did the same back in praps 2010 or 2011, with a notebook...M1530, overclocked 8600M GT, Crysis 2 at max setting, think it was something like ---x512 (some weird custom resolution I tried!). I soon realised it was too blurry!
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
UL... When you run this benchmark, it will tell if your PC is properly grounded, otherwise up the electrical standards, and all in all safe to use. It might also tell something about the performance.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Someone called it here as a joke.. and Lookilooki xD
data/avatar/default/avatar39.webp
Man AMD is not gonna look great in these benchmarks.... I'm starting to think that maybe AMD gave up on consumer graphics they are doing great in consoles, Ryzen is a strong product and Datacenter/AI is far more profitable.
data/avatar/default/avatar35.webp
AMD sooner or later will release something. Also, there's no game that current cards can't run at decent resolution and framerates. Infact, this year was a really bad year for PC gaming, I don't remember anything special that was released.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Just to note, this is running on DXRT not RTX. RTX is the library Nvidia uses for Turing RTX cores to communicate with DXRT, without support it won't use them. The only requirement for DXRT is DX12 level hardware.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
HardwareCaps:

Man AMD is not gonna look great in these benchmarks.... I'm starting to think that maybe AMD gave up on consumer graphics they are doing great in consoles, Ryzen is a strong product and Datacenter/AI is far more profitable.
Guess what is AMD making for next PS console?
data/avatar/default/avatar30.webp
HardwareCaps:

Man AMD is not gonna look great in these benchmarks.... I'm starting to think that maybe AMD gave up on consumer graphics they are doing great in consoles, Ryzen is a strong product and Datacenter/AI is far more profitable.
I dont think they have, they are just having a hard time. They have the traditional console market, but with razor thin margins. Nvidia and Intel both laughed at how little they are making, but maybe amd just wants sony and Microsofts help?
https://forums.guru3d.com/data/avatars/m/273/273835.jpg
just make everything reflect ! i wana see reflecting mute surface in games.
data/avatar/default/avatar32.webp
So I will go from "Your pc is better than 99% of all results" to "You are a console peasant"!
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
vbetts:

Just to note, this is running on DXRT not RTX. RTX is the library Nvidia uses for Turing RTX cores to communicate with DXRT, without support it won't use them. The only requirement for DXRT is DX12 level hardware.
Ah right, does that mean that Turing won't be able to use it's RT cores in this benchmark then? i.e. That means that Turing will not get a massive performance bump (e.g. x10) over Pascal in this benchmark? EDIT: Sorry, didn't read your post carefully enough - you said that without RTX then the RT cores won't be able to be used - well that sucks. So Turing will not get a massive bump over Pascal beyond the 40-60% supposed basic increase in general rendering power of Turing over Pascal.
data/avatar/default/avatar27.webp
Well you cant expect benchmark programs to benchmark proprietary technology that would just be unfair. Why do you think we haven't seen this (Serra) because it would be unfair to nvidia. You see some of this used in forza 7 game. Its why the vega won. https://www.overclock3d.net/news/gpu_displays/amd_rx_vega_-_what_is_rapid_packed_math/1 https://www.guru3d.com/news-story/forza-7-pc-graphics-performance-benchmarks.html https://www.overclock3d.net/gfx/articles/2017/07/31102128179l.jpg
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Pimpiklem:

Well you cant expect benchmark programs to benchmark proprietary technology that would just be unfair. Why do you think we haven't seen this (Serra) because it would be unfair to nvidia. You see some of this used in forza 7 game. Its why the vega won. https://www.overclock3d.net/news/gpu_displays/amd_rx_vega_-_what_is_rapid_packed_math/1 https://www.guru3d.com/news-story/forza-7-pc-graphics-performance-benchmarks.html https://www.overclock3d.net/gfx/articles/2017/07/31102128179l.jpg
Yes, I guess that would be unfair. But, if they could design the Ray Tracing benchmark to have support for ray tracing proprietary technologies of both NVidia & AMD (if they develop one), then I think that would be fair.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Robbo9999:

Ah right, does that mean that Turing won't be able to use it's RT cores in this benchmark then? i.e. That means that Turing will not get a massive performance bump (e.g. x10) over Pascal in this benchmark? EDIT: Sorry, didn't read your post carefully enough - you said that without RTX then the RT cores won't be able to be used - well that sucks. So Turing will not get a massive bump over Pascal beyond the 40-60% supposed basic increase in general rendering power of Turing over Pascal.
This was an interesting article posted by engadget for it. https://www.engadget.com/2018/08/25/nvidia-rtx-speed-claims-rt-turing-cores-video/ To me what it seems like, theoretical power. With RTX Turing can put out 10x the power in ray tracing.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
vbetts:

This was an interesting article posted by engadget for it. https://www.engadget.com/2018/08/25/nvidia-rtx-speed-claims-rt-turing-cores-video/ To me what it seems like, theoretical power. With RTX Turing can put out 10x the power in ray tracing.
Yep, I knew about RTX Turing being x10 more powerful than Pascal in ray tracing, which is why I was joking earlier that I'd need to run this benchmark on my GTX 1070 at 320x240 to get decent frame rates vs Turing at 1920x1080, but given that you say this benchmark doesn't support RTX ray tracing cores, then perhaps this benchmark will run better than I expected on my GTX 1070 because otherwise it would run rubbish on Turing GPUs too considering it can't use it's ray tracing cores.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
HardwareCaps:

Man AMD is not gonna look great in these benchmarks.... I'm starting to think that maybe AMD gave up on consumer graphics they are doing great in consoles, Ryzen is a strong product and Datacenter/AI is far more profitable.
Not really. AMD still offers very nice cards for every market with the exception of the high end. Their problems was that they bet heavily on HBM memory and that was an huge failure at least for now... And with Nvidia on a roll things got eve worse of course. But as long as they continue to make good cards with good prices they will continues to sell them. As for high end maybe they should create a new dual card like a Vega 64X2. Of course they have to fix their power usage first...