Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
GALAX GeForce RTX 4070 Ti EX White review
Cougar Terminator gaming chair review
G.Skill TridentZ5 RGB DDR5 7200 CL34 2x16 GB review
ASUS TUF Gaming B760-PLUS WIFI D4 review
Netac NV7000 2 TB NVMe SSD Review
ASUS GeForce RTX 4080 Noctua OC Edition review
MSI Clutch GM51 Wireless mouse review
ASUS ROG STRIX B760-F Gaming WIFI review
Asus ROG Harpe Ace Aim Lab Edition mouse review
SteelSeries Arctis Nova Pro Headset review

New Downloads
HWiNFO Download v7.42
Intel ARC graphics Driver Download Version: 31.0.101.4257
CrystalDiskInfo 9.0.0 Beta4 Download
AIDA64 Download Version 6.88
GeForce 531.41 WHQL driver download
AMD Radeon Software Adrenalin 23.3.2 WHQL download
GeForce 531.29 WHQL driver download
AMD Ryzen Master Utility Download 2.10.2.2367
AMD Radeon Software Adrenalin 23.3.1 WHQL download
Display Driver Uninstaller Download version 18.0.6.1


New Forum Topics
Entertainment Software Association (ESA) Cancels E3 2023 Event due to Lack of Industry Support NVIDIA's Upcoming RTX 4070 Graphics Card to Cost $599 Impending 5800x3d purchase :) Nvidia Cracks Down on Counterfeit Graphics Cards in Collaboration with Chinese E-commerce Platforms The Last of Us Part I PC Port Receives 77% negative ratings on Steam, due to poor optimization RadeonMod (Tweak Utility) Review: GALAX GeForce RTX 4070 Ti EX White Windows 11 Insider Builds MSI AB / RTSS development news thread AMD Software: Adrenalin Edition 23.3.2 WHQL - Driver Download and Discussion




Guru3D.com » News » NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti

NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti

by Hilbert Hagedoorn on: 06/22/2020 08:20 AM | source: hardwareleaks | 119 comment(s)
NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti

In the category completely a rumor, news hot the web today that an alleged NVIDIA GeForce "Ampere" GPU made an appearance in 3DMark Time Spy. Not details name wise is noted down so it is assumed to be a GeForce RTX 3080 or RTX 3090.

Rumored specs have been doing rounds for a while now, but this round the 3DMark Time Spy score was dug up by by _rogame (Hardware Leaks), the card listed scored 18257 points, roughly 30 percent faster than the RTX 2080 Ti Founders Edition. Futuremark SystemInfo does bring a readout of the GPU clock at 1935 MHz with a memory clock at 6000 MHz. That last one is interesting as typical memory actual clocks are listed as 1750 MHz for 14 Gbps GDDR6. Please take the usual disclaimers in mind. 

Rogame claims to know with 100 percent certainty that this test was run by an Nvidia employee.

  



NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti




« QNAP Outs Boxafe G-Suite and Office 365 NAS Backup Solution · NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti · ASUS ProArt PA32UCX-PK 32in Pro Series Monitor starts listing at 3999 EUR (Mini LED) »

Related Stories

NVIDIA GeForce RTX 3080 Now Better Visualized with 3D Renders - 06/08/2020 09:32 AM
The bubble busted over the weekend when photos leaked of the alleged GeForce RTX 3080, the new front and back fan design is pretty unique. A Reddit user now has made some 3D renderings of the card, pr...

Here are the PC requirements for F1 2020, AMD Ryzen 5 2600X and Nvidia GeForce GTX 1660 Ti - 06/04/2020 07:34 AM
Codemasters announced the minimum and recommended requirements for its annual F1 game, in this case, F1 2020. The requirements are reasonable for the times and taking into account the visual quality i...

Games will only be on Nvidia GeForce Now after developer opts-in - 05/28/2020 08:12 AM
GeForce NOW is an extension of the PC ecosystem. There is no cost for developers — games just run without difficult porting requirements — helping them reach millions of players who don’t have g...

Rumor: NVIDIA GeForce RTX 3070 and 3080 Coming Q3 2020 + Specs - 04/29/2020 02:25 PM
The rumor mill can't stop the chatter about NVIDIA's series 3000 cards, and that will probably stay that way until announced. Today yet another twitter user posted a chart, with names and specificat...

NVIDIA GeForce Now Game Streaming Service Loses Xbox Game Studios and Warner Brothers - 04/21/2020 10:42 AM
Diversification often leaves a market all splintered up, each company holding its own ground and tied exclusively towards a service. Take for example Streaming services like (initially) Netflix, now d...


24 pages « < 20 21 22 23 > »


PrMinisterGR
Senior Member



Posts: 8099
Joined: 2014-09-27

#5803145 Posted on: 06/25/2020 11:51 AM
"B's driver devs try to follow the spec more closely than Vendor A, but in the end this tends to do them no good because most devs just use Vendor A's driver for development and when things don't work on Vendor B they blame the vendor, not the state of GL itself."

Thanks for quoting someone you consider authority.

It's beautiful how you completely ignore the rest though. They couldn't even do texture fetching properly.

This vendor can't get key stuff like queries or syncs to work reliably. So any extension that relies on syncs for CPU/GPU synchronization aren't workable. The driver devs remaining at this vendor pine to work at Vendor A.
Vendor B can't update its driver without breaking something. They will send you updates or hotfixes that fix one thing but break two other things. If you single step into one of this driver's entrypoints you'll notice layers upon layers of cruft tacked on over the years by devs who are no longer at the company. Nobody remaining at vendor B understands these barnacle-like software layers enough to safely change them.

I've occasionally seen bizarre things happen on Vendor B's driver when replaying GL call streams of shipped titles into this driver using voglreplay. The game itself will work fine, but when the GL callstream is replayed we'll see massive framebuffer corruption (that goes away if we flush the GL pipeline after every draw). My guess: this driver is probably using app profiles to just turn off entire features that are just too buggy.
Also no response about how their Linux OpenGL driver is actually good, while the Windows driver is crap. Meanwhile it's all an "Evil Nvidia Plot to corrupt our fingers while we code the OpenGL driver in Windows" :p

Why do you feel the need to excuse them, when they're so obviously at fault? You pay them and they do a shitty job.

As for someone "I consider an authority", just lel
"Co-owner of Binomial LLC, game and open source developer, graphics programmer, lossless data and GPU texture compression specialist. Worked previously at SpaceX, Forgotten Empires, DICE, Microsoft Ensemble Studios, Valve, and Boss Fight Entertainment."

What would he know right? It's like Tim Sweeney all over again. "What do all these Unreal Engine guys know about graphics architectures".
Jesus Christ.

Fox2232
Senior Member



Posts: 11808
Joined: 2012-07-20

#5803149 Posted on: 06/25/2020 12:06 PM
It's beautiful how you completely ignore the rest though. They couldn't even do texture fetching properly.


Also no response about how their Linux OpenGL driver is actually good, while the Windows driver is crap. Meanwhile it's all an "Evil Nvidia Plot to corrupt our fingers while we code the OpenGL driver in Windows" :p

Why do you feel the need to excuse them, when they're so obviously at fault? You pay them and they do a shitty job.

As for someone "I consider an authority", just lel
"Co-owner of Binomial LLC, game and open source developer, graphics programmer, lossless data and GPU texture compression specialist. Worked previously at SpaceX, Forgotten Empires, DICE, Microsoft Ensemble Studios, Valve, and Boss Fight Entertainment."

What would he know right? It's like Tim Sweeney all over again. "What do all these Unreal Engine guys know about graphics architectures".
Jesus Christ.
Issue here is that you are biased and look for supporting evidence to do so. While very people who you use, say something else.
And that's 2014 on top of it. That's beginning of GCN and time nobody sane even touched OGL outside of Linux.

If you noticed current DX, only imbecile would consider OGL over that or Vulkan.
Thanks for quoting someone you consider authority.
I wrote this because you can't go and ignore 1/2 of something you intended to weaponize. So I can have very same person to tell you that you are biased.

OGL was always like DX10.

Fox2232
Senior Member



Posts: 11808
Joined: 2012-07-20

#5803160 Posted on: 06/25/2020 12:38 PM
Why do you feel the need to excuse them, when they're so obviously at fault? You pay them and they do a shitty job.

And this one needs very separate addressing. Because this tells that there is something very wrong with your views.

AMD paid tons of money (they actually overpaid as that was only way) for ATi, and that is reason why nVidia does not have GPU monopoly.
That was big financial blow for your sake as result!

And for your information, AMD has intel on left side and nVidia on right side.
That's: revenue of $75B against AMD's CPU division. And revenue of $11B against GPU division where AMD's revenue is $7B.
Go, and tell me how bad job they do in comparison on how they are "paid".

And then go to intel and ask them about thier GPU division which is surely at least twice as big as entire AMD's CPU+GPU division and paid better.
- - - -
If AMD was not in GPU business, you would be just fine. You would sit on GTX 680, maybe 4GB version by now.
But rest of us would feel damage from business strategy similar to intel's. We would be maybe at GTX 680 + 50% performance on top of it.
Instead we have 4 times stronger GPUs and more available... thanks to fact that AMD "IS" competitive enough where it matters to move industry this way in 6 years.
- - - -
If AMD did what you want them to do, so you would praise them, we would be nowhere. OGL would still sux for industry (and be irrelevant for gamers) no matter how well it would run on AMD's HW. On top of that AMD's WH would sux in return on DX where it actually matters for AMD's business.
You would phase out AMD out of GPU business.

Following the 5% was what got AMD to problems. It took change of leadership and way of thinking, to get them back on track. AMD is doing what matters, because they do not have finance to pursue 5%. And they never did, because when they had superior products, they were prevented from cashing on that by industry.

TheSissyOfFremont
Senior Member



Posts: 252
Joined: 2020-06-12

#5803166 Posted on: 06/25/2020 12:58 PM
Ray tracing and DLSS are two completely different things. Also sacrificing post processing for raw pixel count becomes more and more pointless.


Not necessarily sure I'd agree with that - I'm currently playing MGSV at 5120x2880DSR on my 1440p native monitor and the image quality increase is enormous. It's incredibly impressive.
Something I'd happily prioritise over many other features. I think it depends entirely on the specific game you are playing. I've not seen 8k in real life, and I don't know where the real point of significantly diminishing returns (for people with perfect eyesight) is but I think we're a way off.

I completely agree re: rasterisation performance though, non RT benchmarks are redundant going forward - I think within 5 years you'll see almost all games use some degree of RT and at least a small minority of games use it heavily (way beyond what we've seen thus far).

Dragam1337
Senior Member



Posts: 5553
Joined: 2017-11-23

#5803182 Posted on: 06/25/2020 01:50 PM
I wouldn't call exclusive features which can in many instances dramatically improve image quality and realism or increase frame rates on higher resolution via upscaling as "gimmicks". AMD fanboys used to always say that about PhysX... up until it was made open a few years ago. But coming from their perspective, i'd probably be a bit bitter seeing tech i can't use because i saved a few bucks and bought a competitors product.


Imo it is a gimmick in line with physx - even in control which everyone says is the posterchild for rtx, it only looks marginally better than screenspace reflections, and at an absurd performance hit. Dlss is just thrash, pure and simple... no one buys a 4k screen to play at sub 4k image quality. Native res or downsampling, anything else is fail.

24 pages « < 20 21 22 23 > »


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023