NVIDIA GeForce Ampere 3DMark Time Spy Benchmarks show it to be 30 percent faster than RTX 2080 Ti
In the category completely a rumor, news hot the web today that an alleged NVIDIA GeForce "Ampere" GPU made an appearance in 3DMark Time Spy. Not details name wise is noted down so it is assumed to be a GeForce RTX 3080 or RTX 3090.
Rumored specs have been doing rounds for a while now, but this round the 3DMark Time Spy score was dug up by by _rogame (Hardware Leaks), the card listed scored 18257 points, roughly 30 percent faster than the RTX 2080 Ti Founders Edition. Futuremark SystemInfo does bring a readout of the GPU clock at 1935 MHz with a memory clock at 6000 MHz. That last one is interesting as typical memory actual clocks are listed as 1750 MHz for 14 Gbps GDDR6. Please take the usual disclaimers in mind.
Rogame claims to know with 100 percent certainty that this test was run by an Nvidia employee.
NVIDIA GeForce RTX 3080 Now Better Visualized with 3D Renders - 06/08/2020 09:32 AM
The bubble busted over the weekend when photos leaked of the alleged GeForce RTX 3080, the new front and back fan design is pretty unique. A Reddit user now has made some 3D renderings of the card, pr...
Here are the PC requirements for F1 2020, AMD Ryzen 5 2600X and Nvidia GeForce GTX 1660 Ti - 06/04/2020 07:34 AM
Codemasters announced the minimum and recommended requirements for its annual F1 game, in this case, F1 2020. The requirements are reasonable for the times and taking into account the visual quality i...
Games will only be on Nvidia GeForce Now after developer opts-in - 05/28/2020 08:12 AM
GeForce NOW is an extension of the PC ecosystem. There is no cost for developers — games just run without difficult porting requirements — helping them reach millions of players who don’t have g...
Rumor: NVIDIA GeForce RTX 3070 and 3080 Coming Q3 2020 + Specs - 04/29/2020 02:25 PM
The rumor mill can't stop the chatter about NVIDIA's series 3000 cards, and that will probably stay that way until announced. Today yet another twitter user posted a chart, with names and specificat...
NVIDIA GeForce Now Game Streaming Service Loses Xbox Game Studios and Warner Brothers - 04/21/2020 10:42 AM
Diversification often leaves a market all splintered up, each company holding its own ground and tied exclusively towards a service. Take for example Streaming services like (initially) Netflix, now d...
Senior Member
Posts: 11808
Joined: 2012-07-20
It's beautiful how you completely ignore the rest though. They couldn't even do texture fetching properly.
Also no response about how their Linux OpenGL driver is actually good, while the Windows driver is crap. Meanwhile it's all an "Evil Nvidia Plot to corrupt our fingers while we code the OpenGL driver in Windows" :p
Why do you feel the need to excuse them, when they're so obviously at fault? You pay them and they do a shitty job.
As for someone "I consider an authority", just lel
"Co-owner of Binomial LLC, game and open source developer, graphics programmer, lossless data and GPU texture compression specialist. Worked previously at SpaceX, Forgotten Empires, DICE, Microsoft Ensemble Studios, Valve, and Boss Fight Entertainment."
What would he know right? It's like Tim Sweeney all over again. "What do all these Unreal Engine guys know about graphics architectures".
Jesus Christ.
Issue here is that you are biased and look for supporting evidence to do so. While very people who you use, say something else.
And that's 2014 on top of it. That's beginning of GCN and time nobody sane even touched OGL outside of Linux.
If you noticed current DX, only imbecile would consider OGL over that or Vulkan.
Thanks for quoting someone you consider authority.
I wrote this because you can't go and ignore 1/2 of something you intended to weaponize. So I can have very same person to tell you that you are biased.
OGL was always like DX10.
Senior Member
Posts: 11808
Joined: 2012-07-20
And this one needs very separate addressing. Because this tells that there is something very wrong with your views.
AMD paid tons of money (they actually overpaid as that was only way) for ATi, and that is reason why nVidia does not have GPU monopoly.
That was big financial blow for your sake as result!
And for your information, AMD has intel on left side and nVidia on right side.
That's: revenue of $75B against AMD's CPU division. And revenue of $11B against GPU division where AMD's revenue is $7B.
Go, and tell me how bad job they do in comparison on how they are "paid".
And then go to intel and ask them about thier GPU division which is surely at least twice as big as entire AMD's CPU+GPU division and paid better.
- - - -
If AMD was not in GPU business, you would be just fine. You would sit on GTX 680, maybe 4GB version by now.
But rest of us would feel damage from business strategy similar to intel's. We would be maybe at GTX 680 + 50% performance on top of it.
Instead we have 4 times stronger GPUs and more available... thanks to fact that AMD "IS" competitive enough where it matters to move industry this way in 6 years.
- - - -
If AMD did what you want them to do, so you would praise them, we would be nowhere. OGL would still sux for industry (and be irrelevant for gamers) no matter how well it would run on AMD's HW. On top of that AMD's WH would sux in return on DX where it actually matters for AMD's business.
You would phase out AMD out of GPU business.
Following the 5% was what got AMD to problems. It took change of leadership and way of thinking, to get them back on track. AMD is doing what matters, because they do not have finance to pursue 5%. And they never did, because when they had superior products, they were prevented from cashing on that by industry.
Senior Member
Posts: 252
Joined: 2020-06-12
Not necessarily sure I'd agree with that - I'm currently playing MGSV at 5120x2880DSR on my 1440p native monitor and the image quality increase is enormous. It's incredibly impressive.
Something I'd happily prioritise over many other features. I think it depends entirely on the specific game you are playing. I've not seen 8k in real life, and I don't know where the real point of significantly diminishing returns (for people with perfect eyesight) is but I think we're a way off.
I completely agree re: rasterisation performance though, non RT benchmarks are redundant going forward - I think within 5 years you'll see almost all games use some degree of RT and at least a small minority of games use it heavily (way beyond what we've seen thus far).
Senior Member
Posts: 5553
Joined: 2017-11-23
Imo it is a gimmick in line with physx - even in control which everyone says is the posterchild for rtx, it only looks marginally better than screenspace reflections, and at an absurd performance hit. Dlss is just thrash, pure and simple... no one buys a 4k screen to play at sub 4k image quality. Native res or downsampling, anything else is fail.
Senior Member
Posts: 8099
Joined: 2014-09-27
"B's driver devs try to follow the spec more closely than Vendor A, but in the end this tends to do them no good because most devs just use Vendor A's driver for development and when things don't work on Vendor B they blame the vendor, not the state of GL itself."
Thanks for quoting someone you consider authority.
It's beautiful how you completely ignore the rest though. They couldn't even do texture fetching properly.
This vendor can't get key stuff like queries or syncs to work reliably. So any extension that relies on syncs for CPU/GPU synchronization aren't workable. The driver devs remaining at this vendor pine to work at Vendor A.
Vendor B can't update its driver without breaking something. They will send you updates or hotfixes that fix one thing but break two other things. If you single step into one of this driver's entrypoints you'll notice layers upon layers of cruft tacked on over the years by devs who are no longer at the company. Nobody remaining at vendor B understands these barnacle-like software layers enough to safely change them.
I've occasionally seen bizarre things happen on Vendor B's driver when replaying GL call streams of shipped titles into this driver using voglreplay. The game itself will work fine, but when the GL callstream is replayed we'll see massive framebuffer corruption (that goes away if we flush the GL pipeline after every draw). My guess: this driver is probably using app profiles to just turn off entire features that are just too buggy.
Also no response about how their Linux OpenGL driver is actually good, while the Windows driver is crap. Meanwhile it's all an "Evil Nvidia Plot to corrupt our fingers while we code the OpenGL driver in Windows" :p
Why do you feel the need to excuse them, when they're so obviously at fault? You pay them and they do a shitty job.
As for someone "I consider an authority", just lel
"Co-owner of Binomial LLC, game and open source developer, graphics programmer, lossless data and GPU texture compression specialist. Worked previously at SpaceX, Forgotten Empires, DICE, Microsoft Ensemble Studios, Valve, and Boss Fight Entertainment."
What would he know right? It's like Tim Sweeney all over again. "What do all these Unreal Engine guys know about graphics architectures".
Jesus Christ.