RUMOR: NVIDIA to announce the GeForce RTX 3000 in August
According to the latest leaked gossip (and it is just that), NVIDIA would now be launching its new GeForce RTX 3000 at the end of August so that its partners can display their custom models at Computex 2020, which will finally take place from September 28 to 30.
Of course it remains SUPER unsure if Computex will open up at all. The company is expected to launch Tesla and Quadro models first, based on the Ampere architecture. We have already seen leaks with up to 8192 Cuda Cores and 48GB of HBM2E memory.
For gaming, line rumors indicate that GeForce RTX 3080 Ti would get 5376 Cuda Cores, a 384-bit memory bus, 12GB of VRAM offering an estimated 40% performance increase over RTX 2080 Ti. The RTX 3080 could feature 3840 Cuda Cores, 320-bit bus, 10GB ram, 10% better performance than the RTX 2080 Ti. It will be interesting to see if this information ends up being true, or ends up being false rumors.
Rumor: NVIDIA GeForce Ampere to be fabbed at 10nm, all cards RTX ? - 03/12/2020 05:42 PM
We'll probably go back and forth a bit when it comes to the topic of Ampere until NVIDIA lifts all mystery, expected was that NVIDIA's upcoming GPUs would be fabbed at 7nm. However, that fabrication...
Rumor: NVIDIA Ampere GeForce RTX 3070 and RTX 3080 specs surface - 01/20/2020 01:04 PM
It has been a crazy year, lots of leaks and info mostly coming through unknown Twitter accounts. Today we can add to that as alleged specifications of the GeForce RTX 3070 and RTX 3080 have surfaced....
Rumor: Nvidia Reportedly To Acquire Mellanox For $7B - 03/11/2019 08:56 AM
And let me state, this is a very solid rumor. Nvidia would have purchased the American company Mellanox Technologies, the manufacturer of InfiniBand and Ethernet network equipment. Reportedly, Nvidia...
Rumor: Nvidia GeForce GTX 1180 Expected in July? - 05/18/2018 08:06 AM
A new generation of graphics cards from Nvidia would become available in July, according to close-lipped sources from Tom's Hardware. The next-gen GPUs would be based on the Turing platform, but de...
Rumor: Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th - 02/09/2018 04:07 PM
It's been silent with new graphics card releases from Nvidia. We've mentioned it a couple of times already, we really do not expect Volta with HBM2 to reach consumer grade graphics anytime soon. The...
Senior Member
Posts: 14623
Joined: 2014-07-21
So now this is a gimping discussion again? I'm sure AMD's still optimizing GCN just as much as RDNA(2). Not. Even from an economic perspective, it would be stupid.
I find it amazing how people think their 5 year old 250$ cards would always perform miraculously good due to driver optimizations if the companies would only do it.
Senior Member
Posts: 4822
Joined: 2009-09-08
Same thing.
They gimp older cards by not optimizing them equally to the newer cards, call it soft giping if you like, but they are certainly doing it.
This happens in every industry. Eventually Nvidia and others have to leave older cards behind so they can focus on the new ones because optimizing stuff costs a lot of time and money. It´s no conspiracy, it´s just the way it is. You can always argue that they should provide more support over more time but that´s another issue.
Senior Member
Posts: 1678
Joined: 2017-02-14
So in summary nobody has a clue as to how Nvidia's 3000 series will perform or AMD's Big Navi will perform.

Senior Member
Posts: 11681
Joined: 2004-05-10
Nobody will know how these cards will perform until about a month before release, which is about when we start seeing leaks. Both Nvidia and AMD guard performance figures very tightly. But when cards are in the manufacturing or distribution stage, thats when we might see the odd leak or two come out.
Posts: 22472
Joined: 2008-07-14
Same thing.
They gimp older cards by not optimizing them equally to the newer cards, call it soft giping if you like, but they are certainly doing it.
There is a limit to how much they can "optimize" drivers for any given architecture. The cards aren't being gimped, they're simply aging....
How do you know? How would you even prove this?
Unlike the idea of manufacturers holding back performance, which can actually be proven, it's impossible to prove that NVidia isn't providing the same level of optimization to all supported generations of products. At least for anyone not working for NVidia as a driver dev anyway....
Some people just don't want to believe that their previous generation product can't keep pace with the current generation replacement and would rather blame the manufacturer for "gimping" said product.....