Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
Beelink SER5 Pro (Ryzen 7 5800H) mini PC review
Crucial T700 PCIe 5.0 NVMe SSD Review - 12GB/s
Sapphire Radeon RX 7600 PULSE review
Gainward GeForce RTX 4060 Ti GHOST review
Radeon RX 7600 review
ASUS GeForce RTX 4060 Ti TUF Gaming review
MSI GeForce RTX 4060 Ti Gaming X TRIO review
GeForce RTX 4060 Ti 8GB (FE) review
Corsair 2000D RGB Airflow Mini-ITX - PC chassis review
ASUS PG27AQDM Review - 240Hz 1440p OLED monitor

New Downloads
Corsair Utility Engine Download (iCUE) Download v5.2
CrystalDiskInfo 9.0.0 Download
GeForce 535.98 WHQL driver download
CPU-Z download v2.06
AMD Radeon Software Adrenalin 23.5.1 WHQL download
GeForce 532.03 WHQL driver download
AMD Chipset Drivers Download 5.05.16.529
Intel ARC graphics Driver Download Version: 31.0.101.4369
Display Driver Uninstaller Download version 18.0.6.4
HWiNFO Download v7.46


New Forum Topics
Millions of PC Motherboards Were Sold With a Firmware Backdoor NVIDIA GeForce Game Ready 535.98 WHQL Download & Discussion Extreme 4-Way Sli Tuning Reference AMD RX 7600 to undergo modifications to ensure compatibility with all power cables Windows power plan settings explorer utility Strange behavior Windows 11 Power Plan Directx12 Intel 12th gen Wrappers, fix Low FPS Review: Crucial T700 PCIe 5.0 NVMe 2TB SSD going for 12GB/s Review: Beelink SER5 Pro (Ryzen 7 5800H) mini PC NVIDIA GeForce Game Ready 532.03 WHQL Download & Discussion




Guru3D.com » News » RUMOR: NVIDIA to announce the GeForce RTX 3000 in August

RUMOR: NVIDIA to announce the GeForce RTX 3000 in August

by Hilbert Hagedoorn on: 03/26/2020 07:34 PM | source: tweaktown | 76 comment(s)
RUMOR: NVIDIA to announce the GeForce RTX 3000 in August

According to the latest leaked gossip (and it is just that), NVIDIA would now be launching its new GeForce RTX 3000 at the end of August so that its partners can display their custom models at Computex 2020, which will finally take place from September 28 to 30. 

Of course it remains SUPER unsure if Computex will open up at all. The company is expected to launch Tesla and Quadro models first, based on the Ampere architecture. We have already seen leaks with up to 8192 Cuda Cores and 48GB of HBM2E memory.

For gaming, line rumors indicate that GeForce RTX 3080 Ti would get 5376 Cuda Cores, a 384-bit memory bus, 12GB of VRAM offering an estimated 40% performance increase over RTX 2080 Ti. The RTX 3080 could feature 3840 Cuda Cores, 320-bit bus, 10GB ram, 10% better performance than the RTX 2080 Ti. It will be interesting to see if this information ends up being true, or ends up being false rumors. 

 







« Free to grab: World War Z, Figment and Tormentor X Punisher · RUMOR: NVIDIA to announce the GeForce RTX 3000 in August · Epic Games releases two upcoming games from Remedy »

Related Stories

Rumor: NVIDIA GeForce Ampere to be fabbed at 10nm, all cards RTX ? - 03/12/2020 05:42 PM
We'll probably go back and forth a bit when it comes to the topic of Ampere until NVIDIA lifts all mystery, expected was that NVIDIA's upcoming GPUs would be fabbed at 7nm. However, that fabrication...

Rumor: NVIDIA Ampere GeForce RTX 3070 and RTX 3080 specs surface - 01/20/2020 01:04 PM
It has been a crazy year, lots of leaks and info mostly coming through unknown Twitter accounts. Today we can add to that as alleged specifications of the GeForce RTX 3070 and RTX 3080 have surfaced....

Rumor: Nvidia Reportedly To Acquire Mellanox For $7B - 03/11/2019 08:56 AM
And let me state, this is a very solid rumor. Nvidia would have purchased the American company Mellanox Technologies, the manufacturer of InfiniBand and Ethernet network equipment. Reportedly, Nvidia...

Rumor: Nvidia GeForce GTX 1180 Expected in July? - 05/18/2018 08:06 AM
A new generation of graphics cards from Nvidia would become available in July, according to close-lipped sources from Tom's Hardware. The next-gen GPUs would be based on the Turing platform, but de...

Rumor: Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th - 02/09/2018 04:07 PM
It's been silent with new graphics card releases from Nvidia. We've mentioned it a couple of times already, we really do not expect Volta with HBM2 to reach consumer grade graphics anytime soon. The...


16 pages « < 12 13 14 15 > »


Dellers
Member



Posts: 68
Joined: 2014-04-01

#5774618 Posted on: 03/30/2020 12:30 AM
I'm pretty much just interested in seeing the prices, to determine how big of a nope it'll be. After recent currency movements etc. the 2080 Ti is currently significantly more expensive than the best non SLI gaming rig you could get here a few years back. Having to spend a huge part of the income on a computer is simply a big no. I barely bought my current rig, but now the prices are simply not within my range at all. Good computers have gone from something I could save up for rather quickly to something I can't justify buying.

Denial
Senior Member



Posts: 14091
Joined: 2004-05-16

#5774627 Posted on: 03/30/2020 01:04 AM

Did you just pull those numbers out of your arse? The 1080 ti was 60-80% faster than the 980 ti, depending on the title, as was the 980 ti over the 780 ti.

He explicitly and repeatedly defined the uarch not the optimization in the uarch as the defining metric for his claim. I can read just fine, obviously. Sheesh. Why do folks do this?

I was comparing the relative performance of the cards - which was a mistake because @Robbo9999 is just talking about uplift - I was wrong.

Regardless, before we slid off to the performance stuff - the entire point I was trying to make originally is that there is still a threshold of some kind that Nvidia isn't going to cross but they could - so in reality they are always "holding back performance" but it simply doesn't make sense to cross that threshold for the customer. For example, nothing is stopping Nvidia from developing a 1000w graphics card. Nothing is stopping Nvidia from shipping a 2,000mm2 die - but they don't because customers expect their 600-1000w PSU's to run the entire computer, not just a GPU. They don't because 2000mm2 would be astronomically priced.. yet no one sat there for the last two decades saying "Nvidia is holding back performance because they could ship a better card that a person could afford" because we understand that there are limitations that don't make financial/practical sense to bypass.

Cards like the 1080Ti got away with what they did because the clockrate went from ~1200mhz in the 980Ti to ~1700 in the 1080Ti, which is an extremely rare jump due to Nvidia using optimized cell libraries (plus CPO). I doubt we're going to see a massive jump like that again.

Cost scaling is gone with 7nm - which actually costs more per 100m gates than 16nm - so any card that has more than the 19B transistors in the 2080Ti is simply going to cost more. Unless nvidia can get the clock rates up to 2.5Ghz or has some really magical architecture improvements - even getting the 40% is probably really difficult without the cost going up massively (transistor count). The density is no longer helping with cost like it has for generations prior.

_

Anyway, It doesn't even matter because again, the 40% number comes from a random person. Might as well argue about 200% performance increase because I said Ampere is going to be 200% faster.

HeavyHemi
Senior Member



Posts: 6952
Joined: 2008-10-27

#5774651 Posted on: 03/30/2020 04:04 AM
ll out with it like they did from Maxwell to Pascal, and they could have)

I'm sorry but what you're saying doesn't mean anything.

He compared a 1080 ti to a 980 ti and a 980 ti to a 780 ti. There is no 880 ti, these are simply one GPU year(generation) to another.

8800 GTX to 9800 GTX, no difference.

If you want to read it a specific way because you want to do that, by all means do that. However it doesn't mean it was stated that way at all.

He did not state Kepler to Maxwell or Maxwell to Pascal.

Even if he had, it wouldn't even mean much, as it has been stated before, Pascal is a refined Maxwell. It's not a brand new architecture. Doesn't matter that it has a new name, that's meaningless.

In fact, the GTX 10 series vs the 9 series is very similar to the 8000 and 9000 series. They both have limited differences between architectures and the main difference was the node shrink.

The only REAL difference was that nvidia decided to actually build up from the lower node, which they could have done with the 9000 series, but didn't. (I mean, technically they did. The G80 has 681 million transistors whereas the G92/b had 754 million, so it was by far not "just" a die shrink. They simply did not go all out with it like they did from Maxwell to Pascal, and they could have)

You do realize your post agrees with what I just posted including my critique of your literacy. He was comparing uarchs not enhancement in a uarch when making his percentage claims. This is a fact. All you did was post a lot of noise in an effort to obsure that he said

"The 1080 ti was 60-80% faster than the 980 ti, depending on the title, as was the 980 ti over the 780 ti.


Your arguing because he used the MODEL number in the uarch and not the uarch NAME is your defense? Not a thing you blabber at me is going to change that to what you're arguing. The end. 880 Ti LMAO...lame. I mean, a 10 year old could make a less inept defense.

HeavyHemi
Senior Member



Posts: 6952
Joined: 2008-10-27

#5774652 Posted on: 03/30/2020 04:08 AM
I was comparing the relative performance of the cards - which was a mistake because @Robbo9999 is just talking about uplift - I was wrong.

Regardless, before we slid off to the performance stuff - the entire point I was trying to make originally is that there is still a threshold of some kind that Nvidia isn't going to cross but they could - so in reality they are always "holding back performance" but it simply doesn't make sense to cross that threshold for the customer. For example, nothing is stopping Nvidia from developing a 1000w graphics card. Nothing is stopping Nvidia from shipping a 2,000mm2 die - but they don't because customers expect their 600-1000w PSU's to run the entire computer, not just a GPU. They don't because 2000mm2 would be astronomically priced.. yet no one sat there for the last two decades saying "Nvidia is holding back performance because they could ship a better card that a person could afford" because we understand that there are limitations that don't make financial/practical sense to bypass.

Cards like the 1080Ti got away with what they did because the clockrate went from ~1200mhz in the 980Ti to ~1700 in the 1080Ti, which is an extremely rare jump due to Nvidia using optimized cell libraries (plus CPO). I doubt we're going to see a massive jump like that again.

Cost scaling is gone with 7nm - which actually costs more per 100m gates than 16nm - so any card that has more than the 19B transistors in the 2080Ti is simply going to cost more. Unless nvidia can get the clock rates up to 2.5Ghz or has some really magical architecture improvements - even getting the 40% is probably really difficult without the cost going up massively (transistor count). The density is no longer helping with cost like it has for generations prior.

_

Anyway, It doesn't even matter because again, the 40% number comes from a random person. Might as well argue about 200% performance increase because I said Ampere is going to be 200% faster.

Please do no start this garbage. As one who worked in the industry I am offended every time of you 'civilians' accuse us of cheating you out of performance for profit. That is literally backwards...well, except for Apple. ;)

And you're right, the performance is speculation at this time.

Denial
Senior Member



Posts: 14091
Joined: 2004-05-16

#5774968 Posted on: 03/31/2020 12:47 AM
Same thing.
They gimp older cards by not optimizing them equally to the newer cards, call it soft giping if you like, but they are certainly doing it.

How do you know? How would you even prove this?

16 pages « < 12 13 14 15 > »


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023