RUMOR: NVIDIA to announce the GeForce RTX 3000 in August

Published by

Click here to post a comment for RUMOR: NVIDIA to announce the GeForce RTX 3000 in August on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Isnt that little underwhelming compared to first leaks? 10nm chip with 12gb vram and probably subpar raytracing performance once again (no mention of rt cares). By that time AMD will have desktop 5nm Rdna2 gpus with 16gb vram (looking at new gen rdna1.5 consoles) Tide can change right there.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
€1500 anyone?
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
Let's just assume the die size of the biggest chip is under 500mm^2 this time...that should limit pricing.
data/avatar/default/avatar08.webp
Undying:

Isnt that little underwhelming compared to first leaks? 10nm chip with 12gb vram and probably subpar raytracing performance once again (no mention of rt cares). By that time AMD will have desktop 5nm Rdna2 gpus with 16gb vram (looking at new gen rdna1.5 consoles) Tide can change right there.
Desktop RDNA2 on 5nm? Any source on that?
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Undying:

Isnt that little underwhelming compared to first leaks? 10nm chip with 12gb vram and probably subpar raytracing performance once again (no mention of rt cares). By that time AMD will have desktop 5nm Rdna2 gpus with 16gb vram (looking at new gen rdna1.5 consoles) Tide can change right there.
Consoles are RDNA2, and the chips that launch at the end of year will be at best 7nm+ but 7nm is looking more likely. It's dumb posts like this that start the hype train that inevitably make the actual product AMD releases feel sub-par. In other words AMD's rabid fanboy's are what hurt their image more than the "failures" of the products.
D1stRU3T0R:

Desktop RDNA2 on 5nm? Any source on that?
There is no source he's talking out of his ass again. Last I saw on 5nm was end of 2021 in ZEN4 on EPYC.
data/avatar/default/avatar04.webp
How is 5376 cores vs the 2080ti 4352 cores a 40% increase ? more like 25% at top with the rest of the specs being pretty similar. Really eager to see benchmark but i doubt will be a huge gap, more like 1080ti vs 2080ti. Pricing is whats left to know
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I just want to point out that the new info says nothing about the configuration of the cards. The 10nm/12GB stuff is all still from random twitter guy. The new leak doesn't confirm that or even mention it: https://www.bloomberg.com/opinion/articles/2020-03-25/nvidia-s-next-generation-chips-may-be-biggest-pleasant-surprise Should also point out that somehow Tweaktown took that article and turned it into a source for this: https://www.tweaktown.com/news/71467/nvidia-geforce-rtx-3080-august-2020-reveal-launch-at-computex/index.html Which blows my mind. [URL='https://www.tweaktown.com/news/71467/nvidia-geforce-rtx-3080-august-2020-reveal-launch-at-computex/index.html']
Undying:

Isnt that little underwhelming compared to first leaks? 10nm chip with 12gb vram and probably subpar raytracing performance once again (no mention of rt cares). By that time AMD will have desktop 5nm Rdna2 gpus with 16gb vram (looking at new gen rdna1.5 consoles) Tide can change right there.
[/URL] Where do you come up with this stuff? AMD itself is calling the consoles GPU architecture RDNA2. 5nm isn't even in mass production yet and you think AMD is going to have 300+mm2 GPUs shipping on it by September? AMD has slides that say it's on 7nm from 20 days ago: https://www.extremetech.com/wp-content/uploads/2020/03/RDNA-Roadmap.jpg https://pc.watch.impress.co.jp/img/pcw/docs/1239/287/19_o.jpg So where is your source for 5nm and consoles being 1.5 and a new architecture coming out for desktop?
That_Guy:

How is 5376 cores vs the 2080ti 4352 cores a 40% increase ? more like 25% at top with the rest of the specs being pretty similar. Really eager to see benchmark but i doubt will be a huge gap, more like 1080ti vs 2080ti. Pricing is whats left to know
Everything is what's left to know because citing core and spec counts from random twitter people is not knowing anything.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
TheDeeGee:

€1500 anyone?
Seems cheap to me.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
That_Guy:

How is 5376 cores vs the 2080ti 4352 cores a 40% increase ? more like 25% at top with the rest of the specs being pretty similar. Really eager to see benchmark but i doubt will be a huge gap, more like 1080ti vs 2080ti. Pricing is whats left to know
Optimizations. You can't just look at the number of cores and such.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
I personally hope the rumour is false, was really hoping for a bigger jump in performance, more in line with 60-80% boost. Though its curious why there is an 8k core model, but the 3080ti is only meant to be 5k cores, they have never had that big of a gap between the titan series and XX80TI series. Part of me wonders if the 5376 is actually for the 3080 non-ti version, though maybe that's just me hoping for a bigger model
data/avatar/default/avatar14.webp
TheDeeGee:

€1500 anyone?
Saved money since September 2018 when we bought 2080ti's. €1500, I don't care 😛 Selling 2x 2080ti when new generation is coming. The upgrade cost will be not too bad 🙂
data/avatar/default/avatar32.webp
Hilbert Hagedoorn:

According to the latest leaked gossip (and it is just that), NVIDIA would now be launching its new GeForce RTX 3000 at the end of August so that its partners can display their custom models at Compute... RUMOR: NVIDIA to announce the GeForce RTX 3000 in August
nvidia needs a monster gpus , 3060 , 3070 , 3080 and 3080ti for compete with new consoles. console 12 tflops for $399.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
That_Guy:

How is 5376 cores vs the 2080ti 4352 cores a 40% increase ? more like 25% at top with the rest of the specs being pretty similar. Really eager to see benchmark but i doubt will be a huge gap, more like 1080ti vs 2080ti. Pricing is whats left to know
It's not all about the number of cuda cores, but efficiency. A GTX 1060 192-Bit 6gb has 1280 cores, yet is basically identical to a GTX 980 256-Bit with 1536 cores. Anyway, I'm looking for something around $300 with performance close to a 2070 (non-super) with better optimized RT. Wishful thinking but I think it's very possible.
https://forums.guru3d.com/data/avatars/m/121/121558.jpg
Heh, a lot of things that will get released during this crisis will be very much irrelevant to a lot of people out there. I'd wait next year instead and come with a bang announcement, details, prices and release dates. Another thing to consider: if things actually get worse instead of better over time (until a vaccine comes out) the consoles launch might be delayed until next year. If the consoles don't come out this year there's literally no point in releasing a 'competitive' GPU for the PC since the current cards do the job just fine.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Undying:

Isnt that little underwhelming compared to first leaks? 10nm chip with 12gb vram and probably subpar raytracing performance once again (no mention of rt cares). By that time AMD will have desktop 5nm Rdna2 gpus with 16gb vram (looking at new gen rdna1.5 consoles) Tide can change right there.
I'm sorry, but....nothing in your post makes sense. 10nm nvidia? That's a rumor that holds no ground. Yet you call it a leak as though its guaranteed? How does that work? RDNA2 on 5nm? First i am hearing of this. RDNA2 is slated to my knowledge to be on 7nm or 7nm+, if they distinguish it from RDNA's 7nm. RDNA 3 is supposed to be on an "advanced node" which has not been specifically claimed yet what that entails. So, are you claiming RDNA 3 is delayed? Are you claiming RDNA 2 is skipping 7nm, which would mean its also delayed? What is it that you are claiming throughout this post?
data/avatar/default/avatar08.webp
Undying:

Isnt that little underwhelming compared to first leaks? 10nm chip with 12gb vram and probably subpar raytracing performance once again (no mention of rt cares).
The source article says TSMC 7nm+ in this leak, either way. But for memory, there is a bit of a conumdrum in the future. Capacity and interface bitdepth are tightly linked, and chip size dictates capacity. 2080 Ti was 11x 32-bit 1GB = 11GB at 352-bit. 12GB with a 12x32 interface would work, or 24GB, which would be a larger jump and drive up costs. Or they could make the interface bigger, but a full 512-bit memory interface for eg. 16x32 at 1GB each would be very costly and complex, and they haven't done such a wide interface in forever. They can't mix chip sizes, as that ends up with a 970 situation. Therefor... memory size scaling is problematic, as you lack intermediate steps, since chips are typically made in 2x steps. 512MB, 1GB, 2GB, not 1.5GB. PS: The memory interface on those console GPUs is freakin weird with different speeds for different areas. But it also appears to be both GPU and system memory, so its a very custom solution, and how they achieve 16GB without an expensive 512-bit bus.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
AMD will have RDNA2 cards out this fall and they will be on enhanced 7nm ie not EUV. Just thought I would chime in since all of the 5nm talk. We don't even know if RDNA3 at end of 2021 will be on 5nm as all AMD will say right now is advanced node.
data/avatar/default/avatar36.webp
Poor Ampere. It looks like yet another RADEON win. On the planet of Wet Dreams. Where physics is forbidden by law and all you need to design a top of the line GPU - is to be generally clueless while having an armchair 😀
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
RDNA2 only just competes with big turing.......
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Astyanax:

RDNA2 only just competes with big turing.......
No it won't.