Rumor: NVIDIA GeForce RTX 3070 and 3080 Coming Q3 2020 + Specs

Published by

Click here to post a comment for Rumor: NVIDIA GeForce RTX 3070 and 3080 Coming Q3 2020 + Specs on our message forum
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
EL1TE:

Maybe, maybe not. Normally we would expect the 3070 to be like a 2080, that's normally how it works, next gen x70 is x80 of previous gen.
Not in recent times. 1060 6gb = 980, 2060 = 1080. Speaking of non Ti versions obviously.
https://forums.guru3d.com/data/avatars/m/263/263507.jpg
I have a GTX1080 (600 USD MSRP / Released in May 2016). And after almost 4 years If I can get a good upgrade with a 600 USD 2020 GPU, then I might end up upgrading (with either Nvidia or AMD gpu) because with a 3700X the GPU is the bottleneck now. (Before i had an i5 4590 and the CPU was the bottleneck). If I still can't get a good upgrade for the same price after 4 years, then I'll just keep waiting 1 or 2 more years. I'm still fine with the GTX 1080 --> I don't desire Ultra quality settings because I prioritize FPS above 90~ for heavy games. (1440p 144Hz VRR monitor). In the last 2 months I've been playing MHW (I haven't been addicted to a game for many years). With the 1080 I can play at 90-110 FPS 1440p with medium settings and stay inside VRR range (48-144) except for some really crazy situations.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Whatever it is I hope it's "BIG" from both teams red and green.
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
karma777police:

If those specs are true, it is going to wipe floor with RDNA 2.
If those specs are true,the only thing is going to wipe is our wallets 🙂
https://forums.guru3d.com/data/avatars/m/274/274779.jpg
3080ti with GA100? Not GA102,this looks like fake to me. Or if this is true a big improvement.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Too big of a spec gap between the 3080 and Ti. Which means perf and price gap may be similarly lopsided. Doesnt make sense to leave that large of an unfilled gap in Nvidias lineup. Something I am sure AMD will surely take advantage of, which Nvidia will probably not let happen. Again, doesnt make sense.
data/avatar/default/avatar13.webp
Smells like 3080Ti will be dual chiplet edition.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
RT cores do not take much space at all and they are not the bottleneck in raytraced gameplay anyway, scaling them up by raising SM numbers is a better idea than just putting an extra one per sm
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
alanm:

Too big of a spec gap between the 3080 and Ti. Which means perf and price gap may be similarly lopsided. Doesnt make sense to leave that large of an unfilled gap in Nvidias lineup. Something I am sure AMD will surely take advantage of, which Nvidia will probably not let happen. Again, doesnt make sense.
Gap for 3080 Super or what ever naming scheme they will use to make it even more confusing?
data/avatar/default/avatar02.webp
H83:

The 3080Ti having almost twice the cores than the 3080 seems a little fishy...
Yep it is fishy. The current 12nm behemoths cannot shrink to 7nm as their size it too big (over 500mm2 as many parts do not scale linearly on smaller node), making yields atrocious to start impossible to manufacturer also. Doubling also the core size means 800-1000mm2 GPU at 7nm when TSMC has stated 500-520mm2 is the biggest chip they can make. Mi60 is 480mm2 with atrocious yields making it very expensive to manufacture. The Xbox X series APU is 495mm2 with 56CUs (and includes also 8 core Zen2 and I/O) of which 4 are chopped (which ever are faulty ones) to improve yields and then further faulty APUs will power a lesser Xbox. (Which also suggests the PS5 will be far cheaper as with 36CUs the APU is much smaller so cheaper and easier to make). In comparison 5700XT is 251mm2.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
Looks like total BS: No reason to wait on the xx80Ti cards anymore, since they were typically used to combat new AMD products. Given Nvidia has effectively lapped AMD, they don't need to wait on these anymore. The more time they're on shelves, the more units they can sell, and the more money Nvidia can make off them. Launching the xx60 card many months after the xx70/xx80 cards makes no sense. They need to have an offering down in the $200-$300 price range, and it should be aligned with their current product line. Keeping older cards on shelves just gives consumers the impression they're old products that no one wanted. 8000 cores just seems silly for the Ti. They can sell a Ti at 20-30% boost over the xx80 cards, so why put more performance in than is needed? That's just eating into future product sales.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fediuld:

Yep it is fishy. The current 12nm behemoths cannot shrink to 7nm as their size it too big (over 500mm2 as many parts do not scale linearly on smaller node), making yields atrocious to start impossible to manufacturer also. Doubling also the core size means 800-1000mm2 GPU at 7nm when TSMC has stated 500-520mm2 is the biggest chip they can make. Mi60 is 480mm2 with atrocious yields making it very expensive to manufacture. The Xbox X series APU is 495mm2 with 56CUs (and includes also 8 core Zen2 and I/O) of which 4 are chopped (which ever are faulty ones) to improve yields and then further faulty APUs will power a lesser Xbox. (Which also suggests the PS5 will be far cheaper as with 36CUs the APU is much smaller so cheaper and easier to make). In comparison 5700XT is 251mm2.
Where are you getting these die sizes from? Are you including the interposers or something? Both Mi60 and Xbox Series X numbers are smaller than what you're saying.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@EL1TE neither my 2080 or 1080 had/have problems running everything in UHD/4K. you make it sound like those cards are crap, when not everyone on this planet is using them to run the latest AAA with maxed out settings. its the type of games YOU play, doubt anyone has trouble running minecraft with 60fps@UHD, which you say they really cant.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
I also call BS on this. 'Gaping holes' is not something a rational player like Nvidia would commit to in terms of product line-ups. For ref, the old lineup: 2080 = 2944 cores, 2080 S = 3072, 2080ti = 4352.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Or this is complete bogus and instead what NV will do is push clocks to 3+ Ghz. We already know that TSMC 7nm is capable of 4.2+ Ghz on the very complex Ryzen CPUs, I don't see why it couldn't be viable at 3Ghz on GPUs. The last time NV moved to the new process clocks jumped from 1.3-1.4 to basically 2 Ghz ( Maxwell-> Pascal ), which is why Pascal became so much faster, even if the core counts didn't increase much and it's basically the same arch. A fine-tuned Turing-like GPU (similar core counts) that runs at 3 Ghz would smoke current gen Turing in anything, without the necessity of putting that many more transistors. But we'll see... hopefully soon.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I don't know about everyone here but i only max gfx settings on single-player games. Nearly every online game i play i prefer lowish settings. I only have 2 iiyama 24" 75hz 1ms monitors which i find perfect for my current set up. Pushing 75 fps is for me a perfect balance of frames and power draw. My system hardly notices gaming @ 75/fps/hz. The fans don't ramp up or nothing. Of course i know 144hz is a better experience but trying to get 144 fps all the time would be difficult and very power hungry. I have enough money saved up to buy any pc parts i want and this is probably the year i go full send and buy a beast. Probably Zen3 4700X and nvidia 3080 but even with a beastly PC i want lower power draw mostly. So i'll have to pick some parts that have 65w(cpu) plus hopefully nothing more than 180w for gfx card. I doubt i'll get a Ti even though i can afford one. I'm hoping the 3080 will use less than 180w but we'll have to see.
https://forums.guru3d.com/data/avatars/m/183/183990.jpg
jbscotchman:

Not in recent times. 1060 6gb = 980, 2060 = 1080. Speaking of non Ti versions obviously.
Hmm nope, the 2070 is on par with the 1080 while being slightly better in some games, the 2060 is kinda behind: https://www.guru3d.com/articles_pages/geforce_rtx_2060_review_(founder),22.html https://www.guru3d.com/articles_pages/geforce_rtx_2060_review_(founder),17.html
fry178:

@EL1TE neither my 2080 or 1080 had/have problems running everything in UHD/4K. you make it sound like those cards are crap, when not everyone on this planet is using them to run the latest AAA with maxed out settings. its the type of games YOU play, doubt anyone has trouble running minecraft with 60fps@UHD, which you say they really cant.
If you have trouble understanding basics of communication such as the part that when people post something (besides reports, news articles etc) they are giving their opinion, maybe you shouldn't even bother replying to people if you don't know this. I was typing something but i decided it's not even worth bothering because you'll probably behave the same way.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Seeing is Believing...... Let's wait...... Cross fingers...
https://forums.guru3d.com/data/avatars/m/228/228574.jpg
I have about £550 put aside for a gpu upgrade from my still faithful 980ti, the way nvidia prices are going I might be able to afford the 3050 ... ;_;