Rumor: NVIDIA Ampere GeForce RTX 3070 and RTX 3080 specs surface

Published by

Click here to post a comment for Rumor: NVIDIA Ampere GeForce RTX 3070 and RTX 3080 specs surface on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
It was definitely a weaker release - at launch the 2080Ti was only ~30% faster than a 1080Ti. By comparison the 1080Ti was ~50% than a 980Ti. Pricing obviously didn't help. That being said Turing also brought a bunch of features to the table. It clearly kicked off the whole RT phase of GPUs, which now AMD/Intel are latching onto. DLSS had a rocky start but more recent implementations have been quite good - Wolfeinstein Youngblood in particular is fantastic. Mesh Shaders have yet to be utilized for the most part but should bring some performance/quality improvements. VRS when utilized properly is also pretty nice.
Nicked_Wicked:

Yes, very impressive. ~25% extra performance with a small ~33% price increase.
To be fair the price increase isn't really relevant to their argument
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
wavetrex:

What this new generation should have is a much better performance/price ratio. Even if 3080 is only "slightly" faster than 2080 Ti (or even a bit slower), if it costs only $600 (or less!), then it will be a great product ! And 3070 needs to return back to $400 price range, and if that is considerably faster than 2080, then a LOT of people will buy it. Turing really stretched it with pricing, even NV fanboys like myself said ... NO, JUST NO. WAY too much.
4K gaming is still not a thing so I'd say there is a lot of room for improvement.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
oxidized:

How exactly?
Big Navi...
data/avatar/default/avatar20.webp
3080 should launch at $699 and 3070 at $399. Nvidia wouldn't fiddle on pricing too much now since they got greedy on the previous gen, and now there is competition on the horizon. AMD 'Big Navi' as they call it (lol) should really be competitive, remember a 5700xt is at 1080 Ti performance (which is basically 10% slower than 2080), then later we'll get 5800xt and 5900xt which should be 3070 and 3080 competitors, and the big navi will probably sit smack in the middle between 3080 and 3080 Ti. I don't see big navi beating the Ti. Big Navi may very well be a 5950xt or called something else entirely. I don't believe they will go dual gpu like in the past, that ship has sailed. Multi Gpu is almost dead.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
We need a 30% performance increase coupled with a 10/20% price reduction at least to make this new cards interesting...
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Ampere will determine if hardware ray tracing is win or fail.
https://forums.guru3d.com/data/avatars/m/252/252888.jpg
Heck, let's first manage a stable 144Hz on 1440p in recent games. 4K monitors are visually pleasing, but the semi decent ones are still quite expensive. 4K in general is nice to have, but 1440p with a higher refresh rate is still more popular. And I sincerely hope they unlock the VBIOS this time around, overclocking RTX cards and the 10xx series has been a MAJOR disappointment.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
jbscotchman:

Ampere will determine if hardware ray tracing is win or fail.
Doubt it - AMD/Intel are already committing to it and I'm sure all the game studios who are tired of fake lighting everything are too.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Denial:

Doubt it - AMD/Intel are already committing to it and I'm sure all the game studios who are tired of fake lighting everything are too.
True but Cryengine has already proven it can do simulated ray tracing with a much less performance hit.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
barbacot:

Big Navi...
Idk - it reminds me of "poor volta" honestly. I think they'll have something that may compete with 2080Ti but I think Nvidia's perf/w will still be significantly better, allowing them to hit closer to the 300w wall with better performance.
jbscotchman:

True but Cryengine has already proven it can do simulated ray tracing with a much less performance hit.
Cryengine proved that if you have a ton of time, on a rails demo, you can optimize the hell out of like 5 different techniques and combined them to make it look good. It's been done before but it's not a realistic approach to development.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
jbscotchman:

Ampere will determine if hardware ray tracing is win or fail.
Hmmm...I think that the gaming industry will determine that by the rate of adoption - right now things don't look good...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
barbacot:

Hmmm...I think that the gaming industry will determine that by the rate of adoption - right now things don't look good...
Lol how do you figure they don't look good? There is an article right now on the front page that talks about RTX adoption being high. AMD is doing RT with RDNA2, Intel is doing it with their GPU and the next gen consoles have it. In what world does that not look good? It's a chicken and egg problem - game developers aren't going to build out software for non-existent hardware. Now all the hardware guys are building and committing to it. The devs will follow along because it saves them tons of time.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
jbscotchman:

Ampere will determine if hardware ray tracing is win or fail.
From what i understand the adoption of RT is just a question of time because it´s much easier to implement than the alternatives. And from what i´ve read so far the actual RT implementations are just for show or gimmicks so it may take a long time before "real" RT appears on games.
data/avatar/default/avatar39.webp
Astyanax:

um... ok.
thx for the graph, yes that's what people don't get..the 2080Ti existed and was at that price because it was THE only card to give high fps in 4K, zero competition on that front for a 2080Ti Nvidia know well that those who made the mistake (my opinion) to buy a 4K monitor or (a better reason) want to play on their high-end 4K TV only have this choice and I agree with wavetrex on the pricing, even if we can pay we don't want to... I had 680 SLI, 780 SLI, 980 SLI 1080 SLI -> single 1080 Ti and.....wait and see, you can't say I didn't throw money at nvidia >< doubt they care tough, 1080Ti were constantly out of stock 2080Ti take dust as very few people buy them
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
i hope the 3080TI won't be 1200$, we need a great increase in performance while staying in the same price segment, unlike Turing prices.
data/avatar/default/avatar02.webp
I am hoping that the 3060 or 3070 is reasonably priced because that would be a pretty good upgrade for me going from a 1070. If those cards do come out I am not getting them anytime soon I am going to be getting a new PC before I upgrade the GPU.
https://forums.guru3d.com/data/avatars/m/258/258465.jpg
I've bought the 1080 quite late and within like 6 months after that I've bought the Ti version. Since then I've 'downgraded' my monitor from 4K 60Hz to 1440P 144Hz and I'm still very glad that I did take that step. Maybe, I say maybe I'll save some money to buy a new upgrade for my 1080 Ti, so like a 3080 (Ti) or a AMD Navi based gpu. Let's see what my glass sphere says. 😉
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Dragam1337:

Eh, no... it's a massive fail if it doesn't deliver considerably faster performance than turing, as turing barely upped the performance over pascal...
Don't hold your breath. The age of 50%+ improvements gen to gen is over. We're lucky if we can get 25% within the same class... I would however take 50% performance/price improvement, that's definitely possible, if not for maxxximum greed of leather jacket and his investors.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Ryu5uzaku:

Hmmh interesting. How big of a change is Ampere arch wise? Since I would guess 3072 would perform somewhere around 3072 with same clocks else. You know 3070 is around 2080 super.
Yeah, exactly, good question...I imagine the new architecture itself won't bring massive improvements in performance in IPC, so I would think the number of cores and frequencies of those cores are gonna be a pretty close way of comparing the performance.
Dragam1337:

Eh, no... it's a massive fail if it doesn't deliver considerably faster performance than turing, as turing barely upped the performance over pascal...
Also yeah, exactly!
SniperX:

Yeah, but will Nvidia release Super versions of the 3070 and the 3080? My expectations : 3080 will be around 20% faster than a 2080. Pricing : Eye-watering
Rollocks! 🙂