NVIDIA to Announce Ampere at GTC in 2020?
Aside from fast 360 Hz monitors, things have been silent for NVIDIA at CES on the desktop front. However new industry reports indicate that Nvidia plotted an announcement date for its new architecture Ampere GPUs at GTC 2020, which will take place from March 22 to 26.
The news, however, remains to be based upon logic expectations and a rumor. Earlier on the CEO of Nvidia mentioned that there will be a 7 nm product, and a recent report by analyst group recommended investing in Nvidia now "will have their first releases to 7 nm over the next 9 months."
Other rumors have indicated that Ampere will have 50% more performance with only half of the consumption compared to Turing, thanks to improvements in architecture and the reduction to 7nm. There was also another rumor that mentioned that we will see the announcement of the architecture in March, adding that we will not see products based on this architecture until June.
NVIDIA to Bundle GeForce RTX with Call of Duty: Modern Warfare - 09/17/2019 05:13 PM
NVIDIA’s working with developers to bring ray tracing and other technological goodness to the latest and greatest games. Call of Duty: Modern Warfare is one of those games....
Cyberpunk 2077 demo was running a NVIDIA Titan RTX - 06/18/2019 08:43 AM
The one title that made a lot of news was Cyberpunk 2077 demo'd in an RTX optimized version. Would the game be playable with say an RTX 2070? Well, the E3 2019 demo for Cyberpunk 2077 was running o...
NVIDIA To Release GeForce GTX 1650 "Ti" as well? - 04/23/2019 08:04 AM
This week you'll see the first GeForce GTX 1650 reviews, but from the looks of things, a Ti model is brewing as well. new ECC list entries indicate a second TU117 model that positions itself between...
Is NVIDIA Teasing a Laptop with Two Screens? - 04/18/2019 08:37 AM
On its Twittah account, NVIDIA has been teasing a video. The video shortly prior to Easter seems to show a bit of an easter egg and looks to show a laptop that has two displays....
NVIDIA to launch another Series 16 GPU: GeForce GTX 1650 next month - 02/21/2019 10:17 AM
While the GeForce GTX 1660 Ti still has to be released, there already is word out about another SKU that would be released, the GeForce GTX 1650....
Senior Member
Posts: 803
Joined: 2015-05-19
Well, it will be interesting to see how the "50% more performance at 50% less power consumption" pitch will hold up in reviews once the cards hit the market. Sounds too good to be anything but a sales pitch. Prices will be another thing as well...
Thats not even a sales pitch, thats just unfounded rumors.
Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.
Or maybe 50% more isn't even trying that hard, if better architecture is combined with a node shrink. And its not like we could ever have enough graphics performance, between a strong push for 4K and RTX, performance keeps going down, not up.
Senior Member
Posts: 13725
Joined: 2004-05-16
As said in another Ampere thread: it's likely to be a 50% total improvement.
So if it's 50% faster without using more watts than the 2080Ti, that's technically 50% more efficient, because it's basically more performance for free. But the phrasing is ambiguous. 50% more efficient could mean it uses half the power of the 2080Ti while also being 50% faster, which is absurd and not going to happen.
Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.
They kind of do because they are falling behind in deep learning performance and they typically use the same architecture across their entire product stack. I predict at some point they'll split the server stuff completely off and go MCM with that but I don't know if that's happening with Ampere.
Keep in mind that this is a double node shrink for Nvidia and they are probably going straight to 7nm+ EUV. IIRC Vega VII was ~30% improvement over 64 with just a die shrink - 7nm+ EUV adds another 10-15% over 7nm. So even if you ignore architecture improvements Nvidia is going to gain like 40% just from node improvement.
Senior Member
Posts: 611
Joined: 2007-09-24
When Nvidia changed nodes they always improved quite a lot - see 1080 Ti that wiped the floor with everything Nvidia before so changing now to 7nm it may be possible again based on their history.
Beware the leather jacket man!

Senior Member
Posts: 1578
Joined: 2012-10-07
Im glad nvidia continue to dominate, makes amd and intel try their best to fallow. Im only scared nvidia will bump the prices even more.
If 3080 beats 2080ti with some additional rt cores/improvements you can only imagine how much will it cost. Same goes for 3070. Im sure many of us wont even afford 3060.

Don't think like that...NVidia will be lowering the prices of their old 2xxx stock when the 3xxx cards launch, and they should launch the 3xxx cards at no more than the 2xxx launch prices.
They kind of do because they are falling behind in deep learning performance and they typically use the same architecture across their entire product stack. I predict at some point they'll split the server stuff completely off and go MCM with that but I don't know if that's happening with Ampere.
Keep in mind that this is a double node shrink for Nvidia and they are probably going straight to 7nm+ EUV. IIRC Vega VII was ~30% improvement over 64 with just a die shrink - 7nm+ EUV adds another 10-15% over 7nm. So even if you ignore architecture improvements Nvidia is going to gain like 40% just from node improvement.
When Nvidia changed nodes they always improved quite a lot - see 1080 Ti that wiped the floor with everything Nvidia before so changing now to 7nm it may be possible again based on their history.
Beware the leather jacket man!

Yep, these 3xxx cards will be the ones to get...I predict they will far less disappointing than the 2xxx series!
Senior Member
Posts: 6487
Joined: 2012-11-10
As said in another Ampere thread: it's likely to be a 50% total improvement.
So if it's 50% faster without using more watts than the 2080Ti, that's technically 50% more efficient, because it's basically more performance for free. But the phrasing is ambiguous. 50% more efficient could mean it uses half the power of the 2080Ti while also being 50% faster, which is absurd and not going to happen.
Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.