NVIDIA to Announce Ampere at GTC in 2020?

Published by

Click here to post a comment for NVIDIA to Announce Ampere at GTC in 2020? on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Further discussion also HERE
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
Well, it will be interesting to see how the "50% more performance at 50% less power consumption" pitch will hold up in reviews once the cards hit the market. Sounds too good to be anything but a sales pitch. Prices will be another thing as well...
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Let's hope this rumor is true:
this information comes shortly after an unknown AMD Radeon appeared on the OpenVR benchmarking software offering a score higher than an Nvidia GeForce RTX 2080 Ti.
Because then maybe this rumor will be true:
the Ampere cards are allegedly going to market cheaper than the Turing cards (especially the high-end RTX 3080 and RTX 3080 Ti), presumably to compete with AMD.
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
I think that if Ampere is going to perform as the rumor claims, then I'm absolutely sure that it will not be cheaper at all and very likely far more expensive than the current lineup. NVIDIA produces huge chips compared to AMD and they are really great that they can optimize design to solve all the problems associated with it, but producing one large chip is still going to be expensive, on smaller node even more so.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
I doubt that they will have 50% less power consumption and 50% more performance. That can be assumed that they provide 50% more performance at 50% less power usage on a specific clock. That doesn't mean card will be equally efficient at 100% performance. If NVIDIA achieved such efficient, we will have the same story as with GTX 580, where NVIDIA achieved higher efficiency but instead of offering a significantly better product for the consumer, they have offered a mid range product, for the price of the high end product. Thus started the insane price increase era fpr the GPU market. They did it once, they will do it again, UNLESS AMD will come up with a highly competitive product. Otherwise, next gen will be yet another 10% performance increase, with 50% more profitability for NVIDIA as in this case yet again they will be able to sell lower class silicon as a high end version.
data/avatar/default/avatar15.webp
Not sure about the performance or cost but the time frame seems reasonable. All those 1080 owners who have been hanging on might be happy .....
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Market has never really recovered from the mining craze. Even the RX 5700 XT which is a nice card by todays standards and offers the best price/perf ratio in its performance segment - is too expensive. Let alone ANYTHING above its level from nVidia. Lets hope it gets better with the next generation of cards in 2020 I will hold on to my dear GTX970 until this settles down somehow. Its still really good for 60fps gameplay.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
I thought 7nm was so cutting edge that wafer yields were not great. Other companies like AMD only have to produce relatively small chiplets which gets around this yield issue as it doesn't matter if a few of them fail per wafer. NVidia has to produce a few great big GPU dies on each wafer, so the chance of a lot of them having small errors will be much greater, so yield will be less, and cost per GPU of those that do work perfectly will be much higher, I'd say. Going to be an awful lot of semi-broken GPU dies available to them - which they will probably be able to flog as lower-end GPUs like 3050, 3060, 3070 etc. but I'd say the fully working complete GPUs will be rare enough per wafer, so high-end 3080Ti, Titan etc will be expensive.
https://forums.guru3d.com/data/avatars/m/275/275145.jpg
I very much doubt that we see anything related to gaming. GTC is a Deep Learning and AI conference. If they do show something, it is most likely to be for the professional market.
https://forums.guru3d.com/data/avatars/m/188/188114.jpg
Andy Watson:

Not sure about the performance or cost but the time frame seems reasonable. All those 1080 owners who have been hanging on might be happy .....
Imo theres nothing to hang... I mean my 1070 works wonders under 2560x1440, so far no issues. RTX unless it matures a bit, I can go without. Will see when Cyberpunk releases if there is a need for upgrade :P
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Im glad nvidia continue to dominate, makes amd and intel try their best to fallow. Im only scared nvidia will bump the prices even more. If 3080 beats 2080ti with some additional rt cores/improvements you can only imagine how much will it cost. Same goes for 3070. Im sure many of us wont even afford 3060. 😀
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
@Undying Your comment is sad and reveals why the market has come to this state. You're glad Nvidia dominates, but you're scared they'll bump prices even more. What do you think it will happen when a company has the monopoly, offer a ticket plane and 2 weeks at Hawaii?
data/avatar/default/avatar05.webp
I will never buy blue and green products. All red for ever and consoles. Until Intel and Nvidia ask reasonable money, I have no reason to brag about pc hardware and show my friends anything about it or beg for upgrade discounts and parts swapping. Real life dominates digital one. Amd shows the way, years now. They won cpu battle. If only they could win also gpu front. We hope Nvidia "to learn" from intel's mistakes. Overspending $ at r&d you should never alleviate the costs through consumers, only partners etc.
https://forums.guru3d.com/data/avatars/m/274/274779.jpg
Undying:

Im glad nvidia continue to dominate, makes amd and intel try their best to fallow. Im only scared nvidia will bump the prices even more. If 3080 beats 2080ti with some additional rt cores/improvements you can only imagine how much will it cost. Same goes for 3070. Im sure many of us wont even afford 3060. 😀
2K for gaming gpu is nuts.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Silva:

@Undying Your comment is sad and reveals why the market has come to this state. You're glad Nvidia dominates, but you're scared they'll bump prices even more. What do you think it will happen when a company has the monopoly, offer a ticket plane and 2 weeks at Hawaii?
This so much.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
As said in another Ampere thread: it's likely to be a 50% total improvement. So if it's 50% faster without using more watts than the 2080Ti, that's technically 50% more efficient, because it's basically more performance for free. But the phrasing is ambiguous. 50% more efficient could mean it uses half the power of the 2080Ti while also being 50% faster, which is absurd and not going to happen. Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.
data/avatar/default/avatar40.webp
emperorsfist:

Well, it will be interesting to see how the "50% more performance at 50% less power consumption" pitch will hold up in reviews once the cards hit the market. Sounds too good to be anything but a sales pitch. Prices will be another thing as well...
Thats not even a sales pitch, thats just unfounded rumors.
schmidtbag:

Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.
Or maybe 50% more isn't even trying that hard, if better architecture is combined with a node shrink. And its not like we could ever have enough graphics performance, between a strong push for 4K and RTX, performance keeps going down, not up.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

As said in another Ampere thread: it's likely to be a 50% total improvement. So if it's 50% faster without using more watts than the 2080Ti, that's technically 50% more efficient, because it's basically more performance for free. But the phrasing is ambiguous. 50% more efficient could mean it uses half the power of the 2080Ti while also being 50% faster, which is absurd and not going to happen. Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.
They kind of do because they are falling behind in deep learning performance and they typically use the same architecture across their entire product stack. I predict at some point they'll split the server stuff completely off and go MCM with that but I don't know if that's happening with Ampere. Keep in mind that this is a double node shrink for Nvidia and they are probably going straight to 7nm+ EUV. IIRC Vega VII was ~30% improvement over 64 with just a die shrink - 7nm+ EUV adds another 10-15% over 7nm. So even if you ignore architecture improvements Nvidia is going to gain like 40% just from node improvement.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
When Nvidia changed nodes they always improved quite a lot - see 1080 Ti that wiped the floor with everything Nvidia before so changing now to 7nm it may be possible again based on their history. Beware the leather jacket man! https://i.imgur.com/6sOcyZC.jpg
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Undying:

Im glad nvidia continue to dominate, makes amd and intel try their best to fallow. Im only scared nvidia will bump the prices even more. If 3080 beats 2080ti with some additional rt cores/improvements you can only imagine how much will it cost. Same goes for 3070. Im sure many of us wont even afford 3060. 😀
Don't think like that...NVidia will be lowering the prices of their old 2xxx stock when the 3xxx cards launch, and they should launch the 3xxx cards at no more than the 2xxx launch prices.
Denial:

They kind of do because they are falling behind in deep learning performance and they typically use the same architecture across their entire product stack. I predict at some point they'll split the server stuff completely off and go MCM with that but I don't know if that's happening with Ampere. Keep in mind that this is a double node shrink for Nvidia and they are probably going straight to 7nm+ EUV. IIRC Vega VII was ~30% improvement over 64 with just a die shrink - 7nm+ EUV adds another 10-15% over 7nm. So even if you ignore architecture improvements Nvidia is going to gain like 40% just from node improvement.
barbacot:

When Nvidia changed nodes they always improved quite a lot - see 1080 Ti that wiped the floor with everything Nvidia before so changing now to 7nm it may be possible again based on their history. Beware the leather jacket man! https://i.imgur.com/6sOcyZC.jpg
Yep, these 3xxx cards will be the ones to get...I predict they will far less disappointing than the 2xxx series!