Next Generation NVIDIA Ampere reportedly to offer 50% more perf at half the power

Published by

Click here to post a comment for Next Generation NVIDIA Ampere reportedly to offer 50% more perf at half the power on our message forum
https://forums.guru3d.com/data/avatars/m/133/133128.jpg
IchimA:

And maybe half the price 😀 A men can dream
With 50% perf increase overall will come a 50% increase in price ... its simple they are the "kings" of the show and drive prices ... like Intel did until AMD gave them a serious blow with Ryzen ... do you remember the prices of I5 and I7 cpus and how they've been cut it half ? This must happen in GPU industry also otherwise Nvidia will keep asking more and more. Nobody wants you good here they all want your money but without competition they can ask more and more each year ...
data/avatar/default/avatar16.webp
I know that man ! That's why I said " a men can dream " thing is I am fed up with this Nvidia prices
data/avatar/default/avatar30.webp
Good. Finally I can upgrade!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Undying:

How much faster was 2080 than 1080ti? It was something like 2% if any, it even had less vram. Everyone is talking older gen but pascal-turing was disappointing. This 50% jump in performance probably means using raytracing.
8% - probably closer to 20% now. Turing has a lot of architecture changes that took a bit to show - some of which still aren't really being utilized. Regardless Turing wasn't really a density improvement node-wise and here we're basically getting a double node jump + more as presumably Nvidia is going straight to TSMC's 7nm EUV. Either way I never believe rumors this far from launch. So probably fake.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
I'm looking at this as an increase in SM counts primarily at similar clocks. 25% more SM's could bring around 50-70% improvements in both Traditional raster and Hybrid Raytraded totals, This brings SM's up from 48 on the TU104 to 60 on GA104 and 72 on TU102 to 90 on GA102. 3840 and 5760 Shaders 60 and 90 RT cores 480 and 720 Tensore cores 96 and 128 Rops 240 and 360 TMU's. nvidia has just about always held massive shader count increases for improved nodes.
data/avatar/default/avatar12.webp
Astyanax:

I'm looking at this as an increase in SM counts primarily at similar clocks. 25% more SM's could bring around 50-70% improvements in both Traditional raster and Hybrid Raytraded totals, This brings SM's up from 48 on the TU104 to 60 on GA104 and 72 on TU102 to 90 on GA102. 3840 and 5760 Shaders 60 and 90 RT cores 480 and 720 Tensore cores 96 and 128 Rops 240 and 360 TMU's.
You seem to know very much, yeah. Nvidia employees know less. You and Huang only. Nvidia always does it right. AMD rdna2 cannot deliver linear extra performance, correct? Lmao. You were so sure rdna2 is a flop. But Nvidia SMs deliver twice the efficiency in next gen. I believe ya. Great marketing there pal. You're something.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
i've never commented on RDNA2, RDNA had a horrible launch and so far borderline terrible drivers, the architecture itself is a significant improvement over GCN - on paper resolving a number of issues that have been covered by Scali and other developers over the years in the parts of the chip that should have scaled and operated in parallel.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Astyanax:

the 8800 was 100% faster than the 7800 in some cases.
Completely different architecture though, i had a 7800gtx and a 8800gtx, both watercooled, the 7 series was the most disappointing card ever.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Ok, well I think I'd rather have 80% more performance at the same power consumption rather than 50% more performance at half the power consumption! I hope they still produce products which hit the same wattage targets of our current cards whilst just jacking up the performance - I'm not interested in seeing 150W flagship cards at 50% more performance when they could use up the rest of that power budget up to say 300W to deliver more performance. We don't have issues with keeping GPUs cool in PC's so there'd be no point to just release cards at half the power consumption....perhaps they don't mean that...but either way you get what I mean - let's not leave performance on the table for the sake of power consumption.
https://forums.guru3d.com/data/avatars/m/66/66148.jpg
50% more with 50% less power, cool. Now in what circumstances and with what hardware?? Far too vague and more spin from Nvidia. Consdering Jenson ragged on the next gen consoles because a 2080Ti would shit all over them, I'm not expecting much in the way of logic from Nvidia at the moment.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
I wish they get rid of power draw limitations, my 1080Ti @2050mhz never goes over 50C, its so limited by what its allowed to draw of the PCIe port.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Kaerar:

50% more with 50% less power, cool. Now in what circumstances and with what hardware?? Far too vague and more spin from Nvidia. Consdering Jenson ragged on the next gen consoles because a 2080Ti would crap all over them, I'm not expecting much in the way of logic from Nvidia at the moment.
(RTX 2080 Max-Q)
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Kaarme:

There's no benefit for Nvidia to go for 50% more performance just like that. AMD can barely put up a fight, so Nvidia isn't threatened by the market situation, unlike Intel. If the performance really goes up by 50% throughout the product line, the already high prices will also jump majorly. If 3080 Ti really is 50% better than 2080 Ti, I guess it will cost full 2000 euros. Anything below the flagship, I'll be really surprised if they get the 50% muscle increase even if Nvidia could do it.
Don´t forget Nvidia´s GPUs are also made for supercomputing and servers and they need all the performance they can get and they are willing to pay big bucks for the privilege. Even on the desktop space, there are lots of users still using their 1080Ti or older and they want a reason to upgrade so Nvidia needs to give them a reason to upgrade and a big performance jump is the best reason possible.
data/avatar/default/avatar22.webp
barbacot:

Try not... So they will have enough supply for chips in order to avoid the problems with RTX 2XXX.
You do understand without changes a 2080Ti needs 520mm2 on 7nm? Thats a huge big chip prone to very low yields. AMD had to cut the Navi 10 vram bus, squeezing it to 251mm2 to be profitable chip, yet everyone cries is overpriced. If Nvidia doesn't stick to 12nm and goes to 7nm with those behemoths (irrelevant if TSMC or Samsung can manufacture them given their spec sheets), the yields would be so low that $2000-2200 mark would be the base line for the 3080Ti. A nd if Nvidia adds more Tensor/RT cores as rumoured (to add 50% more RT performance as per initial rumour before lost to translation), it would be even more expensive as it would be bigger than 520mm2 a direct 2080Ti shrink to 7nm is. To put in perspective the Zen2 is 74mm2. Expect to pay a lot of money for those chips and same applies to AMD if they try to make a 500mm2 behemoth at N7+ won't come cheap at sub $1000 As for 5nm forget anything bigger than 100mm2 given TSMC yields are 30% as that size, multi chip solution for GPUs is the only way to have affordable cards, similarly to what AMD did on CPU side.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Is this Nvidia acknowledging Navi potential to get to 2080Ti levels of performance? If so, that more than justifies the 50% more performance claims: they want to keep the crown and crush AMD, again. Although we know that +500 price point is mostly out of reach for the average consumer, mindshare plays an important role. Let's hope AMD releases RDNA2 on pair of those Ampere.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fediuld:

You do understand without changes a 2080Ti needs 520mm2 on 7nm? Thats a huge big chip prone to very low yields. AMD had to cut the Navi 10 vram bus, squeezing it to 251mm2 to be profitable chip, yet everyone cries is overpriced. If Nvidia doesn't stick to 12nm and goes to 7nm with those behemoths (irrelevant if TSMC or Samsung can manufacture them given their spec sheets), the yields would be so low that $2000-2200 mark would be the base line for the 3080Ti. A nd if Nvidia adds more Tensor/RT cores as rumoured (to add 50% more RT performance as per initial rumour before lost to translation), it would be even more expensive as it would be bigger than 520mm2 a direct 2080Ti shrink to 7nm is. To put in perspective the Zen2 is 74mm2. Expect to pay a lot of money for those chips and same applies to AMD if they try to make a 500mm2 behemoth at N7+
Not really sure if this is accurate when these chips are shipping though. 7nm non-EUV yields were extremely good, 7+ yields were similar and EUV was supposed to further improve them. TSMC was reporting that the D0 rate (at smaller die sizes to be fair) was similar to 16nm back in April last year. I mean they'll probably be overpriced but I don't think it's going to be as bad as you think.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
My ears and passive PSU will love it.
https://forums.guru3d.com/data/avatars/m/133/133128.jpg
pegasus1:

I wish they get rid of power draw limitations, my 1080Ti @2050mhz never goes over 50C, its so limited by what its allowed to draw of the PCIe port.
If you card was drawing power from PCIE port you would be limited to 75W and maybe 10-12 fps 😀 The card has 2x power connectors allowing for around 300W. Ohhh wait you are using without those plugged in ? 😀
data/avatar/default/avatar09.webp
I don't think Nvidia will disappoint on 7 nm. Should be some big ray tracing gains I imagine. I think RTX performance of the 2080 Ti could possibly be found on closer-to-the-lower-end products like 60 and 70 series.
data/avatar/default/avatar21.webp
Of course, from past bullsh#t Nvidia marketing, a 'performance' description is never limited to rasterisation - expect RT cores etc to make up the bulk of this. The big, and only, issue for the vast percentage of consumers is pricing (/performance): Ngreedia has, unfortunately, already raised that bar to pathetic levels (particularly for the 1080/Ti owners) using multiple broken RTX promises. While I'm sure AMD's rDNA2 will ultimately be capable of delivering 2080Ti performance at least, and be 'priced aggressively', I think it's unlikely they'll give up on this pricing opportunity though I might be wrong, since... Competition is good 🙂 AMD have already proven with Intel - literally forcing Intel to slash current and future pricing by up to half. Can AMD do the same with Nvidia? rDNA2's much larger die size puts it out of any wafer/yield comparison with Zen2(3...4...) and, we have significant power issues. Interesting times ahead.