Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

Published by

Click here to post a comment for Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz on our message forum
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Even if it's "only" $1000 MSRP, it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination. In b4 someone comes in talking about how HBM2 costs more than an ocean of virgin blood, that their R&D costs more than 10 billion pure souls, then links to nVidia's ballsack's reddit page stating they pay everyone in their staff and homeless people outside $10K per minute therefore they need to charge insulting amounts. 🙄 I'll drop to 30 fps gaming or buy a console before I pay nVidia's mafia monopoly prices. Or better yet, bust out my backlog of old games. Too bad, I was looking forward to Cyberpunk on PC (no, I won't play it on a console). Guess I'll play that in 2026 or so. I can wait.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Neo Cyrus:

it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination.
Because you're market ignorant.
data/avatar/default/avatar02.webp
Expecting 50-70% more performance over a 1080ti and 40% more performance over a 2080ti? . Expecting to reach 2.4Ghz with a stable overclock on air/water? Expecting much better RTX performance aswell. Will it undervolt stable to decrease the temps? Find out more when Nvidia releases their new 3000 series graphics cards which will buttkick your wallet so far to the outer reach of the solar system.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Hulk12:

Maybe no PCIe 4.0 support for top- or high-end GPU Ampere because the Intel doesn't announce PCIe 4.0 for desktop in this year. 😉
Why not tho. Like that is the only part where it would even make sense. Anyway you can put the pcie 4.0 cards in 3.0 slots anyway.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I still have my doubts to believe this is the gaming card we will see, but more a workstation card. And thus, I don't really expect to see those 8192 cores hit the 3080TI either.
data/avatar/default/avatar18.webp
fantaskarsef:

I still have my doubts to believe this is the gaming card we will see, but more a workstation card. And thus, I don't really expect to see those 8192 cores hit the 3080TI either.
No, that would probably the 3xxx series Titan card. I guess for the TI and lower cards they'll disable some SM's and go with GDDR6 i.s.o HBM2e. Just to save costs on the memory.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Crazy Joe:

No, that would probably the 3xxx series Titan card. I guess for the TI and lower cards they'll disable some SM's and go with GDDR6 i.s.o HBM2e. Just to save costs on the memory.
Indeed, like they usually do. Hence my post, I concurr.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Astyanax:

Because you're market ignorant.
No he is right. The only ignorance is your nv fanboyizm and the need to defend it. 😛
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
-Tj-:

No he is right. The only ignorance is your nv fanboyizm and the need to defend it. 😛
You're both market ignorant.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Is that "market ignorant" a new word you like to toss around like your placebos? lol, fail. xD
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Neo Cyrus:

Even if it's "only" $1000 MSRP, it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination. In b4 someone comes in talking about how HBM2 costs more than an ocean of virgin blood, that their R&D costs more than 10 billion pure souls, then links to nVidia's ballsack's reddit page stating they pay everyone in their staff and homeless people outside $10K per minute therefore they need to charge insulting amounts. 🙄 I'll drop to 30 fps gaming or buy a console before I pay nVidia's mafia monopoly prices. Or better yet, bust out my backlog of old games. Too bad, I was looking forward to Cyberpunk on PC (no, I won't play it on a console). Guess I'll play that in 2026 or so. I can wait.
Yeah, $1000 GPUs are a horrible thing if it holds into next gen, but I doubt the massive dies used in them allows them much room to come down in price to Pascal levels.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
-Tj-:

Is that "market ignorant" a new word you like to toss around like your placebos? lol, fail. xD
The failure is you. https://www.guru3d.com/news-story/gddr6-significantly-more-expensive-than-gddr5.html https://www.reddit.com/r/nvidia/comments/99r2x3/attempting_to_work_out_rtx_2080ti_die_cost_long/ And then this doesn't even have costings for pcb complexity,
I think you're making valid points but $115 is an immense difference for the die alone. The costs also don't end there because you need to factor in the cost of validating the die and ensuring that it runs error free at its rated clock speed. All those faulty dies still need to be checked and people are getting paid to do so. Then you need to consider that Ti card uses a 352-bit PCB vs a 256-bit PCB on the 1080. Those are more expensive while you also need 11 memory chips instead of 8 to run on that kind of bus. A single 1GB GDDR6 chip cost ~25$ the last time I checked so the final tally is now possibly a $250 difference on just these three components. It all adds up and we're still only looking at the bill-of-materials side of it. Factor in R&D and the cost of software engineers and those costs quickly start rising. Only, this time, we're not just talking about rasterization but ray-tracing and AI as well. At the risk of seeming insensitive, if you cannot afford a $700 GPU then you can also not afford a $600 GPU. My own utilities bill includes $110 to my ISP and Netflix every single month which means that I'll be spending $2640 over the next two of years on just these two things. The RTX 2080 should easily last me the same amount of time and costing $700 and not $600 is completely meaningless to me. Someone complaining about a $100 increase just tells me that he isn't the person paying for utilities in his house and he shouldn't be spending $600 on a GPU in the first place.
nuff said 😎 Turing is priced exactly where it should be for the people who were passing over pascal as an upgrade. The people who won't buy turing weren't going to buy turing anyway regardless of the price.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
So ram was more expensive and now it made gpu 50% more expensive.. right..nice trolling. Amd hbm is more and in end it got slashed down to initial prices before mining craze and all.. That said the only price hike on nv side was yes they taking advantage of mining market. Greed has no limits.. but it took a toll on them for sure. I've seen some news a while ago high end gpus wont be as expensive as now, just lower end like xx60 and xx70 will somewhat stay the same.. top tier will have price cuts.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
-Tj-:

So ram was more expensive and now it made gpu 50% more expensive.. right..nice trolling.
You were given 2 links, and a third component to consider and you only picked on the one who's cost directly scales on amount used. https://www.reddit.com/r/nvidia/comments/9o6256/2080ti_allocation_problems/ a good part of the freaking cost is the component shortage, another part is price per wafer to buy tmsc fab time. If Ampere is cheaper i'll be surprised since euv takes longer to fab and backdrilling is time and money.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
Ok... so i'm a bit confused here. Why does everyone believe that nVidia is making these Ampre chips using 10nm? The leather jacket himself clearly stated that these chips will be made primarily using TSMC 7nm, with some others using Samsung's 7nm. Did nVIdia make a statement in the past couple months stating otherwise? I personally think they made that switch from Samsung to TSMC for the bulk of these upcoming chips because they plan to use TSMC's CoWoS technology.
-Tj-:

Is that "market ignorant" a new word you like to toss around like your placebos? lol, fail. xD
The availability and consumer demand of an item dictates the price of said item. Anyone who doesn't understand that concept is "market ignorant" and/or what Bernie Sanders calls, a "democratic socialist"....
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Andrew LB:

The availability and consumer demand of an item dictates the price of said item. Anyone who doesn't understand that concept is "market ignorant" and/or what Bernie Sanders calls, a "democratic socialist"....
Yeah - because the demand of buying a video card is similar to healthcare... You can make points without bringing politics into it.