Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
Be Quiet! Pure Power 12 M - 850W ATX 3.0 PSU review
Corsair H170i Elite Capellix XT review
Forspoken: PC performance graphics benchmarks
ASRock Z790 Taichi review
The Callisto Protocol: PC graphics benchmarks
G.Skill TridentZ 5 RGB 6800 MHz CL34 DDR5 review
Be Quiet! Dark Power 13 - 1000W PSU Review
Palit GeForce RTX 4080 GamingPRO OC review
Core i9 13900K DDR5 7200 MHz (+memory scaling) review
Seasonic Prime Titanium TX-1300 (1300W PSU) review

New Downloads
GeForce 528.49 WHQL driver download
Intel ARC graphics Driver Download Version: 31.0.101.4123
FurMark Download v1.33.0.0
Corsair Utility Engine Download (iCUE) Download v4.33.138
CPU-Z download v2.04
AMD Radeon Software Adrenalin 23.1.2 (RX 7900) download
GeForce 528.24 WHQL driver download
Display Driver Uninstaller Download version 18.0.6.0
Download Intel network driver package 27.8
ReShade download v5.6.0


New Forum Topics
RTX 4090 Owner's thread GeForce RTX 4060 would be equivalent to an RTX 3070 Ti in performance MSI AB / RTSS development news thread Ambient Occlusion doesn't work on my laptop White 27-inch WQHD Gaming Monitors from MSI NVIDIA GeForce 528.49 WHQL driver Download & Discussion 13900k owners, do you still oc? The AMD Ryzen All In One Thread /Overclocking/Memory Speeds & Timings/Tweaking/Cooling Part 2 Windows power plan settings explorer utility AMD Confirms Strategy of Restraining Chip Supply to Maintain High CPU and GPU Prices




Guru3D.com » News » Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

by Hilbert Hagedoorn on: 03/07/2020 01:51 PM | source: Stage1 | 37 comment(s)
Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

As GDC closes in, most people do expect some NVIDIA Ampere announcements during the keynote, time will tell. Meanwhile, some (and we assume it is Ampere) specs and benchmarks already have leaked from HPC parts. New info was spotted on the web though.

With the some benchmark entries spotted I already explained that most GPU work in a multitude's of that eight (bits/bytes), and that I would not be surprised to see 128 Shader clusters (CUs) and thus a possible 8192 shader cores for a fully enabled GPU. While it is nothing more than a user posting some info on a forum, new speculation is drawing some attention. A Chinese forum called Stage1 apparently has shared some reliable info in the past, and this round he talks about a chip called 'ga100', And GA obviously would be GeForce Ampere. Here's what he posted and galls GA100:

  • 128SM, 8192cuda, 24 / 48GB HBM2e, boost frequency up to 2.2Ghz, double the tensor core, 300W TDP

GA100 would get far more compute performance then expected, the 7nm fabbed GA100 would indeed get 8192 cores and thus 128 CUs, meaning NVIDIA if pretty much going monolithic and doubles up on their transistor budget, which going from 12nma and 14nm to 7nm is very possible. Very interesting is the mention of a boost clock up-to 2200 MHz, which seems high especially with a 300W TDP.

All that in combination with 24, 32 or even 48 GB of HBM2e graphics memory would/could deliver 32 teraflops of performance. Obviously we're talking data center and supercomputer centric products here, but Ampere paves the way architecture wise for the consumer products as well. Apparently the number of tensor cores would double up as well. 

Relativity can be a bitch - of course, remember, this info is based upon one post in a forum. Grab some salt, but with the number of leaks and GDC so close, I can't rule out this info to be false.  

  







« Review: Team Group T-Force XTREEM ARGB DDR4 · Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz · 2K also withdraws their games from NVIDIA GeForce Now streaming service - EPIC is in »

Related Stories

Rumor: NVIDIA Ampere GeForce RTX 3070 and RTX 3080 specs surface - 01/20/2020 01:04 PM
It has been a crazy year, lots of leaks and info mostly coming through unknown Twitter accounts. Today we can add to that as alleged specifications of the GeForce RTX 3070 and RTX 3080 have surfaced....

NVIDIA Announces New 360Hz refresh rate G-SYNC Esports Displays - 01/06/2020 12:04 PM
NVIDIA today unveiled new G-SYNC® displays with a 360Hz refresh rate, providing esports enthusiasts and competitive gamers with the fastest gaming displays ever made. At 360Hz, game frames are displa...

Next Generation NVIDIA Ampere reportedly to offer 50% more perf at half the power - 01/03/2020 11:59 AM
An interesting set of quotes have been made in the Taipei Times. This year we expect the NVIDIA Ampere series GPUs to land, and with increased competition from AMD and maybe Intel, they might be beefi...

NVIDIA and Tech Leaders Team to Build GPU-Accelerated Arm Servers for New HPC Architectures - 11/19/2019 06:42 PM
NVIDIA today introduced a reference design platform that enables companies to quickly build GPU-accelerated Arm®-based servers, driving a new era of high performance computing for a growing range of ...

NVIDIA Announces Jetson Xavier NX, Smallest Computer for AI at the Edge - 11/07/2019 09:08 AM
NVIDIA today introduced Jetson Xavier NX, the world's smallest, most powerful AI supercomputer for robotic and embedded computing devices at the edge. With a compact form factor smaller than the size...


8 pages « < 5 6 7 8


alanm
Senior Member



Posts: 11452
Joined: 2004-05-10

#5767602 Posted on: 03/10/2020 01:22 AM
Even if it's "only" $1000 MSRP, it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination. In b4 someone comes in talking about how HBM2 costs more than an ocean of virgin blood, that their R&D costs more than 10 billion pure souls, then links to nVidia's ballsack's reddit page stating they pay everyone in their staff and homeless people outside $10K per minute therefore they need to charge insulting amounts. :rolleyes:

I'll drop to 30 fps gaming or buy a console before I pay nVidia's mafia monopoly prices. Or better yet, bust out my backlog of old games. Too bad, I was looking forward to Cyberpunk on PC (no, I won't play it on a console). Guess I'll play that in 2026 or so. I can wait.
Yeah, $1000 GPUs are a horrible thing if it holds into next gen, but I doubt the massive dies used in them allows them much room to come down in price to Pascal levels.

Astyanax
Senior Member



Posts: 15007
Joined: 2018-03-21

#5767681 Posted on: 03/10/2020 05:37 AM
Is that "market ignorant" a new word you like to toss around like your placebos?

lol, fail. xD

The failure is you.

https://www.guru3d.com/news-story/gddr6-significantly-more-expensive-than-gddr5.html
https://www.reddit.com/r/nvidia/comments/99r2x3/attempting-to-work-out-rtx-2080ti-die-cost-long/

And then this doesn't even have costings for pcb complexity,



I think you're making valid points but $115 is an immense difference for the die alone. The costs also don't end there because you need to factor in the cost of validating the die and ensuring that it runs error free at its rated clock speed. All those faulty dies still need to be checked and people are getting paid to do so.

Then you need to consider that Ti card uses a 352-bit PCB vs a 256-bit PCB on the 1080. Those are more expensive while you also need 11 memory chips instead of 8 to run on that kind of bus. A single 1GB GDDR6 chip cost ~25$ the last time I checked so the final tally is now possibly a $250 difference on just these three components.

It all adds up and we're still only looking at the bill-of-materials side of it. Factor in R&D and the cost of software engineers and those costs quickly start rising. Only, this time, we're not just talking about rasterization but ray-tracing and AI as well.

At the risk of seeming insensitive, if you cannot afford a $700 GPU then you can also not afford a $600 GPU. My own utilities bill includes $110 to my ISP and Netflix every single month which means that I'll be spending $2640 over the next two of years on just these two things. The RTX 2080 should easily last me the same amount of time and costing $700 and not $600 is completely meaningless to me. Someone complaining about a $100 increase just tells me that he isn't the person paying for utilities in his house and he shouldn't be spending $600 on a GPU in the first place.



nuff said :cool:

Turing is priced exactly where it should be for the people who were passing over pascal as an upgrade.

The people who won't buy turing weren't going to buy turing anyway regardless of the price.

-Tj-
Senior Member



Posts: 17811
Joined: 2012-05-18

#5767804 Posted on: 03/10/2020 02:00 PM
So ram was more expensive and now it made gpu 50% more expensive.. right..nice trolling.

Amd hbm is more and in end it got slashed down to initial prices before mining craze and all..


That said the only price hike on nv side was yes they taking advantage of mining market. Greed has no limits.. but it took a toll on them for sure.

I've seen some news a while ago high end gpus wont be as expensive as now, just lower end like xx60 and xx70 will somewhat stay the same.. top tier will have price cuts.

Astyanax
Senior Member



Posts: 15007
Joined: 2018-03-21

#5767830 Posted on: 03/10/2020 03:04 PM
So ram was more expensive and now it made gpu 50% more expensive.. right..nice trolling.


You were given 2 links, and a third component to consider and you only picked on the one who's cost directly scales on amount used.

https://www.reddit.com/r/nvidia/comments/9o6256/2080ti-allocation-problems/

a good part of the freaking cost is the component shortage, another part is price per wafer to buy tmsc fab time.

If Ampere is cheaper i'll be surprised since euv takes longer to fab and backdrilling is time and money.

Astyanax
Senior Member



Posts: 15007
Joined: 2018-03-21

#5768661 Posted on: 03/12/2020 09:08 AM


102 = 3080ti, RTX Titan A, Quadro RTX A8000)
103 = 3080 Full/3070
104 = 3060ti Full/3060
106 = 3050ti Full/3050 160bit interface?
107 = 3030(no nvenc?)

8 pages « < 5 6 7 8


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023