Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
FSP Hydro PTM Pro (1200W PSU) review
ASUS ROG Radeon RX 6750 XT STRIX review
AMD FidelityFX Super Resolution 2.0 - preview
Sapphire Radeon RX 6650 XT Nitro+ review
Sapphire Radeon RX 6950 XT Sapphire Nitro+ Pure review
Sapphire Radeon RX 6750 XT Nitro+ review
MSI Radeon RX 6950 XT Gaming X TRIO review
MSI Radeon RX 6750 XT Gaming X TRIO review
MSI Radeon RX 6650 XT Gaming X review
Deepcool AS500 PLUS CPU Cooler Review

New Downloads
Display Driver Uninstaller Download version 18.0.5.1
Download Samsung Magician v7.1.1.820
Intel ARC graphics Driver Download Version: 30.0.101.1732
HWiNFO Download v7.24
GeForce 512.77 WHQL driver download
Intel HD graphics Driver Download Version: 30.0.101.1960
AMD Radeon Software Adrenalin 22.5.1 WHQL driver download
3DMark Download v2.22.7359 + Time Spy
Prime95 download version 30.8 build 15
AIDA64 Download Version 6.70


New Forum Topics
[3rd-Party Driver] Amernime Zone Radeon Insight 22.5.1 WHQL Driver Pack (Released) GeForce RTX 4090 is twice as fast as RTX 3090 and uses 450W (16128 Shader Cores) Nvidia NIS, DSR , DLSS confusion AMD could be developing a 24-core Ryzen 9 7950X CPU with a TDP of 170 W. Klipsch introduces its new Reference line speakers. COUGAR LUXLIM Keyboard with low-pro optical switch (height of only 20 mm) ASUS ROG Strix XG27AQM EVA Edition Evangelion collaboration 27-inch WQHD LCD Deepcool twin tower side flow AK620 received a new color, white. Compact Thermalright Assassin King 120 SE A RGB has 148-mm height NVIDIA Releases Kepler GPU Security Update 473.47 WHQL Driver




Guru3D.com » News » NVIDIA Pascal GP104 Die Photo

NVIDIA Pascal GP104 Die Photo

by Hilbert Hagedoorn on: 04/11/2016 10:39 AM | source: | 23 comment(s)
NVIDIA Pascal GP104 Die Photo

Last week Nvidia announced the Tesla P100 data-center GPU and if you looked closely at the photo's you could already clearly see that big Pascal 15B transistor GPU being used. A new photo this time showing the GP104 GPU (successor to the GM204 = GTX 970/980).

According to the leak the photo shown below is the GP104 (GM204 successor) intended for Nvidia's pending high-end products, and it measures roughly 290-300mm². At the right side you can see Samsung K4G80325FB - 1.5V 8Gb 8Gbps (8000MHz) GDDR5. Word on the street is that this GPU would hold 2560 shader processors.
  

 
The rectangular die of the GP104 was measured at 15.35 mm x 19.18 mm which should house (very speculative) a transistor-count of 7.4~7.9 billion. Interesting to see is that this chip is tied towards Samsung 8-gigabit GDDR5 memory chips. These ICs run an effective speed of 8 GHz (GDDR5 and thus not GDDR5X). At 256-bit you'ld be looking at 256 GB/s of bandwidth. The shader processor core count of the GP104 could be closer to 2,560, than the 4,096 from an older report.

Earlier on it was speculated that graphics cards based on these GPUs would be called X70 and X80, the specs do not meet up even slightly though.



NVIDIA Pascal GP104 Die Photo




« AMD Polaris 11 in shows CompuBench has 1024 Shader processors · NVIDIA Pascal GP104 Die Photo · Nvidia Pascal Consumer card announced during Computex »

Related Stories

Judge rules that Samsung and Qualcomm do not violate Nvidia patents - 10/12/2015 09:10 AM
According to a judge ruling from the International Trade Commission, Samsung and Qualcomm do not violate Nvidia patents. Nvidia released this news themselves on their blog. Nvidia will appeal this r...

AMD and Nvidia prep for next-gen DirectX 12 - 03/24/2015 11:19 AM
Several new sets of slides from AMD and Nvidia as well surfaced on the webs through slideshare, the decks show what the companies have been up to in regards to DirectX 12, and they do look pretty exc...

Nvidia PhysX Source Code Now Available Free - 03/06/2015 09:03 AM
NVIDIA today put more than a decade of research, development and investment in gaming physics into the hands of game developers – by offering free source code for NVIDIA PhysX on GitHub. Thi...

New Nvidia PhysX FleX features - 12/11/2014 10:20 AM
PhysX FleX is a particle based simulation technique for real-time visual effects. So far we showed examples for rigid body stacking, particle piles, soft bodies and fluids. In this video we'll be sho...

Ubisoft and NVIDIA Partner Up for Assassin's Creed Unity Far Cry 4 And More - 06/05/2014 09:03 AM
Ubisoft and NVIDIA today announced the next chapter in their strategic partnership bringing amazing PC gaming experiences to life in Ubisoft's highly anticipated upcoming titles including Assassin's...


5 pages 1 2 3 4 5


PrMinisterGR
Senior Member



Posts: 7975
Joined: 2014-09-27

#5256549 Posted on: 04/11/2016 04:46 PM
This is such complete speculation. It all depends on the clocks too. This is more or less a shrunk 980Ti. If it hits 2GHz then it's fine.

xIcarus
Senior Member



Posts: 954
Joined: 2010-08-24

#5256555 Posted on: 04/11/2016 04:57 PM
I don't see the problem as the GTX brand has been diluted to the point of being meaningless. It used to be only the high end models were GTX, like *70 and *80 series cards. Now you have GTX 950's out the *ss. Seriously? It's a 950.


Word. The 950 should not get the GTX badge.
But at least they dropped the GT vs Ultra suffixes, those were a bit confusing.

It should be as it was with the 500 series. 550 and under was GT, 560 and above was GTX.
At least they're not overinflating their numbers like AMD does. 7850, 7870, 7950, 7970, 7990. Versus 650, 660, 670, 680, 690. WTF srsly. It's very confusing and even now I have to look up benchmarks every damn time I want to compare the cards.

rl66
Senior Member



Posts: 3369
Joined: 2007-05-31

#5256645 Posted on: 04/11/2016 07:49 PM

I haven't seen conclusive proof that HBM actually helps the Fury X to a tangible level. If anyone has such proof, please share it with me.

yes obviously it push to the top, but sadly for us AMD have made terrible choise of HBM1 wich is limited to 4gb...

so the perf are crippled due to lack of ramspace (and in some case even the previous gen high end do better).

now about the "GDDR5 only" drama: on pict i just see prototype or sample, if it work with GDDR5 then it work with GDDR5X too, it's pin to pin compatible (i can already smell old GPU stock with GDDR5X and a new name... lol).

and GDDR5X is clearly nice ram, and cost less than HBM2 (that will be used for high end in red and green flavor for sure).

"don't sell bear's skin before getting it first" none GPU from neither company are ready... let them come out and tested :)

rl66
Senior Member



Posts: 3369
Joined: 2007-05-31

#5256654 Posted on: 04/11/2016 08:08 PM
I don't see the problem as the GTX brand has been diluted to the point of being meaningless. It used to be only the high end models were GTX, like *70 and *80 series cards. Now you have GTX 950's out the *ss. Seriously? It's a 950.


GTX mean gaming
not only high end at 4K over 100fps :)

there is some that doesn't need so much power (and money lol)
the gtx950 is aimed at 1080 60fps for Moba and game like WoT etc... and it does this pretty well, and even in some top game from last year it perform really well (i was clearly impress the 1st time i pair it with I3... you get a lot for the money... i understand their success).

it is not high end GPU but clearly a GTX.

Shadowdane
Senior Member



Posts: 1433
Joined: 2004-02-14

#5256660 Posted on: 04/11/2016 08:19 PM
I kinda get the feeling big Pascal GP100 will be only for Tesla GPUs. Those big chips are just too expensive to sell for consumer gear. The DX-1 box costs around $130,000 with 8 of those GPUs. If you cut off around $30K for the other server components, your looking around $12,500 per GPU! I don't think we'll see anything like this chip for consumers until 16nm process is mature. I bet the chip yields are horrible right now, hence why they cost a fortune.

I bet the GP104 chip will be a different design entirely. Well still based off the GP100 but with some changes. They will probably kill all the Double Precision (DP) units, games don't really make use of that hardware. That silicon space would be much better utilized for Single Precision (SP) shader cores. Currently half of the silicon space on the GP100 is used for DP units, using that space for SP units would be a much better design for a gaming GPU.

5 pages 1 2 3 4 5


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2022