GeForce GTX TITAN-X 3DMark Benchmarks Surface
Okay, BIG grain of salt - but they could be real. Somebody posted a couple of screenshots of Futuremark 3DMark 11 results. That somebody also apparently has four GeForce GTX Titan X graphics cards and published the results online.
The cards have been tested properly with a Core i7-5960X eight-core processor, and the scores are both single-GPU and 4-way SLI with on 3DMark 11, with its "extreme" (X) preset. The card scored X7994 points in a single GPU run and a whopping X24064 points in 4-way SLI. Again that would be the test in EXTREME mode. The screenshots that leaked through Videocardz confirm the announced 12GB memory partition, but it also is listing a boost core clock of 1222 MHz and a memory clock of 1863 MHz (x4) for the Quad SLI setup. We do assume here that the card would be overclocked or tweaked a little, but these clocks are yummie for an 8 billion transistor encounting product!
Clock wise the single GTX Titan X would be clocked arround 1.0 GHz as the screenshot indicates. The memory is clocked at 1753 Mhz (x4) = 7.0 Ghz effective. And these numbers do make more sense. The GTX Titan X is expected to get 3072 shader processors, 192 TMUs, 96 ROPs, and a 384-bit wide GDDR5 memory bus with 12 GB of memory. The majority of specs are based on the GM200 Quadro counter-part. What remains weird though is that the entry is listed as 'Generic VGA', without a driver you simply can not activate SLI or even the single GPU, so it really should state something 'GeForce'.
New Photos of GeForce GTX Titan X - 03/08/2015 10:14 AM
We've already shown you a good bunch of photo's on Nvidia's latest and greatest, the GeForce GTX Titan X, here. But some new photo's surfaced and you really do want to take a look at them !...
Gigabyte now offers GeForce GTX 960 in 4GB versions - 03/06/2015 07:41 PM
Gigabyte now offers a GTX 960 available with 4 GB graphics memory. The larger frame buffer can be useful when playing at higher resolutions, or when playing with higher-resolution textures installed...
NVIDIA GeForce TITAN-X Revealed (Updated) - 03/05/2015 07:09 PM
Well kinda, over at the GDC there was a quickie graphics card announcement. The GeForce GTX TITAN-X will be based on Maxwell but with a massive transistor count. It would have a 12GB frame buffer, 3072 Shader processors and 8 billion transistors.
Unreal Engine 4 Kite Demo - Running on GeForce TITAN X - 03/05/2015 09:57 AM
NVIDIA has opened up PhysX Code to UE4 Developers. When you experience the huge, realistic world of our Kite demo running on NVIDIA TITAN X, you see the result of Epic’s longstanding commi...
Download Nvidia GeForce 347.71 driver - 02/27/2015 07:50 PM
You can now download the Nvidia GeForce 347.71 driver. This is a GeForce Hot Fix driver. More details on the download page. Download here...
Senior Member
Posts: 11809
Joined: 2012-07-20
Great card, awesome performance, but far too expensive. Of course, 1000 bucks in the US means it is going to be 1200 in Canada. That is just insanity. AMD needs to get their act together and release the 3xx cards already. It is time to put an end to this sickening price gouging. There is no way this card should be more than 750. This is what the US prices should be.
GTX 970 - 279.00
GTX 980 - 429.00
GTX 970 Ti (when released) - 379.00
GTX 980 Ti (when released) 529.00
Titan X - 749.00
Issue here is apparent...
There is no performance space between 970 and 980, therefore no need for 970Ti. Other than that TitanX should cost about same as r9-390x will, since it is actually not Titan.
nVidia pulled this time around trick on people. They took chip which should have been named 970Ti and cut down to 970, instead you got it as 980/970.
Now chip which should have been 980Ti and cut down to 980 is Titan X + 980 Ti.
Same silicon, higher naming, higher prices. People fell for it.
When I look at chip and its price, I consider transistor count, die size to estimate what manufacturing costs and where it should be in comparison to already released GPUs, then add 20% as new product premium and you are there. (this goes for full chip, not cut down, there premium should be at most 10%)
980 is 5.2B transistor chip, its price should be between 4.3B r9-280x and 6.2B r9-290x as both are full chips, but closer to r9-290x since its area is very close to that (lower transistor density, likely reason for good OC). GTX 970 as cut down chip priced as r9-290x is OK due to "new product premium".
For me, Titan X (GM200-full) launch price should be $850-900 (but everyone knows that nVidia has fixed price for Titan brand till day they no longer sell).
Senior Member
Posts: 446
Joined: 2008-02-14
Official TDP is not the same as power used or the real BIOS TDP. I was not saying you were wrong in that post, I assumed TDP was 250 as well, I was saying it uses more according to the test online. (like the 165 advertised TDP (180W real TDP) for GTX 980).
My Galax GTX 980 is fun, ;oP, was nervous about noise from the fan, but it is silent. I have a sound resistant case but it is on my desktop, and silent enough still. (just putting this in as that was what I was searching for last week, "is Galax GTX 980 V2 noisy") might help someone.