GeForce GTX TITAN-X 3DMark Benchmarks Surface
Okay, BIG grain of salt - but they could be real. Somebody posted a couple of screenshots of Futuremark 3DMark 11 results. That somebody also apparently has four GeForce GTX Titan X graphics cards and published the results online.
The cards have been tested properly with a Core i7-5960X eight-core processor, and the scores are both single-GPU and 4-way SLI with on 3DMark 11, with its "extreme" (X) preset. The card scored X7994 points in a single GPU run and a whopping X24064 points in 4-way SLI. Again that would be the test in EXTREME mode. The screenshots that leaked through Videocardz confirm the announced 12GB memory partition, but it also is listing a boost core clock of 1222 MHz and a memory clock of 1863 MHz (x4) for the Quad SLI setup. We do assume here that the card would be overclocked or tweaked a little, but these clocks are yummie for an 8 billion transistor encounting product!
Clock wise the single GTX Titan X would be clocked arround 1.0 GHz as the screenshot indicates. The memory is clocked at 1753 Mhz (x4) = 7.0 Ghz effective. And these numbers do make more sense. The GTX Titan X is expected to get 3072 shader processors, 192 TMUs, 96 ROPs, and a 384-bit wide GDDR5 memory bus with 12 GB of memory. The majority of specs are based on the GM200 Quadro counter-part. What remains weird though is that the entry is listed as 'Generic VGA', without a driver you simply can not activate SLI or even the single GPU, so it really should state something 'GeForce'.
New Photos of GeForce GTX Titan X - 03/08/2015 10:14 AM
We've already shown you a good bunch of photo's on Nvidia's latest and greatest, the GeForce GTX Titan X, here. But some new photo's surfaced and you really do want to take a look at them !...
Gigabyte now offers GeForce GTX 960 in 4GB versions - 03/06/2015 07:41 PM
Gigabyte now offers a GTX 960 available with 4 GB graphics memory. The larger frame buffer can be useful when playing at higher resolutions, or when playing with higher-resolution textures installed...
NVIDIA GeForce TITAN-X Revealed (Updated) - 03/05/2015 07:09 PM
Well kinda, over at the GDC there was a quickie graphics card announcement. The GeForce GTX TITAN-X will be based on Maxwell but with a massive transistor count. It would have a 12GB frame buffer, 3072 Shader processors and 8 billion transistors.
Unreal Engine 4 Kite Demo - Running on GeForce TITAN X - 03/05/2015 09:57 AM
NVIDIA has opened up PhysX Code to UE4 Developers. When you experience the huge, realistic world of our Kite demo running on NVIDIA TITAN X, you see the result of Epic’s longstanding commi...
Download Nvidia GeForce 347.71 driver - 02/27/2015 07:50 PM
You can now download the Nvidia GeForce 347.71 driver. This is a GeForce Hot Fix driver. More details on the download page. Download here...
Senior Member
Posts: 3395
Joined: 2007-05-31
because it was cheap and because lot of people were positive i had bought a HD 7950... one of my worse idea, the card is in a box since more or less one year right now and even at 50€ no one want it.
the card is great... on linux with not AMD driver (or in a mac a friend have nice result on mac).
on Windows the drivers **** everything:
-bug with open gl and cl (wich is inacceptable as AMD promote those)
-catastrophic shader use
- etc etc...
of course for the price you have power (the GPU and design is not in cause)... but due to driver next time i would bough lower NVidia instead.
Senior Member
Posts: 446
Joined: 2008-02-14
Pixel fill rate is same as Northern Islands series, this is what I look at most (with everything else).
Titan X Uses 258 watts so looks like I am getting a 980 for my 380watt PSU, hehehe. Should wait for Titan Y, that might be under 200W.
Senior Member
Posts: 11809
Joined: 2012-07-20
Pixel fill rate is same as Northern Islands series, this is what I look at most (with everything else).
Titan X Uses 258 watts so looks like I am getting a 980 for my 380watt PSU, hehehe. Should wait for Titan Y, that might be under 200W.
Good luck moving from 151W GPU which power draw is stable to gtx 980 which draws 190W in average and has huge spikes up and down as power is needed.
New AMD GPUs will have similar inter-frame power saving features, but that really creates power draw spikes and requires that much better PSU.
Senior Member
Posts: 292
Joined: 2013-08-16
I am looking forward for new AMD drivers, but only for new features (vsr on older cards, freesync support for newer) and not for fixes since I need none atm.
Dummies calling r9-390x hot before they get their hands on it are just dummies not realizing that their 970/980 is only 5 billion transistor chip and Titan X is 8 billion one + 50% wider memory buss and therefore 50% more memory chips.
It for sure will be 300W card, those who will OC it as usual will have 450W+ card, and they'll not complain about that.
Heat haters should crawl under their cold rocks.
ROFL, you do realize it's impossible for that card to draw a single watt more than 300 form those power connections, it only has a single 6pin and a nother 8 pin connection, adding in the power that pcie itself can deliver to the card and the max is even lower than 300w, 450w oc please gtfo

Senior Member
Posts: 11809
Joined: 2012-07-20
I am looking forward for new AMD drivers, but only for new features (vsr on older cards, freesync support for newer) and not for fixes since I need none atm.
Dummies calling r9-390x hot before they get their hands on it are just dummies not realizing that their 970/980 is only 5 billion transistor chip and Titan X is 8 billion one + 50% wider memory buss and therefore 50% more memory chips.
It for sure will be 300W card, those who will OC it as usual will have 450W+ card, and they'll not complain about that.
Heat haters should crawl under their cold rocks.