Nvidia GTX 1080 Ti vs Titan X PCB Explored

Published by

Click here to post a comment for Nvidia GTX 1080 Ti vs Titan X PCB Explored on our message forum
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Interesting. Think I will affect overclocking any?
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Interesting. Think I will affect overclocking any?
Nope. Reference cards overclock the same as even the best boards. Limitation these days are silicon lottery.
data/avatar/default/avatar20.webp
"Board partners will likely added DVI. Please, don't
https://forums.guru3d.com/data/avatars/m/99/99106.jpg
I think for the first time I might actually go with a reference/FE card. If the OC potential is the same as the 1080 non-ti then I'd be happy with a stock cooler that exhausts heat out of the case, while the OC potential is pretty much the same as the aftermarket cards. I personally have found my G1 quite loud under full stress as well, and the stock cooler is actually less loud... win/win Edit; The 1080 Ti PCB looks better then the Titan one.. Interesting, so apart from 1gb less and a slightly gimped memory bus, you can get a better card for half the price.. :banana: Glad I waited for a bit and didn't jump the gun on the 1080 😀
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
I think for the first time I might actually go with a reference/FE card. If the OC potential is the same as the 1080 non-ti then I'd be happy with a stock cooler that exhausts heat out of the case, while the OC potential is pretty much the same as the aftermarket cards. I personally have found my G1 quite loud under full stress as well, and the stock cooler is actually less loud... win/win Edit; The 1080 Ti PCB looks better then the Titan one.. Interesting, so apart from 1gb less and a slightly gimped memory bus, you can get a better card for half the price.. :banana: Glad I waited for a bit and didn't jump the gun on the 1080 😀
8 fewer ROP's as well. TXP may end up being better in 4K at the same clocks.
https://forums.guru3d.com/data/avatars/m/99/99106.jpg
8 fewer ROP's as well. TXP may end up being better in 4K at the same clocks.
Hmm true, but we're talking a couple % here and there at half the price 🤓
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Some customs may be louder, but also a lot cooler. Stock seem to aim 80C mark, imo too much - max 75C.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Some customs may be louder, but also a lot cooler. Stock seem to aim 80C mark, imo too much - max 75C.
http://i.imgur.com/v5dqxZi.jpg Obviously ambient temperature and it most likely being an open test bed effect the performance. But it does seem like the stock cooling solution was improved. Either that or removing the DVI port did far more for temps then I would expect. Edit: By comparison the 180w, GTX1080 FE loads ~80c.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Hmm true, but we're talking a couple % here and there at half the price 🤓
Very true. No doubt the TXP is obsolete with the 1080Ti coming. Again don't be surprised if we see a TXP black or Titan Ultra with 3840sp cores, 96 ROP's, 12GB of 11Ghz GDDR5x and a $1200 MSRP.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
[IMG]http://i.imgur.com/v5dqxZi.jpg Obviously ambient temperature and it most likely being an open test bed effect the performance. But it does seem like the stock cooling solution was improved. Either that or removing the DVI port did far more for temps then I would expect. Edit: By comparison the 180w, GTX1080 FE loads ~80c.
Yeah no DVI does help for sure - that's now at least 50% extra exhaust space Btw is that test just with gpuz render or anything extra in background? Well its not the most demanding, saw it when I tested my custom cooled atm 56C gpuz, max ingame ~ 67C.
https://forums.guru3d.com/data/avatars/m/99/99106.jpg
Some customs may be louder, but also a lot cooler. Stock seem to aim 80C mark, imo too much - max 75C.
Personally I don't mind that much - if the card is actually made to run at that temp (and it can do it for a couple years) I honestly don't care. Especially with the fans that exhaust hot air at the back and not into the case 🙂
https://forums.guru3d.com/data/avatars/m/187/187161.jpg
No DVI also means single slot water cooling, just like the Fury X.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
"Board partners will likely added DVI. Please, don't
I still use DVI
data/avatar/default/avatar33.webp
I think for the first time I might actually go with a reference/FE card. If the OC potential is the same as the 1080 non-ti then I'd be happy with a stock cooler that exhausts heat out of the case, while the OC potential is pretty much the same as the aftermarket cards. I personally have found my G1 quite loud under full stress as well, and the stock cooler is actually less loud... win/win Edit; The 1080 Ti PCB looks better then the Titan one.. Interesting, so apart from 1gb less and a slightly gimped memory bus, you can get a better card for half the price.. :banana: Glad I waited for a bit and didn't jump the gun on the 1080 😀
Definitely going for the FE for the first time ever as did my first water cooling system a few weeks ago with my older kit and my 980ti so I'm confident in replacing water blocks so if I don't buy this generation ti it will definitely be the next one but 100% FE 😉
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Can we finally let DVI die off now? least with these top end cards, almost no new monitors even have this old connection. Like cards that kept VGA for way longer than they should have :') But seriously let DVI die, as much as i loved it back in the day, those cables were massive in comparison to DP or HDMI
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
I'm with everyone else on letting dvi die but wow when this was the fury x everyone went nuts now Nvidia does it and it's the smartest thing ever
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Well the fury (x) did come out in June 2015, coming up to 2 years ago... though shouldn't have been on that card either. i can understand DVI on the lower end models, but not on the top end. i'd rather it be left for extra venting or another DP/HDMI port
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
I'm with everyone else on letting dvi die but wow when this was the fury x everyone went nuts now Nvidia does it and it's the smartest thing ever
The thing with the Fury X was not only did it lack DVI it also lacked HDMI2.0.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
I still use DVI
It will be in lieu of a "dongle" for people who need to have access to a DVI connection.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I'm with everyone else on letting dvi die but wow when this was the fury x everyone went nuts now Nvidia does it and it's the smartest thing ever
Yeah.. <_<