NVIDIA Moving to TSMC 16nm FinFET process

Published by

Click here to post a comment for NVIDIA Moving to TSMC 16nm FinFET process on our message forum
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
I hope Skylake is available when we have the GPU's released on 16nm FinFET.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Aaaand I just bought a 970. Haha crap.
data/avatar/default/avatar04.webp
makes ya wonder why tsmc even wasted any resources on 20nm r and d when only a few products even use it might as well just focused totally on 16nm and got there a bit faster even.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Aaaand I just bought a 970. Haha crap.
By the looks of it you have plenty of time :P
I'm waiting for Skylake myself, but I don't think we see it before 2016. Broadwell is next and desktop/K versions won't be out till late Q1/Q2 2015 probably..
It's on schedule at the moment for Q3/4, I'm just hoping it's more Q3 than Q4 http://www.extremetech.com/wp-content/uploads/2014/11/Intel-Desktop.jpg http://www.extremetech.com/computing/194925-intels-comprehensive-roadmap-update-4g-tablets-skylake-on-track-for-h2-2015
https://forums.guru3d.com/data/avatars/m/230/230424.jpg
I dont know why, but Skylake is calling out to me. Seems like a logical upgrade from my 2600k for some reason.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
By the looks of it you have plenty of time :P
Yeah I guess you're right. The old 560 wasn't cutting it anymore.
I dont know why, but Skylake is calling out to me. Seems like a logical upgrade from my 2600k for some reason.
I second that.
data/avatar/default/avatar23.webp
I dont know why, but Skylake is calling out to me. Seems like a logical upgrade from my 2600k for some reason.
meh i doubt even skylake will be that big of an upgrade from sandy bridge. Even from the gulftown cpu i have it wont be that big a deal. Unless you are doing time critical heavy duty stuff with your pc there is no reason to upgrade unless you are on core 2 duo/core 2 quad still. It would be nice if intel actually upgraded their performance for real not these BS at max 10-15% upgrades and thats being generous usually performance gain is much less then that even. What happened to the 50-100% gains?
data/avatar/default/avatar18.webp
Skylake is reportedly about 3 percent faster than Broadwell. The first locked Skylake CPU's will be released alongside the unlocked Broadwell CPU's, so that suggests the performance differential. Unlocked Skylake may not eventuate, they may try to push people who overclock on to the -E platform. No competition! However, if the AMD post excavator CPU is decent, especially if it's VISC based, Intel will quickly lift their game. Wonder what process AMD will use next.
I wouldn't mind if they locked out all overclocking on the mainstream chipset. I only build on enthusiast platform anyways
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Skylake is reportedly about 3 percent faster than Broadwell. The first locked Skylake CPU's will be released alongside the unlocked Broadwell CPU's, so that suggests the performance differential. Unlocked Skylake may not eventuate, they may try to push people who overclock on to the -E platform. No competition! However, if the AMD post excavator CPU is decent, especially if it's VISC based, Intel will quickly lift their game. Wonder what process AMD will use next.
They haven't had competition for a long time and they still brought out their products. I don't see it being any different.
I dont know why, but Skylake is calling out to me. Seems like a logical upgrade from my 2600k for some reason.
Same here, I just want it to be ready when the new GPU's are though so I can make a killer rig :banana:
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I wonder how this 16nm iteration will change the performance of gpus, or if nvidia uses it for it's mobile devise chips (have they closed that down or do I recall it wrong?). Because, even if amd switches to 20nm before nvidia does, they'd still have to go until they get to 16nm after the 390x, for instance.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I see a lot of people using numbers 28/20/16/14. But not many of those who use them know what those are and type of devices are they meant for. Nor they know what is limitation of each specific node and its variations.
data/avatar/default/avatar02.webp
16nm FinFET is actually an enhanced 20nm
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
16nm FinFET is actually an enhanced 20nm
The same is true with 14nm FinFET but they still decrease leakage and lower power consumption. The only downside is that performance won't increase much from it -- but that really hasn't been happening directly from die shrinks much anyway.
data/avatar/default/avatar40.webp
Could be jsut Tegra SoCs
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Aaaand I just bought a 970. Haha crap.
I bought 2x 1 month ago and I don't regret it one bit, the best bang for your buck price/performance cards I had in years. At the same time I'm looking forward to the 16nm cards, and hope they will land in hard, already dreaming at a MSI Lightning 16nm flagship beast. 🤓
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
TSMC is ready to move to volume production of their 16nm FinFET process, Nvidia is joining them based on a recent report. This is interesting news for several reasons, included the one that is called ... NVIDIA Moving to TSMC 16nm FinFET process
im sure my next gpu will probably be on the next nm size down seeing how long i wait between gpus, impressive none the less
data/avatar/default/avatar18.webp
im sure my next gpu will probably be on the next nm size down seeing how long i wait between gpus, impressive none the less
if I still had a 1920x1080 monitor I would wait for the next shrink unfortunately at 2560x1600 a 5870 just doesn't cut it plus the card is acting weird and i think its gonna die soon. I'm barely able to hold on for 390x vs gm200 to see what I am going to upgrade to. Whichever card wins the performance/energy use + heat ratio will be my next card price is disregarded. Zero brand loyalty here.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
all this nm talk is nothing, bring then 16nm/20nm gpus already.. :P
Whichever card wins the performance/energy use + heat ratio will be my next card price is disregarded. Zero brand loyalty here.
Power consumption is overrated, I personally buy gpu for performance not to save power.. Unless it runs hot, but then there are custom coolers. Maxwell turned out not so power friendly, but all praise it like GK104 (mostly diehard nv fans), useless.
data/avatar/default/avatar05.webp
Could be jsut Tegra SoCs
The problem with this type of information, they dont precise at all what product it is, just it will be available. You have different type of process for each node, HP ( high performance ); HPL ( HP low power ), LP ( Low power used generaly on SOC ). But with FinFet i think there's less difference between the type of process used. Then there's 2 version of the 16nmFinfet ( Finfet and Finfet+) For be honest, TSMC care more to say Apple will be their client ( only for 10% of the next A9 soc, 90% will be made by Samsung for the 16nm Finfet. This could mean Samsung+Glofo have some good advance on it ).. Or today SOC ...
https://forums.guru3d.com/data/avatars/m/235/235344.jpg
What happened to the 50-100% gains?
Depends where on the Laffer Curve this development is. Law of Diminishing Returns will always make its presence known.