TSMC Ramping up 2nm Wafer Fabrication Development
Click here to post a comment for TSMC Ramping up 2nm Wafer Fabrication Development on our message forum
Brasky
Is there a size limit where it can't physically get any smaller?
Hilbert Hagedoorn
Administrator
wavetrex
Fox2232
There was time, not so long ago, when 5 nm was considered as impossible target. And now they have 3 nm and talk about 2 nm.
As @Hilbert Hagedoorn wrote, at this time, it is being counted in atoms. Freaking atoms.
Some day in far future, we may have CPU's built atom by atom like from 3D printer. And maybe in few thousands of years even atoms will be manufactured through different energy fields.
Low enough operational voltage and clock will take care of it.
thesebastian
Last year I got a 7nm 3700X CPU and also started got a smartphone with 7 nm SoC (although Google adds so many services into Android that I rarely see a real benefit comparing vs older Nexus phones with much more density SoC and the same battery capacity/mAh).
I can't wait to have hardware with 2 nm CPUs
sverek
Astyanax
funfact: 2nm won't technically be 2nm.
Kaarme
I'd really like to see optical computing start to take more concrete steps. Maybe when the traditional silicon tech reaches the nm endpoint, it will happen. Probably first in hybrid solutions. Since PCIe 4.0 already gives troubles to the developers, I really imagine replacing the electric PCIe with optical communication would be satisfying. But then again, I'm not an engineer.
Noisiv
Astyanax
:D
Graphene and carbon nanotubes ahoy
Silva
Every company calls their products what they wish to call them, for marketing reasons.
Intel advanced 14nm is much more dense over TSMC 12nm and even slightly more denser than GF 12nm. Source: https://en.wikipedia.org/wiki/14_nm_process
Again, we could compare TSMC 7nm to Intel 10nm, very similar (minus the fact there are no Intel products using 10nm at all). Source: https://en.wikipedia.org/wiki/7_nm_process
That means if TSMC calls something 2nm, doesn't mean it's actually 2nm.
What really matters is how to contain the flow of electrons. We need clear 1s and 0s and we can't have leaks or we get errors. Smaller means less voltage and probably a point at witch we get less frequency (maybe that's why Intel 10nm failed). I think we're already hitting an economic wall at 7nm and they're hammering it with loots of science money. The limit I don't know where it is, but economics will play a big part.
As for what would we do next, I think 3D is one option. Heat dissipation could be an issue, but if we put the low power components at the bottom and the high power ones on top, we could get away with a first generation. Also, lowering voltages across the board in favour of having a denser chip, or reinvent a better cooling solution to keep it cooler.
Ricardo
slyphnier
what i curious more is about transistor aging
last year i read :
https://semiengineering.com/transistor-aging-intensifies-10nm/
https://semiengineering.com/transistor-options-beyond-3nm/
so far there not much report regarding it, other than report from people that OC their CPU and get degraded quite fast, that i know many of those people did OC above unsafe voltage range, so not really mean that smaller nm = faster degradation either
but i believe there are some trade for more efficient chip in someways,
well i suppose they design the chip to work at least within warranty period, so around 5years? before seeing some degradation
Middleman
So back in the day, about 20 years ago, i read a leaked document about microchip development and the planned introduction into human beings. The report stated that the goal was to achieve 2nm manufacturing node, and then at that time they could start integrating them into people.
schmidtbag
David Lake
What happened to the 10nm limit of silicon?
Aura89
Jespi
I think we will reach the limits of a sillicon, rather than limits of physics. I think, that around 1nm - 0.8nm we will have to switch from sillicon to other materials, i´ve already read about something about graphene and other complex substances, which are too expensive. I think manufacturers are trying to squeeze maximum from the sillicon just because the prices are "low" ... nobody would pay 5000€ for mid-range desktop processor based on let´s say graphene.
People are mentioning quantum computing, but i think that´s rather 15-20 years into the future (when we will be able to build quantum computer, that can do general computing not pre-defined specific things)
lmimmfn
It's wafer thin!!!
JamesSneed