Researchers develop ultra-fast light-based microprocessor

Published by

Click here to post a comment for Researchers develop ultra-fast light-based microprocessor on our message forum
https://forums.guru3d.com/data/avatars/m/145/145176.jpg
It's an interesting idea but, unless my interpretation is incorrect, it seems a little too far from market to be viable. By the point they can implement it on relatively high end chips at relatively affordable costs, quantum processors will have probably also made further strides onwards. Would there still be a point in investing in this type of technology at that point?
data/avatar/default/avatar24.webp
The major point of the photonic logic is the significant power savings that directly affect the perf/Watt ratio. Even if only applied to the memory and peripheral I/O it would still net a major progress, since moving data over external interfaces is one of the biggest power-demanding operations. Also, optic signals can work at orders of magnitude higher frequencies, expanding the available bandwidth without the need of expensive and power-hungry wide copper wiring for ICs and PCBs. That pretty much is evident in the long-range wired communications -- all of the seafloor cables are fibre-optic for the same reasons.
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
Either way or whichever technology prevails, one thing is sure. The future of technology is getting more and more interesting as is advancing. :thumbup:
data/avatar/default/avatar36.webp
It's an interesting idea but, unless my interpretation is incorrect, it seems a little too far from market to be viable. By the point they can implement it on relatively high end chips at relatively affordable costs, quantum processors will have probably also made further strides onwards. Would there still be a point in investing in this type of technology at that point?
I don't understand your interpretation? Could you clarify please? I just watched the video, as I'm sure you have, and they show it working with a dual core processor chip with RISC-V compilers. I'm pretty sure it was running a hello world and a graphical program using photonic circuits on that dual core chip. So, that is why I'm confused why you think its too far from market? Also, they said that it would cost no more to manufacture than it does today.
https://forums.guru3d.com/data/avatars/m/241/241896.jpg
Don't you just love the word Photonic 🙂 Technology is advancing faster and faster as time passes, one can only dream of what devices we will being using in 20yrs from now and how it will be incorporated into our daily lives plus how technology is also used by the medical field ,eg prosthetic mechanical limbs . Im sure that one day we will be able to have microprocessors implanted directly in our brains (hope it's overclockable) lol .
https://forums.guru3d.com/data/avatars/m/174/174929.jpg
Im sure that one day we will be able to have microprocessors implanted directly in our brains (hope it's overclockable) lol .
To overclock your brain you would need a Scorpius-class cooler.
data/avatar/default/avatar28.webp
What happens if it is overclocked too high? Shines? Color-shift? Yes, it would be a color shift and it wouldnt work I guess. But its so fast noone would think about it. Unless quantum computing takes effect. If it is just the pulse frequency of light, then it would have headroom for oc.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I don't understand your interpretation? Could you clarify please? I just watched the video, as I'm sure you have, and they show it working with a dual core processor chip with RISC-V compilers. I'm pretty sure it was running a hello world and a graphical program using photonic circuits on that dual core chip. So, that is why I'm confused why you think its too far from market? Also, they said that it would cost no more to manufacture than it does today.
In next 5 years, I do not see myself putting processor into socket which has only few pins for power delivery. Doing same for memory, graphics card, sound card, LAN and all other stuff. And then plugging optical cables, making ugly octopus in process. Unless it is easy to install and as reliable as today's systems, it will never take off.
data/avatar/default/avatar31.webp
In next 5 years, I do not see myself putting processor into socket which has only few pins for power delivery. Doing same for memory, graphics card, sound card, LAN and all other stuff. And then plugging optical cables, making ugly octopus in process. Unless it is easy to install and as reliable as today's systems, it will never take off.
All things considered, LGA sounds like the way to go, but instead of pins, it's a flat-surface array of optic-grade transparent ends, flush with the top layer of the surface pressing against the chip. (I suck at describing stuff)
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
I've been reading about light-based processors for the last five years (or more.) Every time a company feels the need for more capital investment they'll put out one of these little blurbs promising "breakthroughs" that force you to read the fine print to discover that they are at least a decade away (or more) from any actual marketable products that are based on the "breakthrough." (Doesn't matter the field or the product.) Translation: don't hold your breath waiting on this, but please give us your money so we can keep working on it. I may be cynical, but a "breakthrough" for me is: "Hi! We've designed an optical processing cpu that will hit the market in 6-9 months and it will forever change your perspective on computing performance. Hang on!" That's a breakthrough...;) (Or, it will be in 6-9 months.)
https://forums.guru3d.com/data/avatars/m/194/194703.jpg
But can it run Crysis?
https://forums.guru3d.com/data/avatars/m/175/175739.jpg
That supposed joke lost all meaning considering all Modern computers can run Crysis. That joke is as dead as a dodo.
https://forums.guru3d.com/data/avatars/m/194/194703.jpg
That supposed joke lost all meaning considering all Modern computers can run Crysis. That joke is as dead as a dodo.
Well I just revived it. The classics never die.:pc1:
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
That supposed joke lost all meaning considering all Modern computers can run Crysis. That joke is as dead as a dodo.
Well I just revived it. The classics never die.:pc1:
Yes, and HonoredShadow just confirmed that chip in article is not as modern.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
In next 5 years, I do not see myself putting processor into socket which has only few pins for power delivery. Doing same for memory, graphics card, sound card, LAN and all other stuff. And then plugging optical cables, making ugly octopus in process. Unless it is easy to install and as reliable as today's systems, it will never take off.
Be serious, it would be nothing like that. Mainboards with tiny fiber optic chamfers replacing the current PCBs' electrical circuitry come to mind. Actually a hybrid of both since you still need electrical power. Sockets with optical channels and just a few pins for power. This could also be applied to volatile memory, but I don't see a way to include this technology into non-volatile memory. It has potential.
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
I've been reading about light-based processors...
That's because it's currently impossible to make an entire optical processor as there is no way to make optical memory. No one even thought of a principle. The best we can do is a delay line for multiplexers. But most of the processing is still done by electric circuit. This is exactly the same, all they did is switch electrical buses to optical. Which is good as an idea because it can improve throughput, but I think quite impractical. As you still need at least a decent diode for transmitter and reciever on top of optical fiber (or a waveguide), that would be both quite hard to integrate. Now it's not impossible, just really expensive, as it would take a lot of space of the chip. Now if someone made a fully optical memory, we would be able to switch to fully otical processors in say a dacade time. But as it is right now, we can only mix it and I think the benefit of it is not worth the price and effort. But it is still something worth exploring as quite soon we would reach the limit of integration and miniaturisation and we will need to change the technology. As there is about 1,5 nm between two atom layers and some quantum effects can be observed even now with 8nm integration.
https://forums.guru3d.com/data/avatars/m/145/145176.jpg
I don't understand your interpretation? Could you clarify please? I just watched the video, as I'm sure you have, and they show it working with a dual core processor chip with RISC-V compilers. I'm pretty sure it was running a hello world and a graphical program using photonic circuits on that dual core chip. So, that is why I'm confused why you think its too far from market? Also, they said that it would cost no more to manufacture than it does today.
Getting optical components on the same chip as electronics is particularly hard, reports MIT technology review. Researchers have been able to combine only very simple circuits with optical parts, and these systems have remained pricey, says Sun.
And others, like Backstabak, have further elaborated on what I had meant. Still, I do thank Fellix for the insight on where the technology might prove useful. I don't know much about the topic which is why I had placed in the disclaimer and worded the second part of my post as a question.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
sorry, but the headline is wrong. any quad channel intel (consumer) will reach those (thru/output) numbers with ddr3. and if its about the energy savings (not the performance), wouldnt "Researchers develop ultra-efficient thingy based on a "optical" microprocessor.." make more sense?!
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
changing electricity to light and back does small delay. So fail.