IBM demonstrates light-based chip communication

Jos

Posts: 3,073   +97
Staff

IBM researchers have announced an important breakthrough today that could change the way computer chips communicate with each other in the future. The company has created a low-power device that can transfer data at high speeds using light, instead of electrical signals over copper wires. The light pulses are transmitted via silicon circuits and can supposedly handle data transfers of up to 40Gbit/s with a 1.5 volt power supply.

The device is called a nanophotonic avalanche photodetector and is made of silicon and germanium -- both used in current microprocessor chips. The term comes from the way the exchange of information happens, as explained by IBM: "Analogous to a snow avalanche on a steep mountain slope, an incoming light pulse initially frees just a few charge carriers which in turn free others until the original signal is amplified many times."

The rather complicated concept is not new but IBM claims it has been able overcome the speed limitations of previous systems from the likes of Intel and others. It will be a while before we see this integrated into mainstream manufacturing, though. IBM says probably five years for high-end servers and another five for video game systems and cell phones.

Permalink to story.

 
Provided the scaling can be consistent with current standard manufacturing techniques, this technology could mean huge boosts in mobile computing, both in efficiency and speeds. Using a fraction of the power to run the same operation - imagine the battery life.
 
Once again IBM are making improvements on the next revolution. I just wonder though, if this does ever make it into consumer electronics, how will they stop bits of dirt getting in between the light, as not everyone's pocket is clean (mine are).

I can see this in very very very very clean environments where not a speck of even the smallest dirt is, but out in the real world, its another matter that needs solving.
 
combine this with the concept of 'spintronics' (that is instead of harnesing the power of the spin of the electron rather than the electron itself) and we are are well on our way to all kinds of things. Now about that warp drive...
 
Armanian said:
Once again IBM are making improvements on the next revolution. I just wonder though, if this does ever make it into consumer electronics, how will they stop bits of dirt getting in between the light, as not everyone's pocket is clean (mine are).

I can see this in very very very very clean environments where not a speck of even the smallest dirt is, but out in the real world, its another matter that needs solving.

Based on what I watched I'd figure the light would have to be transmitted via some sort of transmission line (fiber optic or similar) or have the entire circuit sealed from airborne particles.
 
this seems like a great way to extend battery life on mobile's and making the system a lot faster overall but 5 to 6 years seems like forever :( but this is fascinating and i will watch the tech closely.
 
How would this affect the cooling of the chips? Would they even run as hot with this new architecture?
 
With that kind of power consumption I wouldn't even need a desktop, I could do everything on my laptop! Mobile gaming would be revolutionized!
 
I still want to know when we will be able to use the optical out on the I/O panel of my computer.
 
Armanian said:
Once again IBM are making improvements on the next revolution. I just wonder though, if this does ever make it into consumer electronics, how will they stop bits of dirt getting in between the light, as not everyone's pocket is clean (mine are).

I can see this in very very very very clean environments where not a speck of even the smallest dirt is, but out in the real world, its another matter that needs solving.

I would like to know about this situation as well. Companies would have to make tiny vacuums of space in between the chips, which would presumably be prone to being broken, and that is REALLY not an easy thing to fix one would think.
 
I think if this type of chip is used in pc's , for cpu to chipset to other devices... i think pc's and laptops could shrink smaller, and be so much faster... that would be sweet... so may possibilities....
 
As I recall, this technology has been announced for quite a few years, and so its not really groundbreaking news.

IMO, room-temp superconductors may be a better bet, although I think that one may be much farther down the road.
 
The light signal still has to be converted to electrical before it's of any use. They need a pure photonic transistor before this will be of use in CPUs. By itself this is of no use in chips, with one exception. Where this will be useful is for transmission of information. Maybe a future USB, SATA, or PCIe type of bus that is optical.
 
I would think the the designers would have built in error correction to ensure the should a particle of dirt come in the way the signal would be retransmitted. This would mean while needing to operate in a clean environment it would not need a vacuum reducing the fragility of the device it would be used in.
 
This might even be used to break the limit of Moore's law. Circuits can then get a lot smaller. The only problems here are the converters, which after a while, can't get smaller and the transistors, which can't get smaller, and still have to be made photonic, as peas said.
 
I thought this already existed in the form of fibre optic lines? But just not the 40Gbps bit...
 
this typically reminds when fiber optics where first introduced to replace copper wires. welcome to the photonic world.
 
buttus said:
How would this affect the cooling of the chips? Would they even run as hot with this new architecture?

This is just communication between chips, I'm assuming this will be just like fibre optics and is going to be used for communication between the northbridge, cpu, southbridge maybe even the ram. Imagine how fast this is going to be.
 
Looks interesting, but this sounds like the sort of thing that never makes it to actual products despite IBM saying 5 to 10 years. Kind of like all those multilayered disks we've seen being announced in the last few years (ie: 200GB Bluray disks with 8 layer disks and the likes).
 
Yeah, i guess i am certainly skeptical in the near term. I am currently integrating a "cutting" edge RF to fiber conversion system, and it definitely has some power/heat issues for the targeted application. I know this is a slightly different problem that we are working on, but i would be very surprised if they were able to get this into a reasonable sized package in 5 years.
 
Back