Scientists build the first light-based hardware that competes with silicon

mongeese

Posts: 643   +123
Staff
In context: AMD was first to market with a 1 GHz processor back in March 2000. Intel beat them to 2 GHz not long after in August 2001 and they also got to 3 GHz first in November 2002. After nine long years AMD hit back with a 4 GHz capable FX-6200 and it took them two more years to cobble together the 5 GHz FX-9590 in 2013. In the six years since things haven’t sped up much from a clock rate point of view.

The rate of progress is slowing, and in the not-too-distant future we’ll need something new. That something new could be chip-scale photonic computing – the integration of light-based hardware to improve performance.

A team of scientists sponsored by the Nippon Telegraph and Telephone Corporation, a Japanese telecom, have made a titanic breakthrough with photonic technology. For the first time, photonics has performance and specs competitive with electronic hardware.

The photonics you might see in the next decade will use light to transfer information and electronics to process it. For example, an electrical signal will reach an Electric to Optic (E-O) device, converting it into light. That light is transferred then hits an Optic to Electric (O-E) device that converts the light into a current, which can be processed or sent to the next E-O device.

The main challenges scientists have faced are the power requirements, which can exceed a thousand times the requirements of electronic processing, and speed, because each time the light is absorbed it must go into a capacitor. That capacitor must fill up and discharge fully to pass the signal on, but up until now, it’s been very challenging to build a capacitor small enough for that to happen quickly.

The research team made leaps and bounds and finally matched silicon hardware in terms of performance and power requirements.

They were able to build an Electro-Optic Modulator (E-O) that runs at 40 Gbps that uses just 42 attojoules per bit, meaning it consumes over an order of magnitude less power than the best of the previous experiments. It outperforms them, too, with about half the capacitance at less than a femtofarad.

They then constructed a photoreceiver (O-E) based on the same technologies, and that was able to run at 10 Gbps using two orders of magnitude less power than other optical systems at just 1.6 femtojoules per bit. It’s also the first to not require an amplifier (which saves power) and have a low capacitance at just a few femtofarads.

Combining the two, they demonstrated the world’s first O-E-O ‘transistor.’ It can function as an all-optical switch, a wavelength converter, and a repeater. The incredible versatility makes it the first device that provides benefits over electronic hardware at chip-scale. The researchers suggest it could be used for inter-core communication and to sustain cache coherency.

The scientists were able to make this breakthrough by developing a new type of photonic crystal (a term meaning a synthetic insulating material that controls light), and it’s a piece of silicon with a bunch of holes drilled in it. The holes are arranged such that if the light goes through them it interferes with itself causing it to cancel out. If a line of holes is blocked, then the light goes follows the path and is funneled into light-absorbing material that converts it into a current. The same system also works in reverse.

It’s hard to understate just how exciting this breakthrough is. Up until now, the only role photonics has played in the data center is long-range communication, targeting distances from 500m to 10km. Recent announcements like Intel’s 400G have shrunk that distance to room-scale, with board-scale known to be in the works. But bringing photonics down to chip-scale makes the technology consumer accessible and has the potential to rewrite the rulebook when it comes to performance. After all, light is faster than electronics.

While photonic technology is only just matching electronic hardware, it's in its infancy and will improve rapidly. That being said, it could be a decade before chip-scale photonic technology gets into the hands of the public, but it will be an exciting day when it does.

Femtofarad optoelectronic integration demonstrating energy-saving signal conversion and nonlinear functions, Nature: Photonics (April 2019)

Permalink to story.

 
Interesting .... shortly after RISK processors were deemed impractical CRAY announced they were headed this route .... apparently they must have dropped it or the reporter didn't mention it. Of course that was over 10 years ago so it may have just dropped off everyone's radar ......
 
Electricity propagate at 95% of the speed of light, a common misconception that speed is a major differentiator. The energy saving and the prospect that we might be able to clock signal beyond what is currently possible seems very promising tough.
 
Some functions, once optimized may utilize this tech blended with existing tech, if it can be made affordably. The biggest differences are not just the cost of electricity, but the cost of manufacturing. Extreme UV is well established and produces affordable processors. The article only points at it, but "optical" circuits aren't optical. Millions of conversions need to take place in physical circuits, which makes the commercialization of a useful chip at an affordable price, less likely. Hybrids are another thing, and may move us into the future.
 
It was always going to go the way of optical computing, and I believe an optical RAM is where they struggle most.

However, in tandem with progress like this, I see the main future-changer, is when AI can be tasked with writing out programs. I.e. Software writing software.

Rather than all these woefully high-overhead high-level languages that people use (due to ease of use), we might task an AI-enabled machine with our list of required inputs / outputs, and have it write assembler, or whatever low-level languge (true binary?) to acheive the task. So whilst for humans such a task of program-writing might require an entire 'war effort', and the computationality of a nation-state, AI would just silently get on with the task, and spit out tiny code for what might have required all sorts of libraries / add-ins / hooks normally.

I noticed in an optical publication a few years ago, there were trying new filter creation methods, and someone tasked a machine to create the perfect filter, giving it the number of ports, and the expected band pass & band stop requirments for each port, and the physical characteristics the software came out with, looked like something only nature could have created, rather than something a machine could have thought up. Sorry, I cannot find the link.
 
Electricity propagate at 95% of the speed of light, a common misconception that speed is a major differentiator. The energy saving and the prospect that we might be able to clock signal beyond what is currently possible seems very promising tough.
The biggest difference is that there is no interference. Light "cannot be disturbed".
 
The speed difference between an electron and photon is negligible in terms of processing speed. Where the advantage comes from is thermal performance (for processing) and energy consumption (for communication and processing).

You make an all optical processor, and it likely won't need anywhere near the same amount of cooling as an equivalent silicon processor.
 
Another interesting factor will be the reduced heat signatures of the chips and thus less cooling needed. It will be curious what that energy savings amounts to.
 
Back