Optical Deep Learning: Learning with light

Artificial Neural Networks mimic the way the brain learns from an accumulation of examples. A research team at the Massachusetts Institute of Technology (MIT) has come up with a novel approach to deep learning that uses a nanophotonic processor, which they claim can vastly improve the performance and energy efficiency for processing artificial neural networks.

MIT-Optical-Neural_0.jpg

Optical computers have been conceived before, but are usually aimed at more general-purpose processing. In this case, the researchers have narrowed the application domain considerably. Not only have they restricted their approach to deep learning, they have further limited this initial work to inferencing of neural networks, rather than the more computationally demanding process of training.

The optical chip contains multiple waveguides that shoot beams of light at each other simultaneously, creating interference patterns that correspond to mathematical results. “The chip, once you tune it, can carry out matrix multiplication with, in principle, zero energy, instantly,” said Marin Soljacic, an electrical engineering professor at MIT, in a statement. The accuracy leaves something to be desired for making out vowels, but the nanophotonic chip is still work-in-progress.

The new approach uses multiple light beams(as mentioned above) directed in such a way that their waves interact with each other, producing interference patterns that convey the result of the intended operation. The resulting device is something the researchers call a programmable nanophotonic processor.The result is that the optical chips using this architecture could, in principle, carry out calculations performed in typical artificial intelligence algorithms much faster and using less than one-thousandth as much energy per operation as conventional electronic chips.
The team says it will still take a lot more effort and time to make this system useful; however, once the system is scaled up and fully functioning, it can find many user cases, such as data centers or security systems. The system could also be a boon for self-driving cars or drones, says Harris, or “whenever you need to do a lot of computation but you don’t have a lot of power or time.”