What's new

Photonic computers: The future of computing is… analogue

jamahir

ELITE MEMBER
Jul 9, 2014
21,542
17
19,464
Country
India
Location
India
Photonic computers: The future of computing is… analogue

4 minute read

The Lightmatter Blade rack system containing the company's Envise photonic processor. Image: Lightmatter.

The Lightmatter Blade rack system containing the company's Envise photonic processor. Image: Lightmatter.

The future is optical. Photonic processors promise blazing fast calculation speeds with much lower power demands, and they could revolutionise machine learning.

Photonic computing is as the name suggests, a computer system that uses optical light pulses to form the basis of logic gates rather than electrical transistors. If it can be made to work in such a way that processors can be mass produced at a practical size it has the potential to revolutionise machine learning and other specific types of computing tasks. The emphasis being on the word if. However there are some intriguing sounding products close to coming to market that could changes things drastically.

The idea behind photonic computers is not a new one, with optical matrix multiplications first being demonstrated in the 1970s, however nobody has managed to solve many of the roadblocks to getting them to work on a practical level that can be integrated as easily as transistor based systems. Using photons is an obvious choice to help speed things up. After all all new homes in the UK are built with fibre to the home for a reason. Fibre optic cables are superior to aluminium or copper wires for the modern world of digital data communication. They can transmit more information faster, and over longer distances without signal degradation than metal wiring. However transmitting data from A to B is a whole different kettle of fish to putting such optical pipelines onto a chip fabrication that allows for matrix processing, even though some data centres already use optical cables for faster internal data transfer over short distances.

In an article for IEEE Spectrum, Ryan Hamerly puts forward the case for photonic chipsalong with proposals for solving the key issues, for which there are still many.

First and foremost is the way in which traditional processors work. Traditional transistor based chips are non-linear. Hamerly states “Nonlinearity is what lets transistors switch on and off, allowing them to be fashioned into logic gates. This switching is easy to accomplish with electronics, for which nonlinearities are a dime a dozen. But photons follow Maxwell’s equations, which are annoyingly linear, meaning that the output of an optical device is typically proportional to its inputs. The trick is to use the linearity of optical devices to do the one thing that deep learning relies on most: linear algebra.”

For his own solution, Hamerly proposes using an optical beam splitter that allows two perpendicular beams of light to be fired at it from opposite directions. Each beam of light is split by allowing half of the light to pass through to the other side of the beam splitter mirror, while the remaining light is bounced at 90 degrees from its origin.

The result is that this beam splitter designs allows for two inputs and two outputs, which in turn makes matrix multiplication possible, which is how a traditional computer performs its calculations. Hamerly states that to make this work two light beams are generated with electric-field intensifies that are proportional to the two numbers you wish to multiply, which he calls field intensities x and y. The two beams are fired into the beam splitter and combined in such a way that it produces two outputs.

Hamerly says, “In addition to the beam splitter, this analog multiplier requires two simple electronic components—photodetectors—to measure the two output beams. They don’t measure the electric field intensity of those beams, though. They measure the power of a beam, which is proportional to the square of its electric-field intensity.”

He goes on to describe how the light can be pulsed in rather than as one continuous beam, allowing a ‘pulse then accumulate’ operation that feeds the output signal into a capacitor. This can be carried out many times in rapid-fire fashion.

The potential benefits of such a system working could be huge. Claims of neural network calculations being many thousands of times better than current systems are being mooted, and that’s just based on currently available technology. Although Hamerly does admit that there are still huge obstacles to be overcome. Limited accuracy and dynamic range of current analogue calculations being one, a combination of noise and limited accuracy on current A/D converters. As a result he states that neural networks beyond 10-bit accuracy may not be possible, and although there are currently 8-bit systems, much more precision is required in order to really push things forward. Then there are the problems of placing optical components onto chips in the first place. Not only do they take up much more space than transistors, they cannot be packed anywhere near as densely either.

This creates a rather large problem because it means that either you end up with huge chips, or you keep them small and hence limit the size of the matrices that can be processed in parallel. But although the challenges being faced are still huge, the benefits of advancing the technology could bring advances forward that would leave us floundering if we continued to rely fully on transistor based computers.

The Lightmatter Envise chip

With all that being said, there are already a number of companies developing photonic processors and accelerators, including MIT start-ups Lightmatter and Lightelligence. Companies like these along with Optalysys with its Etile processor, and Luminous, are taking differing approaches to solving the present issues. In fact Lightmatter is proposing to release an optical accelerator board later in the year that will be available for sale. No doubt at a price beyond mere mortals.

Lightmatter Envise chip.

Lightmatter's photonic accelerator chip, called Envise. Image: Lightmatter.

Lightmatter claims that its photonic chip, called Envise, is five times faster than the Nvidia A100 Tensor Core GPU that is inside some of the world’s most powerful data centres, with seven times better energy efficiency. It is also claiming several times more compute density than the Nvidia DGX-A100. The company’s Blade system, which contains 16 Envise chips in a 4-U server configuration apparently uses only 3kW of power, and further claims “3 times higher inferences/second than the Nvidia DGX-A100 with 7 times the inferences/second/Watt on BERT-Base with the SQuAD dataset.”

Lightmatter isn’t playing around. Earlier this year it raised $80m to help bring its chip to market, and Dropbox’s former chief operating officer, Olivia Nottebohm has also joined the company’s board of directors.

The chip is aimed at improving everything from autonomous vehicles through to natural language processing, pharmaceutical development, through to digital customer service assistants and more. Big claims, but it is notable that Lightmatter does appear to have an actual product that works rather than just a laboratory based technology showcase or scientific paper.

Lightmatter Blade rack system.

The Lightmatter 4-U configured Blade rack system containing 16 Envise photonic processors. Image: Lightmatter.

The need to solve issues such as power draw and CO2 emissions could be powerful instigators in this latest ‘space race’ as the issue has now become almost mainstream news. As computing demands keep rising as machine learning becomes more prominent, so too do the demands for offsetting the environmental impact. Although even Hamerly doesn’t believe we’ll ever end up with a 100% optical photonic computer, instead more of a hybrid. But he makes no bones about its importance, concluding “First, deep learning is genuinely useful now, not just an academic curiosity. Second, we can’t rely on Moore’s Law alone to continue improving electronics. And finally, we have a new technology that was not available to earlier generations: integrated photonics.”

Read Ryan Hamerly’s full article on the IEEE Spectrum website.
 

Splurgenxs

SENIOR MEMBER
Mar 24, 2011
2,297
0
2,054
Country
India
Location
India
Its might sound great from a marketing perspective , but its not much of a leap and the adoptions will be negligible.
 

fitpOsitive

ELITE MEMBER
May 27, 2015
9,921
14
11,426
Country
Pakistan
Location
Yemen
Photonic computers: The future of computing is… analogue

4 minute read

The Lightmatter Blade rack system containing the company's Envise photonic processor. Image: Lightmatter.'s Envise photonic processor. Image: Lightmatter.

The Lightmatter Blade rack system containing the company's Envise photonic processor. Image: Lightmatter.

The future is optical. Photonic processors promise blazing fast calculation speeds with much lower power demands, and they could revolutionise machine learning.

Photonic computing is as the name suggests, a computer system that uses optical light pulses to form the basis of logic gates rather than electrical transistors. If it can be made to work in such a way that processors can be mass produced at a practical size it has the potential to revolutionise machine learning and other specific types of computing tasks. The emphasis being on the word if. However there are some intriguing sounding products close to coming to market that could changes things drastically.

The idea behind photonic computers is not a new one, with optical matrix multiplications first being demonstrated in the 1970s, however nobody has managed to solve many of the roadblocks to getting them to work on a practical level that can be integrated as easily as transistor based systems. Using photons is an obvious choice to help speed things up. After all all new homes in the UK are built with fibre to the home for a reason. Fibre optic cables are superior to aluminium or copper wires for the modern world of digital data communication. They can transmit more information faster, and over longer distances without signal degradation than metal wiring. However transmitting data from A to B is a whole different kettle of fish to putting such optical pipelines onto a chip fabrication that allows for matrix processing, even though some data centres already use optical cables for faster internal data transfer over short distances.

In an article for IEEE Spectrum, Ryan Hamerly puts forward the case for photonic chipsalong with proposals for solving the key issues, for which there are still many.

First and foremost is the way in which traditional processors work. Traditional transistor based chips are non-linear. Hamerly states “Nonlinearity is what lets transistors switch on and off, allowing them to be fashioned into logic gates. This switching is easy to accomplish with electronics, for which nonlinearities are a dime a dozen. But photons follow Maxwell’s equations, which are annoyingly linear, meaning that the output of an optical device is typically proportional to its inputs. The trick is to use the linearity of optical devices to do the one thing that deep learning relies on most: linear algebra.”

For his own solution, Hamerly proposes using an optical beam splitter that allows two perpendicular beams of light to be fired at it from opposite directions. Each beam of light is split by allowing half of the light to pass through to the other side of the beam splitter mirror, while the remaining light is bounced at 90 degrees from its origin.

The result is that this beam splitter designs allows for two inputs and two outputs, which in turn makes matrix multiplication possible, which is how a traditional computer performs its calculations. Hamerly states that to make this work two light beams are generated with electric-field intensifies that are proportional to the two numbers you wish to multiply, which he calls field intensities x and y. The two beams are fired into the beam splitter and combined in such a way that it produces two outputs.

Hamerly says, “In addition to the beam splitter, this analog multiplier requires two simple electronic components—photodetectors—to measure the two output beams. They don’t measure the electric field intensity of those beams, though. They measure the power of a beam, which is proportional to the square of its electric-field intensity.”

He goes on to describe how the light can be pulsed in rather than as one continuous beam, allowing a ‘pulse then accumulate’ operation that feeds the output signal into a capacitor. This can be carried out many times in rapid-fire fashion.

The potential benefits of such a system working could be huge. Claims of neural network calculations being many thousands of times better than current systems are being mooted, and that’s just based on currently available technology. Although Hamerly does admit that there are still huge obstacles to be overcome. Limited accuracy and dynamic range of current analogue calculations being one, a combination of noise and limited accuracy on current A/D converters. As a result he states that neural networks beyond 10-bit accuracy may not be possible, and although there are currently 8-bit systems, much more precision is required in order to really push things forward. Then there are the problems of placing optical components onto chips in the first place. Not only do they take up much more space than transistors, they cannot be packed anywhere near as densely either.

This creates a rather large problem because it means that either you end up with huge chips, or you keep them small and hence limit the size of the matrices that can be processed in parallel. But although the challenges being faced are still huge, the benefits of advancing the technology could bring advances forward that would leave us floundering if we continued to rely fully on transistor based computers.

The Lightmatter Envise chip

With all that being said, there are already a number of companies developing photonic processors and accelerators, including MIT start-ups Lightmatter and Lightelligence. Companies like these along with Optalysys with its Etile processor, and Luminous, are taking differing approaches to solving the present issues. In fact Lightmatter is proposing to release an optical accelerator board later in the year that will be available for sale. No doubt at a price beyond mere mortals.

Lightmatter Envise chip.

Lightmatter's photonic accelerator chip, called Envise. Image: Lightmatter.

Lightmatter claims that its photonic chip, called Envise, is five times faster than the Nvidia A100 Tensor Core GPU that is inside some of the world’s most powerful data centres, with seven times better energy efficiency. It is also claiming several times more compute density than the Nvidia DGX-A100. The company’s Blade system, which contains 16 Envise chips in a 4-U server configuration apparently uses only 3kW of power, and further claims “3 times higher inferences/second than the Nvidia DGX-A100 with 7 times the inferences/second/Watt on BERT-Base with the SQuAD dataset.”

Lightmatter isn’t playing around. Earlier this year it raised $80m to help bring its chip to market, and Dropbox’s former chief operating officer, Olivia Nottebohm has also joined the company’s board of directors.

The chip is aimed at improving everything from autonomous vehicles through to natural language processing, pharmaceutical development, through to digital customer service assistants and more. Big claims, but it is notable that Lightmatter does appear to have an actual product that works rather than just a laboratory based technology showcase or scientific paper.

Lightmatter Blade rack system.

The Lightmatter 4-U configured Blade rack system containing 16 Envise photonic processors. Image: Lightmatter.

The need to solve issues such as power draw and CO2 emissions could be powerful instigators in this latest ‘space race’ as the issue has now become almost mainstream news. As computing demands keep rising as machine learning becomes more prominent, so too do the demands for offsetting the environmental impact. Although even Hamerly doesn’t believe we’ll ever end up with a 100% optical photonic computer, instead more of a hybrid. But he makes no bones about its importance, concluding “First, deep learning is genuinely useful now, not just an academic curiosity. Second, we can’t rely on Moore’s Law alone to continue improving electronics. And finally, we have a new technology that was not available to earlier generations: integrated photonics.”

Read Ryan Hamerly’s full article on the IEEE Spectrum website.
Speed of light is same as electricity. The main advancement in computing is Quantum computing, and that to is only good for stored data processing.
 

jamahir

ELITE MEMBER
Jul 9, 2014
21,542
17
19,464
Country
India
Location
India
Its might sound great from a marketing perspective , but its not much of a leap and the adoptions will be negligible.
Speed of light is same as electricity.
1. Do you think that photonic processors will :

a. Have less power consumption and not become hot ?

b. Will be more resistant to radiation ?
2. What about data storage ? Will that remain regular SRAM and DRAM or will that have to be in another form ?

3. What about chip prototyping - can regular FPGAs be used ?

4. What about operating system ?

The main advancement in computing is Quantum computing, and that to is only good for stored data processing.
What do you mean by stored data processing ?
 
Last edited:

fitpOsitive

ELITE MEMBER
May 27, 2015
9,921
14
11,426
Country
Pakistan
Location
Yemen
What do you mean by stored data processing ?
We have two broad kinds of data. Real time and stored. Stored data is any data that is captured and can be used other times as well. Such datas can be processed very fast through Qubits.
For real time processing, Quatum processing is as good as binary.
Germany is working on Light based storage devices. Appearantly the only advantages are size and power consumption. But essentially, as far as the speed is concerned, according to my understanding, it will not have any significant advantage.
 

Splurgenxs

SENIOR MEMBER
Mar 24, 2011
2,297
0
2,054
Country
India
Location
India
1. Do you think that photonic processors will :
a. Have less power consumption and not become hot​
I am not so sure about the heating aspect , its is hard to really differentiate between marketing gimmicks and reality.
Unless and until real world consumer tests are not driving these results one can always speculate.

The diodes emitting or receiving the photon/charge are still subject to transmission drawbacks like heat and attenuation , maybe the medium transmitting the photo would not be affected , so my general sense says it is more optimized for transmission than processing.

I could not find how it manages logic , in this might be the answer , how did it mange to emulate P and N (Source, drain, gate, blk) etc. and how did it mange to create switches and gates.

b. Will be more resistant to radiation ?​
Radiation wise should be no different since its going to be a small closed shielded environment.​
2. What about data storage ? Will that remain regular SRAM and DRAM or will that have to be in another form ?
It can do the processing for the information back and forth from the standard buffer/cache arch setups that are mainstream.
don't see why it would need a revision.

3. What about chip prototyping - can regular FPGAs be used ?
No idea

4. What about operating system ?
as long as the interfaces and extensions are the same the software should work the same.
Its the responsibly of the manufacturers to make it so. Otherwise the adoptability will suffer.
 
Last edited:

jamahir

ELITE MEMBER
Jul 9, 2014
21,542
17
19,464
Country
India
Location
India
so my general sense says it is more optimized for transmission than processing.
But the company Lightmatter seems to already have a product - the Envise chip :
Lightmatter claims that its photonic chip, called Envise, is five times faster than the Nvidia A100 Tensor Core GPU that is inside some of the world’s most powerful data centres, with seven times better energy efficiency. It is also claiming several times more compute density than the Nvidia DGX-A100. The company’s Blade system, which contains 16 Envise chips in a 4-U server configuration apparently uses only 3kW of power, and further claims “3 times higher inferences/second than the Nvidia DGX-A100 with 7 times the inferences/second/Watt on BERT-Base with the SQuAD dataset.”
The chip is aimed at improving everything from autonomous vehicles through to natural language processing, pharmaceutical development, through to digital customer service assistants and more. Big claims, but it is notable that Lightmatter does appear to have an actual product that works rather than just a laboratory based technology showcase or scientific paper.

I could not find how it manages logic , in this might be the answer , how did it mange to emulate P and N (Source, drain, gate, blk) etc. and how did it mange to create switches and gates.
What about the section in the OP about the splitter ? Is that related ?
 

Splurgenxs

SENIOR MEMBER
Mar 24, 2011
2,297
0
2,054
Country
India
Location
India
But the company Lightmatter seems to already have a product - the Envise chip :
What about the section in the OP about the splitter ? Is that related ?
Will have to dig deeper, this is too vague.
My general product instincts are pointing towards empty promises,
let it go through user scrutiny , before we start believing.
 

Users Who Are Viewing This Thread (Total: 1, Members: 0, Guests: 1)


Top Bottom