What if there was no lag in anything that you clicked on your PC, laptop, phone, or your smartwatch? Such low latency that the human brain does not even notice it. That would be the flow of information and processing at the speed of light enabled by an optical computer based on photonic technology right there at your fingertips.
The system you are using right now works on electrical signals to perform calculations; optical computing uses light. It makes it faster, more efficient, and more compact. Could it be the next big thing for DeepTech VCs? 8X Ventures conducted research to consolidate the evolution of hardware computing and where it’s heading. Let’s find out through the prism of computing history.
The first electronic computer
The Electronic Numerical Integrator And Computer (ENIAC) was the first electronic computer, and it was made up of vacuum tubes, developed during World War II in the 1940s. It might sound like a real estate hazard if I told you how big it was. It was approximately the size of a room, measuring 8 ft high, 3 ft deep, and 100 ft long. Don’t listen to people complaining about the ergonomics of a smartwatch on their wrist. What we have achieved in those 80 years is no less than magic.
Though it was our first electronic computer and had calculations to construct a hydrogen bomb as its first task, it still was highly inefficient, slow, and took a lot of space, right? Therefore, in the coming decades, we needed technology that can overcome those limitations. And that’s when electronic computers based on transistor technology came into being. Let’s talk about them. Before we do that, if you are a history or tech buff or both, you can find a portion of that giant machine on exhibit at the Smithsonian Institution in Washington, D.C.
The age of binary computation
Transistors, typically made of Silicon, enable the logic of present computing systems. Millions of microscopic transistors on a chip manipulate the electrical current and thus enable the binary system of 0 and 1, the language our computers understand. ENIAC was definitely an odd bird and relied on a 10-digit decimal system. It was the invention of transistors that changed the game.
When compared to ENIAC, transistors are very compact, fast, and efficient. They are holding the backbone of our digital economy, the data centers. But as per, a report by International Energy Agency (IEA) suggests that data centers and data transmission networks each account for 1-1.5% of global electricity use. The whole internet infrastructure? Take it to 10%.
Most data centers use almost as much non-computing (cooling and power consumption) power as they do to power their servers. Even though they are exponentially better than vacuum-tube-based processors and are continually improving, the graph of this technology is about to saturate. Moore’s law is not dead but decaying.
At some point, the transistors on a chip will reach the size of individual atoms, and it will no longer be possible to continue shrinking them and increasing their density. Scaling would be almost impossible, power consumption would increase, and they would not be as affordable. We won’t be able to fit more silicon transistors on a chip as we did for the last few decades.
What do we do then? We look for emerging technologies, such as quantum computing, neuromorphic chips, photonics, and new materials. The progress in optical computing makes it the next best contender to revolutionize hardware computing again. Let’s find out how?
Optical computing: The power of God
A typical computer is the result of three things it does really well: compute+communicate+store. In the electronic configuration, these processes are performed when current is manipulated with the help of transistors, capacitors, resistors, and other components. In photonic computing, light is manipulated using photodetectors, phase modulators, waveguides, and more. These are the building blocks of electronic computers and optical computing, respectively.
Unlike electronic computing, which works on electron manipulation, photonic computing relies on the properties of photons. This technology is based on the idea that light can be used to perform many of the same functions as an electrical current in a computer, such as performing calculations, storing and retrieving data, and communicating with other devices.
So, how is it better if it has the same functions and works on binary language?
Higher bandwidth:
The wave properties of light allow parallel computing capabilities that enable optical computing to package more information and therefore see higher bandwidth. This makes optical computers even more compact and process way more complex data.
Highly efficient:
Light is also less prone to transmission losses as compared to electrical current and thus does not generate the same level of heat as electrical computing. That means it makes optical computing highly energy efficient. Also, one does not need to worry about electrical short circuits since they are immune to electromagnetic interference.
Faster processing
Even under perfect conditions, the speed of the electric current is 50-95% of the speed of light. And that makes optical computers faster than the prevalent ones.
How far are we?
The image you see at the top of this article depicts the final destination of an optical computer: a crystal slab with no screen but holographic projection in the air for input and output. It will take decades to get there, and that’s a moonshot as of now.
But already realizable applications are seen in near-edge computing and data centers. That means, with near-edge computing capabilities, a 5G-enabled IoT device at a retail store could compute and store a portion of its generated data then and there instead of transferring all the raw data to a faraway data center. Result: low latency, low transmission losses.
Lightspeed Photonics, a deeptech startup in Singapore, is building next-generation optical interconnects that send data into the chips directly through lasers (no cables!) and integrate computing chips for high bandwidth data processing at low power. This is a more realizable example of optical computing as compared to optical computing’s full potential.
Conclusion
Optical computing isn’t completely replacing electronic computing for the next few decades, but its integration is heavily augmenting it, thus removing roadblocks that we face today. The progress on this integration positions DeepTech investors to bet their money on startups working on transforming the computing industry early on.
Featured Image Credit: Created with DALL·E 2 – OpenAI; Provided by the Author; Thank you!
The post The Rise of Photonics in Computing: Next Big Thing for DeepTech VCs appeared first on ReadWrite.