Highlights:

  • The startup’s Optical Fabric interconnect uses light to carry data between memory chips and processors instead of using electricity.
  • Celestial AI claims that Optical Fabric can also assist data center operators in lowering hardware expenses.

Celestial AI Inc., a startup that develops optical chip technology to speed up artificial intelligence models, revealed receiving USD 100 million in funding.

Porsche AG and Samsung Electronics Co. Ltd.’s venture capital divisions contributed to the round. Along with roughly a half-dozen other investors, IMEC, one of the largest nanoelectronics research labs in the world, also invested.

In data centers, advanced AI models don’t always operate as quickly as they could. The root cause is a performance problem known as the memory wall challenge. Celestial AI claims to have solved the problem with its Photonic Fabric optical connectivity technology.

The memory wall problem is caused by the fact that memory chip performance has been lagging behind the pace of CPUs, especially graphics processing units. As a result, memory chips are finding it challenging to meet the demands of data centers. The problem is beginning to have a negative effect on how quickly AI models handle data.

A graphics card will only handle the first 800 megabytes of data it can access from memory, even if it can theoretically process two gigabytes of data per second. Put another way, the GPU’s processing power cannot be utilized entirely. This limits processing rates by preventing AI models from fully utilizing GPU performance.

Increasing the speed at which processors can send and receive data from memory is Celestial AI’s solution to the problem. The startup’s Optical Fabric interconnect uses light to carry data between memory chips and processors instead of the conventional method of using electricity. Network speed rises due to the greater speed of light than electricity.

According to Celestial AI, considerable terabits per second of compute-to-memory connectivity can be offered through Optical Fabric. Furthermore, it offers promising nanosecond latencies. Although it is not the first business employing optical techniques to quicken the transfer of compute-to-memory data, it asserts that its technology provides up to 25 times more bandwidth than other strategies.

Some optical interconnects use beachfront or sides of CPUs to transfer data. The Optical Fabric from Celestial AI is designed to be deployed as a base layer beneath the semiconductor rather than connecting to the chip’s beachfront. The startup’s approach promises better performance by improving the amount of chip surface area via which information can be carried.

The business claims to offer benefits beyond just speeding up AI algorithms. Celestial AI claims that Optical Fabric can also assist data center operators in lowering hardware expenses.

Due to specific technical limitations, expanding the memory in an AI cluster frequently necessitates increasing the number of processors. Celestial AI claims that Optical Fabric enables businesses to add more memory without investing in new systems. As a result, less hardware is needed, bringing down the data center’s price.

Dave Lazovsky, Chief Executive Officer and AI Founder, stated, “This next wave of data center infrastructure is being architected to deliver tremendous advancements in AI workload efficiencies, resulting from disaggregation of memory and compute resources which is enabled by optical interconnectivity.”

The startup claims that its technology also supports other use cases. According to Celestial AI, data can be transferred not just between processors and memory chips but also between processors via Optical Fabric. Additionally, it asserts that the technology can control the data flow within each CPU.

The company intends to grant licenses for its technologies to businesses, including semiconductor manufacturers. It claims that Optical Fabric can be included in CPUs and GPUs. It also has an internal AI accelerator processor called Orion that leverages Optical Fabric to manage data movement as a part of its efforts to monetize its technology.