A team of researchers in Russia is outlining a plan to build a computer that could operate at speeds approaching the speed of light by 2027. The project unites the National Center for Physics and Mathematics (NCFM), the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF), and Samara State University, with researchers connected to the Russian Academy of Sciences. The aim is bold: to shift data processing toward a regime where speed is set by light rather than electricity alone. By merging electronics with photonics, the project seeks to create a processing stack that can tackle neural network workloads with far fewer bottlenecks than traditional silicon-based systems. The international implications are clear, as researchers and industry observers in North America and beyond watch the work closely for signals about what might be possible when optical and electronic computing merge. (Source: TASS)
Alexander Sergeev, who leads the scientific team, describes the computing system as an ultra-fast platform built on optical neural networks. The system, called ECPS, is meant to handle data streams with light-driven co-processors performing key calculations at the speed of light. Sergeev emphasizes that this architecture fuses electronic controllers with photonic processing units to reduce latency and boost throughput for complex AI tasks. (Source: TASS)
Over the next two years, the team plans to deliver a prototype that could outpace today’s leading hardware benchmarks. If the timeline holds, the prototype will demonstrate the practical feasibility of the ECPS approach and offer a reference point for hardware researchers worldwide. The work sits at the intersection of photonics, AI hardware, and neuromorphic design, reflecting a broader trend toward hybrid systems that can support larger models and real-time inference. (Source: TASS)
Separately, researchers from the University of Melbourne have explored a method for creating scalable arrays of atoms in silicon devices that could form the basis of future quantum computers. This line of inquiry underscores the diversity of approaches under discussion globally as scientists seek hardware foundations that can eventually unlock more powerful AI capabilities. (Source: University of Melbourne)
On the commercial and public policy front, Elon Musk has signaled interests in applying artificial intelligence to game development, reflecting a broader push to connect advanced AI techniques with consumer software and interactive experiences. (Source: public statements)
For North America, including Canada and the United States, the Russian announcement resonates amid a wave of investments and research programs in photonics and AI hardware. Canadian researchers and US labs are advancing silicon photonics, neuromorphic chips, and high-bandwidth interconnects as part of national strategies to strengthen AI infrastructure. The aim is to support faster training and lower-latency inference for enterprise workloads, cloud services, and edge devices. (Source: industry reports)
Despite the excitement, experts caution that turning a proof-of-concept into a reliable, scalable device will require breakthroughs in materials science, packaging, thermal management, and error handling. The path from a prototype to a production-ready system is long and involves multi-disciplinary teams across physics, engineering, and computer science. (Source: technical analyses)