2026-05-01

Photonic quantum computing: Hybrid chips bridge the gap

Researchers demonstrate a deterministic pick-and-place method to integrate high-efficiency quantum dots onto silicon chips at telecom wavelengths.

Researchers achieved photonic quantum computing scalability by using a pick-and-place technique to deterministically integrate InAs/InP quantum dots onto silicon chips with nanoscale precision for high-efficiency telecom emission.

— BrunoSan Quantum Intelligence · 2026-05-01
· 6 min read · 1347 words
quantum computingnanotechnologysilicon photonicsresearch

The central challenge of photonic quantum computing has long been a mismatch of materials. To build a quantum computer that uses light, you need two things that rarely play well together: a perfect source of single photons and a complex circuit to route them. For years, physicists have faced a binary choice. They could use indium arsenide quantum dots, which are world-class photon emitters but difficult to arrange in large numbers, or they could use silicon, the gold standard for scalable circuitry that unfortunately lacks the ability to generate high-quality quantum light on its own. The inability to marry these two platforms has stalled the development of large-scale quantum optical processors. [arXiv:10.1021/acs.nanolett.7b03220]

A research team at the KTH Royal Institute of Technology has now bridged this divide by developing a technique to transplant individual quantum emitters onto silicon chips with unprecedented control. By treating the problem as a high-precision assembly task rather than a growth challenge, they have bypassed the chemical incompatibilities that usually prevent these materials from coexisting. This breakthrough addresses the fundamental bottleneck of scalability: how to populate a massive silicon photonic circuit with dozens or hundreds of identical, high-performance light sources without relying on the luck of the draw.

The Core Finding

The researchers successfully demonstrated the hybrid integration of solid-state quantum emitters onto a silicon photonic chip using a deterministic "pick-and-place" technique. They began with epitaxially grown InAs/InP quantum dots, which are prized for emitting single photons at the 1550-nanometer telecom wavelengthβ€”the same frequency used in global fiber-optic networks. Using a specialized nanoprobe, the team moved these dots from their original growth substrate and positioned them onto a silicon-on-insulator (SOI) waveguide with nanoscale precision. This is not a random distribution; it is a deliberate, engineered placement that ensures each emitter is exactly where it needs to be to interact with the chip's circuitry.

To ensure the light actually enters the silicon circuit rather than scattering into the environment, the team employed an adiabatic tapering approach. Think of it like a highway on-ramp designed to let a car merge into high-speed traffic without a collision; the geometry of the materials narrows so gradually that the photon is nudged from the quantum dot into the silicon waveguide with high efficiency. The team confirmed the success of this integration by incorporating an on-chip beamsplitter to perform a Hanbury-Brown and Twiss measurement, a gold-standard test in quantum optics. The abstract notes the primary achievement: they can "position epitaxially grown InAs/InP quantum dots... on a silicon photonic chip deterministically with nanoscale precision."

The State of the Field

Before this work, the field was largely divided between monolithic and stochastic approaches. Groups led by Dirk Englund at MIT and others have explored various ways to integrate emitters, but often faced the "yield problem." In traditional growth methods, quantum dots appear at random locations on a wafer, meaning engineers have to build circuits around the dots wherever they happen to land. This makes designing complex, multi-emitter systems nearly impossible. Earlier attempts at hybrid integration often suffered from high signal loss at the interface between the III-V semiconductor (the emitter) and the Group IV silicon (the waveguide).

This new approach changes the landscape by decoupling the growth of the emitter from the fabrication of the circuit. In the broader quantum computing landscape, this move toward modularity is essential. While superconducting qubits, championed by IBM and Google, currently lead in raw qubit count, they require massive dilution refrigerators to operate. Photonic quantum computing offers a path toward room-temperature operation and easier networking between quantum processors. By proving that high-performance emitters can be manually integrated into the silicon ecosystem, this research provides a blueprint for scaling photonic systems beyond the few-qubit limit.

From Lab to Reality

For the scientific community, this technique unlocks the ability to pre-characterize emitters. Researchers can now test a thousand quantum dots, select only the ten with the most identical optical properties, and place them on a single chip. This "homogeneity" is the holy grail of quantum interference, which is required for the logic gates that power a quantum computer. For engineers, this means the existing multi-billion dollar silicon manufacturing infrastructure can now be used to host quantum components, potentially accelerating the timeline for integrated quantum transceivers.

From an investment perspective, this research directly impacts the quantum interconnect and secure communications markets. The ability to work at telecom wavelengths means these chips are natively compatible with existing fiber-optic infrastructure. As the quantum networking marketβ€”estimated to reach several billion dollars by the early 2030sβ€”matures, the demand for integrated, high-efficiency single-photon sources will become acute. This hybrid integration method provides a viable manufacturing path for the hardware that will underpin the "Quantum Internet," moving the technology from bulky laboratory setups to compact, ruggedized chips.

What Still Needs to Happen

Despite the success of the pick-and-place method, two major technical hurdles remain. First is the throughput of the assembly process. While nanoscale precision is possible, moving emitters one by one is currently a slow, artisanal process. To build a processor with 10,000 emitters, the industry will need automated, parallelized transfer techniques, a challenge currently being explored by groups specializing in micro-transfer printing. Second, while the photons are emitted at telecom wavelengths, the quantum dots themselves still typically require cryogenic cooling to maintain their quantum properties and prevent thermal noise from ruining the single-photon purity.

Groups at the University of Bristol and various startups like Xanadu are working on architectural workarounds for these limitations, such as using multiplexing to compensate for lower efficiencies. However, a truly room-temperature, high-efficiency integrated source remains the "North Star" of the field. We are likely five to ten years away from seeing these hybrid chips in a commercial quantum-secured communication hub, as the transition from a single-emitter demonstration to a mass-produced system requires significant refinements in robotic assembly and cryogenic packaging.

Conclusion

The integration of high-performance semiconductor light sources with silicon's scalable architecture represents a pivotal shift in how we construct quantum hardware. By moving away from random growth and toward deterministic assembly, we gain the control necessary to build truly complex quantum machines. In short: this research demonstrates that photonic quantum computing can achieve scalability by using a pick-and-place technique to integrate telecom-wavelength quantum dots onto silicon chips with nanoscale precision.

Frequently Asked Questions

What is a solid-state quantum emitter?
A solid-state quantum emitter is a tiny structure within a semiconductor, such as a quantum dot, that acts like an artificial atom. When excited by a laser, it releases energy in the form of exactly one photon at a time. These single photons are the fundamental units of information, or qubits, in light-based quantum computing. This specific research uses Indium Arsenide dots grown on an Indium Phosphide base.
How does the pick-and-place technique work?
The technique involves using a high-precision nanoprobe to physically lift a pre-grown quantum dot from its original surface and move it to a new location. The researchers use a microscope to guide the probe, allowing them to place the emitter onto a silicon waveguide with accuracy measured in nanometers. This ensures the emitter is perfectly aligned with the chip's optical paths. The process allows for the creation of custom circuits using only the best-performing emitters.
How does this compare to previous integration methods?
Previous methods usually relied on 'stochastic' growth, where quantum dots were grown directly on the chip in random locations, leading to low yields and wasted space. Other hybrid methods struggled with high light loss when the photon moved from the emitter to the silicon circuit. This new approach uses an 'adiabatic taper'β€”a gradual structural changeβ€”to transfer light with high efficiency. It also allows for 'pre-characterization,' meaning scientists only pick the dots they know work perfectly.
When could this be commercially relevant?
While the laboratory demonstration is a success, commercial application is likely five to ten years away. The current process of moving emitters one by one is too slow for mass production and must be automated. Furthermore, the system still requires cooling to very low temperatures to function. However, it provides a clear roadmap for the telecommunications industry to begin integrating quantum features into standard silicon hardware.
Which industries would benefit most from this research?
The telecommunications and cybersecurity industries are the primary beneficiaries because the system operates at 1550-nanometer wavelengths. This wavelength is the standard for existing fiber-optic cables, meaning these chips could be used for secure quantum key distribution over current networks. Additionally, the aerospace and defense sectors would benefit from the miniaturization of quantum sensors and clocks. The move to silicon-based chips makes these sensitive quantum tools more portable and rugged.
What are the current limitations of this research?
The most significant limitation is the manual nature of the assembly, which prevents immediate large-scale manufacturing. There is also the ongoing requirement for cryogenic cooling, as the quantum dots do not produce high-quality single photons at room temperature. Finally, while the light transfer is efficient, any interface between different materials introduces some degree of signal loss. Researchers must still improve the long-term stability of the bond between the dot and the silicon chip.

Follow photonic quantum computing Intelligence

BrunoSan Quantum Intelligence tracks photonic quantum computing and 44+ quantum computing signals daily — ArXiv papers, Nature, APS, IonQ, IBM, Rigetti and more. Updated every cycle.

Explore Quantum MCP →