Startseite Naturwissenschaften Neural network learning with photonics and for photonic circuit design
Artikel Open Access

Neural network learning with photonics and for photonic circuit design

  • Daniel Brunner ORCID logo EMAIL logo , Miguel C. Soriano ORCID logo und Shanhui Fan ORCID logo
Veröffentlicht/Copyright: 2. März 2023

This special issue covers works that lie at the interface between machine learning, spearheaded by the computing power of artificial neural networks (NN), and photonic technologies. In the past few years, there has been a renewed interest in this promising field due to a number of successful experimental demonstrations of advanced computing functionalities [1, 2] or the design of optimized nanophotonic devices [3, 4]. An example of the cross-fertilization from the combination of machine learning concepts and the advances in photonic fabrication using novel materials is the development of hardware accelerators for vector-matrix multiplication, which benefit from a software and hardware codesign [5].

Here, we have identified that current trends in the community can be conceptually divided in two distinct research directions. On the one hand, photonic systems and devices can serve as a hardware substrate that naturally suits the characteristic properties of artificial NN topologies [6]. Advantages brought by photonics in this context include the potential for parallelization, high-speed operation, and low power consumption. On the other hand, machine learning can aid in the design of photonic devices [7] or components [8] and accelerate the search for promising structures. Artificial NN can also assist in the processing of optically acquired data with the ultimate goal of adding new functionalities and enhancing performance [9].

Dinc et al. [10] provide a tomography-centered review about the involvement of NN in photonic circuit design, which provides a valuable general perspective to the scientific community and future candidate topologies of 3D optical design. The potential to simultaneously exploit the multiple physical dimensions of time, wavelength, and space for optical NN is reviewed in the context of recent advances of optical NNs by Bai et al. [11]. Brückerhoff-Plückelmann et al. [12] illustrate how charge accumulation can potentially be an ingredient for enabling large-scale photonic matrix processors; Gu et al. [13] provide a vision of 3D vertically integrated photonic NN based on vertical-cavity surface-emitting laser (VCSEL) arrays, while Buckley et al. [14] examine online learning paradigms for photonic NN, in which the machinery for training is built deeply into the hardware itself.

Li et al. [15] demonstrate how a periodically poled thin-film lithium niobate nanophotonic waveguide can be used to implement an ultra-fast and highly efficient optical neuron with a linear rectifying nonlinearity as activation function. By coupling a VCSEL laser to a resonant tunneling diode, Hejda et al. [16] demonstrate how to realize an optoelectronic excitable neuron. Hasegawa et al. [17] investigate parallel and deep all-optical reservoir computing with delayed feedback-coupled semiconductor lasers by combining multiple reservoirs in potentially hybrid configurations. Miri et al. [18] show how to harness the collective behavior of laser networks for storing and retrieving a large number of phase patterns, where nonreciprocal coupling is shown to be important for resource efficiency.

How to realize learning of multiple tasks in deep diffractive NN by leveraging multiple wavelengths is investigated and demonstrated by Duan et al. [19]. Continuing with diffractive optical networks, Mengu et al. [20] implement permutation matrices and find indications that the capacity of the diffractive optical networks in approximating a given permutation operation increases proportional to the number of diffractive layers. The application of integrated photonic reservoirs in the context of 64-level quadrature amplitude modulated and directly detected signal in a Kramers–Kronig receiver fashion is reported by Masaad et al. [21]. Hülser et al. [22] create a link between high-level information processing metrics and the performance in a particular task, which they investigate based on a delay-reservoir architecture comprising coupled Stuart–Landau oscillators. How to use transfer learning in a novel way is shown by Bauwens et al. [23], who demonstrate that the concept can increase an analog hardware reservoir’s robustness against parameter drifts. Other important aspects such as limited system size in photonic circuits and a potentially lower bit precision is addressed by Giamougiannis et al. [24], who experimentally demonstrate a speed-optimized dynamic precision NN including tiled matrix multiplication. Basani et al. [25] investigate self-similar topologies of multiport interferometers based on integrated beamsplitter meshes following sine–cosine fractal decompositions of unitary matrices with respect to compactness and robustness against fabrication nonidealities. Finally, the optimization, i.e., training of such multiport interferometer meshes based on a drastically simplified yet high performant protocol is experimentally reported by Pai et al. [26].

Yesilyurt et al. [27] propose a NN-based inverse design technique enabled by a differentiable analytical solver that mitigates common fabrication challenges by including simulated systematic and random nonidealities. Multi-task topology optimization of photonic devices utilizing only low-spatial frequency components in the conjunction with deep NN is introduced and evaluated by Mao et al. [28].

In conclusion, this special issue provides reviews, perspectives, and a variety of research articles on the current themes emerging from the interplay between photonics and NN. We hope that this collection of articles covering the many relevant scientific and technological aspects will be of help of researchers entering the field as well as for those who are already established. Particularly in this fast moving field with the current rate of innovation, regular inventories of the trending research directions are of significant benefit. We would like to thank the Nanophotonics Publishing editor Dennis Couwenberg and publishing assistant Tara Dorrian for their constant support and technical assistance.


Corresponding author: Daniel Brunner, FEMTO-ST/Optics Dept., UMR CNRS 6174, Univ. Franche-Comté, 15B avenue des Montboucons, 25030 Besançon Cedex, France, E-mail:

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Research funding: None declared.

  3. Conflict of interest statement: The authors declare no conflicts of interest regarding this article.

References

[1] Y. Shen, N. C. Harris, S. Skirlo, et al.., “Deep learning with coherent nanophotonic circuits,” Nat. Photonics, vol. 11, no. 7, pp. 441–446, 2017. https://doi.org/10.1038/nphoton.2017.93.Suche in Google Scholar

[2] J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature, vol. 569, no. 7755, pp. 208–214, 2019. https://doi.org/10.1038/s41586-019-1157-8.Suche in Google Scholar PubMed PubMed Central

[3] D. Liu, Y. Tan, E. Khoram, and Z. Yu, “Training deep neural networks for the inverse design of nanophotonic structures,” ACS Photonics, vol. 5, no. 4, pp. 1365–1369, 2018. https://doi.org/10.1021/acsphotonics.7b01377.Suche in Google Scholar

[4] Y. Chen, L. Lu, G. E. Karniadakis, and L. Dal Negro, “Physics-informed neural networks for inverse problems in nano-optics and metamaterials,” Opt. Express, vol. 28, no. 8, pp. 11618–11633, 2020. https://doi.org/10.1364/oe.384875.Suche in Google Scholar

[5] B. J. Shastri, A. N. Tait, T. Ferreira de Lima, et al.., “Photonics for artificial intelligence and neuromorphic computing,” Nat. Photonics, vol. 15, no. 2, pp. 102–114, 2021. https://doi.org/10.1038/s41566-020-00754-y.Suche in Google Scholar

[6] G. Wetzstein, A. Ozcan, S. Gigan, et al.., “Inference in artificial intelligence with deep optics and photonics,” Nature, vol. 588, no. 7836, pp. 39–47, 2020. https://doi.org/10.1038/s41586-020-2973-6.Suche in Google Scholar PubMed

[7] S. So, T. Badloe, J. Noh, J. Bravo-Abad, and J. Rho, “Deep learning enabled inverse design in nanophotonics,” Nanophotonics, vol. 9, no. 5, pp. 1041–1057, 2020. https://doi.org/10.1515/nanoph-2019-0474.Suche in Google Scholar

[8] D. Melati, Y. Grinberg, M. Kamandar Dezfouli, et al.., “Mapping the global design space of nanophotonic components using machine learning pattern recognition,” Nat. Commun., vol. 10, no. 1, p. 4775, 2019. https://doi.org/10.1038/s41467-019-12698-1.Suche in Google Scholar PubMed PubMed Central

[9] G. Genty, L. Salmela, J. M. Dudley, et al.., “Machine learning and applications in ultrafast photonics,” Nat. Photonics, vol. 15, no. 2, pp. 91–101, 2021. https://doi.org/10.1038/s41566-020-00716-4.Suche in Google Scholar

[10] N. Ulas Dinc, A. Saba, J. Madrid-Wolff, et al.., “From 3D to 2D and back again,” Nanophotonics, vol. 12, no. 5, pp. 777–793, 2023.10.1515/nanoph-2022-0512Suche in Google Scholar

[11] Y. Bai, X. Xu, M. Tan, et al.., “Photonic multiplexing techniques for neuromorphic computing,” Nanophotonics, vol. 12, no. 5, pp. 795–817, 2023.10.1515/nanoph-2022-0485Suche in Google Scholar

[12] F. Brückerhoff-Plückelmann, I. Bente, D. Wendland, et al.., “A large scale photonic matrix processor enabled by charge accumulation,” Nanophotonics, vol. 12, no. 5, pp. 819–825, 2023.10.1515/nanoph-2022-0441Suche in Google Scholar

[13] M. Gu, Y. Dong, H. Yu, H. Luan, and Q. Zhang, “Perspective on 3D vertically-integrated photonic neural networks based on VCSEL arrays,” Nanophotonics,vol. 12, no. 5, pp. 827–832, 2023.10.1515/nanoph-2022-0437Suche in Google Scholar

[14] S. M. Buckley, A. N. Tait, A. N. McCaughan, and B. J. Shastri, “Photonic online learning: a perspective,” Nanophotonics, vol. 12, no. 5, pp. 833–845, 2023.10.1515/nanoph-2022-0553Suche in Google Scholar

[15] G. H. Y. Li, R. Sekine, R. Nehra, et al.., “All-optical ultrafast ReLU function for energy-efficient nanophotonic deep learning,” Nanophotonics, vol. 12, no. 5, pp. 847–855, 2023.10.1515/nanoph-2022-0137Suche in Google Scholar

[16] M. Hejda, E. Malysheva, D. Owen-Newns, et al.., “Artificial optoelectronic spiking neuron based on a resonant tunnelling diode coupled to a vertical cavity surface emitting laser,” Nanophotonics, vol. 12, no. 5, pp. 857–867, 2023, arXiv: 2206.11044.10.1515/nanoph-2022-0362Suche in Google Scholar

[17] H. Hasegawa, K. Kanno, and A. Uchida, “Parallel and deep reservoir computing using semiconductor lasers with optical feedback,” Nanophotonics, vol. 12, no. 5, pp. 869–881, 2023.10.1515/nanoph-2022-0440Suche in Google Scholar

[18] M. A. Miri and V. Menon, “Neural computing with coherent laser networks,” Nanophotonics, vol. 12, no. 5, pp. 883–892, 2023, https://doi.org/10.1515/nanoph-2022-0367.Suche in Google Scholar

[19] Z. Duan, H. Chen, and X. Lin, “Optical multi-task learning using multi-wavelength diffractive deep neural networks,” Nanophotonics, vol. 12, no. 5, pp. 893–903, 2023. https://doi.org/10.1515/nanoph-2022-0615.Suche in Google Scholar

[20] D. Mengu, Y. Zhao, A. Tabassum, M. Jarrahi, and A. Ozcan, “Diffractive interconnects: all-optical permutation operation using diffractive networks,” Nanophotonics, vol. 12, no. 5, pp. 905–923, 2023, https://doi.org/10.1515/nanoph-2022-0358.Suche in Google Scholar

[21] S. Masaad, E. Gooskens, S. Sackesyn, J. Dambre, and P. Bienstman, “Photonic reservoir computing for nonlinear equalization of 64-QAM signals with a Kramers–Kronig receiver,” Nanophotonics, vol. 12, no. 5, pp. 925–935, 2023.10.1515/nanoph-2022-0426Suche in Google Scholar

[22] T. Hülser, F. Koster, K. Ludge, and L. Jaurigue, “Deriving task specific performance from the information processing capacity of a reservoir computer,” Nanophotonics, vol. 12, no. 5, pp. 937–947, 2023.10.1515/nanoph-2022-0415Suche in Google Scholar

[23] I. Bauwens, K. Harkhoe, P. Bienstman, G. Verschaffelt, and G. Van der Sande, “Transfer learning for photonic delay-based reservoir computing to compensate parameter drift,” Nanophotonics, vol. 12, no. 5, pp. 949–961, 2023, https://doi.org/10.1515/nanoph-2022-0399.Suche in Google Scholar

[24] G. Giamougiannis, A. Tsakyridis, M. Moralis-Pegios, et al.., “Analog nanophotonic computing going practical: silicon photonic deep learning engines for tiled optical matrix multiplication with dynamic precision,” Nanophotonics, vol. 12, no. 5, pp. 963–973, 2023.10.1515/nanoph-2022-0423Suche in Google Scholar

[25] J. Raj Basani, S. K. Vadlamani, S. Bandyopadhyay, D. R. Englund, and R. Hamerly, “A self-similar sine-cosine fractal architecture for multiport interferometers,” Nanophotonics, vol. 12, no. 5, pp. 975–984, 2023, arXiv: 2209.03335.10.1515/nanoph-2022-0525Suche in Google Scholar

[26] S. Pai, C. Valdez, T. Park, et al.., “Power monitoring in a feedforward photonic network using two output detectors,” Nanophotonics, vol. 12, no. 5, pp. 985–991, 2023.10.1515/nanoph-2022-0527Suche in Google Scholar

[27] O. Yesilyurt, S. Peana, V. Mkhitaryan, et al.., “Fabrication-conscious neural network based inverse design of single-material variable-index multilayer films,” Nanophotonics, vol. 12, no. 5, pp. 993–1006, 2023.10.1515/nanoph-2022-0537Suche in Google Scholar

[28] S. Mao, L. Cheng, H. Chen, et al.., “Multi-task topology optimization of photonic devices in low-dimensional Fourier domain via deep learning,” Nanophotonics, vol. 12, no. 5, pp. 1007–1018, 2023.10.1515/nanoph-2022-0361Suche in Google Scholar

Published Online: 2023-03-02

© 2023 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Artikel in diesem Heft

  1. Frontmatter
  2. Editorial
  3. Neural network learning with photonics and for photonic circuit design
  4. Reviews
  5. From 3D to 2D and back again
  6. Photonic multiplexing techniques for neuromorphic computing
  7. Perspectives
  8. A large scale photonic matrix processor enabled by charge accumulation
  9. Perspective on 3D vertically-integrated photonic neural networks based on VCSEL arrays
  10. Photonic online learning: a perspective
  11. Research Articles
  12. All-optical ultrafast ReLU function for energy-efficient nanophotonic deep learning
  13. Artificial optoelectronic spiking neuron based on a resonant tunnelling diode coupled to a vertical cavity surface emitting laser
  14. Parallel and deep reservoir computing using semiconductor lasers with optical feedback
  15. Neural computing with coherent laser networks
  16. Optical multi-task learning using multi-wavelength diffractive deep neural networks
  17. Diffractive interconnects: all-optical permutation operation using diffractive networks
  18. Photonic reservoir computing for nonlinear equalization of 64-QAM signals with a Kramers–Kronig receiver
  19. Deriving task specific performance from the information processing capacity of a reservoir computer
  20. Transfer learning for photonic delay-based reservoir computing to compensate parameter drift
  21. Analog nanophotonic computing going practical: silicon photonic deep learning engines for tiled optical matrix multiplication with dynamic precision
  22. A self-similar sine–cosine fractal architecture for multiport interferometers
  23. Power monitoring in a feedforward photonic network using two output detectors
  24. Fabrication-conscious neural network based inverse design of single-material variable-index multilayer films
  25. Multi-task topology optimization of photonic devices in low-dimensional Fourier domain via deep learning
Heruntergeladen am 19.12.2025 von https://www.degruyterbrill.com/document/doi/10.1515/nanoph-2023-0123/html?lang=de
Button zum nach oben scrollen