Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

In-situ Backpropagation in Photonic Neural Networks

Not Accessible

Your library or personal account may give you access

Abstract

Recently, integrated optics has gained interest as a hardware platform for implementing machine learning algorithms. Here, we introduce a method that enables highly efficient, in situ training of a photonic artificial neural network. We use the adjoint variable method to derive the photonic analogue of the backpropagation algorithm, which is the standard method for computing gradients for conventional neural networks. We further show how these gradients can be obtained exactly through intensity measurements inside the device. Beyond the training of photonic machine learning implementations, our method may also be of broad interest to experimental sensitivity analysis of photonic systems and the optimization of reconfigurable optics platforms.

© 2018 The Author(s)

PDF Article
More Like This
Training of Photonic Neural Networks through In Situ Backpropagation

Tyler W. Hughes, Momchil Minkov, Ian A. D. Williamson, Yu Shi, and Shanhui Fan
JF3F.2 CLEO: Applications and Technology (CLEO:A&T) 2019

Inference and Gradient Measurement for Backpropagation in Photonic Neural Networks

Sunil Pai, Tyler W. Hughes, Taewon Park, Ben Bartlett, Ian Williamson, Momchil Minkov, Maziyar Milanizadeh, Nathnael Abebe, Francesco Morichetti, Andrea Melloni, Olav Solgaard, Shanhui Fan, and David A. B. Miller
STh5G.2 CLEO: Science and Innovations (CLEO:S&I) 2022

Deep nonlinear optical neural networks trained with in situ backpropagation

Logan G. Wright, Tatsuhiro Onodera, Martin M. Stein, Tianyu Wang, Darren T. Schachter, Zoey Hu, and Peter L. McMahon
NTh1A.6 Nonlinear Optics (NLO) 2021

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.