An anti-Hebbian local learning algorithm for two-layer optical neural networks is introduced. With this learning rule, the weight update for a certain connection depends only on the input and output of that connection and a global, scalar error signal. Therefore the backpropagation of error signals through the network, as required by the commonly used back error propagation algorithm, is avoided. It still guarantees, however, that the synaptic weights are updated in the error descent direction. With the apparent advantage of simpler optical implementation this learning rule is also shown by simulations to be computationally effective.
© 1992 Optical Society of America
Equations on this page are rendered with MathJax. Learn more.