Abstract
A two-dimensional amorphous silicon photoconductor array and a liquid-crystal display form the core components of a hardware system for the implementation of a multilayer perceptron neural network. All connections between layers, as well as the nonlinear transfer characteristics associated with the hidden-and output-layer neurons, are implemented in analog circuitry so that the network, once trained, behaves as a stand-alone processor. Subject to a standard backpropagation training algorithm, the network is shown to train very successfully. Training of the network is studied under different levels of weight quantization, neuron output resolution, and random weight-defect probability. A computer simulation of the hardware network is also performed, and excellent agreement is shown between the results of the hardware network and those of the computer simulation. It is concluded that the training capability of the present hardware network is very little degraded by its nonidealities, including the level of weight quantization and limit in neuron output resolution.
© 1993 Optical Society of America
Full Article | PDF ArticleMore Like This
Richard G. Stearns
Appl. Opt. 31(29) 6230-6239 (1992)
Richard G. Stearns
Appl. Opt. 34(14) 2595-2604 (1995)
S. Abramson, D. Saad, E. Marom, and N. Konforti
Appl. Opt. 32(8) 1330-1337 (1993)