Abstract

Processing images by a neural network means performing a repeated sequence of operations on the images. The sequence consists of a general linear transformation and a nonlinear mapping of pixel intensities. The general (shift variant) linear transformation is time consuming for large images if done with a serial computer. A shift invariant linear transformation can be implemented much easier by fast Fourier transform or optically, but the shift invariant transform has fewer degrees of freedom because the coupling matrix is Toeplitz. We present a neural convolution network with shift invariant coupling that nevertheless exhibits autoassociative restoration of distorted images. Besides the simple implementation, the network has one more advantage: associative recall does not depend on object position.

© 1990 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. G. Häusler, G. Seckmeyer, T. Weiss, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems. 1: Experiments,” Appl. Opt. 25, 4656–4663 (1986).
    [CrossRef] [PubMed]
  2. R. P. Lippmann, “An Introduction to Computing with Neural Nets,” IEEE ASSP Magezin 4, 4–22 (1987).
    [CrossRef]
  3. R. M. May, “Simple Mathematical Models with Very Complicated Dynamics”, Nature (London) 261, 459–473 (1976).
    [CrossRef]
  4. M. Fang, G. Häusler, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems,” Proc. Soc. Photo-Opt. Instrum. Eng. 667, 214–219 (1986).
  5. J. R. Fienup, “Reconstruction of an Object from the Modulus of Its Fourier Transform,” Opt. Lett. 3, 27–29 (1978).
    [CrossRef] [PubMed]
  6. R. W. Gerchberg, W. O. Saxton, “A Practical Algorithm for the Determination of Phase from Image and Diffraction Plane Pictures,” Optik 35, 237–246 (1972).
  7. G. R. Ayers, J. C. Dainty, “Iterative Blind Deconvolution Method and Its Applications,” Opt. Lett. 13, 547–549 (1988).
    [CrossRef] [PubMed]
  8. L. M. Kani, J. C. Dainty, “Super Resolution Using the Gerchberg Algorithm,” Opt. Commun. 68, 11–17 (1988).
    [CrossRef]
  9. D. Psaltis, J. Hong, “Shift-Invariant Optical Associative Memories,” Opt. Eng. 26, 10–15 (1987).
  10. G. J. Dunning, E. Marom, Y. Owechko, B. H. Soffer, “All-Optical Associative Memory with Shift Invariance and Multiple-Image Recall,” Opt. Lett. 12, 346–351 (1987).
    [CrossRef] [PubMed]
  11. K. Nakano, “A Model of Associative Memory,” IEEE Trans. Syst. Man Cybern. SMC-2, 380–388 (1972).
    [CrossRef]
  12. S. I. Amari, “Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements,” IEEE Trans. Comput. C-21, 1197–1206 (1972).
    [CrossRef]
  13. S. I. Amari, “Neural Theory of Association and Concept Formation,” Biol. Cybern. 26, 175–186 (1977).
    [CrossRef] [PubMed]
  14. J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
    [CrossRef]
  15. J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Nat. Acad. Sci. USA 79, 2554–2558 (1982).
    [CrossRef] [PubMed]

1988

L. M. Kani, J. C. Dainty, “Super Resolution Using the Gerchberg Algorithm,” Opt. Commun. 68, 11–17 (1988).
[CrossRef]

G. R. Ayers, J. C. Dainty, “Iterative Blind Deconvolution Method and Its Applications,” Opt. Lett. 13, 547–549 (1988).
[CrossRef] [PubMed]

1987

G. J. Dunning, E. Marom, Y. Owechko, B. H. Soffer, “All-Optical Associative Memory with Shift Invariance and Multiple-Image Recall,” Opt. Lett. 12, 346–351 (1987).
[CrossRef] [PubMed]

D. Psaltis, J. Hong, “Shift-Invariant Optical Associative Memories,” Opt. Eng. 26, 10–15 (1987).

R. P. Lippmann, “An Introduction to Computing with Neural Nets,” IEEE ASSP Magezin 4, 4–22 (1987).
[CrossRef]

1986

M. Fang, G. Häusler, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems,” Proc. Soc. Photo-Opt. Instrum. Eng. 667, 214–219 (1986).

G. Häusler, G. Seckmeyer, T. Weiss, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems. 1: Experiments,” Appl. Opt. 25, 4656–4663 (1986).
[CrossRef] [PubMed]

1982

J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Nat. Acad. Sci. USA 79, 2554–2558 (1982).
[CrossRef] [PubMed]

1978

1977

S. I. Amari, “Neural Theory of Association and Concept Formation,” Biol. Cybern. 26, 175–186 (1977).
[CrossRef] [PubMed]

J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
[CrossRef]

1976

R. M. May, “Simple Mathematical Models with Very Complicated Dynamics”, Nature (London) 261, 459–473 (1976).
[CrossRef]

1972

R. W. Gerchberg, W. O. Saxton, “A Practical Algorithm for the Determination of Phase from Image and Diffraction Plane Pictures,” Optik 35, 237–246 (1972).

K. Nakano, “A Model of Associative Memory,” IEEE Trans. Syst. Man Cybern. SMC-2, 380–388 (1972).
[CrossRef]

S. I. Amari, “Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements,” IEEE Trans. Comput. C-21, 1197–1206 (1972).
[CrossRef]

Amari, S. I.

S. I. Amari, “Neural Theory of Association and Concept Formation,” Biol. Cybern. 26, 175–186 (1977).
[CrossRef] [PubMed]

S. I. Amari, “Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements,” IEEE Trans. Comput. C-21, 1197–1206 (1972).
[CrossRef]

Anderson, J.

J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
[CrossRef]

Ayers, G. R.

Dainty, J. C.

G. R. Ayers, J. C. Dainty, “Iterative Blind Deconvolution Method and Its Applications,” Opt. Lett. 13, 547–549 (1988).
[CrossRef] [PubMed]

L. M. Kani, J. C. Dainty, “Super Resolution Using the Gerchberg Algorithm,” Opt. Commun. 68, 11–17 (1988).
[CrossRef]

Dunning, G. J.

Fang, M.

M. Fang, G. Häusler, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems,” Proc. Soc. Photo-Opt. Instrum. Eng. 667, 214–219 (1986).

Fienup, J. R.

Gerchberg, R. W.

R. W. Gerchberg, W. O. Saxton, “A Practical Algorithm for the Determination of Phase from Image and Diffraction Plane Pictures,” Optik 35, 237–246 (1972).

Häusler, G.

M. Fang, G. Häusler, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems,” Proc. Soc. Photo-Opt. Instrum. Eng. 667, 214–219 (1986).

G. Häusler, G. Seckmeyer, T. Weiss, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems. 1: Experiments,” Appl. Opt. 25, 4656–4663 (1986).
[CrossRef] [PubMed]

Hong, J.

D. Psaltis, J. Hong, “Shift-Invariant Optical Associative Memories,” Opt. Eng. 26, 10–15 (1987).

Hopfield, J. J.

J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Nat. Acad. Sci. USA 79, 2554–2558 (1982).
[CrossRef] [PubMed]

Jones, R.

J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
[CrossRef]

Kani, L. M.

L. M. Kani, J. C. Dainty, “Super Resolution Using the Gerchberg Algorithm,” Opt. Commun. 68, 11–17 (1988).
[CrossRef]

Lippmann, R. P.

R. P. Lippmann, “An Introduction to Computing with Neural Nets,” IEEE ASSP Magezin 4, 4–22 (1987).
[CrossRef]

Marom, E.

May, R. M.

R. M. May, “Simple Mathematical Models with Very Complicated Dynamics”, Nature (London) 261, 459–473 (1976).
[CrossRef]

Nakano, K.

K. Nakano, “A Model of Associative Memory,” IEEE Trans. Syst. Man Cybern. SMC-2, 380–388 (1972).
[CrossRef]

Owechko, Y.

Psaltis, D.

D. Psaltis, J. Hong, “Shift-Invariant Optical Associative Memories,” Opt. Eng. 26, 10–15 (1987).

Ritz, S.

J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
[CrossRef]

Saxton, W. O.

R. W. Gerchberg, W. O. Saxton, “A Practical Algorithm for the Determination of Phase from Image and Diffraction Plane Pictures,” Optik 35, 237–246 (1972).

Seckmeyer, G.

Silverstein, J.

J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
[CrossRef]

Soffer, B. H.

Weiss, T.

Appl. Opt.

Biol. Cybern.

S. I. Amari, “Neural Theory of Association and Concept Formation,” Biol. Cybern. 26, 175–186 (1977).
[CrossRef] [PubMed]

IEEE ASSP Magezin

R. P. Lippmann, “An Introduction to Computing with Neural Nets,” IEEE ASSP Magezin 4, 4–22 (1987).
[CrossRef]

IEEE Trans. Comput.

S. I. Amari, “Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements,” IEEE Trans. Comput. C-21, 1197–1206 (1972).
[CrossRef]

IEEE Trans. Syst. Man Cybern.

K. Nakano, “A Model of Associative Memory,” IEEE Trans. Syst. Man Cybern. SMC-2, 380–388 (1972).
[CrossRef]

Nature (London)

R. M. May, “Simple Mathematical Models with Very Complicated Dynamics”, Nature (London) 261, 459–473 (1976).
[CrossRef]

Opt. Commun.

L. M. Kani, J. C. Dainty, “Super Resolution Using the Gerchberg Algorithm,” Opt. Commun. 68, 11–17 (1988).
[CrossRef]

Opt. Eng.

D. Psaltis, J. Hong, “Shift-Invariant Optical Associative Memories,” Opt. Eng. 26, 10–15 (1987).

Opt. Lett.

Optik

R. W. Gerchberg, W. O. Saxton, “A Practical Algorithm for the Determination of Phase from Image and Diffraction Plane Pictures,” Optik 35, 237–246 (1972).

Proc. Nat. Acad. Sci. USA

J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Nat. Acad. Sci. USA 79, 2554–2558 (1982).
[CrossRef] [PubMed]

Proc. Soc. Photo-Opt. Instrum. Eng.

M. Fang, G. Häusler, “Chaos and Cooperation in Nonlinear Pictorial Feedback Systems,” Proc. Soc. Photo-Opt. Instrum. Eng. 667, 214–219 (1986).

Psychol. Rev.

J. Anderson, J. Silverstein, S. Ritz, R. Jones, “Distinction Features, Categorial Perception and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413–451 (1977).
[CrossRef]

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1

Pictorial feedback system.

Fig. 2
Fig. 2

General shift invariant nonlinear feedback system.

Fig. 3
Fig. 3

Pictorial feedback systems exhibit a broad spectrum of behavior. During the process of iteration either deterministic chaos (left column) or stable structures (right column) may evolve. The behavior depends upon the parameters of the system, i.e., upon the nonlinearity and the convolution kernel. The numbers on the images indicate the number of circulations.

Fig. 4
Fig. 4

Autoassociativity in pictorial feedback systems: (a) represents the original image distorted by a bar; (b), (c), and (d) represent that same image after running it through the system represented by Eqs. (3) and (7) two, ten and twenty times. After 20 iteration cycles the image is autoassociatively restored.

Fig. 5
Fig. 5

Unstable fixed point in a feedback system that uses an inverse filter. (a) Original image. (b) Original image after sigmoid nonlinearity. (c) Original image after nonlinearity and inverse filter; the original image is restored. Therefore, the image is a fixed point of the feedback system. (d) Original image after 20 iteration cycles; the desired fixed point is unstable, and the system terminated at a different stable fixed point. (e) The spectrum of the undesired fixed point and (f) and transfer function h ˜ are similar.

Fig. 6
Fig. 6

(a) The nonlinearity has the fixed points 1/2 and 1. (b) 2nfold application of the nonlinearity displays the basins of attraction of the fixed points. Each intensity x within the range [ 1 / 4 ( 3 - 5 ) , 1 / 4 ( 1 + 5 )] ends up at the fixed point 1/2. All other intensities within the range [0,1] end up at the fixed point 1.

Fig. 7
Fig. 7

The image (a) is a stable fixed point of a system that uses the convolution kernel (b). The spectrum (c) of the stable image contains only frequencies that are almost perfectly transmitted by the transfer function (d).

Fig. 8
Fig. 8

Examples for the relationship between the spectral amplitude ũ of the learned pattern and the transmittance h ˜ of the filter. Function (a) represents the first calculation rule [Eq. (11)]. Function (b) illustrates the more effective calculation rule [Eq. (12)]. Function (c) shows the matched filter.

Fig. 9
Fig. 9

Restoration of noisy (SNR = 1:1) and low pass blurred/shifted images.

Fig. 10
Fig. 10

Application of the feedback network to pattern recognition: (a) restoration of input image during 50 iterations, (b) the hard clip removes remaining noise, (c) the correlator determines the shift of the restored input image and compares it with the stored image, (d) the final clip operation decides whether the image was completely restored, i.e., recognized, or not.

Fig. 11
Fig. 11

Confusion matrix. First column: noisy input patterns (SNR = 1.5:1). Columns 2–6: restoration results of the subsystems. Each subsystem of the pattern recognition system answers with the learned pattern only if the input patterns is the (distorted) learned pattern; there is no crosstalk.

Fig. 12
Fig. 12

Restoration of a very noisy image fails: (a) noisy input image (SNR = 1:2). (b) Same image after 200 iterations.

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

u i ( t + 1 ) = NL [ j = 0 N - 1 T i j u j ( t ) ] ,
T i j = h ( i - j ) mod N ,
u ( t + 1 ) = NL [ h * u ( t ) ] ,
u i ( t + 1 ) = NL [ j = 0 N - 1 h ( i - j ) mod N u j ( t ) ] .
u ˜ k ( t ) = h ˜ k · u ˜ k ( t ) , u ( t + 1 ) = NL [ u ( t ) ] ,
u ˜ k ( t ) = 1 N · j = 0 N - 1 u j ( t ) exp ( - 2 π i j k / N ) .
NL ( x ) = 4 α x ( 1 - x ) + β ,
u = h * u ,
u = NL ( u ) .
h ˜ k = u ˜ k u ˜ k .
T i j = δ i j , h i = δ i 0 .
h ˜ k = { 1 if u ˜ k s d otherwise , with 0 d 1.
h ˜ k = Θ ( 1 - c u ˜ k ) , Θ ( x ) = { x if x 0 0 otherwise .
u ˜ k = u ˜ k · h ˜ k , = u ˜ k u ˜ k · Θ ( u ˜ k - c ) .
k = 0 N - 1 u k - u k 2 N c 2 .
n ˜ k = n ˜ k · h ˜ k , = n ˜ k n ˜ k · Θ ( n ˜ k - c n ˜ k u ˜ k ) .

Metrics