Abstract

Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

© 2012 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. H. Jaeger, “The ‘echo state’ approach to analysing and training recurrent neural networks,” Technical Report GMD Report 148 (German National Research Center for Information Technology, 2001).
  2. H. Jaeger, “Short term memory in echo state networks,” GMD Report 152 (German National Research Institute for Computer Science, 2001).
  3. W. Maass, T. Natschläger, and H. Markram, “Real-time computing without stable states: a new framework for neural computation based on perturbations,” Neural Comput. 14(11), 2531–2560 (2002).
    [CrossRef] [PubMed]
  4. H. Jaeger and H. Haas, “Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication,” Science 304(5667), 78–80 (2004).
    [CrossRef] [PubMed]
  5. J. J. Steil, “Backpropagation-decorrelation: online recurrent learning with O(N) complexity,” in Proceedings of IEEE International Joint Conference on Neural Networks (IEEE, 2004), pp. 843–848.
  6. R. Legenstein and W. Maass, “What makes a dynamical system computationally powerful?” in New Directions in Statistical Signal Processing: From Systems to Brain (MIT Press, 2005), pp. 127–154.
  7. D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
    [CrossRef] [PubMed]
  8. W. Maass, P. Joshi, and E. D. Sontag, “Computational aspects of feedback in neural circuits,” PLOS Comput. Biol. 3(1), e165 (2007).
    [CrossRef] [PubMed]
  9. H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
    [CrossRef] [PubMed]
  10. D. V. Buonomano and W. Maass, “State-dependent computations: spatiotemporal processing in cortical networks,” Nat. Rev. Neurosci. 10(2), 113–125 (2009).
    [CrossRef] [PubMed]
  11. M. Lukoševičius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Comput. Sci. Rev. 3(3), 127–149 (2009).
    [CrossRef]
  12. B. Hammer, B. Schrauwen, and J. J. Steil, “Recent advances in efficient learning of recurrent networks”, in Proceedings of the European Symposium on Artificial Neural Networks (2009), pp. 213–216.
  13. F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).
  14. F. Wyffels and B. Schrauwen, “A comparative study of reservoir computing strategies for monthly time series prediction,” Neurocomputing 73(10–12), 1958–1964 (2010).
    [CrossRef]
  15. M. Lukoševičius, H. Jaeger, and B. Schrauwen, “Reservoir computing trends,” KI - Künstliche Intelligenz, (2012), pp. 1–7.
  16. C. Fernando and S. Sojakka, “Pattern recognition in a bucket,” in Proceedings of the 7th European Conference on Artificial Life2801, W. Banzhaf, J. Ziegler, T. Christaller, P. Dittrich, and J. Kim, eds. (2003), Vol. 2801, pp. 588–597.
  17. F. Schürmann, K. Meier, and J. Schemmel, “Edge of chaos computation in mixed-mode vlsi - a hard liquid,” in Proceedings of Advances in Neural Information Processing Systems, L. K. Saul, Y. Weiss, and Léon, eds. (MIT Press, 2005).
  18. Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
    [CrossRef]
  19. A. Rodan and P. Tino, “Simple deterministically constructed recurrent neural networks,” Intelligent Data Engineering and Automated Learning (IDEAL, 2010), pp. 267–274.
  20. A. Rodan and P. Tino, “Minimum complexity echo state network,” IEEE Trans. Neural Netw. 22(1), 131–144 (2011).
    [CrossRef] [PubMed]
  21. L. Appeltant, M. C. Soriano, G. Van der Sande, J. Danckaert, S. Massar, J. Dambre, B. Schrauwen, C. R. Mirasso, and I. Fischer, “Information processing using a single dynamical node as complex system”, Nat. Commun. 2, 468 (2011). http://www.nature.com/ncomms/journal/v2/n9/full/ncomms1476.html
  22. Y. Paquot, F. Duport, A. Smerieri, J. Dambre, B. Schrauwen, M. Haelterman, and S. Massar, “Optoelectronic reservoir computing,” Sci. Rep. 2, 287 (2012). http://www.nature.com/srep/2012/120227/srep00287/full/srep00287.html
  23. L. Larger, M. C. Soriano, D. Brunner, L. Appeltant, J. M. Gutierrez, L. Pesquera, C. R. Mirasso, and I. Fischer, “Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing,” Opt. Express 20(3), 3241–3249 (2012), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-20-3-3241 .
    [CrossRef] [PubMed]
  24. K. Vandoorne, W. Dierckx, B. Schrauwen, D. Verstraeten, R. Baets, P. Bienstman, and J. Van Campenhout, “Toward optical signal processing using photonic reservoir computing,” Opt. Express 16(15), 11182–11192 (2008).
    [CrossRef] [PubMed]
  25. K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
    [CrossRef] [PubMed]
  26. J. Dambre, D. Verstraeten, B. Schrauwen, and S. Massar, “Information processing capacity of dynamical systems,” Sci. Rep. 2, Article number: 514, (2012).
  27. H. Jaeger, “Adaptive nonlinear system identification with echo state networks,” Adv. Neural Inf. Process. Syst. 8, 593–600 (2002).
  28. V. J. Mathews, “Adaptive algorithms for bilinear filtering,” Proc. SPIE 2296(1), 317–327 (1994).
    [CrossRef]
  29. http://soma.ece.mcmaster.ca/ipix/dartmouth/datasets.html
  30. D. Verstraeten, B. Schrauwen, and D. Stroobandt, “Isolated word recognition using a liquid state machine,” in Proceedings of the 13th European Symposium on Artifcial Neural Networks (ESANN) (2005), pp. 435–440.
  31. Texas Instruments-Developed 46-Word Speaker-Dependent Isolated Word Corpus (TI46), September 1991, NIST Speech Disc 7–1.1 (1 disc) (1991).
  32. R. Lyon, “A computational model of filtering, detection, and compression in the cochlea,” in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (1982), pp. 1282–1285.

2012

2011

A. Rodan and P. Tino, “Minimum complexity echo state network,” IEEE Trans. Neural Netw. 22(1), 131–144 (2011).
[CrossRef] [PubMed]

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

2010

F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).

F. Wyffels and B. Schrauwen, “A comparative study of reservoir computing strategies for monthly time series prediction,” Neurocomputing 73(10–12), 1958–1964 (2010).
[CrossRef]

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

2009

D. V. Buonomano and W. Maass, “State-dependent computations: spatiotemporal processing in cortical networks,” Nat. Rev. Neurosci. 10(2), 113–125 (2009).
[CrossRef] [PubMed]

M. Lukoševičius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Comput. Sci. Rev. 3(3), 127–149 (2009).
[CrossRef]

2008

2007

D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
[CrossRef] [PubMed]

W. Maass, P. Joshi, and E. D. Sontag, “Computational aspects of feedback in neural circuits,” PLOS Comput. Biol. 3(1), e165 (2007).
[CrossRef] [PubMed]

H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
[CrossRef] [PubMed]

2004

H. Jaeger and H. Haas, “Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication,” Science 304(5667), 78–80 (2004).
[CrossRef] [PubMed]

2002

W. Maass, T. Natschläger, and H. Markram, “Real-time computing without stable states: a new framework for neural computation based on perturbations,” Neural Comput. 14(11), 2531–2560 (2002).
[CrossRef] [PubMed]

H. Jaeger, “Adaptive nonlinear system identification with echo state networks,” Adv. Neural Inf. Process. Syst. 8, 593–600 (2002).

1994

V. J. Mathews, “Adaptive algorithms for bilinear filtering,” Proc. SPIE 2296(1), 317–327 (1994).
[CrossRef]

Appeltant, L.

Baets, R.

Bienstman, P.

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

K. Vandoorne, W. Dierckx, B. Schrauwen, D. Verstraeten, R. Baets, P. Bienstman, and J. Van Campenhout, “Toward optical signal processing using photonic reservoir computing,” Opt. Express 16(15), 11182–11192 (2008).
[CrossRef] [PubMed]

Brunner, D.

Buonomano, D. V.

D. V. Buonomano and W. Maass, “State-dependent computations: spatiotemporal processing in cortical networks,” Nat. Rev. Neurosci. 10(2), 113–125 (2009).
[CrossRef] [PubMed]

D’Haene, M.

D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
[CrossRef] [PubMed]

Dambre, J.

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

Dierckx, W.

Fischer, I.

Gutierrez, J. M.

Haas, H.

H. Jaeger and H. Haas, “Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication,” Science 304(5667), 78–80 (2004).
[CrossRef] [PubMed]

Haelterman, M.

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

Jaeger, H.

M. Lukoševičius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Comput. Sci. Rev. 3(3), 127–149 (2009).
[CrossRef]

H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
[CrossRef] [PubMed]

H. Jaeger and H. Haas, “Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication,” Science 304(5667), 78–80 (2004).
[CrossRef] [PubMed]

H. Jaeger, “Adaptive nonlinear system identification with echo state networks,” Adv. Neural Inf. Process. Syst. 8, 593–600 (2002).

Jalalvand, A.

F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).

Joshi, P.

W. Maass, P. Joshi, and E. D. Sontag, “Computational aspects of feedback in neural circuits,” PLOS Comput. Biol. 3(1), e165 (2007).
[CrossRef] [PubMed]

Larger, L.

Lukosevicius, M.

H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
[CrossRef] [PubMed]

Lukoševicius, M.

M. Lukoševičius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Comput. Sci. Rev. 3(3), 127–149 (2009).
[CrossRef]

Maass, W.

D. V. Buonomano and W. Maass, “State-dependent computations: spatiotemporal processing in cortical networks,” Nat. Rev. Neurosci. 10(2), 113–125 (2009).
[CrossRef] [PubMed]

W. Maass, P. Joshi, and E. D. Sontag, “Computational aspects of feedback in neural circuits,” PLOS Comput. Biol. 3(1), e165 (2007).
[CrossRef] [PubMed]

W. Maass, T. Natschläger, and H. Markram, “Real-time computing without stable states: a new framework for neural computation based on perturbations,” Neural Comput. 14(11), 2531–2560 (2002).
[CrossRef] [PubMed]

Markram, H.

W. Maass, T. Natschläger, and H. Markram, “Real-time computing without stable states: a new framework for neural computation based on perturbations,” Neural Comput. 14(11), 2531–2560 (2002).
[CrossRef] [PubMed]

Martens, J.

F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).

Massar, S.

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

Mathews, V. J.

V. J. Mathews, “Adaptive algorithms for bilinear filtering,” Proc. SPIE 2296(1), 317–327 (1994).
[CrossRef]

Mirasso, C. R.

Natschläger, T.

W. Maass, T. Natschläger, and H. Markram, “Real-time computing without stable states: a new framework for neural computation based on perturbations,” Neural Comput. 14(11), 2531–2560 (2002).
[CrossRef] [PubMed]

Paquot, Y.

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

Pesquera, L.

Popovici, D.

H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
[CrossRef] [PubMed]

Rodan, A.

A. Rodan and P. Tino, “Minimum complexity echo state network,” IEEE Trans. Neural Netw. 22(1), 131–144 (2011).
[CrossRef] [PubMed]

Schrauwen, B.

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

F. Wyffels and B. Schrauwen, “A comparative study of reservoir computing strategies for monthly time series prediction,” Neurocomputing 73(10–12), 1958–1964 (2010).
[CrossRef]

F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

K. Vandoorne, W. Dierckx, B. Schrauwen, D. Verstraeten, R. Baets, P. Bienstman, and J. Van Campenhout, “Toward optical signal processing using photonic reservoir computing,” Opt. Express 16(15), 11182–11192 (2008).
[CrossRef] [PubMed]

D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
[CrossRef] [PubMed]

Siewert, U.

H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
[CrossRef] [PubMed]

Sontag, E. D.

W. Maass, P. Joshi, and E. D. Sontag, “Computational aspects of feedback in neural circuits,” PLOS Comput. Biol. 3(1), e165 (2007).
[CrossRef] [PubMed]

Soriano, M. C.

Stroobandt, D.

D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
[CrossRef] [PubMed]

Tino, P.

A. Rodan and P. Tino, “Minimum complexity echo state network,” IEEE Trans. Neural Netw. 22(1), 131–144 (2011).
[CrossRef] [PubMed]

Triefenbach, F.

F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).

Van Campenhout, J.

Vandoorne, K.

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

K. Vandoorne, W. Dierckx, B. Schrauwen, D. Verstraeten, R. Baets, P. Bienstman, and J. Van Campenhout, “Toward optical signal processing using photonic reservoir computing,” Opt. Express 16(15), 11182–11192 (2008).
[CrossRef] [PubMed]

Verstraeten, D.

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

K. Vandoorne, W. Dierckx, B. Schrauwen, D. Verstraeten, R. Baets, P. Bienstman, and J. Van Campenhout, “Toward optical signal processing using photonic reservoir computing,” Opt. Express 16(15), 11182–11192 (2008).
[CrossRef] [PubMed]

D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
[CrossRef] [PubMed]

Wyffels, F.

F. Wyffels and B. Schrauwen, “A comparative study of reservoir computing strategies for monthly time series prediction,” Neurocomputing 73(10–12), 1958–1964 (2010).
[CrossRef]

Adv. Neural Inf. Process. Syst.

H. Jaeger, “Adaptive nonlinear system identification with echo state networks,” Adv. Neural Inf. Process. Syst. 8, 593–600 (2002).

Comput. Sci. Rev.

M. Lukoševičius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Comput. Sci. Rev. 3(3), 127–149 (2009).
[CrossRef]

IEEE Trans. Neural Netw.

A. Rodan and P. Tino, “Minimum complexity echo state network,” IEEE Trans. Neural Netw. 22(1), 131–144 (2011).
[CrossRef] [PubMed]

K. Vandoorne, J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman, “Parallel reservoir computing using optical amplifiers,” IEEE Trans. Neural Netw. 22(9), 1469–1481 (2011), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-15-11182 .
[CrossRef] [PubMed]

Nat. Rev. Neurosci.

D. V. Buonomano and W. Maass, “State-dependent computations: spatiotemporal processing in cortical networks,” Nat. Rev. Neurosci. 10(2), 113–125 (2009).
[CrossRef] [PubMed]

Neural Comput.

W. Maass, T. Natschläger, and H. Markram, “Real-time computing without stable states: a new framework for neural computation based on perturbations,” Neural Comput. 14(11), 2531–2560 (2002).
[CrossRef] [PubMed]

Neural Netw.

D. Verstraeten, B. Schrauwen, M. D’Haene, and D. Stroobandt, “An experimental unification of reservoir computing methods,” Neural Netw. 20(3), 391–403 (2007).
[CrossRef] [PubMed]

H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert, “Optimization and applications of echo state networks with leaky-integrator neurons,” Neural Netw. 20(3), 335–352 (2007).
[CrossRef] [PubMed]

Neurocomputing

F. Wyffels and B. Schrauwen, “A comparative study of reservoir computing strategies for monthly time series prediction,” Neurocomputing 73(10–12), 1958–1964 (2010).
[CrossRef]

Opt. Express

PLOS Comput. Biol.

W. Maass, P. Joshi, and E. D. Sontag, “Computational aspects of feedback in neural circuits,” PLOS Comput. Biol. 3(1), e165 (2007).
[CrossRef] [PubMed]

Proc. SPIE

Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, and S. Massar, “Reservoir computing: a photonic neural network for information processing,” Proc. SPIE 7728, 77280B, 77280B-12 (2010).
[CrossRef]

V. J. Mathews, “Adaptive algorithms for bilinear filtering,” Proc. SPIE 2296(1), 317–327 (1994).
[CrossRef]

Proceedings of Adv.Neural Inf. Processing Syst.

F. Triefenbach, A. Jalalvand, B. Schrauwen, and J. Martens, “Phoneme recognition with large hierarchical reservoirs,” Proceedings of Adv.Neural Inf. Processing Syst. 23, 1–9 (2010).

Science

H. Jaeger and H. Haas, “Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication,” Science 304(5667), 78–80 (2004).
[CrossRef] [PubMed]

Other

J. J. Steil, “Backpropagation-decorrelation: online recurrent learning with O(N) complexity,” in Proceedings of IEEE International Joint Conference on Neural Networks (IEEE, 2004), pp. 843–848.

R. Legenstein and W. Maass, “What makes a dynamical system computationally powerful?” in New Directions in Statistical Signal Processing: From Systems to Brain (MIT Press, 2005), pp. 127–154.

H. Jaeger, “The ‘echo state’ approach to analysing and training recurrent neural networks,” Technical Report GMD Report 148 (German National Research Center for Information Technology, 2001).

H. Jaeger, “Short term memory in echo state networks,” GMD Report 152 (German National Research Institute for Computer Science, 2001).

M. Lukoševičius, H. Jaeger, and B. Schrauwen, “Reservoir computing trends,” KI - Künstliche Intelligenz, (2012), pp. 1–7.

C. Fernando and S. Sojakka, “Pattern recognition in a bucket,” in Proceedings of the 7th European Conference on Artificial Life2801, W. Banzhaf, J. Ziegler, T. Christaller, P. Dittrich, and J. Kim, eds. (2003), Vol. 2801, pp. 588–597.

F. Schürmann, K. Meier, and J. Schemmel, “Edge of chaos computation in mixed-mode vlsi - a hard liquid,” in Proceedings of Advances in Neural Information Processing Systems, L. K. Saul, Y. Weiss, and Léon, eds. (MIT Press, 2005).

B. Hammer, B. Schrauwen, and J. J. Steil, “Recent advances in efficient learning of recurrent networks”, in Proceedings of the European Symposium on Artificial Neural Networks (2009), pp. 213–216.

J. Dambre, D. Verstraeten, B. Schrauwen, and S. Massar, “Information processing capacity of dynamical systems,” Sci. Rep. 2, Article number: 514, (2012).

http://soma.ece.mcmaster.ca/ipix/dartmouth/datasets.html

D. Verstraeten, B. Schrauwen, and D. Stroobandt, “Isolated word recognition using a liquid state machine,” in Proceedings of the 13th European Symposium on Artifcial Neural Networks (ESANN) (2005), pp. 435–440.

Texas Instruments-Developed 46-Word Speaker-Dependent Isolated Word Corpus (TI46), September 1991, NIST Speech Disc 7–1.1 (1 disc) (1991).

R. Lyon, “A computational model of filtering, detection, and compression in the cochlea,” in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (1982), pp. 1282–1285.

A. Rodan and P. Tino, “Simple deterministically constructed recurrent neural networks,” Intelligent Data Engineering and Automated Learning (IDEAL, 2010), pp. 267–274.

L. Appeltant, M. C. Soriano, G. Van der Sande, J. Danckaert, S. Massar, J. Dambre, B. Schrauwen, C. R. Mirasso, and I. Fischer, “Information processing using a single dynamical node as complex system”, Nat. Commun. 2, 468 (2011). http://www.nature.com/ncomms/journal/v2/n9/full/ncomms1476.html

Y. Paquot, F. Duport, A. Smerieri, J. Dambre, B. Schrauwen, M. Haelterman, and S. Massar, “Optoelectronic reservoir computing,” Sci. Rep. 2, 287 (2012). http://www.nature.com/srep/2012/120227/srep00287/full/srep00287.html

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1

Schematic of the experimental set-up of the all-optical reservoir. The reservoir consists of off-the-shelf fiber optics components operating in the telecommunication C-band. It is based on nonlinear all-optical loop operating in incoherent regime. Optical components are in red whereas electronic components are in green. The all-optical loop is driven by the input optical signal. A Superluminescent Light Emitting Diode (SLED) generates broadband white light. An electronic signal corresponding to the time dependent input multiplied by the input mask is generated by the Arbitrary Waveform Generator (AWG). This electronic signal drives an integrated Lithium Niobate Mach-Zehnder intensity modulator (MZ) thereby producing a time dependent input optical signal whose intensity is adjusted with a variable attenuator. The input optical signal is injected into the cavity by means of a 50/50 fiber coupler. The cavity itself consists of an isolator, a Semiconductor Optical Amplifier (SOA), a variable optical attenuator, and a fiber spool that acts as delay line. The cavity operates below the lasing threshold. A 90/10 fiber coupler is used to send 10% of the cavity intensity to a readout photodiode and then to a digitizer. Coarse Wavelength Division Multiplexers (CWDM, boxes with waves) are used to select a wavelength band near the maximum of emission of the SLED and the maximum gain of the SOA.

Fig. 2
Fig. 2

Incoherent light amplification characteristics of the SOA. Left hand: output power versus input power. The SOA gain is almost linear at low input powers and bends over at input powers above the saturation level. The overall output power increases with increasing injection current. Due to spontaneous emission, the output power is non zero even at zero input power. From bottom to top: injection current is a) 160 mA, b) 200 mA, c) 240 mA, d) 300 mA, e) 340 mA, f) 400 mA, and g) 520 mA. (The hysteresis effect is due to the high-pass filter of the modulator’s driver used for the record). Right hand: normalized input-output relation. The data is the same as in the left panel, but the input and output powers are normalized to lie in the interval [0,1]. It is this function that characterizes the nonlinearity of the reservoir, which can be tuned by changing the pump current.

Fig. 3
Fig. 3

Example response of the all-optical reservoir. The amount of light injected into the reservoir is kept constant (normalized value 0.5). At time t = 0, the input signal (blue curve) is momentarily set to 1, then to zero, before returning to its initial value of 0.5. The response of the reservoir is recorded in green. The nonlinearity of the reservoir due to saturation of the SOA is clear from the initial response (positive response is less than negative response). The fading memory of the all-optical system is also clear. Experimental parameters are: injection current 187mA, feedback gain −1.6 dB, and average input power rescaled to its value at the entrance of the SOA 350.8µW (−4.55dBm).

Fig. 4
Fig. 4

Linear, quadratic, and cross memory capacities for the all-optical reservoir, as a function of the feedback gain feedback gain α and the input gain β for an injection current of 187mA. Note that the color scale is different for each capacity. The + signs locate the optimum working points for the channel equalization task at the different signal-to-noise ratios. The x signs locate the optimum working points for the radar task at low sea state over the 10 delays of prediction (some delays have identical optimal working points). This shows that the capacities are correlated to, but do not explain completely, the optimal working points.

Fig. 5
Fig. 5

Performance of the all-optical reservoir on the channel equalization task with 50 nodes at 187mA pump current (Optical RC), with a simulation of the same system (Simulation), and with an optoelectronic reservoir of 50 nodes (OptoElectronic RC). Error bars are statistical.

Fig. 6
Fig. 6

Predicting radar signal with a reservoir of 50 nodes at a pump current of 187mA (Optical RC) and with the optoelectronic reservoir of 50 nodes reported in [22] (OptoElectronic RC).

Tables (1)

Tables Icon

Table 1 Linear, quadratic, cross and total memory capacities of the system with 50 internal variables and a pump current of 187 mA compared to memory capacities of an optoelectronic reservoir with the same number of internal variables. Each quantity is reported for the optimal choice of feedback gain α and the input gain β . The last line gives the sum of the first three quantities, once again maximized over α and β .

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

x i ( n )= F NL ( j=1 N α A ij x i ( n1 ) +β m i u( n ) ), i=1,2,...,N
y ^ ( n )= i W i x i ( n )
NMSE= ( y y ^ ) 2 n ( y y ) 2 n
x i ( n )={ F NL ( α x i1 ( n1 )+β m i u( n ) ) + ν i (n) 2iN F NL ( α x N+i1 ( n2 )+β m i u( n ) ) + ν i (n) i=1
q( n )=0.08d( n+2 )0.12d( n+1 )+d( n )+0.18d( n1 )0.1d( n2 )+0.091d( n3 ) 0.05d( n4 )+0.04d( n5 )+0.03d( n6 )+0.01d( n7 ) u( n )=q( n )+0.036 q 2 ( n )0.011 q 3 ( n )+noise

Metrics