Abstract

We propose a simple all-in-line single-shot scheme for diagnostics of ultrashort laser pulses, consisting of a multi-mode fiber, a nonlinear crystal and a camera. The system records a 2D spatial intensity pattern, from which the pulse shape (amplitude and phase) are recovered, through a fast Deep Learning algorithm. We explore this scheme in simulations and demonstrate the recovery of ultrashort pulses, robustness to noise in measurements and to inaccuracies in the parameters of the system components. Our technique mitigates the need for commonly used iterative optimization reconstruction methods, which are usually slow and hampered by the presence of noise. These features make our concept system advantageous for real time probing of ultrafast processes and noisy conditions. Moreover, this work exemplifies that using deep learning we can unlock new types of systems for pulse recovery.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Deep learning reconstruction of ultrashort pulses

Tom Zahavy, Alex Dikopoltsev, Daniel Moss, Gil Ilan Haham, Oren Cohen, Shie Mannor, and Mordechai Segev
Optica 5(5) 666-673 (2018)

Single-shot kilohertz characterization of ultrashort pulses by spectral phase interferometry for direct electric-field reconstruction

W. Kornelis, J. Biegert, J. W. G. Tisch, M. Nisoli, G. Sansone, C. Vozzi, S. De Silvestri, and U. Keller
Opt. Lett. 28(4) 281-283 (2003)

Unsupervised deep learning for depth estimation with offset pixels

Saad Imran, Sikander Bin Mukarram, Muhammad Umar Karim Khan, and Chong-Min Kyung
Opt. Express 28(6) 8619-8639 (2020)

References

  • View by:
  • |
  • |
  • |

  1. S. X. Hu and L. A. Collins, “Attosecond Pump Probe: Exploring Ultrafast Electron Motion inside an Atom,” Phys. Rev. Lett. 96(7), 073004 (2006).
    [Crossref]
  2. W. Demtroder, “Laser Spectroscopy: Basic Concepts and Instrumentation, Second Enlarged Edition,” Opt. Eng. 35(11), 3361 (1996).
    [Crossref]
  3. K. E. Sheetz and J. Squier, “Ultrafast optics: Imaging and manipulating biological systems,” J. Appl. Phys. 105(5), 051101 (2009).
    [Crossref]
  4. R. Trebino, Frequency-Resolved Optical Gating: The Measurement of Ultrashort Laser Pulses (Springer US, 2000), Vol. 62.
  5. M. Miranda, C. L. Arnold, T. Fordell, F. Silva, B. Alonso, R. Weigand, A. L’Huillier, and H. Crespo, “Characterization of broadband few-cycle laser pulses with the d-scan technique,” Opt. Express 20(17), 18732 (2012).
    [Crossref]
  6. N. C. Geib, M. Zilk, T. Pertsch, and F. Eilenberger, “Common pulse retrieval algorithm: a fast and universal method to retrieve ultrashort pulses,” Optica 6(4), 495 (2019).
    [Crossref]
  7. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Adv. Neural Inf. Process. Syst. 60(6), 87–90 (2017).
    [Crossref]
  8. A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
    [Crossref]
  9. T. Zahavy, A. Dikopoltsev, D. Moss, G. I. Haham, O. Cohen, S. Mannor, and M. Segev, “Deep learning reconstruction of ultrashort pulses,” Optica 5(5), 666–673 (2018).
    [Crossref]
  10. S. Kleinert, A. Tajalli, T. Nagy, and U. Morgner, “Rapid phase retrieval of ultrashort pulses from dispersion scan traces using deep neural networks,” Opt. Lett. 44(4), 979 (2019).
    [Crossref]
  11. J. White and Z. Chang, “Attosecond streaking phase retrieval with neural network,” Opt. Express 27(4), 4799 (2019).
    [Crossref]
  12. P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).
  13. C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
    [Crossref]
  14. Y.-H. Chen, S. Varma, I. Alexeev, and H. Milchberg, “Measurement of transient nonlinear refractive index in gases using xenon supercontinuum single-shot spectral interferometry,” Opt. Express 15(12), 7458 (2007).
    [Crossref]
  15. P. O’Shea, M. Kimmel, X. Gu, and R. Trebino, “Highly simplified device for ultrashort-pulse measurement,” Opt. Lett. 26(12), 932 (2001).
    [Crossref]
  16. D. Fabris, W. Holgado, F. Silva, T. Witting, J. W. G. Tisch, and H. Crespo, “Single-shot implementation of dispersion-scan for the characterization of ultrashort laser pulses,” Opt. Express 23(25), 32803 (2015).
    [Crossref]
  17. C. Iaconis and I. A. Walmsley, “Spectral phase interferometry for direct electric-field reconstruction of ultrashort optical pulses,” Opt. Lett. 23(10), 792 (1998).
    [Crossref]
  18. B. Redding and H. Cao, “Using a multimode fiber as a high-resolution, low-loss spectrometer,” Opt. Lett. 37(16), 3384–3386 (2012).
    [Crossref]
  19. K. Okamoto, Fundamentals of Optical Waveguides (2006), Chap. 3.
  20. D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature 323(6088), 533–536 (1986).
    [Crossref]
  21. Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in ISCAS 2010 - 2010 IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (2010), pp. 253–256.
  22. D. P. Kingma and J. L. Ba, “Adam: A method for stochastic gradient descent,” ICLR Int. Conf. Learn. Represent., 1–15 (2015).
  23. P. Sidorenko, O. Lahav, Z. Avnat, and O. Cohen, “Ptychographic reconstruction algorithm for frequency-resolved optical gating: super-resolution and supreme robustness,” Optica 3(12), 1320 (2016).
    [Crossref]
  24. C. Bourassin-Bouchet and M.-E. Couprie, “Partially coherent ultrafast spectrography,” Nat. Commun. 6(1), 6465 (2015).
    [Crossref]
  25. G. I. Haham, P. Sidorenko, O. Lahav, and O. Cohen, “Multiplexed FROG,” Opt. Express 25(26), 33007 (2017).
    [Crossref]
  26. I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).
  27. W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

2019 (3)

2018 (1)

2017 (3)

G. I. Haham, P. Sidorenko, O. Lahav, and O. Cohen, “Multiplexed FROG,” Opt. Express 25(26), 33007 (2017).
[Crossref]

A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Adv. Neural Inf. Process. Syst. 60(6), 87–90 (2017).
[Crossref]

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

2016 (1)

2015 (2)

2014 (1)

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

2012 (2)

2009 (1)

K. E. Sheetz and J. Squier, “Ultrafast optics: Imaging and manipulating biological systems,” J. Appl. Phys. 105(5), 051101 (2009).
[Crossref]

2007 (1)

2006 (2)

S. X. Hu and L. A. Collins, “Attosecond Pump Probe: Exploring Ultrafast Electron Motion inside an Atom,” Phys. Rev. Lett. 96(7), 073004 (2006).
[Crossref]

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

2001 (1)

1998 (1)

1996 (1)

W. Demtroder, “Laser Spectroscopy: Basic Concepts and Instrumentation, Second Enlarged Edition,” Opt. Eng. 35(11), 3361 (1996).
[Crossref]

1986 (1)

D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature 323(6088), 533–536 (1986).
[Crossref]

Alexeev, I.

Alonso, B.

Arnold, C. L.

Avnat, Z.

Ba, J. L.

D. P. Kingma and J. L. Ba, “Adam: A method for stochastic gradient descent,” ICLR Int. Conf. Learn. Represent., 1–15 (2015).

Bañares, L.

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Baumert, T.

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Bengio, Y.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).

Blau, H. M.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

Bourassin-Bouchet, C.

C. Bourassin-Bouchet and M.-E. Couprie, “Partially coherent ultrafast spectrography,” Nat. Commun. 6(1), 6465 (2015).
[Crossref]

Bromberg, Y.

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Cao, H.

B. Redding and H. Cao, “Using a multimode fiber as a high-resolution, low-loss spectrometer,” Opt. Lett. 37(16), 3384–3386 (2012).
[Crossref]

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Chang, Z.

Chen, Y.-H.

Cohen, O.

Collins, L. A.

S. X. Hu and L. A. Collins, “Attosecond Pump Probe: Exploring Ultrafast Electron Motion inside an Atom,” Phys. Rev. Lett. 96(7), 073004 (2006).
[Crossref]

Couprie, M.-E.

C. Bourassin-Bouchet and M.-E. Couprie, “Partially coherent ultrafast spectrography,” Nat. Commun. 6(1), 6465 (2015).
[Crossref]

Courville, A.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Crespo, H.

de Nalda, R.

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Demtroder, W.

W. Demtroder, “Laser Spectroscopy: Basic Concepts and Instrumentation, Second Enlarged Edition,” Opt. Eng. 35(11), 3361 (1996).
[Crossref]

Dikopoltsev, A.

Eilenberger, F.

Esteva, A.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

Fabris, D.

Farabet, C.

Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in ISCAS 2010 - 2010 IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (2010), pp. 253–256.

Fordell, T.

Geib, N. C.

Gertler, S.

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Goodfellow, I.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Gu, X.

Haham, G. I.

Hinton, G. E.

A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Adv. Neural Inf. Process. Syst. 60(6), 87–90 (2017).
[Crossref]

D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature 323(6088), 533–536 (1986).
[Crossref]

Holgado, W.

Horn, C.

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Hu, S. X.

S. X. Hu and L. A. Collins, “Attosecond Pump Probe: Exploring Ultrafast Electron Motion inside an Atom,” Phys. Rev. Lett. 96(7), 073004 (2006).
[Crossref]

Iaconis, C.

Kavukcuoglu, K.

Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in ISCAS 2010 - 2010 IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (2010), pp. 253–256.

Kimmel, M.

Kingma, D. P.

D. P. Kingma and J. L. Ba, “Adam: A method for stochastic gradient descent,” ICLR Int. Conf. Learn. Represent., 1–15 (2015).

Kleinert, S.

Ko, J.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

Krizhevsky, A.

A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Adv. Neural Inf. Process. Syst. 60(6), 87–90 (2017).
[Crossref]

Krug, M.

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Kuprel, B.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

L’Huillier, A.

Lahav, O.

Lajoie, I.

P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).

Larochelle, H.

P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).

LeCun, Y.

Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in ISCAS 2010 - 2010 IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (2010), pp. 253–256.

Mannor, S.

Manzagol, P. A.

P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).

Milchberg, H.

Miranda, M.

Mirza, M.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Morgner, U.

Moss, D.

Nagy, T.

Novoa, R. A.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

O’Shea, P.

Okamoto, K.

K. Okamoto, Fundamentals of Optical Waveguides (2006), Chap. 3.

Ozair, S.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Pertsch, T.

Pouget-Abadie, J.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Redding, B.

B. Redding and H. Cao, “Using a multimode fiber as a high-resolution, low-loss spectrometer,” Opt. Lett. 37(16), 3384–3386 (2012).
[Crossref]

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Rumelhart, D. E.

D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature 323(6088), 533–536 (1986).
[Crossref]

Segev, M.

Sheetz, K. E.

K. E. Sheetz and J. Squier, “Ultrafast optics: Imaging and manipulating biological systems,” J. Appl. Phys. 105(5), 051101 (2009).
[Crossref]

Sidorenko, P.

Silva, F.

Squier, J.

K. E. Sheetz and J. Squier, “Ultrafast optics: Imaging and manipulating biological systems,” J. Appl. Phys. 105(5), 051101 (2009).
[Crossref]

Sutskever, I.

A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Adv. Neural Inf. Process. Syst. 60(6), 87–90 (2017).
[Crossref]

Swetter, S. M.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

Tagare, H.

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Tajalli, A.

Thrun, S.

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

Tisch, J. W. G.

Trebino, R.

P. O’Shea, M. Kimmel, X. Gu, and R. Trebino, “Highly simplified device for ultrashort-pulse measurement,” Opt. Lett. 26(12), 932 (2001).
[Crossref]

R. Trebino, Frequency-Resolved Optical Gating: The Measurement of Ultrashort Laser Pulses (Springer US, 2000), Vol. 62.

Varma, S.

Vincent, P.

P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).

Walmsley, I. A.

Warde-Farley, D.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Weigand, R.

White, J.

Williams, R. J.

D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature 323(6088), 533–536 (1986).
[Crossref]

Witting, T.

Wollenhaupt, M.

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Xiong, W.

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Xu, B.

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

Zahavy, T.

Zilk, M.

Adv. Neural Inf. Process. Syst. (2)

A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Adv. Neural Inf. Process. Syst. 60(6), 87–90 (2017).
[Crossref]

I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative Adversarial Nets,” Adv. Neural Inf. Process. Syst. 27, 2672–2680 (2014).

J. Appl. Phys. (1)

K. E. Sheetz and J. Squier, “Ultrafast optics: Imaging and manipulating biological systems,” J. Appl. Phys. 105(5), 051101 (2009).
[Crossref]

Nat. Commun. (1)

C. Bourassin-Bouchet and M.-E. Couprie, “Partially coherent ultrafast spectrography,” Nat. Commun. 6(1), 6465 (2015).
[Crossref]

Nature (2)

D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature 323(6088), 533–536 (1986).
[Crossref]

A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature 542(7639), 115–118 (2017).
[Crossref]

Opt. Eng. (1)

W. Demtroder, “Laser Spectroscopy: Basic Concepts and Instrumentation, Second Enlarged Edition,” Opt. Eng. 35(11), 3361 (1996).
[Crossref]

Opt. Express (5)

Opt. Lett. (4)

Optica (3)

Phys. Rev. A (1)

C. Horn, M. Wollenhaupt, M. Krug, T. Baumert, R. de Nalda, and L. Bañares, “Adaptive control of molecular alignment,” Phys. Rev. A 73(3), 031401 (2006).
[Crossref]

Phys. Rev. Lett. (1)

S. X. Hu and L. A. Collins, “Attosecond Pump Probe: Exploring Ultrafast Electron Motion inside an Atom,” Phys. Rev. Lett. 96(7), 073004 (2006).
[Crossref]

Other (6)

R. Trebino, Frequency-Resolved Optical Gating: The Measurement of Ultrashort Laser Pulses (Springer US, 2000), Vol. 62.

K. Okamoto, Fundamentals of Optical Waveguides (2006), Chap. 3.

P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol, “Stacked denoising autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion” J. Mach. Learn. Res. (2010).

Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in ISCAS 2010 - 2010 IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (2010), pp. 253–256.

D. P. Kingma and J. L. Ba, “Adam: A method for stochastic gradient descent,” ICLR Int. Conf. Learn. Represent., 1–15 (2015).

W. Xiong, B. Redding, S. Gertler, Y. Bromberg, H. Tagare, and H. Cao, “Deep learning of ultrafast pulses with a multimode fiber,” arXiv:1911.00649 (2019).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1. Proposed system for ultrafast pulse measurement. The ultrashort pulse is passed through a single mode (SM) fiber, and subsequently through a coupler that couples to multiple modes of a multimode fiber. The resultant spatio-temporal pulse exiting the multimode fiber is passed through a nonlinear crystal (NLC) and then through a high-pass frequency filter (HPF) that passes only the sum-frequencies (blocking the frequencies associated with the spectrum of the original pulse). The ensuing interference pattern is directly imaged onto a camera.
Fig. 2.
Fig. 2. (A) Regression network architecture: four CNN layers followed by three fully connected layers. The input to this network is a sum frequency interference pattern, which is passed through the computational layers, until a final output is produced in the form of a vector of the real and imaginary parts of the temporal electric field. (B) Block diagram of sum frequency interference measurement and of the label generation from an input spectral pulse, as described in section 2 and section 3. The input of the simulation is passed on to the regression network. (C) Supervised training of the regression network. Each interference pattern is passed through the network to create a reconstruction. The error between a reconstructed pulse and its ground truth pulse shape is used in back propagation and gradient descent to train the network and improve the network parameters.
Fig. 3.
Fig. 3. Three examples of pulse reconstruction with varying complexity: Top simple, middle medium, bottom most complex. (A) Temporal amplitude and phase of an original and reconstructed pulse. (B) The nonlinear interference of the original pulse and its reconstructed counterpart.
Fig. 4.
Fig. 4. (A) Error as a function of SNR for reconstruction of 30 pulses using ptychography reconstruction on FROG measurements (black line), and using the two DeepMMF models: one trained with AWG noise (red line) and the other trained without noise (blue line), both trained on measurements of pulses in our system. The error bars are ${\sigma _{STD}}$ of the error. (B) Reconstruction examples for one pulse, using the two reconstruction algorithms and at two SNR points. Left Colum displays DeepMMF reconstruction, right is ptychography. Top Row is at 20dB SNR, Bottom 40dB SNR.
Fig. 5.
Fig. 5. Probability distribution of normalized root mean squared error (NRMSE) reconstruction errors of the three test sets, done on 7000 pulses.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

T ( r , θ , z , ω ) = l , m α l , m ( ω ) E l , m ( r , θ , z , ω )
E ( r , θ , z , ω ) = E i n p u t ( ω ) T ( r , θ , z , ω ) = A ( ω ) e i ϕ ( ω ) l , m α l , m ( ω ) E l , m ( r , θ , z , ω )
M n o n l i n e a r ( r , θ ) = I 2 ( r , θ , z = L , t ) d t = | E ( r , θ , z = L , t ) | 4 d t
M n o n l i n e a r ( r , θ ) = | E ( r , θ , z = L , ω ) e i ω t d ω | 4 d t = | A ( ω ) e i ϕ ( ω ) T ( r , θ , z = L , ω ) e i ω t d ω | 4 d t
Γ : E ( ω ) M ( r , θ ) E ( ω ) C n M ( r , θ ) R k × k
w = arg min w | | D N N ( I ; w ) E ( t ) | | 1
f ( x ) = max ( 0 , x )

Metrics