Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Temporal depth imaging

Open Access Open Access

Abstract

Temporal optics is an emerging field in which optical signals are considered similarly to objects in spatial optics. Indeed, temporal magnifications, temporal Fourier transform, and temporal signal processing have been demonstrated by adopting optical schemes from space to time. However, temporal imaging has so far focused on the equivalent of two-dimensional spatial imaging schemes, while ignoring the depth of the input. Here we adopt the concept of three-dimensional (3D) imaging into the time domain, providing a new dimension in temporal optics. We developed the concept of temporal 3D objects and demonstrated temporal depth imaging. First, we define signals with temporal depth as signals where each point in time has a different dispersion value. Next, we demonstrate how to measure these signals with a moving time lens. Finally, we present a time lens array and use it to realize temporal depth imaging with a single measurement. Our temporal depth imaging concept will enable measurements of ultrafast nonperiodic phenomena, such as optical rogue waves and the evolution of ultrafast pulses in fiber lasers, with temporal resolution that was not possible until now.

© 2017 Optical Society of America

1. INTRODUCTION

Retrieving the depth information of objects and three-dimensional (3D) imaging systems has resulted in novel imaging devices. Different methods for 3D imaging and recording have been invented, including holography [1], light-field imaging and projection [2], parallax imaging [3], optical coherence tomography [4], and time-of-flight cameras [5]. Today, with the flourishing of 3D displays and virtual reality devices, different methods for 3D recording have been developed and implemented in smartphones and compact cameras [6]. In addition, with the spread of 3D printers, the need for replicating objects is spurring the development of different devices for obtaining depth information with high accuracy [7]. We developed the concept of non-flat temporal signals and adopted the depth imaging approach into the time domain.

The concept of temporal depth will open a new avenue in temporal optics, and temporal depth imaging systems will enable the investigation of the dynamics of ultrafast nonperiodic phenomena, such as optical rogue waves and ultrafast pulses in fiber lasers. Optical rogue waves in optical fibers result from the combined effects of nonlinearity and dispersion [810]. Thus, with a temporal depth imaging system, it is possible to investigate the evolution and dynamics of rogue waves in a temporal resolution that was not possible before. Also, the evolution of ultrafast pulses in fiber lasers strongly depends on the dispersion in the cavity [11]. Our temporal depth imaging will enable the investigation of the dynamics and evolution of pulses in the cavity, which may improve future ultrafast fiber lasers.

There is a mathematical equivalent between light diffraction in space and pulse dispersion in time, which arises from the similarity between the equations describing these two phenomena. The diffraction of the field envelope A(r,t) is described by

E(x,z)z=i2k2E(x,z)x2,
where k is the propagation parameter. The dispersion of a pulse A(z,t) as it propagates in a material is described by
A(t,z)z=i2β22A(t,z)t2,
where β2 is the group velocity dispersion of the material. This duality between light diffraction in space and pulse dispersion in time leads to time lenses, which are the temporal equivalent of a lens [12]. In spatial imaging, a lens imposes a quadratic phase in space. When placed at a distance v from an object, an image is created at a distance u from the lens according to 1/f=1/u+1/v. Therefore, the temporal equivalent of an imaging lens is imposing a quadratic phase shift in time, which compensates for dispersion [12,13]. Thus, a temporal imaging system starts by propagating an input signal in a dispersive material and then imposing on it a quadratic phase shift in time, and, finally, propagates in more dispersive material to reach the image plane. We focus on time lenses that impose the quadratic phase shift via a four-wave mixing process between a signal wave and a chirped pump wave. Such time lenses are highly robust and can impose large quadratic phase shifts on ultrafast signals [1416]. These time lenses were utilized for compressing or stretching signals [17] for time-to-frequency mapping [16,18] and temporal cloaking [19]. However, the time lenses in all these schemes are the temporal equivalent of imaging flat objects. We developed a temporal depth imaging system that is the temporal equivalent of a 3D imaging system.

2. DEPTH IMAGING IN SPACE AND TIME

Any depth imaging system is based on the ability to distinguish between two object points, which are separated along the z axis. One method to obtain this depth information is by utilizing a lens array so that each lens images the object from a slightly different angle [20,21]. The schematics of this configuration are presented in Fig. 1, which shows that object points with different depths are imaged to different locations on the image plane as a function of the lens position. We denote the image plane distance from the lens array as v and the object plane distance as u. The difference between an object point and the object plane is denoted as Δu, which creates an image point in v+Δv as presented in Fig. 1, and obeys

1u+Δu+1v+Δv=1f,
where f is the focal distance of the lens. Since we are measuring the output at the plane u, the object point is blared and transversely shifted as illustrated in Fig. 1. The shifting of the image point xi as a function of lens position xl is written as
Δxi=xl(MsΔuuΔu),
where Ms=v/u is the imaging magnification. By measuring Δxi from each lens in the lens array together with the position of each lens, we can retrieve the depth of each point and reconstruct the full depth information of the object.

 figure: Fig. 1.

Fig. 1. Scheme of deviation in an image point position when shifting an object point from the object plane. The distance between the object point and lens axis is xl. The object point is shifted by Δu from the object plane u, which causes the image point to shift by Δv from the image plane v. Since we measure along the image plane, the image point is not in focus and is shifted upward by Δxi.

Download Full Size | PDF

In temporal optics, free-space propagation is equivalent to dispersion. Therefore, the temporal equivalent for an object which is spread along the z axis is an input signal where each signal point in time acquires a different dispersion value. An illustration of an input signal composed of two pulses, where each pulse acquires a different dispersion value, is presented in Fig. 2. We start with a single pulse, presented in Fig. 2(a), which propagates in a dispersive material and is joined by another pulse in Fig. 2(b). The signal continues to propagate in the dispersive material until it reaches the time lens array in Fig. 2(c). Each time lens imposes a quadratic phase shift on the signal and generates an idler beam. The output idler beam is presented in Fig. 2(d). The idler propagates in additional dispersive material until it reaches the image plane in Fig. 2(e) where three images are formed, each with a different separation between the pulses denoted by τ1, τ2, and τ3. The difference between pulse separations results from the different timing of each time lens and enables us to deduce the dispersion acquired by each pulse. Thus, we designed a temporal depth imaging scheme.

 figure: Fig. 2.

Fig. 2. Schematics of a time lens array for temporal depth imaging where different points in the input signal acquire different dispersion values. (a) An input pulse travels in a dispersive fiber until (b) a point where a second pulse joins. Therefore, the signal is composed of both pulses with different dispersion values. The two pulses propagate in additional dispersive material and reach the time lens array in (c). Each time lens in the array imposes a quadratic phase shift on part of the signal. The output idler, presented in (d), propagates in the dispersive material until it reaches the image plane in (e) where three images are formed. From the separations between the two pulses in the three images, denoted as τ1, τ2, and τ3, we deduce the dispersion difference between the two pulses. Thus, by analyzing the difference between the images, we obtain the temporal depth information of the signal. (Bottom) Experimental setup of a time lens array for demonstrating depth imaging.

Download Full Size | PDF

By measuring the delay between the features in the idler wave as a function of the time lens timing, we retrieve the acquired dispersion of each point in the signal. The timing of an idler feature τi as a function of time lens timing τl is written as

τi=τl(MtDΔLsDLs+DΔLs),
where Mt is the magnification of the time lens, Ls is the fiber length of the signal before the time lens, ΔLs is the extra fiber beyond what is needed for the imaging condition, and D is the dispersion of the extra fiber. Thus, by comparing the different images from the time lens array, we reconstruct the input signal and the dispersion acquired by each point in the signal.

3. MEASURED RESULTS OF TEMPORAL DEPTH IMAGING

To demonstrate this concept, we first measured the change in a temporal image as a function of the timing of the time lens. The experimental setup is presented in Fig. 3. We start with a laser pulse of 70 fs and 85 mW (Toptica FemtoFErb 1560) and split it into a pump wave (1553 nm) and a signal wave (1565 nm). The signal wave is composed of two pulses of 4 ps, where one of the pulses acquires extra dispersion by passing through 80 m of single-mode fiber (SMF), denoted as SMFx. The signal is then imaged with a small-aperture time lens that has a temporal magnification of 56 [22]. We measured the separation between the two pulses in the imaged idler wave as a function of the time lens timing using a fast photodetector (Agilent 86116C) connected to a sampling scope (Agilent 86100D). This setup is the temporal equivalent of a spatial lens that shifts transversally for obtaining the depth information of an object.

 figure: Fig. 3.

Fig. 3. Schematics of a temporal imaging of double pulses with tunable pump timing. We start with an ultrafast fiber laser and split the pulse into a signal and an idler wave. The dispersion compensating fibers (DCFs), denoted as DCFp and DCFs, and the SMFi are tailored for obtaining temporal imaging with a magnification of Mt=56. Specifically, DCFp is 348 m, DCFs is 160 m, and SMFi is 23 km. We impose an extra dispersion on one of the pulses with SMFx and measure the separation between the idler pulses as a function of the pump wave timing.

Download Full Size | PDF

The measured results of the separation between the two pulses as a function of time lens timing are presented in Fig. 4. We shift the timing of the time lens from 9ps to +9ps compared with the signal wave and extract the separation between the idler wave’s pulses. Two representative results of the idler wave as a function of time are presented in the insets. The upper left inset presents the idler wave when the time lens is shifted by 9ps compared with the input signal wave, showing a separation of 460 ps between the pulses. The lower right inset presents the measured idler wave when the time lens is shifted by +9ps, showing a separation of 620 ps between the pulses. A linear fitted curve is presented as a red solid line with a slope of

Δτiτl=8.8ps/ps,
denoted as the separation derivative. This separation derivative of 8.8 ps/ps indicates that the dispersion difference between the two input pulses is DΔLs=1.2ps/nm according to Eq. (5), which corresponds to ΔLs=80m. Thus, we recovered the dispersion difference between points in the signal wave by comparing images from different timings of the time lens.

 figure: Fig. 4.

Fig. 4. Measured separation between the idler pulses as a function of the timing of the time lens when the dispersion difference between them is 80 m of an SMF. The insets show representative measured idler waves for time lens timing of 9ps and +9ps. The red curve denotes a linear fitted line with a slope of 8.8 ps/ps, which indicates that the dispersion difference between the two input pulses is 1.26 ps/nm. The slope of the linear fitted curve is denoted as the separation derivative.

Download Full Size | PDF

We repeated the measurements with different lengths of the extra fiber SMFx to investigate the sensitivity of our method to changes in signal dispersion. We replaced the 80 m SMF denoted as SMFx with SMF lengths ranging from 4 m to 140 m, and measured the separation in the idler wave pulses as a function of the time lens timing. We present these measured results in the inset in Fig. 5 together with fitting curves. Next, we evaluated the derivative of each curve in the inset and obtained the separation derivative as a function of the dispersion difference between the pulses, which is presented in Fig. 5 as blue asterisks. The results reveal a linear increase in the separation derivative, indicating that it is possible to distinguish between signal waves with dispersion differences from 0.2 ps/nm to 2.5 ps/nm. We also calculated the separation derivative without any fitting parameter and presented it as a red curve in Fig. 5, which shows good agreement between the calculated and measured results.

 figure: Fig. 5.

Fig. 5. Separation derivative as a function of the dispersion difference between the pulses. Measured peak separations as functions of pump time delay for different SMF lengths are presented in the inset. The derivatives of these curves are denoted as the separation derivatives and are presented as blue asterisks. Calculated results are presented as the red solid curve.

Download Full Size | PDF

Next, we designed a time lens array scheme for obtaining temporal depth information with a single measurement. In an array of time lenses, each image must be a compressed copy of the signal wave to prevent an overlap of adjacent images, as presented in Fig. 2. Thus, to readout the time lens array output, we resorted to time-to-frequency mapping. First, we redesigned the dispersion of the pump, signal, and idler waves in the time lens into a 2f system, which will map the temporal signal into the frequency domain and measure the output with an optical spectrum analyzer. Specifically, we replaced DCFp with a 160 m dispersion compensating fiber (DCF) and DCFs with an 80 m DCF. This time lens maps two input pulses separated by Δτ into an idler wave with two spectral peaks separated by Δλ, according to [23]

Δλ=(λiλs)21DLΔτ,
where DL is the total dispersion before the time lens, and λs and λi are the signal and idler wavelengths, respectively. The timing of the time lens sets the central wavelength of the output idler [24]. Therefore, two small time lenses, each with a different timing, result in two images with two different central frequencies, which are resolved with an optical spectrum analyzer [22].

We measured the output idler spectrum of two pulses separated by 20 ps imaged with a lens array of two time lenses. First, we measured the spectrum for input pulses with the same dispersion and present the results in Fig. 6(a). Each image presents double pulses separated by Δλ1=6.1nm and Δλ2=5nm, which correspond to a time separation of 20 ps for both images according to Eq. (7) [18,2527]. The difference between Δλ1 and Δλ2 arises from the difference in the central frequency of the two time lenses, namely, the term (λi/λs)2 in Eq. (7). The second signal is composed of two pulses separated by 20 ps, but one pulse passes through 20 m of SMF more than the other, which induces an extra dispersion of 0.36 ps/nm. The separation in the first image is 5.7 nm, corresponding to a temporal separation of 22 ps; the separation in the second image is 3 nm, which corresponds to a temporal separation of 11 ps. The timing separation of the pumps is 5 ps; hence, the separation derivative is 2 ps/ps, which corresponds to a dispersion difference of 0.4 ps/nm and agrees with the length of SMFx. Thus, we demonstrated that our time lens array can retrieve different dispersion of each point in the input signal by comparing the temporal images from the array. Another option to readout the output of the time lens array is to resort to a second stage of a time lens, which will magnify the output of the time lens array. The benefit of a two-stage time lens array over time-to-frequency mapping is that it can also image single-shot signals.

 figure: Fig. 6.

Fig. 6. Output spectra of a time lens array with two lenses imaging a double-pulse signal wave. (a) Measured spectrum when the two signal pulses have the same dispersion. (b) Measured spectrum when one signal pulse has a 0.36 ps/nm more dispersion than the other. In both spectra, we identify the two images according to the time lens timing [22]. In (a), the spectral separation in the images is Δλ1=6.1nm and Δλ2=5nm, indicating a timing separation of 20 ps and no dispersion difference according to Eq. (7). In (b), the spectral separation is Δλ3=5.7nm and Δλ4=3nm, indicating a timing separation of 22 ps and 11 ps, respectively, corresponding to a dispersion difference of 0.4 ps/nm between the pulses.

Download Full Size | PDF

We considered two types of resolution in our system, namely, temporal resolution and depth resolution. Temporal resolution denotes the smallest temporal separation, which can be resolved with our time lenses. Depth resolution denotes the smallest dispersion difference between two points, which can be resolved with our time lens array. These two types of resolutions impose conflicting demands on the time lens array. To improve the temporal resolution, we need wider time lenses since the minimum temporal feature Δτmin as a function of time lens width τp follows [22]

Δτmin=|2πϕτp|,
where ϕ is the group delay dispersion of the signal wave. On the other hand, to improve the depth resolution, we need narrower time lenses since it enables separating the time lenses farther apart, which increases τl in Eq. (5). In our experiment, the accuracy of the spectral peaks measurement was 0.2 nm, which translates to 1 ps accuracy of the temporal separation between the peaks. Thus, to obtain a depth resolution of 0.1 ps/nm and a temporal resolution of 10 ps, the time lens width should be between 7 ps and 10 ps. We also considered high-order dispersion terms as temporal aberrations, but found them to be negligible because of the short fiber length of the time lenses [23].

4. CONCLUSIONS

To conclude, we developed the concept of temporal depth imaging. We defined nonflat signals as signals with different dispersion values as a function of time. We demonstrated how shifting the timing of a time lens makes it possible to retrieve the dispersion value of each point in the signal, which is equivalent to a 3D imaging system. Finally, we demonstrated how a time lens array can retrieve these values with a single measurement by comparing the different images obtained with the time lens array.

The concept of temporal depth in general opens a new avenue in temporal optics. Specifically, temporal depth imaging will allow the investigation of ultrafast nonperiodic phenomena in a temporal resolution that has not been possible so far, for example, rogue waves, which require both nonlinearity and dispersion and pulse evolution in a fiber cavity where the dispersion plays an important role.

REFERENCES

1. D. Gabor, “Holography, 1948–1971,” Proc. IEEE 60, 655–668 (1972). [CrossRef]  

2. M. Levoy and P. Hanrahan, “Light field rendering,” in Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp. 31–42.

3. H. Higuchi and J. Hamasaki, “Real-time transmission of 3-D images formed by parallax panoramagrams,” Appl. Opt. 17, 3895–3902 (1978). [CrossRef]  

4. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254, 1178–1181 (1991). [CrossRef]  

5. T. Oggier, M. Lehmann, R. Kaufmann, M. Schweizer, M. Richter, P. Metzler, G. Lang, F. Lustenberger, and N. Blanc, “An all-solid-state optical range camera for 3D real-time imaging with sub-centimeter depth resolution (SwissRanger),” Proc. SPIE 5249, 534–545 (2003). [CrossRef]  

6. G. Sansoni, M. Trebeschi, and F. Docchio, “State-of-the-art and applications of 3D imaging sensors in industry, cultural heritage, medicine, and criminal investigation,” Sensors 9, 568–601 (2009). [CrossRef]  

7. B. C. Gross, J. L. Erkal, S. Y. Lockwood, C. Chen, and D. M. Spence, “Evaluation of 3D printing and its potential impact on biotechnology and the chemical sciences,” Anal. Chem. 86, 3240–3253 (2014). [CrossRef]  

8. N. Akhmediev, B. Kibler, F. Baronio, M. Belić, W.-P. Zhong, Y. Zhang, W. Chang, J. M. Soto-Crespo, P. Vouzas, P. Grelu, C. Lecaplain, K. Hammani, S. Rica, A. Picozzi, M. Tlidi, K. Panajotov, A. Mussot, A. Bendahmane, P. Szriftgiser, G. Genty, J. Dudley, A. Kudlinski, A. Demircan, U. Morgner, S. Amiraranashvili, C. Bree, G. Steinmeyer, C. Masoller, N. G. R. Broderick, A. F. J. Runge, M. Erkintalo, S. Residori, U. Bortolozzo, F. T. Arecchi, S. Wabnitz, C. G. Tiofack, S. Coulibaly, and M. Taki, “Roadmap on optical rogue waves and extreme events,” J. Opt. 18, 063001 (2016). [CrossRef]  

9. M. Närhi, B. Wetzel, C. Billet, S. Toenger, T. Sylvestre, J.-M. Merolla, R. Morandotti, F. Dias, G. Genty, and J. M. Dudley, “Real-time measurements of spontaneous breathers and rogue wave events in optical fibre modulation instability,” Nat. Commun. 7, 13675 (2016). [CrossRef]  

10. D. Solli, C. Ropers, P. Koonath, and B. Jalali, “Optical rogue waves,” Nature 450, 1054–1057 (2007). [CrossRef]  

11. Y. Du and X. Shu, “Pulse dynamics in all-normal dispersion ultrafast fiber lasers,” J. Opt. Soc. Am. B 34, 553–558 (2017). [CrossRef]  

12. B. H. Kolner and M. Nazarathy, “Temporal imaging with a time lens,” Opt. Lett. 14, 630–632 (1989). [CrossRef]  

13. B. H. Kolner, “Space-time duality and the theory of temporal imaging,” IEEE J. Quantum Electron. 30, 1951–1963 (1994). [CrossRef]  

14. C. Zhang, P. Chui, and K. K. Wong, “Comparison of state-of-art phase modulators and parametric mixers in time-lens applications under different repetition rates,” Appl. Opt. 52, 8817–8826 (2013). [CrossRef]  

15. Y. Okawachi, R. Salem, A. R. Johnson, K. Saha, J. S. Levy, M. Lipson, and A. L. Gaeta, “Asynchronous single-shot characterization of high-repetition-rate ultrafast waveforms using a time-lens-based temporal magnifier,” Opt. Lett. 37, 4892–4894 (2012). [CrossRef]  

16. C. Zhang, B. Li, and K. K.-Y. Wong, “Ultrafast spectroscopy based on temporal focusing and its applications,” IEEE J. Sel. Top. Quantum Electron. 22, 295–306 (2016). [CrossRef]  

17. R. Salem, M. A. Foster, A. C. Turner-Foster, D. F. Geraghty, M. Lipson, and A. L. Gaeta, “High-speed optical sampling using a silicon-chip temporal magnifier,” Opt. Express 17, 4324–4329 (2009). [CrossRef]  

18. J. Azana, N. K. Berger, B. Levit, and B. Fischer, “Spectral Fraunhofer regime: time-to-frequency conversion by the action of a single time lens on an optical pulse,” Appl. Opt. 43, 483–490 (2004). [CrossRef]  

19. M. Fridman, A. Farsi, Y. Okawachi, and A. L. Gaeta, “Demonstration of temporal cloaking,” Nature 481, 62–65 (2012). [CrossRef]  

20. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [invited],” Appl. Opt. 52, 546–560 (2013). [CrossRef]  

21. J.-H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48, H77–H94 (2009). [CrossRef]  

22. T. Yaron, A. Klein, H. Duadi, and M. Fridman, “Temporal superresolution based on a localization microscopy algorithm,” Appl. Opt. 56, D24–D28 (2017). [CrossRef]  

23. J. Schröder, F. Wang, A. Clarke, E. Ryckeboer, M. Pelusi, M. A. Roelens, and B. J. Eggleton, “Aberration-free ultra-fast optical oscilloscope using a four-wave mixing based time-lens,” Opt. Commun. 283, 2611–2614 (2010). [CrossRef]  

24. M. Fridman, Y. Okawachi, S. Clemmen, M. Ménard, M. Lipson, and A. L. Gaeta, “Waveguide-based single-shot temporal cross-correlator,” J. Opt. 17, 035501 (2015). [CrossRef]  

25. T. T. Ng, F. Parmigiani, M. Ibsen, Z. Zhang, P. Petropoulos, and D. J. Richardson, “Compensation of linear distortions by using XPM with parabolic pulses as a time lens,” IEEE Photon. Technol. Lett. 20, 1097–1099 (2008). [CrossRef]  

26. K. G. Petrillo and M. A. Foster, “Scalable ultrahigh-speed optical transmultiplexer using a time lens,” Opt. Express 19, 14051–14059 (2011). [CrossRef]  

27. Z. Wu, L. Lei, J. Dong, J. Hou, and X. Zhang, “Reconfigurable temporal Fourier transformation and temporal imaging,” J. Lightwave Technol. 32, 3963–3968 (2014).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Scheme of deviation in an image point position when shifting an object point from the object plane. The distance between the object point and lens axis is xl. The object point is shifted by Δu from the object plane u, which causes the image point to shift by Δv from the image plane v. Since we measure along the image plane, the image point is not in focus and is shifted upward by Δxi.
Fig. 2.
Fig. 2. Schematics of a time lens array for temporal depth imaging where different points in the input signal acquire different dispersion values. (a) An input pulse travels in a dispersive fiber until (b) a point where a second pulse joins. Therefore, the signal is composed of both pulses with different dispersion values. The two pulses propagate in additional dispersive material and reach the time lens array in (c). Each time lens in the array imposes a quadratic phase shift on part of the signal. The output idler, presented in (d), propagates in the dispersive material until it reaches the image plane in (e) where three images are formed. From the separations between the two pulses in the three images, denoted as τ1, τ2, and τ3, we deduce the dispersion difference between the two pulses. Thus, by analyzing the difference between the images, we obtain the temporal depth information of the signal. (Bottom) Experimental setup of a time lens array for demonstrating depth imaging.
Fig. 3.
Fig. 3. Schematics of a temporal imaging of double pulses with tunable pump timing. We start with an ultrafast fiber laser and split the pulse into a signal and an idler wave. The dispersion compensating fibers (DCFs), denoted as DCFp and DCFs, and the SMFi are tailored for obtaining temporal imaging with a magnification of Mt=56. Specifically, DCFp is 348 m, DCFs is 160 m, and SMFi is 23 km. We impose an extra dispersion on one of the pulses with SMFx and measure the separation between the idler pulses as a function of the pump wave timing.
Fig. 4.
Fig. 4. Measured separation between the idler pulses as a function of the timing of the time lens when the dispersion difference between them is 80 m of an SMF. The insets show representative measured idler waves for time lens timing of 9ps and +9ps. The red curve denotes a linear fitted line with a slope of 8.8 ps/ps, which indicates that the dispersion difference between the two input pulses is 1.26 ps/nm. The slope of the linear fitted curve is denoted as the separation derivative.
Fig. 5.
Fig. 5. Separation derivative as a function of the dispersion difference between the pulses. Measured peak separations as functions of pump time delay for different SMF lengths are presented in the inset. The derivatives of these curves are denoted as the separation derivatives and are presented as blue asterisks. Calculated results are presented as the red solid curve.
Fig. 6.
Fig. 6. Output spectra of a time lens array with two lenses imaging a double-pulse signal wave. (a) Measured spectrum when the two signal pulses have the same dispersion. (b) Measured spectrum when one signal pulse has a 0.36 ps/nm more dispersion than the other. In both spectra, we identify the two images according to the time lens timing [22]. In (a), the spectral separation in the images is Δλ1=6.1nm and Δλ2=5nm, indicating a timing separation of 20 ps and no dispersion difference according to Eq. (7). In (b), the spectral separation is Δλ3=5.7nm and Δλ4=3nm, indicating a timing separation of 22 ps and 11 ps, respectively, corresponding to a dispersion difference of 0.4 ps/nm between the pulses.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

E(x,z)z=i2k2E(x,z)x2,
A(t,z)z=i2β22A(t,z)t2,
1u+Δu+1v+Δv=1f,
Δxi=xl(MsΔuuΔu),
τi=τl(MtDΔLsDLs+DΔLs),
Δτiτl=8.8ps/ps,
Δλ=(λiλs)21DLΔτ,
Δτmin=|2πϕτp|,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.