Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Single-shot ptychographic imaging of non-repetitive ultrafast events

Open Access Open Access

Abstract

We demonstrate experimentally high-speed ptychographic imaging of non-repetitive complex-valued events. Three time-resolved complex-valued frames are reconstructed from data recorded in a single camera snapshot. The temporal resolution of the microscope is determined by delays between illuminating pulses. The ability to image amplitude and phase of nonrepetitive events with ultrafast temporal resolution will open new opportunities in science and technology.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

Introduction. Ptychography is a variant of the Coherent Diffraction Imaging (CDI) technique, in which a sample is scanned relative to the illuminating beam in a stepwise fashion [1]. At each step of the scanning process, the far-field diffraction pattern is recorded. During each step, the illumination spot overlaps substantially with adjacent spots, leading to a significant redundancy in the collected data. This redundancy plays a crucial role in enabling reliable algorithmic reconstruction of both the amplitudes and phases of the object and illuminating field without any prior information [2,3].

An inherent limitation in conventional ptychography is obviously the slow scanning process. To address this limitation, researchers have proposed and demonstrated single-shot ptychography (SSP) schemes [4,5] and applied them for various applications, including 3D imaging [6] and spatio-spectral diagnostics of ultrashort laser beams [7]. In SSP, the object is illuminated by multiple beams simultaneously, and the whole ptychographic information is recorded in a single camera snapshot. The recorded data is divided into regions, each region corresponds to a diffraction pattern originating from a specific illuminated region of the object, and conventional algorithms are utilized for the reconstruction. Notably, ultrafast SSP was demonstrated experimentally by employing a single sub-nanosecond pulse for illumination [8].

An extension of the SSP system, termed Time-resolved Imaging by Multiplexed Ptychography (TIMP), was developed to capture ultrafast dynamics of complex-valued objects [8,9]. In TIMP, the SSP system is illuminated by a short burst of pulses with a duration much shorter than the sensor's integration time. As a result, the diffraction patterns from all the pulses are summed incoherently and captured in a single camera snapshot. If the spatial profiles of the pulses in the burst are significantly different (for example, mutually orthogonal with each other), then the redundancy in the acquired data enables the reconstruction of multiple frames of the object. Each frame corresponds to the illumination times of the individual pulses in the burst. The reconstruction is accomplished using a Multi-state Ptychographic Algorithm (MsPA) [1013]. TIMP has several advantages for ultrahigh-speed imaging [14,15]. First, it requires a relatively simple setup and can be applicable across the electromagnetic spectrum, including the extreme UV and x ray regions, potentially enabling nanometric scale spatial resolution. Second, the spatial resolution is not coupled to the temporal resolution—the latter depends only on the intra-burst pulse separation, enabling a high resolution with a high frame rate. Third, while only one camera snapshot is taken, this approach allows the capture of non-repetitive complex-valued dynamic events. TIMP was experimentally demonstrated to capture thirty-six frames of a dynamic complex-valued object from data recorded in a single camera snapshot [9]. However, in that experiment, the beam encoding was done with a spatial light modulator that limited the temporal resolution to milliseconds. The development toward ultrafast TIMP was reported recently in [16].

Here we present an experimental demonstration of a TIMP system capable of capturing dynamics on the femtosecond scale. Our system is based on an SSP microscope and a fiber-based photonic lantern to spatially encode the ultrafast pulses. We reconstruct a three-frame sequence that captures the dynamics of a thermal ablation process [17] from a single measured diffraction pattern. Each reconstructed frame corresponds to the complex-valued (amplitude and phase) transmission function of the object under investigation. In the experiment, the frame rate is determined by the temporal length of the dynamic phenomenon, but the system frame rate is limited only by the temporal overlap of the adjacent pulses in the burst. This study marks an important step in advancing a ptychographic microscope, incorporating ultrahigh-speed capabilities to capture and analyze the dynamics of ultrafast non-repetitive events.

Experimental setup. Our TIMP setup is based on SSP through a 4f system (4fSSP) [4] (Fig. 1). The SSP microscope is comprised of a two-dimensional diffractive optical element (DOE) [18], a collimation lens, and a 4f system. The DOE (MS-817-J-Y-A, Holo-Or Ltd.) is placed at the entrance of the optical system and used to divide the incoming beam into 5 × 5 identical replicas with a separation angle of ${\theta _s} = 0.97^\circ $. Each replica corresponds to a different diffraction order $[{ - 2{,\; } - 1{,\; }0{,\; }1{,\; }2} ]$ along both the vertical and horizontal axes. The collimation lens with a focal length ${f_{col}} = 150\,\textrm{mm}$ is located at its focal length after the DOE, ensuring that all beam replicas are effectively parallel when entering the 4f system. The distance between the centers of neighboring beams on the horizontal and vertical axes at the entrance plane to the 4f system is given by $b = \tan ({0.97} )\ast 150 = 2.54\,\textrm{mm}$. The input plane of the 4f system is located at the back focal plane of the collimating lens. The focal lengths of the two lenses in the 4f system, L1 and L2, are ${f_1} = 100\,\textrm{mm}$ and ${f_2} = 60\,\textrm{mm}$, respectively. The first lens, L1, focuses the beam replicas toward the object, which is located at distance $d = 10\,\textrm{mm}$ before the Fourier plane of the 4f system. This configuration ensures that the neighboring replicas illuminate different, yet partially overlapping areas (overlap ≅77%, for details see Supplement 1) of the object. The second lens, L2, maps the Fourier transform of the field that was diffracted by the object onto the camera (IDS UI-3200-M-GL, with 3006 × 4104 pixels, pixel size of $3.45\; \mathrm{\mu}\textrm{m}$ and 12-bit dynamic range), operated at $150\; \,\mathrm{\mu}\textrm{s}$ exposure time. If the object’s spatial power spectrum is confined, the pattern at the camera can be divided into clearly distinguishable blocks [4]. Each block contains a diffraction pattern associated with a specific replica and contains information about the region of the object illuminated by this replica.

 figure: Fig. 1.

Fig. 1. SSP schematic diagram. The probe pulse illuminates a Diffractive Optical Element (DOE); the replicas are collimated by a collimating lens and then inserted into a 4f imaging system. The object is located near the Fourier plane of the 4f system. The detector is located at the output plane of the 4f system.

Download Full Size | PDF

To perform a time-resolved imaging with the SSP system, we illuminate the system with a burst of pulses (with energy ∼1.5 nJ/pulse). We assume that the object is static within the duration of each pulse but varies from pulse to pulse. Notably, diffracted fields of consecutive pulses do not overlap in time on the camera, while the integration time of the camera is longer than the entire pulse burst. As a result, the captured pattern corresponds to the incoherent sum of the intensity patterns, where each pattern is produced by a single pulse in the burst [8].

To create a burst of three spatially coded pulses for TIMP, we use a Photonic Lantern (PL) device [19] (Fig. 2). Our PL consists of three input channels, each of which is a single-mode (SM) fiber, and one output channel, which is a few-mode (FM) fiber that is capable of supporting three lowest-order modes of a step-index fiber. At the PL’s output, every SM input is converted into a separate mode of the FM fiber in a one-to-one manner, ensuring a significant level of mutual orthogonality. To control the delays between the pulses, we used different lengths for the SM input channels, achieving ${\tau _1} = 200\,\textrm{ns}$ and ${\tau _2} = 900\,\textrm{ns}$ delay relative to the first pulse. We use a femtosecond laser (Light Conversion, Pharos PH1) with a central wavelength of ${\lambda _c} = 1030\,\textrm{nm}$. While the spectrum of the pulses can support 180 fs duration, the pulses were chirped to 20 ps to avoid nonlinear spectral broadening and to prevent damage to the fibers.

 figure: Fig. 2.

Fig. 2. Schematic diagram of the system for the generation of a burst of coded pulses. The pulses pass through different length single-mode fibers and are encoded by a Photonic Lantern to different spatial modes of few-mode fibers.

Download Full Size | PDF

The probed dynamic object is laser-induced holes in a thin soot layer that covers a glass microscope slide. The soot, which fully blocks the probed pulses, was deposited using a standard paraffin candle. To prevent the scattering of the pump pulses to the camera, the pump pulses were converted to the second harmonic (515 nm) using a $2\,\textrm{mm}$ BBO crystal, and a bandpass filter (Thorlabs FLH1030-10) was placed before the camera. The energies off the pump pulses (∼20 ps pulse duration and fundamental mode beams) are 130 µJ and 50 µJ, respectively, and each pulse illuminates the probed object at a different location and at a different time [see Figs. 3(a) and 3(b)]. The pump pulses are directed through the clear glass side and focused on the object, using ${f_a} = 250\,\textrm{mm}$ and ${f_b} = 175\,\textrm{mm}$ focusing lenses for the first and second pumps, respectively, and with $d\tau \approx 3\,\textrm{ns}$ after the first and second probes, respectively. The initial object contains a single hole, and two more holes are created by the two pump pulses [Fig. 3(c)].

 figure: Fig. 3.

Fig. 3. Schematic diagram of the experiment. (a) Train of spatially orthogonal probes illuminates the SSP microscope. The probed event is laser-burned holes in a soot layer. (b) Time sequence of the pump and probe pulses that illuminate the object. (c) Illustration of the holes burned in the soot layer (gray circles).

Download Full Size | PDF

Algorithmic process. The recorded intensity pattern is shown in Fig. 4(a). For calibration, we also took two reference images: one without an object to restore the location of the beams’ centers on the object plane and one without any laser for background noise reduction [20]. To retrieve the three frames of the object and the probes, it is important to remove the background noise and isolate the blocks in the measured data. We used a new procedure of background noise subtraction (Fig. 4). We started by convolving the acquired image with a 9 × 9, $\sigma = 0.6$ Gaussian filter, to equalize the dark current camera noise in both the data image and background (BG) image (image without laser illumination) separately, then subtract the BG noise using the method described in Ref. [20]. We have found that this step prevents artifacts in the reconstruction procedure.

 figure: Fig. 4.

Fig. 4. Snapshot of the experimental data at different stages of the processing. (a) Recorded snapshot by the CMOS without processing. (b) Recorded snapshot after convolving with a Gaussian filter ($\mathrm{9\ \times 9\;\ pixels},\; \sigma \textrm{ = }0.6)$. (c) Recorded snapshot after background noise reduction. (d) Recorded snapshot after data thresholding. All images presented the inline block from the same location. We presented here the original intensity snapshot factored by 0.25, for clarity.

Download Full Size | PDF

Next, we cropped the data into 25 blocks around the calculated centers, to iterate over the data using the MsPA algorithm. The algorithm requires as an input initial guesses for the probes and for the object. The object is considered to be unknown, but the guess for the probes was chosen carefully to prevent local minima in convergence. For the initial guess of the probes, we applied two steps. First, we removed the DOE and the object, placed the camera at the object plane, and took an image of the probes’ intensities at this plane, each probe in turn [see Fig. 6(d)–6(f)]. Second, we returned the system to its original state, placed a known static object in the object plane, and took three HDR images. We used the recorded probes as an initial guess, and reconstructed the probes’ phases by applying 500 iterations of the ePIE algorithm [2]. These reconstructions were used as the initial guess for the probes in the MsPA, which is specifically designed for extracting multiplexed information from highly redundant ptychographic data acquired with different probes. We apply the MsPA algorithm in three steps. First, after BG noise subtraction, we used only the data above some threshold. For the object frames, all that is required from the initial guesses is that they are not identical to avoid degeneracy in the reconstruction process. To this end, simple linear functions $val({x,y} )= ax + by + c$ with different values of $a,b,c$ were chosen for each guess of the frame. We iterate 200 iterations over the data, allowing the probes to be updated only after 40 iterations. The resulting reconstructions contained only the brighter and larger elements of the reconstructed object. We used them as an initial guess for the object in the next step. In the 2nd step, we used the data without thresholding for another 100 iterations, reconstructing the finer elements. This method helped to reduce the artifacts in the reconstruction. Finally, in the 3rd step, we applied another 100 iterations, in which several pixels in the camera that were saturated were omitted from the update step of the algorithm [21,22].

Results. The reconstructions of the complex transmission function of the soot layer are shown in Figs. 5(a)–5(c). For comparison, we reconstructed an image, taken a few seconds after the time-resolved experiment, by a single pulse SSP and ePIE algorithm [Fig. 5(d)]. This reference image serves as a ground-truth for the holes’ locations at the TIMP reconstructions. We can clearly observe new holes appear when moving to the next frame, in both second and third frames. In addition, we observe the thermal ablation process by comparing the size of the holes and their growth – for hole number two [marked in (b)] in frames (b)–(d) and for hole number three [marked in (c)] in frames (c)–(d). The initial hole [marked number one in (a)] did not change its size, as expected. The locations of the emerging holes in the TIMP series match the ones at the ground-truth image (d). Notably, hole number two was growing much faster than hole number three, due to the energy difference of the pump pulses creating these holes (130 uJ versus 50 uJ).

 figure: Fig. 5.

Fig. 5. TIMP reconstruction of three video frames of the complex-valued dynamical object. (a)–(c) Object transmission field reconstruction, amplitude, and phase, ordered by time, at the moments the probes passed through it. The red arrows point to the new holes at every frame. In these images, colors present the relative phase. (d) Ground truth – reconstructions of the object transmission field after the thermal ablation process. Scale of the phase at (a) is compliant with all the images.

Download Full Size | PDF

Alongside the reconstruction of the object, Figs. 6(a)–6(c) present the reconstructions of the complex-valued spatial profile of the burst of probes, at the point they appear on the object plane. The reconstructions present a good similarity to the three complex-valued lowest-order modes of a step-index fiber, as was expected from the PL device. For comparison, we show the intensity profile of the three probes that were captured by placing the detector at the object plane (without the object and DOE) in Figs. 6(d)–6(f). Clearly, there is a good match between the measured intensity profiles and the complex-valued reconstructed profiles.

 figure: Fig. 6.

Fig. 6. Experimental reconstructions of the coded probes. (a)–(c) The complex-valued spatial profiles of the probes. In these images, colors present the relative phases. (d)–(f) Independently measured intensity spatial profiles of the probes.

Download Full Size | PDF

Summary. In this study, we present an experimental demonstration of a TIMP system capable of capturing ultrafast dynamics of complex-valued, non-repetitive events. From a single measured diffraction pattern, we reconstruct a three-frame sequence capturing the dynamics of a thermal ablation process. Each reconstructed frame represents the complex-valued (amplitude and phase) transmission function of the object under investigation. Importantly, the frame rate of the system is determined by the temporal overlap of adjacent pulses in the burst only and is completely decoupled from the spatial resolution of the system. This research is an important milestone in developing an ultrafast TIMP microscope, which is able to record various ultrafast non-repetitive events in an unprecedented temporal and spatial resolution. Exploring the dynamics of non-repetitive events is essential for gaining a comprehensive understanding of the natural world and will advance various fields of science and technology.

Funding

European Research Council (819440-TIMP); Army Research Office of Scientific Research (W911NF1710553); National Aeronautics and Space Administration (80NSSC21K0624).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Supplemental document

See Supplement 1 for supporting content.

REFERENCES

1. J. M. Rodenburg, in Advances in Imaging and Electron Physics (Elsevier, 2008), Vol. 150, pp. 87–184.

2. A. M. Maiden and J. M. Rodenburg, Ultramicroscopy 109, 1256 (2009). [CrossRef]  

3. A. Maiden, D. Johnson, and P. Li, Optica 4, 736 (2017). [CrossRef]  

4. P. Sidorenko and O. Cohen, Optica 3, 9 (2016). [CrossRef]  

5. X. Pan, C. Liu, and J. Zhu, Appl. Phys. Lett. 103, 171105 (2013). [CrossRef]  

6. D. Goldberger, J. Barolak, C. G. Durfee, and D. E. Adams, Opt. Express 28, 18887 (2020). [CrossRef]  

7. D. Goldberger, J. Barolak, C. S. Bevis, B. Ivanic, D. Schmidt, Y. Lei, P. Kazansky, G. F. Mancini, C. G. Durfee, and D. E. Adams, Optica 9, 894 (2022). [CrossRef]  

8. P. Sidorenko, O. Lahav, and O. Cohen, Opt. Express 25, 10997 (2017). [CrossRef]  

9. O. Wengrowicz, O. Peleg, B. Loevsky, B. K. Chen, G. I. Haham, U. S. Sainadh, and O. Cohen, Opt. Express 27, 24568 (2019). [CrossRef]  

10. P. Thibault and A. Menzel, Nature 494, 68 (2013). [CrossRef]  

11. D. J. Batey, D. Claus, and J. M. Rodenburg, Ultramicroscopy 138, 13 (2014). [CrossRef]  

12. P. Li, T. Edo, D. Batey, J. Rodenburg, and A. Maiden, Opt. Express 24, 9038 (2016). [CrossRef]  

13. J. Barolak, D. Goldberger, J. Squier, Y. Bellouard, C. Durfee, and D. Adams, Ultramicroscopy 233, 113418 (2022). [CrossRef]  

14. X. Zeng, X. Lu, C. Wang, K. Wu, Y. Cai, H. Zhong, Q. Lin, J. Lin, R. Ye, and S. Xu, Ultrafast Sci. 3, 0020 (2023). [CrossRef]  

15. J. Liang and L. V. Wang, Optica 5, 1113 (2018). [CrossRef]  

16. J. Barolak, D. Goldberger, B. Ivanic, C. Durfee, and D. Adams, in Imaging and Applied Optics Congress 2022 (3D, AOA, COSI, ISA, PcAOP) (Optica Publishing Group, 2022), paper CF1D.4.

17. Y. L. Yao, H. Chen, and W. Zhang, in Proceedings of the NSF Workshop on Research Needs in Thermal Aspects of Material Removal Processes (2003), Vol. 1, pp. 247–256.

18. S. Katz, N. Kaplan, and I. Grossinger, Opt. Photonik 13, 83 (2018). [CrossRef]  

19. S. G. Leon-Saval, N. K. Fontaine, and R. Amezcua-Correa, Opt. Fiber Technol. 35, 46 (2017). [CrossRef]  

20. C. Wang, Z. Xu, H. Liu, Y. Wang, J. Wang, and R. Tai, Appl. Opt. 56, 2099 (2017). [CrossRef]  

21. X. Pan, C. Liu, and J. Zhu, Opt. Express 26, 21929 (2018). [CrossRef]  

22. X. Pan, S. P. Veetil, B. Wang, C. Liu, and J. Zhu, J. Mod. Opt. 62, 1270 (2015). [CrossRef]  

Supplementary Material (1)

NameDescription
Supplement 1       Supplement 1

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. SSP schematic diagram. The probe pulse illuminates a Diffractive Optical Element (DOE); the replicas are collimated by a collimating lens and then inserted into a 4f imaging system. The object is located near the Fourier plane of the 4f system. The detector is located at the output plane of the 4f system.
Fig. 2.
Fig. 2. Schematic diagram of the system for the generation of a burst of coded pulses. The pulses pass through different length single-mode fibers and are encoded by a Photonic Lantern to different spatial modes of few-mode fibers.
Fig. 3.
Fig. 3. Schematic diagram of the experiment. (a) Train of spatially orthogonal probes illuminates the SSP microscope. The probed event is laser-burned holes in a soot layer. (b) Time sequence of the pump and probe pulses that illuminate the object. (c) Illustration of the holes burned in the soot layer (gray circles).
Fig. 4.
Fig. 4. Snapshot of the experimental data at different stages of the processing. (a) Recorded snapshot by the CMOS without processing. (b) Recorded snapshot after convolving with a Gaussian filter ($\mathrm{9\ \times 9\;\ pixels},\; \sigma \textrm{ = }0.6)$. (c) Recorded snapshot after background noise reduction. (d) Recorded snapshot after data thresholding. All images presented the inline block from the same location. We presented here the original intensity snapshot factored by 0.25, for clarity.
Fig. 5.
Fig. 5. TIMP reconstruction of three video frames of the complex-valued dynamical object. (a)–(c) Object transmission field reconstruction, amplitude, and phase, ordered by time, at the moments the probes passed through it. The red arrows point to the new holes at every frame. In these images, colors present the relative phase. (d) Ground truth – reconstructions of the object transmission field after the thermal ablation process. Scale of the phase at (a) is compliant with all the images.
Fig. 6.
Fig. 6. Experimental reconstructions of the coded probes. (a)–(c) The complex-valued spatial profiles of the probes. In these images, colors present the relative phases. (d)–(f) Independently measured intensity spatial profiles of the probes.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.