Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Ptychographic ultrahigh-speed imaging

Open Access Open Access

Abstract

We propose and demonstrate numerically a simple method for ultrahigh-speed imaging of complex (amplitude and phase) samples. Our method exploits redundancy in single-shot ptychography (SSP) for reconstruction of multiple frames from a single camera snapshot. We term the method Time-resolved Imaging by Multiplexed Ptychography (TIMP). We demonstrate TIMP numerically–reconstructing 15 frames of a complexed-valued dynamic object from a single noisy camera snapshot. Experimentally, we demonstrate SSP with single pulse illumination with pulse duration of 150 psec, where its spectral bandwidth can support 30 fsec pulses.

© 2017 Optical Society of America

1. Introduction

A very important property of imaging systems is the frame rate. While the pump-probe technique [1] is employed for exploring repetitive ultrafast events, imaging of non-repetitive ultrafast dynamical objects require complicated ultrahigh-speed cameras or microscopes. Indeed, such ultrahigh-speed imaging systems are essential for many applications in scientific research, including plasma physics [2], chemistry [3,4], phononics [5,6], spintronics [7,8], fluidics [9,10], and life science [11], as well as for technology developments, clinical diagnostics and monitoring industrial processes.

The introduction of electronic imaging sensors based on the charge-coupled device (CCD) or complementary metal–oxide–semiconductor (CMOS) technology revolutionized high-speed imaging, enabling acquisition rates of up to 107 frames per second. Further increase in the frame rate using CCD or CMOS technology is limited by their on-chip storage and electronic readout speed [12]. Over the last decade, various optical imaging techniques have been developed to allow frame rates that exceed the limits posed by the detectors. For example, a frame rate up to 4.4 THz for 6 frames has been demonstrated in a technique called sequentially timed all-optical mapping photography, which is based on all-optical mapping onto a burst stream of sequentially timed photographs with spatial and temporal dispersion [13]. In compressed ultrafast photography, the image is reconstructed from a single scan of a streaking camera with open slit using compressed sensing approach [14], 350 frames at 100GHz frame rate were demonstrated. A system based on rotating mirror which directs the incoming image through a group of lens pairs to a group of 64 CCD cameras was demonstrated successfully, delivering 128 frames at 25GHz frame rate [10]. Finally, a time-stretch imaging [15] is an innovative paradigm that is built on temporally stretching broadband pulses by dispersive properties of light in both spatial and temporal domains and it achieves continuous image acquisition at an ultrahigh frame rate of up to 1GHz [16]. While all these techniques opened new opportunities in imaging, they add significant complexity to the imaging system, which for many applications is not possible or desired. Moreover, the required complicated modules inevitably introduce new limitations. The frame rate in these techniques is generally strongly and negatively coupled with the spatial resolution and field of view. Also, they do not yield full characterization of complex (amplitude and phase) objects which, for example, is extremely important in live cells [17] and magnetic [18] imaging. Finally, it is technically very challenging to apply these techniques in the short wavelengths spectral regions, including extreme UV and x-rays.

Here we propose a new ultrahigh-speed imaging approach that is based on ptychography [19], a powerful type of coherent diffraction imaging [20]. In ptychography, a complex-valued object is scanned in a stepwise fashion through a localized coherent beam. In each scanning step, the far-field diffraction pattern of the object is measured. The set of diffraction patterns is used for reconstructing the complex transfer function describing the object. Critically, the illumination spot in each step overlaps substantially with neighboring spots. Thus, the recorded information is highly redundant, leading to several advantages of ptychography over ordinary coherent diffraction imaging including improved robustness to noise, no requirement for prior information (e.g. support) on the object, reconstructions of both the imaged sample and the probe beam simultaneously, and generally faster and more reliable reconstruction algorithms [21,22]. Recently, Single-Shot Ptychography (SSP), where the required multiple far-field intensity patterns from overlapped regions in the sample are recorded simultaneously in a single CCD exposure, was proposed and demonstrated [23–25]. Another recent exciting development was the invention of ptychographical information multiplexing (PIM) concept for recovering high dimensional data [26]. PIM has been used for coherent mode decomposition of the probe beam [26,27] and recovering the spectral response of the sample [28,29] under investigation.

Here, we propose to exploit ptychographical information multiplexing in SSP that is illuminated by a burst of pulses for reconstructing multiple temporal frames of the object from a single recorded intensity pattern. In this method, termed Time-resolved Imaging by Multiplexed Ptychography (TIMP), complex-valued multiple frames of the object are recovered algorithmically from the data measured in a single CCD exposure of a SSP system. The frame rate and temporal resolution in TIMP are determined by the light source and not by the imaging system, making this method very flexible. Notably, the TIMP imaging system consists of simple elements hence it can be implemented across the electromagnetic spectrum (including extreme UV and x-rays) as well as with other waves. We demonstrate numerically recovery of 15 different complex frames from a single noisy camera snapshot. We also present progress towards experimental demonstration of TIMP by demonstrating static imaging with a SSP system and single short laser pulse illumination. Due to its simplicity and versatility, we expect TIMP to open numerous ultrahigh-speed imaging applications as well as pave the way to imaging at sub-femtosecond frame time by using bursts of attosecond pulses.

The paper is organized as follows. Section 2 presents the concept of TIMP. Then, sections 3 and 4 present numerical examples of TIMP for phase only objects and complex-valued objects, respectively. An important step for proving the feasibility of TIMP and moving forward for its experimental implementation is presented in section 5 where single-pulse SSP is demonstrated experimentally. In section 6, we propose an alternative scheme for SSP and TIMP. Finally, we conclude and discuss research directions for improving TIMP in section 7.

2. Concept of Time-resolved Imaging by Multiplexed Ptychography (TIMP)

TIMP is based on SSP [24]. For a long time, ptychography was considered as a slow (scanning based) technique [19]. Recently, we proposed and demonstrated experimentally SSP, allowing ultrafast ptychographic imaging. Figure 1(a), shows the ray tracing in SSP microscope which is based on a 4f system (thus we name this setup 4fSSP).

 figure: Fig. 1

Fig. 1 (a) Schematic diagram of single-shot ptychographic microscope with ray tracing. Array of pinholes is located at the input plane of a 4f system. Lens L1, focuses the light beams that diffracts from the array onto the object, which is located at distance d before the back focal plane of lens L1. Lens L2 focuses the diffracted light from the object to the CCD, which is located in the output plane of the 4f system, resulting with blocks of diffraction patterns, where each block corresponds to an illuminated region in the object by the beam originating from one of the pinholes. (b) Schematic diagram of TIMP based on single-shot ptychographic microscope. The 4fSSP microscope is illuminated by a burst of several pulses. A single camera snapshot records a pattern that corresponds to the sum of the diffraction intensity patterns, where each pattern results from a different pulse. Multi-frame images, where each image corresponds to a temporal snapshot of the sample illuminated by a different pulse, are reconstructed from the single recorded pattern.

Download Full Size | PDF

The object is illuminated simultaneously by multiple (N2) partially-overlapping beams originating from a pinhole array that is located at the input plane of a 4f system. The object is located at distance d + f2 before lens L2 (or f2-d) which transfers the field, diffracted by the object, to k-space domain at the CCD plane. Under appropriate conditions [24], the detected intensity pattern consists of clearly distinguished N2 blocks, where the pattern in each block results from a beam originating from a specific pinhole and illuminating the object at a certain region. Thus, the intensity distribution in each block is given by

Im(ν)=|F[P(rRm)O(r)]|2
In Eq. (1), ν and r are the spatial vectors in the CCD and object planes, respectively, m = 1,2,3…N2 is the block/pinhole index and N2 is the total number of blocks/pinholes. F stands for the two-dimensional spatial Fourier operator, O is the complex transmission function of the object, P is the complex envelop of the localized probe beam that illuminates the object (field originated from a single pinhole) and Rm is the center of the illumination spot originated from a pinhole with index m. Importantly, beams from neighboring pinholes illuminate different but overlapping regions of the object. The overlap degree can be tuned by varying d and the pitch of the pinhole array. Increasing the overlap increases the redundancy, which in SSP is reflected by the fact that the number of measured pixels exceeds the number of pixels in the sought signal (somewhat similar to diffraction pattern oversampling in coherent diffraction imaging [30]).

We modify SSP to TIMP by illuminating the imaging system with a burst of ultrashort laser pulses (Fig. 1(b)). We assume that the object is static within the duration of each pulse, but may vary from pulse to pulse. Also, we assume that the diffraction patterns formed by subsequent pulses do not overlap temporally on the CCD. In this case, the recorded pattern by the single camera snapshot corresponds to the sum of the diffraction intensity patterns, where each intensity pattern originated from a different pulse (assuming that the camera integrates over time interval longer than the duration of the pulse burst). Thus, the intensity distribution measured by the CCD in each block is given by

Im(ν)=k=1K|F[P(rRm)Ok(r)]|2
where k is the pulse index in the burst. For reconstructing multiple frames Ok(r) from the single recorded camera snapshot, we use the Multi-state Ptychographic Algorithm (MsPA) [26] that was developed for recovering multiplexed information from highly redundant ptychographical data. In this work, we employ the information multiplexing in SSP to a completely new application: time-resolved multi-frame ultrafast imaging.

3. Numerical results for purely phase objects

Additional prior information is required in order to remove inherent, non-trivial ambiguities in the reconstruction of multiple objects Ok(r) from a single measurement described by Eq. (2) [31]. One such prior information is that the imaged object is phase only [31]. In this case, MsPA algorithm can successfully reconstructs several completely different phase objects from a single camera snapshot of the SSP setup. To demonstrate this numerically, we simulate a 4fSSP setup with the following parameter: N = 6, f1 = 75mm, d = 15mm. While MsPA algorithm can, in principle, recover the illumination probes simultaneously with the objects reconstruction [26], we assume here known illumination (which can be measured or calculated before the imaging experiment). The overlap between neighboring illuminations P(r-Rm) and P(r-Rm + 1) is ~85%. We repeat the simulation for different number of frames (also number of pulses) and Signal-to-Noise Ratio (SNR) values. Figure 2(a) presents mean Normalized Mean Square Error (NMSE) between the original and recovered frames as a function of the number of frames for different SNR values (notably, in this case the correct order of the frames is assumed, but is not recovered algorithmically). The mean NMSE is calculated by arithmetic averaging over the individual frames NMSEs in the simulation. The error bars correspond to standard deviation. The black curve in Fig. 2(a) indicates that we can correctly recover up to 15 frames from a single noiseless camera snapshot. While sharp transition in this curve (at the point of 15 frames) is clearly visible, the behavior of the other curves might be initially unexpected, but can be easily understood after recalling that we added noise to Im(ν), which is the sum of diffraction intensity patterns. Consequently, the noise level per frame increases with the number of frames (it increases linearly if the total power is distributed equally between the frames), even though the noise level in each curve is constant. This explains the linear tendencyof these curves in the range of small number of frames. Figure 2(b) shows NMSE between the diffraction patterns that were fed to the algorithm and the recovered diffraction patterns (named NMSEF) at each iteration of the algorithm. The algorithm stops after 104 iterations or if it reaches NMSEF<10−4. Clearly, the reconstruction algorithm stagnated for all the simulations with noise, which indicates stable convergence. Figure 2(c) shows exemplary reconstruction results: first horizontal panel presents original set of 6 purely phase frames, second panel presents noiseless reconstruction of the set of 6 frames (point marked by green circle on the black curve in Fig. 2(a)), third and the bottom horizontal panels show reconstruction of the frames from data with SNR = 45 dB (point marked by green triangle on the purple curve in Fig. 2(a)) and from data with SNR = 25 dB (point marked by green square on the blue curve in Fig. 2(a)), respectively.

 figure: Fig. 2

Fig. 2 Numerical demonstration of TIMP with burst of spatially identical pulses for imaging the dynamics of a phase-only object. (a) Mean NMSE between the original and recovered frames as a function of the number of frames (also number of pulses) for different SNR values. (b) NMSE between diffraction patterns fed to the algorithm and the recovered diffraction patterns at each iteration of the algorithm (named NMSEF). Number of frames is encoded by the line colors and SNR values are encoded by the line dash types. (c) Exemplary reconstruction results: first horizontal panel presents original set of 6 purely phase frames, second panel presents reconstruction of the set from a noiseless diffraction pattern (green circle mark on the black curve in Fig. 2(a)) third and the bottom horizontal panels show reconstruction of the frames from data with SNR = 45 dB (green triangular mark on the purple curve in Fig. 2(a)) and from data with SNR = 25 dB (green square mark on the blue curve in Fig. 2(a)) respectively. Note, in this case the order of frames was not recovered algorithmically.

Download Full Size | PDF

4. Numerical results for complex-valued objects

Generally, complex-valued objects cannot be recovered uniquely from measurements described by Eq. (2). However, it was shown that orthogonal set of probes lead to unique reconstruction of both the probes and the objects [31]. To utilize this property, we present another modification to the 4fSSP setup which allows reconstruction of general multiple complex-valued objects. Specifically, we assume that each pulse in the burst experiences an effectively different pinhole shape in the array. This scenario can be obtained, for example, by using pulses with different spectra (or different polarization) and a pinhole array with a varying spectral (or polarization) response. Thus, each pulse illuminates the object with a different (and measurable) spatial structure. In this case, the intensity diffraction pattern in block m is given by

Im(ν)=k=1K|F[Pk(rRm)Ok(r)]|2
Next, we simulate TIMP with the same parameters as described in the previous section, but assuming that each pulse in the burst experiences the same array but with different (known) pinhole shapes. Figure 3, which exhibits the same structure as Fig. 2, presents numerical example of TIMP with complex-valued object. Figure 3(a) presents mean NMSE between the original and recovered complex frames as a function of the number of frames for different SNR values (notably, in contrast to the previous case, in this case the order of the frames is recovered algorithmically). The black curve in Fig. 3(a) indicates that we can correctly recover up to 15 complex frames from a single noiseless camera snapshot. Curves that correspond to simulations with noise show same linear tendency as in Fig. 2(a) (its origin is explained in the previous section). Figure 3(b) shows NMSE between diffraction patterns that were fed to the algorithm and the recovered diffraction patterns at each iteration of the algorithm, indicating stable convergence of complex valued frames. Figure 3(c) shows exemplary reconstruction results: first horizontal panel presents original set of 6 complex valued frames, second panel presents reconstruction of noiseless set of 6 frames (point marked by green circle on the black curve in Fig. 3(a)), third and the bottom horizontal panels show reconstruction of the frames from data with SNR = 45 dB (point marked by green triangle on the purple curve in Fig. 2(a)) and SNR = 25 dB (point marked by green square on the blue curve in Fig. 3(a)), respectively.

 figure: Fig. 3

Fig. 3 Numerical demonstration of TIMP for imaging the ultrafast dynamics of a complex-valued object (a) Mean NMSE between the original and recovered frames as a function of the number of frames (also number of pulses) for different SNR values. (b) NMSE between diffraction patterns fed to the algorithm and the recovered diffraction patterns at each iteration of the algorithm (named NMSEF). Number of frames is encoded by the line colors and SNR values encoded by the line dash types. (c) Exemplary reconstruction results: first horizontal panel presents original set of 6 complex frames (amplitude and phase), second panel presents reconstruction from noiseless data set (green circle mark on the black curve in Fig. 3(a)) third and the bottom horizontal panels show reconstruction of the frames from data with SNR = 45 dB (green triangular mark on the purple curve in Fig. 3(a)) and from data with SNR = 25 dB (green square mark on the blue curve in Fig. 3(a)) respectively.

Download Full Size | PDF

5. Experimental demonstration of ultrafast ptychography

In previous experiments, SSP was demonstrated with practically monochromatic continuous wave illumination [23,24]. As a step toward experimental demonstration of TIMP, we demonstrate single-pulse SSP, proving that (spectrally wide) ultrashort laser pulse with limited flux (the flux is limited due to damage threshold of the samples) can be used in SSP. The experimental setup is shown schematically in Fig. 4(a). A single pulse is picked from a pulse train produced by a Ti:Sapphire chirped pulse amplifier laser system by an electro optical modulator. The pulse width is ~150 ps, yet its spectrum, centered at 800nm, can support 30 fsecpulse duration (we used chirped pulse to avoid nonlinear effects associated with high peak power). Part of the pulse is converted to its second harmonic by a Beta Barium Borate (BBO) crystal (we use second harmonic of 800nm because our optical setup was designed for 405 nm wavelength [24]). Then, a dichroic beam splitter separates the blue pulse from the leftover 800nm pulse. The blue pulse is used to illuminate an array of 21 circular pinholes (the pinholes are placed on a square grid with 1.6 mm pitch) located at the input plane of a symmetric 4f lenssystem with f = 75mm. The object (1951 USAF resolution target) is located d = 15 mm before the Fourier plane of the 4f system. Light diffracted by the object is collected by lens L2 and captured by a CCD camera. Figure 4(b) presents the square root of the intensity pattern captured by the camera without the object (which is used for calibrating the imaging system, i.e. for locating the centers of the blocks and for power normalization of each block [24]). Square root of the pattern measured with the object is presented in Fig. 4(c), the 21 diffraction blocks are clearly distinguishable. Next, we apply the extended Ptychographical Iterative Engine (ePIE) [22] reconstruction algorithm and obtain the results in Figs. 4(d)-4(g). Figures 4(d) and 4(e) display the reconstructed intensity and phase of the object, respectively, matching well to a known section in the 1951 USAF resolution target. Figures 4(f) and 4(g) show the reconstructed probe beam intensity and phase, respectively resembling structure of diffraction from a circular pinhole. This section demonstrates the viability of single-pulse SSP in real experiment with illumination spectrum broad enough to support a femtosecond pulses.

 figure: Fig. 4

Fig. 4 Experimental demonstration of ultrafast single-pulse SSP. (a) A scheme of the setup. Single 150ps 800nm pulse from a ti:sapphire laser amplifier is converted in BBO crystal to 400nm pulse and illuminates a square array of 21 pinholes. The array is located at the input plane of a symmetric 4f system with f = 75mm. The object (1951 USAF resolution target) is located d = 15mm before the Fourier plane of the 4f system. The CCD is located at the output face of the 4f system. Measured diffraction patterns without (b) and with (c) the object. (d) and (e) display the reconstructed intensity and phase of the object, respectively, while (f) and (g) show the reconstructed probe beam intensity and phase, respectively.

Download Full Size | PDF

6. Alternative scheme for SSP and TIMP

Finally, we propose an alternative scheme for SSP (and TIMP) in which the imaged object can be outside, possibly far away from, the imaging system. This scheme is based on splitting the incoming beam to two (or more) imaging arms, thus we name it BsSSP. The incoming coherent beam, reflected from, transmitted through, or emitted from the imaged object is split into two (or more) arms (Fig. 5(a)). In each arm, a blocked Grating Array (GA) leads to diffraction of different sections of the beam to different zones in the CCD. In each arm, the CCD records a set of intensity patterns that originated from different (non-overlapping) sections of the beam that illuminated the GA. For obtaining the required overlap for ptychography, the GA in arm 2 is shifted with respect to the GA in arm 1. The total set of recorded patterns are used for ptychographic reconstruction of the complex beam, E(x,y), that illuminated the GAs. The complex valued image of the object is obtained by back propagation of the recovered field to the object plane.

 figure: Fig. 5

Fig. 5 Schematic diagram of BsSSP and TIMP. (a) The incoming coherent beam, which is reflected, transmitted or emitted from the object, is splitted into two arms of the BsSSP setup. In each arm, the incoming beam passes through a grating array which is described by Eq. (4). The diffracted intensity patterns are captured by cameras in the far field. The different phase factors of the gratings in the array induce sufficiently different angle of propagation for each grating, resulting in block structure of the intensity pattern on the camera. Intensity pattern in each block corresponds to the magnitude of the Fourier transform of the nearfield after each grating (Fraunhofer approximation). Thus, patterns captured by the cameras can be described by Eq. (6), which resembles the form of ptychographic measurements. Partial overlap between sections contributing to different diffraction patterns is obtained by introducing a transversal shift between the GAs. (b) Schematic diagram of TIMP based on BsSSP.

Download Full Size | PDF

Mathematically, the transmission function of the GA is given by:

Gm(r)=m=1MG^(rRm)exp(ikmr)
hereG^(rRm)is a pupil function, which is identical for each grating in the array, exp(ikmr) terms are linear phase factors that are different for each grating and m = 1,2,3…M is the index of the grating in the array (M is the total number of gratings). The diffracted intensity pattern captured by a camera after free space propagation (we assume here a Fraunhofer approximation, or one can use a lens instead) is described by
I(ν)=|F[P(r)m=1MG^(rRm)exp(ikmr)]|2
In Eq. (5), ν and r are the spatial vectors in the CCD and GA planes, respectively, F stands for the two-dimensional spatial Fourier operator and P is the complex incoming beam at the GAs plane. Since the effect of the linear phase factors exp(ikmr) is to shift the diffraction patterns laterally in the camera plane, the captured intensity pattern, Eq. (5), consists of distinguished diffraction patterns that are located in M blocks. Thus, we can rewrite Eq. (5) in the form
Im(ν)=|F[P(r)G^(rRm)]|2
where Im(ν) is the intensity pattern in each block. Equation (6) has a form of typical ptychographic measurements [19], but there is no overlap between adjacent pupil functionsG^. To obtain the required overlap, the second arm of the camera should have the same structure and it needs to be displaced transversally with respect to the 1st GA (Fig. 5(a)). Now, the measurements in the two (or more) CCD's constitute a ptychographic set and standard algorithms [21, 22] can be used to recover the complex field at the GA. Finally, to get the complex image of the object, one needs to propagate the recovered complex field at the GA plane backward to the object plane.

7. Conclusions and outlook

In summary, we proposed and demonstrated numerically TIMP - a simple and versatile scheme for ultrahigh-speed imaging. Experimentally, we demonstrated single-frame ultrafast ptychographical microscope with 150ps temporal resolution. The proposed scheme should allow retrieving multiple frames of a complex (i.e. amplitude and phase) objects from a single camera snapshot. Notably, SSP and TIMP options can be added to commercial confocal microscopes by inserting a pinhole array at the correct plane. As in SSP, the frame rate of this scanning-less microscope would be determined by the detector. Converting it to TIMP, by using a proper train of pulses for the illumination, can lead to significant increase of the frame rate, e.g. retrieving 10 frames for each camera snapshot. Remarkably, SSP and TIMP can be implemented in every spectral region and for every type of waves for which focusing elements, lenses or zone-plates (in some spectral regions it can be useful to replace the 'Fourier transforming lens' by free propagation), and pulsed illumination are accessible. These new capabilities will surely open many new opportunities in high speed imaging.

It is worth noting some directions that can improve and extend the scope of TIMP: 1) In this work, different pulses did not temporally overlap on the detector, hence MsPA can be used for multi-framing. However, one can consider scenarios in which diffraction from trailing edges of pulses overlap with diffraction from the leading edges of their consecutive pulses, giving rise to interference pattern in the recorded data. This extra information may be valuable. This line of research may eventually lead to TIMP with a single chirped pulse. 2) Another exciting direction is to utilize structure-based prior knowledge on the object in order to enhance the resolution of the recovered images [32–34]. This direction may yield ultrahigh-speed sub-wavelength imaging. 3) Utilizing the fact that the differences between consecutive frames is most often small (sparse), it should be possible to increase the number of recoverable frames from each camera snapshot [35]. 4) Combination of TIMP with recently developed 3D (volume) ptychography [36] should be straightforward, providing 4D microscopy of complex objects with high spatiotemporal resolution and speed, that may open new opportunities in live cell imaging. 5) Adaptation of holography-guided ptychography [37] method for TIMP should lead to significantly improved dynamic range and robustness to noise. 6) It will be interesting to explore the influence of other types of pinhole arrays (lattice structure and shape of pinholes) on the performances of TIMP. For example, an array with diverse pinhole shapes located on Fermat spiral grid should lead to improved robustness to noise, faster algorithmic convergence and larger field of view [38,39]. 7) Finally, it may be possible to implement the multi-framing (i.e. creation of multiple frames) approach in other types of single-shot multiplexed measurements, e.g. digital holography [40,41], structured illumination [42] and Fourier ptychography [43,44] (to the best of our knowledge, single-shot structured illumination and single-shot Fourier ptychography have not been demonstrated yet).

Funding

Israeli Center of Research Excellence ‘Circle of Light’ (1802/12); The Wolfson Foundation.

References and links

1. M. C. Fischer, J. W. Wilson, F. E. Robles, and W. S. Warren, “Invited review article: pump-probe microscopy,” Rev. Sci. Instrum. 87(3), 031101 (2016). [CrossRef]   [PubMed]  

2. R. Kodama, P. A. Norreys, K. Mima, A. E. Dangor, R. G. Evans, H. Fujita, Y. Kitagawa, K. Krushelnick, T. Miyakoshi, N. Miyanaga, T. Norimatsu, S. J. Rose, T. Shozaki, K. Shigemori, A. Sunahara, M. Tampo, K. A. Tanaka, Y. Toyama, T. Yamanaka, and M. Zepf, “Fast heating of ultrahigh-density plasma as a step towards laser fusion ignition,” Nature 412(6849), 798–802 (2001). [CrossRef]   [PubMed]  

3. P. Hockett, C. Z. Bisgaard, O. J. Clarkin, and A. Stolow, “Time-resolved imaging of purely valence-electron dynamics during a chemical reaction,” Nat. Phys. 7(8), 612–615 (2011). [CrossRef]  

4. C. Y. Wong, R. M. Alvey, D. B. Turner, K. E. Wilk, D. A. Bryant, P. M. G. Curmi, R. J. Silbey, and G. D. Scholes, “Electronic coherence lineshapes reveal hidden excitonic correlations in photosynthetic light harvesting,” Nat. Chem. 4(5), 396–404 (2012). [CrossRef]   [PubMed]  

5. T. Feurer, J. C. Vaughan, and K. A. Nelson, “Spatiotemporal coherent control of lattice vibrational waves,” Science 299(5605), 374–377 (2003). [CrossRef]   [PubMed]  

6. M. Maldovan, “Sound and heat revolutions in phononics,” Nature 503(7475), 209–217 (2013). [CrossRef]   [PubMed]  

7. Y. Acremann, C. H. Back, M. Buess, O. Portmann, A. Vaterlaus, D. Pescia, and H. Melchior, “Imaging precessional motion of the magnetization vector,” Science 290(5491), 492–495 (2000). [CrossRef]   [PubMed]  

8. I. Radu, K. Vahaplar, C. Stamm, T. Kachel, N. Pontius, H. A. Dürr, T. A. Ostler, J. Barker, R. F. L. Evans, R. W. Chantrell, A. Tsukamoto, A. Itoh, A. Kirilyuk, T. Rasing, and A. V. Kimel, “Transient ferromagnetic-like state mediating ultrafast reversal of antiferromagnetically coupled spins,” Nature 472(7342), 205–208 (2011). [CrossRef]   [PubMed]  

9. K. Goda, A. Ayazi, D. R. Gossett, J. Sadasivam, C. K. Lonappan, E. Sollier, A. M. Fard, S. C. Hur, J. Adam, C. Murray, C. Wang, N. Brackbill, D. Di Carlo, and B. Jalali, “High-throughput single-microparticle imaging flow analyzer,” Proc. Natl. Acad. Sci. U.S.A. 109(29), 11630–11635 (2012). [CrossRef]   [PubMed]  

10. X. Chen, J. Wang, M. Versluis, N. de Jong, and F. S. Villanueva, “Ultra-fast bright field and fluorescence imaging of the dynamics of micrometer-sized objects,” Rev. Sci. Instrum. 84(6), 063701 (2013). [CrossRef]   [PubMed]  

11. H. R. Petty, “Spatiotemporal chemical dynamics in living cells: from information trafficking to cell physiology,” Biosystems 83(2-3), 217–224 (2006). [CrossRef]   [PubMed]  

12. M. El-Desouki, M. J. Deen, Q. Fang, L. Liu, F. Tse, and D. Armstrong, “CMOS image sensors for high speed applications,” Sensors (Basel) 9(1), 430–444 (2009). [CrossRef]   [PubMed]  

13. K. Nakagawa, A. Iwasaki, Y. Oishi, R. Horisaki, A. Tsukamoto, A. Nakamura, K. Hirosawa, H. Liao, T. Ushida, K. Goda, F. Kannari, and I. Sakuma, “Sequentially timed all-optical mapping photography (STAMP),” Nat. Photonics 8(9), 695–700 (2014). [CrossRef]  

14. L. Gao, J. Liang, C. Li, and L. V. Wang, “Single-shot compressed ultrafast photography at one hundred billion frames per second,” Nature 516(7529), 74–77 (2014). [CrossRef]   [PubMed]  

15. K. Goda, K. K. Tsia, and B. Jalali, “Amplified dispersive Fourier-transform imaging for ultrafast displacement sensing and barcode reading,” Appl. Phys. Lett. 93(13), 131109 (2008). [CrossRef]  

16. C. Lei, B. Guo, Z. Cheng, and K. Goda, “Optical time-stretch imaging: Principles and applications,” Appl. Phys. Rev. 3(1), 011102 (2016). [CrossRef]  

17. J. Marrison, L. Räty, P. Marriott, and P. O’Toole, “Ptychography--a label free, high-contrast imaging technique for live cells using quantitative phase information,” Sci. Rep. 3, 2369 (2013). [CrossRef]   [PubMed]  

18. X. Zhu, A. P. Hitchcock, D. A. Bazylinski, P. Denes, J. Joseph, U. Lins, S. Marchesini, H.-W. Shiu, T. Tyliszczak, and D. A. Shapiro, “Measuring spectroscopy and magnetism of extracted and intracellular magnetosomes using soft X-ray ptychography,” Proc. Natl. Acad. Sci. U.S.A. 113(51), E8219–E8227 (2016). [CrossRef]   [PubMed]  

19. J. M. Rodenburg, “Ptychography and Related Diffractive Imaging Methods,” in Advances in Imaging and Electron Physics (2008), pp. 87–184.

20. J. Miao, T. Ishikawa, I. K. Robinson, and M. M. Murnane, “Beyond crystallography: diffractive imaging using coherent x-ray light sources,” Science 348(6234), 530–535 (2015). [CrossRef]   [PubMed]  

21. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]  

22. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109(10), 1256–1262 (2009). [CrossRef]   [PubMed]  

23. X. Pan, C. Liu, and J. Zhu, “Single shot ptychographical iterative engine based on multi-beam illumination,” Appl. Phys. Lett. 103(17), 171105 (2013). [CrossRef]  

24. P. Sidorenko and O. Cohen, “Single-shot ptychography,” Optica 3(1), 9 (2016). [CrossRef]  

25. W. Xu, H. Xu, Y. Luo, T. Li, and Y. Shi, “Optical watermarking based on single-shot-ptychography encoding,” Opt. Express 24(24), 27922–27936 (2016). [CrossRef]   [PubMed]  

26. P. Thibault and A. Menzel, “Reconstructing state mixtures from diffraction measurements,” Nature 494(7435), 68–71 (2013). [CrossRef]   [PubMed]  

27. S. Cao, P. Kok, P. Li, A. M. Maiden, and J. M. Rodenburg, “Modal decomposition of a propagating matter wave via electron ptychography,” Phys. Rev. A 94(6), 063621 (2016). [CrossRef]  

28. D. J. Batey, D. Claus, and J. M. Rodenburg, “Information multiplexing in ptychography,” Ultramicroscopy 138, 13–21 (2014). [CrossRef]   [PubMed]  

29. B. Zhang, D. F. Gardner, M. H. Seaberg, E. R. Shanblatt, C. L. Porter, R. Karl, C. A. Mancuso, H. C. Kapteyn, M. M. Murnane, and D. E. Adams, “Ptychographic hyperspectral spectromicroscopy with an extreme ultraviolet high harmonic comb,” Opt. Express 24(16), 18745–18754 (2016). [CrossRef]   [PubMed]  

30. J. Miao, D. Sayre, and H. N. Chapman, “Phase retrieval from the magnitude of the Fourier transforms of nonperiodic objects,” J. Opt. Soc. Am. A 15(6), 1662 (1998). [CrossRef]  

31. P. Li, T. Edo, D. Batey, J. Rodenburg, and A. Maiden, “Breaking ambiguities in mixed state ptychography,” Opt. Express 24(8), 9038–9052 (2016). [CrossRef]   [PubMed]  

32. S. Gazit, A. Szameit, Y. C. Eldar, and M. Segev, “Super-resolution and reconstruction of sparse sub-wavelength images,” Opt. Express 17(26), 23920–23946 (2009). [CrossRef]   [PubMed]  

33. A. Szameit, Y. Shechtman, E. Osherovich, E. Bullkich, P. Sidorenko, H. Dana, S. Steiner, E. B. Kley, S. Gazit, T. Cohen-Hyams, S. Shoham, M. Zibulevsky, I. Yavneh, Y. C. Eldar, O. Cohen, and M. Segev, “Sparsity-based single-shot subwavelength coherent diffractive imaging,” Nat. Mater. 11(5), 455–459 (2012). [CrossRef]   [PubMed]  

34. P. Sidorenko, O. Kfir, Y. Shechtman, A. Fleischer, Y. C. Eldar, M. Segev, and O. Cohen, “Sparsity-based super-resolved coherent diffraction imaging of one-dimensional objects,” Nat. Commun. 6, 8209 (2015). [CrossRef]   [PubMed]  

35. Y. Shechtman, Y. C. Eldar, O. Cohen, and M. Segev, “Efficient coherent diffractive imaging for sparsely varying objects,” Opt. Express 21(5), 6327–6338 (2013). [CrossRef]   [PubMed]  

36. T. M. Godden, R. Suman, M. J. Humphry, J. M. Rodenburg, and A. M. Maiden, “Ptychographic microscope for three-dimensional imaging,” Opt. Express 22(10), 12513–12523 (2014). [CrossRef]   [PubMed]  

37. P. Hessing, B. Pfau, E. Guehrs, M. Schneider, L. Shemilt, J. Geilhufe, and S. Eisebitt, “Holography-guided ptychography with soft X-rays,” Opt. Express 24(2), 1840–1851 (2016). [CrossRef]   [PubMed]  

38. Y.-S. Shi, Y.-L. Wang, and S.-G. Zhang, “Generalized ptychography with diverse probes,” Chin. Phys. Lett. 30(5), 054203 (2013). [CrossRef]  

39. X. Huang, H. Yan, R. Harder, Y. Hwu, I. K. Robinson, and Y. S. Chu, “Optimization of overlap uniformness for ptychography,” Opt. Express 22(10), 12634–12644 (2014). [CrossRef]   [PubMed]  

40. T. Colomb, P. Dahlgren, D. Beghuin, E. Cuche, P. Marquet, and C. Depeursinge, “Polarization imaging by use of digital holography,” Appl. Opt. 41(1), 27–37 (2002). [CrossRef]   [PubMed]  

41. J. Kühn, T. Colomb, F. Montfort, F. Charrière, Y. Emery, E. Cuche, P. Marquet, and C. Depeursinge, “Real-time dual-wavelength digital holographic microscopy with a single hologram acquisition,” Opt. Express 15(12), 7231–7242 (2007). [CrossRef]   [PubMed]  

42. S. Dong, K. Guo, S. Jiang, and G. Zheng, “Recovering higher dimensional image data using multiplexed structured illumination,” Opt. Express 23(23), 30393–30398 (2015). [CrossRef]   [PubMed]  

43. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for Fourier Ptychography with an LED array microscope,” Biomed. Opt. Express 5(7), 2376–2389 (2014). [CrossRef]   [PubMed]  

44. S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 (a) Schematic diagram of single-shot ptychographic microscope with ray tracing. Array of pinholes is located at the input plane of a 4f system. Lens L1, focuses the light beams that diffracts from the array onto the object, which is located at distance d before the back focal plane of lens L1. Lens L2 focuses the diffracted light from the object to the CCD, which is located in the output plane of the 4f system, resulting with blocks of diffraction patterns, where each block corresponds to an illuminated region in the object by the beam originating from one of the pinholes. (b) Schematic diagram of TIMP based on single-shot ptychographic microscope. The 4fSSP microscope is illuminated by a burst of several pulses. A single camera snapshot records a pattern that corresponds to the sum of the diffraction intensity patterns, where each pattern results from a different pulse. Multi-frame images, where each image corresponds to a temporal snapshot of the sample illuminated by a different pulse, are reconstructed from the single recorded pattern.
Fig. 2
Fig. 2 Numerical demonstration of TIMP with burst of spatially identical pulses for imaging the dynamics of a phase-only object. (a) Mean NMSE between the original and recovered frames as a function of the number of frames (also number of pulses) for different SNR values. (b) NMSE between diffraction patterns fed to the algorithm and the recovered diffraction patterns at each iteration of the algorithm (named NMSEF). Number of frames is encoded by the line colors and SNR values are encoded by the line dash types. (c) Exemplary reconstruction results: first horizontal panel presents original set of 6 purely phase frames, second panel presents reconstruction of the set from a noiseless diffraction pattern (green circle mark on the black curve in Fig. 2(a)) third and the bottom horizontal panels show reconstruction of the frames from data with SNR = 45 dB (green triangular mark on the purple curve in Fig. 2(a)) and from data with SNR = 25 dB (green square mark on the blue curve in Fig. 2(a)) respectively. Note, in this case the order of frames was not recovered algorithmically.
Fig. 3
Fig. 3 Numerical demonstration of TIMP for imaging the ultrafast dynamics of a complex-valued object (a) Mean NMSE between the original and recovered frames as a function of the number of frames (also number of pulses) for different SNR values. (b) NMSE between diffraction patterns fed to the algorithm and the recovered diffraction patterns at each iteration of the algorithm (named NMSEF). Number of frames is encoded by the line colors and SNR values encoded by the line dash types. (c) Exemplary reconstruction results: first horizontal panel presents original set of 6 complex frames (amplitude and phase), second panel presents reconstruction from noiseless data set (green circle mark on the black curve in Fig. 3(a)) third and the bottom horizontal panels show reconstruction of the frames from data with SNR = 45 dB (green triangular mark on the purple curve in Fig. 3(a)) and from data with SNR = 25 dB (green square mark on the blue curve in Fig. 3(a)) respectively.
Fig. 4
Fig. 4 Experimental demonstration of ultrafast single-pulse SSP. (a) A scheme of the setup. Single 150ps 800nm pulse from a ti:sapphire laser amplifier is converted in BBO crystal to 400nm pulse and illuminates a square array of 21 pinholes. The array is located at the input plane of a symmetric 4f system with f = 75mm. The object (1951 USAF resolution target) is located d = 15mm before the Fourier plane of the 4f system. The CCD is located at the output face of the 4f system. Measured diffraction patterns without (b) and with (c) the object. (d) and (e) display the reconstructed intensity and phase of the object, respectively, while (f) and (g) show the reconstructed probe beam intensity and phase, respectively.
Fig. 5
Fig. 5 Schematic diagram of BsSSP and TIMP. (a) The incoming coherent beam, which is reflected, transmitted or emitted from the object, is splitted into two arms of the BsSSP setup. In each arm, the incoming beam passes through a grating array which is described by Eq. (4). The diffracted intensity patterns are captured by cameras in the far field. The different phase factors of the gratings in the array induce sufficiently different angle of propagation for each grating, resulting in block structure of the intensity pattern on the camera. Intensity pattern in each block corresponds to the magnitude of the Fourier transform of the nearfield after each grating (Fraunhofer approximation). Thus, patterns captured by the cameras can be described by Eq. (6), which resembles the form of ptychographic measurements. Partial overlap between sections contributing to different diffraction patterns is obtained by introducing a transversal shift between the GAs. (b) Schematic diagram of TIMP based on BsSSP.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

I m (ν)= | F[ P(r R m )O(r) ] | 2
I m (ν)= k=1 K | F[ P(r R m ) O k (r) ] | 2
I m (ν)= k=1 K | F[ P k (r R m ) O k (r) ] | 2
G m (r)= m=1 M G ^ (r R m )exp(i k m r)
I(ν)= | F[ P(r) m=1 M G ^ (r R m )exp(i k m r) ] | 2
I m (ν)= | F[ P(r) G ^ (r R m ) ] | 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.