Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Experimental time-resolved imaging by multiplexed ptychography

Open Access Open Access

Abstract

A recently proposed technique introduced a time-resolved option of fast transient non-repetitive events to ptychographic microscopy. This technique, termed time-resolved imaging by multiplexed ptychography (TIMP), is based on algorithmic reconstruction of multiple frames from data recorded in a single camera acquisition of a single-shot ptychographic microscope. We demonstrate TIMP experimentally, reconstructing thirty-six frames of a dynamical complex-valued object from ptychographic data recorded in a single camera snapshot.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Ptychography [1] is a powerful coherent diffractive imaging (CDI) [2] technique, yielding a label-free, high-contrast quantitative amplitude and phase information, which does not require prior information (e.g. support) on the object and the probe beam. In a conventional ptychographic microscope, a complex-valued object is scanned in a stepwise fashion through a localized beam. In each step, the far-field intensity diffraction pattern from the illuminated region on the object is recorded. Critically, the illumination spot in each step overlaps substantially with neighboring spots, resulting in significant redundancy in the measured data which enables robust amplitude and phase reconstructions of the object using an iterative phase retrieval algorithm [3,4].

A limitation of conventional ptychography is the long acquisition time (> 1ms) due to the scanning, precluding the application of ptychography to imaging of fast dynamics. To overcome this restriction, single-shot ptychography (SSP) schemes, in which ptychographic data (multiple intensity diffraction patterns from overlapping regions) is recorded in a single camera exposure, were proposed and demonstrated [5–9]. The recorded data is divided into zones, each zone approximately contains one diffraction pattern that is associated with a known localized region of the object. Notably, ultrafast SSP (single frame of a static object) was demonstrated using a single 150 psec pulse illumination [7].

Recently, Time-resolved Imaging by Multiplexed Ptychography (TIMP) was proposed as a promising approach to obtain ultrahigh-speed high-resolution imaging of complex-valued objects [7]. In TIMP, an SSP system is illuminated by a burst of pulses that is much faster than the integration time of the sensor, so the diffraction patterns from all the pulses are summed up and recorded in a single camera snapshot. In order to produce a movie of the event from the recorded multiplexed ptychographic data, i) the burst should consist of different (preferentially mutually orthogonal) probe pulses and ii) the ordinary ptychographic reconstruction algorithm should be replaced by a multi-state ptychographic algorithm (MsPA) [10, 11]. TIMP offers exciting possibilities for single-shot ultrahigh-speed imaging [12]. First, due to its relative simplicity, it should be applicable across the electromagnetic spectrum, including extreme UV and x-ray spectral regions. Second, in TIMP, the spatial resolution and frame rate can be largely uncoupled to the number of frames (the cost of increasing the number of frames can be allocated to reduce the field of view or to enhance the complexity of the microscope [7]). Thus, TIMP may allow ultrahigh speed microscopy of complex-valued objects with submicrometer and picosecond resolution scales. While multiplexed single-shot ptychography was demonstrated for static polarization sensitive objects [13], TIMP has not been demonstrated experimentally yet.

Here we demonstrate TIMP experimentally, reconstructing up to thirty-six complex-valued images from data recorded in a single camera snapshot. We explore two different schemes for producing a burst of pulses that are mutually orthogonal. The first one achieves high mutual orthogonality by spatially modulating the probes with different vortex-like phases. The second method modulates the probe beams with different linear phases. We first explore both methods separately and then combine them in order to increase the sequence depth (i.e., the number of frames captured in one acquisition). This work constitutes an important step in the development of an ultrahigh-speed high-resolution ptychographic microscope.

2. Experimental setup

Our TIMP setup is based on SSP through a 4f system (4fSSP) [6,7,13,14] (Fig. 1(a)). In 4fSSP, an array of pinholes is located at the input plane of a 4f system. Lens L1, focuses the light beams that diffract from the array onto the object, which is located at distance d before the back focal plane of lens L1. The small displacement from the Fourier plane, d, creates a partial overlap between the beams which is essential in ptychography. Lens L2 collects the diffracted light from the object and transfers it to a camera on the output plane of the 4f system. Assuming that the spatial power spectrum of the object is largely confined to a low-frequency region, the camera measures an intensity pattern consisting of clearly distinguishable blocks. Each block contains a diffraction pattern associated with a beam originating from a single pinhole, and contains spectral information about a specific region on the object plane.

 figure: Fig. 1

Fig. 1 (a) Schematic diagram of 4fSSP microscope with ray tracing. (b) Schematic diagram of our TIMP microscope which is based on 4fSSP. A phase-only reflective SLM (SLMP) replaces the static pinhole array in (a). The dynamic object is produced by using a reflective amplitude-only SLM (SLMO). (c) The fixed component phase structure, which is induced by SLMP, that mimics a micro-lens array (MLA).

Download Full Size | PDF

In our experiment (Fig. 1(b)), the optical setup is comprised of a 520nm diode-laser that is temporally modulated electrically, emitting a pulse burst, each with a duration of τ = 5msec and the pulse separation is Δt = 16.66msec (60Hz repetition rate). The beam is coupled to a single-mode fiber for spatial filtering, then spatially magnified by a telescope (not shown in Fig. 1) and enters a modified 4fSSP setup, with f1 = f2 = 100mm.

In order to vary the probe beams in time, we replaced the static pinhole array by a reflective HOLOEYE PLUTO-2 phase-only spatial light modulator (SLM) that generates a tunable mask-like beam structure on the input plane of the 4f system. The induced phase mask consists of a fixed component and an additional component that varies between pulses.

Specifically, we set the fixed phase component such that the SLM acts as a micro-lens array (MLA), producing an effective pinhole array at a focal distance, fMLA, downstream from the SLM. Hence, the SLM is located fMLA before the input plane of the 4f system. The programmed fixed-component phase mask is shown in Fig. 1(c). It is an array of 20 phase masks of the form Φ0(r) = exp(iπr2/λfMLA), where r is the distance from the center of a single micro-lens, and is locally defined in each block. We determined fMLA according to fMLA = πbD/4λ, where b = 1.4mm is the distance between consecutive lenses/pinholes, D = 120μm is the required effective spot/pinhole size on the input plane of the 4f system and λ = 520nm is the illumination wavelength.

In our experiment, the dynamical object is an amplitude-only SLM (HOLOEYE HED 6001 monochrome LCOS microdisplay), that was placed d = 16mm before the 4f system Fourier plane, yielding ∼75% overlap on the object plane between beams originated from neighboring pinholes. Notably, we used reflective SLMs as they can typically have smaller pixel size than transmissive SLMs (because the electronic systems can be placed behind the pixels), along with a considerably larger pixel fill factor. In order to further increase the contrast of the SLM-generated MLA, the SLM was slightly tilted so that the non-diffracted light gets deflected outside of the experiment’s numerical aperture, reaching an SNR of 40 dB. Both SLMs have 1920 × 1080 pixels, 8μm pixel pitch, 8 bit dynamic range, and 60Hz input frame rate (synchronized with the laser source).

Lens L2 transforms the object plane’s exit-wave to spatial frequency domain at the exit plane of the 4f system, where we placed the camera. The transformation from real-space to the spatial frequency coordinates is given by ν = r/λf2 (the fact that the object is located distance d + f2 before lens L2 merely adds a parabolic phase, which is not detectable by the camera), where r is a spatial coordinate vector on the camera plane. The resulting images were captured with a Basler acA2440-35um camera that has 2448 × 2048 pixels and a pixel size of 3.45 × 3.45μm2.

In order to make the most efficient use of the rectangular detection area of the camera, we sampled the object with a rectangular Nx × Ny = 4 × 5 pinhole array, yielding a cutoff frequency νmax = b/2λf1 = 14 × 10−3/2λ, i.e. ∼ 1% of the diffraction limit, and a field of view of FOV= (Nx × Ny)bd/f1 ≈ 1mm2 [6].

In order to measure K frames with TIMP, we illuminate the system with a burst of K ≤ 36 pulses [7]. The acquisition time of the camera was set to 4sec (which is larger than KΔt), hence the recorded intensity pattern corresponds to an incoherent sum of the diffraction patterns from all the K pulses in the burst:

Im(ν)=k=1K[Pk(rRm)Ok(r)]2
where I, P, O stand for intensity, probe and object distributions, respectively, m = 1, 2, 3, ..., Nx × Ny is the block/pinhole index, k is the frame/pulse index, and stands for the 2D spatial Fourier operator.

3. Experimental results

Generally, multiple frames of a dynamical complex-valued object cannot be recovered uniquely from the measurements described in Eq. (1). However, an orthogonal set of probes can lead to a unique reconstruction of both the probes and the objects [11]. Thus, in our experiment we employ the SLM to produce a burst of pulses with orthogonal complex-valued spatial profiles. Notably, according to Plancherel’s theorem [15], the orthogonality is preserved after propagation to the object’s plane.

We use two different encoding approaches to acquire mutually orthogonal pulses – an orbital angular momentum encoding (OAME) which is based on the orthogonality of Laguerre-Gaussian beams [16], and a phase gradient encoding (PGE) which shifts transversely the pattern induced by the MLA.

3.1. Orbital angular momentum encoding

First, we explore TIMP using orbital angular momentum encoding of the probe pulses. That is, the phase mask induced by SLMP is given by:

Φl(r,θ)=exp(iπr2λfMLA)exp(ilθ)
where l is the angular mode index number, r and θ are the radial and angular coordinates, respectively. The orthogonality condition for the OAME is l ∈ ℤ [16].

In each experiment, we first measured a set of 9 such probes using the digit 0 as a known reference single frame object, and later used them as initial guesses for the probe beams in MsPA [10] reconstructions of multi-frame measurements.

Next, we used a burst of 9 such probes to illuminate the dynamical object – SLMO displaying the digits 1–9 – and recorded the integrated diffraction patterns on a single camera snapshot. Specifically, we used the following zipper-like OAME set l = {−9, −7, −5, −3, −1, 2, 4, 6, 8}. The captured snapshot is shown in Fig. 2(a). Using MsPA [10] we reconstruct successfully 9 complex-valued digits and 9 complex-valued probes (Fig. 2(b)).

 figure: Fig. 2

Fig. 2 Reconstruction of 9 complex-valued objects and probes using OAME. (a) The intensity pattern recorded in a single camera snapshot. (b) Reconstructed frames – complex-valued objects and probes. Each frame is divided to 4 quarters (as marked on the first frame): top-left is object amplitude, top-right is object phase, bottom left is probe amplitude and bottom-right is probe phase. The amplitudes are normalized, and the phases are in the [−π, π] range.

Download Full Size | PDF

Two features are worth mentioning regarding OAME. First, as higher modes contain higher spatial frequencies, they broaden the diffraction patterns, limiting the resolution of the system. As shown in Fig. 2, OAME transfers the spatial spectral information to ring-like shapes with different radii. In order to minimize the overlap level between different frames, and thus improve the reconstruction quality, we chose the mode orders in the zipper-like way described above. Second, the probe mode order is also limited by the SLM pixel size. Because of the circular profiles of vortices, the required resolution increases towards the center of the vortex. Therefore, the pixel resolution of the SLM limits the number of measurable frames in a single snapshot.

3.2. Phase gradient encoding

Next, we explore TIMP with phase gradient (PG) induced masks:

Φk(r)=exp(iπr2λfMLA)exp(ikr)
where k = (kx, ky) is a phase gradient mode vector, and r is the spatial coordinates vector. The overlap between PG modes in our system is given by:
b/2b/2b/2b/2Φk(rRm)Φk˜*(rRm)dxdy=4eiΔkRmsin(b2Δkx)Δkxsin(b2Δky)Δky
where k, = (kx, ky), (x, y) are phase gradients modes, Δk = k, Φ is a phase image displayed on the SLM, and * stands for complex conjugation. According to Eq. (4), in order to form an orthogonal set (i.e. zero overlap), every pair of different PGE modes must satisfy:
Δk{2πb(i,j)|(i,j)2,(i,j)(0,0)}

Notably, a PG increment Δk = 2π/b displaces the output intensity diffraction patterns on the camera plane by Δr = λfMLA/b (for magnification M = f2/f1 = 1), thus limiting the achievable spatial spectral content of the original system. According to Eq. (5), for objects with typical bandwidth of Δkobj = max with 0 ≤ n ≤ 1, the highest PG mode contained in the available spectral width is given by:

Δkmax<νmaxΔkobj=(1n)b2λf1imax<(1n)b24πλf1
Substituting the optical parameters into Eq. (6), we get that the highest measurable mode is imax = ⌊3 × (1 − n)⌋. Therefore, the system is capable of capturing up to 25 frames if the captured objects’ bandwidths are up to νmax/3, and up to 9 frames if the captured objects’ bandwidths are up to νmax × 2/3.

Figure 3 presents experimental results of TIMP using phase gradient encoding. As in the experiment presented in section 3.1, we reconstructed the probes in advance and used the results as initial guesses for the probes.

 figure: Fig. 3

Fig. 3 Reconstruction of 9 complex-valued objects and probes using PGE. (a) The intensity pattern recorded in a single camera snapshot. (b) Reconstructed frames – complex-valued objects and probes. Each frame is divided to 4 quarters (as marked on the first frame): top-left is object amplitude, top-right is object phase, bottom left is probe amplitude and bottom-right is probe phase. The amplitudes are normalized, and the phases are in the [−π, π] range.

Download Full Size | PDF

3.3. Orbital angular momentum & phase gradient encoding

The number of measurable frames is limited in both probe encoding methods described above, due to the aforementioned technical limitations.

However, these limitations can be overcome by combining the two encoding methods, e.g. by equipping every PGE displaced probe with multiple OAME modes. Thus, by using 4 OAME modes with 9 PGE modes, we increase the sequence depth of the system to 36 frames per snapshot. Figure 4 shows the captured snapshot as well as 36 reconstructed frames consisting of complex-valued objects and probes.

 figure: Fig. 4

Fig. 4 Reconstruction of 36 complex-valued objects and probes using OAM and PG encodings. (a) The intensity pattern recorded in a single camera snapshot. (b) Reconstructed frames – complex-valued objects and probes. Each frame is divided to 4 quarters (as marked on the first frame): top-left is object amplitude, top-right is object phase, bottom left is probe amplitude and bottom-right is probe phase. The amplitudes are normalized, and the phases are in the [−π, π] range.

Download Full Size | PDF

3.4. Image quality comparison

We demonstrated above TIMP reconstruction of multiple frames using three encoding methods of the probe beams. Next, we evaluate the performances of the methods. We implemented three comparison criteria: radially averaged spectral width, peak signal-to-noise ratio (PSNR) and structural similarity (SSIM) [17]. The spectral width indicates the resolution obtained by each method, while PSNR and SSIM are commonly used image quality metrics that indicate the visual similarity (in real space) between the retrieved and original objects. For the evaluation, we used the digit 8 as it is relatively symmetrical (important for the radial averaging to be indicative) but still more geometrically complex than a simple circle or ellipse.

Figures 5(a)–5(e) show the frames that we used in this comparison. The original object (ground truth) that we used was a real-valued image with no noise (Fig. 5(a)). Figure 5(b) shows the SSP reconstruction of the same object – its performance corresponds to the upper limit of our current setup before introducing multi-framing. Figures 5(c) and 5(d) display the TIMP reconstruction of the 8th frame (out of 9 frames) using OAME and PGE, respectively. The 9th reconstructed frame (out of 36 frames) using the combined OAME and PGE encoding is shown in Fig. 5(e).

 figure: Fig. 5

Fig. 5 The amplitude images and radially averaged spatial spectra that were used for comparison in Table 1. (a) the original (ground truth) image, (b) SSP reconstruction, (c) OAME TIMP reconstruction, (d) PGE TIMP reconstruction, (e) OAME+PGE (combined) TIMP reconstruction. (f) Radial average of the spatial spectra of (a)–(e). The dashed red line marks half of the maximum for FWHM comparison.

Download Full Size | PDF

The radially averaged spatial spectra are shown in Fig. 5(f). The spectral width (FWHM) as well as the PSNR and SSIM values for each method are shown in Table 1. The SSP line shows that our SSP setup does not give rise to high-resolution microscopy. But importantly, comparing the SSP and TIMP lines indicate that the multiplexing component in TIMP does not significantly further reduce the resolution of the system. Strikingly, OAME largely maintained the reconstruction quality, even though it contains eight frames more than SSP. Also noticeable, PGE exhibits poorer performance than SSP and OAME. This is likely because the PGE orthogonality condition in Eq. (5) is more sensitive than OAME to pixelization and inaccuracies in the phase masks induced by the SLM, therefore have limited resolution and dynamic range. The performance of the reconstruction with the combined encoding is naturally the poorest.

Tables Icon

Table 1. Reconstructions Comparison

3.5. Measuring complex-valued spatial profiles of pulses in a burst

Mathematically, the object and the probe beam play symmetric roles in ptychography (the measured signal is a function of their product [1]), hence they are interchangeable. Indeed, ptychography is sometimes used for characterizing the probe beams using a known object [18,19]. Applying this exchange of roles between the objects and probe beams to TIMP offers a method for characterizing the (different) spatial profiles of the pulses in a pulse train. We show below a proof-of-concept demonstration of this possibility. To this end, we apply our TIMP reconstruction algorithm on the data presented in Fig. 2(a), where the reconstructed frames presented in Fig. 2(b) are used as initial guesses for the objects and an SSP reconstruction of a TEM00 profile was used as an initial guess for all the probe beams. The reconstruction results are presented in Fig. 6, showing very good correspondence of the reconstructed probe beam functions to those presented in Fig. 2. Hence, we conclude that by using a known dynamic object, TIMP is indeed capable of measuring the spatial complex-valued profiles of pulses in a burst.

 figure: Fig. 6

Fig. 6 Reconstruction of 9 complex-valued spatial profiles of pulses in a burst from measured data shown in Fig. 2(a). Each frame is divided to 2 (as marked on the first frame): amplitude on the left and phase on the right. The amplitudes are normalized, and the phases are in the [−π, π] range.

Download Full Size | PDF

4. Conclusion

In summary, time-resolved imaging by multiplexed ptychography (TIMP) is a new scheme for multi-frame imaging from data recorded in a single camera snapshot [7]. In this work we demonstrated TIMP experimentally, reconstructing up to thirty-six frames that were recorded in a single camera exposure. In this experiment, the spatial resolution was low (∼300 microns) and the frame rate was 60Hz (corresponding to the frame rate of the SLMs). In order to increase the frame rate to the THz scale, one needs to generate a train of femtosecond pulses that are separated by a picosecond scale interval and each pulse is modulated (coded) uniquely such that the pulses are mutually orthogonal. A possible approach to producing such a train of pulses is to launch a single pulse into a multi-mode fiber. Due to modal dispersion, the single pulse is split into a train of pulses, each pulse with a different mode of the fiber [20]. Approaches to enhancing the spatial resolution of TIMP include the use of high NA lenses in 4fSSP, utilizing SSP based on coded aperture [21] and using short-wavelength radiation.

Funding

European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (819440-TIMP).

References

1. J. Rodenburg, “Ptychography and related diffractive imaging methods,” in Advances in Imaging and Electron Physics, vol. 150Hawkes, ed. (Elsevier, 2008), pp. 87–184. [CrossRef]  

2. J. Miao, T. Ishikawa, I. K. Robinson, and M. M. Murnane, “Beyond crystallography: Diffractive imaging using coherent x-ray light sources,” Science 348, 530–535 (2015). [CrossRef]   [PubMed]  

3. J. M. Rodenburg and H. M. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85, 4795–4797 (2004). [CrossRef]  

4. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109, 1256–1262 (2009). [CrossRef]   [PubMed]  

5. X. Pan, C. Liu, and J. Zhu, “Single shot ptychographical iterative engine based on multi-beam illumination,” Appl. Phys. Lett. 103, 171105 (2013). [CrossRef]  

6. P. Sidorenko and O. Cohen, “Single-shot ptychography,” Optica 3, 9–14 (2016). [CrossRef]  

7. P. Sidorenko, O. Lahav, and O. Cohen, “Ptychographic ultrahigh-speed imaging,” Opt. Express 25, 10997–11008 (2017). [CrossRef]   [PubMed]  

8. X. He, C. Liu, and J. Zhu, “Single-shot fourier ptychography based on diffractive beam splitting,” Opt. Lett. 43, 214–217 (2018). [CrossRef]   [PubMed]  

9. X. He, C. Liu, and J. Zhu, “Single-shot aperture-scanning fourier ptychography,” Opt. Express 26, 28187–28196 (2018). [CrossRef]   [PubMed]  

10. P. Thibault and A. Menzel, “Reconstructing state mixtures from diffraction measurements,” Nature 494, 68 (2013). [CrossRef]   [PubMed]  

11. P. Li, T. Edo, D. Batey, J. Rodenburg, and A. Maiden, “Breaking ambiguities in mixed state ptychography,” Opt. Express 24, 9038–9052 (2016). [CrossRef]   [PubMed]  

12. J. Liang and L. V. Wang, “Single-shot ultrafast optical imaging,” Optica 5, 1113–1127 (2018). [CrossRef]  

13. B. K. Chen, P. Sidorenko, O. Lahav, O. Peleg, and O. Cohen, “Multiplexed single-shot ptychography,” Opt. Lett. 43, 5379–5382 (2018). [CrossRef]   [PubMed]  

14. W. Xu, H. Xu, Y. Luo, T. Li, and Y. Shi, “Optical watermarking based on single-shot-ptychography encoding,” Opt. Express 24, 27922–27936 (2016). [CrossRef]   [PubMed]  

15. D. J. Brady, Optical imaging and spectroscopy (John Wiley & Sons, 2009). [CrossRef]  

16. J. D. Jackson, Classical electrodynamics (John Wiley & Sons, 2007).

17. A. Hore and D. Ziou, “Image quality metrics: Psnr vs. ssim,” in 2010 20th International Conference on Pattern Recognition, (IEEE, 2010), pp. 2366–2369. [CrossRef]  

18. A. Schropp, R. Hoppe, V. Meier, J. Patommel, F. Seiboth, H. J. Lee, B. Nagler, E. C. Galtier, B. Arnold, U. Zastrau, J. B. Hastings, D. Nilsson, F. Uhlén, U. Vogt, H. M. Hertz, and C. G. Schroer, “Full spatial characterization of a nanofocused x-ray free-electron laser beam by ptychographic imaging,” Sci. Reports 3, 1633 (2013). [CrossRef]  

19. M. Guizar-Sicairos, S. Narayanan, A. Stein, M. Metzler, A. R. Sandy, J. R. Fienup, and K. Evans-Lutterodt, “Measurement of hard x-ray lens wavefront aberrations using phase retrieval,” Appl. Phys. Lett. 98, 111108 (2011). [CrossRef]  

20. R. Rokitski, P.-C. Sun, and Y. Fainman, “Study of spatial–temporal characteristics of optical fiber based on ultrashort-pulse interferometry,” Opt. Lett. 26, 1125–1127 (2001). [CrossRef]  

21. G. I. Haham, O. Peleg, P. Sidorenko, and O. Cohen, “High-resolution (diffraction limit) single-shot ptychography for ultrahigh-speed microscopy,” in Imaging and Applied Optics 2018 (DH), (Optical Society of America, 2018), p. JTh3A.4.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 (a) Schematic diagram of 4fSSP microscope with ray tracing. (b) Schematic diagram of our TIMP microscope which is based on 4fSSP. A phase-only reflective SLM (SLMP) replaces the static pinhole array in (a). The dynamic object is produced by using a reflective amplitude-only SLM (SLMO). (c) The fixed component phase structure, which is induced by SLMP, that mimics a micro-lens array (MLA).
Fig. 2
Fig. 2 Reconstruction of 9 complex-valued objects and probes using OAME. (a) The intensity pattern recorded in a single camera snapshot. (b) Reconstructed frames – complex-valued objects and probes. Each frame is divided to 4 quarters (as marked on the first frame): top-left is object amplitude, top-right is object phase, bottom left is probe amplitude and bottom-right is probe phase. The amplitudes are normalized, and the phases are in the [−π, π] range.
Fig. 3
Fig. 3 Reconstruction of 9 complex-valued objects and probes using PGE. (a) The intensity pattern recorded in a single camera snapshot. (b) Reconstructed frames – complex-valued objects and probes. Each frame is divided to 4 quarters (as marked on the first frame): top-left is object amplitude, top-right is object phase, bottom left is probe amplitude and bottom-right is probe phase. The amplitudes are normalized, and the phases are in the [−π, π] range.
Fig. 4
Fig. 4 Reconstruction of 36 complex-valued objects and probes using OAM and PG encodings. (a) The intensity pattern recorded in a single camera snapshot. (b) Reconstructed frames – complex-valued objects and probes. Each frame is divided to 4 quarters (as marked on the first frame): top-left is object amplitude, top-right is object phase, bottom left is probe amplitude and bottom-right is probe phase. The amplitudes are normalized, and the phases are in the [−π, π] range.
Fig. 5
Fig. 5 The amplitude images and radially averaged spatial spectra that were used for comparison in Table 1. (a) the original (ground truth) image, (b) SSP reconstruction, (c) OAME TIMP reconstruction, (d) PGE TIMP reconstruction, (e) OAME+PGE (combined) TIMP reconstruction. (f) Radial average of the spatial spectra of (a)–(e). The dashed red line marks half of the maximum for FWHM comparison.
Fig. 6
Fig. 6 Reconstruction of 9 complex-valued spatial profiles of pulses in a burst from measured data shown in Fig. 2(a). Each frame is divided to 2 (as marked on the first frame): amplitude on the left and phase on the right. The amplitudes are normalized, and the phases are in the [−π, π] range.

Tables (1)

Tables Icon

Table 1 Reconstructions Comparison

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

I m ( ν ) = k = 1 K [ P k ( r R m ) O k ( r ) ] 2
Φ l ( r , θ ) = exp ( i π r 2 λ f MLA ) exp ( i l θ )
Φ k ( r ) = exp ( i π r 2 λ f MLA ) exp ( i k r )
b / 2 b / 2 b / 2 b / 2 Φ k ( r R m ) Φ k ˜ * ( r R m ) d x d y = 4 e i Δ k R m sin ( b 2 Δ k x ) Δ k x sin ( b 2 Δ k y ) Δ k y
Δ k { 2 π b ( i , j ) | ( i , j ) 2 , ( i , j ) ( 0 , 0 ) }
Δ k max < ν max Δ k obj = ( 1 n ) b 2 λ f 1 i max < ( 1 n ) b 2 4 π λ f 1
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.