Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Probe position correction based on overlapped object wavefront cross-correlation for continuous-wave terahertz ptychography

Open Access Open Access

Abstract

Continuous-wave terahertz ptychography is a promising large field-of-view lensless terahertz phase imaging method. Inaccurate probe positions would severely degrade the reconstruction quality, as compared to other spectral bands. In this paper, we propose a probe position correction method based on cross-correlation registration on overlapped regions of the object wavefront for terahertz ptychography. The translation errors could be minimized in the order of 0.01 pixels. The simulation results suggest good computational efficiency, correction, and reconstruction accuracy. We perform continuous-wave terahertz ptychography on a cicada’s forewing. The subcosta and the first radius vein are distinguished after position correction. The probe position distribution reveals that the tilt angle between the object plane and the recording plane is 0.26°.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Terahertz (THz) waves, located between the microwaves and infrared radiations in the electromagnetic spectrum, have unique propagation and penetration properties for biomedical imaging [1,2]. The absorption of THz radiation indicates the differences in water content and density of a tissue, while the phase images reveal the refractive index and thickness, thus THz phase imaging is more suitable for weakly absorbing biological samples [3]. THz time domain spectroscopy (THz-TDS) retrieves both the amplitude and phase information of the pulsed THz signal through the femtosecond laser and time delay line [4]. However, the resolution and data acquisition speed are restrained by the illumination spot size and raster scanning mode, respectively. Continuous-wave (CW) THz imaging approaches are blooming with the emerging of inexpensive, high-power THz sources and next generation THz imaging instruments and components. Considering the difficulty in acquiring the phase distribution directly, either monitoring the interference fringes by single cell detectors [5–7] or recording holograms by array detectors [3,8–11] are chosen as the recording architectures for CW THz phase imaging. Quantitative amplitude and phase information of the object wavefront are retrieved through diffraction propagation algorithms. However, the limited sample size or the field-of-view (FOV) of these methods hinders THz imaging applications for large size biological samples. Ptychography is widely used as a lensless coherent diffraction imaging approach, in which the complex amplitude distribution of the sample is retrieved robustly from a set of diffraction patterns originating from overlapping sample illumination areas [12,13]. Both the probe function and the complex transmittance function of the specimen could be retrieved respectively by the ePIE (extended ptychographical iterative engine) algorithm without a priori probe spatial distribution [14]. This imaging approach has been applied successfully in visible light, X-ray and electron microscopy [15–18]. It has been demonstrated recently in THz domain based on a 3.1 THz far-infrared gas CW laser [19].

In ptychography, the inaccuracy of the illumination probe’s parameters, i.e., defocus, spherical aberration and position errors, would increase computational cost and degrade the reconstruction quality [20]. Because the sample is usually motorized by a two-axis linear translation stage during data acquisition, the accuracy of the stage as well as the tilt between the object plane and the recording plane are the two main factors of probe position errors [21]. Other minor negative factors come from system instability including hysteresis in the probe scan coils, thermal drift of illumination electron beam and instability of X-ray sources [22,23]. Therefore, translation position correction has become important in the research of ptychography [21,24–34]. The nonlinear optimization approach [24], the conjugate gradient descent method [25] as well as the diffraction pattern gradient algorithm [26] are difficult to balance between sub-pixel accuracy of final position refinement and large initial probe position errors. The annealing algorithm (also known as the position-correcting ptychographical iterative engine, pc-PIE) [27] is based on a trial-and-select strategy to test a number of trial positions for each probe position. This method as well as the genetic algorithm [28] and the structural similarity index based machine learning method [29] are available to achieve these two goals at the same time at the cost of being computationally expensive. One way to reduce the computational load is to insert the cross-correlation operation into these iterative position registration algorithms. Zhang et al. accelerated the refinement process by performing cross-correlation on successive iterations of the transmittance function at every individual probe position [21]. Dwivedi et al. used the object and probe distributions reconstructed by ptychographical iterative engine (PIE) [13] as the initial guess of the subsequent hybrid-input-out (HIO) algorithm [30]. Cross-correlation was thereafter implemented on the retrieved transmittance function of adjacent probe positions [31]. The combination of these two algorithms boosts the update process of the object and probe spatial distributions as well as the probe positions [31].

The methods described above usually start from the nominal translation positions and insert probe correction related constraints in every iteration, which increases the calculation time of each iteration. Another strategy is to approximately estimate the positions through other coherent imaging approaches. Hurst et al. indicated that the probe positions in electron ptychography could be determined by cross-correlation of adjacent object wavefronts, the latter of which are reconstructed from in-line holograms evaluating from ptychographic diffraction patterns with a priori probe function [32]. This method relies on human intervention and is sensitive to noise. Takahashi et al. plotted the Gaussian fitting curve of the intensity profile based on the dark-field knife-edge scan method [33] and defined the peak of curve as the reference for estimating probe positions in hard X-ray ptychography [22]. This approach requires a priori sample distribution and additional high-precision knife-edge scanning. Hessing et al. adopted Fourier transform holographic scheme in soft X-ray ptychography by which the translation positions were estimated by aligning the reconstructed object wavefronts [34]. Because the point source of the reference beam is closing to the sample at the object plane, the sample size is restrained. The position map in soft X-ray ptychography could be approximately registrated with a priori model for positional drift [23].

In this paper, we propose a probe position correcting method for continuous-wave terahertz ptychography. The adjacent regions of the object wavefront, retrieved from successive THz far-field diffraction patterns based on angular spectrum propagation [35], include the information of both the overlapping complex transmittance function of the specimen and identical probe function. The peak of cross-correlation on the complex amplitudes is adopted as the offset of neighbouring probe positions. All probe positions are estimated once the initial scanning position is determined. The position errors could be further refined to within a few hundredths’ of a pixel accuracy after combining iterative algorithms or with a priori information of the probe function. The effectiveness of the proposed method is verified via simulation and experimental results. The structure of this paper is as follows: Section 2 describes the principle of the proposed method; Section 3 verifies the effectiveness of the method through simulation; the experimental setup as well as the reconstruction and correction results are shown in Section 4; the conclusions are given in the final section.

2. Probe position estimation algorithm

Figure 1 shows a flowchart of the proposed algorithm. Assuming r=(x,y) as the coordinate vector at the object plane, where the complex transmittance function of the specimen and the probe function are expressed as O(r)and P(r), respectively. The probe function refers to the complex distribution of the illumination beam at the object plane. Therefore, the complex distribution at this plane is O(r)P(r). We define d to be the distance from the object plane to the recording plane, and denote u=(ξ,η) as the coordinate vector at the recording plane. Then the captured diffraction patterns can be expressed as:

Ij(u)=|Gd{O(r)P(rRj)}|2,
where Rj=(xj,yj) is the jth translation vector between the specimen and the probe, j = 1, 2, 3…, J, J is the total number of diffraction patterns. G{} represents the diffraction propagation operator.

 figure: Fig. 1

Fig. 1 A schematic representation of the proposed algorithm. If the initial probe position is determined, the following probe positions are successively estimated by Eqs. (2) – (5).

Download Full Size | PDF

The diffraction pattern at the jth probe position is backward propagated to the object plane. The estimated object wavefront corresponding to the illuminated region can be expressed as:

ψj(r)=Gd{Ij(u)},
which contains the transmittance function at the jth probe position. Similarly, ψj+1(r) includes the transmittance function at the adjacent (j + 1)th probe position.

The relative translation offset of neighbouring probe positions is estimated by performing cross-correlation on adjacent regions of object wavefront based on the sub-pixel image registration algorithm [36]:

A=ψj(r)ψj+1(r)=1{{ψj(r)}{ψj+1*(r)}},
where “” indicates the cross-correlation operation, and 1 represent the two-dimensional Fourier transform and inverse Fourier transform. ψj+1(r) denotes the complex conjugate of ψj+1(r).

The result is embedded in the center of a zero-value array, the size of which is enlarged κ times of the original matrix. An upsample cross-correlation distribution is thereafter obtained by computing inverse Fourier transform of the larger-size array [37]. The accuracy of the cross-correlation calculation is related with κ.

The offset (Δxj,Δyj), consisting of the nominal step and the shift error, between the (j + 1)th scanning probe position and the previous exposure is estimated by locating the peak of the cross-correlation distribution. All relative position offsets are written as:

Δx=(Δx1,Δx2,,ΔxJ1),
Δy=(Δy1,Δy2,,ΔyJ1).

If the initial probe position is determined, the following probe positions are successively estimated according to the following recursive relationship:

xj+1=xj+Δxj(j=1,2,,J1),
yj+1=yj+Δyj(j=1,2,,J1).

The acquired probe coordinates are substituted into the iterative ePIE algorithm [14] to update the distributions of the transmission and probe functions.

This method is different from other published cross-correlation based probe position correction methods [21,31]. For the proposed method, cross-correlation is calculated by aligning the object wavefronts, retrieved from the corresponding diffraction patterns by backward propagation, at two adjacent scan positions. For the other iterative algorithms, cross-correlation is performed on successive iterations of the transmittance function of the same probe position. It is also noted that the probe positions estimated by the proposed method could play a role as the starting positions for pc-PIE [27] or other iterative position recovery algorithms to further refine the position accuracy and the reconstruction quality.

3. Simulation

The effectiveness of the proposed algorithm is evaluated by simulating CW THz ptychography based on the experimental equipment available to us. A normally incident plane wave at λ = 118.83 μm (corresponding to 2.52 THz) is cropped with a circular aperture with a diameter of 15 mm before it illuminates the object plane located 5 mm downstream. The two images in Figs. 2(a1) and 2(a2), consisting of 512 × 512 pixels, are used as the intensity and phase distributions of the transmittance function. Its intensity ranges from 0 to 1 and its phase ranges from 0 to π/2. The intensity and phase of the simulated probe are shown in the respective insets to the figure. The detector, located 10 mm away from the object, has 124 × 124 pixels with a pixel pitch of 100 μm × 100 μm. 36 diffraction patterns are simulated with a probe lateral overlap of 75% relative to the probe diameter. Probe position errors are artifically introduced by offsetting each position in the 6 × 6 grid by a vector with random x- and y- components in the range of ± 8 pixels, as shown by the blue circles in Fig. 3. The mean error of initial positions is 6.47 pixels.

 figure: Fig. 2

Fig. 2 Comparison of simulation results with or without probe position refinement. (a1) and (a2) are the intensity [0, 1] and phase [0, π/2] of the test object transmittance function. Insets show the intensity and phase of the simulated probe. The highlighted areas indicated the effective field-of-view. (b1) and (b2) are the reconstructed distributions with probe function update only using 20 loops of ePIE. (c1) and (c2) are the reconstructed images after the probe position estimation using the proposed method (PM) and afterward 20 loops of ePIE. (d1) and (d2) are reconstruction results after 100 loops of pc-PIE on the basis of (c1) and (c2). (e1) and (e2) are the reconstruction results using 20 loops of e-PIE and 100 loops of pc-PIE.

Download Full Size | PDF

 figure: Fig. 3

Fig. 3 Probe position maps in the simulation. The blue circles indicate the erroneous probe positions used in generating the diffraction patterns. The red crosses illustrate the start probe positions before correction (6 × 6 regular grid) with mean position error of 6.74 pixels. The green stars show the corrected positions of Fig. 2(c) with mean position error of 1.86 pixels.

Download Full Size | PDF

In order to quantify visual perceptual quality, correlation coefficient (CC) [38] and structural similarity index (SSIM) [39] ranging from 0 to 1 are adopted as metrics.

CC(m,n)=(mμm)(nμn)(mμm)2×(nμn)2,
SSIM(m,n)=(2μmμn+C1)(2σmn+C2)(μm2+μn2+C1)(σm2+σn2+C2),
where μm, μn, σm, σn, and σmnare the local means, standard deviations and cross-covariance for image m and n. In our case, m is the input intensity or phase distribution in the effective field-of-view indicated as the highlighted areas in Figs. 2(a1) and 2(a2), while n is the corresponding reconstructed distribution. C1=1×104L2 and C2=9×104L2 are two variables to stabilize the division with weak denominator in which L is the dynamic range value of the images. The larger CC and SSIM values represent higher similarity between the reconstructed images and the original ones demonstrating better reconstruction quality.

The simulations are carried out for three cases: (1) updating both the transmittance and probe functions in iterations by ePIE without correcting probe positions, (2) estimating the probe position by the proposed method and subsequently updating the object plane distribution, (3) further refining both the probe position and the object plane distribution by pc-PIE based on (2). The probe positions start as a regular grid with a pitch of 37 pixels, as shown by the red crosses in Fig. 3. The reconstruction distributions are shown in Figs. 2(b1) and 2(b2) using 20 loops of ePIE without position correction. The CC values with the initial images are 0.7910 and 0.4668, meanwhile the SSIM values are 0.3802 and 0.1893. It is clear that the reconstructed object and probe functions are severely distorted. In the second case, the probe position map estimated by the proposed method is indicated by the green stars in Fig. 3. The mean error of the retrieved positions is calculated as 1.86 pixels. It takes 80 loops of the pc-PIE algorithm (estimation number M = 5) to achieve similar precision [27]. The computation time, on a desktop PC with 2.7 GHz i5-6400, 8GB RAM and R2016B matlab, are 1064 sec for pc-PIE and 11 sec for the proposed method. After 20 loops of ePIE, the reconstruction quality of the object plane, as shown in Figs. 2(c1) and 2(c2), is significantly improved because of smaller translation errors. For the intensity image, the CC and SSIM values are 0.9379 and 0.6803, respectively. For the phase image, the corresponding values are 0.5369 and 0.3730. It is possible to decrease the magnitude of the shift error signal to 0.02 pixels by using the reconstruction results above as the initial guess in lieu of nominal shifts for the iterative pc-PIE algorithm and conducting 100 loops of iterations. The CC values of the intensity and phase distributions in Figs. 2(d1) and 2(d2) are 0.9758 and 0.8032, respectively. The corresponding SSIM values are 0.8816 and 0.7903, respectively. It is shown in Figs. 2(e1) and 2(e2) that the values of these two metrics decrease when using 20 loops of e-PIE and 100 loops of pc-PIE without inserting our method to estimate the probe positions.

Compared with pc-PIE, the proposed method has more obvious advantage in terms of computational load if larger size or more numbers of ptychographic diffraction patterns are calculated. The other recording parameters, i.e., arbitrary initial shift error, recording distance, overlap ratio and Poisson noise level on diffraction patterns, have also been investigated. The results suggest that these parameters have little influence on the accuracy of position estimation and the reconstruction quality of the proposed algorithm, therefore not presented here.

On the other hand, the complex amplitude at the object plane is the product of the transmittance function and the probe function, the latter of which has a negative influence on locating the peak of the cross-correlation function and retrieving accuracy of the probe positions. It is noted that if the probe function is known as a priori information, i.e., e-PIE can be replaced by the PIE (ptychographical iterative engine) algorithm for ptychographic reconstruction [13]. The transmittance function could be estimated as the quotient of the complex amplitude ψj(r) in Eq. (2) and the probe functionP(r). Cross-correlation is performed by aligning the transmittance functions at two adjacent scan positions. The simulation results with a priori probe function are illustrated in Fig. 4. The reconstruction images without updating probe positions are shown in Figs. 4(a1) and 4(a2). The CC and SSIM values are better than those in Figs. 2(a1) and 2(a2). It is shown in Figs. 4(b1) and 4(b2) that the translation positions are retrieved within 0.02 pixels accuracy by the proposed method when a priori probe function is applied. The quality of reconstruction images are better than those using 20 loops of pc-PIE shown in Figs. 4(c1) and 4(c2). Therefore, it is unnecessary to combine with additional time-consuming iterative algorithms, i.e., pc-PIE, to further refine the position accuracy. The calculation time of pc-PIE is 5320 sec to retrieve similar scanning positions to within a few hundredths’ of a pixel accuracy. In other words, the probe positions can be directly determined by the proposed method with a priori probe function reducing 99.8% of the calculation time comparing with pc-PIE. Figure 2 and Fig. 4 indicate that the reconstruction quality is improved with the refinement of the translation positions. The proposed method can dramatically decrease the computation complexity comparing with the algorithms starting from the nominal translation shifts.

 figure: Fig. 4

Fig. 4 Comparison of simulation results with a priori probe function. (a1) and (a2) are the reconstructed distributions using 20 loops of PIE without probe position correction. (b1) and (b2) are the reconstruction results using the proposed method and 20 loops of PIE. (c1) and (c2) are the results using 20 loops of pc-PIE.

Download Full Size | PDF

4. Experimental results

An experimental setup for continuous-wave terahertz ptychography is depicted in Fig. 5(a). A far-infrared carbon dioxide gas laser pumped methanol vapor to emit a CW THz beam (FIRL 295, Edinburgh Instruments) at wavelength of 118.83 μm (2.52 THz) with maximum output power of 500 mW. The beam was expanded and collimated via two gold-coated parabolic mirrors (PM1 and PM2). The sample was mounted on a three-axis motorized translation stage (MT3-Z8, Thorlabs, accuracy ± 0.1 μm, travel range 12 mm in each direction) located 5 mm away from a 3.3 mm pinhole. A pyroelectric array detector (Pyrocam III, Ophir Spiricon), which has 124 × 124 pixels on 100 μm × 100 μm pixel pitch and 48 Hz chopping frequency, was placed downstream of the sample. The sample was scanned over a 10 × 10 rectangular grid with interval of 800 μm (8 pixels) along the zig-zag route. The overlap ratio was calculated as 75%. The effective field of view (FOV) of the setup, limited by the travel range of the translation stage, was approximately 19.6 mm × 19.6 mm.

 figure: Fig. 5

Fig. 5 (a) Schematic layout of the setup of CW THz ptychography, PMs: parabolic mirrors. The sample was motorized by a 2-axis translation stage during data acquisition. For details see the main text. (b) Photo of the forewing of cicadas, Sc: subcosta, Rs: radius (R1, R2, R3, R4, R5), Ms: media (M1, M2), and “+” indicates the convex vein.

Download Full Size | PDF

The sample was a forewing of cicadas (Cryptotympana atrata). It is a kind of membrane wing in which the trachea parts of the surface are thickened as veins. The veins are longitudinal stripes to reinforce adjacent layers of the membrane. The venation pattern is an important basis for insect classification. The evolution of insects can also be traced through the comparison of venation patterns. The hydrophobicity and antibacterial properties of cicadas wings have become a focus of research [40]. As illustrated in Fig. 5(b), the costa of the forewing is degraded. The subcosta (Sc) and the radius veins (R) are connected side by side marked by “Sc+R” where “+” indicates the convex vein. The combination of Sc and both the first and second radius veins R1+2 are regarded as “Sc+R1+2”. The total width and the interval of these two veins are 473 μm and 161 μm, respectively. The fifth radial R5 is combined with the first media M1 tagged by “R5+M1”, and the width of which is approximately 99 μm. CW THz ptychography is available to retrieve the optical thickness of the membrane and veins through the reconstructed complex-valued transmittance function. Venation is feature-rich in THz scale so the reconstruction results are sensitive to the probe translational errors. This type of sample is suitable for verifying the effectiveness of probe position correction algorithms [37,41].

Due to the low sensitivity of the pyroelectric detector, 500 frames of diffraction pattern were recorded at each scanning position. Labview control programs are used to move the sample along the x and y directions through the motorized translation stage and control the detector exposure. The total data acquisition time was approximately 28 min when the fluctuation of THz laser intensity was neglected in the subsequent data processing. Gaussian fitting [8] was conducted to improve the signal-to-noise ratio of the diffraction patterns. The diffraction patterns of the 3rd and 4th probe positions are displayed in logarithmic scale in Figs. 6(a1) and 6(a2).

 figure: Fig. 6

Fig. 6 THz ptychographic far-field diffraction patterns. (a1) and (a2) are the diffraction patterns of the 3rd and 4th probe positions after Gaussian fitting of 500 frames. They are displayed in logarithmic scale. (b1) is the normalized in-line hologram at the 1st scanning position and (b2) is the reconstructed intensity distribution at z = 21.0 mm from (b1), in which the scalebar represents 1 mm. (c1) and (c2) are the intensity distributions at the object plane using angular spectrum propagation from (a1) and (a2).

Download Full Size | PDF

Because the chip is mounted inside the pyroelectric detector, it is difficult to measure the accurate distance from the object plane to the recording plane. After data acquisition, we removed the pinhole and reset the sample to the first scanning position. 500 frames of the THz in-line digital hologram and the corresponding background image removing the sample were recorded. The hologram after Gaussian fitting and normalization [8] is displayed in Fig. 6(b1). The recording distance was calculated as 21.0 mm using the autofocus algorithm [42]. The retrieved intensity distribution at z = 21.0 mm from the normalized hologram is illustrated in Fig. 6(b2). The scalebar in the figure represents 1 mm. The FOV was approximately 12.4 mm × 12.4 mm. The reconstructed veins are superimposed with the residual twin image after 100 iterations of phase retrieval [43], which is an inherent drawback of in-line digital holography. It is not available to distinguish the subcosta and the radius vein. R3 and R4 are more pronounced than other veins. According to the reconstruction distance and other experimental parameters, this setup can be categorized as far-field diffraction ptychography [35], the theoretical resolution of which was approximately 0.61λ/sinθ=245.7μm. Figs. 6(c1) and 6(c2) illustrate the object plane distribution in logarithmic scale retrieved by angular spectrum integral from the diffraction patterns shown in Figs. 6(a1) and 6(a2).

Figure 7 is the ptychographic reconstruction results of the forewing of cicadas. Compared with THz in-line digital holography, the FOV was expanded from 12.4 mm × 12.4 mm to 19.6 mm × 19.6 mm. Figures 7(a1) and 7(a2) show the complex-valued reconstruction of the transmittance function without probe position correction, assuming a regular grid with equally spaced intervals. The reconstruction results are free of the twin image found in holographic reconstruction but suffer from the position errors and the previously reported “raster grid pathology” artifact due to the symmetry of probe positions [21,41,44]. It is possible to trace R4 and “R5+M1” in the reconstructed intensity distribution. However, we still cannot distinguish between the subcosta and the radius veins “Sc+R1+2”. Figures 7(b1) and 7(b2) represent much improved reconstruction results with less noise and artifacts by using the proposed probe position estimation algorithm. The subcosta and the radius veins can be recognized in the upper-right corner, the total width of which occupies approximately 3 pixels. Moreover, the contrast of the veins at the lower-right edge of the venation pattern is also significantly improved and the width of “R5+M1” is about one pixel. From the phase image, the optical thicknesses of radius veins R3 and R4 are 8.70 μm and 10.22 μm, respectively. The reconstruction quality, shown in Figs. 7(c1) and 7(c2), benefits from further position refinement after 20 loops of pc-PIE. The optical thicknesses of the same regions of R3 and R4 are 8.82 μm and 9.30 μm, which are more identical and fit the actual distribution. Figures 7(d1) and 7(d2) are the intensity and phase distributions of the probe function corresponding to Figs. 7(c1) and 7(c2). The CC and SSIM values of the experimental results are listed in Table 1. The numerical difference between the reconstructed phase images without or with probe position refinement is even larger than that of the reconstructed intensity images. The results demonstrate the effectiveness of the proposed method.

 figure: Fig. 7

Fig. 7 Comparison of reconstructed images of cicadas’ forewing with or without probe position refinement. The red scalebar represents 1 mm. (a1) and (a2) are the reconstructed distributions without probe position correction after 30 loops of ePIE. (b1) and (b2) are the reconstructed images after probe position estimation and afterward 20 loops of ePIE. (c1) and (c2) are the reconstruction results after 20 loops of pc-PIE on the basis of (b1) and (b2). (d1) and (d2) are the reconstructed probe intensity and phase with position refinement.

Download Full Size | PDF

Tables Icon

Table 1. Comparison of reconstruction results in Fig. 7

Figure 8(a) represents the retrieved position errors along the x-axis and y-axis. The error distribution with a periodicity equal to the number of scanning lines indicates that the tilt between the object plane and the recording plane is the main source of translational errors in this experiment [21]. Figure 8(b) reveals the initial (red crosses) and the retrieved (green circles) probe position maps. The average offset between the nominal steps and actual positions is 0.73 pixels. The arrow indicates the scanning route corresponding to Fig. 8(a). The angle between the normal line of the object plane and the normal line of the recording plane is approximately 0.26° by the least-square method. Further in-depth analysis and discussion about the susceptibility to the uneveness of THz illumination spot, the room temperature fluctuation and the decreasing power due to the CO2 consumption of the laser, will be conducted in future.

 figure: Fig. 8

Fig. 8 Maps of translation position errors. (a) Position error distributions along x-axis and y-axis. (b) Probe position distributions before (red crosses) and after (green circles) correction. The arrow indicates initial scanning direction.

Download Full Size | PDF

5. Discussion and conclusion

We have proposed a probe position correction method that can be used for continuous-wave terahertz ptychography. The probe positions map can be plotted by calculating the offset of the neighbouring probe positions through cross-correlation registration on overlapped regions of the object wavefront, the latter of which are retrieved from the corresponding diffraction patterns by angular spectrum integral. The major difference between the proposed method and other cross-correlation methods is that the former is based on cross-correlation registration on the object wavefronts from adjacent probes while the latters perform cross-correlation on successive iterations of the transmittance function of the same probe position. Compared with the pc-PIE algorithm, the proposed method does not adopt the nominal translation positions as the initial positions for correction which reduces the computation complexity dramatically. Simulation results suggest that the accuracy of estimated positions is equivalent to the accuracy after 80 loops of pc-PIE. The magnitude of the translation errors could be minimized in the order of 0.01 pixels with a priori probe function.

A continuous-wave terahertz ptychographic setup was built on a 2.52 THz optically pumped laser and a pyroelectric array detector. We performed continuous-wave terahertz ptychography on a biological specimen for the first time: a cicadas’ forewing. After correcting the position errors by the proposed method, the contrast of venation is obviously improved, in which the subcosta and the first radius vein are distinguished. The average offset between the nominal steps and actual positions is 0.73 pixels. The periodic error distribution suggests that the position errors are mainly from the tilt between the object plane and the recording plane. Compared with CW THz in-line holography, THz ptychography is immune from twin image and has unlimited FOV, but its resolution is restrained by the diffraction angle. For those resolution enhancement approaches which have been demonstrated in THz holography and ptychography in other spectral bands, we shall investigate how to use them in the regime of THz ptychography.

Funding

National Natural Science Foundation of China (61675010, 61475011), Science Foundation of Education Commission of Beijing (KZ201610005008), Beijing Nova Program (XX2018072).

References

1. X. Yin, B. Ng, and D. Abbott, Terahertz Imaging for Biomedical Applications: Pattern Recognition and Tomographic Reconstruction (SPC, 2014).

2. J. Son, Terahertz Biomedical Science and Technology (CRC, 2014).

3. L. Rong, T. Latychevskaia, C. Chen, D. Wang, Z. Yu, X. Zhou, Z. Li, H. Huang, Y. Wang, and Z. Zhou, “Terahertz in-line digital holography of human hepatocellular carcinoma tissue,” Sci. Rep. 5(1), 8445 (2015). [CrossRef]   [PubMed]  

4. K. Peiponen, J. A. Zeitler, and M. Kuwata-Gonokami, Terahertz Spectroscopy and Imaging (Springer-Verlag Heidelberg, 2013).

5. M. Suga, Y. Sasaki, T. Sasahara, T. Yuasa, and C. Otani, “THz phase-contrast computed tomography based on Mach-Zehnder interferometer using continuous wave source: proof of the concept,” Opt. Express 21(21), 25389–25402 (2013). [CrossRef]   [PubMed]  

6. W. Sun, X. Wang, and Y. Zhang, “Continuous wave terahertz phase imaging with three-step phase-shifting,” Optik (Stuttg.) 124(22), 5533–5536 (2013). [CrossRef]  

7. L. Li, X. Wang, and H. Zhai, “Single-shot diagnostic for the three-dimensional field distribution of a terahertz pulse based on pulsed digital holography,” Opt. Lett. 36(14), 2737–2739 (2011). [CrossRef]   [PubMed]  

8. L. Rong, T. Latychevskaia, D. Wang, X. Zhou, H. Huang, Z. Li, and Y. Wang, “Terahertz in-line digital holography of dragonfly hindwing: amplitude and phase reconstruction at enhanced resolution by extrapolation,” Opt. Express 22(14), 17236–17245 (2014). [CrossRef]   [PubMed]  

9. S. H. Ding, Q. Li, Y. D. Li, and Q. Wang, “Continuous-wave terahertz digital holography by use of a pyroelectric array camera,” Opt. Lett. 36(11), 1993–1995 (2011). [CrossRef]   [PubMed]  

10. P. Zolliker and E. Hack, “THz holography in reflection using a high resolution microbolometer array,” Opt. Express 23(9), 10957–10967 (2015). [CrossRef]   [PubMed]  

11. M. Locatelli, M. Ravaro, S. Bartalini, L. Consolino, M. S. Vitiello, R. Cicchi, F. Pavone, and P. De Natale, “Real-time terahertz digital holography with a quantum cascade laser,” Sci. Rep. 5(1), 13566 (2015). [CrossRef]   [PubMed]  

12. H. M. L. Faulkner and J. M. Rodenburg, “Movable aperture lensless transmission microscopy: a novel phase retrieval algorithm,” Phys. Rev. Lett. 93(2), 023903 (2004). [CrossRef]   [PubMed]  

13. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]  

14. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109(10), 1256–1262 (2009). [CrossRef]   [PubMed]  

15. J. M. Rodenburg, A. C. Hurst, and A. G. Cullis, “Transmission microscopy without lenses for objects of unlimited size,” Ultramicroscopy 107(2-3), 227–231 (2007). [CrossRef]   [PubMed]  

16. J. M. Rodenburg, A. C. Hurst, A. G. Cullis, B. R. Dobson, F. Pfeiffer, O. Bunk, C. David, K. Jefimovs, and I. Johnson, “Hard-x-ray lensless imaging of extended objects,” Phys. Rev. Lett. 98(3), 034801 (2007). [CrossRef]   [PubMed]  

17. K. Giewekemeyer, M. Beckers, T. Gorniak, M. Grunze, T. Salditt, and A. Rosenhahn, “Ptychographic coherent x-ray diffractive imaging in the water window,” Opt. Express 19(2), 1037–1050 (2011). [CrossRef]   [PubMed]  

18. F. Hüe, J. M. Rodenburg, A. M. Maiden, F. Sweeney, and P. A. Midgley, “Wave-front phase retrieval in transmission electron microscopy via ptychography,” Phys. Rev. B 82(12), 121415 (2010).

19. L. Valzania, T. Feurer, P. Zolliker, and E. Hack, “Terahertz ptychography,” Opt. Lett. 43(3), 543–546 (2018). [CrossRef]   [PubMed]  

20. H. M. L. Faulkner and J. M. Rodenburg, “Error tolerance of an iterative phase retrieval algorithm for moveable illumination microscopy,” Ultramicroscopy 103(2), 153–164 (2005). [CrossRef]   [PubMed]  

21. F. Zhang, I. Peterson, J. Vila-Comamala, A. Diaz, F. Berenguer, R. Bean, B. Chen, A. Menzel, I. K. Robinson, and J. M. Rodenburg, “Translation position determination in ptychographic coherent diffraction imaging,” Opt. Express 21(11), 13592–13606 (2013). [CrossRef]   [PubMed]  

22. Y. Takahashi, A. Suzuki, N. Zettsu, Y. Kohmura, Y. Senba, H. Ohashi, K. Yamauchi, and T. Ishikawa, “Towards high-resolution ptychographic x-ray diffraction microscopy,” Phys. Rev. B Condens. Matter Mater. Phys. 83(21), 214109 (2011). [CrossRef]  

23. M. Beckers, T. Senkbeil, T. Gorniak, K. Giewekemeyer, T. Salditt, and A. Rosenhahn, “Drift correction in ptychographic diffractive imaging,” Ultramicroscopy 126, 44–47 (2013). [CrossRef]   [PubMed]  

24. M. Guizar-Sicairos and J. R. Fienup, “Phase retrieval with transverse translation diversity: a nonlinear optimization approach,” Opt. Express 16(10), 7264–7278 (2008). [CrossRef]   [PubMed]  

25. A. Tripathi, I. McNulty, and O. G. Shpyrko, “Ptychographic overlap constraint errors and the limits of their numerical recovery using conjugate gradient descent methods,” Opt. Express 22(2), 1452–1466 (2014). [CrossRef]   [PubMed]  

26. P. Dwivedi, A. P. Konijnenberg, S. F. Pereira, and H. P. Urbach, “Lateral position correction in ptychography using the gradient of intensity patterns,” Ultramicroscopy 192, 29–36 (2018). [CrossRef]   [PubMed]  

27. A. M. Maiden, M. J. Humphry, M. C. Sarahan, B. Kraus, and J. M. Rodenburg, “An annealing algorithm to correct positioning errors in ptychography,” Ultramicroscopy 120(5), 64–72 (2012). [CrossRef]   [PubMed]  

28. A. Shenfield and J. M. Rodenburg, “Evolutionary determination of experimental parameters for ptychographical imaging,” J. Appl. Phys. 109(12), 124510 (2011). [CrossRef]  

29. F. Guzzi, G. Kourousias, F. Billè, R. Pugliese, C. Reis, A. Gianoncelli, and S. Carrato, “Refining scan positions in Ptychography through error minimisation and potential application of Machine Learning,” J. Instrum. 13(6), C06002 (2018). [CrossRef]  

30. A. P. Konijnenberg, W. M. J. Coene, S. F. Pereira, and H. P. Urbach, “Combining ptychographical algorithms with the Hybrid Input-Output (HIO) algorithm,” Ultramicroscopy 171, 43–54 (2016). [CrossRef]   [PubMed]  

31. P. Dwivedi, A. P. Konijnenberg, S. F. Pereira, and H. P. Urbach, “An alternative method to correct translation positions in ptychography,” Proc. SPIE 10677, 106772A (2018).

32. A. C. Hurst, T. B. Edo, T. Walther, F. Sweeney, and J. M. Rodenburg, “Probe position recovery for ptychographical imaging,” J. Phys. Conf. Ser. 241, 012004 (2010). [CrossRef]  

33. Y. Suzuki, A. Takeuchi, H. Takano, and H. Takenaka, “Performance test of Fresnel zone plate with 50 nm outermost zone width in hard X-ray region,” Jpn. J. Appl. Phys. 44(4A), 1994–1998 (2005). [CrossRef]  

34. P. Hessing, B. Pfau, E. Guehrs, M. Schneider, L. Shemilt, J. Geilhufe, and S. Eisebitt, “Holography-guided ptychography with soft X-rays,” Opt. Express 24(2), 1840–1851 (2016). [CrossRef]   [PubMed]  

35. J. W. Goodman, Introduction to Fourier Optics 3rd Edition (Roberts and company publishers, 2005).

36. M. Guizar-Sicairos, S. T. Thurman, and J. R. Fienup, “Efficient subpixel image registration algorithms,” Opt. Lett. 33(2), 156–158 (2008). [CrossRef]   [PubMed]  

37. P. Thibault, M. Dierolf, O. Bunk, A. Menzel, and F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy 109(4), 338–343 (2009). [CrossRef]   [PubMed]  

38. R. C. Gonzalez and R. E. Woods, Digital Image Processing 3rd Edition (Pearson, 2007).

39. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process. 13(4), 600–612 (2004). [CrossRef]   [PubMed]  

40. J. Oh, C. E. Dana, S. Hong, J. K. Román, K. D. Jo, J. W. Hong, J. Nguyen, D. M. Cropek, M. Alleyne, and N. Miljkovic, “Exploring the role of habitat on the wettability of cicada wings,” ACS Appl. Mater. Interfaces 9(32), 27173–27184 (2017). [CrossRef]   [PubMed]  

41. O. Bunk, M. Dierolf, S. Kynde, I. Johnson, O. Marti, and F. Pfeiffer, “Influence of the overlap parameter on the convergence of the ptychographical iterative engine,” Ultramicroscopy 108(5), 481–487 (2008). [CrossRef]   [PubMed]  

42. H. Huang, D. Wang, L. Rong, X. Zhou, Z. Li, and Y. Wang, “Application of autofocusing methods in continuous-wave terahertz in-line digital holography,” Opt. Commun. 346, 93–98 (2015). [CrossRef]  

43. T. Latychevskaia and H.-W. Fink, “Solution to the twin image problem in holography,” Phys. Rev. Lett. 98(23), 233901 (2007). [CrossRef]   [PubMed]  

44. M. Dierolf, P. Thibault, A. Menzel, C. M. Kewish, K. Jefimovs, I. Schlichting, K. V. König, O. Bunk, and F. Pfeiffer, “Ptychographic coherent diffractive imaging of weakly scattering specimens,” New J. Phys. 12(3), 35017 (2010). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 A schematic representation of the proposed algorithm. If the initial probe position is determined, the following probe positions are successively estimated by Eqs. (2) – (5).
Fig. 2
Fig. 2 Comparison of simulation results with or without probe position refinement. (a1) and (a2) are the intensity [0, 1] and phase [0, π/2] of the test object transmittance function. Insets show the intensity and phase of the simulated probe. The highlighted areas indicated the effective field-of-view. (b1) and (b2) are the reconstructed distributions with probe function update only using 20 loops of ePIE. (c1) and (c2) are the reconstructed images after the probe position estimation using the proposed method (PM) and afterward 20 loops of ePIE. (d1) and (d2) are reconstruction results after 100 loops of pc-PIE on the basis of (c1) and (c2). (e1) and (e2) are the reconstruction results using 20 loops of e-PIE and 100 loops of pc-PIE.
Fig. 3
Fig. 3 Probe position maps in the simulation. The blue circles indicate the erroneous probe positions used in generating the diffraction patterns. The red crosses illustrate the start probe positions before correction (6 × 6 regular grid) with mean position error of 6.74 pixels. The green stars show the corrected positions of Fig. 2(c) with mean position error of 1.86 pixels.
Fig. 4
Fig. 4 Comparison of simulation results with a priori probe function. (a1) and (a2) are the reconstructed distributions using 20 loops of PIE without probe position correction. (b1) and (b2) are the reconstruction results using the proposed method and 20 loops of PIE. (c1) and (c2) are the results using 20 loops of pc-PIE.
Fig. 5
Fig. 5 (a) Schematic layout of the setup of CW THz ptychography, PMs: parabolic mirrors. The sample was motorized by a 2-axis translation stage during data acquisition. For details see the main text. (b) Photo of the forewing of cicadas, Sc: subcosta, Rs: radius (R1, R2, R3, R4, R5), Ms: media (M1, M2), and “+” indicates the convex vein.
Fig. 6
Fig. 6 THz ptychographic far-field diffraction patterns. (a1) and (a2) are the diffraction patterns of the 3rd and 4th probe positions after Gaussian fitting of 500 frames. They are displayed in logarithmic scale. (b1) is the normalized in-line hologram at the 1st scanning position and (b2) is the reconstructed intensity distribution at z = 21.0 mm from (b1), in which the scalebar represents 1 mm. (c1) and (c2) are the intensity distributions at the object plane using angular spectrum propagation from (a1) and (a2).
Fig. 7
Fig. 7 Comparison of reconstructed images of cicadas’ forewing with or without probe position refinement. The red scalebar represents 1 mm. (a1) and (a2) are the reconstructed distributions without probe position correction after 30 loops of ePIE. (b1) and (b2) are the reconstructed images after probe position estimation and afterward 20 loops of ePIE. (c1) and (c2) are the reconstruction results after 20 loops of pc-PIE on the basis of (b1) and (b2). (d1) and (d2) are the reconstructed probe intensity and phase with position refinement.
Fig. 8
Fig. 8 Maps of translation position errors. (a) Position error distributions along x-axis and y-axis. (b) Probe position distributions before (red crosses) and after (green circles) correction. The arrow indicates initial scanning direction.

Tables (1)

Tables Icon

Table 1 Comparison of reconstruction results in Fig. 7

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

I j (u)= | G d {O(r)P(r R j )} | 2 ,
ψ j (r)= G d { I j (u) },
A= ψ j (r) ψ j+1 (r)= 1 { { ψ j (r) }{ ψ j+1 * (r) } },
Δx=( Δ x 1 ,Δ x 2 ,,Δ x J1 ),
Δy=( Δ y 1 ,Δ y 2 ,,Δ y J1 ).
x j+1 = x j +Δ x j ( j=1,2,,J1 ),
y j+1 = y j +Δ y j ( j=1,2,,J1 ).
CC(m,n)= ( m μ m )( n μ n ) ( m μ m ) 2 × ( n μ n ) 2 ,
SSIM(m,n)= ( 2 μ m μ n + C 1 )( 2 σ mn + C 2 ) ( μ m 2 + μ n 2 + C 1 )( σ m 2 + σ n 2 + C 2 ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.