Abstract

A moveable lens is used for determining amplitude and phase on the object plane. The extended fractional Fourier transform is introduced to address the single lens imaging. We put forward a fast algorithm for the transform by convolution. Combined with parallel iterative phase retrieval algorithm, it is applied to reconstruct the complex amplitude of the object. Compared with inline holography, the implementation of our method is simple and easy. Without the oversampling operation, the computational load is less. Also the proposed method has a superiority of accuracy over the direct focusing measurement for the imaging of small size objects.

© 2016 Optical Society of America

1. Introduction

Wavefront reconstruction plays a vital role in many technical and scientific applications, such as optical metrology and imaging. The various ways of wavefront reconstruction can be categorized into two groups: (1) methods with a reference beam (interferometry) and (2) methods without a reference beam (phase retrieval) [1, 2]. Phase retrieval addresses the inverse problem of how to determine the phase distribution of a complex-valued function from some modulus data and additional priori information from the optical system, like wavelength and path length. It has been applied successfully in X-ray crystallography [3], astronomy [4] and coherent light microscopy [2, 5].

The iterative phase retrieval algorithm was created by Gerchberg and Saxton [6]. Then it evolved into Yang-Gu and Fienup’s algorithms [7–9], which optimize the convergence process. A feedback control [8] is introduced into the Gerchberg-Saxton phase retrieval algorithm [6]. This type of methods are constrained by the condition that the test object is either a phase-only or an amplitude-only object. As a synthesis, the multiple-image phase retrieval technology [10–15] was invented to obtain more accurate convergence result in iterative computing procedure. It is composed of several units, for which the computation of light propagation is the same. Then outcomes of these units are integrated to accelerate the convergence of iteration [10]. The multi-stage algorithm has been invented by Rodrigo et. al. [11], with serial computing applied to the phase retrieval in gyrator transform domains and Fresnel diffraction [12]. As a class of effective strategy, the retrieval procedures [11, 12] can improve the convergence stagnation in the Gerchberg-Saxton method [6]. It alters the rotation angles of cylindrical lenses to generate multiple intensity patterns corresponding to different transform parameters. Alternatively, illumination wavelength can be tuned to realize the serial calculation [13]. Another serial computing structure, named single-beam multiple-intensity reconstruction (SBMIR) algorithm, has been reported in [14]. Extensively, the implementation based on focus-tunable lens has been proposed in [15].

As the other integration type of multiple-image phase retrieval, parallel phase retrieval algorithm has been presented [16] as the amplitude-phase retrieval (APR) technique, which is discussed in gyrator transform domains [16] and fractional Fourier transform domains [17]. For the phase retrieval scheme in gyrator transform domains [11, 16], recording the output images is inconvenient in experiment due to the operation of thin cylinder lenses.

In this paper, we put forward an easy measurement scheme of amplitude and phase by moving a single lens along the optical axis of system. The extended fractional Fourier transform (eFrFT) [18], is employed to address the optical process. The recorded images are imported into iterative phase retrieval algorithm to determine the intensity distribution and phase distribution at the object plane. Here a convolution equation is derived for the computation of discrete eFrFT. As another possible choice, a two-step Fresnel diffraction with the phase modulation of lens is also calculated by contrast with eFrFT in phase retrieval to verify the convolution scheme. At length, the advantages of our scheme over the focusing measurement have been proved and analyzed. Compared with the coherent diffractive imaging (CDI) proposed by Shechtman et al. [19], our work extends to the visible wavelength regime and avoid the problem of non-existing lenses for hard X-ray. The oversampling process (support constraint) is not required, which is also different from APR in Fresnel domains [20]. Thus the computation load can be lessened.

2. Measurement scheme

Figure 1 shows a measurement system representing an eFrFT, which has a pair of unequal arm length for fractional Fourier transform [18, 21]. Its 1D mathematical expression is expressed as

an(u)=Ed1,d2,f{a0(x)}(u)=+a0(x)exp[λ(fd2)x2+(fd1)u22fxu(d1+d2)fd1d2]dx,
where an(u) is the output of the transform and the intensity pattern An(u) is equal to |an(u)|2. The parameters d1 and d2 are the distance between object and lens and lens and CCD respectively. The function a0(x) is the input determined by SLM. The symbols f and λ denote focal length and wavelength respectively. Equation (1) can be converted into a convolution formula
an(u)=m3(u){[a0(x)m1(x)]m2(x)},
where

 

Fig. 1 An optical system representing eFrFT for wavefront reconstruction. SLM: spatial light modulator; BE: beam expander; CCD: charge-coupled device.

Download Full Size | PPT Slide | PDF

m1(x)=exp[λd2(d1+d2)fd1d2x2],m2(x)=exp[λf(d1+d2)fd1d2x2],m3(u)=exp[λd1(d1+d2)fd1d2u2].

The functions (m1, m2 and m3) are phase-only. Mathematically, the inverse transform of eFrFT Ed1,d2,f is Ed2,d1,f according to Eq. (2). The fast algorithm of eFrFT can be designed by applying FFT to Eq. (2). Here the Fourier transform of the function m2is calculated as

F{m2(x)}(u)=exp(iπ4)exp(λ(d1+d2)fd1d2fu2),
where F denotes the Fourier operator. The 2D expression of eFrFT can be deduced from the Eqs. (1)-(4).

In Fig. 1, when a coherent light illuminates the SLM, it will carry the object plane information to be refracted by the lens and form a diffraction field after the lens. We keep the object plane and the image plane still and move the lens along the optical axis. Then the corresponding intensities of the diffraction patterns are recorded by CCD and stored in the computer for further data processing. During the initialization, the phase of object plane can been taken as a constant considering the uniform illumination of laser. In the APR algorithm, the amplitude and phase at the object plane are updated for the next loop by the average of computed data [16]. Inline holography needs beam splitter and recording medium. A precise reconstruction system is also required for determining the object information. Compared with it, the proposed measurement system is simple and easy.

3. Results and discussion

In our study, the working wavelength of laser is chosen as 632.8nm and the focal length is 200mm. The physical side length of square test images is fixed at 5mm, which is only changed in sub-section 3.5.

3.1 Effects of support constraint size

The oversampling method, which is an effective way of implementing various phasing algorithms, adapts magnitude data through zero padding, i.e., An=0,n in the padding region, which mathematically acts as a support constraint of the original image [20, 22].

The 256 × 256 binary resolution chart and the 256 × 256 Elaine [23] (Seen in Fig. 2) serve as the object function a0(x). Both images are surrounded by a dark (zero-valued) border to simulate images with support constraint. The width of dark margin is w, in proportion to the size of the padding region (Fig. 2(a)), whose unit is pixel. The resulting images are employed as input of the above algorithm. Here the object distance d1 is sequentially taken as 50mm, 70mm, 90mm, 110mm and 130mm. Accordingly, the image distance d2 is calculated by the equation d2=2fd1.

 

Fig. 2 Test images with support constraint: (a) binary resolution chart, (b) Elaine, where the dark border represents support constraint.

Download Full Size | PPT Slide | PDF

At the top of Fig. 3, five intensity images used for Elaine reconstruction are displayed, followed by the retrieved results after 1500 iterations with different support constraint sizes, where w is equal to 0 or 20. It can be directly perceived that reconstructions in Fig. 3(b) are quite close to the original resolution chart. Considering some pixel values in the resolution chart are zero functioning as the support constraint, the Elaine picture, whose pixel values range from 28 to 241, is used as the input. A similar result can be seen in Fig. 3(c). Quantitatively, the normalized correlation coefficient (NCC) is calculated to evaluate the similarity between test image f(x,y) and reconstructed complex amplitude g(x,y). First, their covariance matrix can be defined as

Cf,g=1MNx=1My=1N[f(x,y)f¯(x,y)][g(x,y)g¯(x,y)],
which is a measure of how much two images change together. Then the NCC is formulated as
NCC=Cf,gCf,fCg,g.
Its value ranges from 0 to 1. The larger the NCC is, the better two images match. As a result, coefficients are labeled in red under the reconstructed images. They all approach to 1, indicating the close correspondence of model and reconstruction.

 

Fig. 3 Intensity images: (a) for Elaine reconstruction. Reconstructions with the increasing size of support constraint: (b) binary resolution chart, (c) Elaine.

Download Full Size | PPT Slide | PDF

In terms of the convergence speed, the mean square error (MSE) between model and reconstruction of Elaine image is plotted as a function of iteration numbers in Fig. 4. Here MSE is defined as

MSE=1MNx=1My=1N[f(x,y)g(x,y)]2.
In general, the curve converges more rapidly as the support constraint size decreases. Without the support constraint applied, MSE drops fast first and then at a low steady rate, which is the opposite of the case w = 20. When a 40-pixel-width dark border is added, MSE falls off slowly and ends up with a high value.

 

Fig. 4 Comparison of the convergence speed for different support constraint sizes.

Download Full Size | PPT Slide | PDF

Therefore, it can be concluded that the reconstruction accuracy is unaffected by the support constraint size, but the increasing constraint size will slow down the convergence. Thus, when our scheme is applied, adding the support constraint is not recommended. This is an outstanding advantage to the existing retrieval method [20]. Usually, images of loose support are typically more challenging to reconstruct. Our scheme does not require pre-processing measured data by adding the artificial support constraint and thus simplifies the operation and lessens the computational load. In light of this, the oversampling setting is dropped in the following and only the binary resolution chart is tested.

3.2 Effects of sequential measurement distance

The distance of displacing the lens every time also has an impact on the reconstruction quality because the improvement during iterations depends on the variation of intensity that, in turn, depends on the distance [1]. If the distance Δd is very small, the intensity distributions do not change substantially, which means the frequency domain constraints will not work well. The discussion in this part is to determine the minimum sequential measurement distance that will result in satisfactory reconstructions. Here, the median of d1 is kept as 100mm and the object distance values of 5 measurements are distributed symmetrically at both sides of 150mm according to the corresponding interval Δd. The distance d2 is calculated by the equation d2=2fd1.

Figure 5 displays the plot of NCC as a function of the measurement distance under 500 iterations. It can be observed that the quality of retrieved image increases with the distance growing. If the correlation threshold is set as 0.99, a measurement distance of 21.7mm can be taken as the minimum value to obtain the satisfactory reconstruction for this particular object wavefront.

 

Fig. 5 Reconstructions of a binary resolution chart for increasing sequential measurement distance.

Download Full Size | PPT Slide | PDF

With regard to the convergence speed, Fig. 6 suggests that the larger the measurement distance is, the more rapidly the APR algorithm converges. The curves with a distance less than 21.7mm stagnate at a high MSE value, indicating that more iterations cannot improve the reconstruction quality but just entail longer processing time when the measurement distance is too small. The other two curves keep falling down. Significantly, the measurement distance should not be too large considering the limited detector sensing area in practice. Unacceptable reconstructions could be achieved if the main pattern of measured data is oversize for the CCD.

 

Fig. 6 Comparison of the convergence speed for different sequential measurement distances.

Download Full Size | PPT Slide | PDF

3.3 Effects of number of intensity measurements

Due to various experimental factors and phase randomization, the appropriate number of intensity measurements for the satisfactory wavefront reconstruction is not easy to pin down. If the number is small, the reconstruction will be poor; if large, the computing time will surge. Hence the intention of this discussion is to approximate the appropriate number of intensity measurements. Here, the object distance d1 is taken according to Table 1. The corresponding image distance d2 is calculated by the equation d2=2fd1.

Tables Icon

Table 1. The assigning values of the object distance. (Unit: mm)

From Fig. 7(a), it can be observed that utilizing more measured intensity images can speed up the convergence indeed. The curves representing 6 and 8 measurements keep falling off while the other two curves become stagnant at an early stage. Evident is that the slope of the 8 measurements curve is much steeper than the 6 measurements one. The eventual MSE values each curve ends up with also indicate retrieval from 8 measurements is the best.

 

Fig. 7 Simulations of the effects of number of intensity measurements on the reconstruction quality.

Download Full Size | PPT Slide | PDF

Figure 7(b) shows a matrix of representative reconstructions to reveal the effects of the number of intensity measurements on the reconstruction quality as iteration goes on. Moving from up to down, the number of intensity measurements rises from 2 to 8 with an interval of 2. Moving from left to right, the number of iterations increases from 100 to 500 with an interval of 100. It is notable that the top two rows, which use less than 5 measurements, do not get acceptable retrieved results even after 500 iterations. The reconstructed images are blurry and full of noise. For the quantitative analysis, reconstructions are correlated with the original object and results are labeled in red above each reconstruction. In conclusion, 8 measurements is a proper choice for the satisfactory retrieval in this discussion, taking the computation time into consideration.

3.4 Comparison of eFrFT and two-step Fresnel transform

As is demonstrated in Section 2, a single lens imaging can be seen as an eFrFT from the aspect of information optics. If discussed in the field of wave optics, this process consists of two Fresnel transforms intervened by a phase modulation from single lens [17]. Naturally, we want to compare these two methods. Here the object distance d1 is sequentially taken as 90mm, 120mm, 150mm 180mm and 210mm. Then the image distance d2 is determined by d2=2fd1.

Figure 8 gives a comparison of these two methods’ reconstructions after 500 iterations. It is hard to compare their reconstruction quality with naked eyes, so MSE is calculated and labeled under two reconstructions, which shows the reconstructed image retrieved by eFrFT is better than the two-step Fresnel transform one. Besides, it can be observed that the two convergence curves are both similar to an inclined distorted letter ‘w’. They descend slowly during the initial iterations, plunge after some inflection point and end up close to a slant line. But for the eFrFT curve, its inflection point comes earlier so its final converged MSE value is smaller. This shows eFrFT’s superiority to two-step Fresnel transform under the same iterations. The finding may be attributed to their systematic difference. The latter has two transportations but the former is just one transform, bringing in less accumulative errors. Moreover, the computational load of the eFrFT convolution scheme is lower than two-step Fresnel transform.

 

Fig. 8 Comparison of reconstructions by eFrFT and two-step Fresnel Transform.

Download Full Size | PPT Slide | PDF

3.5 Comparison between focusing and defocusing

Traditionally, a scaled intensity distribution can be detected on the conjugated plane with the object plane when the Gaussian lens formula

1d1+1d2=1f
is applied to single lens imaging. Thus, we have to prove advantages of our scheme over the direct focusing measurement (DFM). During comparison, the object distance d1 is kept as 200mm for two methods. In DFM, the image distance d2 is calculated from the Gaussian lens formula and the result image is from taking eFrFT once. For the multiple-image defocusing measurement (MDM), d2 is taken as 300mm, 400mm, 500mm, 700mm and 800mm sequentially, which are located at both sides of the conjugated plane. The number of iterations is 500 for each test.

Figure 9(a) illustrates the measured amplitude distribution error curve as the physical side length of the observed object increases from 1mm to 6mm. It is shown that the curve corresponding to MDM always winds below the DFM curve. Smaller MSE of the MDM curve indicates that the reconstructed image by our scheme is closer to the ground truth than the direct intensity detection result. For example, when the side length is 3mm, distribution of error between the reconstructed amplitude data from MDM and the ground truth is displayed in Fig. 9(c), the average of which is 10−15, much smaller than the counterpart from DFM in Fig. 9(b). Results suggest our scheme outperforms DFM in intensity reconstruction when the observed object has a micro-size. In addition to the intrinsic property of phase retrieval, our scheme shows great potential in the microscopic imaging.

 

Fig. 9 Comparison of reconstructions by direct focusing measurement (DFM) and multiple-image defocusing measurement (MDM): (a) LMSE curve, (b) and (c) are error distribution from DFM and MDM, respectively.

Download Full Size | PPT Slide | PDF

4. Conclusions

We present a reconstruction method by using the APR algorithm in eFrFT domains. A fast eFrFT algorithm is given by convolution. The effects of different parameters on the reconstruction quality are investigated and the results can be summarized to: (1) the reconstruction accuracy of our scheme does not rely on the support constraint size, which is different from the APR performance in Fresnel domains. (2) The quality of reconstructed images is improved significantly with the measurement interval increasing in a reasonable range, considering the practical CCD size. (3) Reconstruction quality gets enhanced with an increasing number of intensity measurements. The convolution scheme of eFrFT outperforms the two-step Fresnel transform analysis in reconstruction accuracy and computational load. The APR measurement with multiple defocused images is superior to the traditional direct focusing measurement from the aspect of phase retrieval and intensity reconstruction under the condition of small size objects.

The oversampling process can be dropped, which simplifies the calculation. Besides, the optical structure has several other advantages, such as simple experimental implementation, easy operation and controllable diffraction patterns. These features make our method quite promising. We anticipate that it will find application in coherent diffraction imaging, especially for micro-size objects.

Acknowledgment

This work was supported by the National Natural Science Foundation of China (NSFC) (Nos. 61377016, 61575055 and 61575053), the Fundamental Research Funds for the Central Universities (No. HIT.BRETIII.201406), the Program for New Century Excellent Talents in University (No. NCET-12-0148), and the China Postdoctoral Science Foundation (Nos. 2013M540278 and 2015T80340), and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China. Authors are indebted to the reviewers for their helpful suggestions and comments.

References and links

1. P. Almoro, G. Pedrini, and W. Osten, “Complete wavefront reconstruction using sequential intensity measurements of a volume speckle field,” Appl. Opt. 45(34), 8596–8605 (2006). [CrossRef]   [PubMed]  

2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]   [PubMed]  

3. R. P. Millane, “Phase retrieval in crystallography and optics,” J. Opt. Soc. Am. A 7(3), 394–411 (1990). [CrossRef]  

4. J. C. Dainty and J. R. Fienup, “Phase retrieval and image reconstruction for astronomy,” in Image Recovery: Theory and Application, H. Stark, ed. (Academic, 1987), 231–275.

5. J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999). [CrossRef]  

6. R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Jena) 35, 237–246 (1972).

7. V. Soifer, V. Kotlyar, and L. Doskolovich, “Iterative methods for diffractive optical elements computation,” Taylor & Francis Inc. 1997.

8. G. Z. Yang, B. Z. Dong, B. Y. Gu, J. Y. Zhuang, and O. K. Ersoy, “Gerchberg-Saxton and Yang-Gu algorithms for phase retrieval in a nonunitary transform system: a comparison,” Appl. Opt. 33(2), 209–218 (1994). [CrossRef]   [PubMed]  

9. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef]   [PubMed]  

10. G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:. [CrossRef]  

11. J. A. Rodrigo, H. Duadi, T. Alieva, and Z. Zalevsky, “Multi-stage phase retrieval algorithm based upon the gyrator transform,” Opt. Express 18(2), 1510–1520 (2010). [CrossRef]   [PubMed]  

12. J. A. Rodrigo, T. Alieva, G. Cristóbal, and M. L. Calvo, “Wavefield imaging via iterative retrieval based on phase modulation diversity,” Opt. Express 19(19), 18621–18635 (2011). [CrossRef]   [PubMed]  

13. P. Bao, F. Zhang, G. Pedrini, and W. Osten, “Phase retrieval using multiple illumination wavelengths,” Opt. Lett. 33(4), 309–311 (2008). [CrossRef]   [PubMed]  

14. G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. 30(8), 833–835 (2005). [CrossRef]   [PubMed]  

15. F. Mosso, E. Peters, and D. G. Pérez, “Complex wavefront reconstruction from multiple-image planes produced by a focus tunable lens,” Opt. Lett. 40(20), 4623–4626 (2015). [CrossRef]   [PubMed]  

16. Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015). [CrossRef]  

17. C. Guo, J. Tan, and Z. Liu, “Precision influence of a phase retrieval algorithm in fractional Fourier domains from position measurement error,” Appl. Opt. 54(22), 6940–6947 (2015). [CrossRef]   [PubMed]  

18. J. Hua, L. Liu, and G. Li, “Extended fractional Fourier transforms,” J. Opt. Soc. Am. A 14(12), 3316–3322 (1997). [CrossRef]  

19. Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015). [CrossRef]  

20. Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016). [CrossRef]  

21. A. W. Lohmann, “Image rotation, Wigner rotation, and the fractional Fourier transform,” J. Opt. Soc. Am. A 10(10), 2181–2186 (1993). [CrossRef]  

22. J. Miao and D. Sayre, “On possible extensions of X-ray crystallography through diffraction-pattern oversampling,” Acta Crystallogr. A 56(6), 596–605 (2000). [CrossRef]   [PubMed]  

23. http://sipi.usc.edu/database/

References

  • View by:
  • |
  • |
  • |

  1. P. Almoro, G. Pedrini, and W. Osten, “Complete wavefront reconstruction using sequential intensity measurements of a volume speckle field,” Appl. Opt. 45(34), 8596–8605 (2006).
    [Crossref] [PubMed]
  2. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
    [Crossref] [PubMed]
  3. R. P. Millane, “Phase retrieval in crystallography and optics,” J. Opt. Soc. Am. A 7(3), 394–411 (1990).
    [Crossref]
  4. J. C. Dainty and J. R. Fienup, “Phase retrieval and image reconstruction for astronomy,” in Image Recovery: Theory and Application, H. Stark, ed. (Academic, 1987), 231–275.
  5. J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
    [Crossref]
  6. R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Jena) 35, 237–246 (1972).
  7. V. Soifer, V. Kotlyar, and L. Doskolovich, “Iterative methods for diffractive optical elements computation,” Taylor & Francis Inc. 1997.
  8. G. Z. Yang, B. Z. Dong, B. Y. Gu, J. Y. Zhuang, and O. K. Ersoy, “Gerchberg-Saxton and Yang-Gu algorithms for phase retrieval in a nonunitary transform system: a comparison,” Appl. Opt. 33(2), 209–218 (1994).
    [Crossref] [PubMed]
  9. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982).
    [Crossref] [PubMed]
  10. G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
    [Crossref]
  11. J. A. Rodrigo, H. Duadi, T. Alieva, and Z. Zalevsky, “Multi-stage phase retrieval algorithm based upon the gyrator transform,” Opt. Express 18(2), 1510–1520 (2010).
    [Crossref] [PubMed]
  12. J. A. Rodrigo, T. Alieva, G. Cristóbal, and M. L. Calvo, “Wavefield imaging via iterative retrieval based on phase modulation diversity,” Opt. Express 19(19), 18621–18635 (2011).
    [Crossref] [PubMed]
  13. P. Bao, F. Zhang, G. Pedrini, and W. Osten, “Phase retrieval using multiple illumination wavelengths,” Opt. Lett. 33(4), 309–311 (2008).
    [Crossref] [PubMed]
  14. G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. 30(8), 833–835 (2005).
    [Crossref] [PubMed]
  15. F. Mosso, E. Peters, and D. G. Pérez, “Complex wavefront reconstruction from multiple-image planes produced by a focus tunable lens,” Opt. Lett. 40(20), 4623–4626 (2015).
    [Crossref] [PubMed]
  16. Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
    [Crossref]
  17. C. Guo, J. Tan, and Z. Liu, “Precision influence of a phase retrieval algorithm in fractional Fourier domains from position measurement error,” Appl. Opt. 54(22), 6940–6947 (2015).
    [Crossref] [PubMed]
  18. J. Hua, L. Liu, and G. Li, “Extended fractional Fourier transforms,” J. Opt. Soc. Am. A 14(12), 3316–3322 (1997).
    [Crossref]
  19. Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
    [Crossref]
  20. Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
    [Crossref]
  21. A. W. Lohmann, “Image rotation, Wigner rotation, and the fractional Fourier transform,” J. Opt. Soc. Am. A 10(10), 2181–2186 (1993).
    [Crossref]
  22. J. Miao and D. Sayre, “On possible extensions of X-ray crystallography through diffraction-pattern oversampling,” Acta Crystallogr. A 56(6), 596–605 (2000).
    [Crossref] [PubMed]
  23. http://sipi.usc.edu/database/

2016 (1)

Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
[Crossref]

2015 (4)

F. Mosso, E. Peters, and D. G. Pérez, “Complex wavefront reconstruction from multiple-image planes produced by a focus tunable lens,” Opt. Lett. 40(20), 4623–4626 (2015).
[Crossref] [PubMed]

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

C. Guo, J. Tan, and Z. Liu, “Precision influence of a phase retrieval algorithm in fractional Fourier domains from position measurement error,” Appl. Opt. 54(22), 6940–6947 (2015).
[Crossref] [PubMed]

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

2013 (1)

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref] [PubMed]

2011 (1)

2010 (1)

2008 (1)

2006 (1)

2005 (1)

2000 (1)

J. Miao and D. Sayre, “On possible extensions of X-ray crystallography through diffraction-pattern oversampling,” Acta Crystallogr. A 56(6), 596–605 (2000).
[Crossref] [PubMed]

1999 (1)

J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
[Crossref]

1997 (1)

1994 (1)

1993 (1)

1990 (1)

1982 (1)

1972 (1)

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Jena) 35, 237–246 (1972).

Alieva, T.

Almoro, P.

Bao, P.

Calvo, M. L.

Chapman, H. N.

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

Charalambous, P.

J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
[Crossref]

Chen, K.

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Cheng, G.

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Cohen, O.

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

Cristóbal, G.

Dong, B. Z.

Duadi, H.

Eldar, Y. C.

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

Ersoy, O. K.

Fienup, J. R.

Gerchberg, R. W.

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Jena) 35, 237–246 (1972).

Gu, B. Y.

Guo, C.

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

C. Guo, J. Tan, and Z. Liu, “Precision influence of a phase retrieval algorithm in fractional Fourier domains from position measurement error,” Appl. Opt. 54(22), 6940–6947 (2015).
[Crossref] [PubMed]

Horstmeyer, R.

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref] [PubMed]

Hua, J.

Kirz, J.

J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
[Crossref]

Li, G.

Liu, L.

Liu, S.

Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
[Crossref]

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Liu, Z.

Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
[Crossref]

C. Guo, J. Tan, and Z. Liu, “Precision influence of a phase retrieval algorithm in fractional Fourier domains from position measurement error,” Appl. Opt. 54(22), 6940–6947 (2015).
[Crossref] [PubMed]

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Lohmann, A. W.

Miao, J.

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

J. Miao and D. Sayre, “On possible extensions of X-ray crystallography through diffraction-pattern oversampling,” Acta Crystallogr. A 56(6), 596–605 (2000).
[Crossref] [PubMed]

J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
[Crossref]

Millane, R. P.

Mosso, F.

Osten, W.

Pan, L.

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

Pedrini, G.

Pérez, D. G.

Peters, E.

Rodrigo, J. A.

Saxton, W. O.

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Jena) 35, 237–246 (1972).

Sayre, D.

J. Miao and D. Sayre, “On possible extensions of X-ray crystallography through diffraction-pattern oversampling,” Acta Crystallogr. A 56(6), 596–605 (2000).
[Crossref] [PubMed]

J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
[Crossref]

Segev, M.

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

Shechtman, Y.

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

Shen, C.

Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
[Crossref]

Tan, J.

Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
[Crossref]

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

C. Guo, J. Tan, and Z. Liu, “Precision influence of a phase retrieval algorithm in fractional Fourier domains from position measurement error,” Appl. Opt. 54(22), 6940–6947 (2015).
[Crossref] [PubMed]

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Wei, C.

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Wu, Q.

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

Yang, C.

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref] [PubMed]

Yang, G. Z.

Zalevsky, Z.

Zhang, F.

Zhang, Y.

Zheng, G.

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref] [PubMed]

Zhuang, J. Y.

Acta Crystallogr. A (1)

J. Miao and D. Sayre, “On possible extensions of X-ray crystallography through diffraction-pattern oversampling,” Acta Crystallogr. A 56(6), 596–605 (2000).
[Crossref] [PubMed]

Appl. Opt. (4)

IEEE Photonics J. (1)

Z. Liu, C. Shen, J. Tan, and S. Liu, “A recovery method of double random phase encoding system with a parallel phase retrieval,” IEEE Photonics J. 8(1), 7801807 (2016).
[Crossref]

IEEE Signal Process. Mag. (1)

Y. Shechtman, Y. C. Eldar, O. Cohen, H. N. Chapman, J. Miao, and M. Segev, “Phase retrieval with application to optical imaging,” IEEE Signal Process. Mag. 32(3), 87–109 (2015).
[Crossref]

J. Opt. (1)

Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt. 17, 025701 (2015).
[Crossref]

J. Opt. Soc. Am. A (3)

Nat. Photonics (1)

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref] [PubMed]

Nature (1)

J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature 400(6742), 342–344 (1999).
[Crossref]

Opt. Express (2)

Opt. Lett. (3)

Optik (Jena) (1)

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik (Jena) 35, 237–246 (1972).

Other (4)

V. Soifer, V. Kotlyar, and L. Doskolovich, “Iterative methods for diffractive optical elements computation,” Taylor & Francis Inc. 1997.

G. Cheng, C. Wei, J. Tan, K. Chen, S. Liu, Q. Wu, and Z. Liu, “A review of iterative phase retrieval for measurement and encryption,” Opt. Lasers Eng., doi:.
[Crossref]

J. C. Dainty and J. R. Fienup, “Phase retrieval and image reconstruction for astronomy,” in Image Recovery: Theory and Application, H. Stark, ed. (Academic, 1987), 231–275.

http://sipi.usc.edu/database/

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 An optical system representing eFrFT for wavefront reconstruction. SLM: spatial light modulator; BE: beam expander; CCD: charge-coupled device.
Fig. 2
Fig. 2 Test images with support constraint: (a) binary resolution chart, (b) Elaine, where the dark border represents support constraint.
Fig. 3
Fig. 3 Intensity images: (a) for Elaine reconstruction. Reconstructions with the increasing size of support constraint: (b) binary resolution chart, (c) Elaine.
Fig. 4
Fig. 4 Comparison of the convergence speed for different support constraint sizes.
Fig. 5
Fig. 5 Reconstructions of a binary resolution chart for increasing sequential measurement distance.
Fig. 6
Fig. 6 Comparison of the convergence speed for different sequential measurement distances.
Fig. 7
Fig. 7 Simulations of the effects of number of intensity measurements on the reconstruction quality.
Fig. 8
Fig. 8 Comparison of reconstructions by eFrFT and two-step Fresnel Transform.
Fig. 9
Fig. 9 Comparison of reconstructions by direct focusing measurement (DFM) and multiple-image defocusing measurement (MDM): (a) LMSE curve, (b) and (c) are error distribution from DFM and MDM, respectively.

Tables (1)

Tables Icon

Table 1 The assigning values of the object distance. (Unit: mm)

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

a n ( u ) = E d 1 , d 2 , f { a 0 ( x ) } ( u ) = + a 0 ( x ) exp [ λ ( f d 2 ) x 2 + ( f d 1 ) u 2 2 f x u ( d 1 + d 2 ) f d 1 d 2 ] d x ,
a n ( u ) = m 3 ( u ) { [ a 0 ( x ) m 1 ( x ) ] m 2 ( x ) } ,
m 1 ( x ) = exp [ λ d 2 ( d 1 + d 2 ) f d 1 d 2 x 2 ] , m 2 ( x ) = exp [ λ f ( d 1 + d 2 ) f d 1 d 2 x 2 ] , m 3 ( u ) = exp [ λ d 1 ( d 1 + d 2 ) f d 1 d 2 u 2 ] .
F { m 2 ( x ) } ( u ) = exp ( i π 4 ) exp ( λ ( d 1 + d 2 ) f d 1 d 2 f u 2 ) ,
C f , g = 1 M N x = 1 M y = 1 N [ f ( x , y ) f ¯ ( x , y ) ] [ g ( x , y ) g ¯ ( x , y ) ] ,
NCC = C f , g C f , f C g , g .
MSE = 1 M N x = 1 M y = 1 N [ f ( x , y ) g ( x , y ) ] 2 .
1 d 1 + 1 d 2 = 1 f

Metrics