## Abstract

Lensless imaging based on multi-wavelength phase retrieval becomes a promising technology widely used as it has simple acquisition, miniaturized size and low-cost setup. However, measuring the sample-to-sensor distance with high accuracy, which is the key for high-resolution reconstruction, is still a challenge. In this work, we propose a multi-wavelength criterion to realize autofocusing modulation, i.e., achieving much higher accuracy in determining the sample-to-sensor distance, compared to the conventional methods. Three beams in different spectrums are adopted to illuminate the sample, and the resulting holograms are recorded by a CCD camera. The patterns calculated by performing back propagation of the recorded holograms, with exhaustively searched sample-to-sensor distance value, are adopted to access the criterion. Image sharpness can be accessed and the optimal sample-to-sensor distance can be finely determined by targeting the valley of the curve given by the criterion. Through our novel multi-wavelength based autofocusing strategy and executing further phase retrieval process, high-resolution images can be finally retrieved. The applicability and robustness of our method is validated both in simulations and experiments. Our technique provides a useful tool for multi-wavelength lensless imaging under limited experimental conditions.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

## 1. Introduction

Lensless in-line holographic microscopy (LIHM) is a promising tool in noninvasive imaging technique [1] working universally for several attributes. First, it records the integrated wavefront information with no lens introduced and the sample can be algorithmically reconstructed, so the system can be assembled with a smaller and less costly implementation. Second, the field of view (FOV) of the acquired holograms is equal to the active area of the sensor chip, and the trade-off between the FOV and resolution is decoupled consequently. As a result, such technique has been widely used in many fields, such as point-of-care diagnostics devices [2], 3D motion tracking [3] and the optofluidic microscopy [4]. The simplest implementation of the lensless imaging is in-line holography, in which multiple holograms are recorded by moving the camera mechanically [5–7], adjusting the wavelength of light source [8,9], changing the angle of the light source [10–12] or using a spatial light modulator [13]. In the meantime, multi-wavelength lensless imaging has aroused widely concerns in recent years [14–19]. Unlike conventional methods, such digital refocusing and reconstruction procedures highly rely on the precise knowledge of the focus distance. Inaccurate distance estimation from sample to sensor will damps the imaging resolution. Some autofocusing methods have been proposed for a series of lensless imaging applications, such as self-entropy [20], dual-wavelength criterion [21,22], edge sparsity criterion [23], structure tensor measurement [24], local intensity variance [25,26], spectral norms [27], phase criterion [28] and data analysis [29–41]. Recently, the deep learning has also been applied to autofocusing [42–44]. Accordingly, an effective autofocusing method needs unimodal over a wide range of defocus distances in its curve and precise for various samples [45].

In this paper, we propose a robust multi-wavelength autofocusing (MWAF) method for high resolution multi-wavelength lensless imaging. In this method light sources with three wavelengths, i.e., 488 nm, 532 nm and 632.8 nm, are employed for recording the diffractive holograms. Then the reconstructed patterns can be finally retrieved, by inverse propagation of the holograms, with exhaustively searched sample-to-sensor distance, to calculate the variation of MWAF criterion function. The minimum variation value of MWAF relates to the accurate sample-to-senor distance searched. After MWAF based multi-wavelength localization, the high-resolution images can be retrieved given the accurately searched distance, combining with a robust multi-wavelength phase retrieval algorithm. Examining the simulations and experimental results, our method exhibits a better anti-noise performance compared to conventional methods. Moreover, we further adopt our method to multi-wavelength single-shot recording, in which, three major channels (red, green and blue) of the color sensor works simultaneously to record the patterns generated by the illuminating the sample using white laser. Three wavelength channels are then extracted followed by performing our auto-focusing method. The corresponding results show that the MWAF model can retrieve a clear image with an optimal distance in single shot observation.

## 2. System setup and theory

#### 2.1. Experimental platform

Experimental schematic of our system is illustrated in Fig. 1. A white laser with the bandwidth ranges from 350 nm to 2000 nm (YSL Photonics) is used as the light source, and three types of wavelengths (488 nm, 532 nm and 632.8 nm) are chosen by three color filters with 10 nm bandwidth (FL488-10, FL532-10, FL632.8-10, Thorlabs). These color filters are mounted on a motorized filter wheel (FW102C, 6-Position Filter, Thorlabs) for an automatic switchover. A color CCD camera (Shanghai optical instrument factory, 6.5 μm × 6.5 μm) is selected for recording the interference patterns. The distance between the wheel and the CCD camera is around 10 cm. To achieve higher numerical aperture, CCD with its objective lens removed is placed close to sample for on-chip measurement. Two types of samples, including USAF resolution target (R3L3S1P, Thorlabs) and biological specimen, are used for experimental validations.

#### 2.2. MWAF algorithm

In the paraxial regime, the propagation of light wave from *E*(*x*', *y*', 0) to a distance of *z,* expressed as *E*(*x*, *y*, 0), can be described by the diffraction integral [46]:

*λ*is the wavelength of the light source, and

*z*is the distance from the sample plane to the camera sensor. Significantly, Eq. (1) indicates that Fresnel diffraction propagation depends equivalently on distance and wavelength, which means different distance and wavelength values can be utilized when performing phase retrieval algorithms [47–49], explored by previous works.

As shown in Fig. 1, experimental scheme needs no sample or camera movement, while only relies on the measured diffraction patterns in terms of three incident wavelengths. As the commonly used numerical reconstruction algorithm is based on Fresnel-Kirchhoff diffraction formula, these processes can be categorized as, Fresnel transform method, angular spectrum method and convolution method. Considering the angular spectrum method has excellent anti-under-sampling [50] ability and releases the restriction of small angles, this method is utilized to realize backward propagation, and multi-wavelength-based phase retrieval algorithm between the sample and camera planes, with different wavelengths adopted in our work. The propagating kernel is expressed as

*f*and

_{x}*f*represent the spatial frequencies along the horizontal and vertical coordinates, respectively, and

_{y}*λ*is defined as the three chosen wavelengths.

_{n}*F*and

*F*represent the positive and inverse Fourier transforms respectively, the symbol “*” represents the complex conjugate operator;

^{−1}*I*and

**O**represent the intensity pattern acquired under a single measurement and the recovered image after inverse propagation by the angular spectrum method. As illustrated in Eq. (2), in the case of

*λ*, the knowledge of the

_{n}*z*distance is usually a prerequisite for the subsequent phase retrieval step. In order to determine the accurate distance value, object images related to three wavelengths, i.e., O

_{1}, O

_{2}and O

_{3}are numerically recovered, by taking the inverse Fresnel propagation of the recorded holograms, given the distance values searched step by step in an estimated range, i.e., [0 mm, 10 mm]. Then the three reconstructed images are used in the proposed three-wavelength based criterion called MWAF method to determine the focus plane. The MWAF method is expressed as

_{1}, O

_{2}and O

_{3}are the recovered images from raw holograms in connection with the three chosen wavelengths.

*N*and

_{x}*N*denote the number of spatial pixels. In this analysis the searching range [0 mm, 10 mm] from camera plane to the sample plane, i.e., the distance of

_{y}*z*, is larger than the estimated initial distance. The value of MWAF will decrease as the estimated position approaching the focus distance, and the calculated distance corresponds to the minima of this generated curve. When obtaining autofocusing distance, holograms at three different sample-to-sensor wavelengths will be collected to perform the multi-wavelength phase retrieval for the reconstruction of high-resolution objects.

As proved in [51,52], multi-wavelength phase recovery leads to a no twin-image artifact reconstruction for lensless imaging. As shown in Fig. 2, the flowchart of Serial mode multi-wavelength phase retrieval algorithm can be given as in [8,10,18].

- (i) Assume${\mu}_{O}^{k}={A}^{k}\mathrm{exp}\left[j{\phi}^{k}\right]$as a
*k*-th (Integer multiple of three) complex-valued guess of object plane.*A*and^{k}*φ*are the amplitude and the phase values. When^{k}*k*= 1, the value of*A*^{1}and*φ*^{1}are given randomly, which only affect the number of iterations taken for convergence. - (ii) The complex-valued guess is then propagated to the image space with the red wavelength, and the k-th guess of the image plane is expressed as
where

*FST*is the forward free space propagation based on angular spectrum propagation operator, and the subscript_{z}*O*→*R*represents the propagation from object plane to image plane. ${A}_{r}^{k}$and ${\phi}_{r}^{k}$are the amplitude and phase of the object with red wavelength. ${\mu}_{O\to R}^{k}$ is the complex-valued image in the camera space, in which ${V}_{r}^{k}$ and ${\widehat{\phi}}_{r}^{k}$ are the amplitude and phase values. - (iii) Inverse propagation is carried out started with replacing the calculated amplitude by the square root of measured intensity on the camera, i.e., ${V}_{r}^{k}\text{=}\sqrt{{I}_{r}}$ and retaining the calculated phase. The backward propagation can be written as
where $FS{T}_{z}^{-1}$ is the backward free space propagation based on angular spectrum propagation operator. ${\mu}_{R\to O}^{k}$ represents the updated complex-valued object, and the subscript is the propagation from image plane to object plane. Such updated ${\mu}_{R\to O}^{k}$ will be the input value ${\mu}_{O}^{k+1}$ for the (

*k*+ 1)-th iteration. The green/blue wavelength runs as same as the red one, in which the forward and backward propagation are changed to be$${\mu}_{O\to R}^{k}={V}_{g}^{k}\mathrm{exp}\left[j{\widehat{\phi}}_{g}^{k}\right]=FS{T}_{z}\left|{A}_{g}^{k}\mathrm{exp}\left[j{\phi}_{g}^{k}\right]\right|,$$$${\mu}_{R\to O}^{k}=FS{T}_{z}^{-1}\left|\sqrt{{I}_{g}}\mathrm{exp}\left[j{\widehat{\phi}}_{g}^{k}\right]\right|\left(k=2+3i\right),$$$${\mu}_{O\to R}^{k}={V}_{b}^{k}\mathrm{exp}\left[j{\widehat{\phi}}_{b}^{k}\right]=FS{T}_{z}\left|{A}_{b}^{k}\mathrm{exp}\left[j{\phi}_{b}^{k}\right]\right|,$$$${\mu}_{R\to O}^{k}=FS{T}_{z}^{-1}\left|\sqrt{{I}_{b}}\mathrm{exp}\left[j{\widehat{\phi}}_{b}^{k}\right]\right|\left(k=3+3i\right).$$The phase retrieval process performing in the loop is shown in Fig. 2.

- (iv) Repeat steps ii-iii until
*K*times or the convergence is achieved, when the high-resolution amplitude and phase are retrieved.

The model predictions prove satisfactory solution quality and the convergence of the algorithm only takes 10 to 20 iterations, and the mean square error (MSE) criterion is used to judge the iterative convergence in the full text. Our MWAF method is implemented with three wavelengths, and it acquires better results when more wavelengths are involved. This method transfers information between the object plane and the camera (image) plane. The flow chart of the algorithm is shown in Fig. 3. The main figure contains two parts, in which the dotted box in the upper left corner shows multi-wavelength autofocusing and the others are for phase retrieval algorithm. To visualize the course better, the computational simulation is conducted using USAF resolution target in details.

In general, the lensless autofocusing can be regarded as an optimization problem designed to trace the maximum (or minimum, depending on the selected criterion) of various criterions. To estimate the precise value for various samples, it is necessary to find a suitable unimodal function. In this work, we list several different autofocusing criteria summarized by Langehanenberg et al. [28–30,40], including variance (Var), gradient (Grad), Laplacian (Lap), phase (PC), CS, Tamura and MC methods as a comparison with MWAF for validating the applicability on objects. Equations (11)–(13) are some classic functions, and the expressions are presented as

*N*and

_{x}*N*denote the number of spatial pixels and

_{y}*O*(

*x*,

*y*,

*z*) represents pixels of a single grayscale image in the coordinate system. In the following model validations, we will demonstrate that the reconstruction will not be capable of getting a stable and robust auto-focusing search in the case of multi-wavelength lensless imaging when using these traditional methods.

It is worthy to claim that phase shift both in the object and imaging space generated by different wavelengths, when phase retrieval is taken in the MWAF, is not discussed above [8]. The numerical simulation and experimental results addressing these cases are discussed in Appendix.

## 3. Simulations and experiments

We first demonstrated the effectiveness of MWAF by numerical simulations. The initial conditions used in simulations are the same as those in the actual lensless experiments. The image of *cameraman* (amplitude) and *testpat1* (phase) are selected (256 × 256) provided in the MATLAB platform as complex-valued sample. The sample-to-sensor distance (focusing distance) is set to be *Z _{0}* = 3 mm. when MWAF is taken, the focusing distance is searched in the range of [0.1, 10] mm with a step size of 0.1 mm. For comparison, the value is normalized to [0, 1] with no unit. Then the determined focusing distance,

*z*, is brought into phase retrieval algorithm. There are 50 iterations executed to get the reconstructed amplitude and phase. All of these following simulations and experiments are generated in a personal computer (PC) (Intel Xeon E3-1230-v3 CPU at 3.30 GHz, 3.25 GB RAM, Windows 10 64 Bits, MATLAB R2014b) with no GPU acceleration. To verify the feasibility and robustness of MWAF, a series of simulations under different conditions were carried out and then the results were compared to conventional autofocusing criteria. The 532 nm wavelength is used to calculate the distance in Var, Grad, Lap, phase, CS, Tamura and MC method.

In Figs. 4(a)–4(c) the raw diffraction pattern (holograms) with blue, green and red wavelengths, are presented respectively. In Fig. 4(d), the black curve presents the searched distance value vs. the normalized metric value of the MWAF method. Examining the results, it can be seen that the generated curve is unimodal and the curve decreases significantly near the optimal distance position, until it reaches its minimum value at the focusing distance of *z* = 3 cm, and this is consistent with the assumed value *Z _{0}*. Figures 4(e) and (f) show the reconstructed amplitude and phase respectively. Figure 4(g) show the interplay between the autofocusing method and the phase retrieval algorithm.

*l*

_{1}is the reconstructed image at the focus position.

*l*

_{2},

*l*

_{3},

*l*

_{4},

*l*

_{2’},

*l*

_{3′}

*and l*

_{4’}, are defocused images on both sides of the focal plane, respectively, with a spacing of

*d*= 0.1mm. It is obvious that we can get an accurate focusing distance adopting MWAF method, and high-resolution amplitude and phase images are obtained with multi-wavelength phase retrieval algorithm.

In Fig. 5, USAF resolution targets cameraman test and complex test are used as the samples in the object plane. The normalized curves of MWAF, Var, Grad, Lap, phase, CS, Tamura and MC criterion are demonstrated, respectively. Five conditions including noise-free case (NF), Gaussian noise (GN, variance = 0.01), speckle noise (SN), wide bandwidth (WB) and compound noise (CN) are considered. As shown in the left three columns in Fig. 5, the initial distance is given by 1 mm, and searched between 0.1 and 2 mm with the *z* increments of 0.1 mm (shown as horizontal coordinate), and the value (vertical coordinate) is normalized to [0, 1] with no unit. The colored curves represent different methods, respectively. The reconstructed amplitude of complex test under different conditions are represented in the right column.

For noise-free case, in Figs. 5(a)–5(c), MWAF, and Lap methods, are overall unimodal, and each has a strong global peak/valley at the correct *z* position, as expected. Grad method exhibits significant oscillations, which make it difficult to search for a peak or valley value, especially in the pure amplitude simulation. Although Var method obviously gives a peak value, the curve across the propagation interval is monotonously decreasing. Thus, it is hard to separate peak-to-peak value as the distance *z* is approaching the camera plane. Tamura and MC are the improved methods based on Var, and they all have problems similar to Var method. PC and CS methods have lots of local optimal solutions, searching for the local minima or maxima in a small range can solve this problem.

While a zero-mean Gaussian noise of standard deviation of 1% and the speckle noise are added, as shown in Figs. 5(e)–5(g) and Figs. 5(i)–5(k). Lap method is the most seriously affected among the four criterions, and the curve became fuzzy with the peak disappearing. Var, Tamura, MC and Grad have almost the same defect as before. Evidently, our method still has a steep valley. PC and CS methods cannot get the focus distance in some cases. Then a change in the monochromaticity of the incident light is included. The bandwidth of the illumination is from 3 nm to 10 nm. As shown in Figs. 5(m)–5(o), our MWAF method is greatly affected. The curve turned slightly flat, nevertheless, the minimum can still be calculated. Finally, both influencing situations are mixed, as described in Figs. 5(q)–5(s). In general, MWAF method can search the optimal focusing distance, but other methods got suboptimal result in some cases. The robustness against noise of Lap method is insufficient. Var, Tamura and MC method always get the precise distance, but it is fragile for lensless on-chip imaging owing to its monotonicity. Grad and CS methods have frequent local maxima and minima around the peak, and the inaccurate estimation in the choppy curve makes it difficult to get the truth. PC method is cyclically increasing throughout the interval, we have not shown here. Preliminary summary, MWAF is more robust under the noise conditions. Figures 5(d), 5(h), 5(l), 5(p) and 5(t) show the effects of several noises on the reconstructed images. To clarify the performance of MWAF in practical configuration, the proposed system is tested with the data sets captured experimentally. In the first group of experiments, a positive USAF resolution target is used with the experimental setup shown in Fig. 1. Three diffraction patterns related to three wavelengths are captured and the hologram with 488 nm is shown in Fig. 6(a). As shown in Fig. 6(c), the curves obtained by Var and Grad both have two extreme values near 1mm, taking values close to 1mm respectively. Lap and CS methods can both determine the optimal position, and have a single peak in each related curve. Tamura and MC methods are monotonic over the entire propagation interval, but there is a small bump which can be observed on the focus curve.

The PC method is periodic and there is a maximum in each cycle. It has the same results as the MWAF algorithm in the appropriate search range. MWAF method shows a clear peak position perfectly given as the black curve. In the eight methods, the comparison is conducted among the MWAF, Grad and Var method, since the results are closest to MWAF result from both sides. In order to confirm the accuracy of the distance values searched and determined, the reconstructed images in connection with the three methods are exhibited. Figures 6(d)–6(f) show the reconstructed amplitudes adopting multi-wavelength phase retrieval procedure (10 iterations), using the estimated distance values determined by MWAF, Grad and Var methods, respectively. As shown in the selected area in the zoom-in view, 228 lines/mm (4.39 µm) can be resolved and the image recovered based on MWAF is visually superior to the others. Only MWAF method can detect the optimal position, and it is close to the true focus distance. On the contrary, the other methods fail or perform suboptimal for USAF resolution target. According to the USAF resolution target line width calibration, the resolution is nearly 4 µm, which proved that the refocusing distance searched by MWAF is relatively accurate.

In Figs. 6(g)–6(i) the intensities of the corresponding color short lines in Figs. 6(b)–6(f), are plotted respectively, in which the scale bar in Fig. 6(a) presents for a length is 100 μm. Computational efficiency is another vital character of an autofocusing method that needs to be considered. In this set of experiments, the run times of MWAF, Var, Grad, Lap, PC, CS, Tamura and MC are 83 s, 28 s, 39 s, 31 s, 115 s, 50 s, 31 s and 28 s, respectively. In general, FFTs are typically the most consume time step, MWAF requires three propagations to calculate each datapoint on the curve but others just once (except PC methods). The process makes the MWAF method roughly 3 times slower than the others, and in terms of overall performance, which time is acceptable.

The results above present a qualitative comparison among the eight algorithms. To better compare the performance, criteria like accuracy, resolution, range and unimodality are used [24,53,54]. The detailed definitions and parameter settings are given in [53]. We use the accuracy metric (AM) and the resolution metric (RM) to quantitatively evaluate the accuracy and resolution. Results are given in Table 1, where *z _{f}* denotes the detected focal distance, and the unit is mm.

It is worth noting that the *z _{f}* values of the PC, Tamura and MC methods are obtained in a local search. In this table, MWAF method gives a smaller AM value in eight methods. It also gives a larger RM value as expected which reflects the focus distance standard deviation across the whole propagation interval. Most importantly, in the case of lensless on-chip imaging, it is often impossible to estimate the exact initial position, so a relatively large search interval (several millimeter levels) is necessary. Also, unimodality is a must and needs to be taken into consideration first. Only MWAF method gives a unimodal and smooth curve in the whole range. PC, Tamura and MC can get accurate values under small-range search conditions. Var, Grad, Lap and CS methods give a wrong result.

In order to validate the effectiveness of MWAF under the wide bandwidth condition, and to prove that MWAF algorithm can be applied to biological samples (nontransparent, mixed amplitude and phase-contrast objects), an ant specimen is adopted to analyze the impact of laser bandwidth on MWAF method. Firstly 488 nm (3 nm, Edmund), 532 nm (3 nm, Edmund) and 632.8 nm (3 nm, Edmund) narrow-band filters are selected with 3 nm bandwidth additionally. Here we kept the distance from the specimen to the camera plane unchanged, recorded six diffraction patterns under the two groups of bandwidth values and then calculated the two curves, as shown in Fig. 7.

In Figs. 7(a) and 7(b), it shows that the optimal focus distances searched by MWAF under two bandwidths are accordant. The curves remain smooth, unimodal and each has a steep peak basically, and it verified that the proposed algorithm is robust to the laser monochromaticity. Figures 7(c) and 7(d) show the reconstructed amplitude and phase of the ant specimen when 10 nm bandwidth filter is adopted. It can be clearly observed that the ant has a sharp edge and a clear background, which is the mark of focus plane. Unlike the binary object, the curve is smoother, owing to the thickness of the biological sample, and it is equivalent to a multi-layer object. The refocusing on all layers of the thick specimen is not feasible, so the curve is mellow at the peak.

To characterize retrieved USAF resolution target under broadband laser, in the third experiment, a white laser is employed to directly illuminate the negative USAF resolution target. As shown in Fig. 8, a single-shot diffraction pattern is recorded. Each of the red, green and blue holograms could be extracted from the color hologram by virtue of the RGB color filters within the CCD sensor. The wavelengths of 700 nm, 546.1 nm and 435.8 nm of the three primary colors are set as the propagation wavelengths of the Fresnel diffraction [55]. The optimal focus distance searched by MWAF is given in Fig. 8(b). MWAF can still calculate the optimal distance and the curve keeps a steep valley. To enable better visual judgment, the zoom-in area in Figs. 8(a) and 8(c) show that the smaller area can be resolved against with the acquired pattern, and the second position of the sixth group can be distinguished but the unprocessed one cannot make out. Through three distinct configurations in our experiment, we validate that the proposed MWAF method outperforms the existing autofocusing algorithm, providing superior accuracy and robustness for various samples. The results of the USAF resolution target and biological specimen provide strong evidence for the reliability of the method. This demonstrates that MWAF is feasible even under noisy conditions, and may illustrate its practical usefulness for holographic microscopy.

## 4. Conclusion

In summary, we demonstrate a precise and robust autofocusing method (MWAF) based on the minimum variation criterion. The main motivation of this technique is to extract the minimum difference among the inverse propagative patterns. The performance of the proposed MWAF method is demonstrated both in theoretical and experimental aspects. To accurately find the optimal distance under the condition of different types of experimental environments, the retrieved results provide that MWAF is robust, and more feasible even under noisy conditions. Our work paves a new way to measure the focusing distance and reconstruct high resolution images for lensless imaging systems.

Much work remains to be done. Firstly, it is worth to studying the cases when we replace the existing illumination unit with laser diodes or color LEDs. The MWAF algorithm will be developed more flexible while this constraint is relaxed. It means that the algorithm can be applied to partially coherent or even incoherent sources, and the experimental device can be made more compact and cost efficiently. Secondly, the relative motion of the sample and camera during the sampling process may be damped by external environmental vibration and the position error is generated. As a result, the MWAF may invalidate as the position error oversteps the limit, and it seems to be a deficiency of our method. In the further work, we are aiming to refine MWAF through adding the position error modification. Thirdly, although the proposed algorithm presented promising results compared to the conventional methods, its computation speed is time-consuming. The computational complexity is mostly contributed by the numbers of raw holograms and pixels. This issue could be dealt with parallel operation or the GPU acceleration.

## Appendix

In the above simulations, the simplified input phase, generated by normalizing the phase image (denoted by B), followed by changing to exp(*jπB*/2), denoted by ‘phase (0,1)’, is adopted to validate the feasibility of our method. In order to generalize the applications to an extent range, input phase related to wavelengths is also considered and simulated in our work. When the wavelength is taken into account, the updated phase is then given as exp(*jπB*'*/λ _{k}*) (k = 1, 2, 3), denoted by ‘phase (lambda)’, where

*B*' = 2

*B*× 10

^{−7}.

*λ*relates to the three chosen wavelengths, i.e., 488 nm, 532 nm and 632.8 nm.

_{k}When we also consider the affection of wavelengths on phase values, in this case, during iteration, the difference in phase shift caused by switching wavelength, both in the object space and in the imaging space, is considered and the retrieved object images are shown in Fig. 9. In Fig. 9, ‘Old’ refers to the current phase retrieval method without considering the difference of phase modulation caused by wavelength, while ‘New’ represents the phase retrieval method adjusting the phase values when switching the wavelength [8]. From the figure, we can see that the ‘Old’ phase retrieval shows slightly better performance on the retrieved amplitude and phase images.

The comparison results of simplified input phase vs. wavelength depended input phase, and ‘Old’ phase retrieval vs. ‘New’ phase retrieval, are shown in Fig. 9. From the two group of figures, it can be seen that in the case of adopting more general phase module (right figures), the performance is slightly worse than the simplified case, but the difference can be neglected in some cases. It also can be concluded from the figures that our method works well when the wavelength depended phase shift is considered. Compared to the ‘New’ phase retrieval method, our current algorithm, works slightly better, observed from the recovered amplitude and phase images in both cases. From the calculation of MSE = 3 × 10^{−15}, where $MSE=\frac{1}{{N}_{x}{N}_{y}}{{\displaystyle {\sum}_{x=1}^{{N}_{x}}{\displaystyle {\sum}_{y=1}^{{N}_{y}}\left(Phas{e}_{1}-Phas{e}_{2}\right)}}}^{2}$, between the recovered two phases using two methods, there is no big difference using the two phase retrieval methods. The experimental results also show no big difference adopting the two phase retrieval methods, which is not presented in this paper.

## Funding

National Natural Science Foundation of China (61805057); Young Elite Scientists Sponsorship Program (2018QNRC001); Equipment Pre-research Filed Fund (6140923020102).

## Acknowledgments

The authors acknowledge the support of Advanced Microscopy and Instrumentation Research Center.

## Disclosures

The authors declare that there are no conflicts of interest related to this article.

## References

**1. **O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip **10**(11), 1417–1428 (2010). [CrossRef] [PubMed]

**2. **M. Lee, O. Yaglidere, and A. Ozcan, “Field-portable reflection and transmission microscopy based on lensless holography,” Biomed. Opt. Express **2**(9), 2721–2730 (2011). [CrossRef] [PubMed]

**3. **J. F. Restrepo and J. Garcia-Sucerquia, “Automatic three-dimensional tracking of particles with high-numerical-aperture digital lensless holographic microscopy,” Opt. Lett. **37**(4), 752–754 (2012). [CrossRef] [PubMed]

**4. **X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. **105**(31), 10670–10675 (2008). [CrossRef] [PubMed]

**5. **L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. **24**(11), 1511–1515 (2004).

**6. **C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express **25**(14), 16235–16249 (2017). [CrossRef] [PubMed]

**7. **G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. **30**(8), 833–835 (2005). [CrossRef] [PubMed]

**8. **D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett. **39**(2), 193–196 (2014). [CrossRef] [PubMed]

**9. **N. Warnasooriya and M. K. Kim, “LED-based multi-wavelength phase imaging interference microscopy,” Opt. Express **15**(15), 9239–9247 (2007). [CrossRef] [PubMed]

**10. **C. Zuo, J. Sun, J. Zhang, Y. Hu, and Q. Chen, “Lensless phase microscopy and diffraction tomography with multi-angle and multi-wavelength illuminations using a LED matrix,” Opt. Express **23**(11), 14314–14328 (2015). [CrossRef] [PubMed]

**11. **C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. **7**(1), 7562 (2017). [CrossRef] [PubMed]

**12. **C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. **106**, 17–23 (2018). [CrossRef]

**13. **T. Meeser, C. von Kopylow, and C. Falldorf, “Advanced Digital Lensless Fourier Holography by means of a Spatial Light Modulator,” in *2010 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video* (IEEE, 2010), 1–4. [CrossRef]

**14. **A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods **9**(9), 889–895 (2012). [CrossRef] [PubMed]

**15. **P. Bao, G. Situ, G. Pedrini, and W. Osten, “Lensless phase microscopy using phase retrieval with multiple illumination wavelengths,” Appl. Opt. **51**(22), 5486–5494 (2012). [CrossRef] [PubMed]

**16. **W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express **18**(11), 11181–11191 (2010). [CrossRef] [PubMed]

**17. **B. Yu, “Simulation study of phase retrieval for hard X-ray in-line phase contrast imaging,” Sci. China Ser. G **48**(4), 450 (2005). [CrossRef]

**18. **M. Sanz, J. A. Picazo-Bueno, J. García, and V. Micó, “Improved quantitative phase imaging in lensless microscopy by single-shot multi-wavelength illumination using a fast convergence algorithm,” Opt. Express **23**(16), 21352–21365 (2015). [CrossRef] [PubMed]

**19. **Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods **136**, 4–16 (2018). [CrossRef] [PubMed]

**20. **J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. **9**(1), 19–25 (1989). [CrossRef]

**21. **P. Gao, B. Yao, R. Rupp, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, “Autofocusing based on wavelength dependence of diffraction in two-wavelength digital holographic microscopy,” Opt. Lett. **37**(7), 1172–1174 (2012). [CrossRef] [PubMed]

**22. **Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. **58**(04), 1 (2019). [CrossRef]

**23. **Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. **42**(19), 3824–3827 (2017). [CrossRef] [PubMed]

**24. **Z. Ren, N. Chen, and E. Y. Lam, “Automatic focusing for multisectional objects in digital holography using the structure tensor,” Opt. Lett. **42**(9), 1720–1723 (2017). [CrossRef] [PubMed]

**25. **L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. **6**(4), 396–400 (2004). [CrossRef]

**26. **M. Lyu, C. Yuan, D. Li, and G. Situ, “Fast autofocusing in digital holography using the magnitude differential,” Appl. Opt. **56**(13), F152–F157 (2017). [CrossRef] [PubMed]

**27. **W. Li, N. C. Loomis, Q. Hu, and C. S. Davis, “Focus detection from digital in-line holograms based on spectral l_1 norms,” J. Opt. Soc. Am. A **24**(10), 3054–3062 (2007). [CrossRef] [PubMed]

**28. **J. Dohet-Eraly, C. Yourassowsky, and F. Dubois, “Fast numerical autofocus of multispectral complex fields in digital holographic microscopy with a criterion based on the phase in the Fourier domain,” Opt. Lett. **41**(17), 4071–4074 (2016). [CrossRef] [PubMed]

**29. **P. Memmolo, M. Iannone, M. Ventre, P. A. Netti, A. Finizio, M. Paturzo, and P. Ferraro, “On the holographic 3D tracking of in vitro cells characterized by a highly-morphological change,” Opt. Express **20**(27), 28485–28493 (2012). [CrossRef] [PubMed]

**30. **A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. **56**(3), 034103 (2017). [CrossRef]

**31. **P. Picart, S. Montresor, O. Sakharuk, and L. Muravsky, “Refocus criterion based on maximization of the coherence factor in digital three-wavelength holographic interferometry,” Opt. Lett. **42**(2), 275–278 (2017). [CrossRef] [PubMed]

**32. **T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) **8**(7), 1042 (2018). [CrossRef]

**33. **F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, “Focus plane detection criteria in digital holography microscopy by amplitude analysis,” Opt. Express **14**(13), 5895–5908 (2006). [CrossRef] [PubMed]

**34. **M. Antkowiak, N. Callens, C. Yourassowsky, and F. Dubois, “Extended focused imaging of a microparticle field with digital holographic microscopy,” Opt. Lett. **33**(14), 1626–1628 (2008). [CrossRef] [PubMed]

**35. **L. Yu and L. Cai, “Iterative algorithm with a constraint condition for numerical reconstruction of a three-dimensional object from its hologram,” J. Opt. Soc. Am. A **18**(5), 1033–1045 (2001). [CrossRef] [PubMed]

**36. **M. Liebling and M. Unser, “Autofocus for digital Fresnel holograms by use of a Fresnelet-sparsity criterion,” J. Opt. Soc. Am. A **21**(12), 2424–2430 (2004). [CrossRef] [PubMed]

**37. **L. Xu, M. Mater, and J. Ni, “Focus detection criterion for refocusing in multi-wavelength digital holography,” Opt. Express **19**(16), 14779–14793 (2011). [CrossRef] [PubMed]

**38. **P. Ferraro, S. Grilli, D. Alfieri, S. De Nicola, A. Finizio, G. Pierattini, B. Javidi, G. Coppola, and V. Striano, “Extended focused image in microscopy by digital Holography,” Opt. Express **13**(18), 6738–6749 (2005). [CrossRef] [PubMed]

**39. **C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express **26**(11), 14407–14420 (2018). [CrossRef] [PubMed]

**40. **P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, “Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging,” Appl. Opt. **47**(19), D176–D182 (2008). [CrossRef] [PubMed]

**41. **P. Ferraro, G. Coppola, S. De Nicola, A. Finizio, and G. Pierattini, “Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time,” Opt. Lett. **28**(14), 1257–1259 (2003). [CrossRef] [PubMed]

**42. **Z. Ren, Z. Xu, and E. Y. Lam, “Learning-based nonparametric autofocusing for digital holography,” Optica **5**(4), 337 (2018). [CrossRef]

**43. **T. Pitkäaho, A. Manninen, and T. J. Naughton, “Focus prediction in digital holographic microscopy using deep convolutional neural networks,” Appl. Opt. **58**(5), A202–A208 (2019). [CrossRef] [PubMed]

**44. **S. Jiang, J. Liao, Z. Bian, K. Guo, Y. Zhang, and G. Zheng, “Transform- and multi-domain deep learning for single-frame rapid autofocusing in whole slide imaging,” Biomed. Opt. Express **9**(4), 1601–1612 (2018). [CrossRef] [PubMed]

**45. **F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry **6**(2), 81–91 (1985). [CrossRef] [PubMed]

**46. **J. W. Goodman, *Introduction to Fourier Optics*, 2nd ed. (McGraw-Hill, 1996).

**47. **C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. Part II: Attacking optical encryption systems,” Appl. Opt. **54**(15), 4709–4719 (2015). [CrossRef] [PubMed]

**48. **C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. **105**, 54–59 (2018). [CrossRef]

**49. **C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. I: optimization,” Appl. Opt. **54**(15), 4698–4708 (2015). [CrossRef] [PubMed]

**50. **A. J. Jerri, “The Shannon sampling theorem—Its various extensions and applications: A tutorial review,” Proc. IEEE **65**(11), 1565–1596 (1977). [CrossRef]

**51. **Y. S. Kim, T. Kim, S. S. Woo, H. Kang, T. C. Poon, and C. Zhou, “Speckle-free digital holographic recording of a diffusely reflecting object,” Opt. Express **21**(7), 8183–8189 (2013). [CrossRef] [PubMed]

**52. **J. J. Barton, “Removing multiple scattering and twin images from holographic images,” Phys. Rev. Lett. **67**(22), 3106–3109 (1991). [CrossRef] [PubMed]

**53. **E. S. R. Fonseca, P. T. Fiadeiro, M. Pereira, and A. Pinheiro, “Comparative analysis of autofocus functions in digital in-line phase-shifting holography,” Appl. Opt. **55**(27), 7663–7674 (2016). [CrossRef] [PubMed]

**54. **S. K. Mohammed, L. Bouamama, D. Bahloul, and P. Picart, “Quality assessment of refocus criteria for particle imaging in digital off-axis holography,” Appl. Opt. **56**(13), F158–F166 (2017). [CrossRef] [PubMed]

**55. **Y. Sun, C. Lou, Z. Jiang, and H. Zhou, “Experimental research of representative wavelengths of tricolor for color CCD camera,” Opt. Lett. **42**(2), 275–278 (2017). [PubMed]