Abstract

Lensless imaging based on multi-wavelength phase retrieval becomes a promising technology widely used as it has simple acquisition, miniaturized size and low-cost setup. However, measuring the sample-to-sensor distance with high accuracy, which is the key for high-resolution reconstruction, is still a challenge. In this work, we propose a multi-wavelength criterion to realize autofocusing modulation, i.e., achieving much higher accuracy in determining the sample-to-sensor distance, compared to the conventional methods. Three beams in different spectrums are adopted to illuminate the sample, and the resulting holograms are recorded by a CCD camera. The patterns calculated by performing back propagation of the recorded holograms, with exhaustively searched sample-to-sensor distance value, are adopted to access the criterion. Image sharpness can be accessed and the optimal sample-to-sensor distance can be finely determined by targeting the valley of the curve given by the criterion. Through our novel multi-wavelength based autofocusing strategy and executing further phase retrieval process, high-resolution images can be finally retrieved. The applicability and robustness of our method is validated both in simulations and experiments. Our technique provides a useful tool for multi-wavelength lensless imaging under limited experimental conditions.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Lensless in-line holographic microscopy (LIHM) is a promising tool in noninvasive imaging technique [1] working universally for several attributes. First, it records the integrated wavefront information with no lens introduced and the sample can be algorithmically reconstructed, so the system can be assembled with a smaller and less costly implementation. Second, the field of view (FOV) of the acquired holograms is equal to the active area of the sensor chip, and the trade-off between the FOV and resolution is decoupled consequently. As a result, such technique has been widely used in many fields, such as point-of-care diagnostics devices [2], 3D motion tracking [3] and the optofluidic microscopy [4]. The simplest implementation of the lensless imaging is in-line holography, in which multiple holograms are recorded by moving the camera mechanically [5–7], adjusting the wavelength of light source [8,9], changing the angle of the light source [10–12] or using a spatial light modulator [13]. In the meantime, multi-wavelength lensless imaging has aroused widely concerns in recent years [14–19]. Unlike conventional methods, such digital refocusing and reconstruction procedures highly rely on the precise knowledge of the focus distance. Inaccurate distance estimation from sample to sensor will damps the imaging resolution. Some autofocusing methods have been proposed for a series of lensless imaging applications, such as self-entropy [20], dual-wavelength criterion [21,22], edge sparsity criterion [23], structure tensor measurement [24], local intensity variance [25,26], spectral norms [27], phase criterion [28] and data analysis [29–41]. Recently, the deep learning has also been applied to autofocusing [42–44]. Accordingly, an effective autofocusing method needs unimodal over a wide range of defocus distances in its curve and precise for various samples [45].

In this paper, we propose a robust multi-wavelength autofocusing (MWAF) method for high resolution multi-wavelength lensless imaging. In this method light sources with three wavelengths, i.e., 488 nm, 532 nm and 632.8 nm, are employed for recording the diffractive holograms. Then the reconstructed patterns can be finally retrieved, by inverse propagation of the holograms, with exhaustively searched sample-to-sensor distance, to calculate the variation of MWAF criterion function. The minimum variation value of MWAF relates to the accurate sample-to-senor distance searched. After MWAF based multi-wavelength localization, the high-resolution images can be retrieved given the accurately searched distance, combining with a robust multi-wavelength phase retrieval algorithm. Examining the simulations and experimental results, our method exhibits a better anti-noise performance compared to conventional methods. Moreover, we further adopt our method to multi-wavelength single-shot recording, in which, three major channels (red, green and blue) of the color sensor works simultaneously to record the patterns generated by the illuminating the sample using white laser. Three wavelength channels are then extracted followed by performing our auto-focusing method. The corresponding results show that the MWAF model can retrieve a clear image with an optimal distance in single shot observation.

2. System setup and theory

2.1. Experimental platform

Experimental schematic of our system is illustrated in Fig. 1. A white laser with the bandwidth ranges from 350 nm to 2000 nm (YSL Photonics) is used as the light source, and three types of wavelengths (488 nm, 532 nm and 632.8 nm) are chosen by three color filters with 10 nm bandwidth (FL488-10, FL532-10, FL632.8-10, Thorlabs). These color filters are mounted on a motorized filter wheel (FW102C, 6-Position Filter, Thorlabs) for an automatic switchover. A color CCD camera (Shanghai optical instrument factory, 6.5 μm × 6.5 μm) is selected for recording the interference patterns. The distance between the wheel and the CCD camera is around 10 cm. To achieve higher numerical aperture, CCD with its objective lens removed is placed close to sample for on-chip measurement. Two types of samples, including USAF resolution target (R3L3S1P, Thorlabs) and biological specimen, are used for experimental validations.

 

Fig. 1 Experimental schematic of multi-wavelength lensless imaging system. A white light (broadband laser) passes the filter wheel (three modes used in the experiments), and the filtered light illuminates the sample. The diffraction patterns are recorded by a color CCD camera sensor.

Download Full Size | PPT Slide | PDF

2.2. MWAF algorithm

In the paraxial regime, the propagation of light wave from E(x', y', 0) to a distance of z, expressed as E(x, y, 0), can be described by the diffraction integral [46]:

E(x,y,z)=ei2πz/λiλzE(x',y',0)e(iπ/λz)[(xx')2+(yy')2]dx'dy',
where λ is the wavelength of the light source, and z is the distance from the sample plane to the camera sensor. Significantly, Eq. (1) indicates that Fresnel diffraction propagation depends equivalently on distance and wavelength, which means different distance and wavelength values can be utilized when performing phase retrieval algorithms [47–49], explored by previous works.

As shown in Fig. 1, experimental scheme needs no sample or camera movement, while only relies on the measured diffraction patterns in terms of three incident wavelengths. As the commonly used numerical reconstruction algorithm is based on Fresnel-Kirchhoff diffraction formula, these processes can be categorized as, Fresnel transform method, angular spectrum method and convolution method. Considering the angular spectrum method has excellent anti-under-sampling [50] ability and releases the restriction of small angles, this method is utilized to realize backward propagation, and multi-wavelength-based phase retrieval algorithm between the sample and camera planes, with different wavelengths adopted in our work. The propagating kernel is expressed as

Hn(fx,fy)={exp(i2πzλn1(λnfx)2(λnfy)2),if(λnfx)2+(λnfy)2<10,otherwise,
O=F1[F(I)Hn*],
where fx and fy represent the spatial frequencies along the horizontal and vertical coordinates, respectively, and λn is defined as the three chosen wavelengths. F and F−1 represent the positive and inverse Fourier transforms respectively, the symbol “*” represents the complex conjugate operator; I and O represent the intensity pattern acquired under a single measurement and the recovered image after inverse propagation by the angular spectrum method. As illustrated in Eq. (2), in the case of λn, the knowledge of the z distance is usually a prerequisite for the subsequent phase retrieval step. In order to determine the accurate distance value, object images related to three wavelengths, i.e., O1, O2 and O3 are numerically recovered, by taking the inverse Fresnel propagation of the recorded holograms, given the distance values searched step by step in an estimated range, i.e., [0 mm, 10 mm]. Then the three reconstructed images are used in the proposed three-wavelength based criterion called MWAF method to determine the focus plane. The MWAF method is expressed as
MWAF(z)=13x=1Nxy=1Ny(||O1||O2||2+||O1||O3||2+||O2||O3||2)2,
where O1, O2 and O3 are the recovered images from raw holograms in connection with the three chosen wavelengths. Nx and Ny denote the number of spatial pixels. In this analysis the searching range [0 mm, 10 mm] from camera plane to the sample plane, i.e., the distance of z, is larger than the estimated initial distance. The value of MWAF will decrease as the estimated position approaching the focus distance, and the calculated distance corresponds to the minima of this generated curve. When obtaining autofocusing distance, holograms at three different sample-to-sensor wavelengths will be collected to perform the multi-wavelength phase retrieval for the reconstruction of high-resolution objects.

As proved in [51,52], multi-wavelength phase recovery leads to a no twin-image artifact reconstruction for lensless imaging. As shown in Fig. 2, the flowchart of Serial mode multi-wavelength phase retrieval algorithm can be given as in [8,10,18].

 

Fig. 2 Single loop in Serial alternating iteration. The colored arrows represent the propagation at the corresponding color wavelength. k is the kth iteration.

Download Full Size | PPT Slide | PDF

  • (i) AssumeμOk=Akexp[jφk]as a k-th (Integer multiple of three) complex-valued guess of object plane. Ak and φk are the amplitude and the phase values. When k = 1, the value of A1 and φ1 are given randomly, which only affect the number of iterations taken for convergence.
  • (ii) The complex-valued guess is then propagated to the image space with the red wavelength, and the k-th guess of the image plane is expressed as
    μORk=Vrkexp[jφ^rk]=FSTz|Arkexp[jφrk]|(k=1+3i),

    where FSTz is the forward free space propagation based on angular spectrum propagation operator, and the subscript OR represents the propagation from object plane to image plane. Arkand φrkare the amplitude and phase of the object with red wavelength. μORk is the complex-valued image in the camera space, in which Vrk and φ^rk are the amplitude and phase values.

  • (iii) Inverse propagation is carried out started with replacing the calculated amplitude by the square root of measured intensity on the camera, i.e., Vrk=Ir and retaining the calculated phase. The backward propagation can be written as
    μROk=FSTz1|Irexp[jφ^rk]|(k=1+3i),

    where FSTz1 is the backward free space propagation based on angular spectrum propagation operator. μROk represents the updated complex-valued object, and the subscript is the propagation from image plane to object plane. Such updated μROk will be the input value μOk+1 for the (k + 1)-th iteration. The green/blue wavelength runs as same as the red one, in which the forward and backward propagation are changed to be

    μORk=Vgkexp[jφ^gk]=FSTz|Agkexp[jφgk]|,
    μROk=FSTz1|Igexp[jφ^gk]|(k=2+3i),
    μORk=Vbkexp[jφ^bk]=FSTz|Abkexp[jφbk]|,
    μROk=FSTz1|Ibexp[jφ^bk]|(k=3+3i).

    The phase retrieval process performing in the loop is shown in Fig. 2.

  • (iv) Repeat steps ii-iii until K times or the convergence is achieved, when the high-resolution amplitude and phase are retrieved.

The model predictions prove satisfactory solution quality and the convergence of the algorithm only takes 10 to 20 iterations, and the mean square error (MSE) criterion is used to judge the iterative convergence in the full text. Our MWAF method is implemented with three wavelengths, and it acquires better results when more wavelengths are involved. This method transfers information between the object plane and the camera (image) plane. The flow chart of the algorithm is shown in Fig. 3. The main figure contains two parts, in which the dotted box in the upper left corner shows multi-wavelength autofocusing and the others are for phase retrieval algorithm. To visualize the course better, the computational simulation is conducted using USAF resolution target in details.

 

Fig. 3 The flowchart of MWAF and the multi-wavelength phase retrievals. (a) MWAF algorithm, the combination of colored curved arrows and ㊀ indicates the subtraction of the recovered low-resolution images at the corresponding wavelength. (b) Object plane. The colored arrows represent the Fresnel diffraction propagation (c) Camera plane. (d) Recovered amplitude and phase images.

Download Full Size | PPT Slide | PDF

In general, the lensless autofocusing can be regarded as an optimization problem designed to trace the maximum (or minimum, depending on the selected criterion) of various criterions. To estimate the precise value for various samples, it is necessary to find a suitable unimodal function. In this work, we list several different autofocusing criteria summarized by Langehanenberg et al. [28–30,40], including variance (Var), gradient (Grad), Laplacian (Lap), phase (PC), CS, Tamura and MC methods as a comparison with MWAF for validating the applicability on objects. Equations (11)–(13) are some classic functions, and the expressions are presented as

Var(z)=1NxNyx,y[|O(x,y,z)||O(x,y,z)|¯]2,
Grad(z)=x=1Nx1y=1Ny1[|O(x,y,z)||O(x1,y,z)|]2+[|O(x,y,z)||O(x,y1,z)|]2,
Lap(z)=x=1Nx1y=1Ny1[|O(x+1,y,z)|+|O(x1,y,z)|+|O(x,y+1,z)|+|O(x,y1,z)|4|O(x,y,z)|]2,
where Nx and Ny denote the number of spatial pixels and O (x, y, z) represents pixels of a single grayscale image in the coordinate system. In the following model validations, we will demonstrate that the reconstruction will not be capable of getting a stable and robust auto-focusing search in the case of multi-wavelength lensless imaging when using these traditional methods.

It is worthy to claim that phase shift both in the object and imaging space generated by different wavelengths, when phase retrieval is taken in the MWAF, is not discussed above [8]. The numerical simulation and experimental results addressing these cases are discussed in Appendix.

3. Simulations and experiments

We first demonstrated the effectiveness of MWAF by numerical simulations. The initial conditions used in simulations are the same as those in the actual lensless experiments. The image of cameraman (amplitude) and testpat1 (phase) are selected (256 × 256) provided in the MATLAB platform as complex-valued sample. The sample-to-sensor distance (focusing distance) is set to be Z0 = 3 mm. when MWAF is taken, the focusing distance is searched in the range of [0.1, 10] mm with a step size of 0.1 mm. For comparison, the value is normalized to [0, 1] with no unit. Then the determined focusing distance, z, is brought into phase retrieval algorithm. There are 50 iterations executed to get the reconstructed amplitude and phase. All of these following simulations and experiments are generated in a personal computer (PC) (Intel Xeon E3-1230-v3 CPU at 3.30 GHz, 3.25 GB RAM, Windows 10 64 Bits, MATLAB R2014b) with no GPU acceleration. To verify the feasibility and robustness of MWAF, a series of simulations under different conditions were carried out and then the results were compared to conventional autofocusing criteria. The 532 nm wavelength is used to calculate the distance in Var, Grad, Lap, phase, CS, Tamura and MC method.

In Figs. 4(a)–4(c) the raw diffraction pattern (holograms) with blue, green and red wavelengths, are presented respectively. In Fig. 4(d), the black curve presents the searched distance value vs. the normalized metric value of the MWAF method. Examining the results, it can be seen that the generated curve is unimodal and the curve decreases significantly near the optimal distance position, until it reaches its minimum value at the focusing distance of z = 3 cm, and this is consistent with the assumed value Z0. Figures 4(e) and (f) show the reconstructed amplitude and phase respectively. Figure 4(g) show the interplay between the autofocusing method and the phase retrieval algorithm. l1 is the reconstructed image at the focus position. l2, l3, l4, l2’, l3′ and l4’, are defocused images on both sides of the focal plane, respectively, with a spacing of d = 0.1mm. It is obvious that we can get an accurate focusing distance adopting MWAF method, and high-resolution amplitude and phase images are obtained with multi-wavelength phase retrieval algorithm.

 

Fig. 4 The results of MWAF algorithm. (a-c) The Fresnel diffraction patterns at different wavelengths. (d) MWAF normalized metric value curve. (e-f) The reconstructed amplitude and phase images using 50 iterations phase retrieval. (g) The interplay between the autofocusing method and the phase retrieval algorithm.

Download Full Size | PPT Slide | PDF

In Fig. 5, USAF resolution targets cameraman test and complex test are used as the samples in the object plane. The normalized curves of MWAF, Var, Grad, Lap, phase, CS, Tamura and MC criterion are demonstrated, respectively. Five conditions including noise-free case (NF), Gaussian noise (GN, variance = 0.01), speckle noise (SN), wide bandwidth (WB) and compound noise (CN) are considered. As shown in the left three columns in Fig. 5, the initial distance is given by 1 mm, and searched between 0.1 and 2 mm with the z increments of 0.1 mm (shown as horizontal coordinate), and the value (vertical coordinate) is normalized to [0, 1] with no unit. The colored curves represent different methods, respectively. The reconstructed amplitude of complex test under different conditions are represented in the right column.

 

Fig. 5 The normalized metric curves of the eight algorithms under different noise conditions. The first three columns represent different simulated samples, and the fourth column is the recovered amplitude of the complex test corresponding to the third column. (a)-(c) The noise-free case (NF). (e)-(g) The metric curves in the case of adding Gaussian noise (GN, variance = 0.01). (i)-(k) The metric curves in the case of adding speckle noise (SN). (m)-(o) The metric curves in the case of wide bandwidth (WB). (q)-(s) The curves of compound noise (CN).

Download Full Size | PPT Slide | PDF

For noise-free case, in Figs. 5(a)–5(c), MWAF, and Lap methods, are overall unimodal, and each has a strong global peak/valley at the correct z position, as expected. Grad method exhibits significant oscillations, which make it difficult to search for a peak or valley value, especially in the pure amplitude simulation. Although Var method obviously gives a peak value, the curve across the propagation interval is monotonously decreasing. Thus, it is hard to separate peak-to-peak value as the distance z is approaching the camera plane. Tamura and MC are the improved methods based on Var, and they all have problems similar to Var method. PC and CS methods have lots of local optimal solutions, searching for the local minima or maxima in a small range can solve this problem.

While a zero-mean Gaussian noise of standard deviation of 1% and the speckle noise are added, as shown in Figs. 5(e)–5(g) and Figs. 5(i)–5(k). Lap method is the most seriously affected among the four criterions, and the curve became fuzzy with the peak disappearing. Var, Tamura, MC and Grad have almost the same defect as before. Evidently, our method still has a steep valley. PC and CS methods cannot get the focus distance in some cases. Then a change in the monochromaticity of the incident light is included. The bandwidth of the illumination is from 3 nm to 10 nm. As shown in Figs. 5(m)–5(o), our MWAF method is greatly affected. The curve turned slightly flat, nevertheless, the minimum can still be calculated. Finally, both influencing situations are mixed, as described in Figs. 5(q)–5(s). In general, MWAF method can search the optimal focusing distance, but other methods got suboptimal result in some cases. The robustness against noise of Lap method is insufficient. Var, Tamura and MC method always get the precise distance, but it is fragile for lensless on-chip imaging owing to its monotonicity. Grad and CS methods have frequent local maxima and minima around the peak, and the inaccurate estimation in the choppy curve makes it difficult to get the truth. PC method is cyclically increasing throughout the interval, we have not shown here. Preliminary summary, MWAF is more robust under the noise conditions. Figures 5(d), 5(h), 5(l), 5(p) and 5(t) show the effects of several noises on the reconstructed images. To clarify the performance of MWAF in practical configuration, the proposed system is tested with the data sets captured experimentally. In the first group of experiments, a positive USAF resolution target is used with the experimental setup shown in Fig. 1. Three diffraction patterns related to three wavelengths are captured and the hologram with 488 nm is shown in Fig. 6(a). As shown in Fig. 6(c), the curves obtained by Var and Grad both have two extreme values near 1mm, taking values close to 1mm respectively. Lap and CS methods can both determine the optimal position, and have a single peak in each related curve. Tamura and MC methods are monotonic over the entire propagation interval, but there is a small bump which can be observed on the focus curve.

 

Fig. 6 The retrieved results of USAF resolution target. (a) The intensity captured by the CCD camera. (b, d) The reconstructed phase and amplitude images based on MWAF method. (c) The normalized metric curve obtained by the eight methods. (e-f) The reconstruction of the amplitude based on the Grad and Var method using 10 iterations. (g-i) The horizontal-line intensity profile of the marked lines in (d-f), respectively. The scale bar corresponds to 100 μm and is applicable to all images.

Download Full Size | PPT Slide | PDF

The PC method is periodic and there is a maximum in each cycle. It has the same results as the MWAF algorithm in the appropriate search range. MWAF method shows a clear peak position perfectly given as the black curve. In the eight methods, the comparison is conducted among the MWAF, Grad and Var method, since the results are closest to MWAF result from both sides. In order to confirm the accuracy of the distance values searched and determined, the reconstructed images in connection with the three methods are exhibited. Figures 6(d)–6(f) show the reconstructed amplitudes adopting multi-wavelength phase retrieval procedure (10 iterations), using the estimated distance values determined by MWAF, Grad and Var methods, respectively. As shown in the selected area in the zoom-in view, 228 lines/mm (4.39 µm) can be resolved and the image recovered based on MWAF is visually superior to the others. Only MWAF method can detect the optimal position, and it is close to the true focus distance. On the contrary, the other methods fail or perform suboptimal for USAF resolution target. According to the USAF resolution target line width calibration, the resolution is nearly 4 µm, which proved that the refocusing distance searched by MWAF is relatively accurate.

In Figs. 6(g)–6(i) the intensities of the corresponding color short lines in Figs. 6(b)–6(f), are plotted respectively, in which the scale bar in Fig. 6(a) presents for a length is 100 μm. Computational efficiency is another vital character of an autofocusing method that needs to be considered. In this set of experiments, the run times of MWAF, Var, Grad, Lap, PC, CS, Tamura and MC are 83 s, 28 s, 39 s, 31 s, 115 s, 50 s, 31 s and 28 s, respectively. In general, FFTs are typically the most consume time step, MWAF requires three propagations to calculate each datapoint on the curve but others just once (except PC methods). The process makes the MWAF method roughly 3 times slower than the others, and in terms of overall performance, which time is acceptable.

The results above present a qualitative comparison among the eight algorithms. To better compare the performance, criteria like accuracy, resolution, range and unimodality are used [24,53,54]. The detailed definitions and parameter settings are given in [53]. We use the accuracy metric (AM) and the resolution metric (RM) to quantitatively evaluate the accuracy and resolution. Results are given in Table 1, where zf denotes the detected focal distance, and the unit is mm.

Tables Icon

Table 1. Performance of focus metrics for eight autofocusing methods

It is worth noting that the zf values of the PC, Tamura and MC methods are obtained in a local search. In this table, MWAF method gives a smaller AM value in eight methods. It also gives a larger RM value as expected which reflects the focus distance standard deviation across the whole propagation interval. Most importantly, in the case of lensless on-chip imaging, it is often impossible to estimate the exact initial position, so a relatively large search interval (several millimeter levels) is necessary. Also, unimodality is a must and needs to be taken into consideration first. Only MWAF method gives a unimodal and smooth curve in the whole range. PC, Tamura and MC can get accurate values under small-range search conditions. Var, Grad, Lap and CS methods give a wrong result.

In order to validate the effectiveness of MWAF under the wide bandwidth condition, and to prove that MWAF algorithm can be applied to biological samples (nontransparent, mixed amplitude and phase-contrast objects), an ant specimen is adopted to analyze the impact of laser bandwidth on MWAF method. Firstly 488 nm (3 nm, Edmund), 532 nm (3 nm, Edmund) and 632.8 nm (3 nm, Edmund) narrow-band filters are selected with 3 nm bandwidth additionally. Here we kept the distance from the specimen to the camera plane unchanged, recorded six diffraction patterns under the two groups of bandwidth values and then calculated the two curves, as shown in Fig. 7.

 

Fig. 7 (a) The normalized metric curve obtained by MWAF at 10 nm bandwidth and 3nm bandwidth. (c-d) The recovered amplitude and phase of the biological specimen. The white bar at the left-down corner corresponds to 300 μm.

Download Full Size | PPT Slide | PDF

In Figs. 7(a) and 7(b), it shows that the optimal focus distances searched by MWAF under two bandwidths are accordant. The curves remain smooth, unimodal and each has a steep peak basically, and it verified that the proposed algorithm is robust to the laser monochromaticity. Figures 7(c) and 7(d) show the reconstructed amplitude and phase of the ant specimen when 10 nm bandwidth filter is adopted. It can be clearly observed that the ant has a sharp edge and a clear background, which is the mark of focus plane. Unlike the binary object, the curve is smoother, owing to the thickness of the biological sample, and it is equivalent to a multi-layer object. The refocusing on all layers of the thick specimen is not feasible, so the curve is mellow at the peak.

To characterize retrieved USAF resolution target under broadband laser, in the third experiment, a white laser is employed to directly illuminate the negative USAF resolution target. As shown in Fig. 8, a single-shot diffraction pattern is recorded. Each of the red, green and blue holograms could be extracted from the color hologram by virtue of the RGB color filters within the CCD sensor. The wavelengths of 700 nm, 546.1 nm and 435.8 nm of the three primary colors are set as the propagation wavelengths of the Fresnel diffraction [55]. The optimal focus distance searched by MWAF is given in Fig. 8(b). MWAF can still calculate the optimal distance and the curve keeps a steep valley. To enable better visual judgment, the zoom-in area in Figs. 8(a) and 8(c) show that the smaller area can be resolved against with the acquired pattern, and the second position of the sixth group can be distinguished but the unprocessed one cannot make out. Through three distinct configurations in our experiment, we validate that the proposed MWAF method outperforms the existing autofocusing algorithm, providing superior accuracy and robustness for various samples. The results of the USAF resolution target and biological specimen provide strong evidence for the reliability of the method. This demonstrates that MWAF is feasible even under noisy conditions, and may illustrate its practical usefulness for holographic microscopy.

 

Fig. 8 The retrieved result of USAF resolution target under broadband laser. (a) The diffraction pattern recorded by a color CCD camera. (b) The normalized metric curve. (c-d) The reconstructed amplitude and phase using 10 iterations. The curves show the intensity at the position of the colored line. The white bar corresponds to 100 μm.

Download Full Size | PPT Slide | PDF

4. Conclusion

In summary, we demonstrate a precise and robust autofocusing method (MWAF) based on the minimum variation criterion. The main motivation of this technique is to extract the minimum difference among the inverse propagative patterns. The performance of the proposed MWAF method is demonstrated both in theoretical and experimental aspects. To accurately find the optimal distance under the condition of different types of experimental environments, the retrieved results provide that MWAF is robust, and more feasible even under noisy conditions. Our work paves a new way to measure the focusing distance and reconstruct high resolution images for lensless imaging systems.

Much work remains to be done. Firstly, it is worth to studying the cases when we replace the existing illumination unit with laser diodes or color LEDs. The MWAF algorithm will be developed more flexible while this constraint is relaxed. It means that the algorithm can be applied to partially coherent or even incoherent sources, and the experimental device can be made more compact and cost efficiently. Secondly, the relative motion of the sample and camera during the sampling process may be damped by external environmental vibration and the position error is generated. As a result, the MWAF may invalidate as the position error oversteps the limit, and it seems to be a deficiency of our method. In the further work, we are aiming to refine MWAF through adding the position error modification. Thirdly, although the proposed algorithm presented promising results compared to the conventional methods, its computation speed is time-consuming. The computational complexity is mostly contributed by the numbers of raw holograms and pixels. This issue could be dealt with parallel operation or the GPU acceleration.

Appendix

In the above simulations, the simplified input phase, generated by normalizing the phase image (denoted by B), followed by changing to exp(jπB/2), denoted by ‘phase (0,1)’, is adopted to validate the feasibility of our method. In order to generalize the applications to an extent range, input phase related to wavelengths is also considered and simulated in our work. When the wavelength is taken into account, the updated phase is then given as exp(jπB'k) (k = 1, 2, 3), denoted by ‘phase (lambda)’, where B' = 2B × 10−7. λk relates to the three chosen wavelengths, i.e., 488 nm, 532 nm and 632.8 nm.

When we also consider the affection of wavelengths on phase values, in this case, during iteration, the difference in phase shift caused by switching wavelength, both in the object space and in the imaging space, is considered and the retrieved object images are shown in Fig. 9. In Fig. 9, ‘Old’ refers to the current phase retrieval method without considering the difference of phase modulation caused by wavelength, while ‘New’ represents the phase retrieval method adjusting the phase values when switching the wavelength [8]. From the figure, we can see that the ‘Old’ phase retrieval shows slightly better performance on the retrieved amplitude and phase images.

 

Fig. 9 The reconstructed complex object (Cameraman + Testpat1), and pure phase object (Testpat1) using 50 iterations phase retrieval methods. Phase (0, 1) refers to the input phase without considering the modulation by different wavelength, while phase (lambda) is the input phase considering the modulation of the different wavelength. ‘Old’ refers to phase retrieval process without considering the phase shift caused by different wavelength, while ‘New’ is the phase retrieval taking the phase shift of different wavelength into consideration.

Download Full Size | PPT Slide | PDF

The comparison results of simplified input phase vs. wavelength depended input phase, and ‘Old’ phase retrieval vs. ‘New’ phase retrieval, are shown in Fig. 9. From the two group of figures, it can be seen that in the case of adopting more general phase module (right figures), the performance is slightly worse than the simplified case, but the difference can be neglected in some cases. It also can be concluded from the figures that our method works well when the wavelength depended phase shift is considered. Compared to the ‘New’ phase retrieval method, our current algorithm, works slightly better, observed from the recovered amplitude and phase images in both cases. From the calculation of MSE = 3 × 10−15, where MSE=1NxNyx=1Nxy=1Ny(Phase1Phase2)2, between the recovered two phases using two methods, there is no big difference using the two phase retrieval methods. The experimental results also show no big difference adopting the two phase retrieval methods, which is not presented in this paper.

Funding

National Natural Science Foundation of China (61805057); Young Elite Scientists Sponsorship Program (2018QNRC001); Equipment Pre-research Filed Fund (6140923020102).

Acknowledgments

The authors acknowledge the support of Advanced Microscopy and Instrumentation Research Center.

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References

1. O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010). [CrossRef]   [PubMed]  

2. M. Lee, O. Yaglidere, and A. Ozcan, “Field-portable reflection and transmission microscopy based on lensless holography,” Biomed. Opt. Express 2(9), 2721–2730 (2011). [CrossRef]   [PubMed]  

3. J. F. Restrepo and J. Garcia-Sucerquia, “Automatic three-dimensional tracking of particles with high-numerical-aperture digital lensless holographic microscopy,” Opt. Lett. 37(4), 752–754 (2012). [CrossRef]   [PubMed]  

4. X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008). [CrossRef]   [PubMed]  

5. L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

6. C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017). [CrossRef]   [PubMed]  

7. G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. 30(8), 833–835 (2005). [CrossRef]   [PubMed]  

8. D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett. 39(2), 193–196 (2014). [CrossRef]   [PubMed]  

9. N. Warnasooriya and M. K. Kim, “LED-based multi-wavelength phase imaging interference microscopy,” Opt. Express 15(15), 9239–9247 (2007). [CrossRef]   [PubMed]  

10. C. Zuo, J. Sun, J. Zhang, Y. Hu, and Q. Chen, “Lensless phase microscopy and diffraction tomography with multi-angle and multi-wavelength illuminations using a LED matrix,” Opt. Express 23(11), 14314–14328 (2015). [CrossRef]   [PubMed]  

11. C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017). [CrossRef]   [PubMed]  

12. C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018). [CrossRef]  

13. T. Meeser, C. von Kopylow, and C. Falldorf, “Advanced Digital Lensless Fourier Holography by means of a Spatial Light Modulator,” in 2010 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (IEEE, 2010), 1–4. [CrossRef]  

14. A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012). [CrossRef]   [PubMed]  

15. P. Bao, G. Situ, G. Pedrini, and W. Osten, “Lensless phase microscopy using phase retrieval with multiple illumination wavelengths,” Appl. Opt. 51(22), 5486–5494 (2012). [CrossRef]   [PubMed]  

16. W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010). [CrossRef]   [PubMed]  

17. B. Yu, “Simulation study of phase retrieval for hard X-ray in-line phase contrast imaging,” Sci. China Ser. G 48(4), 450 (2005). [CrossRef]  

18. M. Sanz, J. A. Picazo-Bueno, J. García, and V. Micó, “Improved quantitative phase imaging in lensless microscopy by single-shot multi-wavelength illumination using a fast convergence algorithm,” Opt. Express 23(16), 21352–21365 (2015). [CrossRef]   [PubMed]  

19. Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods 136, 4–16 (2018). [CrossRef]   [PubMed]  

20. J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989). [CrossRef]  

21. P. Gao, B. Yao, R. Rupp, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, “Autofocusing based on wavelength dependence of diffraction in two-wavelength digital holographic microscopy,” Opt. Lett. 37(7), 1172–1174 (2012). [CrossRef]   [PubMed]  

22. Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019). [CrossRef]  

23. Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017). [CrossRef]   [PubMed]  

24. Z. Ren, N. Chen, and E. Y. Lam, “Automatic focusing for multisectional objects in digital holography using the structure tensor,” Opt. Lett. 42(9), 1720–1723 (2017). [CrossRef]   [PubMed]  

25. L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004). [CrossRef]  

26. M. Lyu, C. Yuan, D. Li, and G. Situ, “Fast autofocusing in digital holography using the magnitude differential,” Appl. Opt. 56(13), F152–F157 (2017). [CrossRef]   [PubMed]  

27. W. Li, N. C. Loomis, Q. Hu, and C. S. Davis, “Focus detection from digital in-line holograms based on spectral l_1 norms,” J. Opt. Soc. Am. A 24(10), 3054–3062 (2007). [CrossRef]   [PubMed]  

28. J. Dohet-Eraly, C. Yourassowsky, and F. Dubois, “Fast numerical autofocus of multispectral complex fields in digital holographic microscopy with a criterion based on the phase in the Fourier domain,” Opt. Lett. 41(17), 4071–4074 (2016). [CrossRef]   [PubMed]  

29. P. Memmolo, M. Iannone, M. Ventre, P. A. Netti, A. Finizio, M. Paturzo, and P. Ferraro, “On the holographic 3D tracking of in vitro cells characterized by a highly-morphological change,” Opt. Express 20(27), 28485–28493 (2012). [CrossRef]   [PubMed]  

30. A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017). [CrossRef]  

31. P. Picart, S. Montresor, O. Sakharuk, and L. Muravsky, “Refocus criterion based on maximization of the coherence factor in digital three-wavelength holographic interferometry,” Opt. Lett. 42(2), 275–278 (2017). [CrossRef]   [PubMed]  

32. T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018). [CrossRef]  

33. F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, “Focus plane detection criteria in digital holography microscopy by amplitude analysis,” Opt. Express 14(13), 5895–5908 (2006). [CrossRef]   [PubMed]  

34. M. Antkowiak, N. Callens, C. Yourassowsky, and F. Dubois, “Extended focused imaging of a microparticle field with digital holographic microscopy,” Opt. Lett. 33(14), 1626–1628 (2008). [CrossRef]   [PubMed]  

35. L. Yu and L. Cai, “Iterative algorithm with a constraint condition for numerical reconstruction of a three-dimensional object from its hologram,” J. Opt. Soc. Am. A 18(5), 1033–1045 (2001). [CrossRef]   [PubMed]  

36. M. Liebling and M. Unser, “Autofocus for digital Fresnel holograms by use of a Fresnelet-sparsity criterion,” J. Opt. Soc. Am. A 21(12), 2424–2430 (2004). [CrossRef]   [PubMed]  

37. L. Xu, M. Mater, and J. Ni, “Focus detection criterion for refocusing in multi-wavelength digital holography,” Opt. Express 19(16), 14779–14793 (2011). [CrossRef]   [PubMed]  

38. P. Ferraro, S. Grilli, D. Alfieri, S. De Nicola, A. Finizio, G. Pierattini, B. Javidi, G. Coppola, and V. Striano, “Extended focused image in microscopy by digital Holography,” Opt. Express 13(18), 6738–6749 (2005). [CrossRef]   [PubMed]  

39. C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018). [CrossRef]   [PubMed]  

40. P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, “Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging,” Appl. Opt. 47(19), D176–D182 (2008). [CrossRef]   [PubMed]  

41. P. Ferraro, G. Coppola, S. De Nicola, A. Finizio, and G. Pierattini, “Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time,” Opt. Lett. 28(14), 1257–1259 (2003). [CrossRef]   [PubMed]  

42. Z. Ren, Z. Xu, and E. Y. Lam, “Learning-based nonparametric autofocusing for digital holography,” Optica 5(4), 337 (2018). [CrossRef]  

43. T. Pitkäaho, A. Manninen, and T. J. Naughton, “Focus prediction in digital holographic microscopy using deep convolutional neural networks,” Appl. Opt. 58(5), A202–A208 (2019). [CrossRef]   [PubMed]  

44. S. Jiang, J. Liao, Z. Bian, K. Guo, Y. Zhang, and G. Zheng, “Transform- and multi-domain deep learning for single-frame rapid autofocusing in whole slide imaging,” Biomed. Opt. Express 9(4), 1601–1612 (2018). [CrossRef]   [PubMed]  

45. F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985). [CrossRef]   [PubMed]  

46. J. W. Goodman, Introduction to Fourier Optics, 2nd ed. (McGraw-Hill, 1996).

47. C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. Part II: Attacking optical encryption systems,” Appl. Opt. 54(15), 4709–4719 (2015). [CrossRef]   [PubMed]  

48. C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018). [CrossRef]  

49. C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. I: optimization,” Appl. Opt. 54(15), 4698–4708 (2015). [CrossRef]   [PubMed]  

50. A. J. Jerri, “The Shannon sampling theorem—Its various extensions and applications: A tutorial review,” Proc. IEEE 65(11), 1565–1596 (1977). [CrossRef]  

51. Y. S. Kim, T. Kim, S. S. Woo, H. Kang, T. C. Poon, and C. Zhou, “Speckle-free digital holographic recording of a diffusely reflecting object,” Opt. Express 21(7), 8183–8189 (2013). [CrossRef]   [PubMed]  

52. J. J. Barton, “Removing multiple scattering and twin images from holographic images,” Phys. Rev. Lett. 67(22), 3106–3109 (1991). [CrossRef]   [PubMed]  

53. E. S. R. Fonseca, P. T. Fiadeiro, M. Pereira, and A. Pinheiro, “Comparative analysis of autofocus functions in digital in-line phase-shifting holography,” Appl. Opt. 55(27), 7663–7674 (2016). [CrossRef]   [PubMed]  

54. S. K. Mohammed, L. Bouamama, D. Bahloul, and P. Picart, “Quality assessment of refocus criteria for particle imaging in digital off-axis holography,” Appl. Opt. 56(13), F158–F166 (2017). [CrossRef]   [PubMed]  

55. Y. Sun, C. Lou, Z. Jiang, and H. Zhou, “Experimental research of representative wavelengths of tricolor for color CCD camera,” Opt. Lett. 42(2), 275–278 (2017). [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
    [Crossref] [PubMed]
  2. M. Lee, O. Yaglidere, and A. Ozcan, “Field-portable reflection and transmission microscopy based on lensless holography,” Biomed. Opt. Express 2(9), 2721–2730 (2011).
    [Crossref] [PubMed]
  3. J. F. Restrepo and J. Garcia-Sucerquia, “Automatic three-dimensional tracking of particles with high-numerical-aperture digital lensless holographic microscopy,” Opt. Lett. 37(4), 752–754 (2012).
    [Crossref] [PubMed]
  4. X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
    [Crossref] [PubMed]
  5. L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).
  6. C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
    [Crossref] [PubMed]
  7. G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. 30(8), 833–835 (2005).
    [Crossref] [PubMed]
  8. D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett. 39(2), 193–196 (2014).
    [Crossref] [PubMed]
  9. N. Warnasooriya and M. K. Kim, “LED-based multi-wavelength phase imaging interference microscopy,” Opt. Express 15(15), 9239–9247 (2007).
    [Crossref] [PubMed]
  10. C. Zuo, J. Sun, J. Zhang, Y. Hu, and Q. Chen, “Lensless phase microscopy and diffraction tomography with multi-angle and multi-wavelength illuminations using a LED matrix,” Opt. Express 23(11), 14314–14328 (2015).
    [Crossref] [PubMed]
  11. C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
    [Crossref] [PubMed]
  12. C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
    [Crossref]
  13. T. Meeser, C. von Kopylow, and C. Falldorf, “Advanced Digital Lensless Fourier Holography by means of a Spatial Light Modulator,” in 2010 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (IEEE, 2010), 1–4.
    [Crossref]
  14. A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
    [Crossref] [PubMed]
  15. P. Bao, G. Situ, G. Pedrini, and W. Osten, “Lensless phase microscopy using phase retrieval with multiple illumination wavelengths,” Appl. Opt. 51(22), 5486–5494 (2012).
    [Crossref] [PubMed]
  16. W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
    [Crossref] [PubMed]
  17. B. Yu, “Simulation study of phase retrieval for hard X-ray in-line phase contrast imaging,” Sci. China Ser. G 48(4), 450 (2005).
    [Crossref]
  18. M. Sanz, J. A. Picazo-Bueno, J. García, and V. Micó, “Improved quantitative phase imaging in lensless microscopy by single-shot multi-wavelength illumination using a fast convergence algorithm,” Opt. Express 23(16), 21352–21365 (2015).
    [Crossref] [PubMed]
  19. Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods 136, 4–16 (2018).
    [Crossref] [PubMed]
  20. J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989).
    [Crossref]
  21. P. Gao, B. Yao, R. Rupp, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, “Autofocusing based on wavelength dependence of diffraction in two-wavelength digital holographic microscopy,” Opt. Lett. 37(7), 1172–1174 (2012).
    [Crossref] [PubMed]
  22. Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
    [Crossref]
  23. Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017).
    [Crossref] [PubMed]
  24. Z. Ren, N. Chen, and E. Y. Lam, “Automatic focusing for multisectional objects in digital holography using the structure tensor,” Opt. Lett. 42(9), 1720–1723 (2017).
    [Crossref] [PubMed]
  25. L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
    [Crossref]
  26. M. Lyu, C. Yuan, D. Li, and G. Situ, “Fast autofocusing in digital holography using the magnitude differential,” Appl. Opt. 56(13), F152–F157 (2017).
    [Crossref] [PubMed]
  27. W. Li, N. C. Loomis, Q. Hu, and C. S. Davis, “Focus detection from digital in-line holograms based on spectral l_1 norms,” J. Opt. Soc. Am. A 24(10), 3054–3062 (2007).
    [Crossref] [PubMed]
  28. J. Dohet-Eraly, C. Yourassowsky, and F. Dubois, “Fast numerical autofocus of multispectral complex fields in digital holographic microscopy with a criterion based on the phase in the Fourier domain,” Opt. Lett. 41(17), 4071–4074 (2016).
    [Crossref] [PubMed]
  29. P. Memmolo, M. Iannone, M. Ventre, P. A. Netti, A. Finizio, M. Paturzo, and P. Ferraro, “On the holographic 3D tracking of in vitro cells characterized by a highly-morphological change,” Opt. Express 20(27), 28485–28493 (2012).
    [Crossref] [PubMed]
  30. A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017).
    [Crossref]
  31. P. Picart, S. Montresor, O. Sakharuk, and L. Muravsky, “Refocus criterion based on maximization of the coherence factor in digital three-wavelength holographic interferometry,” Opt. Lett. 42(2), 275–278 (2017).
    [Crossref] [PubMed]
  32. T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
    [Crossref]
  33. F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, “Focus plane detection criteria in digital holography microscopy by amplitude analysis,” Opt. Express 14(13), 5895–5908 (2006).
    [Crossref] [PubMed]
  34. M. Antkowiak, N. Callens, C. Yourassowsky, and F. Dubois, “Extended focused imaging of a microparticle field with digital holographic microscopy,” Opt. Lett. 33(14), 1626–1628 (2008).
    [Crossref] [PubMed]
  35. L. Yu and L. Cai, “Iterative algorithm with a constraint condition for numerical reconstruction of a three-dimensional object from its hologram,” J. Opt. Soc. Am. A 18(5), 1033–1045 (2001).
    [Crossref] [PubMed]
  36. M. Liebling and M. Unser, “Autofocus for digital Fresnel holograms by use of a Fresnelet-sparsity criterion,” J. Opt. Soc. Am. A 21(12), 2424–2430 (2004).
    [Crossref] [PubMed]
  37. L. Xu, M. Mater, and J. Ni, “Focus detection criterion for refocusing in multi-wavelength digital holography,” Opt. Express 19(16), 14779–14793 (2011).
    [Crossref] [PubMed]
  38. P. Ferraro, S. Grilli, D. Alfieri, S. De Nicola, A. Finizio, G. Pierattini, B. Javidi, G. Coppola, and V. Striano, “Extended focused image in microscopy by digital Holography,” Opt. Express 13(18), 6738–6749 (2005).
    [Crossref] [PubMed]
  39. C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018).
    [Crossref] [PubMed]
  40. P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, “Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging,” Appl. Opt. 47(19), D176–D182 (2008).
    [Crossref] [PubMed]
  41. P. Ferraro, G. Coppola, S. De Nicola, A. Finizio, and G. Pierattini, “Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time,” Opt. Lett. 28(14), 1257–1259 (2003).
    [Crossref] [PubMed]
  42. Z. Ren, Z. Xu, and E. Y. Lam, “Learning-based nonparametric autofocusing for digital holography,” Optica 5(4), 337 (2018).
    [Crossref]
  43. T. Pitkäaho, A. Manninen, and T. J. Naughton, “Focus prediction in digital holographic microscopy using deep convolutional neural networks,” Appl. Opt. 58(5), A202–A208 (2019).
    [Crossref] [PubMed]
  44. S. Jiang, J. Liao, Z. Bian, K. Guo, Y. Zhang, and G. Zheng, “Transform- and multi-domain deep learning for single-frame rapid autofocusing in whole slide imaging,” Biomed. Opt. Express 9(4), 1601–1612 (2018).
    [Crossref] [PubMed]
  45. F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985).
    [Crossref] [PubMed]
  46. J. W. Goodman, Introduction to Fourier Optics, 2nd ed. (McGraw-Hill, 1996).
  47. C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. Part II: Attacking optical encryption systems,” Appl. Opt. 54(15), 4709–4719 (2015).
    [Crossref] [PubMed]
  48. C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
    [Crossref]
  49. C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. I: optimization,” Appl. Opt. 54(15), 4698–4708 (2015).
    [Crossref] [PubMed]
  50. A. J. Jerri, “The Shannon sampling theorem—Its various extensions and applications: A tutorial review,” Proc. IEEE 65(11), 1565–1596 (1977).
    [Crossref]
  51. Y. S. Kim, T. Kim, S. S. Woo, H. Kang, T. C. Poon, and C. Zhou, “Speckle-free digital holographic recording of a diffusely reflecting object,” Opt. Express 21(7), 8183–8189 (2013).
    [Crossref] [PubMed]
  52. J. J. Barton, “Removing multiple scattering and twin images from holographic images,” Phys. Rev. Lett. 67(22), 3106–3109 (1991).
    [Crossref] [PubMed]
  53. E. S. R. Fonseca, P. T. Fiadeiro, M. Pereira, and A. Pinheiro, “Comparative analysis of autofocus functions in digital in-line phase-shifting holography,” Appl. Opt. 55(27), 7663–7674 (2016).
    [Crossref] [PubMed]
  54. S. K. Mohammed, L. Bouamama, D. Bahloul, and P. Picart, “Quality assessment of refocus criteria for particle imaging in digital off-axis holography,” Appl. Opt. 56(13), F158–F166 (2017).
    [Crossref] [PubMed]
  55. Y. Sun, C. Lou, Z. Jiang, and H. Zhou, “Experimental research of representative wavelengths of tricolor for color CCD camera,” Opt. Lett. 42(2), 275–278 (2017).
    [PubMed]

2019 (2)

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

T. Pitkäaho, A. Manninen, and T. J. Naughton, “Focus prediction in digital holographic microscopy using deep convolutional neural networks,” Appl. Opt. 58(5), A202–A208 (2019).
[Crossref] [PubMed]

2018 (7)

S. Jiang, J. Liao, Z. Bian, K. Guo, Y. Zhang, and G. Zheng, “Transform- and multi-domain deep learning for single-frame rapid autofocusing in whole slide imaging,” Biomed. Opt. Express 9(4), 1601–1612 (2018).
[Crossref] [PubMed]

Z. Ren, Z. Xu, and E. Y. Lam, “Learning-based nonparametric autofocusing for digital holography,” Optica 5(4), 337 (2018).
[Crossref]

C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018).
[Crossref] [PubMed]

C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
[Crossref]

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
[Crossref]

Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods 136, 4–16 (2018).
[Crossref] [PubMed]

2017 (9)

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref] [PubMed]

A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017).
[Crossref]

P. Picart, S. Montresor, O. Sakharuk, and L. Muravsky, “Refocus criterion based on maximization of the coherence factor in digital three-wavelength holographic interferometry,” Opt. Lett. 42(2), 275–278 (2017).
[Crossref] [PubMed]

Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017).
[Crossref] [PubMed]

Z. Ren, N. Chen, and E. Y. Lam, “Automatic focusing for multisectional objects in digital holography using the structure tensor,” Opt. Lett. 42(9), 1720–1723 (2017).
[Crossref] [PubMed]

M. Lyu, C. Yuan, D. Li, and G. Situ, “Fast autofocusing in digital holography using the magnitude differential,” Appl. Opt. 56(13), F152–F157 (2017).
[Crossref] [PubMed]

S. K. Mohammed, L. Bouamama, D. Bahloul, and P. Picart, “Quality assessment of refocus criteria for particle imaging in digital off-axis holography,” Appl. Opt. 56(13), F158–F166 (2017).
[Crossref] [PubMed]

Y. Sun, C. Lou, Z. Jiang, and H. Zhou, “Experimental research of representative wavelengths of tricolor for color CCD camera,” Opt. Lett. 42(2), 275–278 (2017).
[PubMed]

2016 (2)

2015 (4)

2014 (1)

2013 (1)

2012 (5)

2011 (2)

2010 (2)

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
[Crossref] [PubMed]

2008 (3)

2007 (2)

2006 (1)

2005 (3)

2004 (3)

M. Liebling and M. Unser, “Autofocus for digital Fresnel holograms by use of a Fresnelet-sparsity criterion,” J. Opt. Soc. Am. A 21(12), 2424–2430 (2004).
[Crossref] [PubMed]

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
[Crossref]

2003 (1)

2001 (1)

1991 (1)

J. J. Barton, “Removing multiple scattering and twin images from holographic images,” Phys. Rev. Lett. 67(22), 3106–3109 (1991).
[Crossref] [PubMed]

1989 (1)

J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989).
[Crossref]

1985 (1)

F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985).
[Crossref] [PubMed]

1977 (1)

A. J. Jerri, “The Shannon sampling theorem—Its various extensions and applications: A tutorial review,” Proc. IEEE 65(11), 1565–1596 (1977).
[Crossref]

Alfieri, D.

Antkowiak, M.

Bahloul, D.

Bao, P.

Bao, X.

Barton, J. J.

J. J. Barton, “Removing multiple scattering and twin images from holographic images,” Phys. Rev. Lett. 67(22), 3106–3109 (1991).
[Crossref] [PubMed]

Beckmann, T.

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Bertz, A.

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Bian, Z.

Bishara, W.

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
[Crossref] [PubMed]

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Bouamama, L.

Cai, L.

Callens, N.

Canlin, S.

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

Carl, D.

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Chen, N.

Chen, Q.

Coppola, G.

Coskun, A. F.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
[Crossref] [PubMed]

Cui, X.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Dan, D.

Davis, C. S.

De Nicola, S.

Dirksen, D.

Dohet-Eraly, J.

Dubois, F.

Eikema, K. S. E.

Ferraro, P.

Fiadeiro, P. T.

Finizio, A.

Fonseca, E. S. R.

Fratz, M.

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Gao, P.

García, J.

Garcia-Sucerquia, J.

Gillespie, J.

J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989).
[Crossref]

Göröcs, Z.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

Greenbaum, A.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

Grilli, S.

Groen, F. C. A.

F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985).
[Crossref] [PubMed]

Guo, C.

C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
[Crossref]

C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018).
[Crossref] [PubMed]

C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
[Crossref]

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. Part II: Attacking optical encryption systems,” Appl. Opt. 54(15), 4709–4719 (2015).
[Crossref] [PubMed]

C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. I: optimization,” Appl. Opt. 54(15), 4698–4708 (2015).
[Crossref] [PubMed]

Guo, K.

Guo, R.

He, A.

A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017).
[Crossref]

Heng, X.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Hu, Q.

Hu, Y.

Iannone, M.

Isikman, S. O.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Javidi, B.

Jerri, A. J.

A. J. Jerri, “The Shannon sampling theorem—Its various extensions and applications: A tutorial review,” Proc. IEEE 65(11), 1565–1596 (1977).
[Crossref]

Jiang, S.

Jiang, Z.

Jin, H.

L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
[Crossref]

Kang, H.

Kemper, B.

Khademhosseini, B.

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Kim, M. K.

Kim, T.

Kim, Y. S.

King, R. A.

J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989).
[Crossref]

Lam, E. Y.

Langehanenberg, P.

Lee, L. M.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Lee, M.

Lei, M.

Li, D.

Li, Q.

C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
[Crossref]

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

Li, W.

Li, Y.

L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
[Crossref]

Liao, J.

Liebling, M.

Ligthart, G.

F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985).
[Crossref] [PubMed]

Liu, L.

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

Liu, S.

Liu, Z.

C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
[Crossref]

C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018).
[Crossref] [PubMed]

C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
[Crossref]

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref] [PubMed]

Liyun, Z.

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

Loomis, N. C.

Lou, C.

Luo, W.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

Lyu, M.

Ma, B.

Ma, L.

L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
[Crossref]

Manninen, A.

Mater, M.

Memmolo, P.

Micó, V.

Min, J.

Mohammed, S. K.

Montresor, S.

Mudanyali, O.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Muravsky, L.

Naughton, T. J.

Netti, P. A.

Ni, J.

Noom, D. W. E.

Oh, C.

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Osten, W.

Ozcan, A.

Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods 136, 4–16 (2018).
[Crossref] [PubMed]

Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017).
[Crossref] [PubMed]

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

M. Lee, O. Yaglidere, and A. Ozcan, “Field-portable reflection and transmission microscopy based on lensless holography,” Biomed. Opt. Express 2(9), 2721–2730 (2011).
[Crossref] [PubMed]

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
[Crossref] [PubMed]

Oztoprak, C.

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Pan, F.

A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017).
[Crossref]

Paturzo, M.

Pedrini, G.

Pereira, M.

Picart, P.

Picazo-Bueno, J. A.

Pierattini, G.

Pinheiro, A.

Pitkäaho, T.

Poon, T. C.

Psaltis, D.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Ren, Z.

Restrepo, J. F.

Rupp, R.

Sakharuk, O.

Sanz, M.

Schiller, A.

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Schockaert, C.

Sencan, I.

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Seo, S.

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Seyler, T.

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Shan, M.

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

Shen, C.

C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
[Crossref]

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref] [PubMed]

Sheridan, J. T.

Situ, G.

Sternberg, P. W.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Striano, V.

Su, T.-W.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
[Crossref] [PubMed]

Sun, J.

Sun, Y.

Tamamitsu, M.

Tan, J.

C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
[Crossref]

C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018).
[Crossref] [PubMed]

C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
[Crossref]

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref] [PubMed]

Tseng, D.

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Unser, M.

Ventre, M.

von Bally, G.

Wang, C.

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

Wang, H.

Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017).
[Crossref] [PubMed]

L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
[Crossref]

Warnasooriya, N.

Wei, C.

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

Witte, S.

Woo, S. S.

Wu, Y.

Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods 136, 4–16 (2018).
[Crossref] [PubMed]

Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017).
[Crossref] [PubMed]

Xiao, W.

A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017).
[Crossref]

Xiaoxu, L.

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

Xie, X.

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

Xu, L.

Xu, Z.

Xue, L.

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

Yaglidere, O.

Yan, S.

Yang, C.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Yao, B.

Ye, T.

Yimo, Z.

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

Yinlong, L.

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

Young, I. T.

F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985).
[Crossref] [PubMed]

Yourassowsky, C.

Yu, B.

B. Yu, “Simulation study of phase retrieval for hard X-ray in-line phase contrast imaging,” Sci. China Ser. G 48(4), 450 (2005).
[Crossref]

Yu, L.

Yuan, C.

Zhang, J.

Zhang, Y.

Zhao, Y.

Zheng, G.

Zheng, J.

Zhong, W.

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Zhong, Z.

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

Zhou, C.

Zhou, H.

Zuo, C.

Acta Opt. Sin. (1)

L. Xiaoxu, Z. Yimo, Z. Liyun, L. Yinlong, and S. Canlin, “Analysis and experiment of phase-shifting coaxial lensless Fourier digital holography,” Acta Opt. Sin. 24(11), 1511–1515 (2004).

Appl. Opt. (8)

P. Bao, G. Situ, G. Pedrini, and W. Osten, “Lensless phase microscopy using phase retrieval with multiple illumination wavelengths,” Appl. Opt. 51(22), 5486–5494 (2012).
[Crossref] [PubMed]

M. Lyu, C. Yuan, D. Li, and G. Situ, “Fast autofocusing in digital holography using the magnitude differential,” Appl. Opt. 56(13), F152–F157 (2017).
[Crossref] [PubMed]

T. Pitkäaho, A. Manninen, and T. J. Naughton, “Focus prediction in digital holographic microscopy using deep convolutional neural networks,” Appl. Opt. 58(5), A202–A208 (2019).
[Crossref] [PubMed]

P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, “Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging,” Appl. Opt. 47(19), D176–D182 (2008).
[Crossref] [PubMed]

C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. Part II: Attacking optical encryption systems,” Appl. Opt. 54(15), 4709–4719 (2015).
[Crossref] [PubMed]

C. Guo, S. Liu, and J. T. Sheridan, “Iterative phase retrieval algorithms. I: optimization,” Appl. Opt. 54(15), 4698–4708 (2015).
[Crossref] [PubMed]

E. S. R. Fonseca, P. T. Fiadeiro, M. Pereira, and A. Pinheiro, “Comparative analysis of autofocus functions in digital in-line phase-shifting holography,” Appl. Opt. 55(27), 7663–7674 (2016).
[Crossref] [PubMed]

S. K. Mohammed, L. Bouamama, D. Bahloul, and P. Picart, “Quality assessment of refocus criteria for particle imaging in digital off-axis holography,” Appl. Opt. 56(13), F158–F166 (2017).
[Crossref] [PubMed]

Appl. Sci. (Basel) (1)

T. Seyler, M. Fratz, T. Beckmann, A. Schiller, A. Bertz, and D. Carl, “Extending the Depth of Field beyond Geometrical Imaging Limitations Using Phase Noise as a Focus Measure in Multiwavelength Digital Holography,” Appl. Sci. (Basel) 8(7), 1042 (2018).
[Crossref]

Biomed. Opt. Express (2)

Cytometry (1)

F. C. A. Groen, I. T. Young, and G. Ligthart, “A comparison of different focus functions for use in autofocus algorithms,” Cytometry 6(2), 81–91 (1985).
[Crossref] [PubMed]

J. Opt. A, Pure Appl. Opt. (1)

L. Ma, H. Wang, Y. Li, and H. Jin, “Numerical reconstruction of digital holograms for three-dimensional shape measurement,” J. Opt. A, Pure Appl. Opt. 6(4), 396–400 (2004).
[Crossref]

J. Opt. Soc. Am. A (3)

Lab Chip (1)

O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10(11), 1417–1428 (2010).
[Crossref] [PubMed]

Methods (1)

Y. Wu and A. Ozcan, “Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring,” Methods 136, 4–16 (2018).
[Crossref] [PubMed]

Nat. Methods (1)

A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9(9), 889–895 (2012).
[Crossref] [PubMed]

Opt. Eng. (2)

A. He, W. Xiao, and F. Pan, “Automatic focus determination through cosine and modified cosine score in digital holography,” Opt. Eng. 56(3), 034103 (2017).
[Crossref]

Z. Zhong, X. Xie, L. Liu, C. Wang, and M. Shan, “Autofocusing in dual-wavelength digital holography using correlation coefficient,” Opt. Eng. 58(04), 1 (2019).
[Crossref]

Opt. Express (11)

M. Sanz, J. A. Picazo-Bueno, J. García, and V. Micó, “Improved quantitative phase imaging in lensless microscopy by single-shot multi-wavelength illumination using a fast convergence algorithm,” Opt. Express 23(16), 21352–21365 (2015).
[Crossref] [PubMed]

P. Memmolo, M. Iannone, M. Ventre, P. A. Netti, A. Finizio, M. Paturzo, and P. Ferraro, “On the holographic 3D tracking of in vitro cells characterized by a highly-morphological change,” Opt. Express 20(27), 28485–28493 (2012).
[Crossref] [PubMed]

F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, “Focus plane detection criteria in digital holography microscopy by amplitude analysis,” Opt. Express 14(13), 5895–5908 (2006).
[Crossref] [PubMed]

L. Xu, M. Mater, and J. Ni, “Focus detection criterion for refocusing in multi-wavelength digital holography,” Opt. Express 19(16), 14779–14793 (2011).
[Crossref] [PubMed]

P. Ferraro, S. Grilli, D. Alfieri, S. De Nicola, A. Finizio, G. Pierattini, B. Javidi, G. Coppola, and V. Striano, “Extended focused image in microscopy by digital Holography,” Opt. Express 13(18), 6738–6749 (2005).
[Crossref] [PubMed]

C. Guo, Y. Zhao, J. Tan, S. Liu, and Z. Liu, “Adaptive lens-free computational coherent imaging using autofocusing quantification with speckle illumination,” Opt. Express 26(11), 14407–14420 (2018).
[Crossref] [PubMed]

W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18(11), 11181–11191 (2010).
[Crossref] [PubMed]

N. Warnasooriya and M. K. Kim, “LED-based multi-wavelength phase imaging interference microscopy,” Opt. Express 15(15), 9239–9247 (2007).
[Crossref] [PubMed]

C. Zuo, J. Sun, J. Zhang, Y. Hu, and Q. Chen, “Lensless phase microscopy and diffraction tomography with multi-angle and multi-wavelength illuminations using a LED matrix,” Opt. Express 23(11), 14314–14328 (2015).
[Crossref] [PubMed]

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref] [PubMed]

Y. S. Kim, T. Kim, S. S. Woo, H. Kang, T. C. Poon, and C. Zhou, “Speckle-free digital holographic recording of a diffusely reflecting object,” Opt. Express 21(7), 8183–8189 (2013).
[Crossref] [PubMed]

Opt. Lasers Eng. (2)

C. Shen, C. Guo, J. Tan, S. Liu, and Z. Liu, “Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference,” Opt. Lasers Eng. 105, 54–59 (2018).
[Crossref]

C. Guo, Q. Li, J. Tan, S. Liu, and Z. Liu, “A method of solving tilt illumination for multiple distance phase retrieval,” Opt. Lasers Eng. 106, 17–23 (2018).
[Crossref]

Opt. Lett. (11)

G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. 30(8), 833–835 (2005).
[Crossref] [PubMed]

D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett. 39(2), 193–196 (2014).
[Crossref] [PubMed]

J. F. Restrepo and J. Garcia-Sucerquia, “Automatic three-dimensional tracking of particles with high-numerical-aperture digital lensless holographic microscopy,” Opt. Lett. 37(4), 752–754 (2012).
[Crossref] [PubMed]

M. Antkowiak, N. Callens, C. Yourassowsky, and F. Dubois, “Extended focused imaging of a microparticle field with digital holographic microscopy,” Opt. Lett. 33(14), 1626–1628 (2008).
[Crossref] [PubMed]

P. Picart, S. Montresor, O. Sakharuk, and L. Muravsky, “Refocus criterion based on maximization of the coherence factor in digital three-wavelength holographic interferometry,” Opt. Lett. 42(2), 275–278 (2017).
[Crossref] [PubMed]

J. Dohet-Eraly, C. Yourassowsky, and F. Dubois, “Fast numerical autofocus of multispectral complex fields in digital holographic microscopy with a criterion based on the phase in the Fourier domain,” Opt. Lett. 41(17), 4071–4074 (2016).
[Crossref] [PubMed]

P. Gao, B. Yao, R. Rupp, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, “Autofocusing based on wavelength dependence of diffraction in two-wavelength digital holographic microscopy,” Opt. Lett. 37(7), 1172–1174 (2012).
[Crossref] [PubMed]

Y. Zhang, H. Wang, Y. Wu, M. Tamamitsu, and A. Ozcan, “Edge sparsity criterion for robust holographic autofocusing,” Opt. Lett. 42(19), 3824–3827 (2017).
[Crossref] [PubMed]

Z. Ren, N. Chen, and E. Y. Lam, “Automatic focusing for multisectional objects in digital holography using the structure tensor,” Opt. Lett. 42(9), 1720–1723 (2017).
[Crossref] [PubMed]

P. Ferraro, G. Coppola, S. De Nicola, A. Finizio, and G. Pierattini, “Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time,” Opt. Lett. 28(14), 1257–1259 (2003).
[Crossref] [PubMed]

Y. Sun, C. Lou, Z. Jiang, and H. Zhou, “Experimental research of representative wavelengths of tricolor for color CCD camera,” Opt. Lett. 42(2), 275–278 (2017).
[PubMed]

Optica (1)

Pattern Recognit. Lett. (1)

J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recognit. Lett. 9(1), 19–25 (1989).
[Crossref]

Phys. Rev. Lett. (1)

J. J. Barton, “Removing multiple scattering and twin images from holographic images,” Phys. Rev. Lett. 67(22), 3106–3109 (1991).
[Crossref] [PubMed]

Proc. IEEE (1)

A. J. Jerri, “The Shannon sampling theorem—Its various extensions and applications: A tutorial review,” Proc. IEEE 65(11), 1565–1596 (1977).
[Crossref]

Proc. Natl. Acad. Sci. U.S.A. (1)

X. Cui, L. M. Lee, X. Heng, W. Zhong, P. W. Sternberg, D. Psaltis, and C. Yang, “Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging,” Proc. Natl. Acad. Sci. U.S.A. 105(31), 10670–10675 (2008).
[Crossref] [PubMed]

Sci. China Ser. G (1)

B. Yu, “Simulation study of phase retrieval for hard X-ray in-line phase contrast imaging,” Sci. China Ser. G 48(4), 450 (2005).
[Crossref]

Sci. Rep. (1)

C. Guo, Q. Li, C. Wei, J. Tan, S. Liu, and Z. Liu, “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep. 7(1), 7562 (2017).
[Crossref] [PubMed]

Other (2)

T. Meeser, C. von Kopylow, and C. Falldorf, “Advanced Digital Lensless Fourier Holography by means of a Spatial Light Modulator,” in 2010 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (IEEE, 2010), 1–4.
[Crossref]

J. W. Goodman, Introduction to Fourier Optics, 2nd ed. (McGraw-Hill, 1996).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Experimental schematic of multi-wavelength lensless imaging system. A white light (broadband laser) passes the filter wheel (three modes used in the experiments), and the filtered light illuminates the sample. The diffraction patterns are recorded by a color CCD camera sensor.
Fig. 2
Fig. 2 Single loop in Serial alternating iteration. The colored arrows represent the propagation at the corresponding color wavelength. k is the kth iteration.
Fig. 3
Fig. 3 The flowchart of MWAF and the multi-wavelength phase retrievals. (a) MWAF algorithm, the combination of colored curved arrows and ㊀ indicates the subtraction of the recovered low-resolution images at the corresponding wavelength. (b) Object plane. The colored arrows represent the Fresnel diffraction propagation (c) Camera plane. (d) Recovered amplitude and phase images.
Fig. 4
Fig. 4 The results of MWAF algorithm. (a-c) The Fresnel diffraction patterns at different wavelengths. (d) MWAF normalized metric value curve. (e-f) The reconstructed amplitude and phase images using 50 iterations phase retrieval. (g) The interplay between the autofocusing method and the phase retrieval algorithm.
Fig. 5
Fig. 5 The normalized metric curves of the eight algorithms under different noise conditions. The first three columns represent different simulated samples, and the fourth column is the recovered amplitude of the complex test corresponding to the third column. (a)-(c) The noise-free case (NF). (e)-(g) The metric curves in the case of adding Gaussian noise (GN, variance = 0.01). (i)-(k) The metric curves in the case of adding speckle noise (SN). (m)-(o) The metric curves in the case of wide bandwidth (WB). (q)-(s) The curves of compound noise (CN).
Fig. 6
Fig. 6 The retrieved results of USAF resolution target. (a) The intensity captured by the CCD camera. (b, d) The reconstructed phase and amplitude images based on MWAF method. (c) The normalized metric curve obtained by the eight methods. (e-f) The reconstruction of the amplitude based on the Grad and Var method using 10 iterations. (g-i) The horizontal-line intensity profile of the marked lines in (d-f), respectively. The scale bar corresponds to 100 μm and is applicable to all images.
Fig. 7
Fig. 7 (a) The normalized metric curve obtained by MWAF at 10 nm bandwidth and 3nm bandwidth. (c-d) The recovered amplitude and phase of the biological specimen. The white bar at the left-down corner corresponds to 300 μm.
Fig. 8
Fig. 8 The retrieved result of USAF resolution target under broadband laser. (a) The diffraction pattern recorded by a color CCD camera. (b) The normalized metric curve. (c-d) The reconstructed amplitude and phase using 10 iterations. The curves show the intensity at the position of the colored line. The white bar corresponds to 100 μm.
Fig. 9
Fig. 9 The reconstructed complex object (Cameraman + Testpat1), and pure phase object (Testpat1) using 50 iterations phase retrieval methods. Phase (0, 1) refers to the input phase without considering the modulation by different wavelength, while phase (lambda) is the input phase considering the modulation of the different wavelength. ‘Old’ refers to phase retrieval process without considering the phase shift caused by different wavelength, while ‘New’ is the phase retrieval taking the phase shift of different wavelength into consideration.

Tables (1)

Tables Icon

Table 1 Performance of focus metrics for eight autofocusing methods

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

E( x,y,z )= e i2πz/λ iλz E( x',y',0 ) e ( iπ/λz )[ ( xx' ) 2 + ( yy' ) 2 ] dx'dy',
H n ( f x , f y )={ exp( i2πz λ n 1 ( λ n f x ) 2 ( λ n f y ) 2 ), if ( λ n f x ) 2 + ( λ n f y ) 2 <1 0, otherwise ,
O= F 1 [ F( I ) H n * ],
MWAF( z )= 1 3 x=1 N x y=1 N y ( | | O 1 || O 2 | | 2 + | | O 1 || O 3 | | 2 + | | O 2 || O 3 | | 2 ) 2 ,
μ OR k = V r k exp[ j φ ^ r k ]=FS T z | A r k exp[ j φ r k ] |( k=1+3i ),
μ RO k =FS T z 1 | I r exp[ j φ ^ r k ] |( k=1+3i ),
μ OR k = V g k exp[ j φ ^ g k ]=FS T z | A g k exp[ j φ g k ] |,
μ RO k =FS T z 1 | I g exp[ j φ ^ g k ] |( k=2+3i ),
μ OR k = V b k exp[ j φ ^ b k ]=FS T z | A b k exp[ j φ b k ] |,
μ RO k =FS T z 1 | I b exp[ j φ ^ b k ] |( k=3+3i ).
Var( z )= 1 N x N y x,y [| O(x,y,z) | | O(x,y,z) | ¯ ] 2 ,
Grad( z )= x=1 N x 1 y=1 N y 1 [| O(x,y,z) || O(x1,y,z) |] 2 + [| O(x,y,z) || O(x,y1,z) |] 2 ,
Lap( z )= x=1 N x 1 y=1 N y 1 [| O(x+1,y,z) |+| O(x1,y,z) |+| O(x,y+1,z) |+| O(x,y1,z) |4| O(x,y,z) |] 2 ,

Metrics