## Abstract

Multi-wavelength imaging diffraction system is a promising phase imaging technology due to its advantages of no mechanical movement and low complexity. In a multi-wavelength focused system, spectral bandwidth and dispersion correction are critical for high resolution reconstruction. Here, an optical setup for the multi-wavelength lensless diffraction imaging system with adaptive dispersion correction is proposed. Three beams with different wavelengths are adopted to illuminate the test object, and then the diffraction patterns are recorded by a image sensor. The chromatic correction is successfully realized by a robust refocusing technique. High-resolution images can be finally retrieved through phase retrieval algorithm. The effectiveness and reliability of our method is demonstrated in numerical simulation and experiments. The proposed method has the potential to be an alternative technology for quantitative biological imaging.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

## 1. Introduction

Optical phase imaging technology is an important tool to explore the microscopic world, and its applications cover many fields, such as biological specimens, material testing and surface shape measurement. However, only the intensity of the scattered signal is detected and the phase information is lost in most optical equipment. Therefore, how to recover the phase information by making use of the obtained intensity patterns is always an essential issue in optical measurement. Until now, the complete wavefront can be recovered by using typically two different strategies: interferometry [1,2] and coherent diffraction imaging (CDI) [3–5]. In interferometry, a reference beam is added, which makes the system complicated and also very sensitive to environmental vibration. CDI is a lensless phase retrieval technique that uses multiple diffraction patterns to reconstruct the complex-valued object by iterative phase retrieval [6–8]. Compared with the interferometry, CDI has the advantages of simple system structure, without the reference beam, and capability of reaching the diffraction resolution limitation theoretically. The convergence of iterative algorithm is the core problem of CDI. However, weak convergence and limited imaging range problems exist in early phase retrieval algorithms, such as the Gerchberg-Saxton (GS) [9] and Hybrid Input-output (HIO) [10].

In order to overcome the slow convergence of traditional CDI, in recent years, many researchers have improved and developed CDI imaging technology by using the concept of increasing data redundancy based on multiple measurements. Over sampling of diffraction information can ensure the convergence and accuracy of the iterative algorithm in the object reconstruction process. In 2004, Rodenburg et al. proposed a phase-retrieval method based on lighting-probe scanning of a specimen [11,12]. Subsequently, the ptychographical iterative engine (PIE) has been developed rapidly. For example, Maiden et al. proposed the ePIE algorithm which can simultaneously reconstruct the illumination light and the complex amplitude distribution of sample [13]; A single-exposure PIE imaging setup based on grating splitting [14] was demonstrated by Cheng Liu, and Ref. [15] reported a super-resolution Fourier Ptychographic Microscopy technology (FPM) based on LED array to achieve multi-angle illumination for frequency domain scanning. Different from the above PIE method, Pedrini and Osten proposed the single-beam multiple-intensity reconstruction technology (SBMIR) [16], and the phase information can be recovered by moving the camera or the sample for axial scanning to obtain the intensity patterns under different distances. On this base, Ref. [17] proposed a method of using SLM as adjustable lens to simplify the system configuration. Zhengjun Liu et al. proposed the APR algorithm to speed up the convergence speed in the iterative process [18]. As mentioned above, the inevitable problem is that the sample or CCD has to be continuously moved during the measurement process, which not only makes the date acquisition time-consuming, but increases the system scanning error. Many methods have been proposed to solve this problem, such as CMI (coherent modulation imaging) by introducing a phase plate with a known phase distribution [19,20], single exposure based on hole array [21], and multi-wavelength illumination method [22]. Among them, the multi-wavelength imaging setup is a fast coherent diffraction imaging method with simple system structure and avoiding mechanical scanning. However, in previous multi-wavelength imaging systems [23,24], the type of illumination is usually a point source. One problem is that the diffraction angle spectrum diverges too fast, which may limit the detection range of the system and increase the complexity of the reconstruction. Therefore, the parallel light is used as the illumination, which not only can help to enlarge the detection range of system, but simplify the reconstruction process. In a multi-wavelength focused system, wavelength dispersion correction is the key for high resolution reconstruction. Here, an adaptive dispersion correction algorithm is proposed in our experiment, which can get the accurate wavelength curvature. Therefore, the convergence speed and the accuracy of image reconstruction can be greatly improved.

In this paper, we describe a system for fast lensless phase imaging that requires only a three wavelength source, a CCD sensor and a computer, but is still able to recover the image with robustness and high-resolution. Spectral bandwidth is an important parameter in multi-wavelength imaging system. It is found that under certain conditions, the larger spectral bandwidth, the better reconstruction. Taking into account the recovery resolution and data acquisition time, the sources with three wavelengths, i.e., 532nm, 633nm, 850nm, are employed for recording the diffractive patterns in experiment. In the process of digital reconstruction, a robust method of refocusing criterion combined with iterative search function is proposed to correct the wavelength dispersion. After compensatioin of the wavelength curvature,of the wavelength curvature, high resolution images can be recovered. We validate that the proposed method outperforms the existing phase retrieval algorithm, providing superior accuracy and robustness for various application.

## 2. Method

#### 2.1 Fresnel diffraction imaging

The schematic of multi-wavelength imaging system is illustrated in Fig. 1(a). The sample is sequentially illuminated by different wavelengths $\{ \lambda _1,\lambda _2,\ldots , \lambda _m\}$. After the diffraction transmission over a distance of Z0, the corresponding diffraction intensity recorded by the CCD are $\{ I_1,I_2,\ldots , I_m\}$.

This imaging system setup ensures the stability of the measurement process and reduces system errors.In the iterative phase process of multi-wavelength imaging system, the accuracy of the simulated light transmission function is a basic factor which affects the resolution of recovered image. According to the scalar diffraction theory, the angular spectrum theory is a strict and accurate description of the Helmholtz equation. Therefore, the angular spectrum transmission equation is selected to complete the diffraction calculation in the iterative process, which can obtain a reliable and accurate recovery result [25]. A general transformation that describes the formation of a diffraction pattern is given by:

#### 2.2 Dispersion equations for optical materials

In our system, the illumination source is parallel beam, so a focusing system is necessary, which is realized by a collimating lens as shown in Fig. 1(a). Next, we discuss the influence of wavelength chromatic aberration on its collimation in a focused system, which can be understood that the system error is caused by the conversion of wavelength. Most of the collimating lenses used for collimating beams are made of glass. Therefore, when the wavelength is changed, the refractive index of the glass also varies, The collimating beams of different wavlength after the lens are shown in Fig. 1(b). It is necessary to analyze the influence in our system. A dispersion formula is usually utilized to relate the refractive index and the wavelength for optical glass. According the Refs. [26–28], the Sellmeier formula is the most accurate formulas mentioned. The Sellmeier formula is given by [29]:

where n is the index of refraction at wavelength $\lambda$. $A_j$ and $B_j$ are Sellmeier coefficients. We can obtain the refractive index of collimating lens used at different wavelengths according to the Eq. (2). Then the equivalent focal length of the fiber collimator can be calculated by the following formula: where f is the focal length of the fiber collimator at wavelength $\lambda$, r1 and r2 are the parameters of the fiber collimator. As shown in Fig. 1(b), once the focal length of the collimating lens is given, the wavelength modulation function during recording the diffraction patterns can be obtained according to Gaussian imaging formula, $\frac {1}{f} = \frac {1}{u} + \frac {1}{v}$, where u is the object distance and v is the image distance. This provides the important information for the subsequent accurate adaptive wavelength dispersion correction.#### 2.3 Reconstruction algorithm in a dispersing system and simulations

Using the above formula we demonstrate the feasibility of the wavelength error correction reconstruction algorithm. The following improvements have been made in our reconstruction method for a better reconstruction. Firstly, the wavelength ratio $\lambda _1/\lambda _m$ is introduced to account for the difference in phase shift that the object introduces for different wavelengths. Secondly, different wavelengths have different curvature functions M($\lambda$) in our measurement system. In this case, the wavelength curvature function M($\lambda$) is equivalent to spherical wave with different radius and the spherical center is coaxial. In order to accurately find the convergence factor at different wavelengths, we propose an adaptive method based on the prior information combined and iterative search refocusing in our system, which makes a great contribution to the high-resolution reconstruction. Lastly, compared to the conventional iterative methods [30,31], which adopts the serial iteration, the parallel iteration in our method further boosts up the convergence speed.

The flow diagram of the modified algorithm is represented through Fig. 2. The phase iteration algorithm starts with the random definition of the initial complex amplitudes in the object plane. For example, $U_0$ is the initial complex amplitude distribution in the object plane as shown in the left box of Fig. 2. At the same time, the diffraction intensity I($\lambda$), angular spectrum transfer function H($\lambda$), and wavelength curvature function M($\lambda$) at each wavelength are used as the priori information to constrain the iterative algorithm for convergence. As description in iterative recovering process of the Fig. 2, the angular spectrum transmission equation is used to carry out the forward and backward propagation between the object plane and recording plane in spatial domain and frequency domain.The square of root the I ($\lambda _m$) is multiplied by a given phase and the resulting complex-valued function is propagated to the next plane I($\lambda (m + 1)$). This time, the calculated phase is remained and its amplitude is replaced by the measured intensity at the diffraction plane. It is worth noting that there is a double loop in our algorithm. In each iteration, $U_{{\lambda \textrm{m}}}^k$ means reconstructed complex amplitude at wavelength $\lambda _m$ and the number of iterations is k.Parallel iteration means that the one complete loop of algorithm is iterated from $\lambda _1$ to $\lambda _m$, and then from $\lambda _m$ to $\lambda _1$. For example, if we choose three wavelengths diffraction patterns ($\lambda _1$, $\lambda _2$, $\lambda _3$) to reconstruct, one loop iteration is from $\lambda _1$ to $\lambda _2$ to $\lambda _3$, and then from $\lambda _3$ to $\lambda _2$ to $\lambda _1$. The process is repeated until the evaluation function of the reconstructed image at the object plane is less than a certain threshold. Finally, through the above steps, the complex wavefront of the sample under different wavelengths can be obtained.

The effectiveness of correcting the wavelength dispersion is first verified by numerical simulations. The initial conditions used in simulations are the same as those in the actual experiments. The image of USAF1951 is selected in MATLAB as complex-valued object. The number of sampling points N is set to 800*800, and the pixel size is 5.5$\mu$m. The object-to-sensor distance is set to be Z0=30mm ($Z0 >= D \times \Delta x/\lambda$, D: the diameter of object, when $\lambda$>=600nm), which meets the sampling conditions in Fresnel diffraction regime. In this case, to maintain the same level as the actual wavelength dispersion, spherical waves with radii f1=800mm, f2=1000mm and f3=1300mm are applied to the wavelengths of 532nm, 633nm, and 850nm respectively. To quantify the performance of reconstruction and convergence in this method, the sum-squared error (SSE) is served as the error evaluation function in iteration process as description in Eq. (4), where Ii is the recovered image and Io is the measurement image. At the same time, we calculated the correlation coefficient R between the recovered image and the actual image by using Eq. (5), where a and b are images of reconstruct image and the measured image, $\bar {a}$ and $\bar {b}$ are their means. The correlation coefficient is calculated for all reconstruct images as the number of iteration increases. The higher cvalue, the better reconstructed image quality.

The reconstructed image with chromatic aberration and achromatic aberration are shown in Fig. 3(a) and (b), respectively. The results of SSE and R during reconstruction are also shown Fig. 3(c) and (d), respectively. It can be clearly seen that compare to the wavelength dispersion exists, compensating the wavelength dispersion brings a better recovered image quality, which is consistent with the theoretical analysis.

## 3. Experimental results

To demonstrate the effectiveness of the proposed approach, we have designed a compact multi-wavelength imaging system. In our multi-wavelength imaging system, the dispersion of the sample is not considered, because the dispersion of plate sample does not change the curvature of light beam. The experimental system setup is illustrated in Fig. 4(a). Different wavelengths are coupled to the same single-mode fiber. The lightwave from the fiber is collimated and expanded by a fiber collimator (FC) before illuminating the sample. This setup and operation can eliminate the influence of operation error and avoid the mechanical movement. A CCD camera with 1768*2352 pixels, the pixels size of 5.5$\mu$m and response spectrum bandwidth of 400-1100nm is employed to record the diffraction patterns at different wavelengths. We only need to switch the lasers on and off sequentially and the diffraction patterns at each of the three wavelength can be obtained.

To quantitatively demonstrate the effectiveness of our method in imaging, a standard negative resolution board (USAF1951) is used as the test target. The diffraction transmission distance from the sample to the camera plane is 40mm. The raw diffraction patterns are recorded at three wavelength as shown in Fig. 4(c-e), respectively. It should be noted that in the reconstruction, the prior information of the measurement system is very important. For example, the type of fiber collimator used in our system is D-ZK3, and its dispersion formula is ${n^2} = 2.48968705 - 0.0121700502{\lambda ^2} + 0.02125749866{\lambda ^{ - 2}} + 0.000532456649{\lambda ^{ - 4}}$.... The corresponding variable parameter values in the experiment are shown in Table 1. Therefore, we can utilize these prior information of system to roughly estimate the radii of the spherical wave at different wavelengths (M(F) =v-Z0). In order to accurately find the wavelength dispersion function at different wavelengths, here, we use a search refocus method. If the radius of the spherical wave is equal to wavelength curvature function, the criterion function presents an extremum (maximum or minimum) on the best reconstruction plane. Therefore, an adaptive refocusing criterion has to be considered. In the past few decades, several refocus criteria have been proposed in digital holography [32–34]. Here, two classical gradient functions, EOG (Energy of Gradient) and Brenner function, are used as criteria in iterative search refocusing process. The metrics of each criterion are described by Eq. (6) and Eq. (7), respectively. f (x, y) is the value of the pixel of the reconstruction image.

The normalized value of EOG and Brenner at different radius of the spherical wave are shown in Fig. 5(a). They are overall unimodal and each has a strong global peak at the correct radius of curvature. After the above process, we can get the accurate wavelength curvature function M($\lambda$), and then it can be fully utilized in the iterative phase retrieval as shown in Fig. 2 to recover the wavefront of object. The reconstructed images with wavelength dispersion and after eliminating wavelength dispersion are shown in Fig. 5(c-d) and Fig. 5(e-f), respectively. At the same time, the relationship between the SSE and the number of iterations during the reconstruction process are shown in Fig. 5(b). It can be clearly seen that with the number of iterations increasing, our method has a better recovered image quality and smaller iteration errors, which is consistent with the above theoretical analysis. Therefore, the above comparative experiment results shown that our method of eliminating the wavelength dispersion is effective and can greatly improve the reconstruction accuracy. To show the spatial resolution quantitatively, the yellow dashed boxes give an intuitive comparison between the traditional method and our method as shown in Fig. 5(d) and Fig. 5(f). We can find that the reached resolution in our method is 57 lp/mm (Element 6, Group 5), The resolution of our system is about 8 $\mu$m, which basically reaches the limit of CCD sampling resolution, except for the influence of intensity saturation and background noise. By analyzing the parameters of imaging system, the pixel size of the CCD sensor may be a limiting factor for the resolution. By using a smaller pixel of CCD or magnifying the object, the resolution can be improved in theory.

To demonstrate the capabilities of our multi-wavelength lensless system for cell imaging, we perform imaging experiments on female worms. Part of the transected tissue is placed on top of a layer on a coverslip. Figure 6(c-d) show the reconstructed amplitude and phase of the sample using retrieval algorithm. Compared with the reconstruction of conventional method, as shown in Fig. 6(a) and Fig. 6(b), the thick specimen has a sharp edge and a clear recovered image by using our current algorithm, as shown in Fig. 6(c) and Fig. 6(d), showing more fine details of the specimen. It is worth noting that the conventional method has a slow less noise in the center part. We can explain the possible reasons from the following aspects: Firstly, the biological specimen can be considered to be phase objects due to their transparency. The high frequency information of the sample can be obtained by using our current algorithm. Comparatively, the conventional method restored image has more low frequency information. The noise has a greater impact on high-frequency information, which may result in the phenomenon that low-resolution image appears to have less noise. Secondly, in the experiment, the common influence of experimental factors such as intensity saturation and background noise might randomly degrade the quality of experimental results. We also validate that the proposed method for various biological samples. The retrieved results provide superior accuracy and robustness.

## 4. Further discussion

In multi-wavelength imaging system, as description in reference [22], the larger the number of the wavelength, the better the reconstruction, because more wavelengths means more diffraction information of the object can be recorded. However, increasing the number of wavelengths will not only increases the data acquisition time, but also increases the cost of the system. In addition to the number of wavelengths, wavelength interval also plays a very important role in the iterative phase retrieval. Here, the influence of wavelength interval on reconstruction is quantitatively analyzed through numerical simulation, which can provide theoretical support for us to design a simple structure multi-wavelength imaging system with low cost and high-resolution. The parameters used in simulations are the same as those in the actual multi-wavelength experiments. In our numerical simulation, we choose 5 groups of different wavelength intervals in the wavelength range of 500-1000nm. Specifically, when the wavelength interval is 10nm, the corresponding wavelengths are 500nm, 510nm, 520nm, 530nm, 540nm and 550nm. The wavelength definitions of other different interval groups are similar. The above reconstructions are the result of using five diffraction patterns. The SSE, R and the restoration effect under different wavelength intervals illumination are shown in Fig. 7(a-c), respectively. It can be clearly seen that with the wavelength spacing increasing, a better recovered image quality and smaller iteration errors can be reached under the same iterations. In other words, the better resolution comes from the larger wavelength interval. Besides, we can find that the reconstruction correlation coefficient and SSE show only a slight improvement if the wavelength spacing is more than 50nm. This means that the wavelength interval has to be considered when we design a multi-wavelength diffraction imaging system with high-resolution and faster data acquisition time. However, due to the limit of bandwidth of lasers and the image sensors, the wavelength interval can not be increased without limitation. Considering the existing tunable wavelength range with large bandwidth and good coherence, three solid-state laser sources with wavelengths of 532nm, 633nm and 850nm are a cost-effective choice. This system is simple and the wavelength interval is relatively large, which can achieve a good reconstruction accuracy.

## 5. Conclusion

In summary, we demonstrate a high-resolution and robust multi-wavelength imaging system with adaptive dispersion correction. Three wavelengths (532nm, 633nm, 850nm) are chosen in our multi-wavelength imaging system. To accurately find the optimal curvature of the wavelength dispersion function, a method of using the prior information based on Sellmeier formula combined with iterative refocusing function is proposed. We validate the performance of proposed method both in theoretical and experiment. The reconstructed results of the USAF1951 resolution target and biological specimen provide strong evidence for the effectiveness and reliability of our method. Three frames of diffraction pattern and dozens of iteration are enough to reconstruct the object under test. Our proposed system has favorable properties in terms of robustness, reliability and cost-effectiveness compared to other similar method. This method may become a useful alternative in various applications where high–contrast phase imaging is required, such as living cells and surface profiling.

## Funding

Strategic Priority Research Program of Chinese Academy of Sciences (XDA25020104); National Natural Science Foundation of China (61775222, 61875121); National Key R&D Plan of china (2018YFC1503703).

## Acknowledgments

The National Key R&D Plan of China; National Natural Science Foundation of China (NSFC); and the Strategic Priority Research Program of Chinese Academy of Sciences.

## Disclosures

The authors declare no conflicts of interest.

## References

**1. **D. Gabor, “A new microscopic principle,” Nature **161**(4098), 777–778 (1948). [CrossRef]

**2. **U. Schnars and W. Jüptner, “Direct recording of holograms by a ccd target and numerical reconstruction,” Appl. Opt. **33**(2), 179–181 (1994). [CrossRef]

**3. **J. Miao, P. Charalambous, J. Kirz, and D. Sayre, “Extending the methodology of x-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens,” Nature **400**(6742), 342–344 (1999). [CrossRef]

**4. **J. R Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. **21**(15), 2758–2769 (1982). [CrossRef]

**5. **M. H. Maleki and A. J. Devaney, “Phase-retrieval and intensity-only reconstruction algorithms for optical diffraction tomography,” J. Opt. Soc. Am. A **10**(5), 1086–1092 (1993). [CrossRef]

**6. **A. Greenbaum, W. Luo, B. Khademhosseinieh, T. W. Su, A. F. Coskun, and A. Ozcan, “Increased space-bandwidth product in pixel super-resolved lensfree on-chip microscopy,” Sci. Rep. **3**(1), 1717 (2013). [CrossRef]

**7. **A. Greenbaum, N. Akbari, A. Feizi, W. Luo, and A. Ozcan, “Field-portable pixel super-resolution colour microscope,” PLoS One **8**(9), e76475 (2013). [CrossRef]

**8. **W. Luo, F. Shabbir, C. Gong, C. Gulec, J. Pigeon, J. Shaw, A. Greenbaum, S. Tochitsky, C. Joshi, and A. Ozcan, “High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging,” Appl. Phys. Lett. **106**(15), 151107 (2015). [CrossRef]

**9. **R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik **35**, 237–250 (1972).

**10. **J. R. Fienup, “Reconstruction of an object from modulus of its fourier transform,” Opt. Lett. **3**(1), 27–29 (1978). [CrossRef]

**11. **H. M. L. Faulkner and J. M. Rodenburg, “Movable aperture lensless transmission microscopy: A novel phase retrieval algorithm,” Phys. Rev. Lett. **93**(2), 023903 (2004). [CrossRef]

**12. **J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. **85**(20), 4795–4797 (2004). [CrossRef]

**13. **A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy **109**(10), 1256–1262 (2009). [CrossRef]

**14. **X. Pan, C. Liu, and J. Zhu, “Single shot ptychographical iterative engine based on multi-beam illumination,” Appl. Phys. Lett. **103**(17), 171105 (2013). [CrossRef]

**15. **G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution fourier ptychographic microscopy,” Nat. Photonics **7**(9), 739–745 (2013). [CrossRef]

**16. **G. Pedrini, W. Osten, and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. **30**(8), 833–835 (2005). [CrossRef]

**17. **P. F. Almoro, J. Glückstad, and S. G. Hanson, “Single-plane multiple speckle pattern phase retrieval using a deformable mirror,” Opt. Express **18**(18), 19304–19313 (2010). [CrossRef]

**18. **Z. Liu, C. Guo, J. Tan, Q. Wu, L. Pan, and S. Liu, “Iterative phase-amplitude retrieval with multiple intensity images at output plane of gyrator transforms,” J. Opt. **17**(2), 025701 (2015). [CrossRef]

**19. **F. Zhang, G. Pedrini, and W. Osten, “Phase retrieval of arbitrary complex-valued fields through aperture-plane modulation,” Phys. Rev. A **75**(4), 043805 (2007). [CrossRef]

**20. **F. Zhang, B. Chen, G. R. Morrison, J. Vila-Comamala, and I. K. Robinson, “Phase retrieval by coherent modulation imaging,” Nat. Commun. **7**(1), 13367 (2016). [CrossRef]

**21. **P. Sidorenko and O. Cohen, “Single-shot ptychography,” Optica **3**(1), 9–14 (2016). [CrossRef]

**22. **P. Bao, F. Zhang, G. Pedrini, and W. Osten, “Phase retrieval using multiple illumination wavelengths,” Opt. Lett. **33**(4), 309–311 (2008). [CrossRef]

**23. **D. E. W. Noom, D. E. Boonzajer Flaes, E. Labordus, and K. S. E. Eikema and S. Witte, “High-speed multi-wavelength fresnel diffraction imaging,” Opt. Express **22**(25), 30504–30511 (2014). [CrossRef]

**24. **D. W. E. Noom, K. S. E. Eikema, and S. Witte, “Lensless phase contrast microscopy based on multiwavelength fresnel diffraction,” Opt. Lett. **39**(2), 193–196 (2014). [CrossRef]

**25. **J. W. Goodman, * Introduction to Fourier optics* (McGraw-Hill, 1968).

**26. **Berge Tatian, “Fitting refractive-index data with the sellmeier dispersion formula,” Appl. Opt. **23**(24), 4477 (1984). [CrossRef]

**27. **G. Ghosh, “Sellmeier coefficients and dispersion of thermo-optic coefficients for some optical glasses,” Appl. Opt. **36**(7), 1540 (1997). [CrossRef]

**28. **L. Chia-Ling and J. Sasián, “Adaptive dispersion formula for index interpolation and chromatic aberration correction,” Opt. Express **22**(1), 1193–1202 (2014). [CrossRef]

**29. **L. E. Sutton and O. N. Stavroudis, “Fitting refractive index data by least squares,” J. Opt. Soc. Am. **51**(8), 901–905 (1961). [CrossRef]

**30. **P. Bao, G. Situ, G. Pedrini, and W. Osten, “Lensless phase microscopy using phase retrieval with multiple illumination wavelengths,” Appl. Opt. **51**(22), 5486 (2012). [CrossRef]

**31. **L. Jian, Z. Yixuan, G. Cheng, Y. Weisong, and C. Zhang, “Robust autofocusing method for multi-wavelength lensless imaging,” Opt. Express **27**(17), 23814–23829 (2019). [CrossRef]

**32. **F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, “Focus plane detection criteria in digital holography microscopy by amplitude analysis,” Opt. Express **14**(13), 5895–5898 (2006). [CrossRef]

**33. **M. S. Kara, B. Larbi, B. Derradji, and P. Pascal, “Quality assessment of refocus criteria for particle imaging in digital off-axis holography,” Appl. Opt. **56**(13), F158 (2017). [CrossRef]

**34. **P. Memmolo, M. Paturzo, B. Javidi, P. A. Netti, and P. Ferraro, “Refocusing criterion via sparsity measurements in digital holography,” Opt. Lett. **39**(16), 4719–4722 (2014). [CrossRef]