The space–bandwidth product (SBP) of modern objective lenses is often significantly larger than the pixel count of opto-electronic image sensor chips, and, therefore, much of the information transmitted by the optical system cannot be adequately sampled or digitized. To resolve this mismatch, microscopes are in general designed to maintain the resolution of the optical system while significantly wasting the field of view (FOV) and SBP of the objective lens. We introduce a wide-field and high-resolution coherent imaging method that uses a stack of out-of-focus images to provide much better utilization of the SBP of an objective lens. We demonstrate our approach on a benchtop microscope by using a demagnification camera adapter to match the active area of the image sensor chip to the FOV of an objective lens. We show that the resulting spatial undersampling caused by capturing a large FOV can be mitigated through an iterative pixel super-resolution algorithm that uses e.g., ∼three to five slightly out-of-focus images, yielding an -fold increase in the SBP of the microscope. Furthermore, the same pixel super-resolution algorithm also achieves phase retrieval, revealing the optical phase information of the specimen. We compared our method against traditional off-axis and phase-shifting digital holographic microscopy modalities and demonstrated at least 3-fold reduction in the number of images required to achieve the same SBP. This technique could be used to maximize the throughput and SBP of lens-based coherent imaging and holography systems and inspire new microscopy designs that benefit from the inherent autofocusing steps of a scanning microscope to increase its SBP.
© 2016 Optical Society of America
Although modern microscope objective lenses can achieve high-resolution imaging with relatively large fields of view (FOVs), they are inherently designed to provide a match to the human eye rather than to charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS)-based cameras, which appeared in recent decades as common microscope accessories. The space–bandwidth product (SBP) of an optical system is defined by the FOV of the imaging platform divided by the area of a resolvable spot, which is determined by the spatial resolution of the imager , and, in this sense, it is fundamentally tied to the signal-to-noise ratio (SNR) of the optical imaging system. In the case when the spatial resolution exhibits significant variations across the claimed FOV of the imaging system, e.g., due to aberrations, SBP can be estimated by defining sub-regions of the FOV, each with a uniform resolution. For a coherent imaging system, both phase and amplitude channels would independently contribute to the SBP; for example, a objective lens with a numerical aperture (NA) of 0.3 and a field number (FN) of 26.5 mm can achieve, if corrected for aberrations, a total SBP of approximately 14.8 million at an illumination wavelength of 532 nm. However, due to the signal readout mechanism and imaging speed requirements, most cameras that are used in optical microscopes are designed with limited number of pixels, e.g., 1–4 megapixels, which sets a practical limitation for the overall SBP of the microscopic imaging system (see Fig. 1). This gap between objective lenses and opto-electronic sensor chips is in general bridged by matching the optical resolution to the effective pixel size of the imaging configuration, which results in a major sacrifice of the FOV. For example, the use of the same objective lens with a commonly used 1.45 megapixel CCD imager (QIClick Monochrome, QImaging, Surrey, BC, Canada) would necessitate at least a camera adapter to effectively reduce the pixel size by 10-fold and match the resolution of the objective lens to the CCD chip. This strategy would unfortunately waste of the objective lens FOV and, therefore, result in sub-optimal use of the SBP of the microscopic imaging system.
In fact, a survey of coherent imaging and digital holographic microscopy related publications from the past decade clearly illustrates this mismatch between the pixel counts of the utilized image sensor chips and the SBPs of conventional microscope objectives, as summarized in Fig. 1 [2–26]. We refer to this practical mismatch as the “SBP gap” in coherent microscopy systems. To address this gap, here we introduce a new wide-field and high-resolution computational imaging method that best utilizes the SBP of a microscope objective by bridging the gap between digital cameras and objective lenses. For this goal, unlike traditional microscope designs, we first add a demagnification camera adapter (e.g., ) to match the CCD/CMOS image sensor area to the FOV of the objective lens. This demagnification operation, although it increases the sample FOV, reduces the image resolution due to inadequate sampling and results in spatial aliasing and pixilation. To mitigate this limitation, we employ a pixel super-resolution algorithm that uses a few out-of-focus images of the sample to recover a high-resolution complex image of the specimen and significantly increase the overall SBP of the microscope. Conventional pixel super-resolution (PSR) methods restore high-frequency signals from a stack of undersampled images, each with a sub-pixel lateral displacement. Such PSR methods are implemented by either laterally shifting the sample  or shifting the sensor chip inside a camera. The former method needs high-precision motorized stages and may have anisotropic resolution due to uneven sub-pixel movements/shifts. The latter, on the other hand, requires a specialized camera (e.g., DP80, Olympus ) with a built-in pixel-shifting mechanism and a Peltier cooling device. Both of these PSR implementations inevitably complicate the mechanical design of the microscope and increase hardware costs.
Using a stack of out-of-focus images of the sample, we developed a pixel super-resolution framework to create high-resolution and wide-field microscopic images of a specimen, both amplitude and phase, with minimal changes to a conventional bright-field microscope, providing much better utilization of the large SBP of a microscope objective lens. The feasibility of our approach, which is termed out-of-focus-imaging-based pixel super-resolution (OFI-PSR), is demonstrated by reconstructing a resolution test target as well as various biological samples, including blood samples and Papanicolaou smears. The same imaging technique can also be extended to 3D objects assuming that shadowing artifacts due to object thickness and optical density do not create major limitations. To achieve the same SBP that is inherently limited by the objective lens, our approach requires - and -fold fewer images when compared to traditional off-axis and phase-shifting digital holographic microscopy techniques, respectively (Table 1). This unique technique would be useful to optimize the throughput and SBP of lens-based coherent imaging platforms and might inspire new microscopy systems that benefit from the built-in autofocusing process of an automated scanning microscope to further increase its SBP.
A. Experimental Setup
Our OFI-PSR method is demonstrated experimentally using a conventional bright-field microscope (IX73, Olympus Corporation, Tokyo, Japan). Figure 2 depicts our objective-lens-based out-of-focus coherent imaging setup. A fiber-coupled wavelength-tunable light source (WhiteLase-Micro, model VIS, Fianium Ltd, Southampton, UK) is used to provide the illumination. This tunable light source is set to 532 nm with bandwidth. The partially coherent characteristic of our light source allows us to treat each out-of-focus image as an in-line transmission hologram of the sample, while also avoiding any interference from objects outside of the sample plane. A CCD-based image sensor (QIClick Monochrome, QImaging, Surrey, BC, Canada) with a pixel-count of 1.45 million and a pixel size of 6.45 μm is used to capture the out-of-focus transmission images. In this microscopic imaging system, we also introduced a demagnification factor of by adding a camera adapter (Olympus Part #U-TV0.35xC-2) to increase our FOV by -fold, getting close to the FOV limit of the objective lens. With this demagnification, our sample FOV becomes using a objective lens () and using a objective lens (). Note that either the sample or the objective lens can be scanned vertically to capture the required out-of-focus images. In our experimental implementation, the objective lens was scanned vertically, and to investigate the optimum number of out-of-focus images, we used an exploratory depth imaging range of to with respect to the sample plane, with an axial step size of [see Fig. 2(b)]. As will be demonstrated in Section 3, out-of-focus measurements separated by 30 μm are sufficient to reconstruct high-quality images.
B. Sample Preparation
We validated OFI-PSR by imaging a standard 1951 USAF resolution test target as well as unstained Papanicolaou (Pap) smears and blood samples. Pap smears are prepared using the ThinPrep method (Hologic, Massachusetts). The human blood smear is acquired from Carolina (item no. 31-7374). Since we used existing and anonymous specimens, where no subject-related information is linked or can be retrieved, these experiments were exempt from human-subject-research-related regulations.
C. OFI-PSR Algorithm
First, we assume that the quasi-monochromatic light field right after the object plane can be expressed as , where is the object transmission field. The spatial Fourier transform of the sampled intensity for each out-of-focus measurement () can be written as29–31], 2(b)]. Each term with the subscript “” in Eq. (2) represents spatial aliasing related replicas, i.e., 1), refers to the 2D Fourier transform of the “effective pixel function” of the image sensor chip that is projected onto the sample plane that is in focus, and this 2D effective pixel function represents the intensity responsivity distribution of a single pixel at the in-focus sample plane of the microscope. refers to the spatial frequency spectrum of the object scattering field, and is the target of our reconstruction algorithm. We should note that both and in Eq. (1) are independent of the separation between different out-of-focus planes since the illumination wavelength remains unchanged. The last term in Eq. (1), , represents the self-interference resulting from out-of-focus-imaging-related diffraction, which can be written as , where refers to the 2D autocorrelation operation [29,32].
To recover a high-resolution image of the complex object field based on out-of-focus imaging, as depicted in Fig. 3 and detailed below, our OFI-PSR algorithm consists of two stages: (I) generation of an initial object guess, and (II) iterative refinement and reconstruction of the complex object in the frequency spectrum.
1. Stage I: Generation of the Initial Guess
An initial guess of the frequency spectrum of the object is generated through a three-step procedure. First, each out-of-focus intensity image is upsampled by -fold (e.g., [4–6]). This procedure does not introduce any new information, but extends the frequency domain window. In the second step, each upsampled out-of-focus image is digitally backpropagated to the in-focus sample plane of the objective lens. In our terminology, the wave propagation from the in-focus sample plane to an out-of-focus sample plane is denoted as the forward propagation, and the inverse process from an out-of-focus sample plane to the in-focus sample plane is denoted as backward propagation. In the final step of this Stage I of OFI-PSR, we sum up all the backpropagated complex fields calculated from different out-of-focus images, and generate an initial guess of the object’s spatial frequency spectrum.
2. Stage II: Iterative Frequency Spectrum Refinement
In the second stage, we use an iterative algorithm to refine our object reconstruction and eliminate aliasing related spatial artifacts. As depicted in Fig. 3, the current iteration () of our object estimation () is first forward-propagated to each out-of-focus object plane using the angular spectrum method , yielding an estimated out-of-focus image for the th iteration, where represents the th out-of-focus measurement. At the next step of the algorithm, the low-resolution raw measurement at the th out-of-focus measurement plane is convolved with the 2D effective pixel function of the sensor array, which we assumed to be a Gaussian with a FWHM that is a quarter of the pixel pitch, and the result is used to update the amplitude of , with a relaxation factor of, e.g., 0.5, while keeping the phase unchanged . This updated field, , is then backpropagated to the in-focus sample plane, yielding , which is used to update the object estimation in the spatial frequency domain, also using a relaxation factor (e.g., ). Before this update, is also filtered by a spatial frequency mask defined by the passband of the coherent imaging system based on the NA of the objective lens to avoid amplification of high-frequency noise during each iteration cycle. After each out-of-focus measurement has been utilized in a given iteration, the object field estimation is updated from to , and typically we use iterations as part of Stage II.
D. Estimation of Relative Axial Positions of Out-of-Focus Images
The axial position of each out-of-focus image can be determined by digital autofocusing algorithms [27,33]. However, such algorithms tend to perform poorly in the case of severely undersampled images and are sensitive to noise caused by the interference fringes arising from unwanted objects (dust, etc.) residing on the optical elements within the beam path. To address this problem, we used an iterative refinement process after obtaining the initial out-of-focus heights through standard autofocusing algorithms [27,34,35]. Using these initial height estimates, the object field is first digitally propagated to each out-of-focus sample plane, denoted as . At height , we propagate around the estimation point, searching for a position where the correlation of and is the largest. Then is replaced with , updating all the height values corresponding to our out-of-focus measurements. This algorithm converges rapidly and requires about five iterations, where the termination criterion is set to be
Through our experiments, we found out that axial step sizes of between successive out-of-focus images do not show noticeable differences in our OFI-PSR reconstruction results.
E. Computation Platform for the Implementation of OFI-PSR Algorithm
Our OFI-PSR algorithm is implemented in MATLAB (Version R2016a, MathWorks, Natick, Massachusetts) on a desktop computer equipped with a 3.60 GHz CPU (Intel Core i7-4970) and 16 GB of random-access memory. For a stack of out-of-focus images with pixels each, i.e., covering about FOV, one iteration takes approximately 5.8 s with an upsampling factor of 6, such that the total OFI-PSR reconstruction routine finishes within 10 min. In our proof-of-concept implementation, the OFI-PSR algorithm was executed sequentially on a CPU, and one can expect a significant reduction in our computation time (e.g., 10–20-fold) with the help of GPUs and parallel computing .
3. RESULTS AND DISCUSSION
The physical basis of our technique relies on the relative changes of out-of-focus images with respect to the image sampling grid as a function of the sample-to-focus distance. There are two main factors affecting the resolution of the reconstructed images using OFI-PSR: SNR of each out-of-focus image and the spatial sampling rate. Poor SNR limits the resolution by affecting the detection of high-order and lower energy interference patterns in each out-of-focus image and reduces the contrast. As for the effective pixel pitch for spatial sampling at the focal plane of the objective lens, after taking into account the overall magnification of our coherent optical system, we have a relatively large sampling period, which causes severe undersampling in each out-of-focus measurement, in return for a significantly increased sample FOV. As will be detailed next, OFI-PSR not only recovers the phase information of the sample by using a set of intensity-only out-of-focus images, but also performs anti-aliasing by utilizing the strong sensitivity of the coherent transfer function () to the sample-to-focus distance and reconstructs a pixel super-resolved image of the complex object field, significantly increasing the SBP of a coherent microscopy system.
A. Resolution Improvement and Phase Retrieval
We quantified the performance of the OFI-PSR algorithm by reconstructing a resolution test target. Figure 4(a) shows the full FOV of our OFI-PSR reconstruction. Sample FOV is enlarged from to , using a demagnification camera adapter. As a result, the effective pixel size at the focal plane of a objective lens is enlarged from 0.65 to 1.84 μm, which significantly downgrades the lateral resolution: an in-focus amplitude image of the sample is shown in Fig. 4(c), where the half-pitch resolution is . Using the OFI-PSR algorithm with out-of-focus measurements, we show in Fig. 4(d) that the half-pitch resolution is improved to 1.1 μm, which also permits retrieval of the phase information of the sample, as will be detailed below, increasing the overall SBP by a factor of (including both the amplitude and phase channels that are super-resolved). Note also that, although the FOV with the camera adapter could in principle be , the geometrical mismatch between the circular output of the objective lens and the rectangular sensor chip area causes a minor FOV loss at the corners, which results in an effective FOV of , as illustrated in Fig. 5(a)—also see Table 1.
After demonstrating the pixel super-resolution capabilities of computational out-of-focus imaging, next we imaged human blood cells and Pap smear samples (Figs. 5 and 6). For these experiments, we also used a camera adapter along with a objective lens, which resulted in a sample FOV of . Using out-of-focus intensity-only images, the full FOV reconstruction of a blood smear sample is shown in Fig. 5(a). A comparison of Figs. 5(b) and 5(c) illustrates the significant improvement in image quality achieved with our OFI-PSR algorithm, restoring fine features of the sample from severely undersampled and out-of-focus image measurements.
In addition to pixel super-resolution, OFI-PSR also retrieves the object’s phase information, and, when combined with the amplitude channel, this increases the effective SBP by -fold compared to an in-focus image of the object that shares the same FOV. Figures 6(a)–6(c) show in-focus and slightly out-of-focus intensity images of an unstained Pap smear, which can be considered a phase-only object since it is composed of a very thin layer of unstained cells taken from the cervix of a patient. That is why the in-focus image in Fig. 6(a) cannot reveal much information, even if spatial undersampling were to be eliminated. However, as shown in Fig. 6(d), OFI-PSR recovers a high-resolution phase image of the sample, clearly revealing the structure and sub-cellular morphology of the cells. In Fig. 6(e), we also demonstrate a digital phase-contrast  image of the sample [calculated from Fig. 6(d)], which provides a very good agreement with a phase contrast image obtained using the same microscope and a camera adapter, i.e., an over 8-fold smaller sample FOV compared to OFI-PSR.
B. Increased SBP and Data Efficiency of OFI-PSR
The SBP of a coherent computational imaging system is proportional to the number of effective pixels (), reconstructed in a complex object image, i.e.,38].
Based on these definitions, we compared the effective number of pixels and the data efficiency of our OFI-PSR method against conventional lateral-shift-based FOV enhancement techniques for commonly used coherent imaging modalities (see Table 1). During these comparisons, to be fair across different coherent imaging modalities, we utilized the same microscope objective lens employed in our experiments (i.e., a objective lens with a FN of 26.5 mm). Using out-of-focus intensity images, OFI-PSR can reconstruct a sample FOV of with an effective pixel count of million. In our comparison, we first considered a single-exposure off-axis holographic imaging configuration , which keeps only the real image quarter in the Fourier domain during the object reconstruction. Therefore, as summarized in Table 1, the lateral resolution is sacrificed compared to OFI-PSR, and the effective pixel count of a single reconstructed complex object image is limited to 0.37 million. By using lateral scanning, for off-axis holography-based coherent microscopy to achieve the same SBP as in our method, an area of needs to be scanned. Note that to digitally stitch together several different FOVs, there is some spatial overlap that is required among images, which is typically on each side of the image . This suggests that to cover , 25–32 scanning positions and digital images are required, which is significantly larger compared to the number of out-of-focus images that OFI-PSR utilizes, i.e., 5.
Next we considered an alternative coherent imaging modality, i.e., the two-step phase-shifting digital holography (PSDH) configuration [41,42], which is expected to reach the diffraction limit of the imaging system using the least number of measurements among in-line holographic imaging configurations. Based on the use of the same objective lens, a two-step PSDH configuration can reconstruct the image of a complex object with two measurements, achieving an effective pixel count of 1.48 million, which is 4 times larger than the off-axis holography configuration. To achieve the same SBP as in our OFI-PSR method, the two-step PSDH with lateral scanning would need to scan a FOV of . This means seven to eight scanning positions are needed in each phase-shifting-based imaging step, resulting in measurements in total, which is more than what OFI-PSR requires to achieve the same SBP, as also summarized in Table 1.
C. Dependency of OFI-PSR Reconstruction Quality on the Number of Out-of-Focus Measurements
The quality of the reconstructed images using our OFI-PSR method is affected by the number of out-of-focus measurements, . However, the required time for image acquisition and digital reconstruction increases linearly with the number of measurements, as also illustrated in Fig. 7, where OFI-PSR-based reconstructions of human blood cells and a Pap smear sample are compared using three, five, eight, and 15 different out-of-focus measurements, each with 100 iterations. These results illustrate that, compared to the undersampled in-focus images [Figs. 7(a) and 7(f)], OFI-PSR reconstructions with measurements [Fig. 7(b) and 7(g)] already show significantly improved features, although the aliasing signal is partially present. further improves the reconstructed image quality, and the high-frequency features are restored with good visibility. Since the data acquisition and digital computation time both increase linearly with the number of measurements, we conclude that out-of-focus measurements provides a good balance between imaging time and reconstruction quality.
D. Simplicity of Implementation with Minimal Changes to a Standard Microscope Setup
Conveniently, our OFI-PSR configuration requires minimal changes to a standard microscope: (i) providing a partially coherent light source for illumination and (ii) adjusting the camera lens adapter to recover the lost FOV of the objective lens and increase the overall SBP of the microscope. As a proof-of-concept demonstration, in this paper we used a partially coherent laser source for illumination; however, alternatively, we could have used spectrally filtered LEDs or even the microscope’s original white-light illumination source by adding a color filter and a pinhole aperture into the condenser assembly to increase the spatial and temporal coherence of the illumination. The diversity of out-of-focus distance can simply be achieved by moving the objective focus knob, which is a standard component of modern microscopes. As we detailed previously, the axial defocus amount of each out-of-focus image is accurately determined algorithmically (i.e., after the image capture); therefore, there is no need for a high-precision focusing stage, and the presented method would work well for different microscope stages of varying qualities. Furthermore, this computational imaging framework could also be applied to improve the performance of automated scanning microscopy systems, where the inherent autofocusing steps of such scanning systems could be used to increase the effective SBP of the final digital image and/or reduce the number of scanning positions.
We introduced a computational out-of-focus imaging method, termed OFI-PSR, which helps to mitigate the SBP gap between microscope objective lenses and opto-electronic image sensor chips to increase the SBP of coherent microscopy. We demonstrated the proof-of-concept of this wide-field imaging method using a conventional lens-based microscope, and successfully imaged resolution test targets and biological samples. Our OFI-PSR approach first extends the FOV of a single measurement using a demagnification camera adapter, and then reconstructs a high-resolution complex image of the sample using an iterative algorithm. This super-resolution technique does not require lateral displacements between the specimen and the objective lens, and also retrieves the phase information of the sample. To demonstrate the proof-of-concept of this approach, we used a 1.45-megapixel CCD camera and a camera adapter to achieve FOV with a objective lens, and mitigated undersampling-related artifacts using five out-of-focus intensity images, improving the SBP of the microscopic imaging system by a factor of . Furthermore, the OFI-PSR technique showed -fold reduction in the number of images required to achieve the same SBP using traditional in-line holography approaches. We believe this technique will broadly benefit the coherent imaging and holography fields and inspire new microscope designs with improved throughput and SBPs.
National Science Foundation (NSF); Howard Hughes Medical Institute (HHMI); Army Research Office (ARO) (W911NF-13-1-0419, W911NF-13-1-0197); Office of Naval Research (ONR); National Institutes of Health (NIH); U.S. Department of Defense (DOD).
Y. R. is supported by the European Union under grant agreement No. H2020-MSCA-IF-2014-659595 (MCMQCT).
1. J. W. Goodman, Introduction to Fourier Optics (Roberts, 2005).
2. P. Ferraro, S. Grilli, D. Alfieri, S. De Nicola, A. Finizio, G. Pierattini, B. Javidi, G. Coppola, and V. Striano, “Extended focused image in microscopy by digital holography,” Opt. Express 13, 6738–6749 (2005). [CrossRef]
3. F. Charrière, A. Marian, F. Montfort, J. Kuehn, T. Colomb, E. Cuche, P. Marquet, and C. Depeursinge, “Cell refractive index tomography by digital holographic microscopy,” Opt. Lett. 31, 178–180 (2006). [CrossRef]
4. F. Charrière, N. Pavillon, T. Colomb, C. Depeursinge, T. J. Heger, E. A. D. Mitchell, P. Marquet, and B. Rappaz, “Living specimen tomography by digital holographic microscopy: morphometry of testate amoeba,” Opt. Express 14, 7005–7013 (2006). [CrossRef]
5. L. Miccio, D. Alfieri, S. Grilli, P. Ferraro, A. Finizio, L. D. Petrocellis, and S. D. Nicola, “Direct full compensation of the aberrations in quantitative phase microscopy of thin objects by a single digital hologram,” Appl. Phys. Lett. 90, 041104 (2007). [CrossRef]
6. B. Kemper and G. von Bally, “Digital holographic microscopy for live cell applications and technical inspection,” Appl. Opt. 47, A52–A61 (2008). [CrossRef]
7. V. Micó, Z. Zalevsky, C. Ferreira, and J. García, “Superresolution digital holographic microscopy for three-dimensional samples,” Opt. Express 16, 19260–19270 (2008). [CrossRef]
8. Y.-S. Choi and S.-J. Lee, “Three-dimensional volumetric measurement of red blood cell motion using digital holographic microscopy,” Appl. Opt. 48, 2983–2990 (2009). [CrossRef]
9. W. M. Ash III, L. Krzewina, and M. K. Kim, “Quantitative imaging of cellular adhesion by total internal reflection holographic microscopy,” Appl. Opt. 48, H144–H152 (2009). [CrossRef]
10. T. Tahara, K. Ito, T. Kakue, M. Fujii, Y. Shimozato, Y. Awatsuji, K. Nishio, S. Ura, T. Kubota, and O. Matoba, “Parallel phase-shifting digital holographic microscopy,” Biomed. Opt. Express 1, 610–616 (2010). [CrossRef]
11. C. Fang-Yen, W. Choi, Y. Sung, C. J. Holbrow, R. R. Dasari, and M. S. Feld, “Video-rate tomographic phase microscopy,” J. Biomed. Opt. 16, 011005 (2011). [CrossRef]
12. P. Memmolo, G. Di Caprio, C. Distante, M. Paturzo, R. Puglisi, D. Balduzzi, A. Galli, G. Coppola, and P. Ferraro, “Identification of bovine sperm head for morphometry analysis in quantitative phase-contrast holographic microscopy,” Opt. Express 19, 23215–23226 (2011). [CrossRef]
13. J. Min, B. Yao, P. Gao, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, T. Duan, Y. Yang, and T. Ye, “Dual-wavelength slightly off-axis digital holographic microscopy,” Appl. Opt. 51, 191–196 (2012). [CrossRef]
14. A. Anand, V. K. Chhaniwal, N. R. Patel, and B. Javidi, “Automatic identification of malaria-infected RBC with digital holographic microscopy using correlation algorithms,” IEEE Photon. J. 4, 1456–1464 (2012). [CrossRef]
15. P. Gao, B. Yao, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, “Autofocusing of digital holographic microscopy based on off-axis illuminations,” Opt. Lett. 37, 3630–3632 (2012). [CrossRef]
16. P. Petruck, R. Riesenberg, and R. Kowarschik, “Optimized coherence parameters for high-resolution holographic microscopy,” Appl. Phys. B 106, 339–348 (2012). [CrossRef]
17. A. El Mallahi, C. Minetti, and F. Dubois, “Automated three-dimensional detection and classification of living organisms using digital holographic microscopy with partial spatial coherent source: application to the monitoring of drinking water resources,” Appl. Opt. 52, A68–A80 (2013). [CrossRef]
18. P. Gao, G. Pedrini, and W. Osten, “Structured illumination for resolution enhancement and autofocusing in digital holographic microscopy,” Opt. Lett. 38, 1328–1330 (2013). [CrossRef]
19. J. Kostencka, T. Kozacki, and K. Liżewski, “Autofocusing method for tilted image plane detection in digital holographic microscopy,” Opt. Commun. 297, 20–26 (2013). [CrossRef]
20. A. Anand, A. Faridian, V. K. Chhaniwal, S. Mahajan, V. Trivedi, S. K. Dubey, G. Pedrini, W. Osten, and B. Javidi, “Single beam Fourier transform digital holographic quantitative phase microscopy,” Appl. Phys. Lett. 104, 103705 (2014). [CrossRef]
21. X. Yu, J. Hong, C. Liu, M. Cross, D. T. Haynie, and M. K. Kim, “Four-dimensional motility tracking of biological cells by digital holographic microscopy,” J. Biomed. Opt. 19, 045001 (2014). [CrossRef]
22. Y. Zhang, W. Jiang, L. Tian, L. Waller, and Q. Dai, “Self-learning based Fourier ptychographic microscopy,” Opt. Express 23, 18471–18486 (2015). [CrossRef]
23. B. Mandracchia, V. Pagliarulo, M. Paturzo, and P. Ferraro, “Surface plasmon resonance imaging by holographic enhanced mapping,” Anal. Chem. 87, 4124–4128 (2015). [CrossRef]
24. S. Mahajan, V. Trivedi, P. Vora, V. Chhaniwal, B. Javidi, and A. Anand, “Highly stable digital holographic microscope using Sagnac interferometer,” Opt. Lett. 40, 3743–3746 (2015). [CrossRef]
25. N. Verrier, C. Fournier, and T. Fournel, “3D tracking the Brownian motion of colloidal particles using digital holographic microscopy and joint reconstruction,” Appl. Opt. 54, 4996–5002 (2015). [CrossRef]
26. F. Yi, I. Moon, and B. Javidi, “Cell morphology-based classification of red blood cells using holographic imaging informatics,” Biomed. Opt. Express 7, 2385–2399 (2016). [CrossRef]
27. A. Greenbaum, Y. Zhang, A. Feizi, P.-L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6, 267ra175 (2014). [CrossRef]
28. “Microscope digital camera DP80,” http://www.olympus-lifescience.com/en/camera/color/dp80/#!cms[tab]=%2Fcamera%2Fcolor%2Fdp80%2Fresources.
29. W. Luo, Y. Zhang, Z. Göröcs, A. Feizi, and A. Ozcan, “Propagation phasor approach for holographic image reconstruction,” Sci. Rep. 6, 22738 (2016). [CrossRef]
30. A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. O. Isikman, A. F. Coskun, O. Mudanyali, and A. Ozcan, “Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy,” Nat. Methods 9, 889–895 (2012). [CrossRef]
31. W. Bishara, T.-W. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18, 11181–11191 (2010). [CrossRef]
32. O. Mudanyali, D. Tseng, C. Oh, S. O. Isikman, I. Sencan, W. Bishara, C. Oztoprak, S. Seo, B. Khademhosseini, and A. Ozcan, “Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications,” Lab Chip 10, 1417–1428 (2010). [CrossRef]
33. Y. Zhang, S. Y. C. Lee, Y. Zhang, D. Furst, J. Fitzgerald, and A. Ozcan, “Wide-field imaging of birefringent synovial fluid crystals using lens-free polarized microscopy for gout diagnosis,” Sci. Rep. 6, 28793 (2016). [CrossRef]
34. E. McLeod and A. Ozcan, “Unconventional methods of imaging: computational microscopy and compact implementations,” Rep. Prog. Phys. 79, 076001 (2016). [CrossRef]
35. A. Ozcan and E. McLeod, “Lensless imaging and sensing,” Annu. Rev. Biomed. Eng. 18, 77–102 (2016). [CrossRef]
36. S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011). [CrossRef]
37. W. Luo, Y. Zhang, A. Feizi, Z. Göröcs, and A. Ozcan, “Pixel super-resolution using wavelength scanning,” Light Sci. Appl. 5, e16060 (2015). [CrossRef]
38. A. Greenbaum, W. Luo, B. Khademhosseinieh, T.-W. Su, A. F. Coskun, and A. Ozcan, “Increased space-bandwidth product in pixel super-resolved lensfree on-chip microscopy,” Sci. Rep. 3, 1717 (2013). [CrossRef]
39. E. Cuche, P. Marquet, and C. Depeursinge, “Simultaneous amplitude-contrast and quantitative phase-contrast microscopy by numerical reconstruction of Fresnel off-axis holograms,” Appl. Opt. 38, 6994–7001 (1999). [CrossRef]
40. “Olympus IX83 inverted microscope,” http://www.olympus-lifescience.com/en/microscopes/inverted/ix83/#!cms[tab]=%2Fmicroscopes%2Finverted%2Fix83%2Ffeaturesca50677bd9a5846f8deb5d96a828969.
41. C.-S. Guo, L. Zhang, H.-T. Wang, J. Liao, and Y. Y. Zhu, “Phase-shifting error and its elimination in phase-shifting digital holography,” Opt. Lett. 27, 1687–1689 (2002). [CrossRef]
42. P. Guo and A. J. Devaney, “Digital microscopy using phase-shifting digital holography with two reference waves,” Opt. Lett. 29, 857–859 (2004). [CrossRef]