Abstract

Due to the chromatic dispersion properties inherent in all optical materials, even the best-designed multispectral objective will exhibit residual chromatic aberration. Here, we demonstrate a multispectral microscope with a computational scheme based on the Fourier ptychographic microscopy (FPM) to correct these effects in order to render undistorted, in-focus images. The microscope consists of 4 spectral channels ranging from 405 nm to 1552 nm. After the computational aberration correction, it can achieve isotropic resolution enhancement as verified with the Siemens star sample. We image a flip-chip to show the promise of our system to conduct fault detection on silicon chips. This computational approach provides a cost-efficient strategy for high quality multispectral imaging over a broad spectral range.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Multispectral microscopy, as a powerful method to obtain wide spectral images of a sample, has been applied to various research and industrial fields, from ore mineral inspection [1] and pharmaceutical composition determination [2] to biological tissue analysis [3]. Especially, with the wavelength extended to the near infrared (NIR) regime, the multispectral imager can have even more capabilities, including food quality inspection and wafer and solar cell production monitoring [4,5]. Thus, multispectral microscope reaching to the NIR regime can be a promising instrument for the aforementioned scenarios.

A multispectral microscope’s performance highly depends on the optical system design, as aberrations are critical barriers for an optical imaging system to achieve the ideal diffraction-limited resolution. Particularly, when the illumination wavelength ranges over a wide spectrum, such as from ultraviolet to NIR light, the chromatic aberration can become quite severe. Conventionally, to achieve an optical system whose aberrations are well-corrected for a wide spectrum, optical engineers need to put in much effort to combine multiple lenses of different surface profiles or materials to compensate for the aberrations with the aid of optical design software. Even if the final design meets the requirement, the manufacturing process can be difficult and costly. It often entails a bulky setup [6]. Furthermore, due to the finite number of optical surfaces, it will still suffer from residual aberrations to some extent [7].

The imperfect phase retardance of lenses that generates aberrations can be represented by a phase-only function, called the pupil function. Instead of minimizing the aberration with the optical design, it is also viable to compensate for the phase error afterwards, either through additional hardware components, for example, a spatial light modulator, or by numerical post-processing. The key to this approach is to obtain the accurate wavefront. Its difficulty lies in the fact that the light frequency goes far beyond the response rate of the existing state-of-the-art detectors, unlike other waves such as acoustic waves whose wavefronts can be directly measured in time by sufficiently fast detectors. Therefore, the phase information of an optical wavefront can only be inferred from a detector’s intensity measurements. There have been intensive research efforts around the optical wavefront sensing techniques. A simple non-interferometric method is to utilize a Shack-Hartmann wavefront sensor [8,9], which comprises of a set of micro-lenses placed in close proximity to an array detector. From the local focal spot shifts, the phase distortion can be computed. In spite of its simple principle, a considerable setup modification is inevitable. Its spatial resolution is limited by the micro-lens density, and thus the phase approximation is relatively rough.

Alternatively, the wavefront can be reconstructed from direct intensity measurements by the image sensor [1036]. The reconstruction is performed by computational algorithms which can be categorized into two groups. Non-iterative methods are based on the transport-of-intensity equation (TIE) [1014]. It is a second-order elliptic partial differential equation that quantitatively relates the longitudinal intensity variation with the transverse phase distribution of the wavefront. However, this method cannot cope well with significant modulations in the amplitude as they can cause singularity problems during the numerical calculation. The other group of computational wavefront measurement algorithms is categorized as the iterative method, which utilizes the degrees of freedom in the optical system to introduce phase diversity [1517]. A set of seemingly distorted intensity patterns generated by phase diversity is first recorded by a detector and then combined with an iterative algorithm into the original wavefront’s complex function. There are a number of different ways to introduce the phase diversity. One simple way is by defocusing (i.e., axial scanning) [1823]. Various algorithms for phase retrieval from defocus diversity have been reported [1821]. It is suitable for lens-free on-chip setups to achieve high-throughput imaging [22,23]. However, it is prone to pixilation artifacts, as the optical resolution can easily exceed the detector’s pixel resolution. Another iterative wavefront reconstruction method operates by laterally scanning the wavefront over a thin sample. This method is termed ptychography [2426], and it works by shifting an illumination probe over the sample with the coverage at each scanning position overlapping one another so that a highly redundant diffraction pattern dataset is captured. The overlapping region functions much like an interferogram during the iterative reconstruction procedure of the complex functions of both the sample and the probe, which guarantees the convergence of the non-convex phase retrieval computation. Afterwards, its Fourier-domain counterpart, Fourier ptychographic microscopy (FPM), was proposed [27].

FPM is based on the principle that the objective lens’s pupil function can act as the probe that scans across the Fourier spectrum of the sample under plane-wave illuminations of varied angles. Combined with a simple LED array to realize the spectrum scanning, it can access wider spatial-frequency band of the sample and thus enhance the imaging resolution by aperture synthesis with phase retrieval. Due to its simple setup and robust performance, it has attracted great interest from researchers, spurring theoretical and experimental achievements ranging from better optimization algorithms [28,29], system error calibration [30], and noise suppression [31,32], to three-dimensional imaging [33,34]. Furthermore, a variant of FPM, termed aperture-scanning FPM, was demonstrated to directly scan a physical aperture over the Fourier plane [35,36], similar to the coded aperture method in the computational photography [37]. It can be easily realized for both transmissive and reflective imaging, and it is fit for applications where the sample-objective distance is not adjustable, such as wafer inspection [36] and retinal imaging [7]. In addition, without the varied-angle illumination requirement, the thin-sample assumption of the original FPM is circumvented [35].

Applying the aperture-scanning FPM to optical aberration removal has been previously explored in [38] which demonstrates simultaneous optimization of the pupil function estimate and the sample spectrum estimate via the simulated annealing algorithm. Unlike the simultaneous optimization algorithm proposed for angular-diversity FPM [28], this method was shown to be quite sensitive to input parameters and sample as it does not have the advantage of redundant pupil function information scanning across the sample spectrum. Moreover, its optimization computation workload is significantly great due to the high non-convex property of its formulation, which is unacceptable when calibrating the spatially varying aberration over a wide field of view (FOV), let alone for multiple wavelengths.

In this paper, we report a multispectral microscope based on aperture-scanning FPM that is applicable to general imaging scenarios. With its ability to reconstruct complex wavefront functions, it removes aberrations present in the optical system in post-processing to recover sharp, in-focus images. To overcome the limited resolution of the array detectors for both visible and NIR wavelengths, we first use aperture-scanning FPM to synthesize pixel-super-resolution complex fields of a sample [36] over 4 illumination wavelengths (405 nm, 532 nm, 638 nm, and 1552 nm). Pixel super-resolution is especially critical for NIR wavelengths as the existing short-wave infrared (SWIR) cameras are all made of Indium Gallium Arsenide (InGaAs), whose noise control is much more difficult than the mainstream visible light (VIS) cameras based on silicon. Its pixel size is thus bigger (around 10–15 µm) to achieve sufficient signal-to-noise ratio (SNR). As such, it can easily violate the Nyquist sampling requirement of the objective lens in the microscope. Then with the pixel-super-resolved complex fields at hand, multiple defocused complex field slices can be generated and fed into the defocus-diversity-based aberration reconstruction method proposed by [39] to account for the spatially varying aberrations.

This paper is outlined as follows. In Section 2, we introduce our multispectral microscope system based on aperture-scanning FPM and describe the aberration calibration and correction scheme. In Section 3, we present the multi-wavelength calibration results of our system and verify its fidelity using the Siemens star sample under both transmissive and reflective imaging geometry. In Section 4, we show the application of our system to imaging the frontside and backside of a flip-chip to demonstrate it is a viable method for conducting fault detection on silicon-wafer chips.

2. Methods

2.1 Experimental system

Our multispectral microscope system consists of the illumination path and the detection path, as shown in Fig. 1(a). To obtain the multispectral information, 4 laser diodes with different wavelengths ranging from 405 nm to 1552 nm are coupled into a multi-mode fiber (FT200EMT-CUSTOM Thorlabs, 0.39 NA, Ø200 µm, FC/PC, 5m), which is jittered by a vibrating motor to wash out the speckle in the output. The diodes are sequentially powered by a programmable microcontroller (Arduino UNO) during the image capturing process. Depending on the sample to be imaged, transmission or reflection illumination mode is selected. In the detection part, the VIS and NIR imaging paths share the same objective (10X Mitutoyo Plan Apo NIR Infinity Corrected Objective, 0.26 NA), 4f relay system and a mechanical scanning aperture. The paths are separated by a 50:50 beam splitter. Each path has a tube lens and a camera designed for the respective wavelengths. This dual-detection-path design is necessary because it is hard to find a camera with such a wide spectral response. The VIS camera (PROSILICA GX 6600) used for wavelengths ranging from 405 nm to 638 nm has 6576×4394 pixels with the pixel size of 5.5 µm. The magnification for VIS imaging path is 10 so that the effective FOV in the object space is around 3.62 mm×2.42 mm. The NIR camera (WIDY SWIR 640U-S) used for 1552 nm has 640×512 pixels with the pixel size of 15 µm. The magnification for NIR imaging path is intentionally set as 5× so that the effective FOV in the object space is approximately 1.92 mm×1.54 mm. These two cameras are first manually aligned to match their centers in the experimental setup, as shown in Fig. 1(c), and then digitally registered in the signal processing step to rectify the slight tilting and rotation between two camera sensors. Here, the different magnification factor selected for NIR imaging path is the result of a tradeoff between FOV and the effective pixel sampling rate, which will be discussed together with the scanning aperture size in the following.

 

Fig. 1. Multispectral, aperture-scanning Fourier ptychographic microscope system design. (a) The experimental setup schematic, switchable between transmission and reflection illumination mode. (b) FPM scanning strategy, where the red dots and thick red lines represent the scanning trajectory. The blue dotted circles represent the NA coverage of each scanning aperture. (c) Spatial position of VIS and NIR camera FOVs on the sample plane. For visualization, the VIS camera FOV is displayed in green and the NIR camera FOV in transparent gray.

Download Full Size | PPT Slide | PDF

The key to the aperture-scanning FPM in our system is a physical iris with adjustable aperture mounted on a 2D motorized translation stage. It is located at the relayed Fourier plane and scanned in a spiral pattern, as shown in Fig. 1(b), for each laser diode. Then, a set of sub-aperture intensity images are acquired for each wavelength. The overlapping rate between two adjacent aperture coverages is about 85%, satisfying the redundancy demand by the FPM algorithm. The total number of images required to fully exploit the objective NA is 47. The image sets will be fed into the reconstruction algorithm described in the following section. In our experiment, the aperture radius is chosen to be 3 mm, equal to 0.15 NA in the system. Thus, the effective pixel size in the object space should be less than 1.35 µm (calculated at 405 nm) for the VIS imaging path and less than 5.17 µm (calculated at 1552 nm) for the NIR imaging path in order to meet the Nyquist sampling requirement. According to the setup, the effective pixel size of the VIS camera is 0.55 µm under 10× magnification, meeting the sampling requirement. For the NIR camera, a large FOV is desired without going under the sampling limit. Thus 5× magnification is a balanced choice, where the effective pixel size is 3 µm.

2.2 Aberration calibration and correction scheme

Aberration correction for the multispectral optics is a difficult task. Here, we first utilize FPM to reconstruct the complex amplitude field and overcome the pixel sampling issue. Then, we assume the FOV’s center region to be free from aberration and use it as the reference for determining the spatially variant pupil functions across the FOV, as has been done in [39]. Finally, the pupil functions are decomposed into the eight Zernike modes to be used for subsequent aberration-compensation procedures.

Dataflow pipeline of the aberration calibration and correction scheme is shown in Fig. 2. Here, polystyrene microspheres (Polybead, Polysciences) are used as the standard sample for calibration due to their following advantages. First, they have an isotropic shape and a consistent diameter with a variety of choices in size. We choose microbeads of different sizes for different wavelengths so that they can be treated as the quasi-point object for our optical system and used for the following calibration. Second, they can be homogeneously distributed across the full FOV and thereby avoid the mechanical movement of sample to calibrate the spatially varying aberration. As we can observe from Fig. 2(a), the microsphere located at the peripheral region of FOV (denoted as off-axis bead) is distorted into an elliptical shape by the residual aberration, while the one in the central FOV (denoted as on-axis bead) is circular.

 

Fig. 2. Dataflow pipeline of the aberration calibration and correction scheme. (a) Aberration calibration algorithm pipeline based on the microbead sample. It consists of three steps, FPM reconstruction, digital defocusing and Zernike mode fitting. (b) Aberration correction algorithm pipeline. The pupil function at some specific spatial location is compensated as its conjugate in the Fourier domain.

Download Full Size | PPT Slide | PDF

Shown in Fig. 2(a), our aberration calibration algorithm pipeline can be divided into 3 steps.

(1) FPM reconstruction

The imaging process can be modeled as follows. The spatial coordinates on the sample plane, the Fourier plane and the image plane are respectively denoted as $(x,y)$, $(u,v)$ and $({x_1},{y_1})$. The exit wave out of the sample plane is described by a complex-valued function $o(x,y)$. Then, for a single wavelength λ0, the set of low-resolution raw images can be written as

$${I_k}({{x_1},{y_1}} )= {|{{{\textbf F}^{ - 1}}\{{{\textbf F}\{{o({x,y} )} \}\cdot {P_k}({u,v} )} \}} |^2}, k = 1,2,\ldots ,47,$$
where each k corresponds to a scanning position and ${P_k}(u,v)$ is the shifted aperture function at that location. The goal of FPM algorithm is to reconstruct the phase from the highly redundant dataset. First, a high-resolution spatial frequency guess $O^{\prime}(u,v)$ is generated. Then the computational task is formulated as an optimization problem to minimize the following cost function:
$$\arg \min \sum\limits_{k = 1}^{47} {{{|{{{|{{{\textbf F}^{ - 1}}\{{O^{\prime}({u,v} )\cdot {P_k}({u,v} )} \}} |}^2} - {I_k}({{x_1},{y_1}} )} |}^2}} .$$
Here, we adopt the sequential Gauss-Newton method to optimize it, taking the convergence speed, noise robustness and computational cost into account [29]. Since the FPM algorithm has been discussed in detail in [29], it will not be elaborated on in this part.

(2) Digital defocusing

After the complex amplitude functions of on-axis and off-axis beads are reconstructed, they are digitally defocused to 10 different z-axis positions by the angular spectrum method [40]. If the reconstructed complex amplitude field is denoted as $B(x,y)$, the set of 10 z-stack images can be written as

$${I_t}({x,y} )= {|{{{\textbf F}^{ - 1}}\{{{\textbf F}\{{B({x,y} )} \}\cdot {H_t}({u,v} )} \}} |^2}, t = 1,2,\ldots ,10.$$
Here, the ${H_t}(u,v)$ is the free-space propagation kernel, which is
$${H_t}({u,v} )= \exp \left[ {i\frac{{2\pi }}{{{\lambda_0}}}{z_t}\sqrt {1 - {{({\lambda_0}u)}^2} - {{({\lambda_0}v)}^2}} } \right], t = 1,2,\ldots ,10.$$
The defocusing distance interval ${z_t} - {z_{t - 1}}$is set as a constant linearly proportional to the wavelength in order to produce the enough spatial variations among the intensity images ${I_t}(x,y)$. The set of z-stack intensity images ${I_{off,t}}(x,y)$ from the off-axis bead function ${B_{off}}(x,y)$ will be used as the constraints in the following fitting step.

(3) Zernike mode fitting

Based on the assumption that the FOV’s center is minimally distorted and well corrected by the objective lens’s manufacturer, it can be taken as the aberration-free region with the diffraction-limited resolution performance. Therefore, the reconstructed on-axis bead ${B_{on}}(x,y)$ can be taken as the aberration-free image of the bead. The differences between this reference image and the aberrated bead image at other FOV locations in the Fourier domain are attributed to the residual aberrations. Considering that the minimization in the Fourier domain is sensitive to noise, we formulate the optimization problem in the spatial domain. This is why the step (2) is needed to obtain the z stacks in the spatial domain.

The aberration function can be decomposed into orthogonal Zernike modes $Z_n^m(u,v)$, each of which corresponds to a principle aberration basis [41]. In this paper, we focus on the top eight common and dominant Zernike modes, which are $Z_1^{ - 1}$(tiltx), $Z_1^1$(tilty), $Z_2^{ - 2}$(astigmatismx), $Z_2^2$(astigmatismy), $Z_2^0$(defocus), $Z_3^{ - 1}$(comax), $Z_3^1$(comay), and $Z_4^0$(spherical aberration). For simplicity, we re-notate them as ${Z_j}(u,v), j = 1,2\ldots ,8.$ Consequently, a pupil function with a set of guessed Zernike coefficients is

$$P({u,v;{w_1},\ldots ,{w_8}} )= \exp \left[ {i2\pi \sum\limits_{j = 1}^8 {{w_j}{Z_j}(u,v)} } \right],$$
where ${w_j}$ is the coefficient for each Zernike mode. Then we find the pupil function with the Zernike mode coefficients that can minimize the following optimization argument:
$$\arg \min \sum\limits_{t = 1}^{10} {{{|{{{|{{{\textbf F}^{ - 1}}\{{{\textbf F}\{{{B_{on}}({x,y} )} \}P({u,v;{w_1},\ldots ,{w_8}} )\cdot {H_t}({u,v} )} \}} |}^2} - {I_{off,t}}(x,y)} |}^2}} .$$
Once the optimal solution is found, the pupil function at the FOV location $(x,y)$ can be subsequently used to computationally compensate for the aberrations, as shown in Fig. 2(b). Here, since the aberration is spatially varying and wavelength-dependent, the process above will be repeated for different locations in the whole FOV and for all four wavelengths.

3. Results and discussion

3.1 Spatially varying aberration

One fact implied in the account of Section 2.2 is that the aberration is spatially varying. This is more obvious for the low NA objective with a large FOV, such as the one used in our multispectral microscope system. Thus, the Zernike mode coefficient ${w_j}$ is a function of the spatial location $(x,y)$ on the sample plane and the wavelength λ, which can be written as ${w_j}(x,y;\lambda )$.

For a single fixed wavelength λ0, a series of discrete data points can be acquired for ${w_j}(x,y;{\lambda _0})$ by repeating the calibration procedures in Section 2.2, as displayed by the blue dots in Fig. 3 (λ0 = 532 nm). It has been reported that the spatial dependency of different Zernike mode coefficients can be fitted with the polynomials of different orders [41]. Thus, these discrete data points are fitted to 2D surface in Fig. 3 so that the local Zernike coefficients for any spatial location in the FOV can be queried. From the magnitude of eight Zernike coefficient distributions, it can be seen that astigmatism is the dominant aberration in the objective lens of our study.

 

Fig. 3. Spatial varying aberration of our multispectral microscope system when the wavelength is 532 nm. (a)-(h) Zernike mode coefficients as function of the spatial coordinate on the sample plane. Each blue-dot data point represents the calculated Zernike coefficient weight from one off-axis bead. ∼70 beads are identified over the entire FOV. These data points are fitted to a 2D surface for each type of aberration.

Download Full Size | PPT Slide | PDF

The same aberration characterization procedures are repeated for each wavelength and the spatial variance of their Zernike coefficients is summarized in Fig. 4 and Table 1. Considering the tilt and defocus aberration modes are sample-dependent and can be compensated by the digital refocusing on local tiles, they are not shown here. In the reminder of this paper, we only apply the reported three Seidel aberrations (astigmatism, coma and spherical aberration) to correct residual aberrations and it can be seen in the next section that these Zernike terms are sufficient because of their dominance in the aberration function.

 

Fig. 4. Spatial varying aberration of our multispectral microscope system for multiple wavelengths. The NIR fitted surface result is based on the beads located in the NIR camera’s FOV, shown by the dash-line box in (a). For comparison, it is extrapolated to the same size as the VIS results. (a)-(d) Astigmatism and coma coefficients as function of the spatial coordinate on the sample plane. Each color represents a single wavelength.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. Fitted spherical aberration

For explicit visualization and comparison among four wavelengths, the aberration characterization results computed from the NIR camera’s FOV and VIS camera’s FOV are registered and plotted together. The result under the NIR wavelength is further extrapolated to the same size as the VIS ones to observe their varying trends. Since the spherical aberration coefficient is constant for individual wavelength, they are displayed in Table 1 instead of the contours in Fig. 4. From the contour plots, it can be seen that the spatial dependence of different Zernike mode coefficients has similar distribution for each wavelength. However, the aberration is more severe for the NIR regime than for the VIS regime, indicated by the absolute values of all aberration coefficients. They increase the fastest under the wavelength of 1552 nm when deviated from the FOV center.

3.2 Calibration demonstration

After the spatially varying aberration for each wavelength is calibrated, a simple and direct way to verify its correctness is to apply it for imaging of a standard sample, such as the Siemens star suggested by [42]. In the experiment, the Siemens star target is randomly offset from the optical axis for each wavelength and the transmissive and reflective image datasets are acquired at the same location.

Following the scheme described in Section 2.2, reconstructions before and after correction are presented in Figs. 5(a)–5(b). To observe the resolution enhancement, the Siemens star’s center is enlarged and shown in the inset boxes. It can be clearly seen that the aberrations at the arbitrary locations are well corrected under both transmission and reflection illumination modes across all wavelengths.

 

Fig. 5. Demonstration of aberration calibration and correction. The Siemens star target was randomly offset from the optical center for different wavelengths. Transmissive (a) and reflective (b) reconstructions before and after correction are compared and the inset boxes zoom in on the sample center to show the resolution enhancement. (c1)-(c4) draws the line profile of a circle segment in the reflective reconstructions before and after correction.

Download Full Size | PPT Slide | PDF

To further quantify the resolution enhancement, the line profile of a circle section from the reflective reconstructions under four wavelengths is drawn [43] and compared before and after aberration correction in Figs. 5(c1)–5(c4). The interval between peaks and troughs of Siemens star before correction is uneven. The intensity of some troughs is even higher than the one of peaks and several peaks and troughs are totally invisible. After compensation, the even space between peaks and troughs tells that the aberration-corrected contrast is uniform at all orientations, which indicates an isotropic resolution improvement. This verifies the efficacy of our aberration compensation method. Notably, the overall reconstruction quality under 1552 nm is not comparable to the VIS results, as the NIR camera has higher thermal noise.

4. Silicon chip imaging

A useful application of our multispectral microscope is for the silicon chip inspection. Here, a flip-chip is utilized as our imaging sample. Flip-chip is a prevalent method to interconnect semiconductor devices to external circuitry with solder bumps that have been deposited onto the chip pads, as shown in the schematic of Fig. 6(a). The flip-chip we used has nine metallic layers stacking above the silicon die with the thickness of around 305 µm. On top of the metallic layers bonded are eleven micro-ball bumps.

 

Fig. 6. Schematic of the flip-chip and its frontside imaging. (a) Schematic of the flip-chip and its micro-ball bump. (b) Multispectral frontside images of the flip-chip. (c) Comparison between the visible and NIR images in three regions of interest (ROI) marked by the yellow box in (b). (d) Enlargements of the same area from three visible-light channel images, where the results before and after aberration correction are compared, together with the local pupil function for each wavelength.

Download Full Size | PPT Slide | PDF

In failure analysis of silicon chips, nondestructive imaging is important to avoid disturbing the functionality of integrated circuits (IC). High-resolution imaging techniques such as transmission electron microscope (TEM) or scanning electron microscope (SEM) require the transistors to be exposed destructively. Optical microscopy techniques may be used in the frontside imaging on the top metallic layers. However, for the bottom layers or in situations where the flip-chip is bonded, optical imaging is not practical as silicon is opaque in the visible spectrum [6]. Considering silicon's bandgap at 300 K is around 1.12 eV, the radiation at wavelengths longer than 1.12 µm does not have enough energy to excite valence band electrons into the conduction band. Thus, silicon is nearly transparent in the NIR regime. The use of NIR backside imaging is common in the semiconductor failure analysis. Much effort has been put in the resolution improvement, such as using solid immersion lenses [44] and using a deformable mirror to compensate for aberrations [45]. We expect that the NIR imaging results on silicon chip can be further enhanced with our cost-efficient digital aberration correction method.

First, the frontside multispectral images are displayed in Fig. 6(b). It can be seen that different information can be obtained under different wavelengths. Remaining adhesive and scratches are the clearest under 405 nm while the solder mask and the copper wires respectively provide contrast under 532 nm and 638 nm. The image under 1552 nm is similar to the one under 638 nm. The difference is more explicit in Fig. 6(c), the zoom-in of three regions marked in yellow boxes in Fig. 6(b). Considering the resolution limit imposed by the wavelength, it is preferable to image the frontside with the visible light. Furthermore, the aberration for three visible channels is corrected. As shown in Fig. 6(d), the width of metal wires is not uniform due to aberrations before correction. After the aberration correction is applied with the calibrated local pupil function, the wire width becomes consistent, indicating the resolution is more isotropic. Another interesting observation from the local pupil functions is that their patterns are similar to each other except from being rescaled by the wavelength, which tells us that the spatial dependence of aberration has a similar pattern for each wavelength, just as shown in Fig. 4.

As shown in Fig. 7(a), the 1552 nm wavelength can penetrate the silicon substrate of 305 µm thick while the 532 nm wavelength is totally blocked at the surface. Figure 7(c1) displays the bond pad array and (c2) is the backside image of the inductor corresponding to the ROI 2 in Fig. 6(c). It is clear in Fig. 7(c2) that the side-by-side copper wires of the inductor cannot be resolved by the direct up-sampled raw data due to the pixel aliasing. Although the FPM reconstruction can overcome the pixel sampling issue, the aberration still distorts the wire shape. Only after the aberration correction, the wires can be well resolved, as shown by the line profiles in Fig. 7(b). This could be useful when diagnosing the faults, for example the short circuit, of the IC. Without the aberration correction, it is difficult to resolve the integrity of the wires.

 

Fig. 7. Backside imaging of the flip-chip. (a) Comparison between the backside raw image under the wavelength of 532 nm and 1552 nm. (b) Line profiles through the periodic structure in (c2) to highlight the aberration correction performance. (c1)-(c3) Comparison between up-sampled raw image, FPM-reconstructed image before aberration compensation and the image after compensation along with their local pupil function.

Download Full Size | PPT Slide | PDF

To sum up, our multispectral microscope can provide an informative inspection to the flip chip. From the frontside imaging, the visible light can provide high-resolution multi-channel images for top layers. From the backside imaging, the NIR light can penetrate the optically opaque silicon wafer and see the structures beneath. For all the regimes, our aberration correction scheme can overcome the aberration barrier, which is important as the microscale details of chip can be easily distorted by the residual aberrations.

5. Conclusions

In this paper, a scheme to calibrate and correct the aberrations of a multispectral microscope system is reported. It is based on the aperture-scanning FPM, which can be easily integrated into the traditional microscope system. With the pixel super-resolution ability of FPM [36], it relaxes the sampling requirement for the detector. Thus, it is especially suitable for the NIR imaging considering the current limitation in manufacturing detectors in the NIR regime. Moreover, our multispectral microscope is based on computational aberration correction which is much simpler in design and more cost-effective than other optical correction efforts. Once calibrated, it can be applied for subsequent imaging without further calibrations.

To demonstrate the effectiveness of our scheme, we have built and tested a multispectral microscope system with the operating wavelength ranging from 405 nm to 1552 nm. Due to the large FOV of the objective, the aberration in the periphery region is severe. We first calibrate the system with the standard polystyrene microspheres, and then the isotropic resolution enhancement is demonstrated with the Siemens star. Finally, we have shown a promising application to image the silicon-wafer flip-chip. Some IC details are only resolved after the aberration correction. Our scheme could find applications in nondestructive fault diagnosis of samples such as microelectromechanical systems (MEMS) devices, heavily doped silicon samples, wafer bonding, and 3D chip stacks.

Funding

California Institute of Technology (Caltech Innovation Initiative (CII): 25570015).

Acknowledgments

We thank Hangwen Lu and Xiaoyu Liu for their initial efforts to this project, Ruizhi Cao for his constructive discussion and the generous help from Michelle Cua and Craig Ives.

References

1. E. Pirard, “Multispectral imaging of ore minerals in optical microscopy,” Mineral. Mag. 68(2), 323–333 (2004). [CrossRef]  

2. Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005). [CrossRef]  

3. Y. W. Chu, F. Chen, Y. Tang, T. Chen, Y. X. Yu, H. L. Jin, L. B. Guo, Y. F. Lu, and X. Y. Zeng, “Diagnosis of nasopharyngeal carcinoma from serum samples using hyperspectral imaging combined with a chemometric method,” Opt. Express 26(22), 28661–28671 (2018). [CrossRef]  

4. C. D. Tran, “Principles, instrumentation, and applications of infrared multispectral imaging, an overview,” Anal. Lett. 38(5), 735–752 (2005). [CrossRef]  

5. D. L. Barton, K. Bernhard-Höfer, and E. I. Cole Jr, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999). [CrossRef]  

6. K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015). [CrossRef]  

7. J. Chung, G. W. Martinez, K. C. Lencioni, S. R. Sadda, and C. Yang, “Computational aberration compensation by coded-aperture-based correction of aberration obtained from optical Fourier coding and blur estimation,” Optica 6(5), 647–661 (2019). [CrossRef]  

8. B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001). [CrossRef]  

9. J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002). [CrossRef]  

10. M. R. Teague, “Deterministic phase retrieval: a Green's function solution,” J. Opt. Soc. Am. A 73(11), 1434–1441 (1983). [CrossRef]  

11. L. Waller, L. Tian, and G. Barbastathis, “Transport of Intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18(12), 12552–12561 (2010). [CrossRef]  

12. L. Tian, J. C. Petruccelli, and G. Barbastathis, “Nonlinear diffusion regularization for transport of intensity phase imaging,” Opt. Lett. 37(19), 4131–4133 (2012). [CrossRef]  

13. C. Zuo, Q. Chen, Y. Yu, and A. Asundi, “Transport-of-intensity phase imaging using Savitzky-Golay differentiation filter–theory and applications,” Opt. Express 21(5), 5346–5362 (2013). [CrossRef]  

14. C. Zuo, Q. Chen, W. Qu, and A. Asundi, “High-speed transport-of-intensity phase microscopy with an electrically tunable lens,” Opt. Express 21(20), 24060–24075 (2013). [CrossRef]  

15. R. A. Gonsalves and R. Chidlaw, “Wavefront sensing by phase retrieval,” Proc. SPIE 207, 32–39 (1979). [CrossRef]  

16. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef]  

17. R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,” J. Opt. Soc. Am. A 9(7), 1072–1085 (1992). [CrossRef]  

18. L. Waller, M. Tsang, S. Ponda, S. Y. Yang, and G. Barbastathis, “Phase and amplitude imaging from noisy images by Kalman filtering,” Opt. Express 19(3), 2805–2814 (2011). [CrossRef]  

19. J. Zhong, L. Tian, P. Varma, and L. Waller, “Nonlinear optimization algorithm for partially coherent phase retrieval and source recovery,” IEEE Trans. Comput. Imaging 2(3), 310–322 (2016). [CrossRef]  

20. C. Shen, J. Tan, C. Wei, and Z. Liu, “Coherent diffraction imaging by moving a lens,” Opt. Express 24(15), 16520–16529 (2016). [CrossRef]  

21. C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017). [CrossRef]  

22. A. Greenbaum, A. Feizi, N. Akbari, and A. Ozcan, “Wide-field computational color imaging using pixel super-resolved on-chip microscopy,” Opt. Express 21(10), 12469–12483 (2013). [CrossRef]  

23. A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014). [CrossRef]  

24. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]  

25. A. Pan and B. Yao, “Three-dimensional space optimization for near-field ptychography,” Opt. Express 27(4), 5433–5446 (2019). [CrossRef]  

26. A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019). [CrossRef]  

27. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]  

28. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014). [CrossRef]  

29. L. H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of Fourier ptychography phase retrieval algorithms,” Opt. Express 23(26), 33214–33240 (2015). [CrossRef]  

30. A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017). [CrossRef]  

31. C. Zuo, J. Sun, and Q. Chen, “Adaptive step-size strategy for noise-robust Fourier ptychographic microscopy,” Opt. Express 24(18), 20724–20744 (2016). [CrossRef]  

32. J. Chung, H. Lu, X. Ou, H. Zhou, and C. Yang, “Wide-field Fourier ptychographic microscopy using laser illumination source,” Biomed. Opt. Express 7(11), 4787–4802 (2016). [CrossRef]  

33. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2(2), 104–111 (2015). [CrossRef]  

34. R. Horstmeyer, J. Chung, X. Ou, G. Zheng, and C. Yang, “Diffraction tomography with Fourier ptychography,” Optica 3(8), 827–835 (2016). [CrossRef]  

35. S. Dong, R. Horstmeyer, R. Shiradkar, K. Guo, X. Ou, Z. Bian, H. Xin, and G. Zheng, “Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging,” Opt. Express 22(11), 13586–13599 (2014). [CrossRef]  

36. X. Ou, J. Chung, R. Horstmeyer, and C. Yang, “Aperture scanning Fourier ptychographic microscopy,” Biomed. Opt. Express 7(8), 3140–3150 (2016). [CrossRef]  

37. A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007). [CrossRef]  

38. R. Horstmeyer, X. Ou, J. Chung, G. Zheng, and C. Yang, “Overlapped Fourier coding for optical aberration removal,” Opt. Express 22(20), 24062–24080 (2014). [CrossRef]  

39. G. Zheng, X. Ou, R. Horstmeyer, and C. Yang, “Characterization of spatially varying aberrations for wide field-of-view microscopy,” Opt. Express 21(13), 15131–15143 (2013). [CrossRef]  

40. J. W. Goodman, Introduction to Fourier optics (Roberts and Company Publishers, 2005).

41. H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.

42. R. Horstmeyer, R. Heintzmann, G. Popescu, L. Waller, and C. Yang, “Standardizing the resolution claims for coherent microscopy,” Nat. Photonics 10(2), 68–71 (2016). [CrossRef]  

43. C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017). [CrossRef]  

44. K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

45. Y. Lu, T. Bifano, S. Ünlü, and B. Goldberg, “Aberration compensation in aplanatic solid immersion lens microscopy,” Opt. Express 21(23), 28189–28197 (2013). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. E. Pirard, “Multispectral imaging of ore minerals in optical microscopy,” Mineral. Mag. 68(2), 323–333 (2004).
    [Crossref]
  2. Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
    [Crossref]
  3. Y. W. Chu, F. Chen, Y. Tang, T. Chen, Y. X. Yu, H. L. Jin, L. B. Guo, Y. F. Lu, and X. Y. Zeng, “Diagnosis of nasopharyngeal carcinoma from serum samples using hyperspectral imaging combined with a chemometric method,” Opt. Express 26(22), 28661–28671 (2018).
    [Crossref]
  4. C. D. Tran, “Principles, instrumentation, and applications of infrared multispectral imaging, an overview,” Anal. Lett. 38(5), 735–752 (2005).
    [Crossref]
  5. D. L. Barton, K. Bernhard-Höfer, and E. I. Cole, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999).
    [Crossref]
  6. K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
    [Crossref]
  7. J. Chung, G. W. Martinez, K. C. Lencioni, S. R. Sadda, and C. Yang, “Computational aberration compensation by coded-aperture-based correction of aberration obtained from optical Fourier coding and blur estimation,” Optica 6(5), 647–661 (2019).
    [Crossref]
  8. B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001).
    [Crossref]
  9. J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002).
    [Crossref]
  10. M. R. Teague, “Deterministic phase retrieval: a Green's function solution,” J. Opt. Soc. Am. A 73(11), 1434–1441 (1983).
    [Crossref]
  11. L. Waller, L. Tian, and G. Barbastathis, “Transport of Intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18(12), 12552–12561 (2010).
    [Crossref]
  12. L. Tian, J. C. Petruccelli, and G. Barbastathis, “Nonlinear diffusion regularization for transport of intensity phase imaging,” Opt. Lett. 37(19), 4131–4133 (2012).
    [Crossref]
  13. C. Zuo, Q. Chen, Y. Yu, and A. Asundi, “Transport-of-intensity phase imaging using Savitzky-Golay differentiation filter–theory and applications,” Opt. Express 21(5), 5346–5362 (2013).
    [Crossref]
  14. C. Zuo, Q. Chen, W. Qu, and A. Asundi, “High-speed transport-of-intensity phase microscopy with an electrically tunable lens,” Opt. Express 21(20), 24060–24075 (2013).
    [Crossref]
  15. R. A. Gonsalves and R. Chidlaw, “Wavefront sensing by phase retrieval,” Proc. SPIE 207, 32–39 (1979).
    [Crossref]
  16. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982).
    [Crossref]
  17. R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,” J. Opt. Soc. Am. A 9(7), 1072–1085 (1992).
    [Crossref]
  18. L. Waller, M. Tsang, S. Ponda, S. Y. Yang, and G. Barbastathis, “Phase and amplitude imaging from noisy images by Kalman filtering,” Opt. Express 19(3), 2805–2814 (2011).
    [Crossref]
  19. J. Zhong, L. Tian, P. Varma, and L. Waller, “Nonlinear optimization algorithm for partially coherent phase retrieval and source recovery,” IEEE Trans. Comput. Imaging 2(3), 310–322 (2016).
    [Crossref]
  20. C. Shen, J. Tan, C. Wei, and Z. Liu, “Coherent diffraction imaging by moving a lens,” Opt. Express 24(15), 16520–16529 (2016).
    [Crossref]
  21. C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
    [Crossref]
  22. A. Greenbaum, A. Feizi, N. Akbari, and A. Ozcan, “Wide-field computational color imaging using pixel super-resolved on-chip microscopy,” Opt. Express 21(10), 12469–12483 (2013).
    [Crossref]
  23. A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
    [Crossref]
  24. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004).
    [Crossref]
  25. A. Pan and B. Yao, “Three-dimensional space optimization for near-field ptychography,” Opt. Express 27(4), 5433–5446 (2019).
    [Crossref]
  26. A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
    [Crossref]
  27. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
    [Crossref]
  28. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014).
    [Crossref]
  29. L. H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of Fourier ptychography phase retrieval algorithms,” Opt. Express 23(26), 33214–33240 (2015).
    [Crossref]
  30. A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
    [Crossref]
  31. C. Zuo, J. Sun, and Q. Chen, “Adaptive step-size strategy for noise-robust Fourier ptychographic microscopy,” Opt. Express 24(18), 20724–20744 (2016).
    [Crossref]
  32. J. Chung, H. Lu, X. Ou, H. Zhou, and C. Yang, “Wide-field Fourier ptychographic microscopy using laser illumination source,” Biomed. Opt. Express 7(11), 4787–4802 (2016).
    [Crossref]
  33. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2(2), 104–111 (2015).
    [Crossref]
  34. R. Horstmeyer, J. Chung, X. Ou, G. Zheng, and C. Yang, “Diffraction tomography with Fourier ptychography,” Optica 3(8), 827–835 (2016).
    [Crossref]
  35. S. Dong, R. Horstmeyer, R. Shiradkar, K. Guo, X. Ou, Z. Bian, H. Xin, and G. Zheng, “Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging,” Opt. Express 22(11), 13586–13599 (2014).
    [Crossref]
  36. X. Ou, J. Chung, R. Horstmeyer, and C. Yang, “Aperture scanning Fourier ptychographic microscopy,” Biomed. Opt. Express 7(8), 3140–3150 (2016).
    [Crossref]
  37. A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
    [Crossref]
  38. R. Horstmeyer, X. Ou, J. Chung, G. Zheng, and C. Yang, “Overlapped Fourier coding for optical aberration removal,” Opt. Express 22(20), 24062–24080 (2014).
    [Crossref]
  39. G. Zheng, X. Ou, R. Horstmeyer, and C. Yang, “Characterization of spatially varying aberrations for wide field-of-view microscopy,” Opt. Express 21(13), 15131–15143 (2013).
    [Crossref]
  40. J. W. Goodman, Introduction to Fourier optics (Roberts and Company Publishers, 2005).
  41. H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.
  42. R. Horstmeyer, R. Heintzmann, G. Popescu, L. Waller, and C. Yang, “Standardizing the resolution claims for coherent microscopy,” Nat. Photonics 10(2), 68–71 (2016).
    [Crossref]
  43. C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
    [Crossref]
  44. K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).
  45. Y. Lu, T. Bifano, S. Ünlü, and B. Goldberg, “Aberration compensation in aplanatic solid immersion lens microscopy,” Opt. Express 21(23), 28189–28197 (2013).
    [Crossref]

2019 (3)

2018 (1)

2017 (3)

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref]

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

2016 (7)

2015 (3)

2014 (5)

X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014).
[Crossref]

S. Dong, R. Horstmeyer, R. Shiradkar, K. Guo, X. Ou, Z. Bian, H. Xin, and G. Zheng, “Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging,” Opt. Express 22(11), 13586–13599 (2014).
[Crossref]

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

R. Horstmeyer, X. Ou, J. Chung, G. Zheng, and C. Yang, “Overlapped Fourier coding for optical aberration removal,” Opt. Express 22(20), 24062–24080 (2014).
[Crossref]

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

2013 (6)

2012 (1)

2011 (1)

2010 (1)

2007 (1)

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
[Crossref]

2005 (2)

C. D. Tran, “Principles, instrumentation, and applications of infrared multispectral imaging, an overview,” Anal. Lett. 38(5), 735–752 (2005).
[Crossref]

Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
[Crossref]

2004 (2)

E. Pirard, “Multispectral imaging of ore minerals in optical microscopy,” Mineral. Mag. 68(2), 323–333 (2004).
[Crossref]

J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004).
[Crossref]

2002 (1)

J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002).
[Crossref]

2001 (1)

B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001).
[Crossref]

1999 (1)

D. L. Barton, K. Bernhard-Höfer, and E. I. Cole, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999).
[Crossref]

1992 (1)

1983 (1)

M. R. Teague, “Deterministic phase retrieval: a Green's function solution,” J. Opt. Soc. Am. A 73(11), 1434–1441 (1983).
[Crossref]

1982 (1)

1979 (1)

R. A. Gonsalves and R. Chidlaw, “Wavefront sensing by phase retrieval,” Proc. SPIE 207, 32–39 (1979).
[Crossref]

Agarwal, K.

K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
[Crossref]

Akbari, N.

Arena, E. T.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Asundi, A.

Bao, X.

Barbastathis, G.

Barton, D. L.

D. L. Barton, K. Bernhard-Höfer, and E. I. Cole, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999).
[Crossref]

Bernhard-Höfer, K.

D. L. Barton, K. Bernhard-Höfer, and E. I. Cole, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999).
[Crossref]

Beverage, J. L.

J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002).
[Crossref]

Bian, Z.

Bifano, T.

Bifano, T. G.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Blechinger, F.

H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.

Chalus, P.

Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
[Crossref]

Chen, F.

Chen, M.

Chen, Q.

Chen, R.

K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
[Crossref]

Chen, T.

Chen, X.

K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
[Crossref]

Chidlaw, R.

R. A. Gonsalves and R. Chidlaw, “Wavefront sensing by phase retrieval,” Proc. SPIE 207, 32–39 (1979).
[Crossref]

Chu, Y. W.

Chung, J.

Chung, P. L.

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

Cilingiroglu, T. B.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Cole, E. I.

D. L. Barton, K. Bernhard-Höfer, and E. I. Cole, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999).
[Crossref]

Dan, D.

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

Descour, M. R.

J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002).
[Crossref]

DeZonia, B. E.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Dong, J.

Dong, S.

Durand, F.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
[Crossref]

Edmond, A.

Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
[Crossref]

Eliceiri, K. W.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Faulkner, H. M. L.

J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004).
[Crossref]

Feizi, A.

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

A. Greenbaum, A. Feizi, N. Akbari, and A. Ozcan, “Wide-field computational color imaging using pixel super-resolved on-chip microscopy,” Opt. Express 21(10), 12469–12483 (2013).
[Crossref]

Fergus, R.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
[Crossref]

Fienup, J. R.

Freeman, W. T.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
[Crossref]

Goldberg, B.

Goldberg, B. B.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Gonsalves, R. A.

R. A. Gonsalves and R. Chidlaw, “Wavefront sensing by phase retrieval,” Proc. SPIE 207, 32–39 (1979).
[Crossref]

Goodman, J. W.

J. W. Goodman, Introduction to Fourier optics (Roberts and Company Publishers, 2005).

Greenbaum, A.

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

A. Greenbaum, A. Feizi, N. Akbari, and A. Ozcan, “Wide-field computational color imaging using pixel super-resolved on-chip microscopy,” Opt. Express 21(10), 12469–12483 (2013).
[Crossref]

Gross, H.

H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.

Guo, K.

Guo, L. B.

Heintzmann, R.

R. Horstmeyer, R. Heintzmann, G. Popescu, L. Waller, and C. Yang, “Standardizing the resolution claims for coherent microscopy,” Nat. Photonics 10(2), 68–71 (2016).
[Crossref]

Hiner, M. C.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Horstmeyer, R.

Jin, H. L.

Kandukuri, S. R.

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

Koh, L. S.

K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
[Crossref]

Lei, M.

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

Lencioni, K. C.

Levin, A.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
[Crossref]

Liu, S.

Liu, Z.

Lu, H.

Lu, Y.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Y. Lu, T. Bifano, S. Ünlü, and B. Goldberg, “Aberration compensation in aplanatic solid immersion lens microscopy,” Opt. Express 21(23), 28189–28197 (2013).
[Crossref]

Lu, Y. F.

Luo, W.

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

Martinez, G. W.

Min, J.

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

Ou, X.

Ozcan, A.

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

A. Greenbaum, A. Feizi, N. Akbari, and A. Ozcan, “Wide-field computational color imaging using pixel super-resolved on-chip microscopy,” Opt. Express 21(10), 12469–12483 (2013).
[Crossref]

Pan, A.

A. Pan and B. Yao, “Three-dimensional space optimization for near-field ptychography,” Opt. Express 27(4), 5433–5446 (2019).
[Crossref]

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

Paxman, R. G.

Peschka, M.

H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.

Petruccelli, J. C.

Pirard, E.

E. Pirard, “Multispectral imaging of ore minerals in optical microscopy,” Mineral. Mag. 68(2), 323–333 (2004).
[Crossref]

Platt, B. C.

B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001).
[Crossref]

Ponda, S.

Popescu, G.

R. Horstmeyer, R. Heintzmann, G. Popescu, L. Waller, and C. Yang, “Standardizing the resolution claims for coherent microscopy,” Nat. Photonics 10(2), 68–71 (2016).
[Crossref]

Qu, W.

Rodenburg, J. M.

J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004).
[Crossref]

Roggo, Y.

Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
[Crossref]

Rueden, C. T.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Sadda, S. R.

Schindelin, J.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Schulz, T. J.

Shack, R.

B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001).
[Crossref]

Shack, R. V.

J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002).
[Crossref]

Shen, C.

Sheppard, C. J. R.

K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
[Crossref]

Shiradkar, R.

Soltanolkotabi, M.

Sun, J.

Tan, J.

Tang, G.

Tang, Y.

Teague, M. R.

M. R. Teague, “Deterministic phase retrieval: a Green's function solution,” J. Opt. Soc. Am. A 73(11), 1434–1441 (1983).
[Crossref]

Tian, L.

Tran, C. D.

C. D. Tran, “Principles, instrumentation, and applications of infrared multispectral imaging, an overview,” Anal. Lett. 38(5), 735–752 (2005).
[Crossref]

Tsang, M.

Ulmschneider, M.

Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
[Crossref]

Ünlü, M. S.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Ünlü, S.

Varma, P.

J. Zhong, L. Tian, P. Varma, and L. Waller, “Nonlinear optimization algorithm for partially coherent phase retrieval and source recovery,” IEEE Trans. Comput. Imaging 2(3), 310–322 (2016).
[Crossref]

Vigil, K.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Waller, L.

Walter, A. E.

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Wang, Z.

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

Wei, C.

Xin, H.

Yang, C.

J. Chung, G. W. Martinez, K. C. Lencioni, S. R. Sadda, and C. Yang, “Computational aberration compensation by coded-aperture-based correction of aberration obtained from optical Fourier coding and blur estimation,” Optica 6(5), 647–661 (2019).
[Crossref]

X. Ou, J. Chung, R. Horstmeyer, and C. Yang, “Aperture scanning Fourier ptychographic microscopy,” Biomed. Opt. Express 7(8), 3140–3150 (2016).
[Crossref]

R. Horstmeyer, J. Chung, X. Ou, G. Zheng, and C. Yang, “Diffraction tomography with Fourier ptychography,” Optica 3(8), 827–835 (2016).
[Crossref]

J. Chung, H. Lu, X. Ou, H. Zhou, and C. Yang, “Wide-field Fourier ptychographic microscopy using laser illumination source,” Biomed. Opt. Express 7(11), 4787–4802 (2016).
[Crossref]

R. Horstmeyer, R. Heintzmann, G. Popescu, L. Waller, and C. Yang, “Standardizing the resolution claims for coherent microscopy,” Nat. Photonics 10(2), 68–71 (2016).
[Crossref]

R. Horstmeyer, X. Ou, J. Chung, G. Zheng, and C. Yang, “Overlapped Fourier coding for optical aberration removal,” Opt. Express 22(20), 24062–24080 (2014).
[Crossref]

X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014).
[Crossref]

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref]

G. Zheng, X. Ou, R. Horstmeyer, and C. Yang, “Characterization of spatially varying aberrations for wide field-of-view microscopy,” Opt. Express 21(13), 15131–15143 (2013).
[Crossref]

Yang, S. Y.

Yao, B.

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

A. Pan and B. Yao, “Three-dimensional space optimization for near-field ptychography,” Opt. Express 27(4), 5433–5446 (2019).
[Crossref]

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

Yeh, L. H.

Yu, Y.

Yu, Y. X.

Yurt, A.

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

Zeng, X. Y.

Zhang, Y.

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

Zhao, T.

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

Zheng, G.

Zhong, J.

J. Zhong, L. Tian, P. Varma, and L. Waller, “Nonlinear optimization algorithm for partially coherent phase retrieval and source recovery,” IEEE Trans. Comput. Imaging 2(3), 310–322 (2016).
[Crossref]

L. H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of Fourier ptychography phase retrieval algorithms,” Opt. Express 23(26), 33214–33240 (2015).
[Crossref]

Zhou, H.

Zhou, M.

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

Zügge, H.

H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.

Zuo, C.

ACM T. Graphic. (1)

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM T. Graphic. 26(3), 70 (2007).
[Crossref]

Anal. Chim. Acta (1)

Y. Roggo, A. Edmond, P. Chalus, and M. Ulmschneider, “Infrared hyperspectral imaging for qualitative analysis of pharmaceutical solid forms,” Anal. Chim. Acta 535(1-2), 79–87 (2005).
[Crossref]

Anal. Lett. (1)

C. D. Tran, “Principles, instrumentation, and applications of infrared multispectral imaging, an overview,” Anal. Lett. 38(5), 735–752 (2005).
[Crossref]

Appl. Opt. (1)

Appl. Phys. Lett. (1)

J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004).
[Crossref]

Biomed. Opt. Express (2)

BMC Bioinf. (1)

C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, and K. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC Bioinf. 18(1), 529 (2017).
[Crossref]

Electronic Device Failure Analysis (1)

K. Vigil, Y. Lu, A. Yurt, T. B. Cilingiroglu, T. G. Bifano, M. S. Ünlü, and B. B. Goldberg, “Integrated circuit super-resolution failure analysis with solid immersion lenses,” Electronic Device Failure Analysis 16, 26–32 (2014).

IEEE Trans. Comput. Imaging (1)

J. Zhong, L. Tian, P. Varma, and L. Waller, “Nonlinear optimization algorithm for partially coherent phase retrieval and source recovery,” IEEE Trans. Comput. Imaging 2(3), 310–322 (2016).
[Crossref]

J. Biomed. Opt. (1)

A. Pan, Y. Zhang, T. Zhao, Z. Wang, D. Dan, M. Lei, and B. Yao, “System calibration method for Fourier ptychographic microscopy,” J. Biomed. Opt. 22(09), 1 (2017).
[Crossref]

J. Microsc. (1)

J. L. Beverage, R. V. Shack, and M. R. Descour, “Measurement of the three - dimensional microscope point spread function using a Shack-Hartmann wavefront sensor,” J. Microsc. 205(1), 61–75 (2002).
[Crossref]

J. Opt. Soc. Am. A (2)

M. R. Teague, “Deterministic phase retrieval: a Green's function solution,” J. Opt. Soc. Am. A 73(11), 1434–1441 (1983).
[Crossref]

R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,” J. Opt. Soc. Am. A 9(7), 1072–1085 (1992).
[Crossref]

J. Refract. Surg. (1)

B. C. Platt and R. Shack, “History and principles of Shack-Hartmann wavefront sensing,” J. Refract. Surg. 17(5), S573–S577 (2001).
[Crossref]

Microelectron. Reliab. (1)

D. L. Barton, K. Bernhard-Höfer, and E. I. Cole, “FLIP-chip and “backside” techniques,” Microelectron. Reliab. 39(6-7), 721–730 (1999).
[Crossref]

Mineral. Mag. (1)

E. Pirard, “Multispectral imaging of ore minerals in optical microscopy,” Mineral. Mag. 68(2), 323–333 (2004).
[Crossref]

Nat. Photonics (2)

G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013).
[Crossref]

R. Horstmeyer, R. Heintzmann, G. Popescu, L. Waller, and C. Yang, “Standardizing the resolution claims for coherent microscopy,” Nat. Photonics 10(2), 68–71 (2016).
[Crossref]

Opt. Commun. (1)

A. Pan, M. Zhou, Y. Zhang, J. Min, M. Lei, and B. Yao, “Adaptive-window angular spectrum algorithm for near-field ptychography,” Opt. Commun. 430, 73–82 (2019).
[Crossref]

Opt. Express (16)

X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960–4972 (2014).
[Crossref]

L. H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of Fourier ptychography phase retrieval algorithms,” Opt. Express 23(26), 33214–33240 (2015).
[Crossref]

L. Waller, M. Tsang, S. Ponda, S. Y. Yang, and G. Barbastathis, “Phase and amplitude imaging from noisy images by Kalman filtering,” Opt. Express 19(3), 2805–2814 (2011).
[Crossref]

A. Pan and B. Yao, “Three-dimensional space optimization for near-field ptychography,” Opt. Express 27(4), 5433–5446 (2019).
[Crossref]

C. Zuo, J. Sun, and Q. Chen, “Adaptive step-size strategy for noise-robust Fourier ptychographic microscopy,” Opt. Express 24(18), 20724–20744 (2016).
[Crossref]

R. Horstmeyer, X. Ou, J. Chung, G. Zheng, and C. Yang, “Overlapped Fourier coding for optical aberration removal,” Opt. Express 22(20), 24062–24080 (2014).
[Crossref]

G. Zheng, X. Ou, R. Horstmeyer, and C. Yang, “Characterization of spatially varying aberrations for wide field-of-view microscopy,” Opt. Express 21(13), 15131–15143 (2013).
[Crossref]

C. Zuo, Q. Chen, Y. Yu, and A. Asundi, “Transport-of-intensity phase imaging using Savitzky-Golay differentiation filter–theory and applications,” Opt. Express 21(5), 5346–5362 (2013).
[Crossref]

C. Zuo, Q. Chen, W. Qu, and A. Asundi, “High-speed transport-of-intensity phase microscopy with an electrically tunable lens,” Opt. Express 21(20), 24060–24075 (2013).
[Crossref]

C. Shen, J. Tan, C. Wei, and Z. Liu, “Coherent diffraction imaging by moving a lens,” Opt. Express 24(15), 16520–16529 (2016).
[Crossref]

C. Shen, X. Bao, J. Tan, S. Liu, and Z. Liu, “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express 25(14), 16235–16249 (2017).
[Crossref]

A. Greenbaum, A. Feizi, N. Akbari, and A. Ozcan, “Wide-field computational color imaging using pixel super-resolved on-chip microscopy,” Opt. Express 21(10), 12469–12483 (2013).
[Crossref]

Y. W. Chu, F. Chen, Y. Tang, T. Chen, Y. X. Yu, H. L. Jin, L. B. Guo, Y. F. Lu, and X. Y. Zeng, “Diagnosis of nasopharyngeal carcinoma from serum samples using hyperspectral imaging combined with a chemometric method,” Opt. Express 26(22), 28661–28671 (2018).
[Crossref]

L. Waller, L. Tian, and G. Barbastathis, “Transport of Intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18(12), 12552–12561 (2010).
[Crossref]

Y. Lu, T. Bifano, S. Ünlü, and B. Goldberg, “Aberration compensation in aplanatic solid immersion lens microscopy,” Opt. Express 21(23), 28189–28197 (2013).
[Crossref]

S. Dong, R. Horstmeyer, R. Shiradkar, K. Guo, X. Ou, Z. Bian, H. Xin, and G. Zheng, “Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging,” Opt. Express 22(11), 13586–13599 (2014).
[Crossref]

Opt. Lett. (1)

Optica (3)

Phys. Rev. X (1)

K. Agarwal, R. Chen, L. S. Koh, C. J. R. Sheppard, and X. Chen, “Crossing the resolution limit in near-infrared imaging of silicon chips: targeting 10-nm node technology,” Phys. Rev. X 5(2), 021014 (2015).
[Crossref]

Proc. SPIE (1)

R. A. Gonsalves and R. Chidlaw, “Wavefront sensing by phase retrieval,” Proc. SPIE 207, 32–39 (1979).
[Crossref]

Sci. Transl. Med. (1)

A. Greenbaum, Y. Zhang, A. Feizi, P. L. Chung, W. Luo, S. R. Kandukuri, and A. Ozcan, “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med. 6(267), 267ra175 (2014).
[Crossref]

Other (2)

J. W. Goodman, Introduction to Fourier optics (Roberts and Company Publishers, 2005).

H. Gross, H. Zügge, M. Peschka, and F. Blechinger, Handbook of Optical Systems: Vol. 3. Aberration Theory and Correction of Optical Systems (Wiley-Vch, (2007), Chap. 29.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Multispectral, aperture-scanning Fourier ptychographic microscope system design. (a) The experimental setup schematic, switchable between transmission and reflection illumination mode. (b) FPM scanning strategy, where the red dots and thick red lines represent the scanning trajectory. The blue dotted circles represent the NA coverage of each scanning aperture. (c) Spatial position of VIS and NIR camera FOVs on the sample plane. For visualization, the VIS camera FOV is displayed in green and the NIR camera FOV in transparent gray.
Fig. 2.
Fig. 2. Dataflow pipeline of the aberration calibration and correction scheme. (a) Aberration calibration algorithm pipeline based on the microbead sample. It consists of three steps, FPM reconstruction, digital defocusing and Zernike mode fitting. (b) Aberration correction algorithm pipeline. The pupil function at some specific spatial location is compensated as its conjugate in the Fourier domain.
Fig. 3.
Fig. 3. Spatial varying aberration of our multispectral microscope system when the wavelength is 532 nm. (a)-(h) Zernike mode coefficients as function of the spatial coordinate on the sample plane. Each blue-dot data point represents the calculated Zernike coefficient weight from one off-axis bead. ∼70 beads are identified over the entire FOV. These data points are fitted to a 2D surface for each type of aberration.
Fig. 4.
Fig. 4. Spatial varying aberration of our multispectral microscope system for multiple wavelengths. The NIR fitted surface result is based on the beads located in the NIR camera’s FOV, shown by the dash-line box in (a). For comparison, it is extrapolated to the same size as the VIS results. (a)-(d) Astigmatism and coma coefficients as function of the spatial coordinate on the sample plane. Each color represents a single wavelength.
Fig. 5.
Fig. 5. Demonstration of aberration calibration and correction. The Siemens star target was randomly offset from the optical center for different wavelengths. Transmissive (a) and reflective (b) reconstructions before and after correction are compared and the inset boxes zoom in on the sample center to show the resolution enhancement. (c1)-(c4) draws the line profile of a circle segment in the reflective reconstructions before and after correction.
Fig. 6.
Fig. 6. Schematic of the flip-chip and its frontside imaging. (a) Schematic of the flip-chip and its micro-ball bump. (b) Multispectral frontside images of the flip-chip. (c) Comparison between the visible and NIR images in three regions of interest (ROI) marked by the yellow box in (b). (d) Enlargements of the same area from three visible-light channel images, where the results before and after aberration correction are compared, together with the local pupil function for each wavelength.
Fig. 7.
Fig. 7. Backside imaging of the flip-chip. (a) Comparison between the backside raw image under the wavelength of 532 nm and 1552 nm. (b) Line profiles through the periodic structure in (c2) to highlight the aberration correction performance. (c1)-(c3) Comparison between up-sampled raw image, FPM-reconstructed image before aberration compensation and the image after compensation along with their local pupil function.

Tables (1)

Tables Icon

Table 1. Fitted spherical aberration

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

I k ( x 1 , y 1 ) = | F 1 { F { o ( x , y ) } P k ( u , v ) } | 2 , k = 1 , 2 , , 47 ,
arg min k = 1 47 | | F 1 { O ( u , v ) P k ( u , v ) } | 2 I k ( x 1 , y 1 ) | 2 .
I t ( x , y ) = | F 1 { F { B ( x , y ) } H t ( u , v ) } | 2 , t = 1 , 2 , , 10.
H t ( u , v ) = exp [ i 2 π λ 0 z t 1 ( λ 0 u ) 2 ( λ 0 v ) 2 ] , t = 1 , 2 , , 10.
P ( u , v ; w 1 , , w 8 ) = exp [ i 2 π j = 1 8 w j Z j ( u , v ) ] ,
arg min t = 1 10 | | F 1 { F { B o n ( x , y ) } P ( u , v ; w 1 , , w 8 ) H t ( u , v ) } | 2 I o f f , t ( x , y ) | 2 .

Metrics