## Abstract

Optical phase imaging enables visualization of transparent samples, numerical refocusing, and other computational processing. Typically phase is measured quantitatively using interferometric techniques such as digital holography. Researchers have demonstrated image enhancement by synthetic aperture imaging based on digital holography. In this work we introduce a novel imaging technique that implements synthetic aperture imaging using phase retrieval, a non-interferometric technique. Unlike digital holography, phase retrieval obviates the need for a reference arm and provides a more compact, less expensive, and more stable experimental setup. We call this technique synthetic aperture phase retrieval.

© 2014 Optical Society of America

## 1. Introduction

#### 1.1. Measuring phase

Optical phase imaging finds important applications in biomedical imaging where samples are often transparent and weakly scattering. Phase contains valuable information such as refractive index variations or sample thickness that intensity alone cannot provide [1, 2]. Quantitative knowledge of phase, combined with intensity measurements, yields the complex field. The field is a powerful computational tool because it allows the sample to be post-processed after experimental measurements are taken. For example, label-free cell imaging, numerical refocusing, and differential interference contrast can be performed [2–7].

Phase is commonly measured using interferometric techniques such as digital holography. For example, in off-axis interferometry, the reference beam is angularly tilted with respect to the sample beam with wavevector difference Δ*k* [8, 9]. The measured intensity takes the form
$I(x,y)={I}_{r}+{I}_{s}+2\sqrt{{I}_{r}{I}_{s}}\text{cos}(\mathrm{\Delta}k\cdot x+\varphi (x,y))$, from which phase can be extracted. However, the camera pixel size constrains the highest spatial frequency of the interference pattern and therefore limits the maximum off-axis tilt.

Another interferometric technique, called phase-shifting interferometry, helps to solve this problem. Rather than tilting the reference beam, a phase shift *δϕ* is added to the reference arm [10, 11]. The resulting measured intensity becomes
$I(\delta \varphi )={I}_{r}+{I}_{s}+2\sqrt{{I}_{r}{I}_{s}}\text{cos}(\varphi +\delta \varphi )$. Typically, the reference beam is upshifted in frequency by acousto-optic modulators (AOMs). In [6], a high frame rate camera records images at 5000 fps, which is four times the frequency shift of the reference beam. Each image differs in phase by *π*/2. From four consecutive images, phase can be extracted.

Although digital holography is commonly used, it does have its disadvantages. In general, interferometry requires two beams which have to be stabilized. The interference pattern could be very sensitive to table vibrations or temperature fluctuations. The reference arm adds more parts, cost, and complexity. For example, the phase shifting interferometry in [6] requires AOMs to upshift the reference beam and a high frame rate camera to capture the phase-shifted images.

Phase retrieval presents a viable alternative to digital holography for measuring phase. A separate reference beam is not required, which aids stability. The measurements are based on defocused intensity images, which does not require an expensive high frame rate camera. Fewer components are required, which reduces cost.

#### 1.2. Enhancing resolution with synthetic aperture imaging

Regardless of measurement technique, an important concern in any imaging system is resolution, or the ability to resolve fine detail. The objective lens limits the highest spatial frequency that can be captured since it has a finitely sized aperture. The resolution is roughly *λ/NA*. In the frequency domain, the spatial frequencies are filtered by a passband which can be described as

*f*is the focal length of the objective lens.

We can improve resolution if we could capture the spatial frequencies cut off by the lens aperture. To this end consider a plane wave at oblique incidence to a sample,

*θ*denote the angle of illumination corresponding to wavevector $2\pi \left({v}_{x}^{k},{v}_{y}^{k},{v}_{z}^{k}\right)$. The field at the back focal plane becomes [7]

_{k}*T*(

*u*,

*v*) is the Fourier transform of the sample transmittance

*t*(

*x*,

*y*), and we use Eqs. (1) and (2). This expression describes the Fourier transform of the field located at the back focal plane (BFP) of the objective. Figure 2 depicts the BFP inside the objective, as the plane where light comes to a focus. The obliquely incident plane wave causes the passband to cover a different portion of frequency space. Summing the fields covered by these shifted passbands yields the synthesized field with an enlargened passband:

*W*(

^{k}*u*,

*v*) is an angle dependent window function that selects a portion of the spectrum to be added. Section 2.2 describes

*W*(

^{k}*u*,

*v*) in more detail. We note that taking a direct sum by omitting

*W*(

^{k}*u*,

*v*) is not entirely accurate, since a direct sum would place too much emphasis on the low frequencies and introduce phase aberrations [12]. In effect we are synthetically increasing the numerical aperture of the lens. For this reason researchers refer to this technique as

*synthetic aperture imaging*. As a consequence, the imaging system rejects out-of-focus diffraction noise. The resulting image looks cleaner, and techniques like numerical refocusing or differential interference contrast can be digitally implemented [6]. We note that another application, tomographic phase microscopy, also uses the concept of illuminating the sample at oblique angles [13, 14], and it is a possible future extension of this work.

## 2. Phase retrieval algorithm

#### 2.1. General steps

Phase retrieval is a non-interferometric way of measuring phase. The basic idea is to measure a sequence of defocused intensity images *I*_{1},...,*I _{N}* at

*N*planes and then process these images to extract the phase encoded in the defocus. There are different algorithms that process these defocused images to produce a phase image. Deterministic phase retrieval uses a closed form relation between intensity and phase. Solving a Poisson-type equation yields phase [15–19]. However, this equation relies on the paraxial approximation [15]. In our experiment, we illuminate the sample at oblique angles with plane waves of the form in Eq. (2), which in general will not satisfy the paraxial approximation.

Another class of algorithms is iterative in nature. We can view the intensity measurements as constraints, and we would like to retrieve a complex field satisfying those constraints [20, 21]. We apply an iterative algorithm similar to the single beam multiple intensity reconstruction (SBMIR) technique [22–26]. The general steps are

- In the first plane, let the complex amplitude ${U}_{1}=\sqrt{{I}_{1}}$ with a phase of
*ϕ*_{1}(*x*,*y*) = 0. Set*n*= 1. - Numerically propagate the complex amplitude at the previous plane
*U*to the next plane. Extract the phase_{n}*ϕ*_{n+1}(*x*,*y*). - Take the updated complex amplitude as ${U}_{n+1}=\sqrt{{I}_{n+1}}\text{exp}[j{\varphi}_{n+1}(x,y)]$. Increment
*n*by 1. - Go to step 2. Iterate until last plane is reached.

The above steps describe how to retrieve phase for one angle of illumination. The idea behind synthetic aperture imaging is to measure phase at multiple angles of illumination. Our proposal is to implement synthetic aperture imaging using phase retrieval. For each angle of illumination, we apply the above procedure to calculate phase. The combination of intensity and phase completely describes the complex field. Then the synthesized field is calculated by summing the complex fields at each angle with the background phases set equal, as described by Eq. (4). The *synthesized phase* is the phase of the synthesized field.

By implementing synthetic aperture imaging using phase retrieval, we hope to obtain a more compact experimental setup that has fewer parts, smaller expenses, and more stability. To concisely describe this idea, we refer to it as *synthetic aperture phase retrieval*.

We summarize the synthetic aperture phase retrieval procedure as follows:

- Select an angle
*θ*._{k} - Apply the iterative phase retrieval algorithm given above for a single angle. Retrieve the spectrum ${U}_{f}^{k}(u,v)$ (Eq. (3)).
- Go to step 1. Repeat until all angles are measured.
- Take the inverse Fourier transform of ${U}_{f}^{s}(u,v)$ to obtain the synthesized field.

#### 2.2. Stitching of the synthesized spectrum

Here we describe the angle dependent window function *W ^{k}*(

*u*,

*v*) from Eq. (4). A simple sum of the retrieved spectra from each angle of illumination would place too much weight on the low frequencies and introduce phase aberrations [12]. To avoid these effects, we filter the retrieved spectra with window functions

*W*(

^{k}*u*,

*v*). The intuition is that at each angle, the passband shifts to cover a different part of the spectrum, as Eq. (3) describes. We would like to capture the general part of the spectrum being measured for each angle. To implement this idea, we partition the synthesized spectrum into angular regions, according to the example presented in [12]. Figure 1 shows schematics of how the spectrum can be partitioned. Then the basic idea is to stitch together the synthesized spectrum from the retrieved spectra at each angle of illumination. We illustrate the procedure in the following examples.

Consider the example of measured spectra at 5 total angles. In our experiment we scan the back focal plane of the condenser lens (i.e., the focal plane to the left of the condenser lens, as depicted in Fig. 2) to illuminate the sample at different angles; more details are given in Section 3. For convenience, we number the beam positions at the back focal plane in Fig. 1(c). When the beam is at position 0, we retrieve the DC or zero degree spectrum
${U}_{f}^{0}(u,v)$, using the notation of Eq. (3). At position 1, we retrieve the spectrum
${U}_{f}^{1}(u,v)$ of an obliquely illuminated sample. The main lobe of
${U}_{f}^{1}(u,v)$ lies in quadrant 1 from Fig. 1(a). More generally, at position *k*, we retrieve the spectrum
${U}_{f}^{k}(u,v)$ of the sample illuminated at angle *θ _{k}*, and the main lobe of
${U}_{f}^{k}(u,v)$ lies in quadrant

*k*. The synthesized spectrum is composed of selected parts of each spectrum ${U}_{f}^{k}(u,v)$. The central part of the synthesized spectrum (of radius approximately

*NA/λ*) consists of ${U}_{f}^{0}(u,v)$. Outside the central part, quadrant

*k*of the synthesized spectrum consists of ${U}_{f}^{k}(u,v)$. The inverse Fourier transform of the synthesized spectrum yields the synthesized field.

From the above description, we can better understand the factor *W ^{k}*(

*u*,

*v*) from Eq. (4). For

*k*= 0,

*W*

^{0}(

*u*,

*v*) is a circ function of radius approximately

*NA/λ*, or

*k*> 0,

*W*(

^{k}*u*,

*v*) is zero everywhere except for a weight of 1 in quadrant

*k*outside a radius of approximately

*NA/λ*, or

In the case of measured spectra at 9 total angles, we number the beam positions at the back focal plane of the condenser lens in Fig. 1(d). The octants in Fig. 1(b) are numbered similarly; octant 1 contains the positive *u*-axis, and octant 3 contains the positive *v*-axis (numbers are not shown in the figure because of space). When the beam is at position *k*, we retrieve
${U}_{f}^{k}(u,v)$ for the sample illuminated at angle *θ _{k}*, and the main lobe of
${U}_{f}^{k}(u,v)$ lies in octant

*k*. The central part of the synthesized spectrum (of radius approximately

*NA/λ*) consists of ${U}_{f}^{0}(u,v)$. Outside the central part, octant

*k*of the synthesized spectrum consists of the following weighted sum: $\frac{1}{2}{U}_{f}^{k}(u,v)+\frac{1}{4}{U}_{f}^{k-1}(u,v)+\frac{1}{4}{U}_{f}^{k+1}(u,v)$, where the indices

*k*,

*k*− 1, and

*k*+ 1 fall in the range 1,...,8. Here we choose the spectrum retrieved at angle

*θ*to have the largest weighting of 1/2, while the spectra retrieved from neighboring angles have equal weightings of 1/4. Other weightings can be chosen. Finally the inverse Fourier transform of the synthesized spectrum yields the synthesized field.

_{k}We can also describe *W ^{k}*(

*u*,

*v*) for this case of 9 total angles. For

*k*= 0, Eq. (5) describes

*W*

^{0}(

*u*,

*v*). For

*k*> 0,

*W*(

^{k}*u*,

*v*) is zero everywhere except for weights of 1/2 in octant

*k*and 1/4 in octants

*k*− 1 and

*k*+ 1, all outside a radius of approximately

*NA/λ*, or

*k*,

*k*− 1, and

*k*+ 1 fall in the range 1,...,8, as noted previously. To be fully rigorous, the index

*k*− 1 should be replaced with (

*k*− 2) mod 8 + 1, so that when

*k*= 1, the previous index is 8. Similarly, the index

*k*+ 1 should be replaced with

*k*mod 8 + 1, so that when

*k*= 8, the next index is 1. However, for simplicity, we use the notation in Eq. (7).

## 3. Experimental setup

Figure 2 illustrates the experimental setup for synthetic aperture phase retrieval. A helium-neon (HeNe, *λ* = 633 nm) laser serves as the illumination source. Mirror M1 is a motorized gimbal mount, which operates under computer control. It steers the beam at different angles to provide oblique illumination at the sample. After traveling through the condenser lens (Abbe condenser, 1.25 NA) and objective lens (50X, 0.75 NA), the beam is directed through a tube lens and onto a CCD camera, which is mounted on a computer controlled translation stage. We measure a sequence of intensity images by translating the camera along the axial direction. As the stage moves, the images become defocused. From these defocused images, we apply the iterative phase retrieval algorithm to recover phase.

We note that our experiment uses a 0.75 NA objective; it can be extended to use the 1.4 NA in [6]. We also note that there are very high 1.65 NA objectives, but they require special high index immersion oil that evaporates within a few hours and leaves a crystalline residue. They also require special expensive and fragile coverslips [27]. In this work we aim to demonstrate the principle of using phase retrieval to implement synthetic aperture imaging. Our approach enables resolution enhancement without requiring an expensive high NA objective.

The experimental procedure is to first select an angle of illumination by tilting mirror M1. Then a sequence of defocused intensity images is measured by translating the camera. In our experiments, for each angle we measure 15 intensity images at planes separated by 2.1 *μm*, where 2.1 *μm* refers to the sample space, and the planes are symmetric about the focal plane at *z* = 0. We determine the focal plane to be the plane at which the samples look most transparent. More details on how to choose these parameters can be found in [22]. In Fig. 3, we show 5 images for one of our samples, 10 *μm* polystyrene beads (*n* = 1.587) immersed in oil (*n* = 1.515). For a given angle of illumination, it takes about 7 seconds to record 15 intensity images. We note that it should be possible to process these images in real time using a recently developed Kalman filtering algorithm [28]. We also note that electronically controllable, variable focus lenses can defocus the image in place of translating the camera, potentially speeding acquisition time [29].

To provide oblique illumination at the sample, the beam scans the back focal plane of the condenser lens, as illustrated in Fig. 4. To clarify, Fig. 2 depicts the back focal plane (BFP) of the condenser lens to the left of the condenser lens, as the plane where light comes to a focus. Thus Fig. 4 portrays focal spots at the BFP. Note at the sample, the beam stays centered and collimated at all angles.

For nonzero degree illumination, the camera translation is no longer parallel to the beam direction. As a result, the images move transversely as the camera moves axially. Consequently the images need to be registered. We perform this registration by creating an interference pattern at the camera plane and measuring the beam angle from the fringes. The procedure is described as follows. A beamsplitter taps off a reference beam before the sample. The sample beam is the beam exiting the tube lens. Another beamsplitter combines the reference beam and the sample beam to create an interference pattern at the camera. From the resulting fringes, the angle of the beam can be measured, and we use this angle to register our images. Note that we use the interferometer only for calibration purposes. Once we measure the illumination angle for each control signal sent to tilt mirror M1, the interferometer can be removed from the setup. In Fig. 5 we show 5 images taken at 12.3 degrees illumination for one of our samples, 10 *μm* polystyrene beads, and these images are shown after being registered.

As an example of the angular scanning procedure, Fig. 4(b) depicts 4 angles measured around the periphery of the back focal plane, in addition to zero degree illumination (the center dot). In our current configuration, we can illuminate the sample at angles up to 12.3 degrees, measured by the fringe analyis described earlier. We scan the periphery of the back focal plane in an approximate circle so that the largest angle of illumination is 12.3 degrees.

## 4. Phase measurements

In this section we present two examples of phase measurements to highlight the reduction in diffraction noise and the enhancement of resolution enabled by synthetic aperture imaging. The first experiment uses 10 *μm* polystyrene beads, both to illustrate our experimental procedure from Section 3 and to demonstrate reduction in diffraction noise. These beads have a clearly defined circular shape, and we also show that aperture synthesis enhances this circular profile. In the second experiment, we image small (< 1 *μm*) dust particles on a glass slide. Due to the smaller feature sizes, we obtain clearer evidence of resolution enhancement.

#### 4.1. Experiment 1: Polystyrene beads

In our first experiment, we image 10 *μm* polystyrene beads (*n* = 1.587) immersed in oil (*n* = 1.515). The simplest case occurs when we only measure a phase image at zero degree illumination; in other words, we do not perform synthetic aperture imaging. We perform phase retrieval at zero degrees according to the procedure described in Section 3. Figure 6 shows the resulting phase image. We can roughly calculate the expected peak phase shift as Δ*ϕ* = (2*π/λ*)Δ*z*(*n*_{bead} − *n*_{bkg}) = 6.97 rad, where Δ*z* = 9.75*μm* according to the manufacturer’s nominal specifications, *n*_{bead} = 1.587, and *n*_{bkg} = 1.515. We estimate the measured peak phase shift by averaging the phase within the black circle in Fig. 6(b); the estimated shift is Δ*ϕ* = 6.42 rad, which is reasonably close to the refractive index calculations. The phase retrieval algorithm outputs wrapped phase, and to aid visualization, we also unwrap the results. For phase unwrapping we implement the algorithm presented in [30].

For reference, we also use off-axis interferometry to measure a phase image at zero degree illumination. We tap off a portion of the input beam before the sample as a reference beam, as described in Section 3. Figure 7 illustrates the off-axis results. We estimate the peak shift by averaging the phase within the black circle in Fig. 7(b); the estimated peak shift is Δ*ϕ* = 7.57 rad. However, the image is very noisy, so the phase values are not entirely accurate. More importantly, this figure provides a comparison with the phase retrieval result in Fig. 6. In both figures, the circular profiles of the beads look jagged. In addition, the spaces between the beads, highlighed by dashed black circles in Figs. 6(b) and 7(b), appear partially blended with the beads. We hope to enhance the circular profile of the beads and the resolution between the beads by using synthetic aperture imaging.

We choose this sample of clumps of beads because it exhibits clear diffraction noise. For example, from the defocused intensity images in Fig. 3, we can see that the diffraction patterns from each individual bead interferes with those of other beads. Diffraction noise appears in Fig. 6 as diffraction rings and speckle-like patterns inside the beads, which is most clearly seen in the wrapped phase image. Similarly, we can discern some speckle-like patterns inside the beads in Fig. 7. We hope to reduce this diffraction noise with synthetic aperture imaging.

Next we perform synthetic aperture imaging by illuminating the sample at multiple angles. We examine the synthesized phase images for the different numbers of angles listed in Table 1. Figure 8 shows a phase image captured at 11.0 degrees illumination. To aid visualization, we also unwrap the phase. Although there are some unwrapping errors, the figure illustrates the basic idea. We note that synthetic aperture imaging does not require phase unwrapping; the spectra are stitched together as described in Section 2.2. Since the beam is oblique to the camera, the beads appear slightly elongated [6, 13]. As in the case of zero degree illumination, speckle-like patterns inside the beads and diffraction rings appear. To the best of our knowledge, this work is the first experimental demonstration of phase retrieval on an obliquely illuminated sample.

For test case 2, we measure a total of 5 angles, as noted in Table 1. We perform synthetic aperture imaging by adding the angular spectra according to Eq. (4), together with the window functions from Eqs. (5) and (6). The resulting synthesized phase is illustrated in Fig. 9. We see that diffraction noise features, such as the diffraction rings and the speckle-like patterns inside the beads, have been reduced compared to test case 1, which can be seen most clearly from the wrapped phase images.

For test case 3, we measure a total of 9 angles as noted in Table 1. The angular spectra are added according to Eq. (4), together with the window functions from Eqs. (5) and (7). Figure 10 shows the synthesized phase. The diffraction noise features (diffraction rings and speckle-like patterns inside the beads) appear more reduced than test case 2. As more angles are measured, the passband in Eq. (1) shifts to cover different portions of frequency space, which yields the different angular fields in Eq. (3). As a result, when these angular fields are summed in Eq. (4), the synthesized field contains an enlargened passband. Hence, we would expect this improvement in diffraction noise with more angles. In addition, the circular shape of the beads becomes progressively less jagged as more angles are measured, as Figs. 6(a), 9(a), and 10(a) show. We also notice the spaces between the beads become increasingly distinguishable as more angles are added; these spaces are highlighted by dashed black circles in Figs. 6(b), 9(b), and 10(b).

Next we examine a quantitative comparison of the DC case (test case 1) and the synthetic aperture cases (test cases 2 and 3). We measure the phase speckle noise
${\sigma}_{s}^{2}$ inside the bead (dashed circle) for Fig. 6(a) (
${\sigma}_{s}^{2}$ = 8.30 × 10^{−2} rad^{2}) and for Figs. 9(a) and 10(a) (
${\sigma}_{s}^{2}$ = 3.36 × 10^{−2} and 1.49 × 10^{−2} rad^{2} respectively); the phase speckle noise reduces by 59% and 82% respectively, compared to the DC case. We also measure the background noise
${\sigma}_{b}^{2}$ inside the dashed rectangle for Fig. 6(a) (
${\sigma}_{b}^{2}$ = 1.75 × 10^{−2} rad^{2}) and for Figs. 9(a) and 10(a) (
${\sigma}_{b}^{2}$ = 1.51 × 10^{−2} and 7.68 × 10^{−3} rad^{2} respectively); the background noise reduces by 14% and 56% respectively, compared to the DC case. Table 2 summarizes these results. We see that adding more angles reduces the diffraction noise.

#### 4.2. Experiment 2: Particles on a glass slide

In the second experiment, we image small (< 1 *μm*) dust particles on a glass slide. Similar to the demonstration of resolution enhancement in [12], we examine particles that are close to the resolution limit of the imaging system, with the goal of showing better differentiation between particles with aperture synthesis. A total of 9 angles are measured, and the retrieved spectra from each angle are added according to Eq. (4), together with the window functions from Eqs. (5) and (7).

To demonstrate resolution enhancement, we compare the DC or zero degree phase in Fig. 11(a) with the synthesized phase in Fig. 11(b). In general, the synthesized phase looks sharper. We highlight two pairs of particles that are hard to distinguish with dashed white circles. To facilitate comparison, we plot line profiles along the lines shown in insets A and B. As the profiles in Figs. 11(c) and 11(d) indicate, the synthesized phase shows increased contrast between the two particles, allowing them to be better differentiated.

We can estimate the resolution of the setup from the line profiles in Fig. 11. In general, the spatial resolution is determined by the NA and wavelength as

where the factor*κ*depends on experimental parameters such as the signal to noise ratio of the detector [12]. For our experiment,

*λ*= 633 nm and NA = 0.75. We estimate the resolution as

*δ*= 1.59

*μm*from Fig. 11(c), where two particles look close to the resolution limit for the DC phase. Using aperture synthesis, we expect this resolution to improve to where

*θ*

_{illum}is the angle of illumination used for the synthetic aperture. We take

*θ*

_{illum}= 12.3 degrees, which is the largest angle of illumination for our experiment. From Eq. (9), we expect the resolution to improve to

*δ*= 1.24

*μm*. Indeed, Fig. 11(d) shows two particles separated by 1.36

*μm*which become distinguishable after aperture synthesis.

We also measure the reduction in diffraction noise, as we did for the polystyrene beads in Table 2. Inside the dashed white rectangles in Figs. 11(a) and 11(b), we calculate the background noise for the DC phase (
${\sigma}_{b}^{2}$ = 2.05 × 10^{−2} rad^{2}) and the synthesized phase (
${\sigma}_{b}^{2}$ = 9.12 × 10^{−3} rad^{2}); the background noise reduces by 55%. As a consequence, the diffraction rings near the bottom of Fig. 11(a) disappear, enabling other particles to be better distinguished.

## 5. Conclusion

We have demonstrated the principle of using phase retrieval to implement synthetic aperture imaging. Synthetic aperture imaging reduces diffraction noise and enhances resolution by effectively increasing the numerical aperture. Previously this technique has relied on digital holography for phase measurements. By obviating the need for a reference arm, this approach provides a more compact, less expensive, and more stable setup. Our demonstration paves the way for other applications such as tomographic phase microscopy to be enabled by phase retrieval.

## Acknowledgments

We thank Dr. Daniel E. Leaird for providing technical feedback and for machining parts to build the experiment. We are grateful to Jingjing Liu and Delong Zhang for help in acquiring and preparing the test samples. We acknowledge the anonymous reviewers for their constructive criticisms which improved the manuscript. In particular, we thank reviewer 2 for suggesting the idea of stitching selected parts of the spectrum for aperture synthesis. We thank Professor James Krogmeier, Dr. Alexei Lagoutchev, Joseph Lukens, A.J. Metcalf, Jian Wang, and Dr. Ping Wang for helpful discussions and suggestions. This work was supported in part by the National Science Foundation under grant ECCS-1102110.

## References and links

**1. **H. G. Davies and M. H. F. Wilkins, “Interference microscopy and mass determination,” Nature **169**, 541 (1952). [CrossRef] [PubMed]

**2. **B. Rappaz, P. Marquet, E. Cuche, Y. Emery, C. Depeursinge, and P. Magistretti, “Measurement of the integral refractive index and dynamic cell morphometry of living cells with digital holographic microscopy,” Opt. Express **13**, 9361–9373 (2005). [CrossRef] [PubMed]

**3. **P. Marquet, B. Rappaz, P. J. Magistretti, E. Cuche, Y. Emery, T. Colomb, and C. Depeursinge, “Digital holographic microscopy: a noninvasive contrast imaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy,” Opt. Lett. **30**, 468–470 (2005). [CrossRef] [PubMed]

**4. **G. Popescu, T. Ikeda, R. R. Dasari, and M. S. Feld, “Diffraction phase microscopy for quantifying cell structure and dynamics,” Opt. Lett. **31**, 775–777(2006). [CrossRef] [PubMed]

**5. **A. Barty, K. A. Nugent, D. Paganin, and A. Roberts, “Quantitative optical phase microscopy,” Opt. Lett. **23**, 817–819 (1998). [CrossRef]

**6. **M. Kim, Y. Choi, C. Fang-Yen, Y. Sung, K. Kim, R. R. Dasari, M. S. Feld, and W. Choi, “Three-dimensional differential interference contrast microscopy using synthetic aperture imaging,” J. Biomed. Opt. **17**(2), 026003 (2012). [CrossRef] [PubMed]

**7. **J. W. Goodman, *Introduction to Fourier Optics* (McGraw-Hill, 1996).

**8. **U. Schnars and W. Juptner, “Direct recording of holograms by a CCD target and numerical reconstruction,” Appl. Opt. **33**(2), 179–181 (1994). [CrossRef] [PubMed]

**9. **E. Cuche, P. Marquet, and C. Depeursinge, “Simultaneous amplitude-contrast and quantitative phase-contrast microscopy by numerical reconstruction of Fresnel off-axis holograms,” Appl. Opt. **38**, 6994–7001 (1999). [CrossRef]

**10. **I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. **22**, 1268–1270 (1997). [CrossRef] [PubMed]

**11. **I. Yamaguchi, J. Kato, S. Ohta, and J. Mizuno, “Image formation in phase-shifting digital holography and applications to microscopy,” Appl. Opt. **40**(34) 6177–6186 (2001). [CrossRef]

**12. **P. Gao, G. Pedrini, and W. Osten, “Structured illumination for resolution enhancement and autofocusing in digital holographic microscopy,” Opt. Lett. **38**, 1328–1330 (2013). [CrossRef] [PubMed]

**13. **W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, R. R. Dasari, and M. S. Feld, “Tomographic phase microscopy,” Nat. Methods **4**(9), 717–719 (2007). [CrossRef] [PubMed]

**14. **Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. Dasari, and M. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express **17**, 266–277 (2009). [CrossRef] [PubMed]

**15. **M. R. Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am. A **73**(11), 1434–1441 (1983). [CrossRef]

**16. **N. Streibl, “Phase imaging by the transport equation of intensity,” Opt. Commun. **49**(1), 6–10 (1984). [CrossRef]

**17. **L. Allen and M. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. **199**, 65–75 (2001). [CrossRef]

**18. **L. Waller, L. Tian, and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher-order intensity derivatives,” Opt. Express **18**(12), 12552–12561 (2010). [CrossRef] [PubMed]

**19. **L. Waller, S. S. Kou, C. J. R. Sheppard, and G. Barbastathis, “Phase from chromatic aberrations,” Opt. Express **18**(22), 22817–22825 (2010). [CrossRef] [PubMed]

**20. **R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik **35**, 227–246 (1972).

**21. **J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. **21**, 2758–2769 (1982). [CrossRef] [PubMed]

**22. **G. Pedrini, W. Osten, and H. Tiziani, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett. **30**(8), 833–835(2005). [CrossRef] [PubMed]

**23. **P. Almoro, G. Pedrini, and W. Osten, “Complete wavefront reconstruction using sequential intensity measurements of a volume speckle field,” Appl. Opt. **45**(34), 8596–8605 (2006). [CrossRef] [PubMed]

**24. **Y. Zhang, G. Pedrini, W. Osten, and H. Tiziani, “Whole optical wave field reconstruction from double or multi in-line holograms by phase retrieval algorithm,” Opt. Express **11**, 3234–3241 (2003). [CrossRef] [PubMed]

**25. **P. Almoro and S. Hanson, “Wavefront sensing using speckles with fringe compensation,” Opt. Express **16**, 7608–7618 (2008). [CrossRef] [PubMed]

**26. **A. Anand, V. K. Chhaniwal, P. Almoro, G. Pedrini, and W. Osten, “Shape and deformation measurements of 3D objects using volume speckle field and phase retrieval,” Opt. Lett. **34**(10), 1522–1524 (2009). [CrossRef] [PubMed]

**27. **D. Axelrod., “Selective imaging of surface fluorescence with very high aperture microscope objectives,” J. Biomed. Opt. **6**, 6–13 (2001). [CrossRef] [PubMed]

**28. **Z. Jingshan, J. Dauwels, M. Vazquez, and L. Waller, “Sparse ACEKF for phase reconstruction,” Opt. Express **21**, 18125–18137 (2013). [CrossRef] [PubMed]

**29. **R. D. Niederriter, A. M. Watson, R. N. Zahreddine, C. J. Cogswell, R. H. Cormack, V. M. Bright, and J. T. Gopinath, “Electrowetting lenses for compensating phase and curvature distortion in arrayed laser systems,” Applied Optics **52**, 3172 (2013). [CrossRef] [PubMed]

**30. **M. Herraez, D. Burton, M. Lalor, and M. Gdeisat, “Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path,” Appl. Opt. **41**, 7437–7444 (2002). [CrossRef] [PubMed]