## Abstract

We show that phase objects may be computed accurately from a single color image in a brightfield microscope, with no hardware modification. Our technique uses the chromatic aberration that is inherent to every lens-based imaging system as a phase contrast mechanism. This leads to a simple and inexpensive way of achieving single-shot quantitative phase recovery by a modified Transport of Intensity Equation (TIE) solution, allowing real-time phase imaging in a traditional microscope.

© 2010 OSA

## 1. Introduction

Light is a wave, having both an amplitude and a phase; however, optical phase is too fast to be detected directly. Thus, phase differences must be computed from intensity measurements. Phase changes carry important information about an object’s shape and refractive index and are important in surface profiling, adaptive optics and particularly biomedical visualization, where samples are often transparent. Phase objects have been observed since microscopists in the early 1900s recognized that defocusing an image of a transparent object, such as a cell, renders visible previously invisible structure variations [1]. Indeed, in-focus imaging systems (with no aberrations) have a purely real transfer function and thus no phase contrast. Defocus introduces an imaginary component [2], as do most optical aberrations [3], converting some phase information into intensity variations. However, the phase information through such means is neither in-focus nor quantitative, yielding only qualitative descriptions. Phase contrast microscopy [4] solved the problem of providing in-focus phase contrast, but in the resulting images, phase still cannot be separated from absorption and so the method is not quantitative. The invention of the laser enabled many of today’s traditional coherent interference techniques for measuring phase quantitatively with high accuracy and at fast speeds [5–7]. However, interferometric systems are often bulky, suffer unwrapping problems, and are resolution-limited by the coherent diffraction limit. For applications with partially coherent illumination, Shack-Hartmann sensors [8] are popular where spatial resolution is not crucial, and a recent method allows higher resolution [9]. Here, we propose a way to use the chromatic dispersion of an imaging system as a mechanism for obtaining phase directly [10].

When white light interacts with a dielectric medium, different wavelengths propagate at different speeds, giving rise to chromatic dispersion [11]. For years, scientists have worked to reduce the resulting ‘chromatic aberration’ in imaging systems by using long focal lengths, reflective optics or achromatic lenses [12].

Our technique is a variation of Transport of Intensity [13] equation (TIE) imaging, which is able to produce accurate quantitative phase reconstructions with partially coherent light [14], right out to the diffraction limit [15] and without the need for unwrapping. The TIE states that the derivative of intensity with respect to the optical axis, *z*, is related to the phase. In the partially coherent case, the term “phase” refers to a ‘generalized phase’ [14] which is the average phase delay incurred by the spectrally-weighted mean wavelength and direction, in the case of a uniform, circular source [16], such as that of a brightfield microscope. By taking two images at different *z* positions, the generalized phase of a partially coherent incident beam may be recovered [14]. TIE imaging has been applied in many different applications [16–18] and extended to multiple images [19]. Limitations include the requirement for either a beamsplitter with two cameras, or motion between sequential capture of multiple images, slowing down the process. Registration of the images requires careful alignment, as well as ensuring that no tilt is introduced between the images [20].

## 2. Theory

The TIE is derived from the paraxial wave equation after defocus, which is described by Fresnel propagation [21]. Wavelength λ and distance *z* are interchangeable in the propagation equation [22] – in fact, defocus is the product of the two ξ = λ*z*. Thus, we find (see Appendix):

*I(x,y)*is intensity, $\phi (x,y)$ is phase, and ${\nabla}_{\perp}$is the two-dimensional gradient operator in the lateral dimensions. One implementation of this equation allows a defocused phase to be recovered by taking images with different wavelengths in a single plane of constant

*z*, similar to that proposed for X-ray imaging with quantified dispersion [23,24] and diffraction tomography [25]. Similar to the traditional TIE, strict temporal coherence is not required and λ may be replaced by the spectrally-weighted mean wavelength when $\phi (x,y)$ is the generalized phase of a partially coherent beam [14]. Thus, the broad color filters that comprise the RGB channels of any color camera may be used to capture the differentially defocused intensity images simultaneously, allowing single-shot phase imaging free of lateral or tilt registration problems. Furthermore, we describe below a method for in-focus phase imaging with optimal contrast by introducing a wavelength-dependent

*z*. We assume here a thin phase object with negligible object dispersion.

Given a measurement of ∂*I*/∂ξ, Eq. (1) is solved directly for $\phi (x,y)$ using standard Poisson solvers [26]. Here, we choose a Fourier Domain Poisson solver for its speed (order N^{2}logN) and tolerance to boundary condition errors [27]. When amplitude variations or non-uniform illumination are present, obtaining $\phi (x,y)$ from Eq. (1) requires two Poisson solutions according to the method of Teague [13]. Since the measurement is related to the second derivative of the phase distribution, steep phase gradients will not suffer aliasing problems as easily as with interferometry, where phase ‘wraps’ every 2π. By using parallel FFT processing on a Graphics Processing Unit (GPU), the phase reconstructions can be computed and displayed in real-time at camera-limited frame rates [28].

The phase recovered will be that at the camera plane, yielding a defocused result for constant *z*. We desire an in-focus phase image, where two colors are over and under focused by the same amount, making the result accurate to second order. Below we outline the method for designing such an imaging system with tuneable defocus.

We use either a 4*f* system (see Fig. 1a
) or a brightfield microscope with an infinity-corrected objective lens. Refractive imaging systems usually attempt to correct the effects of chromatic aberration with compound achromatic lenses [12]. Here, we refer to the phenomenon as ‘chromatic defocus’ and explicitly utilize it for obtaining phase information. Chromatic defocus has been used previously for depth sectioning in a microscope [29,30].

For a single thin lens, using the Lens maker’s formula [11], the shift in focal length *f* at wavelength λ from the design wavelength λ_{0} is

*n*(λ

_{0}) and

*n*(λ) are the lens refractive indices at λ

_{0}and λ, respectively. In a 4

*f*system designed for λ

_{0}, the wavelength-dependent axial focal shift is

*f*system is thus $\xi (\lambda )=\lambda \cdot \Delta f\text{'}(\lambda )$ and is plotted in Fig. 1b. Note that defocus is nearly linear with wavelength, allowing for a well-centred measurement of the intensity derivative, with green in-focus and red and blue defocused by equal and opposite amounts; this centered derivative measurement is accurate to 2nd order. Furthermore, we can choose

*f*

_{1}(λ

_{0}) and

*f*

_{2}(λ

_{0}) (or the dispersion of the lens material) to tune the slope of this curve such that the optimal defocus exists for a given object. It should be noted that the colors also incur slightly different spherical aberration, which should be minimized by using a field-of-view smaller than the focal length, or specialty gradient index or diffractive lens systems.

Since we anticipate a key application for this technique to be live cell microscopy of fast dynamics, we look next at a microscope imaging system, where an infinity-corrected objective lens is used in place of a 4*f* system. Our technique will be more accurate with higher spatial coherence [14], achieved by stopping down the condenser aperture; however, opening the condenser aperture will improve diffraction-limited resolution [15].

## 3. Experimental results

A MEMS deformable mirror (DM) array [31] developed for adaptive optics provides a well-characterized dynamic phase object. Using the setup shown in Fig. 2a
, the generalized phase of a DM with 16 actuators addressed was imaged in reflection mode (see Fig. 2b-d). The 4*f* system parameters were *f*
_{1} = 200mm, *f*
_{2} = 75mm, and a standard Bayer color camera (Edmund Optics 3112c) was used. We find that the results are not degraded by the shifted sampling in the Bayer pattern, provided that the resolution of the camera exceeds the maximum resolution desired. More expensive 3CCD cameras and Foveon sensors would provide perfect lateral sampling; however, the spectral width of the Foveon color channels is much larger than that of the Bayer filter, giving a more noise-sensitive result. The green color channel is in focus (no phase contrast), while the red and blue channels are under and over focused, respectively (Fig. 2b). From this single color image (Fig. 2c), the solution of Eq. (1) provides a quantitative map of the generalized phase, which is then reinterpreted as height ($height=\phi \lambda /2\pi \Delta n$).The mirror was reconfigured dynamically and the height variations were captured in real-time (see Media 1), demonstrating the capability for fast high-resolution wavefront sensing.

Transmission brightfield microscope images and video are shown in Fig. 3 (Media 2), taken with a Nikon TE2000-U transmission microscope and an achromatic objective (20x, NA = 0.4). The top row shows the captured color images and the bottom row shows the resulting phase maps. The PMMA test object demonstrates the capabilities of our system for reconstruction of sharp surface profile gradients with nanometer-scale accuracy. The live cell samples are adult human dermal microvascular endothelial cells (HMVEC) in EGM-2MV growth medium and live HeLa cells, where results will depend on both refractive index and shape. As expected, the color images show little contrast, except for some color splitting at the sharp edges; however, the small differences between the color channels are enough to recover a robust quantitative phase map.

## 4. Discussion

#### 4.1 Accuracy of phase retrieval

Equation (1) is well-posed and invokes only the paraxial approximation; however, the measurement of the intensity derivative is necessarily a finite difference approximation:

*N*and

_{R}*N*are the noise values in the red and blue color channels, respectively, ${\xi}_{R}$ and ${\xi}_{B}$ are the corresponding defocus parameters, and $\Delta \xi ={\xi}_{R}-{\xi}_{B}$. The derivative measurement becomes unstable at small defocus due to amplification of noise by (Δξ)

_{B}^{−1}. Increasing Δξ provides better signal to noise ratio (SNR) in the derivative estimate; however, the linearity assumption inherent to the finite difference approximation is compromised when defocus is large [19], causing nonlinearity error. An example plot of the root mean squared (RMS) error in the recovered phase for a pure-phase test object is shown in Fig. 4 , where the algorithm used intensity images having simulated defocus due to varying values of wavelength and distance. Since the error is object-dependent, we have attempted to create a general test object by using uniformly-distributed random phase values within a square boundary (see Fig. 4(inset)). The error is due to the nonlinearity in the finite derivative approximation. As expected, error increases with increasing defocus $\Delta \xi =\Delta \lambda \Delta z$. and approaches zero for small defocus in this noise-free case. Thus, the noise floor will limit the accuracy of the system in a nonlinear way. To minimize nonlinearity error, we require Δ$\xi \le {x}^{2}$, where

*x*is the characteristic size of a feature to be reconstructed.

As verification of the accuracy of our technique, we compare our results to those obtained by a commercial interferometer in Fig. 5 . The result is very similar, and we find that with very steep phase gradients, the commercial interferometer tends to acquire phase unwrapping artifacts that our technique does not.

#### 4.2 Imaging with achromats

Most microscope objectives are achromatic or apochromatic, meaning that they correct for chromatic defocus at two or more wavelengths. Achromatic objectives, which focus red and blue (but not green) wavelengths to the same position, are suitable when ∂*I*/∂ξ is measured as follows:

#### 4.3 Limitations

The limitation of our technique is that material dispersion within the object will cause wavelength-dependent phase shifts through the object, which could cause artifacts in the phase result. Reflective objects such as the DM will not have these problems, and we have not found this problem in any samples in practice. For example, in the MIT test object, we can roughly calculate the expected erroneous phase shift due to material dispersion in PMMA, which has a refractive index *n* at the mean color channel wavelengths of n_{R} = 1.48858, n_{G} = 1.49423 and n_{B} = 1.50019. Thus, the phase shift errors will be approximately 0.4% of the total phase shift, or less than 1nm for the object in Fig. 3.

For the biological samples, we verify negligible material dispersion by comparing our result with traditional TIE imaging (Fig. 7 ). The traditional TIE result was obtained from two oppositely focused images taken by moving the sample between captures by a distance which gives approximately equivalent defocus as that which occurs between the color channels, but using data from the green color channel only. The difference map for the two phase solutions (Fig. 7c) shows no obvious systematic differences where the cells are and is mainly a result of the differential noise between the images, giving a cloudy effect. Some small differences at the edges of the cells could be a result of imperfect image registration between the two intensity images in the traditional TIE reconstruction, or due to movement of the cells between the two images captured. Thus, material dispersion is not a problem in these results.

Finally, we have assumed throughout this work that the wavefronts associated with the three colors can be approximated as coming from a single point source, so as not to induce significant errors due to multiple coherent modes [32].

## 5. Conclusion

We have proposed and demonstrated a new method for high-resolution quantitative phase imaging that is inexpensive, fast and accurate. Through a mathematical manipulation of the original TIE equation, the mechanical axial defocus is transformed into chromatic aberration. In this way, no physical movement is needed and only a standard color camera is required. Since chromatic aberration is inherent in any refractive imaging system with a broadband source, our technique turns a negative element of the image formation process into a positive and direct means of phase retrieval. Furthermore, quantitative phase can be captured and computed at camera-limited speeds in a conventional brightfield microscope, which has great potential applications in imaging of dynamic and interactive biological processes, as well as for fast, high-resolution wavefront sensing in the field of adaptive optics and large-scale density measurements. In developing this technique, we have opened the door to a new perspective on chromatic dispersion in imaging systems, and described how to use it to gain quantitative information. The proposed treatment turns any conventional optical microscope into a real-time phase imaging device with low cost and no hardware modification.

## Appendix: Derivation of Eq.1

Start with a complex object $\psi (x,y)=A(x,y){e}^{i\phi (x,y)}$ where $A(x,y)$ is amplitude, $\phi (x,y)$ is phase and $x,y$ denote the lateral dimensions. Assuming plane wave illumination, the field after the object is $\psi (x,y)$ and propagates along the optical axis *z* according to the Fresnel propagation kernel [21],

At a defocus of *ξ*, the recorded intensity is,

Then, taking into account $i2\pi (u+v)\to {\nabla}_{\perp}$, we follow a derivation parallel to that of Beleggia [20] and find after some algebra,

## Acknowledgements

The authors thank Boston Micromachines, Inc. for use of the DM array, S. Yang for fabrication of the test object, N. Loomis for GPU code, Prof. Roger D. Kamm’s Lab and the Spectroscopy Lab at MIT for cell samples, the Boston University Photonics Centre for use of the commercial interferometer, and the anonymous reviewers for constructive criticism on the original manuscript. We also gratefully acknowledge the Singapore-MIT Alliance for Research and Technology (SMART) Centre for financial support.

## References and links

**1. **F. Zernike, “How I discovered phase contrast,” Science **121**(3141), 345–349 (1955). [CrossRef] [PubMed]

**2. **C. J. R. Sheppard, “Defocused transfer function for a partially coherent microscope and application to phase retrieval,” J. Opt. Soc. Am. A **21**(5), 828–831 (2004). [CrossRef]

**3. **H. Hopkins, “The frequency response of a defocused optical system,” Proc. Royal Soc. London, Ser. A **231**(1184), 91–103 (1955). [CrossRef]

**4. **F. Zernike, “Phase contrast, a new method for the microscopic observation of transparent objects,” Physica **9**(7), 686–698 (1942). [CrossRef]

**5. **P. Marquet, B. Rappaz, P. J. Magistretti, E. Cuche, Y. Emery, T. Colomb, and C. Depeursinge, “Digital holographic microscopy: a noninvasive contrast imaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy,” Opt. Lett. **30**(5), 468–470 (2005). [CrossRef] [PubMed]

**6. **G. Popescu, L. P. Deflores, J. C. Vaughan, K. Badizadegan, H. Iwai, R. R. Dasari, and M. S. Feld, “Fourier phase microscopy for investigation of biological structures and dynamics,” Opt. Lett. **29**(21), 2503–2505 (2004). [CrossRef] [PubMed]

**7. **J. Millerd, N. Brock, J. Hayes, M. North-Morris, B. Kimbrough, and J. Wyant, *Fringe 2005: Pixelated Phase-mask Dynamic Interferometers* (Springer, 2003).

**8. **B. C. Platt and R. Shack, “History and principles of Shack–Hartmann wavefront sensing,” J. Refract. Surg. **17**(5), S573–S577 (2001). [PubMed]

**9. **X. Cui, J. Ren, G. J. Tearney, and C. Yang, “Wavefront image sensor chip,” Opt. Express **18**(16), 16685–16701 (2010). [CrossRef] [PubMed]

**10. **L. Waller, and G. Barbastathis, “Phase from Defocused Color Images,” in Frontiers in Optics, OSA Technical Digest (CD) (Optical Society of America, 2009), paper FThR3.

**11. **M. Born, and E. Wolf, *Principles of Optics* (Cambridge Univ. Press, 1999).

**12. **H. King, *The History of the Telescope* (Dover, 2003).

**13. **M. Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am. A **73**, 1434–1441(1983). [CrossRef]

**14. **D. Paganin and K. Nugent, “Noninterferometric phase imaging with partially coherent light,” Phys. Rev. Lett. **80**(12), 2586–2589 (1998). [CrossRef]

**15. **E. D. Barone-Nugent, A. Barty, and K. A. Nugent, “Quantitative phase-amplitude microscopy I: optical microscopy,” J. Microsc. **206**(Pt 3), 194–203 (2002). [CrossRef] [PubMed]

**16. **N. Streibl, “Phase imaging by the transport of equation of intensity,” Opt. Commun. **49**(1), 6–10 (1984). [CrossRef]

**17. **L. Waller, Y. Luo, S.-Y. Yang, and G. Barbastathis, “Transport of intensity phase imaging in a volume holographic microscope,” Opt. Lett. **35**(17), 2961–2963 (2010). [CrossRef] [PubMed]

**18. **S. S. Kou, L. Waller, G. Barbastathis, and C. J. Sheppard, “Transport-of-intensity approach to differential interference contrast (TI-DIC) microscopy for quantitative phase imaging,” Opt. Lett. **35**(3), 447–449 (2010). [CrossRef] [PubMed]

**19. **L. Waller, L. Tian, and G. Barbastathis, “Transport of Intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express **18**(12), 12552–12561 (2010). [CrossRef] [PubMed]

**20. **M. Beleggia, M. Schofield, V. Volkov, and Y. Zhu, “On the transport of intensity technique for phase retrieval,” Ultramicroscopy **102**, 37–49 (2004). [CrossRef] [PubMed]

**21. **J. Goodman, *Introduction to Fourier Optics*, (McGraw-Hill, 1996).

**22. **B. Saleh, and M. Teich, *Fundamentals of Photonics* (John Wiley & Sons, 2010).

**23. **T. Gureyev and S. Wilkins, “On X-ray phase retrieval from polychromatic images," Opt. Commun. **147**, 229–232 (1998) (Erratum: Opt. Commun. **154**, 391). [CrossRef]

**24. **T. E. Gureyev, S. Mayo, S. W. Wilkins, D. Paganin, and A. W. Stevenson, “Quantitative in-line phase-contrast imaging with multienergy X rays,” Phys. Rev. Lett. **86**(25), 5827–5830 (2001). [CrossRef] [PubMed]

**25. **M. A. Anastasio, Q. Xu, and D. Shi, “Multispectral intensity diffraction tomography: single material objects with variable densities,” J. Opt. Soc. Am. A **26**(2), 403–412 (2009). [CrossRef]

**26. **G. Strang, *Computational Science and Engineering* (Wellesley-Cambridge Press, 2010).

**27. **L. Allen and M. Oxley, “Phase retrieval from series of images obtained by defocus variation,” Opt. Commun. **199**(1-4), 65–75 (2001). [CrossRef]

**28. **N. Loomis, L. Waller, G. Barbastathis, “High-speed phase recovery using chromatic transport of intensity computation in graphics processing units,” Proc. Digital Holography meeting of the OSA: JMA7 (2010).

**29. **G. Molesini and F. Quercioli, “Pseudocolor effects of longitudinal chromatic aberration,” J. Opt. (Paris) **17**, 279–282 (1986).

**30. **J. S. Courtney-Pratt and R. L. Gregory, “Microscope with enhanced depth of field and 3-D capability,” Appl. Opt. **12**(10), 2509–2519 (1973). [CrossRef] [PubMed]

**31. **T. Bifano, R. K. Mali, J. K. Dorton, J. A. Perreault, N. Vandelli, M. N. Horenstein, and D. A. Castanon, “Continuous-membrane silicon deformable mirror,” Opt. Eng. **36**, 1354–1360 (1997). [CrossRef]

**32. **A. M. Zysk, R. W. Schoonover, P. S. Carney, and M. A. Anastasio, “Transport of intensity and spectrum for partially coherent fields,” Opt. Lett. **35**(13), 2239–2241 (2010). [CrossRef] [PubMed]