## Abstract

This paper proposes a non-iterative, two-dimensional numerical method to alleviate the compromise between the lateral resolution and wide depth measurement range in optical coherence tomography (OCT). A two-dimensional scalar diffraction model was developed to simulate the wave propagation process from out-of-focus scatterers within the short coherence gate of the OCT system. High-resolution details can be recovered from outside the depth-of-field region with minimum loss of lateral resolution. Experiments were performed to demonstrate the effectiveness of the proposed method.

©2007 Optical Society of America

## 1. Introduction

Optical coherence tomography (OCT) [1] has recently been developed for diverse areas of medical imaging including in vivo imaging of human retina, skin, and internal body tissues. In standard time-domain OCT (TDOCT), a Michelson-type interferometer is illuminated by a femtosecond laser or superluminescent LED, and an optical-delay-line is used for depth scanning. However, this embodiment of OCT needs specific optical and mechanical designs for scanning and is mainly limited by its relatively slow imaging speed. An alternative approach records the interferometric signal between light from a reference and the back-scattered light from the sample point in frequency domain rather than time domain. This is referred to as Fourier domain OCT (FDOCT) [2–3] or spectral domain OCT. The spectral information discrimination in FDOCT is accomplished either by using a dispersive spectrometer [2–6] in the detection arm or rapidly scanning a swept laser source [7–11]. FDOCT has attracted more attention recently because of its high sensitivity and imaging speed compared to TDOCT [12–14].

As a coherent cross-sectional imaging technique, OCT is capable of penetrating ∼3 millimeters into highly scattering biological tissues, and axial resolution of a few um is provided by the low coherence nature of the light source. However, in OCT, a high lateral resolution and wide depth of field (DOF) are always exclusive. Although a higher effective numerical aperture enhances lateral resolution, it narrows the DOF which is inversely proportional to the square of the effective numerical aperture (NA^{2}) of the optical system. Only a very small range around the DOF will exhibit the desired lateral resolution of the system, and the OCT image in the out-of-focus range is blurred laterally. A typical DOF for a small NA system will be several hundred microns, which is about 10 orders smaller than the scanning depth range of an OCT system. Adaptive optics [15], axicon lens [16], dynamic focusing or focus tracking [17–18] is used for maintaining high lateral resolution over a large imaging depth. However, each technique requires special hardware in the system design and may limit the scanning speed and its application in real-time. Inverse scattering [19] and deconvolution algorithms [20–24] have also been designed to improve lateral resolution. Of these methods, most are nonlinear and do not use the phase information of the OCT image. Others [19, 24] have taken account of the phase information; however, simulations or experimental studies were mainly limited to one lateral dimension. Inverse scattering has been recently extended for two-dimensional studies [25, 26].

In this paper, a novel non-iterative, two-dimensional numerical method for lateral resolution improvement is proposed based on a two-dimensional scalar diffraction model, and the compromise between the lateral resolution and depth measurement range in OCT is alleviated. Numerical diffraction algorithms can be used to calculate back the original detail of the sample without moving the focal plane, thus numerically canceling the lateral defocus. High-resolution details are recovered from outside of the depth-of-field region with minimal loss of lateral resolution. The diffraction model is relatively simple to implement because it only involves propagation of wave between different planes.

## 2. Principle

We start by briefly reviewing the illumination and detection of an OCT system. After passing through an objective of the OCT system, a Gaussian probe beam is focused onto a sample, as shown in Fig. 1(a). In an ideal situation, the sample is placed within the DOF of the probe beam. However, in some cases, if the sample is located outside the DOF, the probe beam expands to illuminate a larger area of the sample. Because of the confocal detection scheme of the OCT system, we can consider that the detector is placed in the focal plane of the probe beam and OCT detects the backscattered light, as shown in Fig. 1(b). Note that scatterers at different depths can be differentiated by the short coherence gate of the OCT system.

Since the expanded probe beam is illuminating a larger area of the sample, multiple out-of-focus scatterers from the same depth will contribute to a single detection (along one Aline). The same scatterer will contribute to different detections as the probe beam is scanning different A-lines across the sample. Interestingly, this is exactly the physical model of diffraction. Optically, it can be considered that the detector (at different lateral positions of the focal plane) records a two-dimensional diffraction pattern from the sample plane, which is located within the short coherence gate. Hence, it is reasonable to use the scalar diffraction model to simulate the wave propagation process for direct scattering [27] from these out-of-focus scatterers. This is based on the assumption that only ballistic and quasi-ballistic backscattered photons contribute to OCT imaging because of coherence gating, and contribution from multiple scattering will tend to be delayed and thus may fall outside the coherence gate.

From Fourier optics [28, 29], if *E*(*x,y*;0) is the en-face wave field distribution at plane *z* = 0, the corresponding angular spectrum of the field at this plane can be obtained by taking the Fourier transform:

where *k _{x}* and

*k*are corresponding spatial frequencies of

_{y}*x*and

*y*. The field

*E*(

*x,y*;0) can be rewritten as the inverse Fourier transform of its angular spectrum,

The complex-exponential function exp[*i*(*k _{x}x* +

*k*)] may be regarded as a projection, onto the plane

_{y}y*z*= 0, of a wave propagating with a wave vector (

*k*,

_{x}*k*,

_{y}*k*), where

_{z}*k*=[

_{z}*k*

^{2}-

*k*

_{x}^{2}-

*k*

_{y}^{2}]

^{1/2}and

*k*= 2

*π/λ*. Thus, the field

*E*(

*x,y*;0) can be viewed as a superposition of many wave components propagating in different directions in space and with complex amplitude of each component equal to

*S*(

*k*,

_{x}*k*;0). After propagating along the

_{y}*z*axis to a new plane, the new angular spectrum,

*S*(

*k*,

_{x}*k*;

_{y}*z*), at plane

*z*can be calculated from

*S*(

*k*,

_{x}*k*;0) as

_{y}*S*(

*k*,

_{x}*k*;

_{y}*z*) =

*S*(

*k*,

_{x}*k*;0)exp[

_{y}*ik*]. Thus, the complex field distribution of any plane perpendicular to the propagating

_{z}z*z*axis can be calculated from Fourier theory as

The lateral resolution of the system is determined by the effective numerical aperture of the objective as in Fig. 1. It is required that the actual lateral sampling step (or scanning step) should be smaller than the theoretical lateral resolution of the system. The scanning step functions as the pixel size of the focal plane, and the pixel size of the reconstructed field from the angular spectrum method is always the same as that of the focal plane. Thus, the proposed method could provide spatially invariant lateral resolution.

The schematic of the FDOCT system is shown in Fig. 2. Low-coherence light having a 1310 nm center wavelength with a full width at half maximum of 95 nm was coupled into the source arm of a fiber-based Michelson interferometer. Back-reflected lights from the reference and sample arms were guided into a spectrometer. The dispersed spectrum by a diffraction grating (500g/mm) was sampled by the spectrometer with a 1×1024 InGaAs detector array (SU1024-1.7T, Sensors Unlimited) at 7.7 kHz. The wavelength range on the array was 130 nm, corresponding to a spectral resolution of 0.13 nm and an imaging depth of 3.6 mm in air.

Figure 2 also shows the flow diagram of the whole process for lateral resolution improvement based on an FDOCT system. After the captured spectral interferogram in a line-scan camera is rescaled as evenly k-spaced, a Fast Fourier transform is followed to achieve the complex A-line information along the z axis. The whole three-dimensional (3-D) complex volume of the sample is obtained by two-dimensional scanning of a galvo-system. Since the sample is located outside the DOF range of the probe beam, the OCT system suffers greatly from lateral resolution degradation. Note here that the 3-D out-of-focus complex volume needs to be re-sampled in the z direction to achieve a sequence of en-face (x-y) images *I*(*x, y*;*z _{i}*). Then the angular spectrum

*S*(

*k*,

_{x}*k*;

_{y}*z*) of each en-face image was calculated by Eq. (1), and the new angular spectrum after propagating a distance

_{i}*z*was calculated by multiplying

*S*(

*k*,

_{x}*k*;

_{y}*z*) with a z-dependent exponential term as exp[

_{i}*ik*] with

_{z}z*k*= [

_{z}*k*

^{2}-

*k*

_{x}^{2}-

*k*

_{y}^{2}]

^{1/2}. Finally, the en-face field distribution at plane (

*z*+

_{i}*z*) was calculated from Eq. (3). Thus, by selecting a correct reconstruction distance z, the proposed method can be used to numerically cancel the lateral defocus and improve the lateral resolution.

To better understand the process, Figure 3(a) shows one evenly k-spaced spectral interferogram after interpolation when the galvo-system is scanning a point (*x _{m}*,

*y*) on the sample surface. It is then inverse Fourier transformed and filtered to extract a specific layer

_{n}*I*(

*x*,

_{m}*y*;

_{n}*z*) of the object as indicated in Figs. 3(b) and 3(c). Information of all the other layers is discarded when analyzing this specific layer centered at

_{j}*z*. A one-dimensional Fourier transform of 3(c) results in a spectral distribution

_{j}*I*(

*x*,

_{m}*y*;

_{n}*k*) whose amplitude is proportional to the spectral density of the light source, as shown in Fig. 3(d). Here, we have assumed that the spectral interferogram is interpolated with several times of 1024 points so that each layer may cross several pixels in Fig. 3(c). By two-dimensionally scanning the galvo-system and repeating the process, both the full-field layer information

*I*(

*x,y*;

*z*) and the spectral distribution

_{j}*I*(

*x,y*;

*k*) are obtained. Of course, if the spectral interferogram is interpolated as 1024 points, each layer is averaged into one pixel only in Fig. 3(c), and its Fourier transform should result in a dashed straight line in Fig. 3(d). It is noticed that the amplitude of any

*I*(

*x,y*;

*k*) at a single

_{i}*k*is proportional to the average amplitude of

_{i}*I*(

*x,y*;

*z*), and the phase information of

_{j}*I*(

*x,y*;

*k*) is the summation of the average phase of

_{i}*I*(

*x,y*;

*z*) with a constant

_{j}*k*-related phase shift throughout the whole (

_{i}*x, y*) plane. In this case, it is theoretically equivalent to use either

*I*(

*x,y*;

*z*) or any single

_{j}*I*(

*x,y*;

*k*) for digital focusing, and the use of any wave number

_{i}*k*could give the same result. In the above, we have considered the broadening of the spectrum, and a chromatic algorithm can be designed to take account of each

_{i}*I*(

*x,y*;

*k*) and combine together the wavefields reconstructed from each

_{i}*I*(

*x,y*;

*k*).

_{i}Note that the reconstruction distance *z* represents the double-pass delay considering the reflection geometry in the OCT system and is thus given as *z* = 2*n*Δ*d*, where *n* is the refractive index of the tissue and Δ*d* is the actual deviation from the focal plane. The propagation distance *z* is determined either by prior knowledge or by automatic maximum-sharpness searching algorithms.

## 3. Experiments

Experiments were performed to verify the effectiveness of the proposed idea. The DOF of the system was determined by the diameter (4.8mm) of the probe beam and the focal length (40mm) of the objective in the system which turned out to be ∼230 μm in air. The lateral resolution was ∼13.8 μm. For the purpose of comparison, Figs. 4(a) and 4(b) first show two x-z B-scan images and one x-y en-face image of a focused onion, respectively. Then an onion sample placed about 1.2mm away from the DOF region was studied. Figure 4(c) shows two x-z B-scan images of the defocused onion and Fig. 4(d) shows the en-face amplitude images of a layer ∼300 μm underneath the onion surface. Since the onion was placed outside the DOF, the probe Gaussian beam was expanded to illuminate a larger area and multiple scatterers in the lateral directions contributed to a single A-line detection. This resulted in a great degradation of the lateral resolution, as is clearly shown in Figs. 4(c) and 4(d). Figure 4(e) shows the digitally focused B-scan images at the same cross-section positions as in Fig. 4(c), and the corrected en-face image of Fig. 4(d) is shown in Fig. 4(f) where the scatterers are now greatly sharpened to small bright points. Note that only those scatterers whose centers are located exactly on the observed B-scan image will shrink to bright points, but those scatterers centered on neighboring B-scan images tend to be eliminated or removed. This clearly demonstrates the advantage of a two-dimensional algorithm over one-dimensional algorithms.

In order to laterally differentiate more details of onion cells, an objective of 10 mm focal length (20×, Nachet) was used to improve the lateral resolution of the system to ∼3.5 μm, but the DOF was narrowed to 14.4 μm. Figure 5(a) shows onion cells under a microscope. Figure 5(b) shows the defocused en-face OCT image of a layer ∼240 μm away from the focal plane; Fig. 5(c) shows the reconstructed image when the layer is partially focused, and the digitally focused en-face image is finally shown in Fig. 5(d). The cells are now well in focus and the boundaries in both lateral directions are greatly sharpened. The above experiments clearly demonstrate the improvement of the lateral resolution in both directions.

In our experiment, both the amplitude and phase information were used to calculate the wave propagation between different planes. A microscope cover glass was placed on top of the sample, and the top surface of the glass was used as a reference to eliminate the effect of phase fluctuations. For in-vivo imaging applications, it is challenging to maintain phase stability over a whole two-transverse-dimensional scan. The problem of phase stability might be minimized or eventually solved with the development of ultrahigh speed swept laser sources [11]. It is also noteworthy from the above examples that outside the DOF, not only the lateral resolution decreases, but the illumination amplitude attenuates and affects the signal to noise ratio. Although the proposed method improves the lateral resolution and concentrates the energy within a specific out-of-focus layer, it does not correct the attenuation of illumination amplitude outside the DOF along the probe beam. Some fringe artifacts exist in the reconstructed images, and that might be because of the residual phase instability, existence of multiple scattering or because of the limited axial resolution of the system so that structures of adjacent layers overlap together. Multiple scattering is a problem that exists for all OCT systems and will also affect the performance of the proposed method. However, as long as the full-field phase information of the sample can be correctly extracted, the proposed method should work for lateral resolution improvement. Speckle and the diffraction of the Gaussian beam illuminating the out-of-focus sample plane are currently not taken into account in the model. If the Gaussian beam propagation were considered, the digital focusing effects outside the DOF might be further improved.

## 4. Conclusion

In conclusion, we have shown that the wave scattering process from out-of-focus scatterers in OCT can be considered as a two-dimensional scalar diffraction model. The lateral resolution in either *x* or *y* direction can be improved by use of a non-iterative numerical diffraction algorithm, and high-resolution details can be reconstructed from outside the depth-of-field region without any special hardware in system design. Although a spectrometer-based system is considered in the paper, the proposed method is also applicable to swept-source or full-field OCT systems.

This work was supported by research grants from the National Institutes of Health (EB-00293, NCI-91717, and RR-01192), Air Force Office of Scientific Research (FA9550-04-1-0101), and the Beckman Laser Institute Endowment.

## References and links

**1. **D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science **254**, 1178–1181 (1991). [CrossRef] [PubMed]

**2. **A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. Elzaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. **117**, 43–48 (1995). [CrossRef]

**3. **G. Hausler and M. W. Linduer, “Coherence radar and spectral radar-new tools for dermatological diagnosis,” J. Biomed. Opt. **3**, 21–31 (1998). [CrossRef]

**4. **M. Wojtkowski, V. J. Srinivasan, T. H. Ko, J. G. Fujimoto, A. Kowalczyk, and J. S. Duker, “Ultrahighresolution, high-speed, Fourier domain optical coherence tomography and methods for dispersion compensation,” Opt. Express **12**, 2404–2422 (2004). [CrossRef] [PubMed]

**5. **B. Cense, N. A. Nassif, T. C. Chen, M. C. Pierce, S. Yun, B. H. Park, B. E. Bouma, G. J. Tearney, and J. F. de Boer, “Ultrahigh-resolution high-speed retinal imaging using spectral-domain optical coherence tomography,” Opt. Express **12**, 2435–2447 (2004). [CrossRef] [PubMed]

**6. **S. Jiao, R. Knighton, X. Huang, G. Gregori, and C. A. Puliafito, “Simultaneous acquisition of sectional and fundus ophthalmic images with spectral-domain optical coherence tomography,” Opt. Express **13**, 444–452 (2005). [CrossRef] [PubMed]

**7. **S. H. Yun, G. J. Tearney, J. F. de Boer, N. Iftimia, and B. E. Bouma, “High-speed optical frequency domain imaging,” Opt. Express **11**, 2953–2963 (2003). [CrossRef] [PubMed]

**8. **S. R. Chinn, E. A. Swanson, and J. G. Fujimoto, “Optical coherence tomography using a frequency-tunable optical source,” Opt. Lett. **22**, 340–342 (1997). [CrossRef] [PubMed]

**9. **S. H. Yun, C. Boudoux, G. J. Tearney, and B. E. Bouma, “High-speed wavelength-swept semiconductor laser with a polygon-scanner-based wavelength filter,” Opt. Lett. **28**, 1981–1983 (2003). [CrossRef] [PubMed]

**10. **R. Huber, M. Wojtkowski, K. Taira, J. G. Fujimoto, and K. Hsu, “Amplified, frequency swept lasers for frequency domain reflectometry and OCT imaging: design and scaling principles,” Opt. Express **13**, 3513–3528 (2005). [CrossRef] [PubMed]

**11. **R. Huber, D. C. Adler, and J. G. Fujimoto, “Buffered Fourier domain mode locking: unidirectional swept laser sources for optical coherence tomography imaging at 370,000 lines/s,” Opt. Lett. **31**, 2975–2977 (2006). [CrossRef] [PubMed]

**12. **R. A. Leitgeb, C. K. Hitzenberger, and A. F. Fercher, “Performance of fourier domain vs. time domain optical coherence tomography,” Opt. Express **11**, 889–894 (2003). [CrossRef] [PubMed]

**13. **J. F. de Boer, B. Cense, B. H. Park, M. C. Pierce, G. J. Tearney, and B. E. Bouma, “Improved signal-to noise ratio in spectral-domain compared with time-domain optical coherence tomography,” Opt. Lett. **28**, 2067–2069 (2003). [CrossRef] [PubMed]

**14. **M. A. Choma, M. V. Sarunic, C. H. Yang, and J. A. Izatt, “Sensitivity advantage of swept source and Fourier domain optical coherence tomography,” Opt. Express **11**, 2183–2189 (2003). [CrossRef] [PubMed]

**15. **B. Hermann, E. J. Fernandez, A. Unterhuber, H. Sattmann, A. F. Fercher, W. Drexler, P. M. Prieto, and P. Artal, “Adaptive-optics ultrahigh-resolution optical coherence tomography,” Opt. Lett. **29**, 2142–2144 (2004). [CrossRef] [PubMed]

**16. **Z. Ding, H. Ren, Y. Zhao, J. S. Nelson, and Z. Chen, “High-resolution optical coherence tomography over a large depth range with an axicon lens,” Opt. Lett. **27**, 243–245 (2002). [CrossRef]

**17. **Y. Wang, Y. Zhao, J. S. Nelson, and Z. Chen, “Ultrahighresolution optical coherence tomography by broadband continuum generation from a photonic crystal fiber,” Opt. Lett. **28**, 182–184 (2003). [CrossRef] [PubMed]

**18. **M. J. Cobb, X. Liu, and X. Li, “Continuous focus tracking for real-time optical coherence tomography,” Opt. Lett. **30**, 1680–1682 (2005). [CrossRef] [PubMed]

**19. **T. S. Ralston, D. L. Marks, P. S. Carney, and S. A. Boppart, “Inverse scattering for optical coherence tomography,” J. Opt. Soc. Am. A **23**, 1027–1037 (2006). [CrossRef]

**20. **M. D. Kulkarni, C. W. Thomas, and J. A. Izatt, “Image enhancement in optical coherence tomography using deconvolution,” Electron. Lett. **33**1365–1367 (1997). [CrossRef]

**21. **J. M. Schmitt, “Restoration of optical coherence images of living tissue using the clean algorithm,” J. Biomed. Opt. **3**, 66–75 (1998). [CrossRef]

**22. **D. Piao, Q. Zhu, N. Dutta, S. Yan, and L. Otis, “Cancellation of coherent artifacts in optical coherence tomography imaging,” Appl. Opt. **40**, 5124–5131 (2001). [CrossRef]

**23. **J. Hsu, C.W. Sun, C.W. Lu, C. C. Yang, C. P. Chiang, and C.W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. **42**, 227–234 (2003). [CrossRef] [PubMed]

**24. **Y. Yasuno, J. -i. Sugisaka, Y. Sando, Y. Nakamura, S. Makita, M. Itoh, and T. Yatagai, “Non-iterative numerical method for laterally superresolving Fourier domain optical coherence tomography,” Opt. Express **14**, 1006–1020 (2006). [CrossRef] [PubMed]

**25. **T. S. Ralston, D. L. Marks, S. A. Boppart, and P. S. Carney, “Inverse scattering for high-resolution interferometric microscopy,” Opt. Lett. **31**, 3585–3587 (2006). [CrossRef] [PubMed]

**26. **T. S. Ralston, D. L. Marks, P. S. Carney, and S. A. Boppart. “Interferometric Synthetic Aperture Microscopy,” Nature Physics **3**,129–134, (2007). [CrossRef]

**27. **R. M. Lewis, “Physical optics inverse diffraction,” IEEE Trans. Antennas Propag. **AP-17**, 308–314 (1969). [CrossRef]

**28. **J. W. Goodman. *Introduction to Fourier Optics*. McGraw-Hill, 1996.

**29. **L. Yu and M. K. Kim, “Wavelength-scanning digital interference holography for tomographic three-dimensional imaging by use of the angular spectrum method,” Opt. Lett. **30**, 2092–2094 (2005). [CrossRef] [PubMed]