Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Real-time fiber-based multi-functional spectral-domain optical coherence tomography at 1.3 µm

Open Access Open Access

Abstract

We demonstrate a high-speed multi-functional spectral-domain optical coherence tomography system, using a broadband light source centered at 1.3 µm and two InGaAs line scan cameras capable of acquiring individual axial scans in 24.4 µs, at a rate of 18,500 axial scans per second. Fundamental limitations on the accuracy of phase determination as functions of signal-to-noise ratio and lateral scan speed are presented and their relative contributions are compared. The consequences of phase accuracy are discussed for both Doppler and polarization-sensitive OCT measurements. A birefringence artifact and a calibration procedure to remove this artifact are explained. Images of a chicken breast tissue sample acquired with the system were compared to those taken with a time-domain OCT system for birefringence measurement verification. The ability of the system to image pulsatile flow in the dermis and to perform functional imaging of large volumes demonstrates the clinical potential of multi-functional spectral-domain OCT.

©2005 Optical Society of America

1. Introduction

Optical coherence tomography (OCT) is an interferometric technique capable of noninvasive high-resolution cross-sectional imaging by measuring the intensity of light reflected from within tissue [1]. Since the technique is non-contact and provides images similar in scale and geometry to histology, it holds great clinical promise for applications such as retinal [2], gastro-intestinal [3], and cardiac imaging [4]. In time-domain OCT (TD-OCT), a depth profile is obtained by scanning the length of the reference arm of an interferometer. Alternatively, instead of using reference arm scanning, depth information can be retrieved by detecting the interference signal as a function of wavelength. In spectral-domain OCT (SD-OCT), this is achieved with a broadband source and a spectrometer in the detector arm [57]. Optical frequency domain imaging (OFDI) uses a swept-source and point detector to acquire the same information [8]. The main advantage of these schemes over TD-OCT is a marked increase in sensitivity [913] that can be translated into increased imaging speed. SD-OCT systems capable of imaging 29,300 depth scans per second [14, 15] with Si-based line scan cameras and 18,940 depth scans per second [16] with InGaAs-based line scan cameras have been reported. The speed of these systems facilitates comprehensive clinical screening and surveillance.

Functional extensions to OCT add to the clinical potential of OCT. Polarization-sensitive OCT (PS-OCT) has been used for applications including correlating burn depth with a decrease in birefringence [17, 18], measuring the birefringence of the retinal nerve fiber layer [19], and monitoring the onset and progression of caries lesions [20] by analyzing depth-dependent changes in the polarization state of detected light. Implementation of polarization-sensitivity into SD-OCT has been previously demonstrated [21, 22]. Phase-resolved optical Doppler tomography (ODT) [2327] enables depth-resolved imaging of flow by observing differences in phase between successive depth scans. TD- and SD-ODT have allowed for imaging blood flow patterns in port-wine stains [28] and studying the dynamics of retinal blood flow in vivo [29, 30].

In this paper, we present a real-time fiber-based multi-functional SD-OCT system using a broadband source centered at 1.3 microns capable of acquiring and recording depth scans from two InGaAs line scan cameras at 18.5 kHz. The measurement accuracy of the multi-functional components, Doppler and PS-OCT, is fundamentally limited by phase noise. We studied phase noise as a function of signal-to-noise ratio (SNR) and lateral scan speed, and compared experimental results with theoretical predictions. This paper is organized as follows: the multi-functional SD-OCT system is described (Section 2), basic results, including system characterization (Section 3.1), phase noise analysis as functions of SNR and lateral scan speed and the effects on Doppler and PS-OCT measurement accuracy (Sections 3.2 and 3.3), removal of birefringence artifacts (Section 3.3), and applications (Section 3.4), including demonstration of blood pulsatility in a fingertip and multi-functional imaging of large volumes, using images composed of 1024 depth profiles acquired at 18 frames per second (fps) are presented.

2. System description

Figure 1 shows a schematic of the multi-functional SD-OCT system. The output of a PM-fiber-pigtailed broadband source centered at 1320 nm with a 68 nm FWHM is aligned with a polarizing beam splitter and an electro-optic polarization modulator such that transmitted light can be toggled between polarization states that are perpendicular in a Poincaré sphere representation. This light was sent into an interferometer composed of a fiber circulator and a 90/10 fiber splitter for efficient sample arm illumination and collection. Galvanometer mounted mirrors in the sample arm hand-piece were used to provide transverse scanning of a 22.4 µm diameter focused spot. The reference arm was composed of a lens, variable neutral density filter, and a stationary mirror. Light returning from both arms passed back through the fiber splitter and circulator and was directed to a polarization-sensitive spectrometer. In the spectrometer, the optical spectrum was dispersed by a transmission grating (1100 lines per mm), focused with a planoconvex lens (focal length 150 mm), and split with a polarizing beam splitter cube onto two 512-element InGaAs line scan cameras with a 50 µm pitch and 12-bit output resolution (Sensors Inc., SU 512LX). A polarizer was introduced between the polarizing cube and one camera to improve rejection of horizontally polarized light from the vertical polarization channel. Synchronized camera readout was triggered by an external TTL signal. The output of each of the orthogonally aligned cameras was split into four separate channels and digitized using a four-channel data acquisition board with 12-bit resolution at 5 MS/s per channel (NI-6110).

 figure: Fig. 1.

Fig. 1. Diagram of the multi-functional SD-OCT system (pbs: polarizing beam splitter, pm: polarization modulator, oc: optical circulator, 90/10: fiber splitter, pc: static polarization controller, ndf: neutral density filter, g: transmission grating, pol: polarizer, lsc: line scan camera).

Download Full Size | PDF

Data acquisition was managed through a multi-threaded software scheme, written in Visual C++ and running on a dual-processor computer, similar to that used in a previously published TD-OCT system [31]. A screen capture of the user interface is shown in Figure 2. Software initialization consisted of defining variables, such as the number of depth scans per frame and lateral scanning parameters, and creating waveforms to synchronize the two cameras and to control the polarization modulator and scanning galvanometers. A data acquisition thread was used to start asynchronous data streaming from the two A/D boards into separate circular buffers, each having a capacity large enough to hold ten frames. Once enough data has been acquired to fill one frame, the thread copied the appropriate data from both buffers into a data node in a linked list. A separate thread saved the data to the hard disk as desired. Priority was given to these processes so that data could be saved at the highest possible synchronizable camera line readout rate of 18.5 kHz. Four separate displays, seen in Fig. 2, are updated in real time: spectrum, intensity, flow, and polarization. The intensity, flow, and polarization windows also include side panels displaying horizontal and vertical profiles for immediate quantification of data. Depending on which displays are active, updates occur at 2–5 fps for frames composed of 1024 depth profiles while acquisition and saving of the raw spectral output from both cameras continues uninterrupted and independently at 18 fps. Processing is performed on the most recently acquired frame. The intensity, polarization, and flow windows are each composed of a central image surrounded by horizontal and vertical profiles of user-selectable positions within the images. These graphs allow immediate quantification of SNR, phase retardation, and flow information. Unprocessed spectra corresponding to the vertical profile are also displayed.

 figure: Fig. 2.

Fig. 2. A screen capture of the real-time data acquisition software, taken while imaging a weak intralipid solution flowing through small tubes. There are four main displays: unprocessed spectral information is shown in the upper right, a standard OCT image (reflected intensity) is in the middle left, polarization information is shown in the middle right, and flow information is displayed in the lower left corner.

Download Full Size | PDF

2.1. Intensity

In principle, the spectrally encoded output from a line scan camera contains the information necessary to obtain a phase resolved depth profile. Let the output arrays from the two cameras (horizontally and vertically aligned) be represented by hn(λj) and vn(λj), respectively, where n∈[0,N) represents the depth profile number with N depth profiles per frame, even and odd n correspond to the two different incident polarization states, and j∈[0,512) represents the pixel number. Four reference spectra, heven(λj), veven(λj), hodd(λj), and vodd(λj), computed from averaging detected arrays with the sample arm blocked, are subtracted from the appropriate arrays to reduce individual pixel offsets of the cameras. Fluctuations in the relative ground levels of the four A/D channels are reduced by selective filtering, and one set of camera arrays is flipped to preserve the order of data for all the data (short to long wavelengths). Figure 3 shows a more detailed schematic of part of the spectrometer. Based on the grating equation and geometrical optics, the wavelength λj incident on a given pixel of the camera is given by

λj=g(sinθi+sin(sin1(λcdsinθi)+tan1(w·jxcF))),

where θi is the incident angle on the grating with spacing g, λc is the wavelength of light transmitted unperturbed through the lens of focal length F, and xc is the distance from the first pixel to the position of λc on the line scan camera array, which has a pixel width of w. The remapping parameters used for the two cameras should be the same, with the exception of xc, and these are determined by optimization. The process by which xc for each camera was determined is described in Section 3.3. Incorrect determination of these parameters will introduce degradation of the depth profile. The quality of the linear interpolation procedure can be improved with zero padding prior to remapping. Fourier transforms of the interpolated arrays yielded complex depth profiles represented by Hn(zm) and Vn(zm), where m∈[0, 256). Intensity was then computed according to In(zm)=Hn(zmH̃n(zm)+Vn(zmṼn(zm), where ~ represents the complex conjugate, and displayed using a logarithmic gray scale.

 figure: Fig. 3.

Fig. 3. Ray diagram of light propagation to one camera in the spectrometer depicting the mapping between wavelength and pixel number. θi is the incident angle on the grating. θc is the transmitted angle for wavelength λc that travels through the center of the lens of focal length F. xc is the distance from the first pixel to the position of λc on the line scan camera array, which has a pixel width of w.

Download Full Size | PDF

2.2 Flow

Phase-resolved flow processing was implemented, in which the phase difference between successive depth scans is used to determine flow [26]. Two different implementations can be applied depending on user preference: bi-directional flow and phase variance imaging [32]. Since we probe with two different polarization states, the phases, ϕHn and ϕVn, at a particular depth, zm, in one depth profile are compared to those in the previous depth profile with the same polarization state to yield ΔϕHn,Yn(zm)=ϕHn,Yn(zm)ϕHn2Yn2(zm). Phase differences for the first two depth profiles (n=0,1) were assumed to be zero. A correction algorithm was applied to remove the overall phase shift between depth profiles. The flow thread computed either bi-directional flow (ω) or phase variance (σ2) according to

ωn(zm)=12T.Hn(zm)2ΔϕHn(zm)+Vn(zm)2ΔϕVn(zm)Hn(zm)2+Vn(zm)2 σn2(zm)=Hn(zm)2ΔϕHn2(zm)+Vn(zm)2ΔϕVn2(zm)Hn(zm)2+Vn(zm)2

In both cases, the resulting data was averaged over a user-selectable area, typically chosen to correspond to the lateral beam width in units of depth profiles, nl, and coherence length, zc, in units of pixels.

2.3. Polarization

A generalized Jones-matrix based analysis for fiber-based PS-OCT [33] can be applied to determine sample birefringence, diattenuation, and optic axis orientation, and was used for post-processing of data. However, in biological tissue, the angular deviation of a Stokes vector in a Poincaré sphere representation due to diattenuation is negligible compared to that due to birefringence [33]. In other words, birefringence can be accurately estimated without considering diattenuation in such samples. Thus, a computationally less intensive Stokes-vector based analysis [17, 31, 34] that compares the polarization states of light reflected from the sample surface to that backscattered from some depth for two incident polarization states perpendicular in a Poincaré sphere representation can be applied to determine sample phase retardation and optic axis orientation. This approach was implemented in the acquisition software. Stokes vectors, Sn(zm), were calculated from the complex depth profiles, Hn(zm) and Vn(zm), such that

Sn(zm)=[Qn(zm)Un(zm)Vn(zm)]=[Hn(zm)·H˜n(zm)Vn(zm)·V˜n(zm)Hn(zm)·V˜n(zm)+H˜n(zm)·Vn(zm)i(Hn(zm)·V˜n(zm)H˜n(zm)·Vn(zm))]..

The Stokes vectors were then averaged over a user-selectable area, typically chosen to correspond to the lateral beam diameter, nl, in width and the coherence length, zc, in depth, and normalized to yield Nn(zm)=nl,zcSn(zm)/nl,zcSn(zm). The position of the sample surface, sn, was determined by median filtering and thresholding. A simplified form of the cumulative sample optic axis, A p, that simultaneously rotates a pair of surface polarization states to states at a particular depth is given by

Ap(zm)(N2p(s2p)N2p(zm))×(N2p+1(s2p+1)N2p+1(zm)).

The phase retardation angle, θ2p, necessary to rotate N2p(s2p) to N2p(zm) about A2p(zm) can be calculated by

cos(θ2p(zm))=(Ap(zm)×N2p(s2p))·(Ap(zm)×N2p(zm))Ap(zm)×N2p(s2p)Ap(zm)×N2p(zm).

A corresponding expression yields the phase retardation angle for the other incident polarization state, θ2p+1. A weighted average was used to determine the overall phase retardation angle [31].

3. Results

3.1. System characterization

An analysis similar to that presented in Yun et al. [16] resulted in the following system characterization. The photon-to-electron conversion efficiency of the spectrometer was found to be 0.70 for all polarization states. The imaging depth was found to be 2.0 mm in air using a neutral density filter and a mirror in the sample arm and by altering the relative length between the sample and reference arms of the interferometer. This revealed an 8 dB sensitivity dropoff over that range. The FWHM of the coherence peak was less than 11 microns up to a depth of 1.0 mm, in good agreement with calculation based on a source with a 68 nm bandwidth centered at 1320 nm, and increased to 14 microns at a depth of 1.8 mm. A polarized source power of 11 mW resulted in 6 mW incident on the sample surface. The cameras were operated at the highest possible synchronized rate of 18.5 kHz with an exposure time of 24.4 µs per depth profile, resulting in an imaging speed of 18 fps for frames composed of 1024 depth profiles.

 figure: Fig. 4.

Fig. 4. (2.13 MB) Movie depicting a large volume scan of the fingerprint region. A logarithmic gray-scale was used to display the entire intensity dynamic range. The volume is composed of 104 frames covering a length of 6.5 mm, where individual image frames are composed of 1024 depth profiles covering a width of 8 mm, and was acquired in 5.75 seconds.

Download Full Size | PDF

The high speed and sensitivity of the system allows for rapid imaging of large tissue volumes. Figure 4 shows intensity images acquired in vivo from a volume of the fingerprint region. The volume is composed of 104 frames covering a length of 6.5 mm, with individual frames composed of 1024 depth profiles spanning 8 mm, and was acquired at 18 fps in 5.75 seconds. Images are displayed on a logarithmic scale, with an SNR of approximately 50 dB at the sample surface. The surface of the window of the scanning hand-piece, in light contact with the finger, is visible in the intensity image. The stratum corneum and stratum granulosum are evident, and a number of sweat ducts are visible throughout the imaged volume. Also, variations in the epidermal ridges and dermal papillae reflective of fingerprint patterns can be observed.

3.2. Phase stability and flow

The range of both bi-directional flow and phase variance measurements is governed by the minimum and maximum detectable phase difference. Phase wrapping limits the maximum detectable phase difference to ±π. One fundamental limitation on the minimum detectable phase difference arises from the signal-to-noise ratio (SNR) of a measurement. To quantify this effect, 1024 consecutive depth profiles at a single point on a glass slide were obtained at different SNRs using a variable neutral density filter in the sample arm. The phase difference between the front and back surfaces of the glass slide was determined, and the standard deviation of these phase differences was calculated. This standard deviation directly yields the minimum detectable phase difference. The resulting standard deviations, plotted as a function of SNR in Fig. 5, demonstrate a decreasing minimum detectable phase difference, Δϕ, with increasing SNR. The theoretical curve was generated based on a simple noise model. In the shot noise limit, noise can be modeled as a random vector, n⃗, added to a complex electric field vector, E⃗, describing light returning from the sample, to yield a measured complex electric field, E⃗′. The SNR of such a measurement is given by SNR=(|E⃗|/|n⃗|)2. Assuming a good SNR (|n⃗|≪|E⃗|), the standard deviation of the phase ϕ′of E⃗′ is given by σφ2=12(|n|/|E|)2=12(SNR)1. The minimum detectable phase difference σΔϕ between two measured phases ϕ′ can now be determined as

σΔϕ=2σϕ2=(SNR)12.

This is similar to an expression derived by Yazdanfar et al. [35]. The experimental results show good agreement with predicted theory and underscore the importance of SNR for accurate flow measurements.

 figure: Fig. 5.

Fig. 5. Standard deviation, σΔϕ, of the phase difference between the front and back surfaces of a glass slide as a function of signal-to-noise ratio (SNR) on a log-log scale. Squares: standard deviation over 1024 measurements. Line: theoretical curve (Eq. 2 in text).

Download Full Size | PDF

Using phase differences to quantify flow inherently assumes that the depth profiles from which the phases are extracted represent the same physical structure. The previously described fundamental limit based on SNR is achieved in the limit of no lateral scanning across the sample. In the opposite limit, where two depth profiles image completely different volumes of tissue, the phases cannot be reliably compared. To quantify this effect, 1024 depth profiles from a uniformly scattering sample were acquired with different lateral scan widths. The phase variance over the imaged area (SNR≈15 dB) was calculated and plotted in Figure 6 as a function of the lateral distance moved between depth profiles divided by the focused beam width, demonstrating that the phase variance increases as the fraction of the beam width moved between depth profiles increases. The theoretical curve was based on the following model [36]: the measured phase at a particular point is the ensemble average of random phases, ϕ(x,y), from within the imaged area assuming a two-dimensional amplitude Gaussian cross section defined by the 1/e2 beam width at the focus, d, such that A(x,y)=(4πd2)exp(4d2(x2+y2)). The random phases are assumed to be completely uncorrelated, such that ϕ(x,y)ϕ(x,y)=π23δ(x=x)δ(y=y), where δ denotes the delta function. The expected standard deviation, σΔx, due to movement of the beam, A(x-Δx,y) where Δx is the lateral distance moved, is then given by

σΔx=4π3(1exp(2(Δxd)2)).

The overall phase error, σphase, can be estimated as the combination of these two fundamental limits as σphase=σΔϕ2+σΔx2. The theoretical curve fits well with the experimental data. These results represent the limit where phases are uncorrelated; longer range phase correlations may be present in structured samples.

 figure: Fig. 6.

Fig. 6. Standard deviation of the phase differences measured within a uniformly scattering medium as a function of the ratio between the lateral distance moved between depth scans to the focused beam width in the sample. Squares: standard deviation over 1024 depth profiles. Lines: theoretical curves corresponding to overall phase error (σphase), and components due to lateral scanning (σΔx, Eq. 2) and SNR (σΔϕ, Eq. 3). Inset: lateral scan speed relative to the beam width corresponding to a phase error equal to that for a particular SNR (σΔϕΔx).

Download Full Size | PDF

The relative magnitude of these two effects can be studied by examining conditions for which σΔϕ=σ Δx. The inset of Fig. 6 displays the lateral scan speed (lateral distance moved between depth scans normalized to the beam diameter) as a function of SNR in the limit σ Δϕ=σΔx. From the graph, SNR’s of 30 and 40 dB correspond to beam width fractions of approximately 0.010 and 0.003, respectively. For the system described here (1024 depth profiles alternating between two polarization states and a 22.4 µm beam width), this would correspond to full image widths of 115 and 34.4 µm, respectively. As typical imaging parameters call for image widths in excess of 1 mm, the phase error due to lateral scanning dominates that due to SNR and is the limiting factor in phase determination.

The beating heart of a xenopus laevis tadpole has been used to demonstrate, in vivo, the flow imaging capability of previous OCT systems [3739]. Figure 7 is a time sequence of images acquired at 18 fps from a single imaging plane of a xenopus laevis tadpole. Individual frames are composed of 1024 depth profiles spanning a width of 0.8 mm. The movement of three distinct cardiac chambers, a large ventricle and two smaller atria, is evident in the intensity images. The dynamics of the very rapid cardiac cycle become more evident in the bi-directional flow images, where movement of ventricle and one of the atria is especially apparent. The time-sequence reveals the ability of the system to image flow dynamics at high speed and with excellent dynamic range in the calculated flow.

 figure: Fig. 7.

Fig. 7. (2.09 MB) Beating heart of a xenopus laevis tadpole (1024 depth profiles spanning 0.8 mm in width, 3.87 seconds). Intensity images are displayed on the left, with the ventricle and atria of the heart clearly visible in the upper portions of the imaged region. Unwrapped bidirectional flow is displayed in the middle (gray scale from −3π to 3π) and right (image width and depth on the XY-plane, phase shift indicated by Z-axis), where flow in the ventricle and one atria are especially apparent.

Download Full Size | PDF

3.3. Birefringence

The polarization state of light returning from a sample is determined by the relative amplitude and phase of the interferograms registered by the two orthogonally aligned detectors. The accuracy of polarization state determination is limited by the accuracy of phase determination and is therefore fundamentally limited by SNR as well. Accuracy can be characterized by the measured angular standard deviation in a Poincaré sphere representation of a measured polarization state. The phase error contributions from two orthogonal electric fields can be translated into an angular standard deviation, Δθ, in a Poincaré sphere representation by the relation [40]:

Δθ=2SNR.

Lateral movement should also be expected to degrade polarization state measurement accuracy, since the measurement assumes that the orthogonally polarized depth scans sense the same sample structure.

Alignment of the line scan cameras with respect to one another is another important consideration for accurate determination of sample polarization properties. As mentioned earlier, interpolation of hn(λ) and vn(λ) to hn(k) and vn(k) depends on knowledge of the wavelength incident on the different pixels of the line scan cameras. An error, Δλ, of the assumed incident wavelength, λ, yields a deviation in wave number given by Δk=2πΔλ λ 2. If the two line scan cameras have even slightly different errors, the relative deviation in wave number can give rise to an artificial appearance of birefringence. As an example, for a center wavelength λ=1320 nm and with a relative alignment error of Δλ=1 nm between the cameras, we can expect a phase difference Δϕ=3.606 radians over a depth of 1 mm. The cumulative effect of these phase differences across the line scan cameras can lead to an overall phase retardation that cannot be distinguished from that due to sample birefringence. The most common cause of this artifact is a difference in the positioning of the line scan cameras within the focal line, or a difference in the xc parameter of Eq. 1. A two step calibration process was used to remove this artifact. First, imaging of a non-birefringent sample was used to roughly calibrate the two cameras. This was achieved by translating the cameras with respect to one another along the focal line until the observed birefringence was negligible. This was facilitated by the real-time display, which allowed immediate quantification of birefringence artifacts. Second, a more precise calibration was performed in post-processing by optimizing the remapping parameters, including xc, for the two cameras. Technically, the rough calibration step is unnecessary. The appearance of the birefringence artifact can be completely eliminated by proper calibration of the remapping parameters. However, in practice, without the rough calibration step, the range over which xc varies can be quite large. Using the two-step process described serves to make the overall optimization process easier, and allows for accurate measurement of sample polarization properties.

 figure: Fig. 8.

Fig. 8. Representative TD- and SD-OCT images of the same chicken breast muscle sample. The width of the images was 4.0 mm, and the depth was 1.2 and 1.4 mm for the TD- and SD-OCT images, respectively. Each set of images (TD,SD) are composed of an intensity image (a,c) and phase retardation image (b,d). The unwrapped phase retardation profiles were averaged over the full width of the image (e). Intensity images are gray-scaled encoded over the dynamic range of the image, and phase retardation images are gray-scaled from black to white, representing phase retardations from −π to π radians.

Download Full Size | PDF

Since TD-OCT does not suffer from any such artifact, direct comparison of TD- and SD-OCT results can be used to verify the accuracy of the remapping procedure and removal of birefringence artifacts. PS-OCT images were taken of a chicken muscle sample, its surface oriented orthogonal to the axis of the incident beam, and rotated in 20° increments about this axis, spanning a full 360°. Data from orientations with signal saturation were ignored. The same sample was then imaged with the same geometry with a TD-OCT system capable of imaging at 2048 depth scans per second [34]. A more computationally intensive Jones matrix based analysis capable of providing unwrappable phase retardations was used to analyze both sets of data [33]. The data from the Jones matrix based approach is displayed. Representative data acquired with both systems are displayed in Fig. 8. Average double-pass phase retardation slopes of 0.919±0.031°/mm and 0.885±0.069°/mm for time-domain and spectral domain, respectively, are in excellent agreement. The resulting optic axes, along with a plane determined by least-squares fitting, as well as their corresponding orientation angles on that plane, are shown in Figure 9. The calculated orientation shows good agreement with the set orientation angle of the tissue sample. These results demonstrate the accuracy of the system to determine sample polarization properties, such as phase retardation and optic axis orientation.

 figure: Fig. 9.

Fig. 9. A. Optic axis orientation in a Poincaré sphere representation of the calculated optic axes (arrows) for various set orientations of the tissue sample optic axis. The plane (dotted circle) in which these optic axes lie was determined by least-squares fitting. B. Calculated optic axis orientation as a function of set orientation relative to 0°. Squares: Measured orientation. Line: Linear fit to the data.

Download Full Size | PDF

3.4. Applications

Figure 10 shows intensity and phase variance images acquired from a single imaging plane of a human fingertip. The surface of the window of the scanning handpiece, in light contact with the finger, is visible in the intensity image. The epidermal and dermal regions, as well as two sweat ducts, are evident in the intensity image. A capillary loop of the superficial dermal plexus can be observed in the phase variance image by its lighter color. Locating small capillaries such as this can be problematic, as slight pressure, such as the weight of the handpiece, can easily compress the vessels, stopping flow. The pulsatility of flow in the vessel can be observed through time-varying changes in phase variance, demonstrating the ability to study blood flow dynamics in vivo.

 figure: Fig. 10.

Fig. 10. (1.03 MB) A time-sequence of human fingertip images (1024 depth profiles spanning 3 mm in width, 2.65 seconds). Intensity (upper) images of the same imaging plane. Blood flow pulsatility is visible in the phase variance images (lower).

Download Full Size | PDF

Figure 11 shows multi-functional images acquired from a volume of a fingertip. The volume is divided into 64 frames covering 2 mm, with each frame composed of 1024 depth profiles spanning 4 mm in width. The stratum corneum and granulosum are visible in the intensity images, as are epidermal ridges reflective of the fingerprint pattern and a number of sweat ducts. The structure of the superficial dermal plexus can be observed within the imaged volume in both the phase variance and bi-directional flow images. Thinner capillary loops can be observed in both the phase variance and bi-directional flow images, and a larger venule is evident in the phase variance image. The phase retardation image reveals the dermo-epidermal junction as well as the dermal papillae, which also reflect the fingerprint pattern.

 figure: Fig. 11.

Fig. 11. (2.30 MB) A volume scan of a fingertip. Intensity (upper left), phase retardation (lower left), phase variance (upper right), and bi-directional flow (lower right). The volume is composed of 64 frames covering a length of 4 mm, where individual image frames are composed of 1024 depth profiles covering a width of 4 mm, and was acquired in 3.54 seconds.

Download Full Size | PDF

4. Conclusion

In conclusion, we have demonstrated a fiber-based multi-functional SD-OCT system capable of real-time display of intensity, birefringence, and flow images composed of 1024 depth profiles at 18.5 fps. We demonstrated fundamental limits on the accuracy of phase determination as functions of SNR and lateral scan speed relative to beam width, and showed how the effect due to lateral scanning dominates for typical imaging parameters. The effect of phase accuracy on Doppler and PS-OCT measurements was discussed. Birefringence artifacts were discussed and a strategy for their removal was demonstrated. The accuracy of sample birefringence determination was shown by comparing data acquired with the system to that obtained using a TD-OCT system. The ability of the system to study blood flow dynamics as well as to perform rapid, large-volume multi-functional imaging was demonstrated and indicates the potential clinical utility of this technology.

Acknowledgments

Research grants from the National Institutes of Health (R01-RR019768 and R01-EY014975) and the U.S. Department of Defense (F4 9620-01-10014) are gratefully acknowledged. B.H. Park’s email address is hylepark@helix.mgh.harvard.edu.

References and links

1. D. Huang, E.A. Swanson, C.P. Lin, J.S. Schuman, W.G. Stinson, W. Chang, M.R. Hee, T. Flotte, K. Gregory, C.A. Puliafito, and J.G. Fujimoto, “Optical Coherence Tomography,” Science 254, 1178 (1991). [CrossRef]   [PubMed]  

2. E.A. Swanson, J.A. Izatt, M.R. Hee, D. Huang, C.P. Lin, J.S. Schuman, C.A. Puliafito, and J.G. Fujimoto, “In-vivo retinal imaging by optical coherence tomography,” Opt. Lett. 18, 1864 (1993). [CrossRef]   [PubMed]  

3. B.E. Bouma, G.J. Tearney, C.C. Compton, and N.S. Nishioka, “High-resolution imaging of the human esophagus and stomach in vivo using optical coherence tomography,” Gastrointestinal Endoscopy 51, 467 (2000). [CrossRef]   [PubMed]  

4. I.K. Jang, G.J. Tearney, D.H. Kang, Y.C. Moon, S.J. Park, S.W. Park, K.B. Seung, S.L. Houser, M. Shishkov, E. Pomerantsev, H.T. Aretz, and B.E. Bouma, “Comparison of optical coherence tomography and intravascular ultrasound for detection of coronary plaques with large lipid-core in living patients,” Circulation 102, 509 (2000).

5. A.F. Fercher, C.K. Hitzenberger, G. Kamp, and S.Y. Elzaiat, “Measurement of Intraocular Distances by Backscattering Spectral Interferometry,” Optics Communications 117, 43 (1995). [CrossRef]  

6. G. Hausler and M.W. Lindner, “Coherence Radar and Spectral Radar - New Tools for Dermatological Diagnosis,” J. Biomed. Opt. 3, 21 (1998). [CrossRef]  

7. M. Wojtkowski, R. Leitgeb, A. Kowalczyk, T. Bajraszewski, and A.F. Fercher, “In vivo human retinal imaging by Fourier domain optical coherence tomography,” Journal of Biomedical Optics 7, 457 (2002). [CrossRef]   [PubMed]  

8. S.H. Yun, G.J. Tearney, J.F. de Boer, N. Iftimia, and B.E. Bouma, “High-speed optical frequency-domain imaging,” Opt. Express 11, 2953 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-22-2953. [CrossRef]   [PubMed]  

9. P. Andretzky, M.W. Lindner, J.M. Hermann, A. Schultz, M. Konzog, F. Kiesewetter, and G. Hausler, “Optical coherence tomography by spectral radar: dynamic range estimation and in vivo measurements of skin,” Proc. SPIE 3567, 78 (1998). [CrossRef]  

10. T. Mitsui, “Dynamic range of optical reflectometry with spectral interferometry,” Japanese Journal of Applied Physics Part 1-Regular Papers Short Notes & Review Papers 38, 6133 (1999).

11. R. Leitgeb, C.K. Hitzenberger, and A.F. Fercher, “Performance of fourier domain vs. time domain optical coherence tomography,” Opt. Express 11, 889 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-889. [CrossRef]   [PubMed]  

12. J.F. de Boer, B. Cense, B.H. Park, M.C. Pierce, G.J. Tearney, and B.E. Bouma, “Improved signal-to-noise ratio in spectral-domain compared with time-domain optical coherence tomography,” Opt. Lett. 28, 2067 (2003). [CrossRef]   [PubMed]  

13. M.A. Choma, M.V. Sarunic, C.H. Yang, and J.A. Izatt, “Sensitivity advantage of swept source and Fourier domain optical coherence tomography,” Opt. Express 11, 2183 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-18-2183. [CrossRef]   [PubMed]  

14. N. Nassif, B. Cense, B.H. Park, S.H. Yun, T.C. Chen, B.E. Bouma, G.J. Tearney, and J.F. de Boer, “In vivo human retinal imaging by ultrahigh-speed spectral domain optical coherence tomography,” Opt.s Lett. 29, 480 (2004). [CrossRef]  

15. N.A. Nassif, B. Cense, B.H. Park, M.C. Pierce, S.H. Yun, B.E. Bouma, G.J. Tearney, T.C. Chen, and J.F. de Boer, “In vivo high-resolution video-rate spectral-domain optical coherence tomography of the human retina and optic nerve,” Opt. Express 12, 367 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-367. [CrossRef]   [PubMed]  

16. S.H. Yun, G.J. Tearney, B.E. Bouma, B.H. Park, and J.F. de Boer, “High-speed spectral-domain optical coherence tomography at 1.3 µm wavelength,” Opt. Express 11, 3598 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-26-3598. [CrossRef]   [PubMed]  

17. B.H. Park, C. Saxer, S.M. Srinivas, J.S. Nelson, and J.F. de Boer, “In vivo burn depth determination by high-speed fiber-based polarization sensitive optical coherence tomography,” J. Biomed. Opt. 6, 474 (2001). [CrossRef]   [PubMed]  

18. M.C. Pierce, R.L. Sheridan, B.H. Park, B. Cense, and J.F. de Boer, “Collagen denaturation can be quantified in burned human skin using polarization-sensitive optical coherence tomography,” Burns 30, 511 (2004). [CrossRef]   [PubMed]  

19. B. Cense, T.C. Chen, B.H. Park, M.C. Pierce, and J.F. de Boer, “In vivo depth-resolved birefringence measurements of the human retinal nerve fiber layer by polarization-sensitive optical coherence tomography,” Optics Letters 27, 1610 (2002). [CrossRef]  

20. D. Fried, J. Xie, S. Shafi, J.D.B. Featherstone, T.M. Breunig, and C. Le, “Imaging caries lesions and lesion progression with polarization sensitive optical coherence tomography,” Journal of Biomedical Optics 7, 618 (2002). [CrossRef]   [PubMed]  

21. Y. Yasuno, S. Makita, Y. Sutoh, M. Itoh, and T. Yatagai, “Birefringence imaging of human skin by polarization-sensitive spectral interferometric optical coherence tomography,” Optics Letters 27, 1803 (2002). [CrossRef]  

22. Y. Yasuno, S. Makita, T. Endo, M. Itoh, and T. Yatagai, “Polarization-sensitive complex Fourier domain optical coherence tomography for Jones matrix imaging of biological samples,” Appl. Phys. Lett. 85, 3023 (2004). [CrossRef]  

23. X.J. Wang, T.E. Milner, and J.S. Nelson, “Characterization of Fluid-Flow Velocity by Optical Doppler Tomography,” Opt. Lett. 20, 1337 (1995). [CrossRef]   [PubMed]  

24. Z.P. Chen, T.E. Milner, D. Dave, and J.S. Nelson, “Optical Doppler tomographic imaging of fluid flow velocity in highly scattering media,” Opt. Lett. 22, 64 (1997). [CrossRef]   [PubMed]  

25. J.A. Izatt, M.D. Kulkami, S. Yazdanfar, J.K. Barton, and A.J. Welch, “In vivo bidirectional color Doppler flow imaging of picoliter blood volumes using optical coherence tomograghy,” Opt. Lett. 22, 1439 (1997). [CrossRef]  

26. Y.H. Zhao, Z.P. Chen, C. Saxer, S.H. Xiang, J.F. de Boer, and J.S. Nelson, “Phase-resolved optical coherence tomography and optical Doppler tomography for imaging blood flow in human skin with fast scanning speed and high velocity sensitivity,” Opt. Lett. 25, 114 (2000). [CrossRef]  

27. R.A. Leitgeb, L. Schmetterer, C.K. Hitzenberger, A.F. Fercher, F. Berisha, M. Wojtkowski, and T. Bajraszewski, “Real-time measurement of in vitro flow by Fourier-domain color Doppler optical coherence tomography,” Opt. Lett. 29, 171 (2004). [CrossRef]   [PubMed]  

28. J.S. Nelson, K.M. Kelly, Y.H. Zhao, and Z.P. Chen, “Imaging blood flow in human Port-wine stain in situ and in real time using optical Doppler tomography,” Arch. Dermatol. 137, 741 (2001). [PubMed]  

29. R.A. Leitgeb, L. Schmetterer, W. Drexler, A.F. Fercher, R.J. Zawadzki, and T. Bajraszewski, “Real-time assessment of retinal blood flow with ultrafast acquisition by color Doppler Fourier domain optical coherence tomography,” Opt. Express 11, 3116 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-23-3116. [CrossRef]   [PubMed]  

30. B.R. White, M.C. Pierce, N. Nassif, B. Cense, B.H. Park, G.J. Tearney, B.E. Bouma, T.C. Chen, and J.F. de Boer, “In vivo dynamic human retinal blood flow imaging using ultra-high-speed spectral domain optical Doppler tomography,” Opt. Express 11, 3490 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-25-3490. [CrossRef]   [PubMed]  

31. B.H. Park, M.C. Pierce, B. Cense, and J.F. de Boer, “Real-time multi-functional optical coherence tomography,” Opt. Express 11, 782 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-7-782. [CrossRef]   [PubMed]  

32. Y.H. Zhao, Z.P. Chen, C. Saxer, Q.M. Shen, S.H. Xiang, J.F. de Boer, and J.S. Nelson, “Doppler standard deviation imaging for clinical monitoring of in vivo human skin blood flow,” Opt. Lett. 25, 1358 (2000). [CrossRef]  

33. B.H. Park, M.C. Pierce, B. Cense, and J.F. de Boer, “Jones matrix analysis for a polarization-sensitive optical coherence tomography system using fiber-optic components,” Opt. Lett. 29, 2512 (2004). [CrossRef]   [PubMed]  

34. M.C. Pierce, B.H. Park, B. Cense, and J.F. de Boer, “Simultaneous intensity, birefringence, and flow measurements with high speed fiber-based optical coherence tomography,” Opt. Lett. 27, 1534 (2002). [CrossRef]  

35. S. Yazdanfar, C.H. Yang, M.V. Sarunic, and J.A. Izatt, “Frequency estimation precision in Doppler optical coherence tomography using the Cramer-Rao lower bound,” Opt. Express 13, 410 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-13-2-410. [CrossRef]  

36. S.H. Yun, G.J. Tearney, J.F. de Boer, and B.E. Bouma, “Motion artifacts in optical coherence tomography with frequency-domain ranging,” Opt. Express 12, 2977 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-13-2977. [CrossRef]   [PubMed]  

37. A.M. Rollins, M.D. Kulkarni, S. Yazdanfar, R. Ung-arunyawee, and J.A. Izatt, “In vivo video rate optical coherence tomography,” Opt. Express 3, 219 (1998), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-3-6-219. [CrossRef]   [PubMed]  

38. V. Westphal, S. Yazdanfar, A.M. Rollins, and J.A. Izatt, “Real-time high velocity-resolution color Doppler optical coherence tomography,” Opt. Lett. 27, 34 (2002). [CrossRef]  

39. V.X.D. Yang, M.L. Gordon, E. Seng-Yue, S. Lo, B. Qi, J. Pekar, M. A., B.C. Wilson, and I.A. Vitkin, “High speed, wide velocity dynamic range Doppler optical coherence tomography (Part II): imaging in vivo cardiac dynamics of xenopus laevis,” Opt. Express 11, 1650 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-14-1650. [CrossRef]   [PubMed]  

40. B.H. Park, M.C. Pierce, B. Cense, and J.F. de Boer, “Optic axis determination for fiber-based polarization-sensitive optical coherence tomography,” Opt. Lett. (submitted for review), (2005). [CrossRef]   [PubMed]  

Supplementary Material (4)

Media 1: MOV (2186 KB)     
Media 2: MOV (2150 KB)     
Media 3: MOV (1057 KB)     
Media 4: MOV (2360 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Diagram of the multi-functional SD-OCT system (pbs: polarizing beam splitter, pm: polarization modulator, oc: optical circulator, 90/10: fiber splitter, pc: static polarization controller, ndf: neutral density filter, g: transmission grating, pol: polarizer, lsc: line scan camera).
Fig. 2.
Fig. 2. A screen capture of the real-time data acquisition software, taken while imaging a weak intralipid solution flowing through small tubes. There are four main displays: unprocessed spectral information is shown in the upper right, a standard OCT image (reflected intensity) is in the middle left, polarization information is shown in the middle right, and flow information is displayed in the lower left corner.
Fig. 3.
Fig. 3. Ray diagram of light propagation to one camera in the spectrometer depicting the mapping between wavelength and pixel number. θi is the incident angle on the grating. θc is the transmitted angle for wavelength λc that travels through the center of the lens of focal length F. xc is the distance from the first pixel to the position of λc on the line scan camera array, which has a pixel width of w.
Fig. 4.
Fig. 4. (2.13 MB) Movie depicting a large volume scan of the fingerprint region. A logarithmic gray-scale was used to display the entire intensity dynamic range. The volume is composed of 104 frames covering a length of 6.5 mm, where individual image frames are composed of 1024 depth profiles covering a width of 8 mm, and was acquired in 5.75 seconds.
Fig. 5.
Fig. 5. Standard deviation, σΔϕ, of the phase difference between the front and back surfaces of a glass slide as a function of signal-to-noise ratio (SNR) on a log-log scale. Squares: standard deviation over 1024 measurements. Line: theoretical curve (Eq. 2 in text).
Fig. 6.
Fig. 6. Standard deviation of the phase differences measured within a uniformly scattering medium as a function of the ratio between the lateral distance moved between depth scans to the focused beam width in the sample. Squares: standard deviation over 1024 depth profiles. Lines: theoretical curves corresponding to overall phase error (σphase ), and components due to lateral scanning (σΔx, Eq. 2) and SNR (σΔϕ, Eq. 3). Inset: lateral scan speed relative to the beam width corresponding to a phase error equal to that for a particular SNR (σΔϕΔx).
Fig. 7.
Fig. 7. (2.09 MB) Beating heart of a xenopus laevis tadpole (1024 depth profiles spanning 0.8 mm in width, 3.87 seconds). Intensity images are displayed on the left, with the ventricle and atria of the heart clearly visible in the upper portions of the imaged region. Unwrapped bidirectional flow is displayed in the middle (gray scale from −3π to 3π) and right (image width and depth on the XY-plane, phase shift indicated by Z-axis), where flow in the ventricle and one atria are especially apparent.
Fig. 8.
Fig. 8. Representative TD- and SD-OCT images of the same chicken breast muscle sample. The width of the images was 4.0 mm, and the depth was 1.2 and 1.4 mm for the TD- and SD-OCT images, respectively. Each set of images (TD,SD) are composed of an intensity image (a,c) and phase retardation image (b,d). The unwrapped phase retardation profiles were averaged over the full width of the image (e). Intensity images are gray-scaled encoded over the dynamic range of the image, and phase retardation images are gray-scaled from black to white, representing phase retardations from −π to π radians.
Fig. 9.
Fig. 9. A. Optic axis orientation in a Poincaré sphere representation of the calculated optic axes (arrows) for various set orientations of the tissue sample optic axis. The plane (dotted circle) in which these optic axes lie was determined by least-squares fitting. B. Calculated optic axis orientation as a function of set orientation relative to 0°. Squares: Measured orientation. Line: Linear fit to the data.
Fig. 10.
Fig. 10. (1.03 MB) A time-sequence of human fingertip images (1024 depth profiles spanning 3 mm in width, 2.65 seconds). Intensity (upper) images of the same imaging plane. Blood flow pulsatility is visible in the phase variance images (lower).
Fig. 11.
Fig. 11. (2.30 MB) A volume scan of a fingertip. Intensity (upper left), phase retardation (lower left), phase variance (upper right), and bi-directional flow (lower right). The volume is composed of 64 frames covering a length of 4 mm, where individual image frames are composed of 1024 depth profiles covering a width of 4 mm, and was acquired in 3.54 seconds.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

λ j = g ( sin θ i + sin ( sin 1 ( λ c d sin θ i ) + tan 1 ( w · j x c F ) ) ) ,
A p ( z m ) ( N 2 p ( s 2 p ) N 2 p ( z m ) ) × ( N 2 p + 1 ( s 2 p + 1 ) N 2 p + 1 ( z m ) ) .
cos ( θ 2 p ( z m ) ) = ( A p ( z m ) × N 2 p ( s 2 p ) ) · ( A p ( z m ) × N 2 p ( z m ) ) A p ( z m ) × N 2 p ( s 2 p ) A p ( z m ) × N 2 p ( z m ) .
σ Δ ϕ = 2 σ ϕ 2 = ( SNR ) 1 2 .
σ Δ x = 4 π 3 ( 1 exp ( 2 ( Δ x d ) 2 ) ) .
Δ θ = 2 SNR .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.