Abstract

Optical coherence tomography (OCT) has been gaining acceptance in image-guided microsurgery as a noninvasive imaging technique. However, when using B-mode OCT imaging, it is difficult to continuously keep the surgical tool in the imaging field, and the image of the tissue beneath the tool is corrupted by shadow effects. The alternative using C-mode OCT imaging is either too slow in imaging speed when operating in a high-resolution mode, or provides a poor image resolution in a high-speed mode, with the sweep rate less than one million hertz. Moreover, the 3-dimensional rendering of C-mode OCT image makes it difficult to visualize the tissue structure and track the surgical tool beneath the tissue surface. To solve these problems, we propose a BC-mode OCT image visualization method. This method uses a sparse C-scanning scheme, which provides a set of high-resolution B-mode OCT images at sparsely spaced cross sections. The final BC-mode OCT image is obtained by averaging the image set, with inter frame variance processing to enhance the signal of the surgical tool and tissue layers. The performance of BC-mode OCT images, such as image resolution, signal to noise ratio (SNR), imaging speed, and surgical tool tracking accuracy, is analyzed theoretically and verified experimentally. The feasibility of the proposed method is evaluated by guiding the insertion of a 30-gauge needle into the cornea of an ex-vivo human eye freehand. The results show that this provides better visualization of both the surgical tool and the tissue structure than the conventional B- or C- mode OCT image.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Microsurgery, such as ophthalmic surgery and neurosurgery, involves manipulation of delicate tissues and requires precise control of the surgical tool [1]. Therefore, providing a clear visualization of both the surgical tool and the tissue structure is of great importance to the microsurgery guidance. Conventionally, surgeons use surgical microscopes to magnify the field of view, but the view is restricted to an en face perspective with limited depth information [2]. Alternatively, some tomographic imaging modalities, such as ultrasound, computed tomography (CT) and magnetic resonance imaging (MRI) provide structure information inside the tissue but still suffer from limited resolution [3], radiation exposure [4] and incompatibility with surgical tools [5].

Optical coherence tomography (OCT) [6], a high-speed imaging modality with micrometer resolution, has been gaining acceptance in image-guided microsurgery [79]. Different visualization modes for OCT images have been explored to better guide the surgical tool. Kang et al. used the A-mode common path OCT image to guide the microinjector [10]. Shin et al. used the M-mode OCT image to guide the surgical needle for lamellar keratoplasty [11]. Yu et al. used B-mode OCT image for robotic microsurgery [12]. Kang et al. [7], Viehland et al. [13] and Keller et al. [14] used a C-mode OCT image for surgical tool tracking. This work shows that A-mode and M-mode OCT images only provide depth information, and thus their visualization can be difficult to interpret. C-mode OCT images are easier to interpret. However, with the A-scan rate less than one million hertz, C-mode OCT imaging is either slow when the imaging speed is less than 5 Hz, or poor in image resolution when the imaging speed is higher than 5 Hz, which is necessary in order to provide a decent field of view [1519]. Although video rate C-mode OCT imaging [20] and densely sampled wide-field snapshot C-mode OCT imaging [21] have been demonstrated, it is at the cost of using an ultra high-speed swept source on the order of million hertz. It is not fair to compare when the sweep rate is different. Therefore, a better criteria is the scanning spots within each frame. Usually, compared with B-mode OCT images, C-mode OCT images need far more scanning spots within each frame. Therefore, B-mode OCT imaging is usually much faster than C-mode OCT image, under the condition of the same sweep rate. Moreover, without projection to a 2-dimensional plane, the 3-dimensional rendering of C-mode OCT image highlights the tissue surface, so it is difficult to visualize the tissue structure and track the surgical tool beneath the surface. Compared with the C-mode OCT image, B-mode OCT image provides a clearer cross-sectional view of tissue structure and surgical tool underneath the tissue surface, and it is faster in imaging speed. Therefore, the B-mode OCT image is a better intraoperative imaging mode than C-mode OCT image. However, in conventional B-mode OCT imaging, it is difficult to continuously keep the surgical tool in the imaging field since it provides only one cross-sectional image of a single plane. Moreover, when the tool is in the imaging plane, the tissue signal is blocked by the surgical tool, which results in shadow effects in the B-mode OCT image.

In this paper, we propose a BC-mode OCT image visualization method to solve these problems, which compresses the sparsely scanned OCT volume data into a compounded B-mode OCT image. First, we introduce the BC-mode OCT system setup, scanning configuration, and calculation method. Then, we analyze the performance of the BC-mode OCT imaging theoretically and verify the theoretical results with experimental data. Next, we describe in detail more image processing techniques that we use to enhance the BC-mode OCT image quality. Finally, we evaluate the BC-mode OCT image performance by guiding a 30-gauge needle into the ex-vivo human cornea freehand. The results show that this provides better visualization than the conventional B- or C- mode OCT image, which validates its effectiveness in guiding the surgical tool during microsurgery.

2. Method

2.1. System setup

The system setup is shown in Fig. 1(a). The system uses a 100kHz swept source OCT centered at 1060nm with 110nm tuning range. The output of the swept laser is split into two arms by a 75:25 fiber coupler. The light source couples to the sample arm with 25%, and to the reference arm with 75%. The output optical power in the sample arm is 2.5mW. Polarization controllers are used to maintain the same light polarization states between the two arms. Achromatic collimators are used to collimate the light from the fiber. Galvanometer scanners, which contain two scanning mirrors (shown in Fig. 1(b)), are used for scanning. The objective lens is used to focus the collimated light onto the sample. The numerical aperture (NA) of the objective lens is 0.05. The dispersion compensation lens is used to compensate for the dispersion due to the light double passing through the object lens. The signal from both the sample arm and the reference arm are coupled by a 50:50 fiber coupler and detected by a balanced detector. The interferometric data is processed by a data acquisition board (12-bit, 500 MSPS) and then transferred to the workstation by a frame grabber for further data processing. Graphics processing units (GPU) are used for high-speed data processing [16,22]. The system is controlled by our customized software programmed by C++ and C#.

 

Fig. 1. (a) The system setup. FC, fiber coupler. PC, polarization controller. AC, achromatic collimator. GVS, galvanometer scanners which contain two scanning mirrors. OL, objective lens. DCL, dispersion compensation lens. BD+, BD-, balanced detector. DAQ, data acquisition board. (b) The scanning scheme.

Download Full Size | PPT Slide | PDF

We use a standard OCT C-scanning, which is shown in Fig. 1(b). Two scanning mirrors are used for scanning. The first one scans densely along the X axis, while the second one scans sparsely along the Z axis. The orientation and the movement of the surgical tool are both set to be roughly parallel to the densely scanning axis X. The yellow lines shown in Fig. 1(b) indicate the multiple scanning cross sections in the XY plane. After one scanning period, the system can obtain a set of OCT volume data, which is an image set that contains multiple B-mode OCT images from different cross sections.

The BC-mode OCT image is obtained by averaging these multiple B-mode OCT images from different cross sections, which is calculated by Eq. (1),

$${I_{BC}}({i,j} )= \frac{1}{N}\sum\limits_{k = 1}^N {{S_k}({i,j} )} ,$$
where ${I_{BC}}$ is the BC-mode OCT image, ${S_k}$ is the k-th image in the image set, N is the total number of images in the image set and $({i,j} )$ indicates the pixel index in the image. Although averaging techniques have been widely used in OCT imaging, they are usually performed by taking the average of multiple scans at the same cross section [23,24], while BC-mode averaging scheme is taking the average from different cross sections. Some similar averaged B-scan images from different cross sections were previously demonstrated by Tao et al. [25] and Ehlers et al. [26], but without theoretical analysis or further discussion about its performance. Moreover, the results still suffer from the tool shadow effects due to the narrow scanning range. To solve this problem, we made the scanning along the Z axis sparse enough to cover a larger scanning range. The specific scanning configuration and the rationale are discussed in detail in the following paragraph.

The scanning configuration is shown in Table 1. The illustration of the scanning region and each labeled axis XYZ is shown in Fig. 1(b). Along the X axis, the scanning length is 6000 µm with a 6 µm dense scanning step. Along the Z axis, the scanning length is 2000 µm with a 250 µm sparse scanning step. Therefore, a set of OCT volume data contains 8 B-mode OCT images in total. The B-scan rate is 100Hz and the C-scan rate is 12.5Hz. Since one BC-mode OCT image is calculated from one set of OCT volume data, the final BC-mode OCT imaging speed is 12.5Hz. The size of the OCT volume data is 1000*1024*8 (X*Y*Z). This scanning configuration is specifically customized for the deep anterior lamellar keratoplasty (DALK) procedure, where we need to use our OCT system to guide a 30-gauge needle to Descemet’s membrane of the cornea of an ex-vivo human eye. This is also where we collected the data in this paper.

Tables Icon

Table 1. System scanning configuration

To further explain the scanning configuration of our method, we show two adjacent B-mode OCT images from the image set which contains the needle signal in Fig. 2(a) and 2(b). As we can see in both images, the needle blocks out the signal from the tissue beneath it, creating a shadow in the image. Moreover, due to the limit of a single imaging plane, parts of the needle are missing, but then appear in the adjacent B-mode OCT image. Although separately both images show the tissue shadowing effect and the difficulties of keeping the needle in the imaging field, together they provide more complete information about the needle and the cornea structure. The scanning step along Y axis is set to be a little less than the outer diameter of the 30-gauge needle (311 µm) so that the whole needle structure can be captured by the OCT volume data as shown in Fig. 2(a) and 2(b). Moreover, the 6000 µm by 2000 µm total scanning range provides a wide field of view, making it easier to keep the needle position inside the scanning range during the free hand operation.

 

Fig. 2. B-mode OCT images showing the 30-gauge needle inside the cornea of the ex-vivo human eye. (a) One slice from the image set. (b) The adjacent slice from the image set.

Download Full Size | PPT Slide | PDF

The main idea of BC-mode OCT image visualization is to compress the sparsely scanned OCT volume data into one compounded B-mode OCT image. To understand its advantage and limitations, it is important to compare its performance against the conventional B-mode OCT image visualization. In the following section, we will derive a theoretical model for BC-mode OCT image visualization. Based on this model, we will analyze its performance in terms of the resolution, signal to noise ratio (SNR), imaging speed and needle tracking accuracy.

2.2. Theoretical analysis

2.2.1. Resolution

OCT image resolution includes both lateral resolution and axial resolution. For the conventional B-mode OCT image visualization, both axial and lateral resolutions are independent and depend totally on the system parameters. Specifically, when the transversal sampling rate is sufficient, the lateral resolution depends on the diffraction limit of the OCT system [27]. At the focal plane, the lateral resolution can be calculated by Eq. (2),

$${\sigma _x} = 0.61 \cdot \frac{{{\lambda _0}}}{{NA}},$$
where ${\sigma _x}$ is the lateral resolution, ${\lambda _0}$ is the center wavelength of the laser, $NA$ is the numerical aperture of the objective lens (shown in Fig. 1(a)). Although the lateral resolution depends on the imaging depth, it does not affect the following analysis on image resolution since its value is not specified in the analysis. The axial resolution depends on the coherence length of the laser. A common definition of the axial resolution is the full-width half maximum (FWHM) of the point spread function (PSF) in the time domain [27,28]. The axial resolution can be calculated by Eq. (3) if we consider the gaussian profile laser spectrum,
$${\sigma _y} = 0.44 \cdot \frac{{\lambda _0^2}}{{\Delta \lambda }},$$
where ${\sigma _y}$ is the axial resolution, ${\lambda _0}$ is the center wavelength of the laser, $\Delta \lambda $ is the FWHM of the laser spectrum in $\lambda $ space. In our OCT system, however, the swept source produces a nearly rectangular spectrum. The constant factor at the right hand side of Eq. (3) will be slightly different. Fourier transformation is used to obtain the time domain PSF. The coherence length is defined as the FWHM of the PSF, while the axial resolution is given by half the coherence length since the wave travels twice the sample arm [29]. Therefore, the constant factor should be modified by 0.6 for rectangular source spectrum. After we have both ${\sigma _x}$ and ${\sigma _y}$, the point spread function for B-mode OCT image system (B-PSF) can be described by Eq. (4),
$${h_B}({x,y} )= \exp \left( { - 4\ln 2 \cdot \frac{{{x^2}}}{{\sigma_x^2}}} \right) \cdot \exp \left( { - 4\ln 2 \cdot \frac{{{y^2}}}{{\sigma_y^2}}} \right),$$
where ${h_B}$ is B-PSF, ${\sigma _x}$ and ${\sigma _y}$ are lateral and axial resolution respectively. For the BC-mode images, on the other hand, the image resolution not only depends on the system parameters but also depends on the sample characteristics. The idea of BC-mode OCT image is to average laterally shifted B-mode OCT images. However, it is not ensured that each laterally shifted B-mode OCT image has the exact same structure, in which case the resolution will remain the same. Qualitatively, if the sample has highly flat structures, the influence of the sample structure on the image resolution can be neglected. On the contrary, if the sample structure has a large curvature, it will affect the image resolution significantly.

To quantitatively analyze how the sample structure affects the image resolution, the strategy we use here is to convert the discrete problem into a continuous function space problem. We first define a continuous OCT image function space $\Omega = \{{f\textrm{|}f:{R^3} \to R} \}$, where its element f maps the 3-dimensional continuous position space to the 1-dimensional continuous image intensity space. Then, BC-mode OCT images can be calculated by Eq. (5),

$${I_{BC}}({x,y} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;f({x,y,{z_0} + z} ),$$
where ${I_{BC}}$ is the BC-mode OCT image, t is the sparsely scanning length along the Z axis, ${z_0}$ is the center position of the scanning range along the Z axis, and $({x,y,{z_0} + z} )$ indicates the position in the 3-dimensional continuous position space. This equation is the continuous version of Eq. (1). Specially, the B-mode OCT image at the $z = {z_0}$ cross section can be calculated by taking the limit $t \to 0$ if we neglect the laser beam width. To proceed, we approximate the function within a small integration range by using the first order Taylor expansion, which is shown in Eq. (6),
$${I_{BC}}({x,y} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;\left( {f({x,y,{z_0}} )+ \frac{{\partial f}}{{\partial z}}z} \right).$$
The next step is to find the point spread function for the BC-mode OCT image (BC-PSF), which is a function of x and y. In order to do that, we need to make Eq. (6) integrate on the XY plane instead of integrating along the Z axis. This can be done by finding the contour line with the same image intensity, i.e. $x(z ),\; y(z )$ as a function of z so that Eq. (7) is satisfied,
$$f({x(z ),y(z ),z} )= const.$$
By taking the derivative with respect to z on both side of Eq. (7) and using the chain rule, we can get a differential geometric relationship of the contour line shown in Eq. (8),
$$\frac{{\partial f}}{{\partial x}}\frac{{dx}}{{dz}} + \frac{{\partial f}}{{\partial y}}\frac{{dy}}{{dz}} + \frac{{\partial f}}{{\partial z}} = 0.$$
By plugging Eq. (8) into Eq. (6) and use the inverse Taylor expansion, we can obtain Eq. (9),
$${I_{BC}}({x,y} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;f\left( {x - \frac{{dx}}{{dz}}z,y - \frac{{dy}}{{dz}}z,{z_0}} \right).$$
To analyze the contour line quantitatively, we define two angles $\alpha $ and $\beta $, which are shown in Fig. 3. Note that we rotate the axis in order to better illustrate the angle relationship. $\vec{n}$ is the unit tangential vector of the contour line. Angle $\beta $ is the polar angle between $\vec{n}$ and the Z axis. Angle $\alpha $ is the azimuth angle between the orthogonal projection of $\vec{n}$ on the XY plane and the X axis. Therefore, the unit tangential vector $\vec{n}$ can be described by Eq. (10),
$$\vec{n} = ({\sin \beta \cos \alpha ,\sin \beta \sin \alpha ,\cos \beta } ).$$

 

Fig. 3. The angle relationship. $\vec{n}$, unit tangential vector. $\alpha $, azimuth angle. $\beta $, polar angle.

Download Full Size | PPT Slide | PDF

Then, we can get the differential geometric relationship in Eq. (11) and Eq. (12),

$$\frac{{dx}}{{dz}} = \tan \beta \cos \alpha ,$$
$$\frac{{dy}}{{dz}} = \tan \beta \sin \alpha .$$
Plugging Eq. (11) and Eq. (12) into Eq. (9), the BC-mode OCT image term becomes Eq. (13),
$${I_{BC}}({x,y} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;f({x - \tan \beta \cdot z\cos \alpha ,y - \tan \beta \cdot z\sin \alpha ,{z_0}} ).$$
Substituting f in Eq. (13) with B-PSF ${h_B}$ in Eq. (4), we can obtain BC-PSF in Eq. (14),
$${h_{BC}}({x,y} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;{h_B}({x - \tan \beta \cdot z\cos \alpha ,y - \tan \beta \cdot z\sin \alpha } ),$$
where we do not write out the third argument z since it is a constant. The geometric meaning of the polar angle $\beta $ is the curvature angle of the sample structure along the sparsely scanning axis Z. A smaller $\beta $ will lead to a smaller effect on the resolution of the BC-mode OCT image. As we can see from Eq. (14), if $\beta = 0$, BC-PSF is the same as B-PSF and the image resolution will not be affected. The angle $\alpha $ denotes the direction of the orthogonal projection of the tangential vector on the XY plane. Two extreme situations happen when $\alpha = 0$ and $\alpha = \pi /2$. As we can see from Eq. (14), if $\alpha = 0$, then $\sin \alpha = 0$ and the axial resolution will not be affected. If $\alpha = \pi /2$, then $\cos \alpha = 0$ and the lateral resolution will not be affected. These results are reasonable and expected.

For a fixed point in the OCT volume data, the sample structure is fixed, and so are the angles $\alpha $ and $\beta $. For fixed angles $\alpha $ and $\beta $, Eq. (14) is a line integration. The integration can be rewritten as Eq. (15),

$$\begin{aligned} {h_{BC}}({x,y} ) &= \int_{ - \infty }^{ + \infty } {dz} \;\exp \left( { - 4\ln 2 \cdot \frac{{{{({x - \tan \beta \cdot z\cos \alpha } )}^2}}}{{\sigma_x^2}}} \right)\\ &\cdot \exp \left( { - 4\ln 2 \cdot \frac{{{{({y - \tan \beta \cdot z\sin \alpha } )}^2}}}{{\sigma_y^2}}} \right) \cdot \frac{1}{t}Rect\left( {\frac{{2z}}{t}} \right), \end{aligned}$$
where $Rect({2z/t} )$ is a rectangular function, i.e. $Rect({2z/t} )= 1$ for $- t/2 < z < t/2$, otherwise $Rect({2z/t} )= 0$. To make Eq. (15) integrable, we approximate the rectangular function as a Gaussian function with the same variance. The reason we do this approximation is that FWHM of a PSF is more important than the shape of the PSF, which determines the image resolution directly. Therefore, instead of numerically obtaining the exact shape of the BC-PSF, we are trying to find the broadened width when the rectangular function convolutes with the PSF based on a statistical approach. By treating the rectangular function as a probability distribution $p(x )$ for a random variable X, we can get the variance of X by $V(X )= E({{X^2}} )- E{(X )^2} = {t^2}/12$. Then, we approximate the rectangular function in Eq. (16),
$$\frac{1}{t}{\mathop{\textit{Re}}\nolimits} ct\left( {\frac{{2z}}{t}} \right) \approx \frac{1}{{\sqrt {2\pi } {\sigma _t}}}\exp \left( { - \frac{{{z^2}}}{{2\sigma_t^2}}} \right),$$
where $\sigma _t^2 = {t^2}/12$. The integration result for BC-PSF in Eq. (15) is approximated in Eq. (17),
$${h_{BC}}({x,y} )= \frac{1}{{\sqrt {1 + \frac{1}{{12}}{{\left( {\frac{t}{{{\sigma_0}}}} \right)}^2}} }} \cdot \exp \left( {\frac{{ - 4\ln 2 \cdot \left( {\begin{array}{{cc}} x&y \end{array}} \right) \cdot {\textbf A} \cdot {{\left( {\begin{array}{{cc}} x&y \end{array}} \right)}^T}}}{{\sigma_x^2\sigma_y^2 + \frac{{2\ln 2}}{3}{{\tan }^2}\beta ({\sigma_x^2{{\sin }^2}\alpha + \sigma_y^2{{\cos }^2}\alpha } ){t^2}}}} \right),$$
where $\sigma _0^2 = \sigma _x^2\sigma _y^2/({8\ln 2 \cdot {{\tan }^2}\beta ({\sigma_x^2{{\sin }^2}\alpha + \sigma_y^2{{\cos }^2}\alpha } )} )$, and the matrix ${\boldsymbol A}$ is shown in Eq. (18),
$${\textbf A} = \left( {\begin{array}{{cc}} {\sigma_y^2 + \frac{{2\ln 2}}{3}{{\tan }^2}\beta {{\sin }^2}\alpha \cdot {t^2}}&{ - \frac{{\ln 2}}{3}{{\tan }^2}\beta \sin 2\alpha \cdot {t^2}}\\ { - \frac{{\ln 2}}{3}{{\tan }^2}\beta \sin 2\alpha \cdot {t^2}}&{\sigma_x^2 + \frac{{2\ln 2}}{3}{{\tan }^2}\beta {{\cos }^2}\alpha \cdot {t^2}} \end{array}} \right).$$
Therefore, finding the image resolution is equivalent to finding the eigenvalue of the matrix A. Note that ${n_x} = ({1,0} )$ and ${n_y} = ({0,1} )$ are no longer the eigenvectors of the matrix A if $\beta \ne 0$ and $\alpha \ne 0$ or $\pi /2$. This is expected since the line integration is not along the axis X or axis Y in this case. However, the eigenvectors are not simply parallel or perpendicular to the integration direction due to the asymmetry along the X axis and the Y axis, i.e. ${\sigma _x} \ne {\sigma _y}$. Despite its complexity, the eigenvalues of the matrix A can still be found but it is not necessary to show the results here. Instead, we can do approximation because for a large scanning length t, the difference between ${\sigma _x}$ and ${\sigma _y}$ can be neglected. Thus, we approximate that $\sigma _x^2 \approx \sigma _y^2 \approx ({\sigma_x^2 + \sigma_y^2} )/2$. In this approximation, we get symmetry along the X axis and the Y axis, thus the eigen problem becomes much easier to solve. The eigenvalues and the eigenvectors are shown in Eq. (19) and Eq. (20),
$${\lambda _1} = \frac{{\sigma _x^2 + \sigma _y^2}}{2},\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\textrm{where}\;{\vec{n}_1} = ({\cos \alpha ,\sin \alpha } ),$$
$${\lambda _2} = \frac{{\sigma _x^2 + \sigma _y^2}}{2} + \frac{{2\ln 2}}{3}{\tan ^2}\beta \cdot {t^2},\;\;\;\textrm{where}\;{\vec{n}_2} = ({ - \sin \alpha ,\cos \alpha } ),$$
where ${\lambda _1}$ and ${\lambda _2}$ are the eigenvalues, ${\vec{n}_1}$ and ${\vec{n}_2}$ are the corresponding eigenvectors, which are parallel or perpendicular to the integration direction. Therefore, along the direction ${\vec{n}_1} = ({\cos \alpha ,\sin \alpha } )$, the resolution term can be found as in Eq. (21),
$${\sigma _{\max }} = \sqrt {\frac{{\sigma _x^2 + \sigma _y^2}}{2} + \frac{{2\ln 2}}{3}{{\tan }^2}\beta \cdot {t^2}} .$$
Since ${\vec{n}_1}$ is parallel to the integration direction, the resolution should be the worst, and thus it is denoted by ${\sigma _{max}}$. Along the direction ${\vec{n}_2} = ({ - \sin \alpha ,\cos \alpha } )$, the resolution term can be found as in Eq. (22),
$${\sigma _{\min }} = \sqrt {\frac{{\sigma _x^2 + \sigma _y^2}}{2}} .$$
Since ${\vec{n}_2}$ is perpendicular to the integration direction, the resolution should be the best, and thus it is denoted by ${\sigma _{min}}$. For any other direction, the image resolution lies between ${\sigma _{min}}$ and ${\sigma _{max}}$, i.e. ${\sigma _{min}} < \sigma < {\sigma _{max}}$. This result is important since it approximates the upper bound and lower bound for the image resolution along any direction.

However, there are a lot of points inside the OCT image volume data; each point has its unique sample structure, i.e. the azimuth angle $\alpha $ and the polar angle $\beta $. It is also important to analyze the average effect of the sample structure on the image resolution. To do that, we made a statistical assumption that for a fixed polar angle $\beta $, the azimuth angle $\alpha $ is distributed uniformly between $0$ to $\pi $. Therefore, for a fixed angle $\beta $, we can average the BC-PSF in Eq. (14) for different angle $\alpha $ as in Eq. (23),

$${h_{BC}}({x,y} )= \frac{4}{{\pi {t^2}}}\int_0^{\frac{t}{2}} {dz} \;\int_0^{2\pi } {zd\alpha } \;{h_B}({x - \tan \beta \cdot z\cos \alpha ,y - \tan \beta \cdot z\sin \alpha } ),$$
where we use the area measure $zd\alpha dz$ and $4/\pi {t^2}$ is the normalization factor. We change the integrate range for $\alpha $ from $[{0,\pi } ]$ to $[{0,2\pi } ]$ because we change the integration range for z from $[{ - t/2,t/2} ]$ to $[{0,t/2} ]$. After transforming the basis from the polar coordinate to the Cartesian coordinate, where $\left( {\begin{array}{{cc}} {{x_0}}&{{y_0}} \end{array}} \right) = \left( {\begin{array}{{cc}} {z\cos \alpha }&{z\sin \alpha } \end{array}} \right)$, the integration can be written as Eq. (24),
$${h_{BC}}({x,y} )= \int_{ - \infty }^{ + \infty } {d{x_0}} \;\int_{ - \infty }^{ + \infty } {d{y_0}} \;{h_B}({x - \tan \beta \cdot {x_0},y - \tan \beta \cdot {y_0}} )\cdot \frac{4}{{\pi {t^2}}}Circle\left( {\frac{{x_0^2 + y_0^2}}{{{{\left( {\frac{t}{2}} \right)}^2}}}} \right),$$
where $Circle({({x_0^2 + y_0^2} )/{{({t/2} )}^2}} )$ is a circular function, i.e. $Circle({({x_0^2 + y_0^2} )/{{({t/2} )}^2}} )= 1$ for $x_0^2 + y_0^2 < {({t/2} )^2}$, otherwise $Circle({({x_0^2 + y_0^2} )/{{({t/2} )}^2}} )= 0$. To make the Eq. (24) integrable, we again approximate the circular function as a 2-dimensional Gaussian function. This time, we regard the circular function as a joint probability distribution $p({x,y} )$ for random variables X and Y. To find the variance for random variable X, we need to calculate the marginal probability distribution by ${p_X}(x )= \mathop \smallint \nolimits_{ - \infty }^{ + \infty } dyp({x,y} ).$ Intuitively, the 2-dimensional circular function makes X more confined to the center than the 1-dimensional rectangular function, thus it should have smeller variance. We can calculate the variance of X by $V(X )= E({{X^2}} )- E{(X )^2} = {t^2}/64$, which is smaller than ${t^2}/12$ as expected. Therefore, we approximate the circular function in Eq. (25),
$$\frac{4}{{\pi {t^2}}}Circle\left( {\frac{{x_0^2 + y_0^2}}{{{{\left( {\frac{t}{2}} \right)}^2}}}} \right) \approx \frac{1}{{2\pi \sigma _t^2}}\exp \left( { - \frac{{x_0^2 + y_0^2}}{{2\sigma_t^2}}} \right),$$
where $\sigma _t^2 = {t^2}/64$. The integration result for BC-PSF in Eq. (24) is approximated in Eq. (26),
$$\begin{array}{c} {h_{BC}}({x,y} )= \frac{1}{{\sqrt {1 + \frac{{\ln 2}}{8}{{\tan }^2}\beta {{\left( {\frac{t}{{{\sigma_x}}}} \right)}^2}} }}\exp \left( { - 4\ln 2 \cdot \frac{{{x^2}}}{{\sigma_x^2 + \frac{{\ln 2}}{8}{{\tan }^2}\beta \cdot {t^2}}}} \right)\\ \cdot \frac{1}{{\sqrt {1 + \frac{{\ln 2}}{8}{{\tan }^2}\beta {{\left( {\frac{t}{{{\sigma_y}}}} \right)}^2}} }}\exp \left( { - 4\ln 2 \cdot \frac{{{y^2}}}{{\sigma_y^2 + \frac{{\ln 2}}{8}{{\tan }^2}\beta \cdot {t^2}}}} \right). \end{array}$$
Therefore, the average effects of the sample structure on the lateral image resolution and axial image resolution are separated and can be described by Eq. (27) and Eq. (28),
$${\bar{\sigma }_x} = \sqrt {\sigma _x^2 + \frac{{\ln 2}}{8}{{\tan }^2}\beta \cdot {t^2}} ,$$
$${\bar{\sigma }_y} = \sqrt {\sigma _y^2 + \frac{{\ln 2}}{8}{{\tan }^2}\beta \cdot {t^2}} ,$$
where ${\bar{\sigma }_x}$ and ${\bar{\sigma }_y}$ lie within ${\sigma _{min}}$ and ${\sigma _{max}}$ for large t as expected. Note that although the lateral resolution ${\sigma _x}$ depends on the imaging depth, Eq. (21), Eq. (22), Eq. (27) and Eq. (28) still show the correct trend for the BC-mode OCT image resolution compared with the conventional B-mode OCT image at the same imaging depth.

To validate our theory, we used the surface of the cornea as landmarks to approximate the BC-mode OCT image resolution. The human cornea surface usually has a spherical shape with a 6.75 mm radius. The sparsely scanning length along the axis Z is 2 mm in our scanning configuration. Figure 4 shows the model of the human cornea, where r is the radius of the cornea and t is the scanning length along the Z axis. Considering that we are imaging the center part of the cornea, the center of the cornea is aligned with the center of the scanning region along the Z axis. The minimal polar angle of the cornea structure is zero at the center of the scanning region, while the maximal polar angle is ${\tan ^{ - 1}}({t/2r} )$ at the edge of the scanning region. Statistically, we want to obtain an average polar angle, which is denoted by the blue dashed line and angle $\beta $ in Fig. 4. Since this polar angle is small in cornea structure, we can approximate the average polar angle $\beta $ by Eq. (29),

$$\beta = {\tan ^{ - 1}}\frac{t}{{4r}}.$$
Therefore, we get the approximate average polar angle $\beta = {4.2^{\circ}}$. For the OCT system parameters, we use the center wavelength ${\lambda _0} = 1060nm$, the wavelength tuning range $\Delta \lambda = 110nm$, the numerical aperture of the scanning lens $NA = 0.05$. We can approximate the lateral resolution by using Eq. (2) since the imaging plane is close to the focal plane of the laser beam. We tested three different regions of interest (ROIs) around the center part of the cornea, which are shown in Fig. 5(a)–5(c). Figure 5(a)–5(c) were rescaled to better arrange the figure. The theoretical curves for the average lateral image resolution, average axial image resolution, image resolution lower bound and image resolution upper bound are plotted by the solid curves with different colors in Fig. 5(d)–5(f). The black dots with error bars in Fig. 5(d)–5(f) show the measured BC-mode OCT image resolutions from different ROIs, which will be explained in the following paragraph.

 

Fig. 4. Human cornea model.

Download Full Size | PPT Slide | PDF

 

Fig. 5. (a) ROI-1. (b) ROI-2. (c) ROI-3. (d) ∼ (f) Theoretical curves and experimental measurements of BC-mode OCT image resolution in ROI-1, ROI-2, ROI-3, respectively.

Download Full Size | PPT Slide | PDF

To calculate the resolution of the BC-mode OCT image, we segmented the surface of the cornea in the B-mode OCT image. The segmentation method we use here is based on graph theory and dynamic programming, with the accuracy around $\pm 1$ pixel [14,30]. The resolution for each A-line is approximated by calculating the largest difference between the segmented surface positions. To get the dependence of the resolution on the sparsely scanning length, we acquire BC-mode OCT images for different N from 2 to 8 so that the sparse scanning length ranges from 0.5 mm to 2 mm. In Fig. 5(d)–5(f), the black circles show the mean value of the resolution, while the black bars show the variance value of the resolution. We can see that most resolution values lie within the lower bound and upper bound. Moreover, for each tested data, the resolution has a nearly linear relationship with the sparsely scanning length. The difference in the slope comes from another structure parameter, the azimuth angle $\alpha $. As we discussed before, when $\alpha $ is close to 0, the resolution is close to the lower bound; when $\alpha $ is close to $\pi /2$, the resolution is close to the upper bound. Therefore, we can say that ROI-1 has a lower value of $\alpha $, while ROI-2 has a larger value of $\alpha $.

2.2.2. SNR

One benefit of BC-mode averaging is to reduce the speckle noise. Since the signal to noise ratio (SNR) is a good quantification for the level of speckle noise, we need to compare the SNR between BC-mode OCT images and conventional B-mode OCT images. First, we define the SNR as shown in Eq. (30),

$$SNR = 10{\log _{10}}\frac{{\mu _s^2}}{{\sigma _n^2}},$$
where ${\mu _s}$ is the mean intensity of the signal, ${\sigma _n}$ is the standard deviation of the noise. Since the BC-mode OCT image is averaged from many laterally shifted B-mode OCT images, the mean intensity ${\mu _{BC}}$ of the signal in the BC-mode image should remain the same as the mean intensity ${\mu _B}$ of B-mode OCT image, i.e. ${\mu _{BC}} = {\mu _B}$ no matter what N is. To prove this, we can use the continuous OCT image function space $\Omega $ that we defined in the resolution section. The mean intensity ${\mu _B}$ of B-mode OCT image can be calculated in Eq. (31),
$${\mu _B}({{z_0}} )= \frac{1}{S}\int\!\!\!\int_S {dxdy} \;f({x,y,{z_0}} ),$$
where we integrate over the signal range and S is the total integration area, $f \in \Omega $, and the plane $z = {z_0}$ is the scanning cross section of the B-mode OCT image. By using the continuous BC-mode OCT integration scheme in Eq. (5), the mean intensity ${\mu _{BC}}$ of the signal in the B-mode image can be calculated in Eq. (32),
$${\mu _{BC}}({{z_0}} )= \frac{1}{S}\int\!\!\!\int_S {dxdy} \;\frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;f({x,y,{z_0} + z} ),$$
where $z = {z_0}$ is the center plane of the scanning range along the Z axis, and t is the sparsely scanning length along the Z axis. This is true for BC-mode OCT signals that are not blocked by the surgical tool. However, for BC-mode OCT signals that are blocked by the surgical tool, we need to modify Eq. (32), which we will discuss explicitly in the later part of this section. By exchanging the integration order and plugging Eq. (31), we can get Eq. (33),
$${\mu _{BC}}({{z_0}} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;\frac{1}{S}\int\!\!\!\int_S {dxdy} \;f({x,y,{z_0} + z} )= \frac{1}{t}\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;{\mu _B}({{z_0} + z} ).$$
To proceed, an assumption was made that for a small range of area, the mean intensity of the B-mode signal remains the same, i.e. ${\mu _B}({{z_0} + z} )= {\mu _B}({{z_0}} )$ for any small z. Under this assumption, we can easily get Eq. (34),
$${\mu _{BC}} = {\mu _B},$$
where we prove that the mean intensity ${\mu _{BC}}$ should remain the same as ${\mu _B}$ and it does not depend on the averaging frame number N.

However, the case for the standard deviation of the noise ${\sigma _n}$ is different and it will not remain the same for different N. In fact, OCT speckle noise is spatially independent, and this independence is widely used in speckle reduction of OCT image [3133]. Therefore, we assume that the noises for different B-mode OCT images ${X_1},{X_2} \cdots {X_N}$ are random variables from an identical independent distribution with a standard deviation ${\sigma _B}$, i.e. ${X_1},{X_2} \cdots {X_N} \in i.i.d$ and $V({{X_i}} )= {\sigma _B}$ for $i = 1,2, \cdots ,N$, where N is the number of averaging frames. Then the noise in a BC-mode OCT image can be modelled as Eq. (35),

$$\bar{X} = \frac{1}{N}\sum\limits_{i = 1}^N {{X_i}} .$$
The variance of the noise $\bar{X}$ in BC-mode OCT image can be calculated in Eq. (36),
$$\sigma _{BC}^2 = V({\bar{X}} )= V\left( {\frac{1}{N}\sum\limits_{i = 1}^N {{X_i}} } \right) = \frac{1}{{{N^2}}}\sum\limits_{i = 1}^N V ({{X_i}} )= \frac{{\sigma _B^2}}{N}.$$
We label the image area where OCT signals are not blocked by the surgical tool the unshaded area. By using Eq. (30), Eq. (34) and Eq. (36), the SNR of the unshaded area can be modeled as Eq. (37),
$$SN{R_{BC - unshaded}} = 10{\log _{10}}\frac{{\mu _{BC}^2}}{{\sigma _{BC}^2}} = 10{\log _{10}}\left( {\frac{{\mu_B^2}}{{\sigma_B^2}} \cdot N} \right) = SN{R_B} + 10{\log _{10}}N,$$
where $SN{R_B} = 10{log _{10}}({\mu_B^2/\sigma_B^2} )$ is the SNR of the conventional B-mode OCT image.

However, for the image area where OCT signals are blocked by the surgical tool, which we label the shaded area, the SNR is different and should be lower than that of the unshaded area. Since the shaded area is filled with noise instead of the OCT signal, Eq. (33) and Eq. (34) do not work. To model it, we first define ${t_0}$ as the thickness of the surgical tool along the sparsely scanning axis Z. Therefore, Eq. (33) can be easily modified as Eq. (38),

$${\mu _{BC}}({{z_0}} )= \frac{1}{t}\left( {\int_{ - \frac{t}{2}}^{\frac{t}{2}} {dz} \;{\mu_B}({{z_0} + z} )+ \int_{a - \frac{{{t_0}}}{2}}^{a + \frac{{{t_0}}}{2}} {dz} \;({{\mu_n}({{z_0} + z} )- {\mu_B}({{z_0} + z} )} )} \right),$$
where ${\mu _n}$ is the mean intensity of the noise at the shaded area, and $- ({t - {t_0}} )/2 < a < ({t - {t_0}} )/2$ since the surgical tool should be kept within the sparsely scanning range. We can ignore the independent variables ${z_0} + z$ based on the assumption that both ${\mu _B}$ and ${\mu _n}$ remain constant for a small range of area. Therefore, the mean intensity ${\mu _{BC}}$ of signal in the BC-mode OCT image can be written in Eq. (39),
$${\mu _{BC}} = {\mu _B} \cdot \frac{{t - {t_0}}}{t} + {\mu _n} \cdot \frac{{{t_0}}}{t}.$$
We define another parameter s as the scanning step so that $t = Ns$. Therefore, the SNR of the shaded area can be modeled as Eq. (40),
$$SN{R_{BC - shaded}} = 10{\log _{10}}\frac{{\mu _{BC}^2}}{{\sigma _{BC}^2}} = 10{\log _{10}}\left( {\frac{{\mu_B^2}}{{\sigma_B^2}} \cdot N{{\left( {1 - \frac{{{\mu_B} - {\mu_n}}}{{{\mu_B}}}\frac{{{t_0}}}{{Ns}}} \right)}^2}} \right).$$
It is generally true that in OCT image, the signal level is higher than the noise level, i.e. ${\mu _B} > {\mu _n}$. Therefore, $({{\mu_B} - {\mu_n}} )/{\mu _B} > 0$ and we can see that the SNR increases when the scanning step s increases. However, in practice, the scanning size s should be set less than the thickness of the surgical tool ${t_0}$, i.e. $s \le {t_0}$. Therefore, when $s = {t_0}$, the best SNR is obtained for a shaded area which is shown in Eq. (41),
$$SN{R_{BC - shaded}} = SN{R_B} + 10{\log _{10}}\frac{1}{N}{\left( {N - 1 + \frac{{{\mu_n}}}{{{\mu_B}}}} \right)^2},$$
where we can see that the noise in the shaded area ${\mu _n}$ is not a bad thing since it improves the SNR. If the mean intensity of the noise is the same as the mean intensity of the signal, i.e. ${\mu _n} = {\mu _B}$, then Eq. (11) will go back to Eq. (9) as expected. Usually the mean intensity of the noise ${\mu _n}$ changes with the axial position and it is smaller at a deeper axial position. However, no matter how ${\mu _n}$ changes, it should be bounded by 0 and ${\mu _B}$, i.e. $0 < {\mu _n} < {\mu _B}$. Therefore, the SNR of the shaded area is also bounded. The upper bound of the SNR in the shaded area is given by Eq. (42) when ${\mu _n} = {\mu _B}$,
$$SN{R_{BC - shaded - max}} = SN{R_B} + 10{\log _{10}}N,$$
which is the same as the SNR of the unshaded area. The lower bound of the SNR is given by Eq. (43) when ${\mu _n} = 0$,
$$SN{R_{BC - shaded - min}} = SN{R_B} + 10{\log _{10}}\left( {N + \frac{1}{N} - 2} \right),$$
which we are more interested in because this defines the worst case of how the shading effect of the surgical tool impacts on the SNR of BC-mode OCT images.

To validate our theory on SNR, we compared both theoretical SNR predictions and experimental SNR values calculated from BC-mode images. To obtain the dependence of SNR on the averaging frame number N, we acquire BC-mode OCT images for different N from 1 to 8. As shown in Fig. 6(a), we choose three different ROIs on the BC-mode image. The ROI-1 is used to calculate the standard deviation of the noise. The ROI-2 is used to calculate the mean intensity of the unshaded area, while the ROI-3 is used to calculate the mean intensity of the shaded area. For better comparison, we chose the ROI-2 and the ROI-3 at the same axial depth, as we can see in Fig. 6(a). Figure 6(a) was rescaled to better arrange the figure. To explore the lower bound of the SNR for the shaded area, we made the noise intensity of the shaded area zero. The results are plotted in Fig. 6(b). The blue curve and the blue circles show the theoretical and experimental value for SNR of the unshaded area, respectively. The red curve and the red circles show the theoretical lower bound and experimental value for SNR of the shaded area, respectively. The constant $SN{R_B}$ was found to be 9.43 by least square fitting our theoretical models with the experimental data. Note that the absolute value of SNR is not as important as the relative trend of SNR, since the absolute value of SNR can be easily adjusted by changing the reference arm power or by the background subtraction, while the trend of SNR remains the same. As we can see, the theoretical value and the experimental value match well. The little offset between the theoretical and experimental values is because of the mean intensity ${\mu _B}$ of the OCT signal does not remain exactly the same in a small range around the ROIs.

 

Fig. 6. (a) Three ROIs in the BC-mode OCT images. (b) The comparison between the theoretically predicted SNR and the experimentally calculated SNR for both the shaded and the unshaded area.

Download Full Size | PPT Slide | PDF

From the plot in Fig. 6(b), we can see that when we increase the averaging frame number N for BC-mode OCT image, the SNR for both unshaded and shaded area increases and the difference between the two SNR decreases. However, there are other aspects that we should consider when choosing a proper value of N, such as the imaging resolution as we discussed before, and the imaging speed that we will discuss in the following section.

2.2.3. Imaging speed

The imaging speed for the BC-mode OCT image visualization is easy to analyze. However, it is necessary to be analyzed since it gives us an upper bound for the number of averaging frames N if we want a decent video frame rate. The frame rate is determined by the sweep rate of the laser source and the number of the scanning spots within one frame, which can be described by Eq. (44),

$$frame\;rate = \frac{{{R_A}}}{{{N_{BC}}}}.$$
where ${R_A}$ is the available A-scan sweep rate of the swept source, and ${N_{BC}}$ is the scanning spots within one BC-mode OCT image frame. In our OCT system, the swept source has a 100kHz sweep rate, i.e. ${R_A} = 100kHz$. Since each B-mode OCT image contains 1000 A-scan lines and each BC-mode OCT image contains N B-mode OCT images, we have ${N_{BC}} = 1000 \cdot N$. Therefore, by using Eq. (44), the frame rate in our system is $100/N$. The frame rate should be at least 10 Hz for guiding the surgical tool at a video rate. Therefore, the averaging number of frames for each BC-mode OCT image should be no more than 10, i.e. $N \le 10$.

2.2.4. Tracking accuracy

Although BC-mode visualization can improve the visual experience by continuously keeping the needle position in the imaging field, it is unknown at what accuracy the needle can be tracked by using BC-mode OCT images. For procedures involving the needle guiding, the most important target requiring precise tracking is the needle tip so that the distance between the needle tip and the target surface such as Descemet’s membrane in the case of DALK can be accurately calculated. To quantitatively analyze the needle tip tracking accuracy, we need to consider the needle shape and the BC-mode scanning scheme.

Figure 7 shows the geometric model for needle tracking accuracy analysis. In Fig. 7(a), the blue part represents the needle and the red lines indicate the sparse BC-mode scanning cross-sections. The needle tip is modeled as isosceles triangle shape for first order approximation. The distance between the adjacent scanning line is the scanning step size s, which was defined previously. The scanning step size s should satisfy that $0 < s < {t_0}$, where ${t_0}$ is the thickness of the surgical tool along the sparsely scanning axis Z. The angle $\theta $ is the angle of the needle tip seen from the top view. Figure 7(b) shows the 3-dimensional diagram of the needle, where the top needle is the real needle and the bottom needle is the orthogonal projection of the real needle onto the horizontal plane XZ, which is also shown in the top view in Fig. 7(a). The angle $\alpha $ is the real angle of the needle tip and it is a fixed value for a fixed type of needle. The angle $\beta $ is the insertion angle between the needle and the horizontal plane. The angle $\theta $ is the angle of the needle tip after its projection onto the horizontal plane, which is the same as the one shown in Fig. 7(a). Based on the geometric relationship in Fig. 7(b), the angle $\theta $ can be calculated as in Eq. (45),

$$\theta = 2{\tan ^{ - 1}}\left( {\frac{{\tan \frac{\alpha }{2}}}{{\cos \beta }}} \right).$$
During the surgical procedure, the needle can be randomly placed, while the positions of BC-mode scanning lines are fixed. In practice, we try to make both the needle orientation and the needle movement parallel to the BC-mode scanning axis X during the experiment, so we neglect the small angle that might occur between them in the theoretical analysis. We define x as the distance between the needle tip and the closest BC-mode scanning line. Therefore, x should satisfy that $0 \le x \le s/2$. Moreover, we assume that x is uniformly distributed in this range. The next step is to calculate the measurement error of the needle tip position by using the BC-mode OCT image. The vertical blue dashed line at the left side in Fig. 7(a) shows the real position of the needle tip, while the one at the right side shows the measured position of the needle tip in BC-mode OCT image. The error e is the distance between the two blue dashed lines. Since we care more about the vertical distance between the needle tip and the target surface, we are more interested in the vertical component of the error, ${e_v} = e\tan \beta $. Using the geometric angle relation in Eq. (45), we can get the vertical error ${e_v}$ in Eq. (46),
$${e_v} = \frac{x}{{\tan \frac{\theta }{2}}} \cdot \tan \beta = \frac{{\sin \beta }}{{\tan \frac{\alpha }{2}}} \cdot x.$$

 

Fig. 7. The model for needle tracking accuracy calculations. (a) The top view. (b) The 3-dimensional diagram.

Download Full Size | PPT Slide | PDF

Since the distance from the needle tip to the closest BC-mode scanning line is bounded, i.e. $0 \le x \le s/2$, the vertical error ${e_v}$ is also bounded. This is the advantage of the BC-mode OCT scanning scheme, where it is much easier to keep the needle in the range of the scanning field compared with B-mode OCT scanning scheme. As long as the needle is kept inside the scanning field, the measurement error of the needle tip is always bounded. However, for B-mode OCT scanning scheme, since there is only one scanning line, the measurement error of the needle tip is not bounded and can be comparably large if the needle does not exactly lie in the B-mode scanning plane.

To proceed, based on our assumption that x is a random variable that is uniformly distributed in the range $[{0,s/2} ]$, we can calculate the mean of the vertical error in Eq. (47),

$${\mu _{{e_v}}} = \frac{2}{s}\int_0^{\frac{s}{2}} {dx} \;{e_v} = \frac{{\sin \beta }}{{\tan \frac{\alpha }{2}}} \cdot \frac{s}{4}.$$
We can also calculate the standard deviation of the vertical error in Eq. (48),
$${\sigma _{{e_v}}} = \sqrt {\frac{2}{s}\int_0^{\frac{s}{2}} {dx} \;{{({{e_v} - {\mu_{{e_v}}}} )}^2}} = \frac{{\sin \beta }}{{\tan \frac{\alpha }{2}}} \cdot \frac{s}{{4\sqrt 3 }}.$$
The results show that both the mean and the variance of the vertical error increase linearly with the scanning step size s, and they also increase when we increase the insertion angle $\beta $. Therefore, with a smaller insertion angle, we will get better tracking accuracy. Furthermore, we can see that the error is always positive, which means that the real needle tip position is always ahead of the measured needle position. Thus, we should compensate for that error by adding a small vertical distance d to the measured needle tip position in the BC-mode OCT image. To find the appropriate value for d, we used minimum variance estimation (MVE). The variance of error after adding d can be described by Eq. (49),
$$\sigma _d^2 = \frac{2}{s}\int_0^{\frac{s}{2}} {dx} \;{({{e_v} - d} )^2} = \frac{2}{s}\int_0^{\frac{s}{2}} {dx} \;{({({{e_v} - {\mu_{{e_v}}}} )- ({d - {\mu_{{e_v}}}} )} )^2} = \sigma _{{e_v}}^2 + {({d - {\mu_{{e_v}}}} )^2}.$$
Therefore, to minimize the variance $\sigma _d^2$, we should choose $d = {\mu _{{e_v}}}$. In this case, the minimum variance of the measurement error will be the same as the variance of the vertical error, i.e. $\sigma _d^2 = \sigma _{{e_v}}^2$.

To better understand how the scanning step size s and the insertion angle $\beta $ affect the tracking accuracy, the measurement error variances after the distance compensation versus the scanning step size s for different insertion angles $\beta $ are plotted in Fig. 8. The angle of the 30-gauge needle tip $\alpha $ was measured to be about ${60^ \circ }$. As we can see in Fig. 8, the smaller the insertion angle, the better the accuracy for the needle tracking. Moreover, the tracking error of the needle tip in the BC-mode OCT image increases linearly when we increase the scanning step size s. Although we discussed before that larger scanning step size s will give better SNR, we also need to consider the tracking accuracy when choosing a proper value of s.

 

Fig. 8. The vertical tracking accuracy.

Download Full Size | PPT Slide | PDF

In conclusion, the BC-mode OCT image has better tracking accuracy than B-mode OCT image. To further improve the tracking accuracy of BC-mode OCT image, one thing we have to do is to compensate for the error by adding ${\mu _{{e_v}}}$ to the actual measured needle tip vertical position, so that we will have minimal vertical measurement standard deviation ${\sigma _{{e_v}}}$. Moreover, one can get a better tracking accuracy by lowering the insertion angle $\beta $ or decreasing the scanning step size s.

2.3. Image processing

In the previous theoretical section, we did a complete analysis of the theoretical performance of the BC-mode OCT image. Briefly, compared with conventional B-mode OCT images, BC-mode OCT images have better SNR and better tracking accuracy. Although the image resolution deteriorates, the influence is little for samples with a flat structure. Moreover, the image resolution is on the same order of magnitude as the tracking accuracy, so it is still good enough for the surgical tool tracking task. Furthermore, the imaging speed for BC-mode OCT image visualization is fast enough for video rate surgical tool guidance.

To further improve the BC-mode OCT image quality, we will describe the image processing techniques that we use and the rationale in the following paragraphs. From the previous study, we know that for the whole signal area, the BC-mode OCT image has better SNR than the B-mode OCT image. However, signals from the surgical tool or the tissue layer, which are strong in B-mode OCT image due to the high reflectivity, become weak in BC-mode OCT image due to the averaging effect. To solve this problem, an inter frame variance processing technique is proposed to extract and enhance signals for both the surgical tool and the tissue layers.

Before proceeding, we first eliminate the saturation for all the images in the set of OCT volume data. Otherwise, the undesirable saturation line will also be extracted and enhanced. To suppress the saturation, we first decreased the optical coupling efficiency in the reference arm by adjusting the optical alignment shown in Fig. 1(a). Then we apply a Hanning window [34] to the spectrum before the Fourier transform. The Hanning window is a standard procedure for side mode suppression [35], and the process is described by Eq. (50),

$$\tilde{I}(k )= I(k )\cdot \frac{1}{2}\left( {1 - \cos \left( {\frac{{2\pi k}}{{K - 1}}} \right)} \right),$$
where I is the original spectrum, $\tilde{I}$ is the new spectrum after applying the Hanning window, k refers to the k-th index in the spectrum and K is the total length of the spectrum. Next, we do the Fourier transform on $\tilde{I}$ to get the time domain signal. The result after decreasing the optical coupling in the reference arm and applying the Hanning window to the spectrum data is shown in Fig. 9(a). We can see that the saturation is greatly reduced as well as the saturated intensity. However, there are still some saturation lines indicated by the yellow triangle in Fig. 9(a) which need to be eliminated.

 

Fig. 9. (a) The saturation reduced B-mode OCT image after decreasing the optical coupling in the reference arm and applying the Hanning window to the spectrum data. (b) The saturation free B-mode OCT image after the saturation elimination.

Download Full Size | PPT Slide | PDF

To detect the saturation lines, we first calculated the average intensity of each A-scan image. Then, we calculated the mean value $\mu $ and the standard deviation $\sigma $ of all the average intensities. The saturation lines are identified by finding the lines whose average intensities are greater than a threshold and the threshold is calculated by $T = \mu + \beta \cdot \sigma $, where T is the threshold value, $\mu $ is the mean value, $\sigma $ is the standard deviation, and $\beta $ is a constant value which is adjustable based on how much the saturation needs to be removed. Without loss of generality, we set $\beta = 2$ in our system. After finding the saturation lines, we replace them with the lines at the same locations from the previous adjacent image slice, which is guaranteed to be free of saturation since it is processed by the same procedure before. The result after the saturation elimination is shown in Fig. 9(b). We can see that there are no obvious saturation lines.

After the saturation elimination, the main image processing scheme is shown in Fig. 10. The image set contains multiple saturation-free B-mode OCT images. The BC-mode OCT image ${I_{BC}}$ is calculated by averaging the image set, which is defined in Eq. (1). The variance image is calculated in Eq. (51),

$$V({i,j} )= \frac{1}{N}\sum\limits_{k = 1}^N {{{({{S_k}({i,j} )- {I_{BC}}({i,j} )} )}^2}} ,$$
where V is the variance image, ${S_k}$ is the k-th image in the image set, ${I_{BC}}$ is the BC-mode OCT image, N is the total number of images in one image set and $({i,j} )$ indicates the pixel index in the image. For convenience, we rescale the variance image to [0, 255].

 

Fig. 10. The image processing scheme.

Download Full Size | PPT Slide | PDF

As we mentioned before, although the BC-mode OCT image ${I_{BC}}$ contains all the information about both the surgical tool and the tissue, the signals for both the surgical tool and the tissue layers are weakened due to the averaging effect. While on the other hand, the variance image V highlights the areas where features change fast along the Z axis, such as the surgical tool and the tissue layers. Therefore, they are good complementary to each other. To enhance the BC-mode OCT image, the idea is to add both images together. But in order to fully utilize the useful information and reduce the useless information for both images, we need to do some image processing, such as filtering, auto-threshold, and auto-rescale. ${\bar{I}_{BC}}$ and $\bar{V}$ are the images after processing. Among them, ${\bar{I}_{BC}}$ contains the coarse information about the surgical tool and the tissue with good contrast, while $\bar{V}$ contains the clean detailed information such as the surgical tool and the tissue layers. The final enhanced BC-mode OCT image can be obtained by adding them together as shown in Eq. (52),

$${\tilde{I}_{BC}}({i,j} )= {\bar{I}_{BC}}({i,j} )+ \bar{V}({i,j} ),$$
where ${\bar{I}_{BC}}$ is the processed BC-mode OCT image, $\bar{V}$ is the processed variance image, ${\tilde{I}_{BC}}$ is the final enhanced BC-mode OCT image, and $({i,j} )$ indicates the pixel index in the image.

The main image processing methods and results are shown in Fig. 11. For BC-mode OCT image, in order to further reduce the variance of the speckle noise, we first apply a 3 by 3 average filtering window for filtering. This step is necessary since any speckle noise will be magnified in the following contrast enhancement process, and we want the speckle noise as little as possible. Moreover, the 3 by 3 average filtering window does not affect the image resolution too much compared with the BC-mode averaging scheme. The black solid line in Fig. 11(a) show the histogram of the BC-mode OCT image after the filtering. From the histogram, we can see that the contrast of the image is not good enough since the intensities are mostly distributed from 0 to 50. This is also shown in the original BC-mode OCT image in Fig. 11(b). To further reduce the noise and enhance the image contrast, two critical intensities are found through the histogram. The first critical intensity is found at the peak of the histogram. We regard this intensity as the boundary between the noise and the signal. The second critical intensity is found through the auto triangle threshold [36]. The triangle threshold intensity is found at the point on the histogram with the largest distance to the line connecting the peak and the end of the histogram, which is indicated by the blue dotted lines in Fig. 11(a). We regard this intensity as the boundary between the coarse structure information and the detail structure information. Two critical intensities are shown as two vertical red dashed lines in Fig. 11(a), which divide the histogram into three parts. From left to right, these three parts are regarded as the noise, the coarse structure information, and the detail structure information respectively. Since the coarse structure information is the most important part in the BC-mode OCT image, we do a histogram mapping as indicated by the red dashed line shown in Fig. 11(a). The processed BC-mode OCT image result ${\bar{I}_{BC}}$ is shown in Fig. 11(c). Compared with the original BC-mode OCT image in Fig. 11(b), we can see that it shows the coarse structure of the cornea well with the low noise level and high contrast. However, as we discussed before, the signals of the needle and the cornea layers are weakened due to the averaging effects. But it can be solved by using the variance image, which is discussed in detail in the next paragraph.

 

Fig. 11. (a) The histogram of the BC-mode OCT image. (b) The original BC-mode OCT image ${I_{BC}}$. (c) The processed BC-mode OCT image ${\bar{I}_{BC}}$. (d) The histogram of the variance image. (e) The processed variance image $\bar{V}$. (f) The final enhanced BC-mode OCT image ${\tilde{I}_{BC}}$. In (a) and (d), the black solid line is the histogram; the blue dotted line indicates the auto triangle threshold method; the red dashed line shows the auto rescale method.

Download Full Size | PPT Slide | PDF

While the BC-mode OCT image is used to obtain the coarse structure information, the variance image is used for obtaining the clean detailed structure information, such as the needle and the cornea layers. We do not use the BC-mode OCT image for that purpose since the detail information in the BC-mode OCT image is not clean enough. To reduce the noise, we first apply a 3 by 3 average filtering window to filter the variance image. The histogram of the variance image after filtering is shown as the black solid line in Fig. 11(d). To further obtain the detail information, two critical intensities are also found through the histogram. The first critical intensity is found through the auto triangle threshold [36] as indicated by the blue dotted lines in Fig. 11(d), which is explained before. We regard this intensity as the boundary between the coarse structure information and the detail structure information. The second critical intensity is found at a certain distance away from the right side of the first critical. For convenience, we set this distance the same as that between the peak intensity and the first critical intensity. In Fig. 11(d), two critical intensities are shown as two vertical red dashed lines and the histogram mapping is done as indicated by the red dashed line. The processed variance image result $\bar{V}$ is shown in Fig. 11(e). We can see that the processed variance image clearly shows the needle and the cornea layers.

The final BC-mode OCT image ${\tilde{I}_{BC}}$ is obtained by adding up the processed BC-mode OCT image ${\bar{I}_{BC}}$ and the processed variance image $\bar{V}$, which is shown in Fig. 11(f). The intensities over 255 are set to 255 to prevent overflow. Comparing Fig. 11(c) with Fig. 11(f), we can see that the details of the needle and the cornea layers are greatly enhanced in Fig. 11(f), while the coarse structures remain the same. All the algorithms were implemented in GPU for real-time data processing. The image and video results for the surgical tool guidance by using BC-mode OCT image visualization during microsurgery are shown in the following section.

3. Results

We tested the feasibility of the proposed method by guiding the insertion of a 30-gauge needle into the cornea of an ex-vivo human eye. The results are shown in Fig. 12. The three on the left are conventional B-mode OCT images. The three on the right are BC-mode OCT images. The left and the right show the same three stages during the needle insertion. The top two images show the needle just before it was inserted into the cornea surface. Middle two images show the needle when it just entered the cornea. Bottom two images show the needle when it was deep inside the cornea. We can see that in conventional B-mode OCT images, parts of the needle are missing and there are shadows underneath the needle. While in BC-mode OCT images, the whole part of the needle and the whole structure of the cornea are shown clearly. Moreover, the details of the needle, the cornea surface and the cornea bottom are well enhanced. Furthermore, it has a lower noise level with no saturation corruption. The video results of guiding a 30-gauge needle into the ex-vivo human cornea as well as pulling the needle out from the cornea using the freehand procedure can be viewed in Visualization 1. The fading of the surgical tool in Visualization 1 is mainly due to the high-frequency tremor during the needle insertion by free hand. If the surgical tool is moving at a relatively stable motion, the image of the surgical tool does not fade since the scanning will always capture the surgical tool.

 

Fig. 12. (a) The conventional B-mode OCT image just before the needle was inserted into the cornea. (b) The conventional B-mode OCT image when the needle just entered the cornea. (c) The conventional B-mode OCT image when the needle was deep inside the cornea. (d) The BC-mode OCT image just before the needle was inserted into the cornea. (e) The BC-mode OCT image when the needle just entered the cornea. (f) The BC-mode OCT image when the needle was deep inside the cornea.

Download Full Size | PPT Slide | PDF

From both the image results and the video results, we can see that the BC-mode OCT image is beneficial for surgeons when guiding the surgical tool. For surgical procedures, surgeons can put the surgical tool roughly parallel to the densely scanning axis X at the very beginning of an operation to make it easier for the surgical tool to stay in the scanning range during the insertion. Moreover, surgeons can know the location and orientation of the surgical tool through the changes in the BC-mode OCT image when they adjust the location or orientation of the surgical tool. For example, if they try to translate the surgical tool along the sparsely scanning axis Z to see when the signal disappears, they can roughly tell if the surgical tool is in the middle or edge of the scanning area. The orientation of the surgical tool can be roughly approximated by looking at the continuity of the surgical tool signal. If the surgical tool has a large angle with the densely scanning axis X, each B-mode OCT image will capture a part of the surgical tool and this will cause a certain discontinuity of the final surgical tool signal.

4. Conclusions

In this paper, we propose a BC-mode OCT image visualization method for the intraoperative OCT image-guided microsurgery. Compared with B-mode OCT image, this visualization method solves both problems of shadow effects in the tissue and the difficulties of keeping the surgical tool inside the imaging field. Compared with the C-mode OCT image, this visualization method provides a faster imaging speed and larger field of view. Moreover, it provides significantly more information about the underside of the tissue surface because it shows a clearer cross-section view.

The complete theoretical analysis is developed for the performance of the BC-mode OCT image visualization, including the resolution, SNR, imaging speed and needle tracking accuracy. It is quantitatively proved that BC-mode OCT images have higher SNR and better tracking accuracy than conventional B-mode OCT images. Moreover, it has reasonable image resolution and fast imaging speed that can be used for surgical tool guidance. However, signals from the surgical tool and the tissue layers deteriorate due to the averaging effect in BC-mode OCT images. An inter frame variance processing technique is proposed to extract and enhance both signals. The enhanced BC-mode OCT image shows better signals of the surgical tool and tissue layers.

The experimental evaluation is conducted when guiding a 30-gauge needle into the cornea of the ex-vivo human eye by using BC-mode OCT image visualization. The results demonstrate that the BC-mode OCT image shows both the surgical tool and the tissue structure clearly without shadow effects. Moreover, it is easier to keep the surgical tool inside the imaging field when operating freehand by using BC-mode OCT image visualization.

Furthermore, by changing the scanning configuration, BC-mode OCT image visualization method can be applied to many other intraoperative OCT-guided microsurgery procedures. The scanning step s can be adjusted according to the tool thickness ${t_0}$. The averaging frame number N can be set based on the necessary scanning range t and the scanning step s. However, there is a limitation to the BC-mode OCT image visualization method. When the tissue has a large curvature and the tool has a large thickness, the image resolution will deteriorate, and thus BC-mode OCT image visualization does not work well. However, for most microsurgery procedures where the tissue has a flat structure and the tool thickness is small, BC-mode OCT image visualization will be beneficial for the microsurgery guidance.

Funding

Wallace H. Coulter Foundation; Johns Hopkins University (Discovery Grant).

Acknowledgements

The authors would like to thank support from Coulter Foundation and JHU Discovery Grant.

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References

1. R. K. Daniel, “Microsurgery: through the looking glass,” N. Engl. J. Med. 300(22), 1251–1257 (1979). [CrossRef]  

2. M. Singh and A. Saxena, “Microsurgery: A useful and versatile tool in surgical field,” Surg. Curr. Res 4(4), 9–11 (2014). [CrossRef]  

3. H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013). [CrossRef]  

4. D. J. Brenner and E. J. Hall, “Computed tomography—an increasing source of radiation exposure,” N. Engl. J. Med. 357(22), 2277–2284 (2007). [CrossRef]  

5. V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997). [CrossRef]  

6. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991). [CrossRef]  

7. J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012). [CrossRef]  

8. O. M. Carrasco-Zevallos, C. Viehland, B. Keller, M. Draelos, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Review of intraoperative optical coherence tomography: technology and applications,” Biomed. Opt. Express 8(3), 1607–1637 (2017). [CrossRef]  

9. Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013). [CrossRef]  

10. J. U. Kang and G. Cheon, “Demonstration of Subretinal Injection Using Common-Path Swept Source OCT Guided Microinjector,” Appl. Sci. 8(8), 1287 (2018). [CrossRef]  

11. S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017). [CrossRef]  

12. H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016). [CrossRef]  

13. C. Viehland, B. Keller, O. M. Carrasco-Zevallos, D. Nankivil, L. Shen, S. Mangalesh, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Enhanced volumetric visualization for real time 4D intraoperative ophthalmic swept-source OCT,” Biomed. Opt. Express 7(5), 1815–1829 (2016). [CrossRef]  

14. B. Keller, M. Draelos, G. Tang, S. Farsiu, A. N. Kuo, K. Hauser, and J. A. Izatt, “Real-time corneal segmentation and 3D needle tracking in intrasurgical OCT,” Biomed. Opt. Express 9(6), 2716–2732 (2018). [CrossRef]  

15. K. Zhang and J. U. Kang, “Real-time 4D signal processing and visualization using graphics processing unit on a regular nonlinear-k Fourier-domain OCT system,” Opt. Express 18(11), 11772–11784 (2010). [CrossRef]  

16. K. Zhang and J. U. Kang, “Real-time intraoperative 4D full-range FD-OCT based on the dual graphics processing units architecture for microsurgery guidance,” Biomed. Opt. Express 2(4), 764–770 (2011). [CrossRef]  

17. M. Draelos, B. Keller, C. Viehland, O. M. Carrasco-Zevallos, A. Kuo, and J. Izatt, “Real-time visualization and interaction with static and live optical coherence tomography volumes in immersive virtual reality,” Biomed. Opt. Express 9(6), 2825–2843 (2018). [CrossRef]  

18. O. M. Carrasco-Zevallos, C. Viehland, B. Keller, R. P. McNabb, A. N. Kuo, and J. A. Izatt, “Constant linear velocity spiral scanning for near video rate 4D OCT ophthalmic and surgical imaging with isotropic transverse sampling,” Biomed. Opt. Express 9(10), 5052–5070 (2018). [CrossRef]  

19. I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018). [CrossRef]  

20. W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014). [CrossRef]  

21. T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

22. Y. Huang, X. Liu, and J. U. Kang, “Real-time 3D and 4D Fourier domain Doppler optical coherence tomography based on dual graphics processing units,” Biomed. Opt. Express 3(9), 2162–2174 (2012). [CrossRef]  

23. M. Szkulmowski, I. Gorczynska, D. Szlag, M. Sylwestrzak, A. Kowalczyk, and M. Wojtkowski, “Efficient reduction of speckle noise in Optical Coherence Tomography,” Opt. Express 20(2), 1337–1359 (2012). [CrossRef]  

24. M. Szkulmowski and M. Wojtkowski, “Averaging techniques for OCT imaging,” Opt. Express 21(8), 9757–9773 (2013). [CrossRef]  

25. Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.

26. J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013). [CrossRef]  

27. A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003). [CrossRef]  

28. A. F. Fercher, “Optical coherence tomography,” J. Biomed. Opt. 1(2), 157–174 (1996). [CrossRef]  

29. J. G. Fujimoto and W. Drexler, “Introduction to OCT,” Optical Coherence Tomography: Technology and Applications, 3–64 (2015).

30. S. J. Chiu, X. T. Li, P. Nicholas, C. A. Toth, J. A. Izatt, and S. Farsiu, “Automatic segmentation of seven retinal layers in SDOCT images congruent with expert manual segmentation,” Opt. Express 18(18), 19413–19428 (2010). [CrossRef]  

31. D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007). [CrossRef]  

32. G. Gong, H. Zhang, and M. Yao, “Speckle noise reduction algorithm with total variation regularization in optical coherence tomography,” Opt. Express 23(19), 24699–24712 (2015). [CrossRef]  

33. M. Li, R. Idoughi, B. Choudhury, and W. Heidrich, “Statistical model for OCT image denoising,” Biomed. Opt. Express 8(9), 3903–3917 (2017). [CrossRef]  

34. F. J. Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform,” Proc. IEEE 66(1), 51–83 (1978). [CrossRef]  

35. B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

36. G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. R. K. Daniel, “Microsurgery: through the looking glass,” N. Engl. J. Med. 300(22), 1251–1257 (1979).
    [Crossref]
  2. M. Singh and A. Saxena, “Microsurgery: A useful and versatile tool in surgical field,” Surg. Curr. Res 4(4), 9–11 (2014).
    [Crossref]
  3. H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
    [Crossref]
  4. D. J. Brenner and E. J. Hall, “Computed tomography—an increasing source of radiation exposure,” N. Engl. J. Med. 357(22), 2277–2284 (2007).
    [Crossref]
  5. V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
    [Crossref]
  6. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
    [Crossref]
  7. J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
    [Crossref]
  8. O. M. Carrasco-Zevallos, C. Viehland, B. Keller, M. Draelos, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Review of intraoperative optical coherence tomography: technology and applications,” Biomed. Opt. Express 8(3), 1607–1637 (2017).
    [Crossref]
  9. Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
    [Crossref]
  10. J. U. Kang and G. Cheon, “Demonstration of Subretinal Injection Using Common-Path Swept Source OCT Guided Microinjector,” Appl. Sci. 8(8), 1287 (2018).
    [Crossref]
  11. S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
    [Crossref]
  12. H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
    [Crossref]
  13. C. Viehland, B. Keller, O. M. Carrasco-Zevallos, D. Nankivil, L. Shen, S. Mangalesh, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Enhanced volumetric visualization for real time 4D intraoperative ophthalmic swept-source OCT,” Biomed. Opt. Express 7(5), 1815–1829 (2016).
    [Crossref]
  14. B. Keller, M. Draelos, G. Tang, S. Farsiu, A. N. Kuo, K. Hauser, and J. A. Izatt, “Real-time corneal segmentation and 3D needle tracking in intrasurgical OCT,” Biomed. Opt. Express 9(6), 2716–2732 (2018).
    [Crossref]
  15. K. Zhang and J. U. Kang, “Real-time 4D signal processing and visualization using graphics processing unit on a regular nonlinear-k Fourier-domain OCT system,” Opt. Express 18(11), 11772–11784 (2010).
    [Crossref]
  16. K. Zhang and J. U. Kang, “Real-time intraoperative 4D full-range FD-OCT based on the dual graphics processing units architecture for microsurgery guidance,” Biomed. Opt. Express 2(4), 764–770 (2011).
    [Crossref]
  17. M. Draelos, B. Keller, C. Viehland, O. M. Carrasco-Zevallos, A. Kuo, and J. Izatt, “Real-time visualization and interaction with static and live optical coherence tomography volumes in immersive virtual reality,” Biomed. Opt. Express 9(6), 2825–2843 (2018).
    [Crossref]
  18. O. M. Carrasco-Zevallos, C. Viehland, B. Keller, R. P. McNabb, A. N. Kuo, and J. A. Izatt, “Constant linear velocity spiral scanning for near video rate 4D OCT ophthalmic and surgical imaging with isotropic transverse sampling,” Biomed. Opt. Express 9(10), 5052–5070 (2018).
    [Crossref]
  19. I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
    [Crossref]
  20. W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014).
    [Crossref]
  21. T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.
  22. Y. Huang, X. Liu, and J. U. Kang, “Real-time 3D and 4D Fourier domain Doppler optical coherence tomography based on dual graphics processing units,” Biomed. Opt. Express 3(9), 2162–2174 (2012).
    [Crossref]
  23. M. Szkulmowski, I. Gorczynska, D. Szlag, M. Sylwestrzak, A. Kowalczyk, and M. Wojtkowski, “Efficient reduction of speckle noise in Optical Coherence Tomography,” Opt. Express 20(2), 1337–1359 (2012).
    [Crossref]
  24. M. Szkulmowski and M. Wojtkowski, “Averaging techniques for OCT imaging,” Opt. Express 21(8), 9757–9773 (2013).
    [Crossref]
  25. Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.
  26. J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
    [Crossref]
  27. A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
    [Crossref]
  28. A. F. Fercher, “Optical coherence tomography,” J. Biomed. Opt. 1(2), 157–174 (1996).
    [Crossref]
  29. J. G. Fujimoto and W. Drexler, “Introduction to OCT,” Optical Coherence Tomography: Technology and Applications, 3–64 (2015).
  30. S. J. Chiu, X. T. Li, P. Nicholas, C. A. Toth, J. A. Izatt, and S. Farsiu, “Automatic segmentation of seven retinal layers in SDOCT images congruent with expert manual segmentation,” Opt. Express 18(18), 19413–19428 (2010).
    [Crossref]
  31. D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007).
    [Crossref]
  32. G. Gong, H. Zhang, and M. Yao, “Speckle noise reduction algorithm with total variation regularization in optical coherence tomography,” Opt. Express 23(19), 24699–24712 (2015).
    [Crossref]
  33. M. Li, R. Idoughi, B. Choudhury, and W. Heidrich, “Statistical model for OCT image denoising,” Biomed. Opt. Express 8(9), 3903–3917 (2017).
    [Crossref]
  34. F. J. Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform,” Proc. IEEE 66(1), 51–83 (1978).
    [Crossref]
  35. B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).
  36. G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977).
    [Crossref]

2018 (5)

2017 (3)

2016 (2)

H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
[Crossref]

C. Viehland, B. Keller, O. M. Carrasco-Zevallos, D. Nankivil, L. Shen, S. Mangalesh, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Enhanced volumetric visualization for real time 4D intraoperative ophthalmic swept-source OCT,” Biomed. Opt. Express 7(5), 1815–1829 (2016).
[Crossref]

2015 (1)

2014 (2)

2013 (4)

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

M. Szkulmowski and M. Wojtkowski, “Averaging techniques for OCT imaging,” Opt. Express 21(8), 9757–9773 (2013).
[Crossref]

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

2012 (3)

2011 (1)

2010 (2)

2007 (2)

D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007).
[Crossref]

D. J. Brenner and E. J. Hall, “Computed tomography—an increasing source of radiation exposure,” N. Engl. J. Med. 357(22), 2277–2284 (2007).
[Crossref]

2003 (1)

A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
[Crossref]

1997 (1)

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

1996 (1)

A. F. Fercher, “Optical coherence tomography,” J. Biomed. Opt. 1(2), 157–174 (1996).
[Crossref]

1991 (1)

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

1979 (1)

R. K. Daniel, “Microsurgery: through the looking glass,” N. Engl. J. Med. 300(22), 1251–1257 (1979).
[Crossref]

1978 (1)

F. J. Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform,” Proc. IEEE 66(1), 51–83 (1978).
[Crossref]

1977 (1)

G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977).
[Crossref]

Ahn, Y.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Albert, F. K.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

André, R.

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Atia, W.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Bae, J. K.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Barker, K.

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Bleicher, I. D.

I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
[Crossref]

Bonsanto, M. M.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Brandacher, G.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Brenner, D. J.

D. J. Brenner and E. J. Hall, “Computed tomography—an increasing source of radiation exposure,” N. Engl. J. Med. 357(22), 2277–2284 (2007).
[Crossref]

Carrasco-Zevallos, O. M.

Cha, J.

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Chang, W.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Cheon, G.

J. U. Kang and G. Cheon, “Demonstration of Subretinal Injection Using Common-Path Swept Source OCT Guided Microinjector,” Appl. Sci. 8(8), 1287 (2018).
[Crossref]

Chiu, S. J.

Choi, G.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Choudhury, B.

Cook, C.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Cool, D. W.

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Daniel, R. K.

R. K. Daniel, “Microsurgery: through the looking glass,” N. Engl. J. Med. 300(22), 1251–1257 (1979).
[Crossref]

Draelos, M.

Draxinger, W.

Drexler, W.

A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
[Crossref]

J. G. Fujimoto and W. Drexler, “Introduction to OCT,” Optical Coherence Tomography: Technology and Applications, 3–64 (2015).

Ehlers, J. P.

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.

Eigenwillig, C. M.

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Farsiu, S.

Fenster, A.

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Fercher, A. F.

A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
[Crossref]

A. F. Fercher, “Optical coherence tomography,” J. Biomed. Opt. 1(2), 157–174 (1996).
[Crossref]

Flotte, T.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Fujimoto, J. G.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

J. G. Fujimoto and W. Drexler, “Introduction to OCT,” Optical Coherence Tomography: Technology and Applications, 3–64 (2015).

Gabr, H.

I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
[Crossref]

Gardi, L.

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Gehlbach, P.

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Goldberg, B.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Gong, G.

Gorczynska, I.

Gregory, K.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Hall, E. J.

D. J. Brenner and E. J. Hall, “Computed tomography—an increasing source of radiation exposure,” N. Engl. J. Med. 357(22), 2277–2284 (2007).
[Crossref]

Harris, F. J.

F. J. Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform,” Proc. IEEE 66(1), 51–83 (1978).
[Crossref]

Hauser, K.

Hee, M. R.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Heidrich, W.

Hewko, M. D.

D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007).
[Crossref]

Hitzenberger, C. K.

A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
[Crossref]

Huang, D.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Huang, Y.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Y. Huang, X. Liu, and J. U. Kang, “Real-time 3D and 4D Fourier domain Doppler optical coherence tomography based on dual graphics processing units,” Biomed. Opt. Express 3(9), 2162–2174 (2012).
[Crossref]

Huber, R.

W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014).
[Crossref]

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Ibrahim, Z.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Idoughi, R.

Izatt, J.

Izatt, J. A.

I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
[Crossref]

B. Keller, M. Draelos, G. Tang, S. Farsiu, A. N. Kuo, K. Hauser, and J. A. Izatt, “Real-time corneal segmentation and 3D needle tracking in intrasurgical OCT,” Biomed. Opt. Express 9(6), 2716–2732 (2018).
[Crossref]

O. M. Carrasco-Zevallos, C. Viehland, B. Keller, R. P. McNabb, A. N. Kuo, and J. A. Izatt, “Constant linear velocity spiral scanning for near video rate 4D OCT ophthalmic and surgical imaging with isotropic transverse sampling,” Biomed. Opt. Express 9(10), 5052–5070 (2018).
[Crossref]

O. M. Carrasco-Zevallos, C. Viehland, B. Keller, M. Draelos, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Review of intraoperative optical coherence tomography: technology and applications,” Biomed. Opt. Express 8(3), 1607–1637 (2017).
[Crossref]

C. Viehland, B. Keller, O. M. Carrasco-Zevallos, D. Nankivil, L. Shen, S. Mangalesh, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Enhanced volumetric visualization for real time 4D intraoperative ophthalmic swept-source OCT,” Biomed. Opt. Express 7(5), 1815–1829 (2016).
[Crossref]

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

S. J. Chiu, X. T. Li, P. Nicholas, C. A. Toth, J. A. Izatt, and S. Farsiu, “Automatic segmentation of seven retinal layers in SDOCT images congruent with expert manual segmentation,” Opt. Express 18(18), 19413–19428 (2010).
[Crossref]

Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.

Jackson-Atogi, M.

I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
[Crossref]

Johnson, B.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Joo, C.-K.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Joos, K. M.

H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
[Crossref]

Jung, W.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Kakani, N.

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Kang, J. U.

J. U. Kang and G. Cheon, “Demonstration of Subretinal Injection Using Common-Path Swept Source OCT Guided Microinjector,” Appl. Sci. 8(8), 1287 (2018).
[Crossref]

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Y. Huang, X. Liu, and J. U. Kang, “Real-time 3D and 4D Fourier domain Doppler optical coherence tomography based on dual graphics processing units,” Biomed. Opt. Express 3(9), 2162–2174 (2012).
[Crossref]

K. Zhang and J. U. Kang, “Real-time intraoperative 4D full-range FD-OCT based on the dual graphics processing units architecture for microsurgery guidance,” Biomed. Opt. Express 2(4), 764–770 (2011).
[Crossref]

K. Zhang and J. U. Kang, “Real-time 4D signal processing and visualization using graphics processing unit on a regular nonlinear-k Fourier-domain OCT system,” Opt. Express 18(11), 11772–11784 (2010).
[Crossref]

Karpf, S.

Keller, B.

Kim, H.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Klein, T.

W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014).
[Crossref]

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Knauth, M.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Kowalczyk, A.

Kuo, A.

Kuo, A. N.

Kuth, R.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Kuznetsov, M.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Larson, N.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Lasser, T.

A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
[Crossref]

Latt, S.

G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977).
[Crossref]

Lee, W. A.

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Lee, W. P. A.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

Lenz, G.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Li, M.

Li, X. T.

Lin, C. P.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Liu, X.

Maldonado, R.

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

Mallon, E.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Mangalesh, S.

Mao, Q.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

McKenzie, E.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

McNabb, R. P.

Melendez, C.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Moon, S.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Nankivil, D.

Neshat, H.

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

Nicholas, P.

Pang, J.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

Pastyr, O.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Pfeiffer, T.

W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014).
[Crossref]

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Popescu, D. P.

D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007).
[Crossref]

Puliafito, C. A.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Rogers, W.

G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977).
[Crossref]

Saxena, A.

M. Singh and A. Saxena, “Microsurgery: A useful and versatile tool in surgical field,” Surg. Curr. Res 4(4), 9–11 (2014).
[Crossref]

Schlegel, W.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Schuman, J. S.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Shen, J.-H.

H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
[Crossref]

Shen, L.

Shin, S.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Simaan, N.

H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
[Crossref]

Singh, M.

M. Singh and A. Saxena, “Microsurgery: A useful and versatile tool in surgical field,” Surg. Curr. Res 4(4), 9–11 (2014).
[Crossref]

Sowa, M. G.

D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007).
[Crossref]

Staubert, A.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Stinson, W. G.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Swanson, E. A.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Sylwestrzak, M.

Szkulmowski, M.

Szlag, D.

Tang, G.

Tao, Y. K.

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.

Tong, D.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

Toth, C. A.

I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
[Crossref]

O. M. Carrasco-Zevallos, C. Viehland, B. Keller, M. Draelos, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Review of intraoperative optical coherence tomography: technology and applications,” Biomed. Opt. Express 8(3), 1607–1637 (2017).
[Crossref]

C. Viehland, B. Keller, O. M. Carrasco-Zevallos, D. Nankivil, L. Shen, S. Mangalesh, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Enhanced volumetric visualization for real time 4D intraoperative ophthalmic swept-source OCT,” Biomed. Opt. Express 7(5), 1815–1829 (2016).
[Crossref]

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

S. J. Chiu, X. T. Li, P. Nicholas, C. A. Toth, J. A. Izatt, and S. Farsiu, “Automatic segmentation of seven retinal layers in SDOCT images congruent with expert manual segmentation,” Opt. Express 18(18), 19413–19428 (2010).
[Crossref]

Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.

Tronnier, V. M.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Viehland, C.

Wells, B.

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Wieser, W.

W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014).
[Crossref]

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Wirtz, C. R.

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Wojtkowski, M.

Yao, M.

Yoo, Y.-S.

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

Yu, H.

H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
[Crossref]

Zack, G.

G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977).
[Crossref]

Zhang, H.

Zhang, K.

Zhu, S.

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

Appl. Sci. (1)

J. U. Kang and G. Cheon, “Demonstration of Subretinal Injection Using Common-Path Swept Source OCT Guided Microinjector,” Appl. Sci. 8(8), 1287 (2018).
[Crossref]

Biomed. Opt. Express (9)

C. Viehland, B. Keller, O. M. Carrasco-Zevallos, D. Nankivil, L. Shen, S. Mangalesh, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Enhanced volumetric visualization for real time 4D intraoperative ophthalmic swept-source OCT,” Biomed. Opt. Express 7(5), 1815–1829 (2016).
[Crossref]

B. Keller, M. Draelos, G. Tang, S. Farsiu, A. N. Kuo, K. Hauser, and J. A. Izatt, “Real-time corneal segmentation and 3D needle tracking in intrasurgical OCT,” Biomed. Opt. Express 9(6), 2716–2732 (2018).
[Crossref]

K. Zhang and J. U. Kang, “Real-time intraoperative 4D full-range FD-OCT based on the dual graphics processing units architecture for microsurgery guidance,” Biomed. Opt. Express 2(4), 764–770 (2011).
[Crossref]

M. Draelos, B. Keller, C. Viehland, O. M. Carrasco-Zevallos, A. Kuo, and J. Izatt, “Real-time visualization and interaction with static and live optical coherence tomography volumes in immersive virtual reality,” Biomed. Opt. Express 9(6), 2825–2843 (2018).
[Crossref]

O. M. Carrasco-Zevallos, C. Viehland, B. Keller, R. P. McNabb, A. N. Kuo, and J. A. Izatt, “Constant linear velocity spiral scanning for near video rate 4D OCT ophthalmic and surgical imaging with isotropic transverse sampling,” Biomed. Opt. Express 9(10), 5052–5070 (2018).
[Crossref]

O. M. Carrasco-Zevallos, C. Viehland, B. Keller, M. Draelos, A. N. Kuo, C. A. Toth, and J. A. Izatt, “Review of intraoperative optical coherence tomography: technology and applications,” Biomed. Opt. Express 8(3), 1607–1637 (2017).
[Crossref]

Y. Huang, X. Liu, and J. U. Kang, “Real-time 3D and 4D Fourier domain Doppler optical coherence tomography based on dual graphics processing units,” Biomed. Opt. Express 3(9), 2162–2174 (2012).
[Crossref]

W. Wieser, W. Draxinger, T. Klein, S. Karpf, T. Pfeiffer, and R. Huber, “High definition live 3D-OCT in vivo: design and evaluation of a 4D OCT engine with 1 GVoxel/s,” Biomed. Opt. Express 5(9), 2963–2977 (2014).
[Crossref]

M. Li, R. Idoughi, B. Choudhury, and W. Heidrich, “Statistical model for OCT image denoising,” Biomed. Opt. Express 8(9), 3903–3917 (2017).
[Crossref]

J. Biomed. Opt. (4)

A. F. Fercher, “Optical coherence tomography,” J. Biomed. Opt. 1(2), 157–174 (1996).
[Crossref]

J. U. Kang, Y. Huang, J. Cha, K. Zhang, Z. Ibrahim, W. A. Lee, G. Brandacher, and P. Gehlbach, “Real-time three-dimensional Fourier-domain optical coherence tomography video image guided microsurgeries,” J. Biomed. Opt. 17(8), 081403 (2012).
[Crossref]

Y. Huang, Z. Ibrahim, D. Tong, S. Zhu, Q. Mao, J. Pang, W. P. A. Lee, G. Brandacher, and J. U. Kang, “Microvascular anastomosis guidance and evaluation using real-time three-dimensional Fourier-domain Doppler optical coherence tomography,” J. Biomed. Opt. 18(11), 111404 (2013).
[Crossref]

S. Shin, J. K. Bae, Y. Ahn, H. Kim, G. Choi, Y.-S. Yoo, C.-K. Joo, S. Moon, and W. Jung, “Lamellar keratoplasty using position-guided surgical needle and M-mode optical coherence tomography,” J. Biomed. Opt. 22(12), 125005 (2017).
[Crossref]

J. Histochem. Cytochem. (1)

G. Zack, W. Rogers, and S. Latt, “Automatic measurement of sister chromatid exchange frequency,” J. Histochem. Cytochem. 25(7), 741–753 (1977).
[Crossref]

Med. Phys. (1)

H. Neshat, D. W. Cool, K. Barker, L. Gardi, N. Kakani, and A. Fenster, “A 3D ultrasound scanning system for image guided liver interventions,” Med. Phys. 40(11), 112903 (2013).
[Crossref]

N. Engl. J. Med. (2)

D. J. Brenner and E. J. Hall, “Computed tomography—an increasing source of radiation exposure,” N. Engl. J. Med. 357(22), 2277–2284 (2007).
[Crossref]

R. K. Daniel, “Microsurgery: through the looking glass,” N. Engl. J. Med. 300(22), 1251–1257 (1979).
[Crossref]

Neurosurgery (1)

V. M. Tronnier, C. R. Wirtz, M. Knauth, G. Lenz, O. Pastyr, M. M. Bonsanto, F. K. Albert, R. Kuth, A. Staubert, and W. Schlegel, “Intraoperative diagnostic and interventional magnetic resonance imaging in neurosurgery,” Neurosurgery 40(5), 891–902 (1997).
[Crossref]

Opt. Commun. (1)

D. P. Popescu, M. D. Hewko, and M. G. Sowa, “Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample,” Opt. Commun. 269(1), 247–251 (2007).
[Crossref]

Opt. Express (5)

Proc. IEEE (1)

F. J. Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform,” Proc. IEEE 66(1), 51–83 (1978).
[Crossref]

Rep. Prog. Phys. (1)

A. F. Fercher, W. Drexler, C. K. Hitzenberger, and T. Lasser, “Optical coherence tomography-principles and applications,” Rep. Prog. Phys. 66(2), 239–303 (2003).
[Crossref]

Retina (1)

J. P. Ehlers, Y. K. Tao, S. Farsiu, R. Maldonado, J. A. Izatt, and C. A. Toth, “Visualization of real-time intraoperative maneuvers with a microscope-mounted spectral domain optical coherence tomography system,” Retina 33(1), 232–236 (2013).
[Crossref]

Science (1)

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Surg. Curr. Res (1)

M. Singh and A. Saxena, “Microsurgery: A useful and versatile tool in surgical field,” Surg. Curr. Res 4(4), 9–11 (2014).
[Crossref]

Trans. Vis. Sci. Tech. (1)

I. D. Bleicher, M. Jackson-Atogi, C. Viehland, H. Gabr, J. A. Izatt, and C. A. Toth, “Depth-Based, Motion-Stabilized Colorization of Microscope-Integrated Optical Coherence Tomography Volumes for Microscope-Independent Microsurgery,” Trans. Vis. Sci. Tech. 7(6), 1 (2018).
[Crossref]

Transactions on Mechatronics (1)

H. Yu, J.-H. Shen, K. M. Joos, and N. Simaan, “Calibration and integration of b-mode optical coherence tomography for assistive control in robotic microsurgery,” Transactions on Mechatronics 21(6), 2613–2623 (2016).
[Crossref]

Other (4)

T. Klein, W. Wieser, R. André, T. Pfeiffer, C. M. Eigenwillig, and R. Huber, “Multi-MHz FDML OCT: snapshot retinal imaging at 6.7 million axial-scans per second,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XVI, (International Society for Optics and Photonics, 2012), 82131E.

Y. K. Tao, J. P. Ehlers, C. A. Toth, and J. A. Izatt, “Visualization of vitreoretinal surgical manipulations using intraoperative spectral domain optical coherence tomography,” in Optical Coherence Tomography and Coherence Domain Optical Methods in Biomedicine XV, (International Society for Optics and Photonics, 2011), 78890F.

J. G. Fujimoto and W. Drexler, “Introduction to OCT,” Optical Coherence Tomography: Technology and Applications, 3–64 (2015).

B. Johnson, W. Atia, M. Kuznetsov, C. Cook, B. Goldberg, B. Wells, N. Larson, E. McKenzie, C. Melendez, and E. Mallon, “Swept light sources,” Optical Coherence Tomography: Technology and Applications, 639–658 (2015).

Supplementary Material (1)

NameDescription
» Visualization 1       The video results of guiding a 30-gauge needle into the ex-vivo human cornea as well as pulling the needle out from the cornea freehand by using BC-mode OCT image visualization.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1.
Fig. 1. (a) The system setup. FC, fiber coupler. PC, polarization controller. AC, achromatic collimator. GVS, galvanometer scanners which contain two scanning mirrors. OL, objective lens. DCL, dispersion compensation lens. BD+, BD-, balanced detector. DAQ, data acquisition board. (b) The scanning scheme.
Fig. 2.
Fig. 2. B-mode OCT images showing the 30-gauge needle inside the cornea of the ex-vivo human eye. (a) One slice from the image set. (b) The adjacent slice from the image set.
Fig. 3.
Fig. 3. The angle relationship. $\vec{n}$, unit tangential vector. $\alpha $, azimuth angle. $\beta $, polar angle.
Fig. 4.
Fig. 4. Human cornea model.
Fig. 5.
Fig. 5. (a) ROI-1. (b) ROI-2. (c) ROI-3. (d) ∼ (f) Theoretical curves and experimental measurements of BC-mode OCT image resolution in ROI-1, ROI-2, ROI-3, respectively.
Fig. 6.
Fig. 6. (a) Three ROIs in the BC-mode OCT images. (b) The comparison between the theoretically predicted SNR and the experimentally calculated SNR for both the shaded and the unshaded area.
Fig. 7.
Fig. 7. The model for needle tracking accuracy calculations. (a) The top view. (b) The 3-dimensional diagram.
Fig. 8.
Fig. 8. The vertical tracking accuracy.
Fig. 9.
Fig. 9. (a) The saturation reduced B-mode OCT image after decreasing the optical coupling in the reference arm and applying the Hanning window to the spectrum data. (b) The saturation free B-mode OCT image after the saturation elimination.
Fig. 10.
Fig. 10. The image processing scheme.
Fig. 11.
Fig. 11. (a) The histogram of the BC-mode OCT image. (b) The original BC-mode OCT image ${I_{BC}}$. (c) The processed BC-mode OCT image ${\bar{I}_{BC}}$. (d) The histogram of the variance image. (e) The processed variance image $\bar{V}$. (f) The final enhanced BC-mode OCT image ${\tilde{I}_{BC}}$. In (a) and (d), the black solid line is the histogram; the blue dotted line indicates the auto triangle threshold method; the red dashed line shows the auto rescale method.
Fig. 12.
Fig. 12. (a) The conventional B-mode OCT image just before the needle was inserted into the cornea. (b) The conventional B-mode OCT image when the needle just entered the cornea. (c) The conventional B-mode OCT image when the needle was deep inside the cornea. (d) The BC-mode OCT image just before the needle was inserted into the cornea. (e) The BC-mode OCT image when the needle just entered the cornea. (f) The BC-mode OCT image when the needle was deep inside the cornea.

Tables (1)

Tables Icon

Table 1. System scanning configuration

Equations (52)

Equations on this page are rendered with MathJax. Learn more.

I B C ( i , j ) = 1 N k = 1 N S k ( i , j ) ,
σ x = 0.61 λ 0 N A ,
σ y = 0.44 λ 0 2 Δ λ ,
h B ( x , y ) = exp ( 4 ln 2 x 2 σ x 2 ) exp ( 4 ln 2 y 2 σ y 2 ) ,
I B C ( x , y ) = 1 t t 2 t 2 d z f ( x , y , z 0 + z ) ,
I B C ( x , y ) = 1 t t 2 t 2 d z ( f ( x , y , z 0 ) + f z z ) .
f ( x ( z ) , y ( z ) , z ) = c o n s t .
f x d x d z + f y d y d z + f z = 0.
I B C ( x , y ) = 1 t t 2 t 2 d z f ( x d x d z z , y d y d z z , z 0 ) .
n = ( sin β cos α , sin β sin α , cos β ) .
d x d z = tan β cos α ,
d y d z = tan β sin α .
I B C ( x , y ) = 1 t t 2 t 2 d z f ( x tan β z cos α , y tan β z sin α , z 0 ) .
h B C ( x , y ) = 1 t t 2 t 2 d z h B ( x tan β z cos α , y tan β z sin α ) ,
h B C ( x , y ) = + d z exp ( 4 ln 2 ( x tan β z cos α ) 2 σ x 2 ) exp ( 4 ln 2 ( y tan β z sin α ) 2 σ y 2 ) 1 t R e c t ( 2 z t ) ,
1 t Re c t ( 2 z t ) 1 2 π σ t exp ( z 2 2 σ t 2 ) ,
h B C ( x , y ) = 1 1 + 1 12 ( t σ 0 ) 2 exp ( 4 ln 2 ( x y ) A ( x y ) T σ x 2 σ y 2 + 2 ln 2 3 tan 2 β ( σ x 2 sin 2 α + σ y 2 cos 2 α ) t 2 ) ,
A = ( σ y 2 + 2 ln 2 3 tan 2 β sin 2 α t 2 ln 2 3 tan 2 β sin 2 α t 2 ln 2 3 tan 2 β sin 2 α t 2 σ x 2 + 2 ln 2 3 tan 2 β cos 2 α t 2 ) .
λ 1 = σ x 2 + σ y 2 2 , where n 1 = ( cos α , sin α ) ,
λ 2 = σ x 2 + σ y 2 2 + 2 ln 2 3 tan 2 β t 2 , where n 2 = ( sin α , cos α ) ,
σ max = σ x 2 + σ y 2 2 + 2 ln 2 3 tan 2 β t 2 .
σ min = σ x 2 + σ y 2 2 .
h B C ( x , y ) = 4 π t 2 0 t 2 d z 0 2 π z d α h B ( x tan β z cos α , y tan β z sin α ) ,
h B C ( x , y ) = + d x 0 + d y 0 h B ( x tan β x 0 , y tan β y 0 ) 4 π t 2 C i r c l e ( x 0 2 + y 0 2 ( t 2 ) 2 ) ,
4 π t 2 C i r c l e ( x 0 2 + y 0 2 ( t 2 ) 2 ) 1 2 π σ t 2 exp ( x 0 2 + y 0 2 2 σ t 2 ) ,
h B C ( x , y ) = 1 1 + ln 2 8 tan 2 β ( t σ x ) 2 exp ( 4 ln 2 x 2 σ x 2 + ln 2 8 tan 2 β t 2 ) 1 1 + ln 2 8 tan 2 β ( t σ y ) 2 exp ( 4 ln 2 y 2 σ y 2 + ln 2 8 tan 2 β t 2 ) .
σ ¯ x = σ x 2 + ln 2 8 tan 2 β t 2 ,
σ ¯ y = σ y 2 + ln 2 8 tan 2 β t 2 ,
β = tan 1 t 4 r .
S N R = 10 log 10 μ s 2 σ n 2 ,
μ B ( z 0 ) = 1 S S d x d y f ( x , y , z 0 ) ,
μ B C ( z 0 ) = 1 S S d x d y 1 t t 2 t 2 d z f ( x , y , z 0 + z ) ,
μ B C ( z 0 ) = 1 t t 2 t 2 d z 1 S S d x d y f ( x , y , z 0 + z ) = 1 t t 2 t 2 d z μ B ( z 0 + z ) .
μ B C = μ B ,
X ¯ = 1 N i = 1 N X i .
σ B C 2 = V ( X ¯ ) = V ( 1 N i = 1 N X i ) = 1 N 2 i = 1 N V ( X i ) = σ B 2 N .
S N R B C u n s h a d e d = 10 log 10 μ B C 2 σ B C 2 = 10 log 10 ( μ B 2 σ B 2 N ) = S N R B + 10 log 10 N ,
μ B C ( z 0 ) = 1 t ( t 2 t 2 d z μ B ( z 0 + z ) + a t 0 2 a + t 0 2 d z ( μ n ( z 0 + z ) μ B ( z 0 + z ) ) ) ,
μ B C = μ B t t 0 t + μ n t 0 t .
S N R B C s h a d e d = 10 log 10 μ B C 2 σ B C 2 = 10 log 10 ( μ B 2 σ B 2 N ( 1 μ B μ n μ B t 0 N s ) 2 ) .
S N R B C s h a d e d = S N R B + 10 log 10 1 N ( N 1 + μ n μ B ) 2 ,
S N R B C s h a d e d m a x = S N R B + 10 log 10 N ,
S N R B C s h a d e d m i n = S N R B + 10 log 10 ( N + 1 N 2 ) ,
f r a m e r a t e = R A N B C .
θ = 2 tan 1 ( tan α 2 cos β ) .
e v = x tan θ 2 tan β = sin β tan α 2 x .
μ e v = 2 s 0 s 2 d x e v = sin β tan α 2 s 4 .
σ e v = 2 s 0 s 2 d x ( e v μ e v ) 2 = sin β tan α 2 s 4 3 .
σ d 2 = 2 s 0 s 2 d x ( e v d ) 2 = 2 s 0 s 2 d x ( ( e v μ e v ) ( d μ e v ) ) 2 = σ e v 2 + ( d μ e v ) 2 .
I ~ ( k ) = I ( k ) 1 2 ( 1 cos ( 2 π k K 1 ) ) ,
V ( i , j ) = 1 N k = 1 N ( S k ( i , j ) I B C ( i , j ) ) 2 ,
I ~ B C ( i , j ) = I ¯ B C ( i , j ) + V ¯ ( i , j ) ,

Metrics