Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

RGB-color forward-viewing spectrally encoded endoscope using three orders of diffraction

Open Access Open Access

Abstract

Spectrally encoded endoscopy (SEE) is an ultra-miniature endoscopy technology that encodes each spatial location on the sample with a different wavelength. One challenge in SEE is achieving color imaging with a small probe. We present a novel SEE probe that is capable of conducting real-time RGB imaging using three diffraction orders (6th order diffraction of the blue spectrum, 5th of green, and 4th of red). The probe was comprised of rotating 0.5 mm-diameter illumination optics inside a static, 1.2 mm-diameter flexible sheath with a rigid distal length of 5 mm containing detection fibers. A color chart, resolution target, and swine tissue were imaged. The device achieved 44k/59k/23k effective pixels per R/G/B channels over a 58° angular field and differentiated a wide gamut of colors.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Miniature endoscopy technologies have the potential to provide safer and less traumatic endoscopic imaging and treatment. Several technologies have been demonstrated to conduct endoscopic imaging through small probes [1,2,3]. Spectrally encoded endoscopy (SEE) is a miniature endoscopy technology that uses a grating to image a line of the tissue without having to use beam scanning devices [4]. Previously, a 350 µm diameter SEE device was utilized for laparoscopic imaging of a mouse in vivo [5]. SEE has been used for three-dimensional imaging [5], subsurface imaging [6], and Doppler imaging [7]. While the majority of these SEE devices were designed with large side-viewing angles, forward-viewing SEE devices using two separate channels for illumination and detection have also been demonstrated [8,9,10].

Most SEE devices, however, expend spectral information for spatial encoding and, therefore, do not provide color images of the tissue. The monochromatic images obtained by SEE may not be sufficient in certain cases when color imaging is required for disease diagnosis. A bench top color SEE method that used three spatially offset light beams was previously demonstrated [11]. However, implementing the three-beam color SEE method as a miniature probe could be challenging, as it mandates the use of three separate fibers in the device, making free-form probe rotation challenging. Another color SEE method that translated the specimen along the axis of the spectrally encoded line was demonstrated [12]. The translation-based method may be difficult to utilize for endoscopic imaging in vivo due to the movement and deformation of living tissue in the endoscopic environment. Previously, a bench top color SEE setup was demonstrated using high-order grating diffraction, where different diffracted orders overlapped with each other on the tissue [13]. This configuration allowed each point on the tissue to be illuminated by three distinctive wavelengths. Advantages of this method for in vivo imaging include the use of a single light beam and one dimensional scanning to obtain a two-dimensional color image.

In this paper, we present a color forward-view SEE device based on the same high-order grating diffraction concept with 0.5-mm diameter spectrally encoded illumination optics contained in a 1.2-mm diameter SEE probe. In order to demonstrate feasibility of the color SEE device, several samples were imaged at video rate, including color and resolution charts and a swine joint ex vivo.

2. Methods

2.1 Color SEE system

Figure 1 shows a schematic diagram of the SEE device. The device consisted of a system console, a motor with cables, and a probe. The system console included a broadband source (EXU-6, NKT Photonics; spectral bandwidth = 400–2400 nm; spectral power density = 1 mW nm-1), a custom-made fiber optic rotary joint, a spectrometer, and a computer. Visible light from the broadband source was filtered by a low pass filter (SuperK SPLIT, NKT Photonics; cut-off wavelength = 850 nm) and coupled into a single-mode fiber (SMF) (630-HP, Nufern). The fiber, SMF#1, led to the stationary side of the rotary joint. Light from the fiber (SMF#1) was coupled into a rotating single-mode fiber (630-HP, Nufern, length = 2.5 m), SMF#2, across the rotary joint. The fiber (SMF#2) was connected to the probe’s illumination fiber (630-HP, Nufern), SMF#3, at its distal end by LC/APC connectors. The optical throughput in the system console (after the low-pass filter) was about 20%, which was mainly caused by the coupling optics in the rotary joint and the single-mode fiber connections. The probe’s illumination fiber (SMF#3) delivered light into its distal illumination optics to focus and diffract the broadband light toward the sample. The diffracted light was shaped as 3-color band of spectrally encoded illumination lines overlapped and positioned such that each bands’ lowest wavelength was along the probe’s rotational axis. The total illumination power on the sample was about 30 mW. The probe’s illumination optics are discussed in more detail in the next subsection.

 figure: Fig. 1.

Fig. 1. Schematic of the color SEE device.

Download Full Size | PDF

To obtain two-dimensional illumination, the probe’s illumination optics and SMF#3 were rotated at 30 rps by the following mechanism. Figure 2 shows the probe, the motor (283833, Maxon), and the fiber connectors. The probe can be disconnected from the motor with cables connected to the console, which allows exchange of probes. The probe optics was flexible and inserted in a rigid guide (304 Stainless steel tubing, McMaster-Carr, 14.5 Gauge) in Fig. 2. The illumination optics and SMF#3 were contained within a drive cable (DC-b) (Asahi Intecc; 2 layers, inner diameter = 500 µm, outer diameter = 700 µm). The drive cable (DC-b) was mechanically connected to another drive cable (DC-a) (Asahi Intecc; 3 layers, inner diameter = 560 µm, outer diameter = 2.00 mm, length = 2.5 m) that contained SMF#2. The drive cable (DC-a) was held by the hollow shaft of the motor located at its distal end. The motor rotated DC-a as well as DC-b through the mechanical connection. The proximal end of SMF#2 was rotated by DC-a inside of the rotary joint. The benefit of having the motor near the probe, not at the rotary joint in the system console, is that non-uniform rotation distortion (NURD) was reduced due to the short distance between the motor and the probe optics. A motor encoder sent a trigger per each rotation to the spectrometer for image synchronization.

 figure: Fig. 2.

Fig. 2. The color SEE probe (inside of a straight rigid guide) and the motor with cables. The probe was disconnected mechanically and optically from the fibers at the motor side.

Download Full Size | PDF

Light reflected from the sample was collected by the probe’s detection fiber bundle which consisted of 45 multi-mode fibers (MMF) (GOF85, Schott; core diameter = 70 µm, NA = 0.64) that surrounded the probe’s illumination optics and the drive cable in a ring configuration (Fig. 1 and Fig. 3). The detection fibers were arranged as a circular bundle (core area diameter = 560 µm) at the probe’s proximal end. The circular bundle was connected (at the probe’s proximal end) to another circular bundle (core area diameter = 560 µm) consisting of 45 multi-mode fibers (GOF85, Schott; core diameter = 70 µm, NA = 0.64, length = 3.5 m) that was connected to the spectrometer in the system console. The spectrometer comprised a collimation lens (DLB-50-120PM, Sigma Koki; focal length = 120 mm, NA = 0.20), a transmission diffraction grating (Wasatch Photonics; groove density = 1,260 mm-1), a custom-designed focusing lens (Canon Inc.; focal length = 55 mm, NA = 0.43), and a camera with CCD line sensor (S14291, Hamamatsu Photonics; 1 × 2048 pixels, pixel size = 500 µm x 12 µm). The multi-mode fibers at the spectrometer entrance were configured as a rectangular array (3 columns of 15 fibers) with a width of 194 µm and a height of 1.11 mm, which corresponds to having a slit shape at the entrance of the spectrometer (Fig. 1). To determine the array configuration: (i) the width should be minimized since the resolution of the spectrometer is proportional to the width of the array (ii) light from the fibers located in the array should be within the sensor’s height (500 µm) divided by the spectrometer’s magnification (0.458), so that it can be coupled to the sensor. The rectangular array’s shape was chosen to be 194 µm x 1.11 mm, balancing resolution and total coupling efficiency. The spectrometer optics were designed to have the highest spectral resolution for green light. The measured resolutions using a Mercury-Argon light source (HG-1, Ocean Optics) and 50-µm pinhole entrance were 0.36, 0.21, and 0.30 nm for 435.8, 546.1, and 672.2 nm light, respectively. It was estimated that the spectrometer has about 1 nm resolution with the fiber bundle that had width of 194 µm for the 3-color bands. 1,440 lines were acquired from the line sensor per frame following each trigger coming from the encoder at a rate of 30 Hz.

 figure: Fig. 3.

Fig. 3. Schematic of the probe distal end.

Download Full Size | PDF

2.2 Color SEE probe

Figure 4(A) shows a ray tracing of the probe’s illumination optics (Zemax): the probe’s illumination fiber, a silica GRIN lens (Toyo Seikan; diameter = 250 µm, length = 1.55 mm, NA=0.188/0.192/0.199 for 635/532/450 nm), and an angle-polished glass spacer (Tempax, diameter = 500 µm, length = 1.0 mm). The distal tip of the probe contained a glass window (diameter = 0.98 mm, thickness = 200 µm) which was attached on the distal end of an inner can (stainless steel, outer diameter = 0.98 mm, length = 5.0 mm) and surrounded by the detection fiber bundle to prevent ingress of water and dust into the probe (Fig. 3). The GRIN lens length was adjusted so that the beam of 635 nm light was nearly collimated (the beam divergence was less than 7 mrad). The fiber and the GRIN lens were fusion-spliced. The spacer was attached on the GRIN lens by epoxy adhesive resin (UD1355, Epoxy technology). The glass spacer had 2 facets at its tip: a mirror surface and a grating surface as shown in Fig. 4(B). The GRIN lens was off-center to the spacer so that light from the probe illumination fiber was reflected at the spacer’s mirror surface by total internal reflection and diffracted by the grating. The grating surface on the spacer was a thin grating layer made of approximately 1 nl of UV-curable resin (OG603, Epoxy technology). The grating was formed by stamping a reversed-pattern master grating (fused silica, Canon Inc.) on a UV-cured thin resin layer [14]. UV exposure was performed in a chamber purged with nitrogen to prevent imperfect resin curing. The grating surface’s optical microscopic images are shown in Figs. 4(C) and 4(D). The grating pattern was resolved with a faint moiré pattern in Fig. 4(D). Figure 4(E) is a cross-sectional scanning electron microscope (SEM) image of the resin grating. The resin grating had a pitch (p) of 1.54 µm, a groove opening width (w) of 0.383 µm, and a groove depth (d) of 1.82 µm. The grating’s groove wall angle was 87.5 degrees.

 figure: Fig. 4.

Fig. 4. (A) Ray tracing of SEE probe illumination optics in Zemax; (B) microscopic image at side view of the probe illumination optics; (C) optical microscopic image of the resin grating on the spacer’s surface from the direction of the yellow arrow in (B); (D) enlarged image in the yellow box in (C); (E) cross-sectional SEM image of the resin grating.

Download Full Size | PDF

The fabrication process for the illumination optics was as follows: (i) a glass rod was ground and polished into a spacer with 3 polished surfaces (mirror surface, grating surface, and surface attached to GRIN lens), (ii) a resin grating was formed on the grating surface, (iii) a GRIN lens was fusion-spliced with an illumination fiber, (iv) the GRIN lens and the spacer were attached by an epoxy adhesive, and (v) the illumination optical assembly was inserted into a drive cable and the fiber’s proximal end was connectorized.

The light reflected by the mirror was diffracted at a distinct angle toward the sample according to:

$${n_s}\sin {\theta _i} + \sin {\theta _d} ={-} mg\lambda ,$$
where ${n_s}$ is refractive index of the spacer, ${\theta _i}$ is the incident angle on the grating in the spacer, ${\theta _d}$ is the diffraction angle in air, $m$ is the order of diffraction, g is the groove density of the grating (=1/p), and $\lambda $ is the wavelength. The diffraction angle is associated to view angle, ${\theta _v}$ by:
$${\theta _d} = {\theta _{d0}} + {\theta _v},$$
where ${\theta _{d0}}$ is the diffraction angle when the diffracted light is parallel to the probe’s rotational axis that coincides with the center of the forward view. The optics were designed so that ${\theta _i}$ was 38.1°, and ${\theta _{d0}}$ was 47.3°. The refractive index of the spacer was 1.471 at 589.3 nm. The three spectral lines of the blue ($\lambda $ = 423-483 nm), green (506-578 nm), and red light (631-722 nm) diffracted in $m$ = -6, -5, and -4 orders, respectively, covered the sample in the range of ${\theta _v}$ (${\theta _d}$) from 0° (47.3°) to 29° (76.3°) [ Fig. 5(A)]. The full angular field of view was thus 58° by rotating the probe’s illumination optics. The authors previously reported a bench top color SEE scheme using 3 diffraction orders as blue, green, and red: $m$ = -5, -4, and -3 with the wavelengths: $\lambda $ = 408-480, 510-600, and 680-800 nm, respectively [13]. Here, higher diffraction orders were chosen so that the high and low orders contained mostly blue and red rather than violet and near infrared. The diffraction efficiency of the grating was calculated by rigorous coupled-wave analysis (RCWA) method [15] [Fig. 5(B)].

 figure: Fig. 5.

Fig. 5. (A) Schematic of the custom grating’s diffraction orders and (B) Calculated diffraction efficiencies of the custom grating. The color bands of blue, green, and red in (B) correspond to the wavelength bands used in color SEE imaging for blue, green and red channels, respectively.

Download Full Size | PDF

The probe’s illumination optics in the drive cable were rotated in an inner sheath (polyimide tubing, Microlumen; inner diameter = 0.75 mm, outer diameter = 0.85 mm) attached inside the inner can (Fig. 1 and Fig. 3). The detection fiber ring bundle (outer diameter = 1.14 mm) surrounded the inner sheath, the inner can, and the window. The window and the detection fibers were polished together at the distal end so that the light reflected at the window surfaces were not coupled directly into the detection fibers. An outer can (outer diameter = 1.24 mm, length = 3.0 mm) and an outer tubing (silicone) protected detection fibers. Figure 6(A) shows the detection sheath assembly and the illumination optics that was inserted in its inner sheath. The detection fibers were bundled in a circular shape at the proximal end. The probe was flexible except its distal part (5 mm long) and its bending radius was 5 mm. To check its bending capability, the probe was tested in a rigid bent guide [Fig. 6(B)]. It showed similar imaging performance as in the straight guide without significant illumination power drop (less than 5%).

 figure: Fig. 6.

Fig. 6. (A) Flexible probe detection sheath including multi-mode fiber bundle. The drive cable of the illumination optics was inserted into the inner sheath. (B) Rigid bent guide (304 Stainless steel tubing, McMaster-Carr, 14.5 Gauge) containing the SEE probe inside. The guide was bent near the distal end (probe bending radius = 5.4 mm and bending angle = 110°).

Download Full Size | PDF

The probe optical throughput (power toward sample over power delivered to SMF #3) was 40%. The main optical loss was caused by the grating diffraction. The probe tip’s temperature was measured using a thermocouple temperature sensor and a forward-looking infrared camera in air at room temperature (25 °C). When the illumination power toward sample was 30 mW, the temperature remained constant after 5 minutes, and it was 34.9 °C at the rigid guide tip and 37.5 °C at the window.

2.3 Image processing

The camera generated a two-dimensional 12-bit monochromatic raw target image (2,048 by 1,440 pixels/image), which was digitally transferred to the computer after each frame acquisition. The computer (CPU: Intel Core i7-7700, memory: DDR 16 GB, OS: Windows 10) processed each raw image to create a final RGB-color target image [13] within 33 ms before next raw image was transferred. In the process, the first step was subtracting background noise spectrum from each spectral line of the raw image of the target. The background noise spectrum was prepared before imaging by (i) taking the raw background noise image with the same system and probe without turning on the laser and (ii) averaging 1,440 lines of the raw background image into background noise spectrum. The second step was normalization of the target image. Each line of the target image (2,048 by 1,440) was divided by reference spectrum (2,048 by 1). The reference spectrum was generated with the same system and probe in advance by (i) obtaining raw data by imaging a target white area (X-Rite ColorChecker 3-Value Grayscale, Edmund), and (ii) averaging (n = 1,440) multiple spectral lines into one spectrum and subtracting the background spectrum from it. The third step involved cropping 3 sets of data: red, green, and blue, from the normalized data based on the spectrometer’s pixel indices corresponding to edge wavelengths for red, green, and blue bands. In the fourth step, each of the 3 sets of cropped data (spectrometer pixel index – time index) were converted to same size data (256 by 1,440 pixels per each red, green, and blue image) in polar coordinates (radial coordinate – angular coordinate) based on the probe’s illumination optics configuration. The relation between time index and angular coordinate was assumed to be linear. For the fifth step, the data in the polar coordinate system was converted to the Cartesian coordinate system (512 by 512 pixels per each R, G, and B image). Finally, the three images were merged into one RGB color image with brightness and contrast adjustment (the user can change these parameters through the software GUI any time during imaging), followed by gamma correction.

3. Results

3.1 Color chart imaging

In order to demonstrate color imaging, a video of a color chart (ColorGauge Nano Matte, Edmund) was taken and displayed with the system and probe at 30 Hz while the relative position between the chart and probe was changed. Figures 7(A), 7(B), and 7(C) show color SEE screenshot images of the color chart around its green, white, and black panel, respectively. The color SEE images had a qualitatively similar color appearance to the photograph [Fig. 7(D)] of the color chart taken by a digital still camera (Powershot S120, Canon). The L*a*b* values of the red, green, and blue squares were measured in the color chart in middle view of the SEE image where those three colors were present in middle image height region to avoid artifacts discussed later (false color at center region and low contrast band at outer edge) in a single frame (Fig. 7(A), from the pixels inside of the dashed circles) and in the digital still camera image [Fig. 7(D)], and their differences to the reference CIE L*a*b* values (provided from the color chart manufacturer; D5000, 2° observer) were calculated as $\Delta {E^\ast }$ (Table 1). The color difference between $({L_1^\ast ,a_1^\ast ,b_1^\ast } )$ and $({L_2^\ast ,a_2^\ast ,b_2^\ast } )$ was calculated [11,16] by

$$\Delta {E^\ast } = \sqrt {{{({L_1^\ast{-} L_2^\ast } )}^2} + {{({a_1^\ast{-} a_2^\ast } )}^2} + {{({b_1^\ast{-} b_2^\ast } )}^2}} .$$
The color differences for the color SEE image were larger than just noticeable difference (JND) of 2.3 [17]. This difference may be due to the white balancing method where RGB values at each pixel were normalized in comparison with the reflection signal from the white target using only 3 wavelengths. The color differences for the digital still camera image could be explained by the difference in the illumination condition and the white balancing algorithm embodied in the digital still camera. The magnitude of the color differences to the reference values were similar between the color SEE system and the digital still camera, which indicates the color SEE probe system has a similar capability to represent color as the digital still camera, at least for the red, green, and blue color chart squares in middle region.

 figure: Fig. 7.

Fig. 7. (A)-(C): Color chart images obtained using the color SEE device; Screenshots of the video taken at 30 Hz. The arrows denote false image artifact due to lower order illumination. (D): An image of the same color chart obtained using a digital still camera. The dashed circles correspond to the (A) (B) and (C) fields of view.

Download Full Size | PDF

Tables Icon

Table 1. Reference CIE L*a*b* (D5000, 2° observer) values of color squares (red, green, and blue) provided by manufacturer, measured CIE L*a*b* average values by the color SEE system and the digital still camera with the manufacturer reference.

Several imaging artifacts were observed in the SEE imaging in Figs. 7(A)–7(C):

  • • The seam line artifact [at 8 o’clock, pink arrow in Fig. 7(B)]: The spectrometer obtained spectral lines in one frame at equal pre-set intervals, while the system obtained each frame data from the spectrometer by frame trigger from the motor. The temporal gap between the last spectral line in one frame and the first spectral line in the next frame could be longer than the pre-set interval. Also the relative position of the chart could change during the gap. These factors caused a spatial discontinuity in the image.
  • • The distortion artifact: This artifact was caused by a mapping error between spectrally encoding and decoding. The color chart squares in the SEE images were distorted in similar way to barrel distortion [red arrow in Fig. 7(B)]. Also, there is a screw-type distortion at center [blue arrow in Fig. 7(B)] when the lowest wavelength in each order’s illumination spectrum does not perfectly align with the probe’s rotational axis (center of image). This situation can happen in some cases including (i) illumination optics axis is not parallel to the drive cable axis, and (ii) probe grating pattern is tilted and conical diffraction occurs.
  • • False color (blue) at center region [yellow arrow in Fig. 7(C)]: This artifact was caused by low S/N due to relatively weak power of the laser and low optical throughput of the overall system including the probe optics at wavelengths below about 430 nm.
  • • Ring pattern artifact [orange arrow in Fig. 7(B)]: The ring pattern artifact occurs at the spectral normalization step when the spectral shape and/or system optical throughput changes between reference spectral normalization and imaging.
  • • Low contrast band at outer edge: Because the detection optics’ throughput was low at the outer edge region of the field of view, pixel values were low in both the reference spectrum and the raw image at corresponding pixels. These factors caused the effective dynamic range to be low at the outer edges, which resulted in low image contrast.
  • • False image artifact due to lower diffraction order illumination [white arrow in Fig. 7(B)]: This artifact occurs when other orders [Fig. 5(A)] illuminate the sample. Although the lower-order beams ($m$ = -5, -4, and -3 for the blue ($\lambda $ = 423-483 nm), green (506-578 nm), and red light (631-722 nm), respectively) were blocked partially by the drive cable, some portion of the undesired (false) diffracted order beams reached the target opposite to the (true) illumination beam in the field of view. Upon image reconstruction, the detected reflection light from the target illuminated by the beams of other orders was converted to a false image superposed on the true image, generating green artifacts as shown in Fig. 7(B) (white arrow). The ratio of the false signal to the true signal can be represented as
    $$\frac{{{S_{\textrm{False}}}}}{{{S_{\textrm{True}}}}} = \left( {\frac{{{\eta_{\textrm{Lower}}}}}{{{\eta_{\textrm{Imaging}}}}} \cdot \frac{{{\alpha_{\textrm{Lower}}}}}{{{\alpha_{\textrm{Imaging}}}}} \cdot \frac{{{\tau_{\textrm{Lower}}}}}{{{\tau_{\textrm{Imaging}}}}}} \right) \cdot \frac{{{\rho _{\textrm{Lower}}}}}{{{\rho _{\textrm{Imaging}}}}}$$
    where S is signal amount, $\eta $ is probe grating diffraction efficiency, $\alpha $ is transmittance from the grating to the target (including vignetting loss by the drive cable), $\tau $ is detection optics throughput (from coupling to the detection fibers to the sensor in the spectrometer), and $\rho $ is sample reflectivity collected by the probe’s detection fibers for each order. For example, the wavelengths corresponding to the green artifacts above were 578 nm and below. Although the ratio of ${\eta _{\textrm{Lower}}}$ (-4th order) to ${\eta _{\textrm{Imaging}}}$ (-5th order) was 0.043 or lower [ref. Figure 5(B)] below 578 nm and the ratio of ${\alpha _{\textrm{Lower}}}$ to ${\alpha _{\textrm{Imaging}}}$ was smaller than 1 by design, ${S_{\textrm{False}}}$ was not negligible because of the large ratio of ${\tau _{\textrm{Lower}}}$ to ${\tau _{\textrm{Imaging}}}$. For these wavelengths illumination beam diffracted in -5th order reached the target at higher view angle than beam in -4th order (the view angles of 578nm is about 29° and -11° for -5th and -4th orders, respectively), and so after being reflected at the sample, the former beam was coupled to detection fibers at higher order mode than the latter. Furthermore, since the detection fiber NA was higher than the spectrometer’s entrance NA, the ratio of ${\tau _{\textrm{Lower}}}$ to ${\tau _{\textrm{Imaging}}}$ was considered to be larger than 1.

3.2 Resolution

The device resolution was evaluated by imaging a NBS 1963A resolution target (Edmund; Positive target, Opal). The target was positioned 10 mm away from the probe window. The diameter of the field of view (FOV) was 11.6 mm. The SEE device resolved target patterns of 11.0 lp/mm [ Fig. 8(A)], which corresponds to 128 resolvable points per the FOV diameter. In Fig. 8(A), a rainbow-like circular pattern was observed. This artifact was caused by the spectral shape changes that occurred between reference spectral normalization and imaging, similar to the ring pattern artifact discussed in 3.1. In comparison, the same target was imaged at the same working distance (10 mm) with a commercially available chip-on-tip endoscope (IK-CT2, Toshiba; 1.2 mm outer diameter, BSI CMOS sensor with 220 by 220 pixels) with its LED illumination unit (IK-CTLED, Toshiba). The FOV was 15.9 mm by 15.9 mm. The chip-on-tip endoscope did not resolve the target patterns of 7.1 lp/mm (corresponding to 113 resolvable points per the side of the FOV) with aliasing due to its small sampling number [Fig. 8(B)].

 figure: Fig. 8.

Fig. 8. Magnified view of NBS 1963A resolution target imaging obtained using the SEE device (A, 11.0 lp/mm) and a commercial chip-on-the-tip endoscope (B, 7.1 lp/mm). Both of the working distances were 10 mm.

Download Full Size | PDF

For more quantitative evaluation, the lateral resolution was measured by calculating the full width at half-maximum (FWHM) of the line spread function along both scanning and spectrally encoding directions in middle region (${\theta _v}$ = 16°) of the FOV (diameter = 11.6 mm) at the working distance of 10 mm. The lateral resolutions in the scanning direction ($\Delta {x_{\textrm{Scan}}}$) were measured to be 92 µm (red channel), 54 µm (green), and 79 µm (blue). The lateral resolutions in the spectrally encoding direction ($\Delta {x_{\textrm{SE}}}$) were 104 µm (red), 133 µm (green), and 235 µm (blue). The lateral resolutions in the scanning direction, corresponding to Nyquist sampling (N = 1,440), were 26.3 µm at the middle region (${\theta _v}$ = 16°) and 50.6 µm at the FOV outer edge (${\theta _v}$ = 29°).

The measured resolution values were compared with the theoretical values. The illumination beam sizes (FWHM) in scanning direction, $\Delta {X_{\textrm{BS, SC}}}$, were simulated to be 75 µm (red), 47 µm (green), and 61 µm (blue); and in spectrally encoding direction, $\Delta {X_{\textrm{BS, SE}}}$ to be 73 µm (red), 82 µm (green), and 155 µm (blue). The illumination beam was assumed to be a Gaussian beam in the simulation. The differences in the simulated values between red, green, and blue channels were caused mainly by dispersion of the GRIN lens. The differences in the simulated values between scanning and spectrally encoding directions were caused by astigmatism due to the grating [18]. The measured lateral resolution values in scanning direction ($\Delta {x_{\textrm{Scan}}}$) were in good agreement with the simulated illumination beam sizes $\Delta {X_{\textrm{BS, SC}}}$, while the values in spectrally encoding direction ($\Delta {x_{\textrm{SE}}}$) were not with $\Delta {X_{\textrm{BS, SE}}}$. This discrepancy is because the lateral resolution in spectrally encoding direction is affected by not only illumination beam size (spectrally encoding) but also finite spectrometer resolution (spectrally decoding).The total number of imaging elements was approximately calculated as $4A/({\Delta {x_{\textrm{Scan}}} \cdot \Delta {x_{\textrm{SE}}}} )$, where A was the area of imaging, $\Delta {x_{\textrm{Scan}}}$ and $\Delta {x_{\textrm{SE}}}$ were measured lateral resolution in angular and radial direction, respectively. With the measured values above substituted, the total numbers of imaging elements were estimated to be about 44,000 (red), 59,000 (green), 23,000 (blue) and 125,000 in total.

The motion jitter of the motor as well as friction between rotating illumination optics and the static detection fibers’ sheath can cause image rotational vibration from frame to frame. To evaluate this effect, a static target was imaged at 30 Hz and location of the same point of its pattern was tracked in images of continuous 30 frames. The absolute value of location error between adjacent frames was calculated. The maximum value and the standard deviation of the error in angular coordinates were 0.86° and 0.19°, respectively.

3.3 Swine joint tissue imaging

To demonstrate the capacity of the color SEE device to image clinically relevant anatomic structures, the system and the probe (in the rigid straight guide held by the user’s hand) were used to image a swine knee joint ex vivo. The target joint was the intertarsal joint that joins the distal part of the calcaneus bone and the 4th tarsal bone. The swine skin and tissue were dissected to enable access to the intertarsal joint. Figures 9(A)–9(B) are single frames of a color SEE video taken at a frame rate of 30 Hz. The SEE probe was placed about 25-30 mm from the joint when the images were acquired. The same joint was imaged by the chip-on-tip endoscope (IK-CT2, Toshiba; 220 by 220 pixels) with its LED illumination unit (IK-CTLED, Toshiba, illumination power was 6 mW) [Figs. 9(C)–9(D)], and photographed by a digital still camera (Powershot S120, Canon) [Fig. 9(E)]. The field of view of the chip-on-tip endoscope was 77° (horizontal and vertical) and 120° (diagonal). The arrows in Fig. 9(E) correspond to the locations where the color SEE and the chip-on-tip endoscope images in Figs. 9(A)–9(D) were obtained. Although artifacts could be observed such as the seam line at 7 o’clock, the color SEE images revealed the joint structure including the calcaneus bone, gap between the bones, and the surrounding tissues with vessels. In comparison with the chip-on-tip endoscope images, the SEE images showed fine features more sharply [arrows in Figs. 9(A)–9(D)], which was consistent with the resolution chart imaging results. The color appearance of the joint in the SEE images was similar to that observed in Fig. 9(E).

 figure: Fig. 9.

Fig. 9. Ex vivo imaging of swine intertarsal joint, using (A-B) color SEE device, (C-D) chip-on-tip endoscope, and (E) digital still camera. Arrows in the image (E) indicate where the images (A)-(D) were obtained.

Download Full Size | PDF

4. Discussion

In this paper we have demonstrated an endoscope for color imaging that utilized one spectrally encoded illumination fiber and three diffraction orders corresponding to red, green, and blue channels. The SEE probe illumination and detection optics were encompassed in a flexible probe with an outer diameter of 1.2 mm and rigid tip length of 5 mm. Images of a color chart and swine joint tissue demonstrated a good color correspondence between images acquired by the color SEE endoscope in a straight rigid guide and images by digital still cameras.

The next step in the development of this technology is to improve image quality by overcoming the imaging artifacts observed in the color imaging. The radial line artifact due to discontinuity between one frame and the following frame can be removed by obtaining spectra acquired based on a line trigger instead of a frame trigger. The distortion artifact at center region can be mitigated by a better probe optics alignment process during assembly. False color (blue) at center region can be solved by enhancing the S/N at shorter wavelengths within the blue channel, which can be accomplished by optimizing the laser spectrum and optical throughput at these wavelengths. The ring pattern artifact can potentially be reduced by additional image processing as well as by stabilizing the spectral shape.

The false image artifact due to lower diffraction order illumination can be reduced by optimizing grating design to make ${\eta _{\textrm{Lower}}}/{\eta _{\textrm{Imaging}}}$ smaller. Also, it would be effective to modify detection optics to make ${\tau _{\textrm{Lower}}}/{\tau _{\textrm{Imaging}}}$ smaller. These modifications could include angle polishing the distal end surface of the detection fibers [10].

The FOV in the current device was determined by the difference between the lower and upper angle of diffraction limits. The lower limit (${\theta _{d0}}$) was chosen so that ${\eta _{\textrm{Imaging}}}$ was high enough. The tilt angle of the grating surface to the probe axis was set to the lower limit. The upper limit was determined by the corresponding view angle (${\theta _v}$) where ${S_{\textrm{False}}}/{S_{\textrm{True}}}$ [Eq. (4)] was acceptable. Methods to improve the FOV include increasing the illumination power and making ${\eta _{\textrm{Lower}}}/{\eta _{\textrm{Imaging}}}$ and ${\tau _{\textrm{Lower}}}/{\tau _{\textrm{Imaging}}}$ smaller. Another method would be to place optics such as lens in front of the grating to achieve a wider FOV. For this case, additional development would be required to overcome fabrication challenges such as alignment.

The resolution in the current device was limited by mainly illumination beam size, but was also degraded by the finite resolution of the spectrometer. Resolution can be improved in several ways at the illumination optics and the spectrometer. For example, the GRIN lens’ parameters can be optimized so that the beam waist is located at the working distance of interest. The GRIN lens in the probe’s illumination optics had large chromatic aberration that made the beam waist closer and resolution lower in the blue channel compared to that of the other 2 channels. Replacing the GRIN with an achromatic focusing element could mitigate this focus shift and improve resolution in blue channel. One method to improve the resolution of the spectrometer is to make the width of the fiber bundle at the entrance narrower. A line sensor with taller pixels allows reshaping the bundle into a longer array, helping to solve the spectrometer trade-off between high resolution and high optical throughput.

In SEE device, color calibration profiles are different and required at each view angle (image height). Software algorithm to calibrate various colors with a single frame would require a custom color chart such as a color pie chart.

One of the advantages of SEE over chip-on-tip endoscopes is that the number of sampling pixels (i.e. product of the number of spectrometer sensors used in SEE image processing and the line rate) can be increased regardless of the small diameter of the probe. In addition to that, every pixel is sampled with each of R, G, and B channels simultaneously in the color SEE, and interpolation used in chip-on-tip endoscopes with a Bayer filter is not required. Another advantage of SEE is that SEE probe does not require electrical packaging at the probe tip, which allows further probe miniaturization. One inherent disadvantage of SEE is that, for any given color channel, the same wavelength of illumination cannot be directed to the sample at different locations, making it impossible to conduct spectroscopic imaging.

5. Conclusion

In summary, we have developed a probe-based color forward-view SEE system and have demonstrated color imaging at video rate. The use of three diffraction orders enabled RGB color imaging with a single illumination fiber in a 1.2-mm-diameter probe. By imaging a color standard chart, the device showed a capability to differentiate color standards. The color SEE also enabled visualization of anatomical structures with color cues in swine tissue ex vivo. Although there were several imaging artifacts, the causes and methods to overcome them are known and likely correctable by choosing appropriate parameters and materials. These results suggest that this form of unique miniature endoscopy technology could potentially be useful for various medical applications, especially where small outer diameter and small bending radius are required, such as transnasal sinus endoscopy and transbronchial pulmonary endoscopy.

Funding

Canon U.S.A., Inc.

Acknowledgments

We thank Donald I. Gilbody, Justin H. Palermo, Bin Wu, Tokuyuki Honda, Naoki Kohara, Osamu Koyama, and Dongkyun Kang for their assistance and advice. Also, we thank Canon Inc. for manufacturing the probe spacer with grating, and manufacturing the imaging lens of the spectrometer.

Disclosures

This research was sponsored by Canon U.S.A., Inc. MI, TW, ATM, AA, XY, JHH, ST(Takeuchi): Canon U.S.A., Inc. (E, P). AY, ST(Tatsumi), KI: Canon Inc. (E, P). JR, AZ, GJT: Canon U.S.A., Inc. (F, P, R).

References

1. C. M. Lee, C. J. Engelbrecht, T. D. Soper, F. Helmchen, and E. J. Seibel, “Scanning fiber endoscopy with highly flexible, 1 mm catheterscopes for wide-field, full-color imaging,” J. Biophotonics 3(5-6), 385–407 (2010). [CrossRef]  

2. E. J. Seibel, C. M. Brown, J. A. Dominitz, and M. B. Kimmey, “Scanning single fiber endoscopy: a new platform technology for integrated laser imaging, diagnosis, and future therapies,” Gastrointest. Endosc. Clin. N. Am. 18(3), 467–478 (2008). [CrossRef]  

3. B. Gorissen, M. De Volder, and D. Reynaerts, “Chip-on-tip endoscope incorporating a soft robotic pneumatic bending microactuator,” Biomed. Microdevices 20(3), 73 (2018). [CrossRef]  

4. G. J. Tearney, M. Shishkov, and B. E. Bouma, “Spectrally encoded miniature endoscopy,” Opt. Lett. 27(6), 412–414 (2002). [CrossRef]  

5. D. Yelin, I. Rizvi, W. White, J. T. Motz, T. Hasan, B. E. Bouma, and G. J. Tearney, “Three-dimensional miniature endoscopy,” Nature 443(7113), 765 (2006). [CrossRef]  

6. D. Yelin, B. E. Bouma, and G. J. Tearney, “Volumetric sub-surface imaging using spectrally encoded endoscopy,” Opt. Express 16(3), 1748–1757 (2008). [CrossRef]  

7. D. Yelin, B. E. Bouma, J. J. Rosowsky, and G. J. Tearney, “Doppler imaging using spectrally-encoded endoscopy,” Opt. Express 16(19), 14836–14844 (2008). [CrossRef]  

8. A. Zeidan and D. Yelin, “Miniature forward-viewing spectrally encoded endoscopic probe,” Opt. Lett. 39(16), 4871–4874 (2014). [CrossRef]  

9. Z. Wang, T. Wu, M. A. Hamm, A. Altshuler, A. T. Mach, D. I. Gilbody, B. Wu, S. N. Ganesan, J. P. Chung, M. Ikuta, J. S. Brauer, S. Takeuchi, and T. Honda, “Ultraminiature video-rate forward-view spectrally encoded endoscopy with straight axis configuration,” Proc. SPIE 10040, 100400M (2017). [CrossRef]  

10. A. Zeidan, D. Do, D. Kang, M. Ikuta, J. Ryu, and G. J. Tearney, “High-resolution, wide-field, forward-viewing spectrally encoded endoscope,” Lasers Surg. Med. 51(9), 808–814 (2019). [CrossRef]  

11. D. Kang, D. Yelin, B. E. Bouma, and G. J. Tearney, “Spectrally-encoded color imaging,” Opt. Express 17(17), 15239–15247 (2009). [CrossRef]  

12. A. Zeidan and D. Yelin, “Spectral imaging using forward-viewing spectrally encoded endoscopy,” Biomed. Opt. Express 7(2), 392–398 (2016). [CrossRef]  

13. M. Ikuta, D. Kang, D. Do, A. Zeidan, and G. J. Tearney, “Single-beam spectrally encoded color imaging,” Opt. Lett. 43(10), 2229–2232 (2018). [CrossRef]  

14. D. Kang, R. V. Martinez, G. M. Whitesides, and G. J. Tearney, “Miniature grating for spectrally-encoded endoscopy,” Lab Chip 13(9), 1810–1816 (2013). [CrossRef]  

15. M. G. Moharam, E. B. Grann, D. A. Pommet, and T. K. Gaylord, “Formulation for stable and efficient implementation of the rigorous coupled-wave analysis of binary gratings,” J. Opt. Soc. Am. A 12(5), 1068–1076 (1995). [CrossRef]  

16. M. P. Tjoa, S. M. Krishnan, C. Kugean, P. Wang, and R. Doraiswami, “Segmentation of clinical endoscopic image based on homogeneity and hue,” Proc.Conf.3, 2665–2668 (2001). [CrossRef]  

17. M. Mahy, L. Van Eycken, and A. Oosterlinck, “Evaluation of uniform color spaces developed after the adoption of CIELAB and CIELUV,” Color Res. Appl. 19(2), 105–121 (1994). [CrossRef]  

18. M. Merman, A. Abramov, and D. Yelin, “Theoretical analysis of spectrally encoded endoscopy,” Opt. Express 17(26), 24045–24059 (2009). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Schematic of the color SEE device.
Fig. 2.
Fig. 2. The color SEE probe (inside of a straight rigid guide) and the motor with cables. The probe was disconnected mechanically and optically from the fibers at the motor side.
Fig. 3.
Fig. 3. Schematic of the probe distal end.
Fig. 4.
Fig. 4. (A) Ray tracing of SEE probe illumination optics in Zemax; (B) microscopic image at side view of the probe illumination optics; (C) optical microscopic image of the resin grating on the spacer’s surface from the direction of the yellow arrow in (B); (D) enlarged image in the yellow box in (C); (E) cross-sectional SEM image of the resin grating.
Fig. 5.
Fig. 5. (A) Schematic of the custom grating’s diffraction orders and (B) Calculated diffraction efficiencies of the custom grating. The color bands of blue, green, and red in (B) correspond to the wavelength bands used in color SEE imaging for blue, green and red channels, respectively.
Fig. 6.
Fig. 6. (A) Flexible probe detection sheath including multi-mode fiber bundle. The drive cable of the illumination optics was inserted into the inner sheath. (B) Rigid bent guide (304 Stainless steel tubing, McMaster-Carr, 14.5 Gauge) containing the SEE probe inside. The guide was bent near the distal end (probe bending radius = 5.4 mm and bending angle = 110°).
Fig. 7.
Fig. 7. (A)-(C): Color chart images obtained using the color SEE device; Screenshots of the video taken at 30 Hz. The arrows denote false image artifact due to lower order illumination. (D): An image of the same color chart obtained using a digital still camera. The dashed circles correspond to the (A) (B) and (C) fields of view.
Fig. 8.
Fig. 8. Magnified view of NBS 1963A resolution target imaging obtained using the SEE device (A, 11.0 lp/mm) and a commercial chip-on-the-tip endoscope (B, 7.1 lp/mm). Both of the working distances were 10 mm.
Fig. 9.
Fig. 9. Ex vivo imaging of swine intertarsal joint, using (A-B) color SEE device, (C-D) chip-on-tip endoscope, and (E) digital still camera. Arrows in the image (E) indicate where the images (A)-(D) were obtained.

Tables (1)

Tables Icon

Table 1. Reference CIE L*a*b* (D5000, 2° observer) values of color squares (red, green, and blue) provided by manufacturer, measured CIE L*a*b* average values by the color SEE system and the digital still camera with the manufacturer reference.

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

n s sin θ i + sin θ d = m g λ ,
θ d = θ d 0 + θ v ,
Δ E = ( L 1 L 2 ) 2 + ( a 1 a 2 ) 2 + ( b 1 b 2 ) 2 .
S False S True = ( η Lower η Imaging α Lower α Imaging τ Lower τ Imaging ) ρ Lower ρ Imaging
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.