Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Development of a visible to 1600 nm hyperspectral imaging rigid-scope system using supercontinuum light and an acousto-optic tunable filter

Open Access Open Access

Abstract

In this study, we developed a rigid-scope system that can perform hyperspectral imaging (HSI) between visible and 1600 nm wavelengths using a supercontinuum light source and an acousto-optic tunable filter to emit specific wavelengths. The system optical performance was verified, and the classification ability was investigated. Consequently, it was demonstrated that HSI (490–1600 nm) could be performed. In addition, seven different targets could be classified by the neural network with an accuracy of 99.6%, recall of 93.7%, and specificity of 99.1% when the wavelength range of over 1000 nm (OTN) was extracted from HSI data as train data.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Near-infrared hyperspectral imaging (NIR-HSI), which acquires spectral images in the NIR region (850–2500 nm), has attracted attention in the food and industrial fields. This imaging technique can nondestructively analyze the components of objects that cannot be easily distinguished with visible light, such as fat in meat, the composition of similarly colored resins, and contaminants from foreign matter [16]. One notable aspect of NIR absorption spectroscopy, particularly over 1000 nm (OTN), is the inclusion of information from the overtones and combination tones of molecular vibrations. The data derived from the spectroscopy at each pixel are then stored. Consequently, NIR-HSI enables identifying organic substances, estimating their concentrations, and creating two-dimensional maps [7]. In addition, compared with infrared light, NIR light is characterized by lower light losses due to water absorption and light scattering in the living body. Moreover, NIR light exhibits higher transparency through the living body than ultraviolet and visible light [8]. Because spectral information can be acquired from the deep parts of the body, NIR-HSI is expected to be applied in the medical field for the visualization of lesions hidden in normal tissue [9,10].

Therefore, many types of devices that can perform HSI according to the imaging target and situation have been developed. For example, methods to a) perform HSI by moving the x-axis stage or galvanometer mirror with a line camera attached to a spectrometer or b) perform HSI with a camera or light source equipped with a wavelength-variable filter have been reported [1114]. The former method is configured by spectroscopic cameras and transport equipment and is more applicable; moreover, it enables uniform illumination and is suitable for line scanning and imaging under a microscope [15]. The latter method can transmit illumination using light guides and transmit images using relay optics and bundle fibers; thus, it is suitable for situations requiring portability and imaging in confined spaces, such as when rigid scopes and fiberscopes are used [16,17].

However, for OTN wavelengths, ordinary visible cameras lose sensitivity, and a few commercially available lenses can achieve chromatic aberration correction. Therefore, it is necessary to construct cameras, optical systems, and illumination systems to use portable NIR-HSI devices. Thus far, no device has been reported to acquire NIR-HSI with a rigid scope, which requires the design of a complex relay optical system. Therefore, we have developed a unique rigid scope capable of imaging across the visible to 1600 nm wavelengths, equipped with custom-made optics [18]. NIR multispectral imaging can be achieved by manually switching bandpass filters at 14 wavelengths while transmitting NIR light at each wavelength through a light guide [19]. However, this method has limitations in exploratory spectral analysis studies because the acquired NIR spectra are sparse. Thus, we suggest the feasibility of NIR-HSI rigid scope acquisitions using a light source that can continuously extract specific wavelengths.

Therefore, a supercontinuum (SC) light source and an acousto-optic tunable filter (AOTF) were used to operate rigid-scope NIR-HSI. An SC light source can output intense and coherent white light, whereas an AOTF can extract light containing specific wavelengths by inputting high-frequency electrical signals [20]. This method offers the advantages of easy light transmission to the light guide and the ability to switch electrically within a broad range of wavelengths at speeds ≤ 1 ms [21]. The combination of an SC light source and an AOTF can also produce visible wavelengths [22]. Therefore, by using a camera with a sensitivity to visible to NIR wavelengths, visible (VIS) HSI can be obtained. When the system is realized, it will enable colorimetric and organic substances analysis using HSI data under a rigid scope.

In this study, a rigid-scope system capable of performing VIS-NIR HSI was developed, and various optical characteristics were measured. Subsequently, the HSI data of six types of resin pieces were acquired and investigated to determine whether resins with similar colors could be classified using a neural network.

2. Materials and methods

2.1 Optics of rigid scope and light source

The optical system of the rigid scope was designed as shown in Figs. 1(a) and 1(b). The device consisted of a custom-made VIS-NIR rigid scope (3MX-1760, Machida Endoscope Co., Tokyo, Japan) [23], variable-focus liquid lens (EL-16-40-NIR-5D-C, Optotune, Dietikon, Switzerland), 60 mm-focus lens (AC254-060-C, Thorlabs, USA), and 1280 × 1024 pixel InGaAs camera (ARTCAM-990SWIR-TEC, ARTRAY Co., Japan). The images were captured using an InGaAs camera with a sensitivity in the range of 450–1700 nm and a field of view of 40°.

 figure: Fig. 1.

Fig. 1. (a) Developed near-infrared hyperspectral imaging (NIR-HSI) rigid scope. (b) Optical system of the rigid scope. (c) Developed light source for HSI in the range of 490–1600 nm.

Download Full Size | PDF

The light source was designed as shown in Fig. 1(c). An SC light source (SC-OEM, Wuhan Yangtze Soton Laser Co., Ltd., Wuhan, China) was used, and a specific wavelength (490–1600 nm) was extracted from the light using an AOTF (AOTF-PRO, Wuhan Yangtze Soton Laser Co., Ltd., Wuhan, China). The extracted light was passed through a protected silver mirror (PF10-03-P01 - Φ1, Thorlabs, Newton, NJ, USA) and diffuser (EDC-30, VIAVI Solutions, Scottsdale, AZ, USA), introduced to the light guide (Φ6 mm) of the rigid scope, and irradiated from the tip of the rigid scope to the sample at a lighting angle of approximately 40°.

During image acquisition, the chromatic aberration can be corrected by changing the focal length for each wavelength using a variable-focus liquid lens [19]. The relationship between optical power and wavelength is shown in Fig. 2.

 figure: Fig. 2.

Fig. 2. Plot of the optical power setting of a variable-focus liquid lens as a function of wavelength.

Download Full Size | PDF

2.2 Camera calibration

Because light intensity varies as a function of the wavelength, this study used a method in which the optimal exposure time was set when the wavelength was changed. To avoid saturation, the exposure time was set using Eq. (1) as 90% of the dynamic range of intensity value within the top 50 pixels when the standard white (STR-99-050, Labsphere, North Sutton, NH, USA) was captured.

$${T_\lambda } = {T_{initial}} \times \frac{{65535}}{{{I_{max50}}}} \times 0.9$$
where ${T_\lambda }$ is the exposure time of each wavelength, ${T_{initial}}$ is 100 ms, and ${I_{max50}}$ represents the average of the top 50 intensity values at each pixel in the spectral image at each wavelength. Because the pixel data have a 12-bit resolution shifted by four bits (16–65535), the dynamic range was set to 65535. The gain was set to zero. To perform the white calibration process, HSI data for the white and dark current images were acquired at each ${T_\lambda }$ setting.

2.3 Characterization of light source and optics

To investigate the light intensity from the rigid-scope tip at each wavelength extracted by the AOTF, the power was measured in 10 nm increments using a power meter (S120C/S132C, Thorlabs, Newton, NJ, USA), as shown in Fig. 3(a). In addition, to confirm whether the developed device could perform VIS-NIR HSI, a color chart (CASMATCH, Funakoshi Co., Ltd., Japan) and a standard reflector (WCS-MC-020, Labsphere, North Sutton, NH, USA) were used to acquire the light spectra from 490 to 1600 nm at 6 nm increments, as shown in Fig. 3(b). The spectral data of the standard reflector to be compared has too high a wavelength resolution to enable accurate comparison. In the developed system, the wavelength resolution depends on the half-width of the wavelength. Because the light extracted by the AOTF has different half-widths at each wavelength, this factor had to be considered when evaluating the spectral characteristics. Each set of wavelengths was then measured using a monochromator (CM110 1/8 Meter Monochromator, Spectral Products, Putnam, CT, USA), visible (450–1000 nm) or NIR (1000–1700 nm) photosensors (PDA36A2/PDA20CS2, Thorlabs, Newton, NJ, USA), and an oscilloscope (DCS-1054B, TEXIO, Kanagawa, Japan), as shown in Fig. 3(c). Each AOTF-extracted spectral dataset was normalized, and the theoretical spectrum was calculated by multiplying the normalized values with the calibrated data of the standard reflector, as shown in Eq. (2). The simulated spectra were then compared with the acquired spectral data.

$$Simulatio{n_n} = \mathop \sum \limits_{m = n - range}^{n + range} ({Datashee{t_m} \times Monochromete{r_m}} )$$
where $Simulatio{n_n}$ is the theoretical reflection rate of the standard reflector when spectroscopy is performed using an AOTF, $Datashee{t_m}$ is the reflection rate of the calibration certificate at each wavelength, and $Monochromete{r_m}$ is the normalized value from actual measurements with a monochromator.

 figure: Fig. 3.

Fig. 3. (a) Power measurements using the HSI light source. S120C and S132C were used for measurements from 490 to 960 nm and from 970 to 1600 nm, respectively. (b) Measurements of reflectors (color chart and standard reflector). (c) Spectroscopy setup of the HSI light source. Photosensors of PDA36A2 and PDA20CS2 were used for measurements from 490 to 964 nm and from 970 to 1600 nm, respectively. (d) Resolution evaluation setup at different wavelengths.

Download Full Size | PDF

To investigate the resolution at each wavelength, a 1951 USAF resolution chart (R3L3S1N, Thorlabs, Newton, NJ, USA) was placed at 50 mm from the tip of the rigid scope. The light extracted by the AOTF was output from the light guide and passed through a lens with a focal length of 250 mm (LA1301, Thorlabs, Newton, NJ, USA) and a 100 × 100 mm glass diffuser (DG100X100-220, Thorlabs, Newton, NJ, USA), and a resolution chart was then exposed. In addition, 24 wavelength images were captured from 490 to 1600 nm, as shown in Fig. 3(d). The contrast values were calculated using the Michelson contrast equation [24]:

$$\textrm{Contrast} = \frac{{{I_{max}} - {I_{min}}}}{{{I_{max}} + {I_{min}}}}$$
where ${I_{max}}$ and ${I_{min}}$ represent the highest and lowest pixel intensity values of the adjacent bars within an element, respectively.

2.4 Short wavelength infrared hyperspectral imaging of resin

For spectral acquisition and validation of the identification accuracy, six types of resins were prepared: fiber-reinforced plastic (FRP), acrylic, epoxy-glass, Nylon 6, polycarbonate, and rigid polyvinyl chloride (PVC) (SPLATEAJAA02, SPLATEAJAA07, SPLATEAJAA080001, SPLATEAJA-A09, SPLATEAJAA11, and SPLATEAJAA14, respectively; Standard Testpiece Co., Ltd., Japan). All resins were approximately 2 mm in thickness, and they were cut into quasi square shapes (5 mm). In this study, 186 spectral images were acquired from 490 to 1600 nm in 6 nm increments at a distance of 50 mm between the tip of the rigid scope and the surface of the resin pieces. To acquire training data, three resin pieces of the same material were placed on a white plate, and the HSI data were captured. To classify multiple resins in the same field of view, six different resin pieces were placed on a white plate, and the HSI data were captured as the test dataset.

2.5 Data processing

The image data were processed according to the flowchart shown in Fig. 4(a). For each spectral image, a white calibration process was performed on all pixels $({i,j} )$ using white and dark image data to correct the brightness.

$$\textrm{W}({i,j} )= \frac{{{I_r}({i,j} )- {I_d}({i,j} )}}{{{I_w}({i,j} )- {I_d}({i,j} )}}$$
where $\textrm{W}({i,j} )$ is the row vector of the data after white calibration processing and ${I_r}({i,j} )$, ${I_w}({i,j} )$, and ${I_d}({i,j} )$ are the row vectors of the raw, white, and dark data, respectively.

 figure: Fig. 4.

Fig. 4. Analysis algorithm. (a) Processing flow. (b) Neural network structure.

Download Full Size | PDF

To correct the magnification and location changes due to chromatic aberration correction by the variable-focus liquid lens, an affine transformation was performed on the images obtained at each wavelength [19]. Finally, the absorbance was calculated from the intensity values of each pixel.

$$\textrm{A}({i,j} )={-} {\log _{10}}({\textrm{W}({i,j} )} )$$
where $\textrm{A}({i,j} )$ is the row vector of the absorbance spectrum for the obtained image.

2.6 Machine-learning algorithm

The spectra obtained for each pixel were trained using machine learning to create a learning model for the identification of the resin type. Training was performed using a three-layer neural network with a hierarchical structure, as shown in Fig. 4(b). In the neural network, the weights were multiplied by the input spectrum, that is, the absorbance at each wavelength, and the values were updated to those of the next layer. Eventually, after multiple layers, the machine-learning algorithm identified whether the input spectrum was one of the six resins or a standard white that served as the background. For a spectrum x consisting of $\textrm{m}$ wavelengths at input pixel j, the output value ${z_j}$ for the next layer can be expressed as follows [25,26]:

$${z_j} = \mathop \sum \limits_{i = 1}^m {w_{ij}}{x_i} + {b_j}$$
where ${z_j}$ represents the input values for the activation function, the response is the output of the unit, and ${w_{ij}}$, ${x_i}$, ${b_j}$ are the corresponding weight, input values of the unit, and bias, respectively. The sigmoidal, rectified linear unit, and softmax functions were used as the activation functions; these are expressed by the following equations:
$$\sigma ({{z_j}} )= \frac{1}{{1 + {e^{ - {z_j}}}}}$$
$$\textrm{R}({{z_j}} )= \max ({0,{z_j}} )$$
$$\textrm{S}{({{z_j}} )_k} = \frac{{{e^{{{({z_j})}_k}}}}}{{\mathop \sum \nolimits_{k = 1}^K {e^{{{({z_j})}_k}}}}}$$
where K is the class number, which is seven in this study. The following backpropagation method was used to update the weights:
$$\Delta w_{i,j}^{l - 1,l} ={-} \eta \frac{{\partial E}}{{\partial w_{i,j}^{l - 1,l}}}$$
where $w_{i,j}^{l - 1,l}$ is the weight between the $i$th unit in the ($l - 1)$th layer and the $j$th unit in the $l$th layer, E is the error function, and $\eta $ is the learning rate.

Dropouts were incorporated into each layer to prevent overfitting. The ADAM optimizer was used [27]. The neural network output the probability such that the spectrum of the identified pixel belonged to each of the seven classes, six types of resins, and a white plate. The decision was made assuming that the class had the highest probability value.

In this study, spectra obtained from 1000 pixels from each of the seven studied classes were used for learning. The wavelengths used for learning were in the (i) VIS region: 490–652 nm, (ii) NIR sensitivity band of silicon sensor: 652–1000 nm, (iii) full region: 490–1600 nm, and (iv) over 1000 nm (OTN) region: 1000–1600 nm, and training was performed without data used for identification to avoid over-training.

The prediction accuracy was evaluated by classifying the pixels into four groups: pixels of a targeted resin predicted as the target (true-positive: TP), pixels of a targeted resin predicted as the other (false-negative: FN), pixels of a non-targeted resin predicted as the target (false-positive: FP), and pixels of a non-targeted resin predicted as the non-targeted resins (true-negative: TN). Based on the classified pixels, the specificity, sensitivity, and accuracy were calculated as follows:

$$Accuracy = \frac{{TP + TN}}{{TP + TN + FP + FN}} \times 100$$
$$Recall = \frac{{TP}}{{TP + FN}} \times 100$$
$$Specificity = \frac{{TN}}{{TN + FP}} \times 100$$

3. Results

Figure 5(a) shows the measured power at each wavelength of the NIR light emitted from the tip of the rigid scope through the light guide. Because a YAG laser was injected into the nonlinear optical medium to create white light in the SC light source, a peak of approximately 18 mW was observed near 1064 nm, and the spectrum extended from visible to NIR around that wavelength. The light output was observed in the wavelength range of 490–1600 nm. Figure 5(b) shows the color chart of the pseudo color image and the average spectrum in the squares of the red, green, blue, white, and black charts. The average red, green, and blue spectra were located between the black and white spectra in all the wavelength. Although the output of wavelengths corresponding to blue below 490 nm was below the minimum detection limit and the standard deviation was large because it was less than 1 mW (up to approximately 650 nm), spectral differences were observed in the chart for each color; e.g., in the green color, absorption was observed in the 514–580 nm region, and in the red color, absorption was observed in the 610–750 nm region.

 figure: Fig. 5.

Fig. 5. Properties of the light source. (a) Plot of the power output irrigated from the tip of the rigid scope as a function of wavelength. (b) Color chart imaging results: (i) Visual image of color chart. (ii) Acquired images using the developed device at 640, 550, and 508 nm, denoted as R, G, and B, respectively. (iii) Average spectra of red, green, blue, black, and white colors. (c) Spectral performance. The blue line shows the acquired spectrum of a standard reflector, and the red line shows the spectral simulation results obtained by multiplying the datasheet (black line) and the spectroscopic results. (d) Plot of exposure time as a function of wavelength.

Download Full Size | PDF

Figure 5(c) shows the results of the comparison between the acquired absorption spectrum of the standard reflector and the simulation using a calibration certificate. The blue line represents the spectrum of the acquired reflector, and the band represents the standard deviation. The black line is the reference value obtained from the calibration certificate, and the red line is the result of the simulation obtained by multiplying the datasheet value with the spectral results. Although there is an overall upward shift compared with the simulation, the obtained spectra capture the absorption characteristics of the standard reflector. The exposure times for each wavelength used for imaging are shown in Fig. 5(d). The time required for VIS-NIR HSI (186 wavelengths between 490 and 1600 nm) was approximately 390 s.

The 1951 USAF resolution chart images for each wavelength are shown in Fig. 6(a). The relationship between the resolution and contrast is shown in Fig. 6(b). In this VIS-NIR HSI system, the resolution tended to decrease in the same line pair as the wavelength increased.

 figure: Fig. 6.

Fig. 6. Resolution outcomes. (a) Results of the 1951 USAF resolution chart raw images. (b) Contrast plots as a function of resolution at different wavelengths.

Download Full Size | PDF

The results of the VIS-NIR HSI acquisition for the six types of resins are shown in Fig. 7. Figure 7(a) shows the samples (in which FRP, Acrylic, Epoxy-glass, Nylon 6, Polycarbonate, and Rigid PVC were located) captured by a visible-color camera when exposed to white light. Figure 7(b) shows pseudocolor images acquired by the VIS-NIR HSI rigid-scope system using R = 640 nm, G = 550 nm, and B = 508 nm as visible images and R = 1150 nm, G = 1354 nm, and B = 1438 nm as NIR pseudo color images. The average spectra of six resins acquired using this system are shown in Fig. 7(c). Each spectrum shows the absorption characteristics of the molecular vibrations in the OTN band for example, a second overtone of CH2 (1200 nm) was observed in the acrylic, rigid PVC, and polycarbonate samples [28]. Furthermore, the combination tones of CH3 (1340 nm) and aromatic CH (1440 nm) were observed in acrylic and polycarbonate, respectively [29].

 figure: Fig. 7.

Fig. 7. (a) Images of six types of resin spices acquired using a visible camera. (b) Merged color images; the left and right images were obtained in the visible (R: 640 nm, G: 550 nm, B: 508 nm) and NIR ranges (R: 1150 nm, G: 1354 nm, B:1438 nm), respectively. (c) Acquired spectra of six types of resins in the range of 490–1600 nm. (d) Neural network classification outcomes. The used wavelength ranges for (i), (ii), (iii), and (iv) were 490–652 nm, 652–1000 nm, 490–1600 nm, and 1000–1600 nm, respectively.

Download Full Size | PDF

The pseudo color assignment results of the 7-class classification scheme used by the neural network are shown in Fig. 7(d), and the results of accuracy under each condition are listed in Table 1. The classifier using the spectral data in the visible range (i) resulted in many misclassifications with an average recall of 22.1%. In particular, many pixels misidentified the transparent acrylic, polycarbonate, and rigid PVC materials as the background and FRP as the epoxy glass. The classifier using the spectral data of the NIR sensitivity band of the silicon sensor (ii) also led to numerous misclassifications, with an average recall of 37.4%; many pixels misclassified the backgrounds as acrylic, polycarbonate, and rigid PVC. Conversely, the classifier using the spectral data of (iii) the full region (490–1600 nm) and (iv) OTN resulted in fewer misclassifications, with averages of 99.3% and 99.6% for accuracy, 90.1% and 93.7% for recall, and 98.5% and 99.1% for specificity, respectively.

Tables Icon

Table 1. Classification Results of Hyperspectral Imaging Analysis for Six Types of Resin Pieces and Standard White

4. Discussion

In this study, VIS-NIR HSI was successfully performed under a rigid scope using an SC light source with an AOTF; the method can extract wavelengths in the range of 490–1600 nm, irradiate the target using a light guide, and capture images at each wavelength while focusing with a variable-focus liquid lens. This system has several advantages: (i) The light power of the extracted wavelength is sufficiently low (<18 mW) to enable nondestructive imaging of a variety of objects. (ii) Because the spectral light is guided by a light guide, the device can be downsized compared with the push-bloom method, which has a spectrometer and scanning mechanism inside the camera. Therefore, HSI of narrow places and three-dimensional objects from multiple angles can be performed. (iii) The system can acquire more continuous NIR spectra than the conventional MSI rigid scope-type device; therefore, it is possible to analyze the wavelengths that are important for identification.

In addition, even for resins with visible color differences that are difficult to distinguish, the neural network analysis of the VIS-NIR HSI can classify seven classes with high accuracy. Conversely, the investigation in this study revealed several points for future improvement of the device and points to be noted when acquiring HSI data using this method.

In the wavelength range of 490–650 nm, the noise in the acquired images was large, as shown in Fig. 5(b). This is because the light intensity in this region is low, as indicated by the power measurements presented in Fig. 5(a). In the future, to improve the quality of images in the visible region, an intense light source for the visible region should be added, or a dichroic mirror should be used to extract light from the visible wavelength and image it onto a low-noise, high-sensitivity camera sensor.

Furthermore, Eq. (5) used in this study cannot apply Lambert-beer’s method to objects that produce diffuse reflections, and the absorbance equation may not always be applicable. Ideally, the Kubelka-Munk formula should be used; however, the strict conditions for its application limit the ability to obtain accurate spectra. In addition, the present method provides a spectral trend similar to that of a transmission absorption measurement; therefore, it should be possible to classify objects through machine learning.

As shown in Fig. 6, the contrast decreased as the wavelength increased. This is because the power of the lens increased as the wavelength increased, and the angle of view increased because the focus, which shifted by chromatic aberration, was adjusted by the variable-focus liquid lens. Therefore, to adjust the contrast at each wavelength, the optical design of the rigid-scope lens must be corrected for chromatic aberrations over a broad area. If this chromatic aberration correction is performed, images can be acquired at the same angle of view, even when the wavelengths are switched.

The classification outcomes indicate that low-recall resulted when only the visible region (490–650 nm) or the NIR sensitivity band of the silicon sensor (650–1000 nm) was used for training. The characteristic absorption spectrum was considered to be faint in this range [30]. However, when trained in the full range (490–1600 nm) and the OTN range (1000–1600 nm), the classification accuracy improved. It is believed that a characteristic spectrum derived from the molecular vibrations was obtained for each resin in the OTN region, which affected the classification process. Additionally, the OTN region was shown to yield outcomes larger than those yielded by the total region in terms of average accuracy, recall, and specificity. This suggests that the spectral data of the visible regions were characterless and unnecessary as training data.

In a classification study of the OTN region that focused on misidentified pixels, the recalled value of polycarbonate was confirmed to be low. As shown in Fig. 7(d), the boundaries around the acrylic were misidentified as polycarbonate. In this case, the illumination method used a light guide positioned parallel to the optical axis of the image capture. Consequently, shadows were created around the object, which may have led to misidentification. Therefore, the influence of lighting misidentification can be reduced by capturing images from multiple angles. Improvements in the aforementioned shortcomings showed that the VIS-NIR HSI rigid-scope system could provide more accurate data acquisition.

Although the AOTF and SC light sources enable faster wavelength switching than conventional MSI devices, the imaging time in this study was approximately 390 s. The imaging time was long because the camera settings, optical design, and number of wavelengths were not optimized to prioritize the imaging speed. The imaging time can be reduced using the following methods and settings. The exposure time was set for each wavelength such that white light would be approximately 90% of the dynamic range. However, because the current conditions provided ample brightness for obtaining spectral information, the exposure time could be reduced. In addition, because a lag of approximately 100 ms was generated when the exposure time changed, the delay was reduced by fixing the exposure time and adopting a variable neutral density filter. Moreover, in this case, the images were captured with a gain of zero; thus, increasing the gain would also lead to a shorter exposure time. As a common optical design problem, a delay of approximately 25 ms was caused by the operation of the variable-focus liquid lens when the extracted wavelength was changed. Therefore, in the future, a chromatic aberration-correcting lens should be developed based on the wavelength range of camera sensitivity (450–1700 nm) to eliminate variable-focus liquid lens. A total of 186 wavelengths were used in this study. Akimoto et al. reported that a gastrointestinal stromal tumor area could be classified by a neural network with almost no loss of accuracy by training data at only four effective wavelengths extracted from the 196-band NIR-HSI data [31]. Therefore, when the wavelengths can be limited by selecting those optimized for target identification, the time required for image acquisition can be reduced considerably. In addition, wavelength selection has the potential to realize real-time classification because the machine-learning calculation costs are reduced. If the classification is displayed at approximately 15–30 frames per second (video-like frame rate), the number of possible applications will increase; for example, the on-sight measurement in industrial field and laparoscopic surgery can be considered [32].

5. Conclusion

In this study, a rigid-scope system capable of HSI at VIS-NIR wavelengths was developed by designing an optical system using AOTF and SC light sources. To evaluate the absorption spectroscopy performance, the system was confirmed to be capable of imaging in the 490–1600 nm range. Using this system, the spectra of six types of resins were acquired and classified pixel-by-pixel with a neural network in multiple wavelength ranges. As a result, the learning in the 490–652 and 652–1000 nm regions, which are the sensitivity bands of the visible camera, resulted in many misclassifications, and the average recall was less than 40%. Conversely, when training was performed in the region of 490–1600 nm, which includes the NIR region, the average recall was 90.1%; when the training was limited to the region of 1000–1600 nm, the average recall was 93.7%, and the average values of other evaluation indices were also improved compared with other conditions. Therefore, the system can acquire information on the molecular vibrations of each resin at each pixel. In addition, the portability of the system is anticipated to expand the range of potential targets for VIS-NIR HSI measurements and to enhance its utility in industrial and medical fields.

Funding

Agencia Canaria de Investigación, Innovación y Sociedad de la Información (POC 2014-2020, Eje 3 Tema Prioritario 74 (85\%)); Japan Society for the Promotion of Science (21H038440); Japan Agency for Medical Research and Development (23he0422027j0001, 23ym0126806j0002;Seed Number: A417TS); National Cancer Center Japan (2023-A-9, 31-A-11).

Disclosures

The authors have no relevant financial interests or any other potential conflicts of interest to disclose.

Data availability

Data presented in this study are available upon request from the corresponding author.

References

1. W. Jia, S. van Ruth, N. Scollan, et al., “Hyperspectral Imaging (HSI) for meat quality evaluation across the supply chain: Current and future trends,” Curr. Res. Food Sci. 5, 1017–1027 (2022). [CrossRef]  

2. J. Ma, D.-W. Sun, H. Pu, et al., “Advanced Techniques for hyperspectral imaging in the food industry: principles and recent applications,” Annu. Rev. Food Sci. Technol. 10(1), 197–220 (2019). [CrossRef]  

3. F. Dong, Y. Bi, J. Hao, et al., “A combination of near-infrared hyperspectral imaging with two-dimensional correlation analysis for monitoring the content of alanine in beef,” Biosensors 12(11), 1043 (2022). [CrossRef]  

4. A. Faltynkova and M. Wagner, “Developing and testing a workflow to identify microplastics using near infrared hyperspectral imaging,” Chemosphere 336, 139186 (2023). [CrossRef]  

5. W. Ye, W. Xu, T. Yan, et al., “Application of near-infrared spectroscopy and hyperspectral imaging combined with machine learning algorithms for quality inspection of grape: a review,” Foods 12(1), 132 (2022). [CrossRef]  

6. M. J. Khan, H. S. Khan, A. Yousaf, et al., “Modern trends in hyperspectral image analysis: a review,” IEEE Access 6, 14118–14129 (2018). [CrossRef]  

7. K. Okubo, Y. Kitagawa, N. Hosokawa, et al., “Visualization of quantitative lipid distribution in mouse liver through near-infrared hyperspectral imaging,” Biomed. Opt. Express 12(2), 823–835 (2021). [CrossRef]  

8. A. M. Smith, M. C. Mancini, and S. Nie, “Second window for in vivo imaging,” Nat. Nanotechnol. 4(11), 710–711 (2009). [CrossRef]  

9. D. Sato, T. Takamatsu, M. Umezawa, et al., “Distinction of surgically resected gastrointestinal stromal tumor by near-infrared hyperspectral imaging,” Sci. Rep. 10(1), 21852 (2020). [CrossRef]  

10. T. Mitsui, A. Mori, T. Takamatsu, et al., “Evaluating the identification of the extent of gastric cancer by over-1000 nm near-infrared hyperspectral imaging using surgical specimens,” J. Biomed. Opt. 28(08), 086001 (2023). [CrossRef]  

11. A. Yahata, H. Takemura, T. Takamatsu, et al., “Wavelength selection of near-infrared hyperspectral imaging for gastric cancer detection,” in 2021 6th International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS) (2021), pp. 219–223.

12. A. Yahata, H. Takemura, R. Iwanami, et al., “Gastric cancer detection by two-step learning in near-infrared hyperspectral imaging,” Journal of Information and Communication Engineering 7(2), 467–473 (2021).

13. H. Akbari, K. Uto, Y. Kosugi, et al., “Cancer detection using infrared hyperspectral imaging,” Cancer Sci. 102(4), 852–857 (2011). [CrossRef]  

14. A. Mori, M. Umezawa, K. Okubo, et al., “Visualization of hydrocarbon chain length and degree of saturation of fatty acids in mouse livers by combining near-infrared hyperspectral imaging and machine learning,” Sci. Rep. 13(1), 20555 (2023). [CrossRef]  

15. N. T. Clancy, G. Jones, L. Maier-Hein, et al., “Surgical spectral imaging,” Med. Image Anal. 63, 101699 (2020). [CrossRef]  

16. J. Yoon, J. Joseph, D. J. Waterhouse, et al., “A clinically translatable hyperspectral endoscopy (HySE) system for imaging the gastrointestinal tract,” Nat. Commun. 10(1), 1902 (2019). [CrossRef]  

17. K. J. Zuzak, S. C. Naik, G. Alexandrakis, et al., “Characterization of a near-infrared laparoscopic hyperspectral imaging system for minimally invasive surgery,” Anal. Chem. 79(12), 4709–4715 (2007). [CrossRef]  

18. T. Zako, M. Yoshimoto, H. Hyodo, et al., “Cancer-targeted near infrared imaging using rare earth ion-doped ceramic nanoparticles,” Biomater. Sci. 3(1), 59–64 (2015). [CrossRef]  

19. T. Takamatsu, Y. Kitagawa, K. Akimoto, et al., “Over 1000 nm near-infrared multispectral imaging system for laparoscopic in vivo imaging,” Sensors 21(8), 2649 (2021). [CrossRef]  

20. A. Bellincontro, A. Taticchi, M. Servili, et al., “Feasible application of a portable NIR-AOTF tool for on-field prediction of phenolic compounds during the ripening of olives for oil production,” J. Agric. Food Chem. 60(10), 2665–2673 (2012). [CrossRef]  

21. M. Misono, N. Henmi, T. Hosoi, et al., “High-speed wavelength switching and stabilization of an acoustooptic tunable filter for WDM network in broadcasting stations,” IEEE Photonics Technol. Lett. 8(4), 572–574 (1996). [CrossRef]  

22. H. Shao, Y. Chen, Z. Yang, et al., “Feasibility study on hyperspectral LiDAR for ancient Huizhou-style architecture preservation,” Remote Sens. 12(1), 88 (2019). [CrossRef]  

23. T. Zako, M. Ito, H. Hyodo, et al., “Extra-luminal detection of assumed colonic tumor site by near-infrared laparoscopy,” Surg Endosc 30(9), 4153–4159 (2016). [CrossRef]  

24. H. G. Kang, S. H. Song, Y. B. Han, et al., “Proof-of-concept of a multimodal laparoscope for simultaneous NIR/gamma/visible imaging using wavelength division multiplexing,” Opt. Express 26(7), 8325–8339 (2018). [CrossRef]  

25. D. Svozil, V. Kvasnicka, and J. Pospichal, “Introduction to multi-layer feed-forward neural networks,” Chemom. Intell. Lab. Syst. 39(1), 43–62 (1997). [CrossRef]  

26. T. Udelhoven and B. Schütt, “Capability of feed-forward neural networks for a chemical evaluation of sediments with diffuse reflectance spectroscopy,” Chemom. Intell. Lab. Syst. 51(1), 9–22 (2000). [CrossRef]  

27. P. Diederik and J. Ba, “Adam: a method for stochastic optimization,” arXiv, arXiv pre-print server (2017). [CrossRef]  

28. M. Shimoyama, S. Hayano, K. Matsukawa, et al., “Discrimination of ethylene/vinyl acetate copolymers with different composition and prediction of the content of vinyl acetate in the copolymers and their melting points by near-infrared spectroscopy and chemometrics,” J. Polym. Sci. B Polym. Phys. 36(9), 1529–1537 (1998). [CrossRef]  

29. M. Kumagai, H. Suyama, T. Sato, et al., “Chemical meaning of near infrared spectra from a portable near infrared spectrometer for various plastic wastes,” Int. J. Soc. Mater. Eng. Resour 11(1), 5–9 (2003). [CrossRef]  

30. R. Leon, H. Fabelo, S. Ortega, et al., “VNIR–NIR hyperspectral imaging fusion targeting intraoperative brain cancer detection,” Sci. Rep. 11(1), 19696 (2021). [CrossRef]  

31. K. Akimoto, R. Ike, K. Maeda, et al., “Wavelength bands reduction method in near-infrared hyperspectral image based on deep neural network for tumor lesion classification,” European Journal of Applied Sciences 9(1), 273–281 (2021).

32. L. Ayala, T. J. Adler, S. Seidlitz, et al., “Spectral imaging enables contrast agent-free real-time ischemia monitoring in laparoscopic surgery,” Sci. Adv. 9(10), eadd6778 (2023). [CrossRef]  

Data availability

Data presented in this study are available upon request from the corresponding author.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) Developed near-infrared hyperspectral imaging (NIR-HSI) rigid scope. (b) Optical system of the rigid scope. (c) Developed light source for HSI in the range of 490–1600 nm.
Fig. 2.
Fig. 2. Plot of the optical power setting of a variable-focus liquid lens as a function of wavelength.
Fig. 3.
Fig. 3. (a) Power measurements using the HSI light source. S120C and S132C were used for measurements from 490 to 960 nm and from 970 to 1600 nm, respectively. (b) Measurements of reflectors (color chart and standard reflector). (c) Spectroscopy setup of the HSI light source. Photosensors of PDA36A2 and PDA20CS2 were used for measurements from 490 to 964 nm and from 970 to 1600 nm, respectively. (d) Resolution evaluation setup at different wavelengths.
Fig. 4.
Fig. 4. Analysis algorithm. (a) Processing flow. (b) Neural network structure.
Fig. 5.
Fig. 5. Properties of the light source. (a) Plot of the power output irrigated from the tip of the rigid scope as a function of wavelength. (b) Color chart imaging results: (i) Visual image of color chart. (ii) Acquired images using the developed device at 640, 550, and 508 nm, denoted as R, G, and B, respectively. (iii) Average spectra of red, green, blue, black, and white colors. (c) Spectral performance. The blue line shows the acquired spectrum of a standard reflector, and the red line shows the spectral simulation results obtained by multiplying the datasheet (black line) and the spectroscopic results. (d) Plot of exposure time as a function of wavelength.
Fig. 6.
Fig. 6. Resolution outcomes. (a) Results of the 1951 USAF resolution chart raw images. (b) Contrast plots as a function of resolution at different wavelengths.
Fig. 7.
Fig. 7. (a) Images of six types of resin spices acquired using a visible camera. (b) Merged color images; the left and right images were obtained in the visible (R: 640 nm, G: 550 nm, B: 508 nm) and NIR ranges (R: 1150 nm, G: 1354 nm, B:1438 nm), respectively. (c) Acquired spectra of six types of resins in the range of 490–1600 nm. (d) Neural network classification outcomes. The used wavelength ranges for (i), (ii), (iii), and (iv) were 490–652 nm, 652–1000 nm, 490–1600 nm, and 1000–1600 nm, respectively.

Tables (1)

Tables Icon

Table 1. Classification Results of Hyperspectral Imaging Analysis for Six Types of Resin Pieces and Standard White

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

T λ = T i n i t i a l × 65535 I m a x 50 × 0.9
S i m u l a t i o n n = m = n r a n g e n + r a n g e ( D a t a s h e e t m × M o n o c h r o m e t e r m )
Contrast = I m a x I m i n I m a x + I m i n
W ( i , j ) = I r ( i , j ) I d ( i , j ) I w ( i , j ) I d ( i , j )
A ( i , j ) = log 10 ( W ( i , j ) )
z j = i = 1 m w i j x i + b j
σ ( z j ) = 1 1 + e z j
R ( z j ) = max ( 0 , z j )
S ( z j ) k = e ( z j ) k k = 1 K e ( z j ) k
Δ w i , j l 1 , l = η E w i , j l 1 , l
A c c u r a c y = T P + T N T P + T N + F P + F N × 100
R e c a l l = T P T P + F N × 100
S p e c i f i c i t y = T N T N + F P × 100
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.