Image-guided surgery can enhance cancer treatment by decreasing, and ideally eliminating, positive tumor margins and iatrogenic damage to healthy tissue. Current state-of-the-art near-infrared fluorescence imaging systems are bulky and costly, lack sensitivity under surgical illumination, and lack co-registration accuracy between multimodal images. As a result, an overwhelming majority of physicians still rely on their unaided eyes and palpation as the primary sensing modalities for distinguishing cancerous from healthy tissue. Here we introduce an innovative design, comprising an artificial multispectral sensor inspired by the Morpho butterfly’s compound eye, which can significantly improve image-guided surgery. By monolithically integrating spectral tapetal filters with photodetectors, we have realized a single-chip multispectral imager with higher sensitivity and better spatial co-registration accuracy compared to clinical imaging systems in current use. Preclinical and clinical data demonstrate that this technology seamlessly integrates into the surgical workflow while providing surgeons with real-time information on the location of cancerous tissue and sentinel lymph nodes. Due to its low manufacturing cost, our bio-inspired sensor will provide resource-limited hospitals with much-needed technology to enable more accurate value-based health care.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
Surgery is the primary curative option for patients with cancer, with the overall objective of complete resection of all cancerous tissue while avoiding iatrogenic damage to healthy tissue. In addition, sentinel lymph node (SLN) mapping and resection is an essential step in staging and managing the disease . Even with the latest advancements in imaging technology, incomplete tumor resection in patients with breast cancer is at an alarming rate of 20 to 25 percent, with recurrence rates of up to 27 percent . The clinical need for imaging instruments that provide real-time feedback in the operating room is unmet, largely due to the use of imaging systems based on contemporary technological advances in the semiconductor and optical fields, which have bulky and costly designs with suboptimal sensitivity and co-registration accuracy between multimodal images [3–7].
Here, we demonstrate that image-guided surgery can be dramatically improved by shifting the design paradigm away from conventional advancements in the semiconductor and optical technology fields and instead adapting the elegant 500-million-year-old design of the Morpho butterfly’s compound eye [8,9]: a condensed biological system optimized for high-acuity detection of multispectral information. Nature has served as the inspiration for many engineering sensory designs, with performances exceeding state-of-the-art sensory technology and enabling new engineering paradigms such as achromatic circular polarization sensors , artificial vision sensors [11–15], silicon cochlea [16,17], and silicon neurons . Our artificial compound eye, inspired by the Morpho butterfly’s photonic crystals, monolithically integrates pixelated spectral filters with an array of silicon-based photodetectors. Our bio-inspired image sensor has the prominent advantages of (1) capturing both color and near-infrared fluorescence (NIRF) with high co-registration accuracy and sensitivity under surgical light illumination, allowing simultaneous identification of anatomical features and tumor-targeted molecular markers; (2) streamlined design—at 20 g including optics, our bio-inspired image sensor does not impede surgical workflow; and (3) low manufacturing cost of , which will provide resource-limited hospitals with much-needed technology to enable more accurate value-based health care.
A. Nature-Inspired Design
Light has imposed significant selection pressure for perfecting, optimizing, and miniaturizing animal visual systems since the Cambrian period some 500 million years ago . Sophisticated visual systems emerged in a tight race with prey coloration during that time, resulting in a proliferation of photonic crystals in the animal kingdom used for both signaling and sensing [20,21]. For example, not only are the tree-shaped photonic crystals of the Morpho butterfly the source of its wings’ magnificent iridescent colors [Fig. 1(a)], which can be sensed by conspecifics a mile away, but these crystals have inspired the design of photonics structures that can sense vapors  and infrared photons  with sensitivity that surpasses state-of-the-art manmade sensors.
Similar photonic crystals are also present in the compound eye of the Morpho butterfly. These photonic crystals, known as tapetal filters, are realized by stacks of alternating layers of air and cytoplasm, which act as interference filters at the proximal end of the rhabdom within each ommatidium [Fig. 1(d)]. The light that enters an individual ommatidium and is not absorbed by the visual and screening pigments in the rhabdom will be selectively reflected by the tapetal filters and will have another chance of being absorbed before exiting the eye. The spectral responses of the tapetal filters, together with screening and visual pigments in the rhabdom, determine the eye shine of the ommatidia  and the inherent multispectral sensitivity of the butterfly’s visual system [Fig. 1(b)]. Individual ommatidia have different combinations of visual pigments and tapetal filter stacks, enabling selective spectral sensitivity across the ultraviolet, visible, and near-infrared (NIR) spectra.
By imitating the compound eye of the Morpho butterfly using dielectric materials and silicon-based photosensitive elements, we developed a multispectral imaging sensor that operates radically differently from the current state-of-the-art multispectral imaging technology [Fig. 1(c); Table 1]. The tapetal spectral filters are constructed using alternating nanometric layers of and , which are pixelated with a 7.8 μm pitch and deposited onto the surface of a custom-designed silicon-based complementary metal-oxide semiconductor (CMOS) imaging array (see Methods). The alternating stack of dielectrics acts as an interference filter, allowing certain light spectra to be transmitted while reflecting others [Fig. 1(e)]. Four distinct pixelated spectral filters are replicated throughout the imaging sensor in a two-by-two pattern by modulating the thickness and periodicity of the dielectric layers in individual pixels. Three of the four pixels are designed to sense the red, green, and blue (RGB) spectra, respectively, and the fourth pixel captures NIR photons with wavelengths greater than 780 nm. An additional stack of interference filters is deposited across all pixels to block fluorescence excitation light between 770 and 800 nm. The proximity of the imaging array’s four base pixels inherently co-registers the captured multispectral information, similar to its biological counterpart.
A. Optoelectronic Performance of the Bio-Inspired Sensor
The optical density and transmission spectrum of the four base pixels from the artificial compound eye are presented in Figs. 2(a) and 2(b), respectively. The sensor was evaluated with uniform monochromatic light impinging normal to the surface of the imaging plane. The individual tapetal filters are optimized to achieve transmission of 60% in the visible spectrum and 80% in the NIR spectrum. The high optical density of ensures effective suppression of fluorescence excitation light between 770 and 800 nm. The spatial uniformity or fixed pattern noise (FPN) before calibration for the RGB and NIR filters is 6.5%, 1.9%, 4.5%, and 5%, respectively [Fig. 2(c)]. After first-order gain and offset calibration, the FPN is around 0.1% across different illumination intensities [Fig. 2(d)]. The fixed pattern noise for all four channels is evaluated under 30 ms exposure and different illumination intensities, ranging from dark conditions to intensities that almost saturate the pixel’s output signal. Hence, the axes in Figs. 2(c) and 2(d) represent the mean output signal from a pixel represented as a percentage of the dynamic range. The spatial variations in the optical response of the tapetal filters are primarily due to variations in the underlying transistors and photodiodes within individual pixels, which can be mitigated via calibration, improving spatial uniformity under various illumination conditions. The peak quantum efficiencies for the RGB and NIR pixels are 28%, 35%, 38%, and 28%, respectively [Fig. 2(e); Table 2].
B. Acquiring NIR Fluorescence and Color Under Surgical Light Illumination
Simultaneous and real-time imaging of both NIR fluorescence and RGB information is essential in surgical settings, as these will enable the surgeon to identify the location of the tumor on the correct anatomical features. U.S. Food and Drug Administration (FDA) regulations require the optical power of visible-spectrum surgical illumination to be between 40 and 160 kLux. The optical power for NIR laser-based excitation sources typically does not exceed . Hence, the intensity of the NIRF molecular probe, which could be emitted from tumors several centimeters deep in the tissue, is at least 5 orders of magnitude weaker than the intensity of the reflected visible-spectrum light [24,25]. To enable simultaneous color and NIR imaging in the operating room, most FDA-approved instruments work with dimmed surgical illumination, which significantly impedes the surgical workflow: physicians stop the resection, dim the surgical lights, evaluate the surgical margins with NIRF instrumentation, and then continue the surgery under either dim or normal illumination but without NIRF guidance . This significant drawback short-circuits the intrinsic benefits of NIRF, preventing wide acceptance of this technology in the operating room and leading to positive margins and iatrogenic damage.
When imaging weak NIR fluorescent and high visible-spectrum photon flux during intraoperative procedures, the dynamic range of the scene can range between up to [24,25]. The imaged tissue is illuminated with a bright visible light (40 to 160 kLux) to highlight anatomical features in the wound site, as well as by NIR laser light with optical power to excite fluorophores. The NIR fluorescent photon flux depends on the concentration and depth location of the fluorescent dye, which will change during the surgical procedure as the tumor is located and resected. At the beginning of the surgery, the tumor might be located several millimeters beneath the surface of the tissue. As NIR photons travel to and from the skin and the tumor, the NIR light will be significantly attenuated and the dynamic range of the imaged scene (visible and NIR photons) can approach . The NIR photons will be less attenuated as the surrounding tissue is resected and the tumor is directly exposed and imaged. Depending on the fluorescent concentration, the dynamic range of the imaged scene can be between 40 and 80 dB when the tumor is at the surface . To address the wide-dynamic-scene imaging demands in the operating room, our custom CMOS imager has programmable readout circuitry that enables independent exposure control and programmable gains for both visible and NIR pixels within a single frame (see Methods). Although single exposure time per frame will necessitate collection of two consecutive images with different exposure times (i.e., one frame with long and one with short exposure time), the reduction of fame rate will be a major shortcoming for intraoperative applications where real-time imaging is critical.
In our intraoperative experiments, due to the high visible-spectrum illumination in the operating room, the exposure time is typically set to or lower for the RGB pixels to ensure that non-saturated and high-contrast color images are recorded. Since the NIRF emission is weaker than the visible light reflected from tissue, the exposure time is set to 40 ms to ensure imaging rates of 25 frames/s and acquisition of high-contrast NIR images. This contrasts with current state-of-the-art pixelated NIRF systems that either utilize polymer-based  or Fabry–Perot absorptive pixelated spectral filters with low optical density () coupled with CMOS sensors (see Tables 1 and 3), allowing only a single exposure time for all pixels, or utilize a multicamera approach, which lacks co-registration accuracy between multimodal images . The multi-exposure capabilities of our CMOS imager, coupled with the high optical density and quantum efficiency of the NIR pixels, enables detection of 100 pM fluorescence concentrations of indocyanine green (ICG) under 60 kLux surgical light illumination (), which is a thousand-fold improvement over current state-of-the-art single-exposure pixelated sensors  [Fig. 2(e)].
C. Multi-Exposure Imaging Under Surgical Light Illumination
We demonstrate the preclinical relevance of the multi-exposure capabilities of our bio-inspired sensor by imaging a 4T1 breast cancer model under 60 kLux surgical light illumination and laser light excitation power of at 785 nm. The results are compared with a single-exposure pixelated CMOS camera (Fig. 3). Using the tumor-specific NIRF marker LS301 (a cypate-based contrast agent that typically accumulates in the periphery of tumors ), we obtained high target-to-background contrast images due to the tissue’s low auto-fluorescence, low scattering, and absorption in the 700 to 950 nm spectral bands. The images in Figs. 3(a) and 3(b) were obtained with a single-exposure CMOS camera with exposure times of 0.1 and 40 ms, respectively. When the animal was imaged with an exposure time of 0.1 ms, the color image was well illuminated, while the NIR image had very low contrast. The animal was then imaged with 40 ms exposure time, resulting in a well-illuminated NIR image but a saturated color image. This is due to the large difference between the visible and NIR photon flux in the operating room. Utilizing a single exposure time in a pixelated camera enables only one of the two imaging modalities to have satisfactory contrast and high signal-to-noise ratio, rendering this technology incompatible with the demands of intraoperative imaging applications.
Figure 3(c) presents data collected with our bio-inspired imaging sensor. The exposure times for the color and NIR pixels were set to provide optimal contrast in both color and NIR channels: 0.1 ms for the color pixels and 40 ms for the NIR pixels. The combined images contain high signal-to-noise ratios and non-saturated information from both imaging modalities. Hence, the operator can clearly identify the anatomical features of the patient while accurately determining the location of the tumor as tagged by the molecular probe.
D. Multispectral Co-Registration Accuracy
Co-registration accuracy between color and NIRF images is one of the most important attributes for an instrument to be clinically relevant. However, state-of-the-art NIRF instrumentation comprising a beam splitter and dichroic mirrors suffers from temperature-dependent co-registration inaccuracy due to thermal expansion and thermal shifts of individual optical components. These FDA-approved instruments are rated to function between 10°C and 35°C, though they fail to maintain co-registration accuracy in this range. In contrast, our bio-inspired sensor monolithically integrates filtering and imaging elements on the same substrate and is inherently immune to temperature-dependent co-registration errors.
We evaluated co-registration accuracy as a function of temperature for both our bio-inspired sensor and a state-of-the-art NIRF imaging system composed of a single lens, beam splitter, and two imaging sensors (Fig. 4). The sensors were placed 60 cm away from a calibrated checkerboard target to emulate the distance at which the sensor will be placed during preclinical and clinical trials. At the starting operating point, the beam-splitter NIRF system achieves subpixel co-registration accuracy using standard calibration methods. However, the disparity between the two images increases as the instrument’s operating temperature increases, leading to large co-registration errors. And, as the instrument is cooled, the trajectory of the co-registration error differs from that when the instrument is heated up. Hence, placing a temperature sensor on the instrument will not sufficiently correct for thermal expansion of the individual optical elements. In contrast, in our bio-inspired sensor, the worst-case co-registration error at the sensor’s plane is pixels due to the pixelated filter arrangement. Compared to the beam-splitter NIRF system, our bio-inspired sensor exhibits sevenfold improved spatial co-registration accuracy at the imaging plane when the sensors operate at 35°C.
E. Implications on Co-Registration Accuracy in Murine Cancer Model
The implication of the temperature-dependent co-registration error between the NIR and RGB images in state-of-the-art NIRF systems is demonstrated in a murine model where 4T1 cancer cells are implanted next to a sciatic nerve. At weeks post-implantation, the tumor size is and is imaged with the tumor-targeted agent LS301. The animal is imaged with a beam-splitter NIRF imager placed inside a thermal chamber, which has a viewing port that allows imaging of the animal without perturbing the temperature of the instrument. The animal is kept on a heated thermal pad to maintain constant body temperature of .
Figure 5(a) is a composite image taken with the NIRF system at 15°C operating temperature. The green false color indicates the NIRF signal from the tumor-targeted agent LS301. The fluorescence signal from the tumor tissue underneath the sciatic nerve is much weaker than the fluorescence signal from the surrounding tumor tissue. After thresholding the fluorescence signal, the location of the sciatic nerve is observed due to the absence of fluorescence signal [Fig. 5(a), arrow]. Since the image sensor is calibrated at 15°C operating temperature, the NIR image (i.e., location of the tumor) is correctly co-registered on the color image (i.e., anatomical features).
Figure 5(b) is another set of images recorded with the NIRF sensor at 32°C. Because of the thermally induced shift in the optical elements of the NIRF instrument, the fluorescence image is shifted with respect to the color image. The NIRF image incorrectly marks the sciatic nerve as cancerous tissue, while the cancerous tissue immediately next to the sciatic nerve has low fluorescence signal. This incorrect labeling of cancerous and nerve tissue can lead to iatrogenic damage to healthy tissue, which might not be visible to surgeons, while leaving behind cancerous tissue in the patient. In contrast, our bio-inspired sensor suffers no thermally induced co-registration error and accurately depicts the location of the tumor and sciatic nerve at both temperatures due to the monolithic integration of pixelated spectral filters and imaging elements.
F. Imaging Spontaneous Tumors Under Surgical Light Illumination
We used our bio-inspired sensor to identify spontaneous tumor development in a transgenic PyMT murine model for breast cancer (). All animals developed multifocal tumors throughout the mammary tissues by 5–6 weeks, and some of the small tumors blended in well with surrounding healthy tissue due to their color and were difficult to differentiate visually with the unaided eye. However, because our bio-inspired sensor has high co-registration accuracy and NIRF sensitivity, we could easily locate the tumors, resect them, and ensure that the tumor margins were negative [Figs. 6(a)–6(c)]. When we compared results obtained with our bio-inspired imaging sensor against histology results, we found that our sensor together with the tumor-targeted probe LS301 had a sensitivity of 80%, specificity of 75%, and area under the receiver operator curve of 73.4% using parametric analysis. In addition, while visible-spectrum imaging picks up only surface information, fluorescence imaging in the NIR spectrum enables deep-tissue imaging, which helps identify the location of tumors before surgery [Fig. 6(d)]. Compared to the state-of-the-art, non-real-time, bulky Pearl imaging system, with a receiver operator curve of 77.9% and relevant standard error of 6.3% , our bio-inspired sensor provides similar real-time accuracy under surgical light illumination.
G. Clinical Translation of Our Bio-Inspired Technology
The current standard of care for tracking SLNs is to inject into the patient both a visible dye, such as ICG or methylene blue, and a radioactive 99 mTc sulfur-colloid tracer. The SLNs are generally first identified by the unaided eye, due to the coloration of the accumulated dye at the tissue’s surface, followed by gamma probe to check for radioactivity. In a pilot clinical trial, we investigated the utility of our bio-inspired imaging sensor for locating SLNs in human patients () with breast cancer using the ICG lymphatic tracer. ICG naturally exhibits a green color due to its absorption spectra, as well as NIRF at 800 nm. ICG passively accumulates in the SLNs and is cleared through the liver and bile ducts within 24–36 h post injection (Fig. 7).
Our bio-inspired imaging system provided the surgeon with real-time intraoperative imaging of the tissue in color that was enhanced with NIRF information from the ICG marker under surgical light illumination (Fig. 7). The surgeon could identify 100% of the SLNs when using information registered with our bio-inspired imaging system alone. In contrast, the surgeon identified 90% of SLNs when using the green color from ICG with the unaided eye, and 87% when using information from the radioactive tracer.
Noteworthy, during one of the procedures the surgeon identified and resected two uninvolved SLNs, using information from the green color of the ICG probe, which the gamma probe did not detect. With the assistance of the gamma probe, another two involved SLNs which did not exhibit visible green color accumulation of ICG were identified and resected. However, the bio-inspired imaging system correctly identified all four SLNs during this operation. We anticipate this to be because the green color of the ICG dye can be visually identified only at the tissue’s surface when viewed with the naked eye; in contrast, our bio-inspired sensor identifies NIRF signals several centimeters deep in the tissue . The gamma ray detector failed to detect the radioactive tracer in two of the four SLNs because the limited space in the surgical cavity limited insertion of the relatively large radioactivity detection probe. In contrast, since both surgical and excitation light could clearly illuminate the surgical cavity, the bio-inspired sensor provided accurate visualization of both anatomic features and the location of the ICG dye in the SLNs.
3. MATERIALS AND METHODS
A. Animal Study
Animal study protocols were reviewed and approved by the Animal Studies Committee of Washington University in St. Louis. Female PyMT mice () were obtained from Washington University Medical Center Breeding Core. Mice developed multiple mammary tumors as early as 5–6 weeks and were injected with 100 μl of 60 μM LS301, a tumor-targeted NIRF contrast agent, via the lateral tail vein. Images were taken 24 h post injection for best contrast.
During the in vivo study, the bio-inspired multispectral imaging sensor was set up at 1 m working distance, and the illumination module was placed at a 1 m distance. The animals were imaged under simultaneous surgical light illumination (60 kLux) and laser light excitation power of at 785 nm. Collimating lenses and diffusers were used to create a uniform circular excitation area with a 15 cm diameter. The RGB pixels’ exposure time was set to 0.1 ms, and the NIR pixels’ exposure time was set to 40 ms to ensure imaging rates of 25 frames/s.
A double-blind protocol was used for tissue specimen collection. Randomization was not used in this study, and all animals were included in the data analysis. During the imaging experiments, animals remained anesthetized through inhalation of isoflurane (2%–3% v/v in ). The surgeon used fluorescence information detected by our bio-inspired sensor to locate and resect all tumor tissues. After the removal of each tumor, the surgeon resected two additional samples: a tissue sample next to the tumor identified as tumor margin and a fluorescence-negative muscle tissue. Six to eight tumors per mouse were resected, and a total of 102 samples were collected from all five mice. All harvested tissue samples were also imaged using a Lycor Pearl small animal imaging system. All tissue samples were then preserved for histology evaluation. Each sample was sliced with 8 μm thickness, stained with hematoxylin and eosin and examined by clinical pathologist. The pathologist was blind to the fluorescence results.
For the 4T1 studies, six-week-old Balb/C female mice were obtained from Jackson Laboratory and injected with 4T1 murine cancer cells. In the first study, the 4T1 cells were implanted into either the left or right inguinal mammary pad. In the second study, the 4T1 cells were implanted next to the sciatic nerve. At 5–7 mm tumor size (7–10 days post-implantation), these mice were injected with 100 μl of 60 μM LS301 agent via the lateral tail vein. The animals were imaged 24 h post injection.
B. Human Study
Human study protocols were approved by the Institutional Review Board of Washington University in St. Louis. The human procedure was carried out in accordance with approved guidelines. The inclusion criteria for patients in this study were newly diagnosed clinically node-negative breast cancer, negative nodal basin clinical exam, and at least 18 years of age. The exclusion criteria from this study were contraindication to surgery; receiving any investigational agents; history of allergic reaction to iodine, seafood, or ICG; presence of uncontrolled intercurrent illness; or pregnant or breastfeeding. All patients gave informed consent for this HIPAA-compliant study. The study was registered on the clinicaltrials.gov website (trial ID no. NCT02316795).
The age and body mass index of all patients were years and , respectively. Before the surgical procedure, 99 m Tc sulfur colloid (834 μCi) and ICG (500 μmol, 1.6 mL) were injected into the patient’s tumor area, followed by site massage for approximately 5 min. At 10–15 min post injection, surgeons proceeded with the surgery per standard of care. Once the surgeon identified the SLNs using the visible properties of ICG (i.e., green color) and radioactivity using the gamma probe, the surgeon used our bio-inspired imaging system to locate the SLNs. The patients were imaged under simultaneous surgical light illumination (60 kLux) and laser light excitation power of at 785 nm. The surgeon then proceeded with the resection of the SLNs. The imaging system was set up at a 1 m working distance, and the illumination module was placed at a 1 m distance. The RGB pixels’ exposure time was set to 0.1 ms to ensure non-saturated color images were recorded, and the NIR pixels’ exposure time was set to 40 ms to ensure imaging rates of 25 frames/sec. The average imaging time with our bio-inspired sensor was min.
C. Cell Culture
4T1 breast cancer cells were used for in vivo tumor models. This cell line was obtained from American Type Culture Collection in Manassas, Virginia, USA. Mycoplasma Detection Kit from Thermo Fisher Scientific was used to verify negative status for mycoplasma contamination in the cell line. Cells were cultured in Dulbecco’s modified eagle medium from Thermo Fisher Scientific, with 10% fetal bovine serum and antibiotics.
D. Fluorescence Concentration Detection Limits Under Surgical Light Sources
Ten different ICG concentrations in plastic vials were imaged under surgical light illumination (60 kLux) and excited with laser light excitation power of at 785 nm. Six different vials at each concentration, as well as a control vial with deionized water, were imaged with two different pixelated imaging sensors. The first sensor was our bio-inspired imager, which has two separate exposure times per frame: the RGB pixels’ exposure time was set to 0.1 ms to ensure non-saturated color images were recorded, and the NIR pixels’ exposure time was set to 40 ms to ensure imaging rates of 25 frames/sec. The second sensor was a pixelated CMOS imaging sensor with single exposure time for both RGB and NIR pixels in the array. The exposure time for the second sensor was set to 0.1 ms because longer exposure times would saturate the color image due to the high photon flux from the surgical light source illumination.
A region of interest within each vial was selected to avoid edge artifacts. An average intensity value and standard deviation of the NIR pixels were computed on the region of interest, excluding 5% of the pixels’ outliers. The detection threshold was determined as the average NIR signal plus three standard deviations of the control vial.
E. Temperature-Dependent Co-Registration Accuracy Measurement
The imaging instruments were individually placed in a custom-built thermal chamber with a transparent viewing port, where the operating temperature was accurately controlled using a proportional-integral-derivative controller. The imaging instrument resided at 15°C for 24 h to reach thermal equilibrium. The disparity matrix between the NIR and color image for the beam splitter image was computed at 15°C temperature. The temperature of the thermal chamber was increased in increments of , and the instrument was held at the new temperature for 15 min before an image of a calibrated checkerboard target was captured with both NIR and color sensors. These new images were co-registered using the disparity matrix computed at 15°C, and the co-registration error across the entire image was evaluated. The operating temperature of the instrument was increased to 35°C and then reduced back to 15°C. The co-registration error was computed at each temperature point.
F. CMOS Imager with Pixel-Level Multi-Exposure Capabilities
The custom CMOS imager, which is used as a substrate for our NIRF sensor, is custom designed and fabricated in 180 nm CMOS image sensor technology. The imager is composed of an array of 1280 by 720 pixels, programmable scanning registers, and readout analog circuits comprising switch capacitors, amplifiers, bandpass filters, voltage reference circuits, and analog-to-digital converters (Fig. S1 of Supplement 1). An individual pixel comprises a pinned photodiode and four transistors that control the access of the pixel to the readout circuitry and the exposure time of the pixel.
The digital scanning registers interface with the individual pixels and control the transistors’ gates within each pixel. The scanning registers are designed to be programmable by inserting different digital patterns and altering the clocking sequence via an external field programmable gate array. This enables pixel-level control of the exposure time for individual photodiodes and specialized readout sequence of individual pixels. Hence, individual groups of pixels (color or NIR) can be read out at different times, and the exposure time for both types of pixels is optimized to ensure acquisition of high signal-to-noise (SNR) color and NIR images, respectively.
During a single frame readout, NIR and color pixels can have different exposure times, and the frame rate is limited by the longest integration time of the two. For example, during intra-operative procedures, the light intensity from the surgical light sources that is reflected from the tissue is much higher than the NIRF signal from the molecular dye. In this scenario, to ensure high-SNR and non-saturated images, the integration time for the color pixels is set to and the exposure time for the NIR pixels is set to 40 ms to ensure imaging rates of 25 frames/s.
The timing sequence for two neighboring pixels with different spectral filters is shown in Fig. S2 of Supplement 1. Both pixels are reset initially, and then they start to collect photons on the photodiodes’ intrinsic capacitance. Since the photon flux for the visible-spectrum photons in the operating room is typically much higher than that of the NIRF photons, the photodiode voltage in the color pixels drops faster than in the NIR pixels over time. At the end of the exposure period, the photodiode voltages from the color pixels are sampled first on the column parallel readout capacitors. A readout control registers scans through the column parallel capacitors and digitizes the analog information after it has been amplified with a programmable gain amplifier. The same readout sequence is repeated with the NIR pixels at the end of the 40-ms exposure time.
G. Fabrication of Pixelated Spectral Filters
The pixelated spectral filters were fabricated via a set of optimized microfabrication steps. Here are the steps taken to fabricate the pixelated filters:
- 1. The carrier wafer was soaked for 30 min in isopropanol alcohol and rinsed with DI water.
- 2. The wafer was coated with 20 nm of chromium, which was used to block stray light between pixels [Fig. S3(a) of Supplement 1].
- 3. SU8 2000 photoresist was spin coated at 500 rpm for 10 s and then at 3000 rpm for 50 s with 500 rpm per second acceleration.
- 4. The sample was baked at 65°C for 1 min and then at 95°C for 2 min on a hot plate [Fig. S3(b)].
- 5. The photoresist was exposed at 375 nm wavelength for 22 s at intensity using a Karl Zuess mask aligner.
- 6. The sample was post-baked at 65°C for 1 min and then at 95°C for 3 min. The sample was cooled down to 65°C for 1 min to gradually decrease the temperature and minimize stress and cracking on the photoresist. The photoresist was developed for 3 min in an SU-8 developer using an ultrasound bath, and gently rinsed with isopropyl alcohol at the end of the procedure [Fig. S3(c)].
- 7. The exposed chromium was etched in an Oxford reactive ion etching, inductively coupled plasma instrument [Fig. S3(d)].
- 8. The omnicoat was spin coated at 4000 rpm and baked at 150°C for 1 min.
- 9. SU8 2000 photoresist was spin coated at 500 rpm for 10 s and then at 3000 rpm for 50 s with 500 rpm per second acceleration.
- 10. The sample was baked at 65°C for 1 min and then at 95°C for 2 min on a hot plate.
- 11. The photoresist was exposed at 375 nm wavelength for 22 s at intensity using a Karl Zuess mask aligner.
- 12. The sample was post-baked at 65°C for 1 min and then at 95°C for 3 min. The photoresist was developed for 3 min in an SU-8 developer using an ultrasound bath, and gently rinsed with isopropyl alcohol at the end of the procedure.
- 13. The exposed Omnicoat was etched using chlorine in an Oxford reactive ion etching inductively coupled plasma instrument [Fig. S3(e)].
- 14. Alternating layers of silicon dioxide and silicon nitrate were deposited across the entire sample using physical vapor deposition. The thickness of the dielectric layers was optimized for transmitting NIR light with high transmission ratio [Fig. S3(f)].
- 15. The sample was immersed in Remover PG and ultrasound bath for 30 min to lift off the unwanted structures [Fig. S3(g)].
- 16. Steps 2 through 13 were repeated three times to fabricate red, green, and blue pixels [Figs. S3(h) and S3(i)].
- 17. The final sample had all four different types of pixels: NIR and visible spectrum pixels [Fig. S3(j)].
H. Statistical Analysis
The sample size in our animal study of PyMT breast cancer was determined to ensure adequate power (, at significance of 0.05) to detect predicted effects, which were estimated based on either preliminary data or previous experiences with similar experiments. The confidence interval for the fluorescence detection accuracy was calculated using the exact method (one-sided). Differences between ICG concentration and control vial were assessed using one-sided Student’s -tests for the calculation of values. Matlab was used for data analysis.
We have designed, fabricated, tested, and translated into a clinical setting a bio-inspired multispectral imaging system that provides critical information to health care providers in a space-, time-, and illumination-constrained operating room. Because of its compact size and excellent multispectral sensitivity (see Table 2), this paradigm-shifting sensor can assist physicians without impacting normal surgical workflow. The high co-registration accuracy between NIR and visible-spectrum images and the high NIR imaging sensitivity under surgical light illumination are key advantages over current FDA-approved instruments used in the operating room. The initial clinical results corroborate results from previous studies  and indicate the benefits of using fluorescence properties of ICG for mapping SLNs in patients with cancer.
Air Force Office of Scientific Research (AFOSR) (FA9550-12-1-0321); National Institutes of Health (NIH) (NCI R01 CA171651); National Science Foundation (NSF) (1724615, 1740737).
V. G. conceived the sensor idea and oversaw the entire project. M. G. and V. G. designed the imager and designed the animal studies. Optical evaluation of the imager was performed by M. G., C. E., T. Y., R. M, and V. G. Animal studies were performed by M. G., S. M., G. S., W. A., S. A., M. Y. P., and V. G. Surgical light source was designed by N. Z. and R. L. Human study was performed by J. M. Co-registration study was performed by M. G., V. G. and M. Z. Paper was written by V. G., M. Y. P., and M. G. All authors proofread the paper. The authors would like to thank James Hutchinson and Patricia J. Watson for paper editing. The authors declare no competing financial interests.
Data Availability. All data supporting the findings of this study can be located on the Dryad website at http://datadryad.org/resource/doi:10.5061/dryad.q6188pm.
See Supplement 1 for supporting content.
1. D. L. Morton, J. F. Thompson, A. J. Cochran, N. Mozzillo, O. E. Nieweg, D. F. Roses, H. J. Hoekstra, C. P. Karakousis, C. A. Puleo, B. J. Coventry, M. Kashani-Sabet, B. M. Smithers, E. Paul, W. G. Kraybill, J. G. McKinnon, H.-J. Wang, R. Elashoff, and M. B. Faries, “Final trial report of sentinel-node biopsy versus nodal observation in melanoma,” N. Engl. J. Med. 370, 599–609 (2014). [CrossRef]
2. S. A. McLaughlin, “Surgical management of the breast: breast conservation therapy and mastectomy,” Surg. Clin. N. Am. 93, 411–428 (2013). [CrossRef]
3. L. Gao and L. V. Wang, “A review of snapshot multidimensional optical imaging: measuring photon tags in parallel,” Phys. Rep. 616, 1–37 (2016). [CrossRef]
4. S. H. Yun and S. J. J. Kwok, “Light in diagnosis, therapy and surgery,” Nat. Biomed. Eng. 1, 0008 (2017). [CrossRef]
5. G. M. van Dam, G. Themelis, L. M. A. Crane, N. J. Harlaar, R. G. Pleijhuis, W. Kelder, A. Sarantopoulos, J. S. de Jong, H. J. G. Arts, A. G. J. van der Zee, J. Bart, P. S. Low, and V. Ntziachristos, “Intraoperative tumor-specific fluorescence imaging in ovarian cancer by folate receptor-[alpha] targeting: first in-human results,” Nat. Med. 17, 1315–1319 (2011). [CrossRef]
6. A. L. Vahrmeijer, M. Hutteman, J. R. van der Vorst, C. J. H. van de Velde, and J. V. Frangioni, “Image-guided cancer surgery using near-infrared fluorescence,” Nat. Rev. Clin. Oncol. 10, 507–518 (2013). [CrossRef]
7. G. Hong, A. L. Antaris, and H. Dai, “Near-infrared fluorophores for biomedical imaging,” Nat. Biomed. Eng. 1, 0010 (2017). [CrossRef]
8. W. H. Miller and G. D. Bernard, “Butterfly glow,” J. Ultrastruct. Res. 24, 286–294 (1968). [CrossRef]
9. D. G. Stavenga, “Visual adaptation in butterflies,” Nature 254, 435–437 (1975). [CrossRef]
10. Y. Zhao, M. A. Belkin, and A. Alù, “Twisted optical metamaterials for planarized ultrathin broadband circular polarizers,” Nat. Commun. 3, 870 (2012). [CrossRef]
11. Y. M. Song, Y. Xie, V. Malyarchuk, J. Xiao, I. Jung, K.-J. Choi, Z. Liu, H. Park, C. Lu, R.-H. Kim, R. Li, K. B. Crozier, Y. Huang, and J. A. Rogers, “Digital cameras with designs inspired by the arthropod eye,” Nature 497, 95–99 (2013). [CrossRef]
12. H. Liu, Y. Huang, and H. Jiang, “Artificial eye for scotopic vision with bioinspired all-optical photosensitivity enhancer,” Proc. Natl. Acad. Sci. USA 113, 3982–3985 (2016). [CrossRef]
13. C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbruck, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014). [CrossRef]
14. T. York, S. B. Powell, S. Gao, L. Kahan, T. Charanya, D. Saha, N. W. Roberts, T. W. Cronin, J. Marshall, S. Achilefu, S. P. Lake, B. Raman, and V. Gruev, “Bioinspired polarization imaging sensors: from circuits and optics to signal processing algorithms and biomedical applications,” Proc. IEEE 102, 1450–1469 (2014). [CrossRef]
15. M. Garcia, C. Edmiston, R. Marinov, A. Vail, and V. Gruev, “Bio-inspired color-polarization imager for real-time in situ imaging,” Optica 4, 1263–1271 (2017). [CrossRef]
16. S.-C. Liu and T. Delbruck, “Neuromorphic sensory systems,” Curr. Opin. Neurobiol. 20, 288–295 (2010). [CrossRef]
17. B. Wen and K. Boahen, “A silicon cochlea with active coupling,” IEEE Trans. Biomed. Circuits Syst. 3, 444–455 (2009). [CrossRef]
18. G. Indiveri, B. Linares-Barranco, T. Hamilton, A. van Schaik, R. Etienne-Cummings, T. Delbruck, S.-C. Liu, P. Dudek, P. Häfliger, S. Renaud, J. Schemmel, G. Cauwenberghs, J. Arthur, K. Hynna, F. Folowosele, S. Saïghi, T. Serrano-Gotarredona, J. Wijekoon, Y. Wang, and K. Boahen, “Neuromorphic silicon neuron circuits,” Front. Neurosci. 5, 73 (2011). [CrossRef]
19. M. F. Land and D.-E. Nilsson, Animal Eyes (Oxford University, 2012).
20. A. R. Parker, R. C. McPhedran, D. R. McKenzie, L. C. Botten, and N. Nicorovici, “Photonic engineering. Aphrodite’s iridescence,” Nature 409, 36–37 (2001). [CrossRef]
21. P. Vukusic and J. R. Sambles, “Photonic structures in biology,” Nature 424, 852–855 (2003). [CrossRef]
22. R. A. Potyrailo, H. Ghiradella, A. Vertiatchikh, K. Dovidenko, J. R. Cournoyer, and E. Olson, “Morpho butterfly wing scales demonstrate highly selective vapour response,” Nat. Photonics 1, 123–128 (2007). [CrossRef]
23. A. D. Pris, Y. Utturkar, C. Surman, W. G. Morris, A. Vert, S. Zalyubovskiy, T. Deng, H. T. Ghiradella, and R. A. Potyrailo, “Towards high-speed imaging of infrared photons with bio-inspired nanoarchitectures,” Nat. Photonics 6, 195–200 (2012). [CrossRef]
24. L. V. Wang and H.-I. Wu, Biomedical Optics: Principles and Imaging (Wiley, 2009).
25. L. J. Steven, “Optical properties of biological tissues: a review,” Phys. Med. Biol. 58, R37–R61 (2013). [CrossRef]
26. A. V. Dsouza, H. Lin, E. R. Henderson, K. S. Samkoe, and B. W. Pogue, “Review of fluorescence guided surgery systems: identification of key performance capabilities beyond indocyanine green imaging,” J. Biomed. Opt. 21, 080901 (2016). [CrossRef]
27. Z. Chen, N. Zhu, S. Pacheco, X. Wang, and R. Liang, “Single camera imaging system for color and near-infrared fluorescence image guided surgery,” Biomed. Opt. Express 5, 2791–2797 (2014). [CrossRef]
28. S. Mondal, S. Gao, N. Zhu, G. Sudlow, A. Som, W. Akers, R. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015). [CrossRef]
29. J. A. Hanley and B. J. McNeil, “A method of comparing the areas under receiver operating characteristic curves derived from the same cases,” Radiology 148, 839–843 (1983). [CrossRef]
30. M. G. Niebling, R. G. Pleijhuis, E. Bastiaannet, A. H. Brouwers, G. M. van Dam, and H. J. Hoekstra, “A systematic review and meta-analyses of sentinel lymph node identification in breast cancer and melanoma, a plea for tracer mapping,” Eur. J. Surg. Oncol. 42, 466–473 (2016). [CrossRef]