Nature has a large repertoire of animals that take advantage of naturally abundant polarization phenomena. Among them, the mantis shrimp possesses one of the most advanced and elegant visual systems nature has developed, capable of high polarization sensitivity and hyperspectral imaging. Here, we demonstrate that by shifting the design paradigm away from the conventional paths adopted in the imaging and vision sensor fields and instead functionally mimicking the visual system of the mantis shrimp, we have developed a single-chip, low-power, high-resolution color-polarization imaging system. Our bio-inspired imager captures co-registered color and polarization information in real time with high resolution by monolithically integrating nanowire polarization filters with vertically stacked photodetectors. These photodetectors capture three different spectral channels per pixel by exploiting wavelength-dependent depth absorption of photons. Our bio-inspired imager comprises 1280 by 720 pixels with a dynamic range of 62 dB and a maximum signal-to-noise ratio of 48 dB. The quantum efficiency is above 30% over the entire visible spectrum, while achieving high polarization extinction ratios of on each spectral channel. This technology is enabling underwater imaging studies of marine species, which exploit both color and polarization information, as well as applications in biomedical fields.
© 2017 Optical Society of America
The mantis shrimp visual system has evolved to be arguably one of the most sophisticated sensory systems in the animal kingdom [1,2]. With its two apposition compound eyes, the mantis shrimp perceives the world by sensing 16 different spectral channels, 4 equally spaced linear polarization orientations, and 2 circularly polarized states, making the animal one of the best-adapted predators in shallow waters. This sophisticated and compact visual system combines three fundamental principles within each individual ommatidium: spectral tuning, enabled by crystalline cones [3,4]; multispectral sensitivity, enabled by vertically stacked rhabdomeres ; and polarization detection, enabled by structurally organized microvilli . In this paper, we demonstrate that by mimicking the space- and energy-efficient implementation of the mantis shrimp visual system, we have designed a compact, single-chip, low-power color- and polarization-sensitive imager (Fig. 1). Our bio-inspired imager is realized by combining an array of vertically stacked photodetectors capable of discerning three different broadband spectral channels with pixelated nanowire filters for polarization sensitivity. Testing results reveal that this sensor captures co-registered color and polarization information with high accuracy, sensitivity, and resolution and can enable a wide range of applications, including remote sensing, cancer imaging, label-free neural imaging, and the study of new underwater phenomena and marine life behavior [7–9].
Polarization of light is caused by the scattering of light in air or water media and by reflection or refraction from objects or live organisms . Polarization contains valuable information about the imaged environment, such as material and tissue properties, surface roughness, structural composition, and three-dimensional shape [11–14]. This information is orthogonal to the information captured by the other two fundamental properties of light: intensity and color. Many animals utilize polarization for both sensing and signaling purposes [1,15–18]. Animals have evolved extraordinary visual systems that can detect subtle differences in polarization states, as well as photonic structures along their bodies, enabling both color and polarization camouflage [16–19]. Polarization also can serve as a covert communication channel between conspecifics, which often is visible over longer distances than color or intensity in highly scattered environments .
Capturing both color and polarization properties of light has been of great interest for many biomedical applications. Among its many applications, polarization information has been used to study tissue biomechanics, detect early cancer formation, and record label-free neural activity [7,9,21,22]. One of the key challenges in these and other applications is the correct and temperature-invariant superposition of the color and polarization information. For example, to utilize polarization in distinguishing cancerous from healthy tissue, polarization information must be accurately superimposed to the correct anatomical features captured by color cameras. Unfortunately, most state-of-the-art multispectral and polarization-sensitive imaging devices suffer from temperature-dependent co-registration errors, on the order of tens of pixels, due to the use of multiple optical elements, such as beam splitters, relay lenses, and polarization filters [23–25].
Today’s state-of-the-art color-polarization imaging sensory technology is realized either by computational approaches, such as computed tomography imaging spectrometry, Fourier transform hyperspectral spectrometry, and compressive sensing, or by direct measurements through the combination of traditional panchromatic imaging arrays with spectral-polarization optics that are modulated in either time, light amplitude, or focal plane [26–32]. These technologies are based on advances in conventional signal processing, 2D planar imaging arrays, and optics, and have yielded to complex, bulky, and expensive systems with intolerably low polarization performance and limited translation to field experiments for the study of naturally occurring phenomena. For example, although ocean water covers of Earth’s surface and is home to more than of all animal life, we have very little knowledge of this hidden world; less than 10% of marine animal species are catalogued due in part to shortcomings of the dominant imaging technology—the digital color camera, which is an incomplete imaging sensor . The visual systems of various animals provided the blueprint for designing artificial vision imagers whose performance exceeds that of current state-of-the-art imaging technology [7,34–40]. In a similar fashion, our bio-inspired color-polarization imaging sensor, with its low-power, compact design and high sensitivity, enables a glimpse of the underwater world where animals actively exploit color and polarization information for both camouflaging and covert signaling channels.
A block diagram of our bio-inspired imaging sensor and its functional similarities to the mantis shrimp visual system is shown in Fig. 1. The mantis shrimp’s compound eye is divided into three morphological parts: two hemispheres and a midband section. The rhabdoms in the peripheral hemispheres are sensitive to two orthogonal orientations of linearly polarized light by alternating stacks of bidirectional microvilli. The rhabdoms are rotated 45° between hemispheres, giving the mantis shrimp the capability to fully reconstruct partially linearly polarized light. The midband section in the mantis shrimp’s eye is where most of the spectral discrimination takes place. Unlike traditional color sensors in which color is sensed over four spatially distributed pixels, the mantis shrimp’s visual system is capable of sensing multiple spectra within a single pixel location or ommatidium. This is accomplished by vertically stacked photosensitive cells and spectral filters in a single ommatidium. As light enters the ommatidium, its spectrum is initially filtered by the crystalline cone on top of the rhabdom, followed by wavelength-dependent absorption in the vertically stacked rhabdomeres. Within the ommatidium, deeper and larger photosensitive cells are most sensitive to longer wavelengths, while the shallower and shorter photosensitive cells are sensitive predominantly to shorter wavelengths. Neighboring ommatidia are separated by black pigments, which absorb scattered light and prevent optical cross-talk between neighbors .
Our bio-inspired imaging sensor mimics the visual system of the mantis shrimp by utilizing pixelated linear polarization filters deposited on an array of silicon-based vertically stacked photodetectors. Similar to its biological counterpart, silicon’s absorption coefficient is wavelength dependent and monotonically decreases from the blue to red wavelengths . The absorption coefficient for red wavelengths (650 nm) is about 30-fold lower than that for blue wavelengths (400 nm), leading to 99% of blue and red photons being absorbed within the first and , respectively. The initial concept of color imaging with vertically stacked photodetectors was proposed and patented by Kodak in the early 1980s . However, it took more than a decade to overcome the technological challenges for fabricating vertically stacked photodetectors in CMOS technology and was first achieved by researchers at Foveon . Today’s advanced CMOS processes allow for implementation of vertically stacked photodiodes, although its spectral responsivity is poor due to an unoptimized fabrication process for the photodiodes [45–47].
Our bio-inspired imager is fabricated in Foveon’s 180 nm vertically stacked process. Using an epitaxial growth of three separate layers on a positively doped silicon wafer and alternating positive and negative doping of six junctions, three broadband vertically stacked photodiodes within each individual pixel are realized (see Methods). The thicknesses of the doped junctions are optimized for peak quantum efficiencies in the blue, green, and red spectra for the three vertically stacked photodiodes. Figure 2 depicts the cross-sectional profile of the pixel. The top photodiode is beneath the silicon surface, the middle photodiode extends up to , and the bottom photodiode extends up to .
The full bio-inspired sensor comprises 1280 by 720 pixels, where each pixel contains three vertically stacked photodiodes (see Methods). The pitch of each pixel is 7.8 μm, with 48% fill factor and well capacity of . Negatively doped isolation regions are implanted between neighboring pixels to minimize electrical cross talk. These regions, resembling the black pigments between ommatidia, prevent photon-generated electron–hole pairs in one pixel from diffusing to neighboring pixels and corrupting information stored in these pixels. The integrated photodiode voltages from the three photodiodes are accessed simultaneously in a column-parallel fashion and stored in a bank of capacitors placed at the periphery of the imaging array (see Methods).
When designing the imaging sensor’s pixel, tradeoffs between pixel pitch, number of transistors per pixel, fill factor, and readout noise performance were carefully evaluated. Today’s state-of-the-art imagers utilize four transistors per pixel with pinned photodiode to achieve sub-electron readout noise by utilizing correlated double sampling (CDS) techniques [48,49]. However, the complex layout in implementing pinned photodiodes, floating diffusion nodes, and individual transfer transistors for each of the three vertically stacked photodiodes would result in a pixel with prohibitively large pitch and small fill factor. Therefore, three transistors per individual photodiode are implemented to reduce the pixel pitch of this imager (Fig. 2). Each pixel has a total of nine transistors, where the gates of the access and reset transistors are connected respectively to reduce the number of metal lines utilized within each pixel.
Difference double sampling (DDS) is implemented on chip by first sampling the integrated photodiode charges, followed by sampling of the reset value on two banks of column parallel capacitors, respectively. The difference between the two banks of capacitors is computed as individual elements are sequentially read out via a programmable gain differential amplifier. The final output is digitized by a 14-bit analog-to-digital converter. The dynamic range of the imager is 62 dB, with a root-mean-square readout noise of and power consumption of .
The DDS operation reduces the voltage threshold variations between the individual pixel’s source followers and improves the spatial uniformity across the image. However, the DDS operation doubles the readout noise ( in our implementation) and limits the operation of the sensor in low light illumination settings. Therefore, small pixel pitch is achieved at the expense of higher readout noise. As this imaging technology matures and feature sizes of transistors continue to decrease with advanced fabrication techniques, four transistors per pixel together with pinned photodiodes and CDS technique will be feasible and will reduce the readout noise in imaging sensors with vertically stacked photodiodes.
Figure 3(b) shows that the measured peak quantum efficiency (QE) for the top, middle, and bottom layers of photodiodes are 33.27% at 430 nm, 25.46% at 550 nm, and 26.23% at 620 nm, respectively. These three photodiode layers represent the three-color channels, named by their peak QE as blue, green, and red channels. The aggregated response of the color channels yields a maximum QE of 63.46% at 570 nm.
Polarization sensitivity is added to the imager with vertically stacked photodiodes by spatially modulating two 45°-shifted pairs of orthogonal pixelated polarization filters arranged in a checkerboard pattern that repeats across the imaging array. The 2 by 2 pattern has four pixelated polarization filters oriented nominally at 0°, 45°, 90°, and 135°. Figure 3 shows the combined polarization filters and photodiode raw response to Malus’s law, fixed-pattern noise histograms, QE plots for the color channels, and scanning electron micrographs of the filter features and orientations. The polarization filters are realized by depositing aluminum nanowires after CMOS fabrication using optimized interference lithography and reactive ion etching (see Methods). The aluminum nanowires are 75 nm wide and 250 nm high [Fig. 3(c)] and have a 50% duty cycle. This high aspect ratio of the aluminum nanowires is crucial to achieve the high diattenuation ratios.
The plot in Fig. 3(a) shows that the pixels in the imaging system exhibit sinusoidal response in accordance with Malus’s law. More specifically, the means and standard deviations of the filters’ orientations across the imaging array are , , , and , with diattenuation ratios of , which correspond to extinction ratios of . These high extinction ratios are comparable to those for state-of-the-art panchromatic or monochromatic polarimeters [29,32,50,51], which range between and . Furthermore, it has been shown that extinction ratios as low as 3 are sufficient for reliable polarimetry, although extinction ratios above 10 are preferred for accurate polarization reconstruction . Figure 4(d) shows that the high extinction ratio is kept across the visible spectrum of interest for each color channel. The normalized fixed-pattern noises [Fig. 3(a) histograms] for the 0°, 45°, 90°, and 135° filters with signals close to the full dynamic range of the pixels are 1.90%, 2.10%, 2.11%, and 1.80%, respectively. State-of-the-art polarization imaging sensors have fixed pattern noise of 5% or higher [29,32,51] mainly due to large variations in the fabricated pixelated polarization filters. Our highly optimized nanofabrication process for constructing pixelated polarization filters coupled with the DDS circuitry enables the lowest fixed pattern noise for pixelated polarization cameras reported in the literature to date.
Two figures of merit are typically used to describe the polarization responsivity and sensitivity of polarization-sensitive imaging sensors: the degree of linear polarization (DoLP) and the angle of polarization (AoP) . These metrics can be derived from the Stokes parameters, which themselves can be calculated from the filter-modulated light-intensity measurements. The different optoelectronic characteristics of an imaging system, such as extinction ratio, QE, and spatial noise, ultimately translate into how accurately and precisely the system can detect the polarization figures of merit across its imaging array.
Since our polarimeter is a trichromatic imaging system, the polarization metrics are computed per each color channel. To improve polarization sensitivity, a calibration scheme is utilized (see Methods). Figures 4(a) and 4(b) show the AoP and DoLP errors, respectively, when the imaging system is exposed to fully linearly polarized light with the input light’s AoP ranging from 0° to 180°. The DoLP and AoP errors are less than 1.2% and 0.18%, respectively, across the whole AoP domain for all color channels. Similarly, Fig. 4(c) shows the DoLP error when the imaging system is exposed to partially polarized light, with the input light DoLP ranging from 0 to 1, where 0 represents unpolarized and 1 represents linearly polarized light. As the input DoLP decreases—hence, as the signal-to-noise ratio decreases—the DoLP error increases but remains below 5% even for weak polarization signatures. These results show that our color-polarization imaging system can effectively measure polarization signatures with low errors across the visible spectrum. This high polarization sensitivity of our bio-inspired sensor is essential for applications that require detecting slight changes of polarization states in biological tissue, such as label-free neural recording or cancer detection.
Figure 5 shows a sample image of the co-registered color and polarization information acquired by our bio-inspired imaging system. Figure 5(b) shows the DoLP of the scene in a linear false-color map, where red and blue indicate fully polarized and unpolarized light, respectively. Figure 5(c) shows the AoP of the scene in a circular false-color map, where red and light blue indicate horizontally (0° or 180°) and vertically (90°) polarized light, respectively. The objects included in the scene are a Macbeth color-calibration target, an orange toy race car, a silicon conical ingot, a polarization target composed of six polarization filters offset by 60°, a black plastic horse, and a few beach rocks. The Macbeth color chart shows the color reconstruction accuracy of our imaging system [Fig. 5(a)], yielding a root-mean-square error of less than 4%. The polarization filters show high DoLP and very homogeneous AoP, as expected, due to their intrinsic properties for each of the orientations in the target. The ingot, the toys, and the rocks demonstrate the AoP dependence on shape and the DoLP dependence on both shape and material properties of the imaged targets.
Most neuroethology and sensory ecology polarization studies in marine animals have to be done outside of the animal’s natural environment: the animal of interest has to be captured and brought into a fish tank in a laboratory setting. The lack of a color-polarization camera that yields meaningful real-time, high-resolution data and is sufficiently compact to be integrated with an underwater system prohibits the study of marine animals with polarization capabilities in their natural habitat. Here, we show that our color-polarization imaging system is sufficiently compact, robust, and low power to be placed inside an underwater camera housing to capture high-frame-rate color-polarization videos of marine life. Figure 6 depicts—for the first time reported in the literature—still frames captured from video of four marine animals imaged in their natural habitat. These animals exhibit polarization signatures along their bodies, and in some cases both color and polarization information is actively controlled by the animal.
Depicted are the color and polarization signatures on the antenna scales of the mantis shrimp Odontodactylus scyllarus [Fig. 6(a)], the blue maxillipeds of the mantis shrimp Haptosquilla trispinosa [Fig. 6(b)], the polarized stripes of the cuttlefish Sepia latimanus in two distinct color states [Figs. 6(c) and 6(d)], and the polarized stripes of the squid Sepioteuthis lessoniana [Fig. 6(e)] (see also Visualization 1). Observations during experiments in a laboratory setting indicate that the polarization signatures displayed by marine animals are wavelength and orientation dependent [53–55]. This phenomenon is not detectable by panchromatic or monochromatic polarimeters used in field experiments. We recorded wavelength-dependent polarization signatures from several marine animals in their natural habitat with our bio-inspired sensor. The difference between the DoLPs of the blue and red channels is for the H. trispinosa and for the S. latimanus, and these real-time measurements agree with data taken in the laboratory setting (see Visualization 2). Our bio-inspired sensor can be used in the natural habitat, which will enable future studies of the hidden color-polarization marine world.
Here we have shown that by mimicking the mantis shrimp visual system, we have designed a compact, low-power, and highly sensitive imaging system capable of capturing co-registered color and polarization information in real time. Because of its size, low power, and ease of use, our bio-inspired sensor enables many versatile and challenging applications, such as underwater imaging, remote sensing, and various bio-medical applications, where polarization yields a body of information orthogonal to that of the color channels.
A. Fabrication of Nanowire Polarization Filters
The nanofabrication of the pixelated nanowire polarization filter is achieved using a series of optimized nanofabrication steps:
- 1. A 250-nm-thick aluminum layer is deposited on a substrate via e-beam deposition.
- 2. A 75-nm-thick layer is deposited on top of the aluminum layer via chemical vapor deposition. This layer will act as a hard mask for etching the underlying aluminum.
- 3. A 140-nm-thick S-1805 photoresist layer is spin-coated at 3000 rpm.
- 4. The sample is baked at 120°C for 90 s, followed by 30 s cooling at 70°C to avoid cracks in the photoresist.
- 5. Quartz mask is brought in contact with the sample. The mask is composed of a layer of chromium to block light except at selected pixels where the chromium is removed. Every even-numbered pixel in every even-numbered row has the chromium removed and will be exposed to an interference pattern.
- 6. An interference pattern is generated using a 532 nm continuous-wave neodymium-doped yttrium aluminum garnet (Nd:YAQ) laser coupled with a frequency doubler. Two laser beams are aligned to intersect at and generate a 140 nm periodic interference pattern on the surface of the exposed photoresist. The photoresist is exposed for 40 s.
- 7. The chromium mask is shifted one pixel over in the horizontal direction and brought in contact with the substrate. Both mask and substrate are rotated 45° with respect to the interference pattern. The photoresist is exposed for 40 s.
- 8. Step 7 is repeated two more times to generate 90° and 135° pixelated nanowire filters.
- 9. After the four consecutive exposures, the photoresist is developed for 60 s while stirring the sample.
- 10. Inductively coupled plasma reactive-ion etching (ICP RIE) is used to first etch the . A standard recipe is used to etch the . At the end of this step, the 75 nm nanowire pattern is transferred from the photoresist to the underlying layer.
- 11. The aluminum layer is etched next using a standard ICP RIE recipe. The layer acts as a hard mask for etching aluminum and enables forming deep trenches, i.e., 250-nm-tall and 75-nm-wide aluminum nanowires. (One cannot achieve high-aspect-ratio aluminum structures using photoresist because the higher etching rate of photoresist compared to aluminum would necessitate forming photoresist structures with an aspect ratio of 100 or higher, which is impossible.) The fabrication steps were performed on individual dies in the cleanroom facility at Washington University.
B. CMOS Imager with Vertically Stacked Photodiodes
Our bio-inspired imaging sensor is fabricated in a 180 nm feature process with one poly and three metal layers. The substrate for fabricating the sensor is a custom wafer with three epitaxially layers. The imager is fabricated on a positively doped silicon wafer ( boron atoms per ). The first epitaxial layer is grown on top of the silicon wafer with thickness. This epitaxial layer is positively doped with boron atoms per . Next, the negative terminal of the red photodiode is implemented by doping a selective region in the epitaxial layer with phosphorus atoms per at 75 keV energy followed by rapid thermal annealing. A negatively doped isolation region between neighboring pixels is created next to prevent lateral flow of photoinduced electron–hole pairs. This isolation region is formed by using three different doping concentrations at three different energy levels. The first doping is at 1500 keV with phosphorus atoms per , the second at 1000 keV with phosphorus atoms per , and the third at 750 keV with phosphorus atoms per . Rapid thermal annealing is performed next. The impedance of this isolation region is not critical as it only serves to remove optical cross-talk between neighboring pixels.
The second, positively doped epitaxial layer ( boron atoms per ) is grown on top of the first epitaxial layer with a thickness of . The next step is to form (1) the negative terminal of the green photodiode, (2) connection to the negative terminal of the red photodiode, and (3) connection to the negatively doped isolation region between pixels. The latter two connections are implemented first by using three different doping concentrations at three different energy levels. The first doping is at 1500 keV with phosphorus atoms per , the second at 1000 keV with phosphorus atoms per , and the third at 750 keV with phosphorus atoms per . The last doping step is to form the negative terminal of the green photodiode with phosphorus atoms per at 75 keV energy followed by rapid thermal annealing.
The third, positively doped epitaxial layer ( boron atoms per ) is grown on top of the second epitaxial layer with a thickness of . The next step is to form (1) the negative terminal for the blue photodiode, (2) connections to the negative terminals of the green and red photodiodes, and (3) connection to the negatively doped isolation region between pixels. The latter two connections are implemented first by using two different doping concentrations at three different energy levels. The first doping is at 1000 keV with phosphorus atoms per , and the second is at 750 keV with phosphorus atoms per . The last step is to form the negative terminal of the blue photodiode with phosphorus atoms per at 75 keV energy followed by rapid thermal annealing. The positive terminals of all photodiodes are connected to ground potential. The negative terminals of the individual photodiodes are connected to individual readout circuits in the pixel.
The readout circuit comprises three transistors: reset transistor, source follower, and access transistor. The reset transistor controls the integration (or exposure) time of the photodiode such that when the gate voltage is low the photon-generated electron–hole pairs are integrated on the photodiode intrinsic capacitance. The source follower buffers the integrated photodiode voltage before outputting it on the column bus. The access transistor controls access to the readout bus, such that all pixels in a column share the same readout bus. The gates of the reset and access transistors for the red, green, and blue photodiodes are connected together; hence, all three photodiodes have the same exposure control and are accessed in parallel at the same time. This minimizes both the number of metallic lines per pixel and the pixel pitch. DDS is performed on the readout, eliminating mismatches between pixels’ voltage thresholds.
C. Experimental Optical Setups
A variety of optical setups were employed to obtain the optoelectrical characterization measurements of the color-polarization imaging system. For the response to Malus’s law, fixed-pattern noise histograms, and estimation error for DoLP and AoP as a function of the input light’s AoP, an optical setup that produced broadband, collimated, and fully linearly polarized light was constructed. Three current-controlled and narrowband LED sources at 460 nm, 515 nm, and 625 nm were connected to the input ports of an integrating sphere (819D-SF-4, Newport). The output port of the integrating sphere was aligned to an adjustable iris (SM2D25, Thorlabs), an aspheric collimating lens (ACL7560, Thorlabs), a linear polarizer (20LP-VIS-B, Newport), and the imaging sensor under test, in that order. The linear polarizer was mounted on a nanorotator stage (NR360S, Newport) to modulate the input light’s AoP.
To evaluate the estimation accuracy of the DoLP, a monochromator (Acton SP2150, Princeton Instruments), outputting light at 532 nm, was instead connected to the integrating sphere’s input. A zero-order quarter-wave retarder at 532 nm (20RP34-532, Newport) was added to the optical path between the linear polarizer and the sensor. The retarder was mounted on a nanorotator stage to modulate the input light’s DoLP. A calibrated photodiode (S130C, Thorlabs) driven by a power meter (PM100D, Thorlabs), in conjunction with a rotating linear polarizer, was utilized to calculate the true DoLP. For the diattenuation ratio as a function of the input light’s wavelength measurement, the previously described optical setup was utilized without the quarter-wave retarder, and the monochromator produced narrowband light from 400 nm to 650 nm in steps of 10 nm. For the QE measurement, the previously described optical setup was utilized without any polarization optics. A calibrated photodiode driven by a power meter was utilized to calculate the total photon flux.
D. Data Interpolation
A spatial interpolation algorithm was used to recover full-resolution polarization frames and to minimize instantaneous field-of-view artifacts. To reconstruct polarization information at each pixel location, the full-frame bicubic spline interpolation method was utilized per color channel . This method has the advantage of yielding a higher modulation transfer function gain and wider validation frequency bandwidth than bilinear-based interpolation methods.
E. Color and Polarization Calibration
To color correct the trichromatic intensity images produced by the imaging system and to closely replicate the color perception of the human eye, a linear regression algorithm was used. A Macbeth color calibration target was used as the training data to produce a color-mixing 4 by 3 elements matrix. This matrix then can be used to color-calibrate images taken under similar illumination conditions.
To correct for imperfections in the nanofabrication process of the nanowire polarization filters, a division-of-focal-plane polarimeter calibration method is employed . The nanowires can have spatial variations across the filter array, which can cause fixed-pattern noise and deviation in the filters’ transmission ratios and desired AoP. To minimize the errors caused by these filter artifacts, the calibration scheme characterizes the polarization filters by computing an array of analysis matrices and dark offsets using a linear regression algorithm and Mueller theory. These parameters are used in real time to calibrate newly acquired polarization data.
F. Underwater Imaging
To acquire the underwater videos of animals, an underwater imaging setup was utilized. The bio-inspired color-polarization imaging system was enclosed in an underwater housing (Bluefin VX2000, Light and Motion). A single-board computer (QM67PC-2715QE, ADL) was placed inside the underwater housing to control the imaging system and record the data to a solid-state drive. A microcontroller board (Teensy 3.2 ARM, PJRC) was programmed to control a Canon EF lens and process the user commands from the underwater housing integrated control buttons. The system included an external HDMI monitor for the user to see the data in real time. The underwater system was powered by a lithium-ion polymer battery (Li-Ion 18650 14.8 V 6600mAh, Tenergy). The data for the marine animals were acquired in the proximity bays of Lizard Island, Queensland, Australia, with the help of the Lizard Island Research Station facilities.
Air Force Office of Scientific Research (AFOSR) (FA9550-12-1-0321); National Science Foundation (NSF) (1724615, 1740737).
The authors thank James Hutchinson and Patricia J. Watson for manuscript editing and Michael Bok for the stomatopod photographs. The authors thank Sam Powell for the help in designing the underwater imaging system. V. G. conceived the idea and directed the project. M. G. and V. G. designed the instruments. M. G., C. E., R. M., and V. G. designed and fabricated the polarization filters. M. G. designed the image sensor and performed optoelectronic testing of the sensor. M. G. developed real-time data acquisition and image processing. M. G., A. V., and V. G. designed the underwater camera and performed underwater imaging. M. G. and V. G. wrote the manuscript with contributions from all authors. The authors declare no competing financial interests.
The datasets generated and analyzed during the current study are available from the corresponding author on reasonable request.
1. H. H. Thoen, M. J. How, T.-H. Chiou, and J. Marshall, “A different form of color vision in mantis shrimp,” Science 343, 411–413 (2014). [CrossRef]
2. J. Marshall and J. Oberwinkler, “Ultraviolet vision: the colourful world of the mantis shrimp,” Nature 401, 873–874 (1999). [CrossRef]
3. M. J. Bok, M. L. Porter, A. R. Place, and T. W. Cronin, “Biological sunscreens tune polychromatic ultraviolet vision in mantis shrimp,” Curr. Biol. 24, 1636–1642 (2014). [CrossRef]
4. M. J. Bok, M. L. Porter, and T. W. Cronin, “Ultraviolet filters in stomatopod crustaceans: diversity, ecology and evolution,” J. Exp. Biol. 218, 2055–2066 (2015). [CrossRef]
5. T. W. Cronin and N. J. Marshall, “A retina with at least ten spectral types of photoreceptors in a mantis shrimp,” Nature 339, 137–140 (1989). [CrossRef]
6. I. M. Daly, M. J. How, J. C. Partridge, S. E. Temple, N. J. Marshall, T. W. Cronin, and N. W. Roberts, “Dynamic polarization vision in mantis shrimps,” Nat. Commun. 7, 12140 (2016). [CrossRef]
7. T. York, S. B. Powell, S. Gao, L. Kahan, T. Charanya, D. Saha, N. W. Roberts, T. W. Cronin, J. Marshall, S. Achilefu, S. P. Lake, B. Raman, and V. Gruev, “Bioinspired polarization imaging sensors: from circuits and optics to signal processing algorithms and biomedical applications,” Proc. IEEE 102, 1450–1469 (2014). [CrossRef]
8. R. Marinov, N. Cui, M. Garcia, S. B. Powell, and V. Gruev, “A 4-megapixel cooled CCD division of focal plane polarimeter for celestial imaging,” IEEE Sens. J. 17, 2725–2733 (2017). [CrossRef]
9. T. Charanya, T. York, S. Bloch, G. Sudlow, K. Liang, M. Garcia, W. J. Akers, D. Rubin, V. Gruev, and S. Achilefu, “Trimodal color-fluorescence-polarization endoscopy aided by a tumor selective molecular probe accurately detects flat lesions in colitis-associated cancer,” J. Biomed. Opt. 19, 126002 (2014). [CrossRef]
10. D. H. Goldstein, Polarized Light (CRC Press, 2016).
11. N. M. Garcia, I. de Erausquin, C. Edmiston, and V. Gruev, “Surface normal reconstruction using circularly polarized light,” Opt. Express 23, 14391–14406 (2015). [CrossRef]
12. S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47, 123201 (2008). [CrossRef]
13. E. Salomatina-Motts, V. Neel, and A. Yaroslavskaya, “Multimodal polarization system for imaging skin cancer,” Opt. Spectrosc. 107, 884–890 (2009). [CrossRef]
14. C. P. Huynh, A. Robles-Kelly, and E. Hancock, “Shape and refractive index recovery from single-view polarisation images,” in IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (IEEE, 2010), pp. 1229–1236.
15. T. W. Cronin, S. Johnsen, N. J. Marshall, and E. J. Warrant, Visual Ecology (Princeton University, 2014).
16. G. M. Calabrese, P. C. Brady, V. Gruev, and M. E. Cummings, “Polarization signaling in swordtails alters female mate preference,” Proc. Natl. Acad. Sci. USA 111, 13397–13402 (2014). [CrossRef]
17. P. C. Brady, A. A. Gilerson, G. W. Kattawar, J. M. Sullivan, M. S. Twardowski, H. M. Dierssen, M. Gao, K. Travis, R. I. Etheredge, A. Tonizzo, A. Ibrahim, C. Carrizo, Y. Gu, B. Russell, K. Mislinski, S. Zhao, and M. Cummings, “Open-ocean fish reveal an omnidirectional solution to camouflage in polarized environments,” Science 350, 965–969 (2015). [CrossRef]
18. C.-C. Chiao, J. K. Wickiser, J. J. Allen, B. Genter, and R. T. Hanlon, “Hyperspectral imaging of cuttlefish camouflage indicates good color match in the eyes of fish predators,” Proc. Natl. Acad. Sci. USA 108, 9148–9153 (2011). [CrossRef]
19. R. Hanlon, “Cephalopod dynamic camouflage,” Curr. Biol. 17, R400–R404 (2007). [CrossRef]
20. N. Shashar, P. Rutledge, and T. Cronin, “Polarization vision in cuttlefish in a concealed communication channel?” J. Exp. Biol. 199, 2077–2084 (1996).
21. B. Kunnen, C. Macdonald, A. Doronin, S. Jacques, M. Eccles, and I. Meglinski, “Application of circularly polarized light for non‐invasive diagnosis of cancerous tissues and turbid tissue‐like scattering media,” J. Biophoton. 8, 317–323 (2015). [CrossRef]
22. T. York, L. Kahan, S. P. Lake, and V. Gruev, “Real-time high-resolution measurement of collagen alignment in dynamically loaded soft tissue,” J. Biomed. Opt. 19, 066011 (2014). [CrossRef]
23. L. Gao and L. V. Wang, “A review of snapshot multidimensional optical imaging: measuring photon tags in parallel,” Phys. Rep. 616, 1–37 (2016). [CrossRef]
24. J. S. Tyo, D. L. Goldstein, D. B. Chenault, and J. A. Shaw, “Review of passive imaging polarimetry for remote sensing applications,” Appl. Opt. 45, 5453–5469 (2006). [CrossRef]
25. S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, and R. Liang, “Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015). [CrossRef]
26. J. F. De Boer and T. E. Milner, “Review of polarization sensitive optical coherence tomography and Stokes vector determination,” J. Biomed. Opt. 7, 359–371 (2002). [CrossRef]
27. J. Li, J. Zhu, and H. Wu, “Compact static Fourier transform imaging spectropolarimeter based on channeled polarimetry,” Opt. Lett. 35, 3784–3786 (2010). [CrossRef]
28. G. Biener, A. Niv, V. Kleiner, and E. Hasman, “Near-field Fourier transform polarimetry by use of a discrete space-variant subwavelength grating,” J. Opt. Soc. Am. A 20, 1940–1948 (2003). [CrossRef]
29. W.-L. Hsu, G. Myhre, K. Balakrishnan, N. Brock, M. Ibn-Elhaj, and S. Pau, “Full-Stokes imaging polarimeter using an array of elliptical polarizer,” Opt. Express 22, 3063–3074 (2014). [CrossRef]
30. T.-H. Tsai, X. Yuan, and D. J. Brady, “Spatial light modulator based color polarization imaging,” Opt. Express 23, 11912–11926 (2015). [CrossRef]
31. C. Fu, H. Arguello, B. M. Sadler, and G. R. Arce, “Compressive spectral polarization imaging by a pixelized polarizer and colored patterned detector,” J. Opt. Soc. Am. A 32, 2178–2188 (2015). [CrossRef]
32. V. Gruev, R. Perkins, and T. York, “CCD polarization imaging sensor with aluminum nanowire optical filters,” Opt. Express 18, 19087–19094 (2010). [CrossRef]
33. C. Mora, D. P. Tittensor, S. Adl, A. G. Simpson, and B. Worm, “How many species are there on Earth and in the ocean?” PLoS Biol. 9, e1001127 (2011). [CrossRef]
34. Y. Zhao, M. A. Belkin, and A. Alù, “Twisted optical metamaterials for planarized ultrathin broadband circular polarizers,” Nat. Commun. 3, 870 (2012). [CrossRef]
35. Y. M. Song, Y. Xie, V. Malyarchuk, J. Xiao, I. Jung, K.-J. Choi, Z. Liu, H. Park, C. Lu, R.-H. Kim, R. Li, K. B. Crozier, Y. Huang, and J. A. Rogers, “Digital cameras with designs inspired by the arthropod eye,” Nature 497, 95–99 (2013). [CrossRef]
36. H. Liu, Y. Huang, and H. Jiang, “Artificial eye for scotopic vision with bioinspired all-optical photosensitivity enhancer,” Proc. Natl. Acad. Sci. USA 113, 3982–3985 (2016). [CrossRef]
37. C. Posch, T. Serrano-Gotarredona, B. Linares-Barranco, and T. Delbruck, “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output,” Proc. IEEE 102, 1470–1484 (2014). [CrossRef]
38. Y.-J. Jen, A. Lakhtakia, C.-W. Yu, C.-F. Lin, M.-J. Lin, S.-H. Wang, and J.-R. Lai, “Biologically inspired achromatic waveplates for visible light,” Nat. Commun. 2, 363 (2011). [CrossRef]
39. G. England, M. Kolle, P. Kim, M. Khan, P. Muñoz, E. Mazur, and J. Aizenberg, “Bioinspired micrograting arrays mimicking the reverse color diffraction elements evolved by the butterfly Pierella luna,” Proc. Natl. Acad. Sci. USA 111, 15630–15634 (2014). [CrossRef]
40. C.-C. Huang, X. Wu, H. Liu, B. Aldalali, J. A. Rogers, and H. Jiang, “Large-field-of-view wide-spectrum artificial reflecting superposition compound eyes,” Small 10, 3050–3057 (2014). [CrossRef]
41. N. Marshall, M. Land, C. King, and T. Cronin, “The compound eyes of mantis shrimps (Crustacea, Hoplocarida, Stomatopoda). I. Compound eye structure: the detection of polarized light,” Philos. Trans. R. Soc. London B 334, 33–56 (1991). [CrossRef]
42. M. A. Green and M. J. Keevers, “Optical properties of intrinsic silicon at 300 K,” Prog. Photovoltaics 3, 189–192 (1995). [CrossRef]
43. B. C. Burkey, R. S. VanHeyningen, R. A. Spaulding, and E. L. Wolf, “Color responsive imaging device employing wavelength dependent semiconductor optical absorption,” U.S. patent 4,613,895 A (September 23, 1986).
44. R. B. Merrill, “Vertical-color-filter detector group with trench isolation,” U.S. patent 6,930,336 B1 (August 16, 2005).
45. R. Berner and T. Delbruck, “Event-based pixel sensitive to changes of color and brightness,” IEEE Trans. Circuits Syst. I 58, 1581–1590 (2011). [CrossRef]
46. J. A. Lenero-Bardallo, D. H. Bryn, and P. Hafliger, “Bio-inspired asynchronous pixel event tricolor vision sensor,” IEEE Trans. Biomed. Circuits Syst. 8, 345–357 (2014). [CrossRef]
47. J. A. Leñero-Bardallo, M. Delgado-Restituto, R. Carmona-Galán, and Á. Rodríguez-Vázquez, “Enhanced sensitivity of CMOS image sensors by stacked diodes,” IEEE Sens. J. 16, 8448–8455 (2016). [CrossRef]
48. M.-W. Seo, T. Wang, S.-W. Jun, T. Akahori, and S. Kawahito, “4.8 A 0.44 e-rms read-noise 32 fps 0.5 Mpixel high-sensitivity RG-less-pixel CMOS image sensor using bootstrapping reset,” in IEEE International Solid-State Circuits Conference (ISSCC) (IEEE, 2017), pp. 80–81.
49. J. Ma and E. R. Fossum, “Quanta image sensor jot with sub 0.3 e-rms read noise and photon counting capability,” IEEE Electron Device Lett. 36, 926–928 (2015). [CrossRef]
50. Z. Wang, J. Chu, Q. Wang, and R. Zhang, “Single-layer nanowire polarizer integrated with photodetector and its application for polarization navigation,” IEEE Sens. J. 16, 6579–6585 (2016). [CrossRef]
51. T. York and V. Gruev, “Characterization of a visible spectrum division-of-focal-plane polarimeter,” Appl. Opt. 51, 5392–5400 (2012). [CrossRef]
52. J. S. Tyo and H. Wei, “Optimizing imaging polarimeters constructed with imperfect optics,” Appl. Opt. 45, 5497–5503 (2006). [CrossRef]
53. T.-H. Chiou, L. M. Mäthger, R. T. Hanlon, and T. W. Cronin, “Spectral and spatial properties of polarized light reflections from the arms of squid (Loligo pealeii) and cuttlefish (Sepia officinalis L.),” J. Exp. Biol. 210, 3624–3635 (2007). [CrossRef]
54. T.-H. Chiou, T. W. Cronin, R. L. Caldwell, and J. Marshall, “Biological polarized light reflectors in stomatopod crustaceans,” Proc. SPIE 5888, 380–388 (2005). [CrossRef]
55. T. W. Cronin, N. Shashar, R. L. Caldwell, J. Marshall, A. G. Cheroske, and T.-H. Chiou, “Polarization signals in the marine environment,” Proc. SPIE 5158, 85–92 (2003). [CrossRef]
56. S. Gao and V. Gruev, “Bilinear and bicubic interpolation methods for division of focal plane polarimeters,” Opt. Express 19, 26161–26173 (2011). [CrossRef]
57. S. B. Powell and V. Gruev, “Calibration methods for division-of-focal-plane polarimeters,” Opt. Express 21, 21039–21055 (2013). [CrossRef]