Polarization is one of the three fundamental properties of light, along with color and intensity, yet most vertebrate species, including humans, are blind with respect to this light modality. In contrast, many invertebrates, including insects, spiders, cephalopods, and stomatopods, have evolved to detect polarization information with high-dynamic-range photosensitive cells and utilize this information in visually guided behavior. In this paper, we present a high-dynamic-range polarization imaging sensor inspired by the visual system of the mantis shrimp. Our bioinspired imager achieves 140 dB dynamic range and 61 dB maximum signal-to-noise ratio across pixels equipped with logarithmic photodiodes. Contrary to state-of-the-art active pixel sensors, where photodiodes in individual pixels operate in reverse bias mode and yield up to dynamic range, our pixel has a logarithmic response by operating individual photodiodes in forward bias mode. This novel pixel circuitry is monolithically integrated with pixelated polarization filters composed of 250-nm-tall × 75-nm-wide aluminum nanowires to enable snapshot polarization imaging at 30 frames per second. This sensor can enable many automotive and remote sensing applications, where high-dynamic-range imaging augmented with polarization information can provide critical information during hazy or rainy conditions.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
Polarization imaging can provide rich information about our surrounding world—the polarization state of light can act like memory foam by “remembering” the intrinsic properties of the media or objects that light has encountered in previous optical interactions . This modality of light encodes information such as three-dimensional (3D) shape, surface roughness, and material or tissue structural composition. Although nature did not favor the human eye with the ability to distinguish polarization of light, a vast number of animals have evolved to discriminate and utilize it in visually guided behaviors . For example, desert ants utilize polarization properties of the sky to navigate straight home after a long, random foraging walk. Birds also utilize the polarization properties of the sky to calibrate their magnetic compass, while the polarization patterns on the body of the mantis shrimp and swordtail fish provide conspecific channels of communication [2,3].
Among the various visual systems in the animal kingdom, that of the mantis shrimp has evolved to be one of the most complex, capable of detecting 16 spectral channels and 4 linear and 2 circular polarization channels. Furthermore, individual photoreceptors have logarithmic responses to incident light intensity, which equips these creatures with high-dynamic-range imaging capabilities . With these complex imaging capabilities, it is no surprise that the mantis shrimp is considered one of the top predators in shallow waters. Its exemplary visual system has been the motivation for several bioinspired color and polarization imaging systems [4–6]. Although these bioinspired imaging sensors have enabled many biomedical and remote sensing applications, their imaging dynamic range is limited to due to the use of conventional active pixel sensors with reverse biased photodiodes.
A high dynamic range is necessary in various polarization applications due to the nature of polarized optical phenomena, such as specular reflections, celestial imaging, and polarization microscopy. For example, in the automotive industry, high-dynamic-range cameras are desirable for capturing scenes where the illumination can easily vary by several orders of magnitude, such as exiting a garage or a tunnel [7–9]. Additionally, polarization information can provide key cues for autonomous navigation [10–12]. The addition of an inexpensive high-dynamic-range polarimeter can benefit the automotive industry by providing the automobile computer with polarization information about the road environment during the navigation decision-making process. In a similar fashion, the biomedical arena has strict illumination guidelines, particularly during operative procedures. For a polarimeter to be integrated into the surgical workflow to exploit the polarization benefits, it must be able to acquire significant and meaningful polarization data frames under highly dynamic lighting conditions .
Current state-of-the-art polarization imaging sensors are realized by combining polarization optics with arrays of polarization-blind photodetectors—depending on the imaging architecture chosen, the light intensity is polarization modulated in either the time, light amplitude, or focal plane domain . From these different architectures, bioinspired and division-of-focal-plane polarimeters have proliferated due to their robustness, compactness, and single-chip integration enabling simultaneous acquisition of all pertinent data planes in a single snapshot [4,6,14–26]. As a result, these state-of-the-art polarimeters have been used in an extensive number of applications, including 3D shape reconstruction [27,28], contrast enhancement in hazy conditions [29,30], material detection [31,32], cancer detection [33–35], identification of stress in ligaments , and underwater geolocalization .
Although polarization sensitivity has been added to a variety of imaging sensors, these polarimeters are still outmatched compared to their color cousins, the digital color cameras, on non-polarization-related metrics such as frame rate, resolution, noise, and dynamic range. These optoelectronic limitations hamper the efficacy of polarization-based applications and ultimately decelerate the industrial integration of polarization technology. Among the shortcomings that polarimeter technology faces is the limited instantaneous dynamic range provided by current state-of-the-art polarimeters.
To address these shortcomings in current state-of-the-art polarization imagers, we have designed, fabricated, and tested a high-dynamic-range polarization imager based on the visual system of the mantis shrimp (Fig. 1). In contrast to state-of-the-art active pixel sensors, where photodiodes in individual pixels operate in reverse bias mode and yield up to dynamic range, our pixel has a logarithmic response by operating the photodiode in forward bias mode. Our imager, fabricated in 180-nm CMOS technology, has a spatial resolution of , a 140 dB dynamic range, and a 61 dB signal-to-noise ratio (SNR), and it operates at 30 frames per second (fps). Single-chip polarization measurement is realized by monolithic integration of an array of aluminum nanowire polarization filters with a custom array of logarithmic CMOS photodetectors. This sensor can enable many automotive and remote sensing applications, where high-dynamic-range imaging augmented with polarization information can provide critical information during hazy or rainy conditions.
A block diagram of our logarithmic polarization imaging sensor, a schematic of its logarithmic pixel architecture, and a scanning electron micrograph of a nanowire polarization filter are shown in Fig. 1. The sensor consists of an array of with pixel pitch of 30 μm and is fabricated in a CMOS 180 nm process. Once the imager is fabricated in a traditional semiconductor foundry, polarization filters are monolithically integrated via an optimized nanofabrication procedure at the University of Illinois at Urbana-Champaign . At the end of the integration process, a pattern of pixelated polarization filters offset by 45° is repeated across the imaging array. Individual pixels contain polarization filters composed of 250-nm-tall and 75-nm-wide aluminum nanowires, with a 50% duty cycle (Fig. 1). The monolithic integration of CMOS pixels and aluminum nanowires enables the realization of a single-chip polarization imager capable of capturing polarization properties of the imaged environment in a single snapshot.
The CMOS pixel is composed of three transistors [1 P-type metal oxide semiconductor (PMOS) and 2 N-type metal oxide semiconductor (NMOS) transistors] and a photodiode [Fig. 1(b)]. The reset transistor T1 switches the photodiode between the exposure and reset modes; the source-follower transistor T2 buffers the photodiode voltage for the output bus; and the select transistor T3 controls the pixel’s readout to provide the photodiode voltage to a column-parallel output bus. Digital scanning registers control the gates of the reset and select transistors to enable individual pixel readouts across the imaging array. There are three PN junctions in the pixel: our main n+/psub photodiode, two p+/nwell PN junctions, and an nwell/psub PN junction (Fig. 2). Since the psub is grounded and the nwell is biased to VDD, the photoinduced electron–hole pairs generated in the space charge regions of the nwell/psub and nwell/p+ are collected by the power rail instead of the photodiode. Thus, no shielding is necessary for the nwell. The potentials of the psub and the nwell are always fixed to GND and VDD, respectively, making the body effect of both PMOS and NMOS transistors negligible.
To achieve high-dynamic-range imaging capabilities, the pixel circuitry is designed to operate the photodiode in the forward bias mode, unlike traditional active pixel sensors which operate it in reverse bias mode, translating to a major difference in readout architecture. The photodiode voltage is no longer linearly proportional to the photocurrent or photon flux and instead follows a logarithmic response [Fig. 3(a)]. This logarithmic response means that the photocurrent measurement gets compressed in the voltage domain as the photon flux increases, making our logarithmic pixel more sensitive to larger photon fluxes than a traditional linear active pixel, thus accomplishing a high-dynamic-range imager. T1 sets the negative node of the photodiode to during the reset operation. Since the positive node of the photodiode is set to 0 V, the photodiode is biased in the forward mode. When T1 is turned off, the current across the forward bias photodiode is equal to the photon-induced current. Hence, the voltage across the forward biased photodiode is set by the photocurrent, as described by3). The measured dynamic range of our imager is 140 dB while operating at 30 fps. Figures 3(b) and 3(c) show the noise measurements of our imager; the average noise is reported at around nine digital values, yielding a maximum SNR of 61.2 dB. The noise of this imager architecture has been previously characterized and follows Johnson noise, which is constant over the entire logarithmic operation range , and the reset noise for the pixel is 0.12 mV.
To estimate the first three Stokes vector elements, the light intensity captured by the imaging sensor’s photodiodes is modulated through the four different types of polarization filters—the relationship between intensity, Stokes parameters, and polarization filters is described by Stokes’s intensity formula . This imaging architecture is known as a division-of-focal-plane polarimeter , and just like its Bayer counterpart, the digital color camera, it has the prominent advantages of simultaneously capturing all the polarization states (i.e., no motion blur), having a compact and robust single-chip integration, and diminishing coregistration errors between data planes by avoiding moving or temperature-dependent expandable optics.
A series of optoelectronic measurements were performed on our logarithmic polarimeter to evaluate its polarization sensitivity (see Section 3 and Fig. 4). When characterizing the polarization sensitivity of a polarimeter, two figures of merit are commonly used: (1) degree of linear polarization (DoLP), which ranges from 0 to 1 and describes the proportion of the captured light that is linearly polarized, and (2) angle of polarization (AoP), which is a circular metric ranging from 0° to 180° that describes the orientation of the polarized light . Figure 4(a) shows the uncalibrated sinusoidal response of the nanowire polarization filters to Malus’s law and fixed-pattern noise (FPN) histograms for data points with an AoP input light matching the filters’ orientation (i.e., maximum filter transmittance). The imager’s response to Malus’s law demonstrates its polarization sensitivity: the polarization filters closely match their nominal orientations and have diattenuation ratios of , which correspond to extinction ratios (ERs) of , yielding an average FPN of across the imaging array. The ER was measured over the entire system, which includes the polarization filters and the photodiodes. Crosstalk between pixels, whether caused optically or electronically, is known to lower the achievable ER because of the signal contamination from one pixel to another. Accordingly, the reported imager’s ER is smaller than the ER of the nanowire filters alone because our imager suffers from crosstalk on the readout circuitry. The pixelated polarization filters fabricated on a glass substrate and evaluated under a microscope have a measured ER of . When the filters are deposited on the imaging array, the ERs drop to , which indicates optical and electrical crosstalk between pixels. About 15% of the charges from the neighboring pixel to the left are transferred during the readout due to imperfections in the switched-capacitor circuits. To diminish the effect of crosstalk on the ER, we have arranged our polarizing filters in a pattern at angles of 45˚ in the horizontal direction so that imagers with a high modulation transfer function are not necessary for good polarization reconstructions. These measurements demonstrate that our polarimeter can be used for the most demanding polarization applications since it has been shown that ERs as low 3 and 5 can be tolerated in a four-channel polarimeter, with only a 3.5 and 2 dB decrease in the SNR, respectively, in the reconstructed Stokes vector compared with an ideal polarization filter (i.e., infinite ER) . To validate our previous statement, the accuracy of the figures of merit, DoLP and AoP, is shown in Figs. 4(b)–4(d).
The imager was exposed to collimated narrowband light where the input light’s DoLP and AoP were modulated through rotating polarization filters (see Section 3). The AoP and DoLP errors as a function of the input light’s AoP for fully polarized light are shown in Figs. 4(b) and 4(c), respectively. The errors are and for the AoP and DoLP, respectively, across the input light’s AoP, with very low standard deviation across the imager (). The DoLP error as a function of the input light’s DoLP (i.e., partially polarized light) is shown in Fig. 4(d). The error is less than 2.4% even for almost unpolarized light, where the SNR decreases significantly.
A sample image of the coregistered high-dynamic-range scene intensity and polarization information captured by our logarithmic polarization camera is shown in Fig. 5. The scene includes a polarization target composed of three polarization filters offset by 60°, a conical silicon ingot, a black plastic horse, and a high-power LED flashlight placed behind the polarization targets. Figure 5(a) shows the scene’s intensity image captured by our sensor before linearization (i.e., raw logarithmic data) to demonstrate the detail in a single print in both the blacks and the highlights. This scene has a dynamic range of 94.3 dB, achieved mostly by the difference in illumination between the black plastic horse and the LED flashlight. Figure 5(b) shows the scene’s DoLP in a linear false-color map, where red and blue areas indicate fully polarized and unpolarized light, respectively. Similarly, Fig. 5(c) shows the scene’s AoP in a circular false-color map, where red and blue areas indicate horizontally (0˚ or 180˚) and vertically (90˚) polarized light, respectively. Each of the polarization filters within the target shows homogeneous high DoLP and AoP, with their respective color matching their orientation, due to their intrinsic material properties. The silicon ingot, with a high refractive index, shows interesting polarization properties in agreement with its shape [27,28]: the ingot is highly polarized on high-zenith-angle surfaces, and it shows a continuous AoP change around its conical center due to the surface azimuth-angle change. In a similar fashion, the black horse, despite its low-intensity values, displays DoLP and AoP signatures appropriate for its shape characteristics.
These results show the high polarization sensitivity under high-dynamic-range scenes of our compact, low-power, single-chip logarithmic polarization imaging system. Our polarization imager can capture coregistered high-dynamic-range polarization data frames in real time. Optoelectronic characterization reveals that by operating the photodiode in forward bias mode, our imager achieves an instant dynamic range of 140 dB with an SNR of 61 dB, times greater and times greater, respectively, than the highest figures reported in the literature. A comparison between our technology and state-of-the-art polarimeters is shown in Table 1. The addition of pixelated nanowire polarization filters yields high polarization sensitivity for demanding applications ranging from remote sensing to biomedical imaging.
A. Optoelectronic Characterization
To evaluate the sensor’s response to Malus’s law and ability to accurately estimate the AoP and DoLP of fully polarized light [Figs. 4(a)–4(c)], an optical setup that produced narrowband fully polarized collimated light was utilized. The optical setup consisted of a modified current-controlled fiber light source (OSL1, Thorlabs) capable of producing high-intensity light directly connected to a power supply (N5746A, Agilent) that was computer controlled via USB 2.0, an integrating sphere (819D-SF-4, Newport), an adjustable iris (SM2D25, Thorlabs), an aspheric condensing lens (Thorlabs ACL50832U), a bandpass and a narrowband spectral filter (NENIR560B and FB450-10, Thorlabs), a high-precision polymer polarizer (Newport 20LP-VIS-B) mounted inside of a motorized nanorotational stage (NR360S, Thorlabs), and our sensor under test. The light source fiber was fed into the integrating sphere, and the sphere’s output was aligned to the optical train: iris, condensing lens, spectral filters, polarizer, and sensor. The light source was swept from 1 A to 7 A in steps of 100 mA, and for each current in the sweep, the polarizer was rotated from 0˚ to 175˚ in steps of 5˚. The result was a high-dynamic-range data cloud of fully polarized light () with different AoPs.
To evaluate the sensor’s ability to accurately estimate the DoLP of partially polarized light [Fig. 4(d)], an optical setup that produced narrowband partially polarized collimated light was utilized. The setup was similar to the one utilized for Figs. 4(a)–4(c), but the fiber light source was replaced with narrowband LEDs at 532 nm. The narrowband spectral filter was replaced with a 532 nm narrowband filter (FF01-532/18-25, Semrock), and a quarter-wave retarder (20RP34-532, Newport) mounted inside of another motorized nanorotational stage (NR360S, Thorlabs) was added between the polarizer and the sensor to modulate the input light’s DoLP. The LEDs were swept from 0 A to 2 A in steps of 100 mA, and for each current in the sweep, the retarder was rotated from 0° to 90° in steps of 5°. The result was a high-dynamic-range data cloud of partially polarized light (i.e., different DoLPs ranging from 0 to 1). To obtain the ground-truth DoLP and estimate the errors, a polarization analyzer was built utilizing an additional polarizer-and-rotational-stage set. The polarizer in the analyzer was swept from 0° to 160° in steps of 20° for each point in the data cloud to have an overdetermined system and reduce the ground-truth error estimation.
To evaluate the sensor’s dynamic range, SNR, and noise (Fig. 3), an optical setup that produced unpolarized collimated light was utilized. The setup was similar to the one utilized for Figs. 4(a)–4(c), with the exception that the polarizer was removed from the optical train. The result was a high-dynamic-range data cloud of unpolarized light (). The noise was calculated by estimating the standard deviation over 100 frames.
B. Data Processing
To compute the polarization metrics, the raw data from the sensor must be linearized—the data were converted from the logarithmic domain to the linear domain by removing the ADC offsets and applying an exponential function. Next, a demosaicing and spatial interpolation algorithm was used to recover the full-resolution polarization frames and minimize instantaneous field-of-view artifacts. A full-frame bicubic spline interpolation was used on the array. Generally, this algorithm performs better than the bilinear-type interpolation methods in giving higher modulation transfer function gains and wider validation frequency bandwidth . Last, a calibration polarization algorithm was employed to correct for imperfections in the nanofabrication process of the nanowire polarization filters and pixel response variances . Each polarization nanowire structure is unique, with its own transmission ratios and spatial distribution. The result of this difference in structure is FPN and spatial nonuniformity in filter diattenuations. The “dark” response matrix per pixel and analysis matrices are used in tandem with a linear regression algorithm to minimize any variance or nonuniformity introduced by the uniqueness of the nanostructures. All the calibration computations are based in Mueller matrix theory and are used to calibrate our imaging sensor’s data in real time.
In this paper, we presented a bioinspired, high-dynamic-range polarization imaging sensor. The sensor mimics the visual system of the mantis shrimp in two ways: (1) it utilizes four different pixelated polarization filters that are offset by 45° and integrated with photosensitive elements, and (2) the underlying photodiodes operate in forward bias mode, yielding a logarithmic response with respect to incident photons. By monolithically combining these two advances, we have created a snapshot polarimeter that operates at 30 fps and has a dynamic range of 140 dB. The departure from traditional CMOS imaging sensors, which provide a linear relationship between incident photon flux and output digital value by operating individual pixels’ photodiodes in reverse bias mode, has enabled us to create an imager with a logarithmic response and exceptional SNR of 61 dB. Due to its compact size and potential low fabrication cost of less than $10, this image sensor can become an integral part of many automotive or remote sensing applications.
National Science Foundation (NSF) (1724615, 1740737); Air Force Office of Scientific Research (AFOSR) (FA9550-18-1-0278).
The authors would like to thank James Hutchinson and Patricia J. Watson for paper editing. The authors declare no competing financial interests.
1. D. Goldstein, Polarized Light, 3rd ed. (CRC Press, 2010).
2. T. W. Cronin, S. Johnsen, J. Marshall, and E. Warrant, Visual Ecology (Princton University, 2014).
3. G. M. Calabrese, P. C. Brady, V. Gruev, and M. E. Cummings, “Polarization signaling in swordtails alters female mate preference,” Proc. Natl. Acad. Sci. 111, 13397–13402 (2014). [CrossRef]
4. V. Gruev, R. Perkins, and T. York, “CCD polarization imaging sensor with aluminum nanowire optical filters,” Opt. Express 18, 19087–19094 (2010). [CrossRef]
5. T. York, S. B. Powell, S. Gao, L. Kahan, T. Charanya, D. Saha, N. W. Roberts, T. W. Cronin, J. Marshall, S. Achilefu, and V. Gruev, “Bioinspired polarization imaging sensors: from circuits and optics to signal processing algorithms and biomedical applications,” Proc. IEEE 102, 1450–1469 (2014). [CrossRef]
6. M. Garcia, C. Edmiston, R. Marinov, A. Vail, and V. Gruev, “Bio-inspired color-polarization imager for real-time in situ imaging,” Optica 4, 1263–1271 (2017). [CrossRef]
7. D. Stoppa, A. Simoni, L. Gonzo, M. Gottardi, and G.-F. Dalla Betta, “Novel CMOS image sensor with a 132-dB dynamic range,” IEEE J. Solid-State Circuits 37, 1846–1852 (2002). [CrossRef]
8. M. Schanz, C. Nitta, A. Bußmann, B. J. Hosticka, and R. K. Wertheimer, “A high-dynamic-range CMOS image sensor for automotive applications,” IEEE J. Solid-State Circuits 35, 932–938 (2000). [CrossRef]
9. R. Etienne-Cummings, V. Gruev, and M. A. Ghani, “VLSI implementation of motion centroid localization for autonomous navigation,” in Advances in Neural Information Processing Systems (1999), pp. 685–691.
10. D. Wang, H. Liang, H. Zhu, and S. Zhang, “A bionic camera-based polarization navigation sensor,” Sensors 14, 13006–13023 (2014). [CrossRef]
11. D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, and R. Wehner, “A mobile robot employing insect strategies for navigation,” Robot. Auton. Syst. 30, 39–64 (2000). [CrossRef]
12. J. Chu, H. Wang, W. Chen, and R. Li, “Application of a novel polarization sensor to mobile robot navigation,” in International Conference on Mechatronics and Automation (IEEE, 2009), pp. 3763–3768.
13. M. Garcia, C. Edmiston, T. York, R. Marinov, S. Mondal, N. Zhu, G. P. Sudlow, W. J. Akers, J. Margenthaler, S. Achilefu, R. Liang, M. A. Zayed, M. Y. Pepino, and V. Gruev, “Bio-inspired imager improves sensitivity in near-infrared fluorescence image-guided surgery,” Optica 5, 413–422 (2018). [CrossRef]
14. J. S. Tyo, D. L. Goldstein, D. B. Chenault, and J. A. Shaw, “Review of passive imaging polarimetry for remote sensing applications,” Appl. Opt. 45, 5453–5469 (2006). [CrossRef]
15. G. Myhre, W.-L. Hsu, A. Peinado, C. LaCasse, N. Brock, R. A. Chipman, and S. Pau, “Liquid crystal polymer full-stokes division of focal plane polarimeter,” Opt. Express 20, 27393–27409 (2012). [CrossRef]
16. J. Chang, H. He, Y. Wang, Y. Huang, X. Li, C. He, R. Liao, N. Zeng, S. Liu, and H. Ma, “Division of focal plane polarimeter-based 3 × 4 Mueller matrix microscope: a potential tool for quick diagnosis of human carcinoma tissues,” J. Biomed. Opt. 21, 056002 (2016). [CrossRef]
17. M. Zhang, X. Wu, N. Cui, N. Engheta, and J. Van der Spiegel, “Bioinspired focal-plane polarization image sensor design: from application to implementation,” Proc. IEEE 102, 1435–1449 (2014). [CrossRef]
18. X. Zhao, F. Boussaid, A. Bermak, and V. G. Chigrinov, “High-resolution thin “guest-host” micropolarizer arrays for visible imaging polarimetry,” Opt. Express 19, 5565–5573 (2011). [CrossRef]
19. K. Sasagawa, S. Shishido, K. Ando, H. Matsuoka, T. Noda, T. Tokuda, K. Kakiuchi, and J. Ohta, “Image sensor pixel with on-chip high extinction ratio polarizer based on 65-nm standard CMOS technology,” Opt. Express 21, 11132–11140 (2013). [CrossRef]
20. M. Sarkar, D. S. S. Bello, C. van Hoof, and A. J. Theuwissen, “Biologically inspired CMOS image sensor for fast motion and polarization detection,” IEEE Sens. J. 13, 1065–1073 (2013). [CrossRef]
21. Y. Maruyama, T. Terada, T. Yamazaki, Y. Uesaka, M. Nakamura, Y. Matoba, K. Komori, Y. Ohba, S. Arakawa, and Y. Hirasawa, “3.2-MP back-illuminated polarization image sensor with four-directional air-gap wire grid and 2.5-μm pixels,” IEEE Trans. Electron Dev. 65, 2544–2551 (2018). [CrossRef]
22. T. Ohfuchi, M. Sakakura, Y. Yamada, N. Fukuda, T. Takiya, Y. Shimotsuma, and K. Miura, “Polarization imaging camera with a waveplate array fabricated with a femtosecond laser inside silica glass,” Opt. Express 25, 23738–23754 (2017). [CrossRef]
23. G. Han, X. Hu, J. Lian, X. He, L. Zhang, Y. Wang, and F. Dong, “Design and calibration of a novel bio-inspired pixelated polarized light compass,” Sensors 17, 2623 (2017). [CrossRef]
24. X. Zhao, F. Boussaid, A. Bermak, and V. G. Chigrinov, “Thin photo-patterned micropolarizer array for CMOS image sensors,” IEEE Photon. Technol. Lett. 21, 805–807 (2009). [CrossRef]
25. T. Tokuda, S. Sato, H. Yamada, K. Sasagawa, and J. Ohta, “Polarisation-analysing CMOS photosensor with monolithically embedded wire grid polariser,” Electron. Lett. 45, 228–230 (2009). [CrossRef]
26. H. Park and K. B. Crozier, “Elliptical silicon nanowire photodetectors for polarization-resolved imaging,” Opt. Express 23, 7209–7216 (2015). [CrossRef]
27. N. M. Garcia, I. de Erausquin, C. Edmiston, and V. Gruev, “Surface normal reconstruction using circularly polarized light,” Opt. Express 23, 14391–14406 (2015). [CrossRef]
28. O. Morel, C. Stolz, F. Meriaudeau, and P. Gorria, “Active lighting applied to three-dimensional reconstruction of specular metallic surfaces by polarization imaging,” Appl. Opt. 45, 4062–4068 (2006). [CrossRef]
29. S. Shwartz, E. Namer, and Y. Y. Schechner, “Blind haze separation,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 1984–1991.
30. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42, 511–525 (2003). [CrossRef]
31. J. Tyo, M. Rowe, E. Pugh, and N. Engheta, “Target detection in optically scattering media by polarization-difference imaging,” Appl. Opt. 35, 1855–1870 (1996). [CrossRef]
32. S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47, 123201 (2008). [CrossRef]
33. E. Salomatina-Motts, V. Neel, and A. Yaroslavskaya, “Multimodal polarization system for imaging skin cancer,” Opt. Spectrosc. 107, 884–890 (2009). [CrossRef]
34. T. Charanya, T. York, S. Bloch, G. Sudlow, K. Liang, M. Garcia, W. J. Akers, D. Rubin, V. Gruev, and S. Achilefu, “Trimodal color-fluorescence-polarization endoscopy aided by a tumor selective molecular probe accurately detects flat lesions in colitis-associated cancer,” J. Biomed. Opt. 19, 126002 (2014). [CrossRef]
35. B. Kunnen, C. Macdonald, A. Doronin, S. Jacques, M. Eccles, and I. Meglinski, “Application of circularly polarized light for non-invasive diagnosis of cancerous tissues and turbid tissue-like scattering media,” J. Biophoton. 8, 317–323 (2015). [CrossRef]
36. S. B. Powell, R. Garnett, J. Marshall, C. Rizk, and V. Gruev, “Bioinspired polarization vision enables underwater geolocalization,” Sci. Adv. 4, eaao6841 (2018). [CrossRef]
37. V. Gruev, “Fabrication of a dual-layer aluminum nanowires polarization filter array,” Opt. Express 19, 24361–24369 (2011). [CrossRef]
38. Y. Ni, Y. Zhu, and B. Arion, “A 768 × 576 logarithmic image sensor with photodiode in solar cell mode,” in International Image Sensor Workshop (2011).
39. T. York and V. Gruev, “Characterization of a visible spectrum division-of-focal-plane polarimeter,” Appl. Opt. 51, 5392–5400 (2012). [CrossRef]
40. J. S. Tyo and H. Wei, “Optimizing imaging polarimeters constructed with imperfect optics,” Appl. Opt. 45, 5497–5503 (2006). [CrossRef]
41. “PolarCam,” 4D Technology, 2018, https://www.4dtechnology.com/products/polarimeters/polarcam/#.
42. D. V. Vorobiev, Z. Ninkov, and N. Brock, “Astronomical polarimetry with the RIT polarization imaging camera,” Publ. Astron. Soc. Pac. 130, 064501 (2018). [CrossRef]
43. S. Gao and V. Gruev, “Bilinear and bicubic interpolation methods for division of focal plane polarimeters,” Opt. Express 19, 26161–26173 (2011). [CrossRef]
44. S. B. Powell and V. Gruev, “Calibration methods for division-of-focal-plane polarimeters,” Opt. Express 21, 21039–21055 (2013). [CrossRef]