Quantifying the coloration and luminous transmittance of eyewear filters is essential to understand their impact on human visual performance. Accurate measurements require the use of a spectrophotometer, with such equipment often being prohibitively expensive for wide-scale use. This paper details a new technique to characterize eyewear filters using only a digital camera and a sheet of white paper. Images of the paper are captured with and without the filter in front of the camera lens, and then subsequent analysis of pixel intensities allows the filter coloration and luminous transmittance to be determined. The technique has been applied to six different eyewear filters using three different camera and illumination configurations, demonstrating a reasonable match to the spectrophotometer data. This technique is suited to implementation in a smartphone app, in order to provide a low-cost and widely deployable solution to monitor the ageing of eyewear inventory.
Colored eyewear filters have a variety of applications, including sun protection [1,2], enhancing visual contrast , laser protection , and looking cool . Such filters intentionally block portions of the visible light spectrum in order to provide a benefit to the wearer, but in doing so they reduce the overall light transmission and impart a coloration to the visual scene . These factors may negatively impact human visual performance and therefore need to be quantified.
Accurate measurement of coloration and luminous transmittance requires the use of a spectrophotometer to measure the intensity of light transmitted through a material across the visible waveband . Such equipment may be available in an optical laboratory, and portable devices are also available for use in the field. However, costs are significant (thousands to tens of thousands of British pounds per unit), meaning that wide-scale deployment is prohibitively expensive. This is an important consideration for applications such as monitoring laser protection eyewear products used by personnel across thousands of locations worldwide. These products are subject to degradations in protection associated with ageing and solarization .
Digital cameras (either standalone or smartphone based) have been shown to be effective for a variety of applications requiring discrimination of different colors, such as determining water pH levels ; quantifying colorimetric urine tests of pH, protein, and glucose ; monitoring food quality ; and detecting explosives . In such applications, they offer advantages of simplicity and cost saving versus traditional, complex laboratory equipment.
To realize these advantages for the present application, this paper documents an experimental technique using a digital camera and a white paper target to quantify the estimated coloration and luminous transmittance of eyewear filters. Section 2 outlines the recommended measurement and calculation procedure, before Section 3 presents example results with a comparison to data obtained from a spectrophotometer-based approach. Section 4 provides discussion points covering accuracy and optimization.
A. Reference White Image
First, a digital camera should be used to take a JPEG (sRGB color space) image of an evenly lit white sheet of paper, ensuring that there is an even color balance and that the white is close to, but not at, the point of saturation (Fig. 1). An even color balance can be achieved through optimizing the camera’s color balance setting to ensure that white is represented by an approximately equal contribution of red (R), green (G), and blue (B) values (to be assessed by interrogating the images through an image editing application). Avoiding saturation means that the highest RGB values in an 8-bit image should be less than 255, and this can be optimized by adjusting the shutter speed and/or lens aperture settings, again interrogating the image to find the optimum settings. The camera should be used in a fully manual mode with all other options set to fixed values.
B. Image Through Filter
Using the exact same camera settings as for the reference white image, the filter should then be placed in front of the camera lens, and another JPEG image of the white sheet of paper should be taken (Fig. 2). Care should be taken that no automatic exposure controls of any kind are able to artificially boost the intensity of the image or adjust its coloration. This image should appear darker than the reference image (as some light is being removed by the filter), and the color of the image should approximate the color as perceived by looking through the filter with the naked eye.
C. Coloration Calculation
The coloration of the filter can be calculated by calibrating the reference white image with the ideal representation of white, and applying that calibration to the through-filter image. The calculations are simple enough to be implemented in a smartphone application, but for this proof of principle, they have been implemented in MATLAB.
First, the average white RGB values from the white reference image should be determined by averaging pixel values across several areas of the paper in the image (to account for any uneven illumination). With an image captured using the sRGB color space , each pixel will have three associated 8-bit values from 0 to 255 (). These are first normalized to 1 () and then converted to linear space (r, g, b) according to 14]
D. Luminous Transmittance Calculation
When assessing the transmission of eyewear filters, the spectral response of the human eye needs to be taken into consideration; as the eye is more sensitive to green wavelengths than red or blue, weightings must be applied to derive a luminous transmittance. The optimum conversion found for this application, and for the cameras used in this study, is the following calculation of luminance :
E. Spectrophotometer Measurements
To provide “accurate” reference data for comparison to the present technique, a spectrophotometer setup was also used to measure the coloration and transmission of each filter. This setup consisted of a Thorlabs compact spectrometer (CCS200/M) and its associated control software, used in conjunction with a Thorlabs high-intensity fiber light source (OSL1-EC). This setup generated transmission data in 1 nm increments across the visible waveband. These data were subsequently converted to tristimulus color values, XYZ, and then to the final sRGB coloration values using standard color space conversion equations [13,16]. Transmission was calculated according to the standard equation for luminous transmittance  [also known as photopic luminous transmittance (PLT) or integrated visual photopic transmission (IVPT)]. This calculation is weighted by the spectral response of the day-adapted human eye to approximate the human perception of transmission through a filter.
3. EXAMPLE RESULTS
A. Eyewear Filters
To assess the effectiveness of this technique, six different filters have been assessed. Three of the filters are ballistic protection goggle filters: a clear filter for ballistic protection-only; a gray filter for sun protection; and a yellow filter for contrast enhancement. Three of the filters are laser eye protection filters: a blue filter that blocks red wavelengths (); an orange filter that blocks blue and green wavelengths (); and a brown filter that provides stronger blocking of blue and green wavelengths () together with some blocking of the longer-wavelength reds ().
B. Camera and Illumination Configurations
Three different camera and illumination configurations were used in order to assess the applicability to high- and low-end setups: a digital single-lens reflex (SLR) camera with a calibrated light cabinet (advanced setup), a smartphone camera with a desk lamp and bulb designed to mimic the sun’s spectral output (smartphone setup A), and a second smartphone camera with a desk lamp and simple halogen bulb (smartphone setup B). For smartphone setup B, three sets of images were collected and analyzed to provide a measure of intra-setup variability. Full details of the two configurations are given in Table 1, including the optimum camera settings that were fixed for the capture of all images. Figures 1 and 2 are example images from smartphone setup A.
C. Coloration Results
Table 2 summarizes the sRGB coloration values obtained for the advanced and smartphone setups, together with color difference values (CIE76 calculation, based on color space with a D65 adopted white point) to quantify the differences from the spectrophotometer data. All three configurations provide reasonable matches to the clear and gray ballistic goggle filters with a highest overall of 12 and the advanced setup providing marginally better results than either smartphone setup. For the yellow ballistic filter, the advanced setup and smartphone setup A provide reasonable matches. However, smartphone B gives a large of 48 that appears less saturated than the spectrophotometer data. This could be due to the warm color temperature of the halogen light source being close to the filter’s own color appearance. For the laser eye protection filters (blue, orange, and brown), the color matches were generally worse than the ballistic filters with a highest of 22. Subjective assessment of the colors produced by the sRGB values (shown as colored cell shading in Table 2, with the displayed colors being valid only when viewed on a calibrated display) indicates a good general match to the spectrophotometer results for the advanced setup and smartphone setup A, apart from the brown filter, which showed as a deep red with both configurations. As with the yellow filter, smartphone setup B results for the laser eye protection filters appear less saturated than the spectrophotometer values, although the values are similar to values for the advanced setup.
D. Luminous Transmittance Results
Luminous transmittance data are listed in Table 2 and illustrated as a bar chart in Fig. 3. Smartphone setup B gave a better match to the spectrophotometer data than the advanced setup for five of the six filters, with differences ranging from 0% to 7% (absolute) for all filters. Smartphone setup A gave differences from the spectrophotometer values ranging from 2% to 9% (absolute). The advanced setup gave differences from the spectrophotometer values ranging from 4% to 11% (absolute). The worst relative comparison was for smartphone setup A assessing the blue filter, where an estimated luminous transmittance of 5% gave a relative difference of 44% compared to the spectrophotometer value of 9%.
This paper has demonstrated that a digital camera can be used to give a reasonable approximation of the coloration and luminous transmittance of eyewear filters. However, there is a loss of accuracy versus use of a spectrophotometer, and so the determined values should be taken only as indicative estimates.
Coloration estimates gave values ranging from 1 to 48, with less overall accuracy for the laser protection filters, possibly due to the more complex nature of their spectral profile. Luminous transmittance discrepancies varied from 0% to 11% (absolute) with similar accuracy across the ballistic and laser protection filters. With a few exceptions, the smartphone camera setups were found to give results comparable to those from an advanced setup comprising a digital SLR and professional light cabinet.
It should be noted that the authors assessed a variety of other variations before arriving at this proposed optimum technique. The analysis of solid and gradient colored targets was performed, as was the imaging of targets on computer displays. Alternative equations to calculate luminous transmittance were also assessed. Overall, the technique featured in this paper was found to provide the most consistent results across the widest range of conditions.
Not considered here is the more fundamental question of the level of accuracy required in either coloration or transmittance measurements. Degradation tolerances and performance limits are application specific, and would need to be addressed separately from the present study.
There remains scope to improve this technique, particularly with refinements to camera control and calculation methodology. A bespoke smartphone application could control the image capture settings more tightly to ensure consistency across the reference white and through-filter images. This could also involve capturing RAW images, instead of JPEG, to limit image processing internal to the camera. Greater consistency with the illumination source would minimize the need for extremes of white balancing that may negatively affect color difference values. Finally, the use of multiple reference images could improve the overall calibration stage, and coloration estimates could be improved by application of a color appearance model (e.g., CIECAM02 ) that would also account for chromatic adaptation.
Significantly, the presented technique is simple enough to be automated in a smartphone application, which could then provide a low-cost and widely deployable capability to monitor eyewear inventory.
Defence Science and Technology Laboratory (Dstl).
1. F. Behar-Cohen, G. Baillet, T. de Ayguavives, P. Ortega Garcia, J. Krutmann, P. Pena-Garcia, C. Reme, and J. S. Wolffsohn, “Ultraviolet damage to the eye revisited: eye-sun protection factor (E-SPF), a new ultraviolet protection label for eyewear,” Clin. Ophthalmol. 8, 87–104 (2014). [CrossRef]
2. J. C. S. Yam and A. K. H. Kwok, “Ultraviolet light and ocular diseases,” Int. Ophthalmol. 34, 383–400 (2014). [CrossRef]
3. J. S. Wolffsohn, A. L. Cochrane, H. Khoo, Y. Yoshimitsu, and S. Wu, “Contrast is enhanced by yellow lenses because of selective reduction of short-wavelength light,” Optom. Vision Sci. 77, 73–81 (2000). [CrossRef]
4. British Standards Institution, “Personal eye-protection equipment–filters and eye-protectors against laser radiation (laser eye-protectors),” BS EN 207:2009 (2009).
5. V. Brown, Cool Shades: The History and Meaning of Sunglasses (Bloomsbury Academic, 2015).
6. C. A. Williamson and M. Y. Boontanrart, “Simulating the impact of laser eye protection on color vision,” J. Laser Appl. 28, 012010 (2016). [CrossRef]
7. F. Manoochehri and E. Ikonen, “High-accuracy spectrometer for measurement of regular spectral transmittance,” Appl. Opt. 34, 3686–3692 (1995). [CrossRef]
8. L. A. Levasseur, J. F. Roach, and K. B. Brubaker, “Evaluation of laser-protective eyewear dyes in UVEX lenses,” Natick/TR-95/006 (US Army Natick Research Development and Engineering Center, 1994).
9. G. S. Luka, E. Nowak, J. Kawchuk, M. Hoorfar, and H. Najjaran, “Portable device for the detection of colorimetric assays,” Royal Soc. Open Sci. 4, 171025 (2017). [CrossRef]
10. A. K. Yetisen, J. L. Martinez-Hurtado, A. Garcia-Melendrez, F. da Cruz Vasoncellos, and C. R. Lowe, “A smartphone algorithm with inter-phone repeatability for the analysis of colorimetric tests,” Sens. Actuators B: Chem. 196, 156–160 (2014). [CrossRef]
11. P. B. Pathare, U. L. Opara, and F. A. J. Al-Said, “Colour measurement and analysis in fresh and processed foods: a review,” Food Bioprocess Technol. 6, 36–60 (2013). [CrossRef]
12. M. J. Kangas, R. M. Burks, J. Atwater, R. M. Lukowicz, P. Williams, and A. E. Holmes, “Colorimetric sensor arrays for the detection and identification of chemical weapons and explosives,” Crit. Rev. Anal. Chem. 47, 138–153 (2017). [CrossRef]
13. M. Stokes, M. Anderson, S. Chandrasekar, and R. Motta, “A standard default color space for the Internet: sRGB,” International Color Consortium, 1996, http://color.org/sRGB.xalter.
14. British Standards Institution, “Multimedia system and equipment—Colour measurement and management—Part 2-1: Colour management—Default RGB colour space—sRGB,” BS EN 61966-2-1:2000 (2000).
15. International Telecommunication Union, “Studio encoding parameters of digital television for standard 4:3 and wide-screen 16:9 aspect ratios,” Recommendation ITU-R BT.601-7 (03/2011)(2011).
16. G. Wyszecki and W. S. Stiles, Color Science, 2nd ed. (Wiley, 1982).
17. British Standards Institution, “Personal protective equipment—Test methods for sunglasses and related eyewear,” BS EN ISO 12311:2013 (2013).
18. International Commission on Illumination, “A colour appearance model for colour management systems: CIECAM02,” Technical Report No. CIE 159:2004 (2004).