Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Optical filter for highlighting spectral features Part I: design and development of the filter for discrimination of human skin with and without an application of cosmetic foundation

Open Access Open Access

Abstract

Light reflected from an object’s surface contains much information about its physical and chemical properties. Changes in the physical properties of an object are barely detectable in spectra. Conventional trichromatic systems, on the other hand, cannot detect most spectral features because spectral information is compressively represented as trichromatic signals forming a three-dimensional subspace. We propose a method for designing a filter that optically modulates a camera’s spectral sensitivity to find an alternative subspace highlighting an object’s spectral features more effectively than the original trichromatic space. We designed and developed a filter that detects cosmetic foundations on human face. Results confirmed that the filter can visualize and nondestructively inspect the foundation distribution.

©2011 Optical Society of America

1. Introduction

Light reflected from the surface of an object contains a range of information about the object’s physical and chemical properties. In human color vision, the transformation of reflected light from the spectra to trichromatic signals results in the loss of most of the spectral features (metamerism). Recently, spectral imaging has received much attention for highlighting and detecting such spectral features and visualizing an object’s properties. However, often the nondestructive inspections based on spectral imaging face several obstacles such as their cost and the measuring skill requirements in industrial applications. In our previous work [13], we proposed two methods for designing the spectral transmittance of a filter that allows two colors from predefined target spectra to be easily and accurately distinguished (color discrimination enhancement [1, 2], and michromatic scope [3]). The first study attempted color difference enhancement for two targets, facial skin color in the presence and absence of cosmetics foundation and human skin above the vein and around the vein. Actually, the designed filters showed good color difference enhancement in both the cases. We also designed an artificial camera system “michromatic scope” to enhance the spectral difference of these targets. This system included three spectral filters that were designed on the basis of a non-negative tensor factorization of spectral data sets of the visualization targets. This method showed a large spectral difference enhancement. Although it required some software processing, its optical implementation was achieved using three optical filters and a monochromatic camera.

This study proposes a method to highlight the spectral features, particularly the spectral difference between two target colors by using only one filter, which is mounted in front of the lens of an RGB digital camera. The filter is designed to modulate the camera’s spectral sensitivity in order to highlight and detect the spectral differences for metameric or nearly-metameric colors with higher enhancement than a color difference enhancement filter [1,2] in a simpler, more portable, and less expensive way than the michromatic scope [3]. This study also focuses on human skin color as a target, especially on the differences in facial colors in the presence and absence of cosmetic foundations.

Human skin color is increasingly attracting attention in several fields of research because it’s a rich source of information of health and mental conditions. Skin color is associated with human skin pigments, including melanin, carotene, and hemoglobin [4]. These pigments have characteristic spectral absorption properties in the visible wavelength region. Several studies showed methods for measuring and visualizing the human skin pigmentation by using spectral absorption [57]. Preece et al. demonstrated a method to design optimal band-pass filters for recovering the optical parameters of human skin [8]. The accuracy of estimation obtained using optimal three band-pass filters was higher than that of typical RGB filters. In the medical field, pigmentation measuring techniques based on skin spectra have been applied to evaluate erythema and melanoma [912].

On the other hand, cosmetic foundation has also been an active research topic because it is important for cosmetic products to hide skin flaws and display beautiful skin. Recently, along with the development of nanoparticle control technology, cosmetic foundations have become increasingly sophisticated using options that include a high covering power and a natural outlook [13]. Optical absorption by hemoglobin appeared to have an effect on the reflectance of human skin, particularly at wavelengths in the range of 545–575 nm, and the absorption property is considered to be affected by the presence or absence of cosmetic foundations. However, it is difficult or almost impossible for our naked eyes to detect or discriminate the spectral differences of human skin because of the function of cosmetic foundations. Therefore, a nondestructive inspection method for applied cosmetic foundations is required. Doi et al. proposed a method to estimate the spectral reflectance of made-up skin and the method yielded effective estimation accuracy [14]. This technique can also be used for estimating the thickness of the foundation layer; however, an ordinal spectral imaging system will be required.

In this study, instead of the spectral imaging system, we used an optical filter to visualize the spatial distributions of foundation by highlighting the spectral differences of facial skin colors in the presence and absence of cosmetic foundations. The next section presents the research problem and our general solution. The results of the experiments and the principal evaluation of the developed optical filter are presented in the third section. Discussions are described in the fourth section. Finally, conclusions are presented in the fifth section.

2. Methods

2.1. Problem settings

The problem addressed in this study is how a filter that highlights the spectral difference between the observed spectra of two targets I1(λ) and I2(λ) (in practice, representing spectral data sets for categories 1 and 2) can be defined. Thus, the target function is as follows.

Minimize the misclassification rate:

Pe(I1(λ),I2(λ))=p1P(category=2|I1(λ))+p2P(category=1|I2(λ)),
where p1 and p2 are prior probabilities of I1(λ) and I2(λ), respectively. P (category=j| Ii(λ)) gives the conditional probability in which the color signal Ii(λ) is categorized into the wrong category j.

Classification of the input spectra was accomplished using the output signal of the camera rather than the input spectra directly. Here, we define the trichromatic signal (output of the RGB camera) as follows:

Cik=λT(λ)Ii(λ)Sk(λ),i{1,2},k{R,G,B},
where T(λ) is the transmittance function of the filter, Sk(λ) represents the spectral sensitivity functions of the camera, i indicates the category number, and k represents the type of color channel of the camera. Hereafter, the color signal to be discriminated is represented as a vector form of the chromaticity signal:

Ci=(ri,g)i=(CiRCiR+CiG+CiB,CiGCiR+CiG+CiB).

The effect of using chromatic signals instead of the original trichromatic signal is discussed in Section 4.

2.2. Method for designing the spectral filter

We determined the spectral transmittance function T(λ) by minimizing the misclassification rate between the two predefined spectral sets I1(λ) and I2(λ), i.e., formulated it as a nonlinear optimization problem whose objective function is Eq. (1). The flowchart of the filter design process is shown in Fig. 1 . The details of each of the processes in our work are described as follows.

 figure: Fig. 1

Fig. 1 Filter design process flowchart. The solution of the transmittance function is formulated as a nonlinear optimization problem whose objective function is Eq. (1). The optimization consists of five steps.

Download Full Size | PDF

  • Step 1: Definition of the targets and observation conditions

In the first step, we defined the spectral data sets, i.e., I1(λ) and I2(λ), and the spectral sensitivity function Sk(λ). The targets to be discriminated were human facial skin in the presence and absence of cosmetic foundation. In this study, spectral reflectances obtained under these conditions were used as data sets. The spectral measurement for the data sets is described in the next section. The color space was defined by the spectral sensitivities of a Nikon D70 digital camera as Sk(λ), which was measured using a monochromator (Fig. 2 ). From a practical viewpoint, tristimulus signals are affected by spectral sensitivity of cameras and spectral distribution of illuminants. Effects of the camera and light source differences are discussed in Section 4.

 figure: Fig. 2

Fig. 2 Spectral sensitivities of the RGB color sensor for a Nikon D70 digital camera measured using a monochromator (Shimadzu, SPG-120) in the range of 380–780 nm in steps of 5 nm.

Download Full Size | PDF

  • Step 2: Computation of the transmittance function

In the second step, the transmittance function T(λ) is described using a model that has a small number of independent variables Dk (k = 1, 2, …, N) for a spline function Bk(λ) [15]:

T(λ)=k=1NDkBk(λ),Bk(λ)={{ω3+3ω2(ω|λλk|)+3ω(ω|λλk|)23(ω|λλk|)3}/6ω3for|λλk|ω,(2ω|λλk|)3/6ω3forω|λλk|2ω,0for2ω|λλk|,
where, k = 1, 2, …, N; λk is the center of the function; Dk is the weight of the spline function Bk(λ); and ω is the parameter describing the width of the function. The weight values Dk (k = 1, 2, …, N) can be used as independent variables for optimizing the transmittance function T(λ). Also, ω was 15 nm and λk was set in the range of 405–735 nm in steps of 15 nm. Thus, 23 (N) independent variables (D1, D2, , D23) were optimized.

Other models for describing the transmittance can be used instead of Eq. (4). For example, a rectangular transmittance function like a band-pass filter is described using the independent variables that set a cut-off wavelength and transmittance values of the pass and elimination band.

  • Step 3: Color signal computation

All parameters for the color signal computation are defined in the above steps. Thus, filtered color signals of the two targets can be calculated by evaluating Eq. (2).

  • Step 4: Discriminant analysis

We performed discriminant analysis to classify the data sets C1 and C2 and computed the misclassification rate by using Eq. (1).

This discrimination should be carried out with the lowest possible computational effort. Therefore, in this study, linear discriminant analysis (LDA) was used for classifying color signals. The discriminant score obtained using the LDA formula for two classes is as follows:

fd(C)=(Σ1(μ1μ2))t(Cμ1+μ22)log(p2/p1),
where C is a color signal transformed from an observed spectrum; Σ −1 is a pooled variance-covariance matrix of color signal sets C1 and C2; and μ1 and μ2 are the means of C1 and C2, respectively. When fd (C) > 0, an observed color signal is classified as category 1; otherwise it is classified as category 2. Also, p1 and p2 (assumed as p1 = p2 = 0.5) are prior probabilities for category 1 and 2, respectively.

  • Step 5: Optimization process for determining the transmittance function

We modified the independent variable Dk, which explains the transmittance function mentioned in Step 2, to minimize the misclassification rate. We used the simulated annealing method for optimizing the transmittance function [16]. However, this filter designing process is not restricted to the optimization method used here. Other optimization methods like the genetic algorithm or the Nelder–Mead method also work well.

2.3. Spectral measurement for designing the filter

To design the filter, the experimental data consisting of two target spectral data sets were measured using a multispectral imaging system (Nuance™, Multispectral Imaging System, CRi, VIS). The first set I1(λ) contains the spectral reflectance obtained from the skin to which a cosmetic foundation had been applied. The second set I2(λ) corresponds to the spectral reflectance of bare skin. The spectral range of the measurements varied over 420–720 nm in steps of 5 nm. The resolution of the spectral image was 696 × 512 pixels. Three halogen lamps (300 W, 2547.7 K) were used as illuminants. A color patch (OSA UCS, −1, 0, 0) was placed to the left side of the subjects. The reflectance spectrum of this color patch was used as the reference spectrum. We used cosmetic foundations of two types both liquid and powder and six colors of each type. The most suitable foundation (the subject’s favorite foundation) was chosen for each subject. The subjects were Japanese women with ages in the range 20–60 years. In all, spectral images obtained from 30 subjects were measured.

2.4. Measured spectral data sets of the targets

From the spectral images obtained, we extracted data sets automatically for designing the filter. The facial skin area was defined by comparing its spectral similarity (mean square error) with a typical human skin spectrum and an open/close processing [1719]. After that, the facial skin area was split into nine (3 × 3) regions and the average spectrum of each area was extracted for the spectral data set. In total, each of the data sets consisted of 540 spectra from the subjects (nine spectra for each of the 60 images). Figure 3 shows the average spectra of each data set. The solid lines show the reflectance spectra of bare skin and the broken lines show the spectra of skin with an application of cosmetic foundation. Both the spectra have mostly overlapped (Fig. 3); however, there was a slight spectral difference between the average spectra of bare skin and that of the skin with foundation. The difference was caused by the spectral absorption characteristics of hemoglobin (see Section 1). Hence, the absorption properties of human skin pigmentations were hidden by the cosmetic foundations, which are artificial materials that imitate human skin color.

 figure: Fig. 3

Fig. 3 Average spectra of each data set. Slight spectral difference at wavelengths in the range 545–575 nm were caused by the spectral absorption characteristics of hemoglobin.

Download Full Size | PDF

3. Results

3.1. Design and development of the filter

The spectral transmittance function T(λ) was determined by optimization process using the spectral data sets of bare skin and skin with foundation, and the theoretically derived spectral transmittance was experimentally realized using a multilayer film filter made by vacuum deposition technology. Figure 4(a) depicts the spectral transmittance obtained by optimization and that of the optically realized filter, and Fig. 4(b) shows a picture of the realized optical filter. This filter was composed of 31 layers of SiO2 and TiO2. Twelve such filters were developed and the best one among them was selected. The optically realized filter almost had the same discrimination accuracy compared to the theoretically designed one.

 figure: Fig. 4

Fig. 4 (a) Spectral transmittance of the theoretically designed and the optically realized filter. Transmittance function was theoretically designed by optimization process. The optical filter was realized by vacuum deposition technology. (b) Developed optical filter. The optical filter was made of a multilayer thin film, which was composed of 31 layers of SiO2 and TiO2.

Download Full Size | PDF

The designed spectral transmittance had sharp pass bands in the wavelength range 410–460 nm, 500–550 nm, and 560–610 nm (Fig. 4(a)). The third pass band (560–610 nm) includes the absorption property of hemoglobin and longer wavelengths were eliminated. Therefore, the color sensitivity function for the red component (sensitivity function of longer wavelengths in Fig. 2) was modified to only detect the spectral absorption of hemoglobin, which is deeply involved with an application of cosmetic foundation.

3.2. Evaluation of the designed optical filter

To evaluate the real optical filter and to recalibrate the discriminant function fd for the realized filter, we measured facial images with and without cosmetic foundation of 20 subjects by using an RGB digital camera with the developed optical filter. The measurement device was a commercially available camera (Nikon D70) and the source of illumination light was a fluorescent light (Diva-Lite, 6300K). Polarizing films were installed on both the camera and the illumination light source to eliminate specular light. This fluorescent light had a broad emission spectrum through all regions of visible wavelengths and it emitted lower heat. The measurement geometry is shown in Fig. 5 .

 figure: Fig. 5

Fig. 5 Measurement geometry. The measurement device was a commercially available camera (Nikon D70) and a fluorescent light (Diva-Lite, 6300K) was selected as the illumination light source. Polarizing films were installed on both the camera and the illumination light source to eliminate specular light.

Download Full Size | PDF

Figures 6(a) and 6(b) show color (chromaticity) distributions of two targets (bare skin and skin with cosmetic foundation) with and without filtering. The ellipses in Fig. 6 represent the equiprobability ellipses whose shafts are defined by the standard deviations. Color distributions with filtering were effectively separated for the two targets. The misclassification rates without and with filtering were 33.0% and 5.97%, respectively (Figs. 6(a) and 6(b)).

 figure: Fig. 6

Fig. 6 Color (chromaticity) distributions extracted from an RGB image taken with the digital camera. (a) Color distributions without filtering and (b) color distributions with filtering. The ellipses represent equiprobability ellipses whose shafts are defined by the standard deviations. Misclassification rates (a) 33.0% and (b) 5.97%.

Download Full Size | PDF

3.3. Detection of skin area applying Cosmetic Foundation

Figures 7(a)7(c) show the results of the filtered images. The foundation was applied only on the left side of the face. Figure 7(b) shows the enhancement of the spectral difference as the enhancement of the color difference (shifted to be reddish). Figure 7(c) shows the discriminant score fd(C), which was computed according to Eq. (5) by using r-g chromaticity for each pixel. Colored pixels (Fig. 7(c)) were classified into the area applying foundation according to the classification rule. This figure shows that the detection was accurate.

 figure: Fig. 7

Fig. 7 Results of filtered images captured using a digital camera. Foundation was applied only on the left side of the face. (a) RGB color image without the filter. (b) Filtered color image. The image in the figure shows obvious enhancement. (c) Discriminant score computed according to Eq. (5).

Download Full Size | PDF

4. Discussion

4.1. Discrimination accuracy with and without luminance information

Throughout the above sections, we used r-g chromaticity coordinates as the color signals and derived all results from the chromaticity coordinates. The transformation from trichromatic signals to chromaticity coordinates in Eq. (3) is used for removing the luminance information from the trichromatic signal in Eq. (2). From a practical viewpoint, the luminance information should not be included in the discriminant analysis process because it is particularly susceptible to other disturbances like illuminant ununiformity or the shape of the face. In fact, the intensity for each pixel in Fig. 7 is not constant. For that reason, we designed the optical filter to obtain high discrimination accuracy with no effects due to luminance modulations.

To evaluate the contribution of luminance information to the discrimination of skin with and without foundation, discrimination accuracies in both cases the luminance component Yi=CiR+CiG+CiB both included and excluded were computed. We used trichromatic signals measured in Section 3.2 for this comparison and computed the discrimination accuracies with leave-one-out cross-validation. Figure 8 shows the computed discrimination accuracies. The left two bars show the discrimination accuracies when the optical filter was not equipped and the right two bars show the accuracies with the filter. Accordingly, discrimination accuracy with filtering was not improved even though the luminance information was included for the discrimination process. Also, the improvement of the accuracy without filtering was very moderate implying that the luminance information did not contribute to the discrimination process because the data set of the trichromatic signals contained much luminance variation which was irrelevant to cosmetic foundation. Therefore, using r-g chromaticity coordinates was probably a suitable way to design the filter.

 figure: Fig. 8

Fig. 8 Discrimination accuracy with and without luminance information. Trichromatic signals measured in Section 3.2 were used for this comparison and the discrimination accuracies were computed with leave-one-out cross validation. The left two bars show the discrimination accuracies when the optical filter was not equipped and the right two bars show the accuracies with the filter.

Download Full Size | PDF

4.2. Devices and material dependencies

This filter was designed for RGB color sensors on a digital camera (Nikon D70), and universality for other RGB digital cameras was not considered. However, almost all RGB cameras should have quite similar spectral sensitivities. Actually, we tried to resolve the device dependency for three types of digital cameras (Nikon D70, D100, and D40X) that use color signals extracted from color patches (X-rite, Digital ColorChecker Semi Gloss) and several facial images with and without cosmetic foundation, and the RGB color spaces were certainly calibrated using a simple color calibration process using these color signals [20]. This color calibration process works effectively for calibrating errors by using different illumination light source. We confirmed the discrimination accuracy under a different fluorescent light (Vita-Lite, 5500K) and it was 95.9% for three subjects. However, the filter would not work when the illumination light source did not have spectral radiant energy on the pass bands of the filter in the wavelength range 410–460 nm, 500–550 nm, and 560–610 nm.

According to an earlier study [14], the reflectance spectrum of human skin in the presence of cosmetic foundation depends on the reflectance spectrum of bare skin and the optical properties of the material. Thus, basically, the developed filter and the discrimination function would only work for the cosmetic foundation products used in this spectral measurement (Section 2.3) because the trichromatic signals of the RGB sensor would be different for different compositions. However, in the further experiment to confirm the material dependency of the discriminant score, about 30 types of foundation products were applied on forearms and most foundations were detected by the developed filter and discrimination function. The optimal filter worked because it modulated the red sensitivity function of the digital camera to focus on the absorption property of hemoglobin as mentioned in Section 2.4. Thus, our filter works well for various materials and skin colors because it focuses on the spectral difference caused by the absorption property of hemoglobin.

5. Conclusion

This study proposed a method to highlight the spectral differences between two target colors by using only one filter that we mounted in front of the lens of an RGB digital camera and applied to visualize foundation distribution.

The spectral transmittance function of the filter was designed to minimize the misclassification rate of LDA for two predefined spectral data sets transformed to color signals. The filter design process described in Fig. 1 is formulated in the second section. We optically realized the designed transmittance as a multilayer thin film filter (Fig. 4(b)). As shown in Figs. 6(a) and 6(b), color signal distributions of the obtained RGB images taken with a digital camera mounted the filter clearly highlighted the spectral differences between the two spectra sets, which were invisible to the human eye. Finally, the presence of cosmetic foundation on the human face was perfectly detected and visualized as shown in Fig. 7(c). This filter can also function for different RGB cameras because commercially available RGB cameras are designed to satisfy the Luther condition and the simple color calibration process works well. Therefore, our filter will be used as an effective nondestructive inspection tool in the cosmetics field.

In this study, we focused on highlighting the spectral features for the discrimination of the target’s condition. However, it is possible to apply the method for other purposes, i.e., highlighting in order to quantify an object’s properties. We could obtain more results by modifying the objective function (Eq. (1)) for quantification.

Our method is not restricted to the cosmetics field but could be applied to other applications: food inspection, medical imaging, or other targets that require nondestructive inspection. The proposed system could be implemented as part of an online measuring system in a compact and an inexpensive way.

Acknowledgments

The authors appreciate the assistance of Itoh Optical Industrial Co. Ltd. for providing the optical filter and thereby helping in conducting the experiments. This work was also supported in part by the Global COE Program “Frontiers of Intelligent Sensing” from the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan.

References and links

1. K. Nishino, A. Kaarna, K. Miyazawa, and S. Nakauchi, “Spectral filtering for color discrimination enhancement,” in Proceedings of the 15th Color Imaging Conference (Albuquerque, New Mexico, USA), pp. 195–200 (2007).

2. K. Nishino, A. Kaarna, K. Miyazawa, H. Oda, and S. Nakauchi, “Optical implementation of spectral filtering for the enhancement of skin color discrimination,” Color Res. Appl. (to be published).

3. A. Kaarna, K. Nishino, K. Miyazawa, and S. Nakauchi, “Michromatic scope for enhancement of color difference,” Color Res. Appl. 35(2), 101–109 (2010).

4. E. Angelopoulou, The reflectance spectrum of human skin, (Technical Report MS-CIS-99–29, GRASP Laboratory, Department of Computer and Information Science, University of Pennsylvania, USA, 1999).

5. N. Tsumura, M. Kawabuchi, H. Haneishi, and Y. Miyake, “Mapping pigmentation in human skin by multi-visible-spectral imaging by inverse optical scattering technique,” J. Imag. Sci. Tech. 45(5), 444–450 (2001).

6. I. V. Meglinski and S. J. Matcher, “Quantitative assessment of skin layers absorption and skin reflectance spectra simulation in the visible and near-infrared spectral regions,” Physiol. Meas. 23(4), 741–753 (2002). [CrossRef]   [PubMed]  

7. G. N. Stamatas, B. Z. Zmudzka, N. Kollias, and J. Z. Beer, “Non-invasive measurements of skin pigmentation in situ,” Pigment Cell Res. 17(6), 618–626 (2004). [CrossRef]   [PubMed]  

8. S. J. Preece and E. Claridge, “Spectral filter optimization for the recovery of parameters which describe human skin,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 913–922 (2004). [CrossRef]  

9. M. Moncrieff, S. Cotton, E. Claridge, and P. Hall, “Spectrophotometric intracutaneous analysis: a new technique for imaging pigmented skin lesions,” Br. J. Dermatol. 146(3), 448–457 (2002). [CrossRef]   [PubMed]  

10. J. K. Wagner, C. Jovel, H. L. Norton, E. J. Parra, and M. D. Shriver, “Comparing quantitative measures of erythema, pigmentation and skin response using reflectometry,” Pigment Cell Res. 15(5), 379–384 (2002). [CrossRef]   [PubMed]  

11. G. N. Stamatas, M. Southall, and N. Kollias, “In vivo monitoring of cutaneous edema using spectral imaging in the visible and near infrared,” J. Invest. Dermatol. 126(8), 1753–1760 (2006). [CrossRef]   [PubMed]  

12. G. N. Stamatas and N. Kollias, “In vivo documentation of cutaneous inflammation using spectral imaging,” J. Biomed. Opt. 12(5), 051603 (2007). [CrossRef]   [PubMed]  

13. J. Shiozawa, K. Nishikata, and N. Nakamura, “Applications of optically-arranged metal-acrylic films for super-covering makeups,” J. SCCJ 27, 326–326 (1993).

14. M. Doi, R. Ohtsuki, and S. Tominaga, “Spectral estimation of made-up skin color under various conditions,” in Proc. SPIE (San Jose, California, USA), pp. 606204 (2006).

15. N. Matsushiro and N. Ohta, “Theoretical analysis of subtractive color mixture characteristics III-realistic colorants and single tristimulus dimension,” Color Res. Appl. 30(5), 354–362 (2005). [CrossRef]  

16. S. Kirkpatrick, C. D. Gelatt Jr, and M. P. Vecchi, “Optimization by simulated annealing,” Science 220(4598), 671–680 (1983). [CrossRef]   [PubMed]  

17. J. Serra, Image Analysis and Mathematical Morphology (Academic Press, London, UK, 1982).

18. J. Serra, Image Analysis and Mathematical Morphology, Part II: Theoretical Advances (Academic Press, London, UK, 1988).

19. J. Serra and L. Vincent, “7An overview of morphological filtering,” Circ. Sys. Sig. Proc. 11(1), 47–108 (1992). [CrossRef]  

20. R. S. Berns, Bilmeyer and Saltzman’s Principles of Color Technology Third Edition (Wiley-Interscience, 2000), Chap. 6.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Filter design process flowchart. The solution of the transmittance function is formulated as a nonlinear optimization problem whose objective function is Eq. (1). The optimization consists of five steps.
Fig. 2
Fig. 2 Spectral sensitivities of the RGB color sensor for a Nikon D70 digital camera measured using a monochromator (Shimadzu, SPG-120) in the range of 380–780 nm in steps of 5 nm.
Fig. 3
Fig. 3 Average spectra of each data set. Slight spectral difference at wavelengths in the range 545–575 nm were caused by the spectral absorption characteristics of hemoglobin.
Fig. 4
Fig. 4 (a) Spectral transmittance of the theoretically designed and the optically realized filter. Transmittance function was theoretically designed by optimization process. The optical filter was realized by vacuum deposition technology. (b) Developed optical filter. The optical filter was made of a multilayer thin film, which was composed of 31 layers of SiO2 and TiO2.
Fig. 5
Fig. 5 Measurement geometry. The measurement device was a commercially available camera (Nikon D70) and a fluorescent light (Diva-Lite, 6300K) was selected as the illumination light source. Polarizing films were installed on both the camera and the illumination light source to eliminate specular light.
Fig. 6
Fig. 6 Color (chromaticity) distributions extracted from an RGB image taken with the digital camera. (a) Color distributions without filtering and (b) color distributions with filtering. The ellipses represent equiprobability ellipses whose shafts are defined by the standard deviations. Misclassification rates (a) 33.0% and (b) 5.97%.
Fig. 7
Fig. 7 Results of filtered images captured using a digital camera. Foundation was applied only on the left side of the face. (a) RGB color image without the filter. (b) Filtered color image. The image in the figure shows obvious enhancement. (c) Discriminant score computed according to Eq. (5).
Fig. 8
Fig. 8 Discrimination accuracy with and without luminance information. Trichromatic signals measured in Section 3.2 were used for this comparison and the discrimination accuracies were computed with leave-one-out cross validation. The left two bars show the discrimination accuracies when the optical filter was not equipped and the right two bars show the accuracies with the filter.

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

P e ( I 1 ( λ ) , I 2 ( λ ) ) = p 1 P ( c a t e g o r y = 2 | I 1 ( λ ) ) + p 2 P ( c a t e g o r y = 1 | I 2 ( λ ) ) ,
C i k = λ T ( λ ) I i ( λ ) S k ( λ ) , i { 1 , 2 } , k { R , G , B } ,
C i = ( r i , g ) i = ( C i R C i R + C i G + C i B , C i G C i R + C i G + C i B ) .
T ( λ ) = k = 1 N D k B k ( λ ) , B k ( λ ) = { { ω 3 + 3 ω 2 ( ω | λ λ k | ) + 3 ω ( ω | λ λ k | ) 2 3 ( ω | λ λ k | ) 3 } / 6 ω 3 for | λ λ k | ω , ( 2 ω | λ λ k | ) 3 / 6 ω 3 for ω | λ λ k | 2 ω , 0 for 2 ω | λ λ k | ,
f d ( C ) = ( Σ 1 ( μ 1 μ 2 ) ) t ( C μ 1 + μ 2 2 ) log ( p 2 / p 1 ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.