Abstract

Materials, such as cosmetics, applied to the face can severely inhibit biometric face-recognition systems operating in the visible spectrum. These products are typically made up of materials having different spectral properties and color pigmentation that distorts the perceived shape of the face. The surface of the face emits thermal radiation, due to the living tissue beneath the surface of the skin. The emissivity of skin is approximately 0.99; in comparison, oil- and plastic-based materials, commonly found in cosmetics and face paints, have an emissivity range of 0.9–0.95 in the long-wavelength infrared part of the spectrum. Due to these properties, all three are good thermal emitters and have little impact on the heat transferred from the face. Polarimetric-thermal imaging provides additional details of the face and is also dependent upon the thermal radiation from the face. In this paper, we provide a theoretical analysis on the thermal conductivity of various materials commonly applied to the face using a metallic sphere. Additionally, we observe the impact of environmental conditions on the strength of the polarimetric signature and the ability to recover geometric details. Finally, we show how these materials degrade the performance of traditional face-recognition methods and provide an approach to mitigating this effect using polarimetric-thermal imaging.

© 2016 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Thermal-to-visible face recognition using partial least squares

Shuowen Hu, Jonghyun Choi, Alex L. Chan, and William Robson Schwartz
J. Opt. Soc. Am. A 32(3) 431-442 (2015)

Improving cross-modal face recognition using polarimetric imaging

Nathaniel Short, Shuowen Hu, Prudhvi Gurram, Kristan Gurton, and Alex Chan
Opt. Lett. 40(6) 882-885 (2015)

The effects of thermal equilibrium and contrast in LWIR polarimetric images

J. Scott Tyo, Bradley M. Ratliff, James K. Boger, Wiley T. Black, David L. Bowers, and Matthew P. Fetrow
Opt. Express 15(23) 15161-15167 (2007)

References

  • View by:
  • |
  • |
  • |

  1. Entry/Exit Transformation Office, “U.S. customs and border protection,” 2014, http://www.cbp.gov/sites/default/files/documents/Entry%20Exit%20Fact%20Sheet.PDF .
  2. NiDirect Government Services, “Using ePassport gates at airport border control,” NiDirect, accessed 15January2016, http://www.nidirect.gov.uk/using-epassport-gates-at-airport-border-control .
  3. FBI Criminal Justice Information Service Division, “FBI announces full operational capability of the next generation identification system,” National Press Releases, 15September2014, https://www.fbi.gov/news/pressrel/press-releases/fbi-announces-full-operational-capability-of-the-next-generation-identification-system .
  4. X. Zhu and D. Ramanan, “Face detection, pose estimation, and landmark localization in the wild,” in IEEE Conference on Computer Vision and Pattern Recognition, Providence, Rhode Island, 2012.
  5. A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
    [Crossref]
  6. Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
    [Crossref]
  7. Q. Qiu and R. Chellappa, “Compositional dictionaries for domain adaptive face recognition,” IEEE Trans. Image Process. 24, 5152–5165 (2015).
    [Crossref]
  8. K. P. Gurton, A. J. Yuffa, and G. W. Videen, “Enhanced facial recognition for thermal imagery using polarimetric imaging,” Opt. Lett. 39, 3857–3859 (2014).
    [Crossref]
  9. N. J. Short, S. Hu, P. K. Gurram, K. P. Gurton, and A. L. Chan, “Improving cross-modal face recognition using polarimetric imaging,” Opt. Lett. 40, 882–885 (2015)
    [Crossref]
  10. B. Riggan, N. Short, and S. Hu, “Optimal feature learning and discriminative framework for polarimetric thermal to visible face recognition,” in IEEE Winter Conference on Applications of Computer Vision, Lake Placid, New York, 2016.
  11. J. S. Tyo, B. M. Ratliff, J. K. Boger, W. T. Black, D. L. Bowers, and M. P. Fetrow, “The effects of thermal equilibrium and contrast in LWIR polarimetric images,” Opt. Express 15, 15161–15167 (2007).
    [Crossref]
  12. C.-D. Wen and I. Mudawar, “Modeling the effects of surface roughness on the emissivity of aluminum alloys,” Int. J. Heat Mass Transfer 49, 4279–4289 (2006).
    [Crossref]
  13. D. L. Jordan, G. D. Lewis, and E. Jackeman, “Emission polarization of roughened glass and aluminum surfaces,” Appl. Opt. 35, 3583–3590 (1996).
    [Crossref]
  14. J. Steketee, “The influence of cosmetics and ointments on the spectral emissivity of skin,” Phys. Med. Biol. 21, 920–930 (1976).
    [Crossref]
  15. T. Togawa, “Non-contact skin emissivity: measurement from reflectance using step change in ambient radiation temperature,” Clin. Phys. Physiol. Meas. 10, 39–48 (1989).
    [Crossref]
  16. F. J. Sanchez-Marin, S. Calixto-Carrera, and C. Villasenor-Mora, “Novel approach to assess the emissivity of the human skin,” J. Biomed. Opt. 14, 024006 (2009).
  17. M. L. Eckert, N. Kose, and J. L. Dugelay, “Facial cosmetics database and impact analysis on automatic face recognition,” in IEEE International Workshop on Multimedia Signal Processing, Pula, Italy, 2013.
  18. C. Chen, A. Dantcheva, and A. Ross, “Automatic facial makeup detection with application in face recognition,” in IAPR International Conference on Biometrics (ICB), Madrid, Spain, 2013.
  19. A. Dantcheva, C. Chen, and A. Ross, “Can facial cosmetics affect the matching accuracy of face recognition systems?” in IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), Washington, D.C., 2012.
  20. C. Chen, A. Dantcheva, and A. Ross, “An ensemble of patch-based subspaces for makeup-robust face recognition,” J. Inf. Fusion 32,80–92 (2016).
  21. G. Guo, L. Wen, and S. Yan, “Face authentication with makeup changes,” IEEE Trans. Circuits Syst. Video Technol. 24, 814–825 (2014).
    [Crossref]
  22. A. Moeini, K. Faez, and H. Moeini, “Face recognition across makeup and plastic surgery from real-world images,” J. Electron. Imaging 24, 053028 (2015).
    [Crossref]
  23. W. K. Widger and M. P. Woodall, “Integration of the Planck blackbody radiation function,” Bull. Am. Meteorol. Soc. 57(10), 1217–1219 (1976).
    [Crossref]
  24. A. J. Yuffa, K. P. Gurton, and G. Videen, “Three-dimensional facial recognition using passive long-wavelength infrared polarimetric imaging,” Appl. Opt. 53, 8514–8521 (2014).
    [Crossref]
  25. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
    [Crossref]
  26. B. Klare and A. K. Jain, “Heterogeneous face recognition: matching NIR to visible light images,” in International Conference on Pattern Recognition, Istanbul, Turkey, 2010.
  27. J. D. Gibbons, Nonparametric Statistical Inference, 2nd ed. (M. Dekker, 1985).

2016 (1)

C. Chen, A. Dantcheva, and A. Ross, “An ensemble of patch-based subspaces for makeup-robust face recognition,” J. Inf. Fusion 32,80–92 (2016).

2015 (3)

A. Moeini, K. Faez, and H. Moeini, “Face recognition across makeup and plastic surgery from real-world images,” J. Electron. Imaging 24, 053028 (2015).
[Crossref]

Q. Qiu and R. Chellappa, “Compositional dictionaries for domain adaptive face recognition,” IEEE Trans. Image Process. 24, 5152–5165 (2015).
[Crossref]

N. J. Short, S. Hu, P. K. Gurram, K. P. Gurton, and A. L. Chan, “Improving cross-modal face recognition using polarimetric imaging,” Opt. Lett. 40, 882–885 (2015)
[Crossref]

2014 (3)

2012 (2)

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

2009 (1)

F. J. Sanchez-Marin, S. Calixto-Carrera, and C. Villasenor-Mora, “Novel approach to assess the emissivity of the human skin,” J. Biomed. Opt. 14, 024006 (2009).

2007 (1)

2006 (1)

C.-D. Wen and I. Mudawar, “Modeling the effects of surface roughness on the emissivity of aluminum alloys,” Int. J. Heat Mass Transfer 49, 4279–4289 (2006).
[Crossref]

2004 (1)

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
[Crossref]

1996 (1)

1989 (1)

T. Togawa, “Non-contact skin emissivity: measurement from reflectance using step change in ambient radiation temperature,” Clin. Phys. Physiol. Meas. 10, 39–48 (1989).
[Crossref]

1976 (2)

J. Steketee, “The influence of cosmetics and ointments on the spectral emissivity of skin,” Phys. Med. Biol. 21, 920–930 (1976).
[Crossref]

W. K. Widger and M. P. Woodall, “Integration of the Planck blackbody radiation function,” Bull. Am. Meteorol. Soc. 57(10), 1217–1219 (1976).
[Crossref]

Black, W. T.

Boger, J. K.

Bovik, A. C.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
[Crossref]

Bowers, D. L.

Calixto-Carrera, S.

F. J. Sanchez-Marin, S. Calixto-Carrera, and C. Villasenor-Mora, “Novel approach to assess the emissivity of the human skin,” J. Biomed. Opt. 14, 024006 (2009).

Chan, A. L.

Chellappa, R.

Q. Qiu and R. Chellappa, “Compositional dictionaries for domain adaptive face recognition,” IEEE Trans. Image Process. 24, 5152–5165 (2015).
[Crossref]

Chen, C.

C. Chen, A. Dantcheva, and A. Ross, “An ensemble of patch-based subspaces for makeup-robust face recognition,” J. Inf. Fusion 32,80–92 (2016).

C. Chen, A. Dantcheva, and A. Ross, “Automatic facial makeup detection with application in face recognition,” in IAPR International Conference on Biometrics (ICB), Madrid, Spain, 2013.

A. Dantcheva, C. Chen, and A. Ross, “Can facial cosmetics affect the matching accuracy of face recognition systems?” in IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), Washington, D.C., 2012.

Dantcheva, A.

C. Chen, A. Dantcheva, and A. Ross, “An ensemble of patch-based subspaces for makeup-robust face recognition,” J. Inf. Fusion 32,80–92 (2016).

C. Chen, A. Dantcheva, and A. Ross, “Automatic facial makeup detection with application in face recognition,” in IAPR International Conference on Biometrics (ICB), Madrid, Spain, 2013.

A. Dantcheva, C. Chen, and A. Ross, “Can facial cosmetics affect the matching accuracy of face recognition systems?” in IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), Washington, D.C., 2012.

Dugelay, J. L.

M. L. Eckert, N. Kose, and J. L. Dugelay, “Facial cosmetics database and impact analysis on automatic face recognition,” in IEEE International Workshop on Multimedia Signal Processing, Pula, Italy, 2013.

Eckert, M. L.

M. L. Eckert, N. Kose, and J. L. Dugelay, “Facial cosmetics database and impact analysis on automatic face recognition,” in IEEE International Workshop on Multimedia Signal Processing, Pula, Italy, 2013.

Faez, K.

A. Moeini, K. Faez, and H. Moeini, “Face recognition across makeup and plastic surgery from real-world images,” J. Electron. Imaging 24, 053028 (2015).
[Crossref]

Fetrow, M. P.

Ganesh, A.

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Gibbons, J. D.

J. D. Gibbons, Nonparametric Statistical Inference, 2nd ed. (M. Dekker, 1985).

Guo, G.

G. Guo, L. Wen, and S. Yan, “Face authentication with makeup changes,” IEEE Trans. Circuits Syst. Video Technol. 24, 814–825 (2014).
[Crossref]

Gurram, P. K.

Gurton, K. P.

Hu, S.

N. J. Short, S. Hu, P. K. Gurram, K. P. Gurton, and A. L. Chan, “Improving cross-modal face recognition using polarimetric imaging,” Opt. Lett. 40, 882–885 (2015)
[Crossref]

B. Riggan, N. Short, and S. Hu, “Optimal feature learning and discriminative framework for polarimetric thermal to visible face recognition,” in IEEE Winter Conference on Applications of Computer Vision, Lake Placid, New York, 2016.

Jackeman, E.

Jain, A. K.

B. Klare and A. K. Jain, “Heterogeneous face recognition: matching NIR to visible light images,” in International Conference on Pattern Recognition, Istanbul, Turkey, 2010.

Jordan, D. L.

Klare, B.

B. Klare and A. K. Jain, “Heterogeneous face recognition: matching NIR to visible light images,” in International Conference on Pattern Recognition, Istanbul, Turkey, 2010.

Kose, N.

M. L. Eckert, N. Kose, and J. L. Dugelay, “Facial cosmetics database and impact analysis on automatic face recognition,” in IEEE International Workshop on Multimedia Signal Processing, Pula, Italy, 2013.

Lewis, G. D.

Ma, Y.

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

Mobahi, H.

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Moeini, A.

A. Moeini, K. Faez, and H. Moeini, “Face recognition across makeup and plastic surgery from real-world images,” J. Electron. Imaging 24, 053028 (2015).
[Crossref]

Moeini, H.

A. Moeini, K. Faez, and H. Moeini, “Face recognition across makeup and plastic surgery from real-world images,” J. Electron. Imaging 24, 053028 (2015).
[Crossref]

Mudawar, I.

C.-D. Wen and I. Mudawar, “Modeling the effects of surface roughness on the emissivity of aluminum alloys,” Int. J. Heat Mass Transfer 49, 4279–4289 (2006).
[Crossref]

Peng, Y.

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

Qiu, Q.

Q. Qiu and R. Chellappa, “Compositional dictionaries for domain adaptive face recognition,” IEEE Trans. Image Process. 24, 5152–5165 (2015).
[Crossref]

Ramanan, D.

X. Zhu and D. Ramanan, “Face detection, pose estimation, and landmark localization in the wild,” in IEEE Conference on Computer Vision and Pattern Recognition, Providence, Rhode Island, 2012.

Ratliff, B. M.

Riggan, B.

B. Riggan, N. Short, and S. Hu, “Optimal feature learning and discriminative framework for polarimetric thermal to visible face recognition,” in IEEE Winter Conference on Applications of Computer Vision, Lake Placid, New York, 2016.

Ross, A.

C. Chen, A. Dantcheva, and A. Ross, “An ensemble of patch-based subspaces for makeup-robust face recognition,” J. Inf. Fusion 32,80–92 (2016).

C. Chen, A. Dantcheva, and A. Ross, “Automatic facial makeup detection with application in face recognition,” in IAPR International Conference on Biometrics (ICB), Madrid, Spain, 2013.

A. Dantcheva, C. Chen, and A. Ross, “Can facial cosmetics affect the matching accuracy of face recognition systems?” in IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), Washington, D.C., 2012.

Sanchez-Marin, F. J.

F. J. Sanchez-Marin, S. Calixto-Carrera, and C. Villasenor-Mora, “Novel approach to assess the emissivity of the human skin,” J. Biomed. Opt. 14, 024006 (2009).

Sheikh, H. R.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
[Crossref]

Short, N.

B. Riggan, N. Short, and S. Hu, “Optimal feature learning and discriminative framework for polarimetric thermal to visible face recognition,” in IEEE Winter Conference on Applications of Computer Vision, Lake Placid, New York, 2016.

Short, N. J.

Simoncelli, E. P.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
[Crossref]

Steketee, J.

J. Steketee, “The influence of cosmetics and ointments on the spectral emissivity of skin,” Phys. Med. Biol. 21, 920–930 (1976).
[Crossref]

Togawa, T.

T. Togawa, “Non-contact skin emissivity: measurement from reflectance using step change in ambient radiation temperature,” Clin. Phys. Physiol. Meas. 10, 39–48 (1989).
[Crossref]

Tyo, J. S.

Videen, G.

Videen, G. W.

Villasenor-Mora, C.

F. J. Sanchez-Marin, S. Calixto-Carrera, and C. Villasenor-Mora, “Novel approach to assess the emissivity of the human skin,” J. Biomed. Opt. 14, 024006 (2009).

Wagner, A.

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Wang, Z.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
[Crossref]

Wen, C.-D.

C.-D. Wen and I. Mudawar, “Modeling the effects of surface roughness on the emissivity of aluminum alloys,” Int. J. Heat Mass Transfer 49, 4279–4289 (2006).
[Crossref]

Wen, L.

G. Guo, L. Wen, and S. Yan, “Face authentication with makeup changes,” IEEE Trans. Circuits Syst. Video Technol. 24, 814–825 (2014).
[Crossref]

Widger, W. K.

W. K. Widger and M. P. Woodall, “Integration of the Planck blackbody radiation function,” Bull. Am. Meteorol. Soc. 57(10), 1217–1219 (1976).
[Crossref]

Woodall, M. P.

W. K. Widger and M. P. Woodall, “Integration of the Planck blackbody radiation function,” Bull. Am. Meteorol. Soc. 57(10), 1217–1219 (1976).
[Crossref]

Wright, J.

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

Xu, W.

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

Yan, S.

G. Guo, L. Wen, and S. Yan, “Face authentication with makeup changes,” IEEE Trans. Circuits Syst. Video Technol. 24, 814–825 (2014).
[Crossref]

Yuffa, A. J.

Zhou, Z.

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Zhu, X.

X. Zhu and D. Ramanan, “Face detection, pose estimation, and landmark localization in the wild,” in IEEE Conference on Computer Vision and Pattern Recognition, Providence, Rhode Island, 2012.

Appl. Opt. (2)

Bull. Am. Meteorol. Soc. (1)

W. K. Widger and M. P. Woodall, “Integration of the Planck blackbody radiation function,” Bull. Am. Meteorol. Soc. 57(10), 1217–1219 (1976).
[Crossref]

Clin. Phys. Physiol. Meas. (1)

T. Togawa, “Non-contact skin emissivity: measurement from reflectance using step change in ambient radiation temperature,” Clin. Phys. Physiol. Meas. 10, 39–48 (1989).
[Crossref]

IEEE Trans. Circuits Syst. Video Technol. (1)

G. Guo, L. Wen, and S. Yan, “Face authentication with makeup changes,” IEEE Trans. Circuits Syst. Video Technol. 24, 814–825 (2014).
[Crossref]

IEEE Trans. Image Process. (2)

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “From error visibility to structural similarity,” IEEE Trans. Image Process. 13, 600–612 (2004).
[Crossref]

Q. Qiu and R. Chellappa, “Compositional dictionaries for domain adaptive face recognition,” IEEE Trans. Image Process. 24, 5152–5165 (2015).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

A. Wagner, J. Wright, A. Ganesh, Z. Zhou, H. Mobahi, and Y. Ma, “Toward a practical face recognition system: robust alignment and illumination by sparse representation,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 372–386 (2012).
[Crossref]

Y. Peng, A. Ganesh, J. Wright, W. Xu, and Y. Ma, “RASL: robust alignment by sparse and low-rank decomposition for linearly correlated images,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 2233–2246 (2012).
[Crossref]

Int. J. Heat Mass Transfer (1)

C.-D. Wen and I. Mudawar, “Modeling the effects of surface roughness on the emissivity of aluminum alloys,” Int. J. Heat Mass Transfer 49, 4279–4289 (2006).
[Crossref]

J. Biomed. Opt. (1)

F. J. Sanchez-Marin, S. Calixto-Carrera, and C. Villasenor-Mora, “Novel approach to assess the emissivity of the human skin,” J. Biomed. Opt. 14, 024006 (2009).

J. Electron. Imaging (1)

A. Moeini, K. Faez, and H. Moeini, “Face recognition across makeup and plastic surgery from real-world images,” J. Electron. Imaging 24, 053028 (2015).
[Crossref]

J. Inf. Fusion (1)

C. Chen, A. Dantcheva, and A. Ross, “An ensemble of patch-based subspaces for makeup-robust face recognition,” J. Inf. Fusion 32,80–92 (2016).

Opt. Express (1)

Opt. Lett. (2)

Phys. Med. Biol. (1)

J. Steketee, “The influence of cosmetics and ointments on the spectral emissivity of skin,” Phys. Med. Biol. 21, 920–930 (1976).
[Crossref]

Other (10)

B. Klare and A. K. Jain, “Heterogeneous face recognition: matching NIR to visible light images,” in International Conference on Pattern Recognition, Istanbul, Turkey, 2010.

J. D. Gibbons, Nonparametric Statistical Inference, 2nd ed. (M. Dekker, 1985).

B. Riggan, N. Short, and S. Hu, “Optimal feature learning and discriminative framework for polarimetric thermal to visible face recognition,” in IEEE Winter Conference on Applications of Computer Vision, Lake Placid, New York, 2016.

Entry/Exit Transformation Office, “U.S. customs and border protection,” 2014, http://www.cbp.gov/sites/default/files/documents/Entry%20Exit%20Fact%20Sheet.PDF .

NiDirect Government Services, “Using ePassport gates at airport border control,” NiDirect, accessed 15January2016, http://www.nidirect.gov.uk/using-epassport-gates-at-airport-border-control .

FBI Criminal Justice Information Service Division, “FBI announces full operational capability of the next generation identification system,” National Press Releases, 15September2014, https://www.fbi.gov/news/pressrel/press-releases/fbi-announces-full-operational-capability-of-the-next-generation-identification-system .

X. Zhu and D. Ramanan, “Face detection, pose estimation, and landmark localization in the wild,” in IEEE Conference on Computer Vision and Pattern Recognition, Providence, Rhode Island, 2012.

M. L. Eckert, N. Kose, and J. L. Dugelay, “Facial cosmetics database and impact analysis on automatic face recognition,” in IEEE International Workshop on Multimedia Signal Processing, Pula, Italy, 2013.

C. Chen, A. Dantcheva, and A. Ross, “Automatic facial makeup detection with application in face recognition,” in IAPR International Conference on Biometrics (ICB), Madrid, Spain, 2013.

A. Dantcheva, C. Chen, and A. Ross, “Can facial cosmetics affect the matching accuracy of face recognition systems?” in IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), Washington, D.C., 2012.

Supplementary Material (2)

NameDescription
» Visualization 1: AVI (40620 KB)      Media file illustrating data collection for Scenario 1.
» Visualization 2: AVI (46320 KB)      Media file illustrating data collection for Scenario 2.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) Visible image taken of subject without oil-based paint (left), partial coverage (middle), and full coverage (right). Simultaneously acquired (b) thermal and (c) polarimetric-thermal images.
Fig. 2.
Fig. 2. Typical imaging scene is shown. In this scene, waves produced by the optical background are denoted by a dashed blue line, waves emitted by the target are denoted by a solid green line, and waves emitted by the image background are denoted by a dotted red line.
Fig. 3.
Fig. 3. Metal ball painted with Krylon flat black paint and three different makeup materials is shown. The following makeup paints have been applied to the ball: upper-left painted with gel makeup, lower-left painted with black makeup paint, lower-right painted with green makeup paint, and upper-right is makeup free.
Fig. 4.
Fig. 4. (a) DoLP and (b) retrieved mapping surface normals θ for a ball heated to (top) 45°C, (middle) 35°C, and (bottom) 30°C.
Fig. 5.
Fig. 5. Figure illustrating face signature differences in visible before (left) and after (right) the application of makeup for two scenarios. Annotations indicate the type of material applied to each portion of the face. Visualization 1 illustrates the signature differences of a dynamic face with cosmetics acquired by visible, thermal, and polarimetric-thermal imagers. Visualization 2 illustrates the same for the exaggerated makeup example.
Fig. 6.
Fig. 6. Figure illustrating the cropped face signature used for recognition before (top) and after (middle) the application of cosmetic material in (a), (d) visible; (b), (e) polarimetric-LWIR; and (c), (f) conventional LWIR. Bottom row illustrates local structural similarity measures of pixel neighborhoods in the collected imagery.
Fig. 7.
Fig. 7. (Top-Left) Average match score resulting from comparison of probe images with target and nontarget data for the no makeup/makeup case using visible and polarimetric-thermal images. (Top-Right) Difference in match scores of target and nontarget/match subjects resulting from the application of makeup. (Bottom) Same charts using results from the exaggerated cosmetics scenario.

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

B ( v ) = v L v ˜ d v ˜ = v 2 × 10 8 h c 2 v ˜ 3 1 e 100 h c v ˜ k B T 1 d v ˜
B ( v ) = 2 k 4 T 4 h 3 c 2 n = 1 ( x 3 n + 3 x 2 n 2 + 6 x n 3 + 6 n 4 ) e n x    Wm 2 sr 1 ,
v 1 v 2 L v ˜ d v ˜ = B ( v 1 ) B ( v 2 ) ,
SSIM ( I w , J w ) = ( 2 μ I w μ J w + ( k 1 L ) 2 ) ( 2 σ I w J w + ( k 2 L ) 2 ) ( μ I w 2 + μ J w 2 + ( k 1 L ) 2 ) ( σ I w 2 + σ I w 2 + ( k 2 L ) 2 ) ,

Metrics