Abstract

Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.

© 2017 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Simple method to measure the visual axis of the human eye

Kodikullam V. Avudainayagam and Chitralekha S. Avudainayagam
Opt. Lett. 36(10) 1803-1805 (2011)

Albertian errors in head-mounted displays: I. Choice of eye-point location for a near- or far-field task visualization

Jannick Rolland, Yonggang Ha, and Cali Fidopiastis
J. Opt. Soc. Am. A 21(6) 901-912 (2004)

Eye-tracking architecture for biometrics and remote monitoring

Ashit Talukder, John-Michael Morookian, Steve Monacos, Raymond Lam, Clayton LeBaw, and James L. Lambert
Appl. Opt. 44(5) 693-700 (2005)

References

  • View by:
  • |
  • |
  • |

  1. A. T. Duchowski, “A breadth-first survey of eye-tracking applications,” Behav. Res. Methods Instrum. Comput. 34(4), 455–470 (2002).
    [Crossref] [PubMed]
  2. A. Poole and L. J. Ball, “Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects,” in Encyclopedia of Human-Computer Interaction, C. Ghaoui, ed. (Idea Group Reference, 2005), pp. 211–219.
  3. K. Rayner, “Eye movements in reading and information processing: 20 years of research,” Psychol. Bull. 124(3), 372–422 (1998).
    [Crossref] [PubMed]
  4. N. Wade and B. Tatler, The Moving Tablet of the Eye: The Origins of Modern Eye Movement Research (Oxford University Press, 2005).
  5. K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van De Weijer, Eye Tracking: A Comprehensive Guide to Methods and Measures (Oxford University Press, 2011).
  6. D. A. Robinson, “Movement using a scieral search in a magnetic field,” IEEE Trans. Bio-med. Electron. 10(4), 137–145 (1963).
    [Crossref]
  7. H. Collewijn, F. van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–450 (1975).
    [Crossref] [PubMed]
  8. B. Shackel, “Pilot Study in Electro-oculography,” Br. J. Ophthalmol. 44(2), 89–113 (1960).
    [Crossref] [PubMed]
  9. E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006).
    [Crossref] [PubMed]
  10. Z. Zhu and Q. Ji, “Eye Gaze Tracking Under Natural Head Movements,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) (IEEE, 2005), pp. 918–923.
  11. C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans. Biomed. Eng. 56(7), 1891–1900 (2009).
    [Crossref] [PubMed]
  12. T. N. Cornsweet and H. D. Crane, “Accurate two-dimensional eye tracker using first and fourth Purkinje images,” J. Opt. Soc. Am. 63(8), 921–928 (1973).
    [Crossref] [PubMed]
  13. D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010).
    [Crossref] [PubMed]
  14. S.-W. Shih, Y.-T. Wu, and J. Liu, “A calibration-free gaze tracking technique,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000 (IEEE, 2000), pp. 201–204.
    [Crossref]
  15. J. Chen, Y. Tong, W. Gray, and Q. Ji, “A Robust 3D Eye Gaze Tracking System using Noise Reduction,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 189–196.
    [Crossref]
  16. E. Trucco, T. Anderson, M. Razeto, and S. Ivekovic, “Robust correspondenceless 3-D iris location for immersive environments,” in International Conference on Image Analysis and Processing, F. Roli and S. Vitulano, eds. (Springer-Verlag Berlin Heidelberg, 2005), pp. 123–130.
    [Crossref]
  17. S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
    [Crossref]
  18. C. Lai, S. Shih, and Y. Hung, “Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features,” IEEE Trans. Circ. Syst. Video Tech. 25(1), 24–37 (2015).
    [Crossref]
  19. R. Navarro, J. Santamaría, and J. Bescós, “Accommodation-dependent model of the human eye with aspherics,” J. Opt. Soc. Am. A 2(8), 1273–1281 (1985).
    [Crossref] [PubMed]
  20. C. Fedtke, F. Manns, and A. Ho, “The entrance pupil of the human eye: a three-dimensional model as a function of viewing angle,” Opt. Express 18(21), 22364–22376 (2010).
    [Crossref] [PubMed]
  21. H. J. Wyatt, “The human pupil and the use of video-based eyetrackers,” Vision Res. 50(19), 1982–1988 (2010).
    [Crossref] [PubMed]
  22. B. Gagl, S. Hawelka, and F. Hutzler, “Systematic influence of gaze position on pupil size measurement: analysis and correction,” Behav. Res. Methods 43(4), 1171–1181 (2011).
    [Crossref] [PubMed]
  23. J.-Y. Bouguet, “Complete Camera Calibration Toolbox for Matlab,” http://www.vision.caltech.edu/bouguetj/ .
  24. A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999).
    [Crossref]
  25. S.-W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Syst. Man Cybern. B Cybern. 34(1), 234–245 (2004).
    [Crossref] [PubMed]
  26. T. Nagamatsu, J. Kamahara, T. Iko, and N. Tanaka, “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 95–98.
    [Crossref]

2015 (1)

C. Lai, S. Shih, and Y. Hung, “Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features,” IEEE Trans. Circ. Syst. Video Tech. 25(1), 24–37 (2015).
[Crossref]

2011 (1)

B. Gagl, S. Hawelka, and F. Hutzler, “Systematic influence of gaze position on pupil size measurement: analysis and correction,” Behav. Res. Methods 43(4), 1171–1181 (2011).
[Crossref] [PubMed]

2010 (3)

C. Fedtke, F. Manns, and A. Ho, “The entrance pupil of the human eye: a three-dimensional model as a function of viewing angle,” Opt. Express 18(21), 22364–22376 (2010).
[Crossref] [PubMed]

H. J. Wyatt, “The human pupil and the use of video-based eyetrackers,” Vision Res. 50(19), 1982–1988 (2010).
[Crossref] [PubMed]

D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010).
[Crossref] [PubMed]

2009 (1)

C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans. Biomed. Eng. 56(7), 1891–1900 (2009).
[Crossref] [PubMed]

2006 (1)

E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006).
[Crossref] [PubMed]

2004 (1)

S.-W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Syst. Man Cybern. B Cybern. 34(1), 234–245 (2004).
[Crossref] [PubMed]

2002 (1)

A. T. Duchowski, “A breadth-first survey of eye-tracking applications,” Behav. Res. Methods Instrum. Comput. 34(4), 455–470 (2002).
[Crossref] [PubMed]

1999 (1)

A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999).
[Crossref]

1998 (1)

K. Rayner, “Eye movements in reading and information processing: 20 years of research,” Psychol. Bull. 124(3), 372–422 (1998).
[Crossref] [PubMed]

1985 (1)

1975 (1)

H. Collewijn, F. van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–450 (1975).
[Crossref] [PubMed]

1973 (1)

1963 (1)

D. A. Robinson, “Movement using a scieral search in a magnetic field,” IEEE Trans. Bio-med. Electron. 10(4), 137–145 (1963).
[Crossref]

1960 (1)

B. Shackel, “Pilot Study in Electro-oculography,” Br. J. Ophthalmol. 44(2), 89–113 (1960).
[Crossref] [PubMed]

Ablassmeier, M.

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Bardins, S.

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Bartl, K.

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Bescós, J.

Chen, J.

J. Chen, Y. Tong, W. Gray, and Q. Ji, “A Robust 3D Eye Gaze Tracking System using Noise Reduction,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 189–196.
[Crossref]

Collewijn, H.

H. Collewijn, F. van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–450 (1975).
[Crossref] [PubMed]

Cornsweet, T. N.

Crane, H. D.

Duchowski, A. T.

A. T. Duchowski, “A breadth-first survey of eye-tracking applications,” Behav. Res. Methods Instrum. Comput. 34(4), 455–470 (2002).
[Crossref] [PubMed]

Eizenman, M.

E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006).
[Crossref] [PubMed]

Fedtke, C.

Fisher, R. B.

A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999).
[Crossref]

Fitzgibbon, A.

A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999).
[Crossref]

Gagl, B.

B. Gagl, S. Hawelka, and F. Hutzler, “Systematic influence of gaze position on pupil size measurement: analysis and correction,” Behav. Res. Methods 43(4), 1171–1181 (2011).
[Crossref] [PubMed]

Gray, W.

J. Chen, Y. Tong, W. Gray, and Q. Ji, “A Robust 3D Eye Gaze Tracking System using Noise Reduction,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 189–196.
[Crossref]

Guestrin, E. D.

E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006).
[Crossref] [PubMed]

Hansen, D. W.

D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010).
[Crossref] [PubMed]

Hawelka, S.

B. Gagl, S. Hawelka, and F. Hutzler, “Systematic influence of gaze position on pupil size measurement: analysis and correction,” Behav. Res. Methods 43(4), 1171–1181 (2011).
[Crossref] [PubMed]

Hennessey, C. A.

C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans. Biomed. Eng. 56(7), 1891–1900 (2009).
[Crossref] [PubMed]

Ho, A.

Hung, Y.

C. Lai, S. Shih, and Y. Hung, “Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features,” IEEE Trans. Circ. Syst. Video Tech. 25(1), 24–37 (2015).
[Crossref]

Hutzler, F.

B. Gagl, S. Hawelka, and F. Hutzler, “Systematic influence of gaze position on pupil size measurement: analysis and correction,” Behav. Res. Methods 43(4), 1171–1181 (2011).
[Crossref] [PubMed]

Iko, T.

T. Nagamatsu, J. Kamahara, T. Iko, and N. Tanaka, “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 95–98.
[Crossref]

Jansen, T. C.

H. Collewijn, F. van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–450 (1975).
[Crossref] [PubMed]

Ji, Q.

D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010).
[Crossref] [PubMed]

J. Chen, Y. Tong, W. Gray, and Q. Ji, “A Robust 3D Eye Gaze Tracking System using Noise Reduction,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 189–196.
[Crossref]

Z. Zhu and Q. Ji, “Eye Gaze Tracking Under Natural Head Movements,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) (IEEE, 2005), pp. 918–923.

Kamahara, J.

T. Nagamatsu, J. Kamahara, T. Iko, and N. Tanaka, “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 95–98.
[Crossref]

Kohlbecher, S.

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Lai, C.

C. Lai, S. Shih, and Y. Hung, “Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features,” IEEE Trans. Circ. Syst. Video Tech. 25(1), 24–37 (2015).
[Crossref]

Lawrence, P. D.

C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans. Biomed. Eng. 56(7), 1891–1900 (2009).
[Crossref] [PubMed]

Liu, J.

S.-W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Syst. Man Cybern. B Cybern. 34(1), 234–245 (2004).
[Crossref] [PubMed]

S.-W. Shih, Y.-T. Wu, and J. Liu, “A calibration-free gaze tracking technique,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000 (IEEE, 2000), pp. 201–204.
[Crossref]

Manns, F.

Nagamatsu, T.

T. Nagamatsu, J. Kamahara, T. Iko, and N. Tanaka, “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 95–98.
[Crossref]

Navarro, R.

Pilu, M.

A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999).
[Crossref]

Poitschke, T.

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Rayner, K.

K. Rayner, “Eye movements in reading and information processing: 20 years of research,” Psychol. Bull. 124(3), 372–422 (1998).
[Crossref] [PubMed]

Robinson, D. A.

D. A. Robinson, “Movement using a scieral search in a magnetic field,” IEEE Trans. Bio-med. Electron. 10(4), 137–145 (1963).
[Crossref]

Santamaría, J.

Schneider, E.

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Shackel, B.

B. Shackel, “Pilot Study in Electro-oculography,” Br. J. Ophthalmol. 44(2), 89–113 (1960).
[Crossref] [PubMed]

Shih, S.

C. Lai, S. Shih, and Y. Hung, “Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features,” IEEE Trans. Circ. Syst. Video Tech. 25(1), 24–37 (2015).
[Crossref]

Shih, S.-W.

S.-W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Syst. Man Cybern. B Cybern. 34(1), 234–245 (2004).
[Crossref] [PubMed]

S.-W. Shih, Y.-T. Wu, and J. Liu, “A calibration-free gaze tracking technique,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000 (IEEE, 2000), pp. 201–204.
[Crossref]

Tanaka, N.

T. Nagamatsu, J. Kamahara, T. Iko, and N. Tanaka, “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 95–98.
[Crossref]

Tong, Y.

J. Chen, Y. Tong, W. Gray, and Q. Ji, “A Robust 3D Eye Gaze Tracking System using Noise Reduction,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 189–196.
[Crossref]

van der Mark, F.

H. Collewijn, F. van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–450 (1975).
[Crossref] [PubMed]

Wu, Y.-T.

S.-W. Shih, Y.-T. Wu, and J. Liu, “A calibration-free gaze tracking technique,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000 (IEEE, 2000), pp. 201–204.
[Crossref]

Wyatt, H. J.

H. J. Wyatt, “The human pupil and the use of video-based eyetrackers,” Vision Res. 50(19), 1982–1988 (2010).
[Crossref] [PubMed]

Zhu, Z.

Z. Zhu and Q. Ji, “Eye Gaze Tracking Under Natural Head Movements,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) (IEEE, 2005), pp. 918–923.

Behav. Res. Methods (1)

B. Gagl, S. Hawelka, and F. Hutzler, “Systematic influence of gaze position on pupil size measurement: analysis and correction,” Behav. Res. Methods 43(4), 1171–1181 (2011).
[Crossref] [PubMed]

Behav. Res. Methods Instrum. Comput. (1)

A. T. Duchowski, “A breadth-first survey of eye-tracking applications,” Behav. Res. Methods Instrum. Comput. 34(4), 455–470 (2002).
[Crossref] [PubMed]

Br. J. Ophthalmol. (1)

B. Shackel, “Pilot Study in Electro-oculography,” Br. J. Ophthalmol. 44(2), 89–113 (1960).
[Crossref] [PubMed]

IEEE Trans. Bio-med. Electron. (1)

D. A. Robinson, “Movement using a scieral search in a magnetic field,” IEEE Trans. Bio-med. Electron. 10(4), 137–145 (1963).
[Crossref]

IEEE Trans. Biomed. Eng. (2)

E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006).
[Crossref] [PubMed]

C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans. Biomed. Eng. 56(7), 1891–1900 (2009).
[Crossref] [PubMed]

IEEE Trans. Circ. Syst. Video Tech. (1)

C. Lai, S. Shih, and Y. Hung, “Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features,” IEEE Trans. Circ. Syst. Video Tech. 25(1), 24–37 (2015).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell. 32(3), 478–500 (2010).
[Crossref] [PubMed]

A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pattern Anal. Mach. Intell. 21(5), 476–480 (1999).
[Crossref]

IEEE Trans. Syst. Man Cybern. B Cybern. (1)

S.-W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Syst. Man Cybern. B Cybern. 34(1), 234–245 (2004).
[Crossref] [PubMed]

J. Opt. Soc. Am. (1)

J. Opt. Soc. Am. A (1)

Opt. Express (1)

Psychol. Bull. (1)

K. Rayner, “Eye movements in reading and information processing: 20 years of research,” Psychol. Bull. 124(3), 372–422 (1998).
[Crossref] [PubMed]

Vision Res. (2)

H. Collewijn, F. van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–450 (1975).
[Crossref] [PubMed]

H. J. Wyatt, “The human pupil and the use of video-based eyetrackers,” Vision Res. 50(19), 1982–1988 (2010).
[Crossref] [PubMed]

Other (10)

S.-W. Shih, Y.-T. Wu, and J. Liu, “A calibration-free gaze tracking technique,” in Proceedings 15th International Conference on Pattern Recognition. ICPR-2000 (IEEE, 2000), pp. 201–204.
[Crossref]

J. Chen, Y. Tong, W. Gray, and Q. Ji, “A Robust 3D Eye Gaze Tracking System using Noise Reduction,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 189–196.
[Crossref]

E. Trucco, T. Anderson, M. Razeto, and S. Ivekovic, “Robust correspondenceless 3-D iris location for immersive environments,” in International Conference on Image Analysis and Processing, F. Roli and S. Vitulano, eds. (Springer-Verlag Berlin Heidelberg, 2005), pp. 123–130.
[Crossref]

S. Kohlbecher, S. Bardins, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier, “Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 135–138.
[Crossref]

Z. Zhu and Q. Ji, “Eye Gaze Tracking Under Natural Head Movements,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) (IEEE, 2005), pp. 918–923.

N. Wade and B. Tatler, The Moving Tablet of the Eye: The Origins of Modern Eye Movement Research (Oxford University Press, 2005).

K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van De Weijer, Eye Tracking: A Comprehensive Guide to Methods and Measures (Oxford University Press, 2011).

A. Poole and L. J. Ball, “Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects,” in Encyclopedia of Human-Computer Interaction, C. Ghaoui, ed. (Idea Group Reference, 2005), pp. 211–219.

J.-Y. Bouguet, “Complete Camera Calibration Toolbox for Matlab,” http://www.vision.caltech.edu/bouguetj/ .

T. Nagamatsu, J. Kamahara, T. Iko, and N. Tanaka, “One-point calibration gaze tracking based on eyeball kinematics using stereo cameras,” in Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA ’08 (ACM, 2008), pp. 95–98.
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1

Top view of the simulated stereo eye-tracking set-up. Two virtual cameras were placed 120 mm apart, with their nodal points 60 mm left and right from the origin in a right-handed coordinate system. The optical axis of both cameras intersect at 400 mm from the origin in the horizontal plane. The simulated eye was placed at different spatial locations (X∈[0, 30, 60] mm, Y∈[0, 30, 60] mm and Z = 400 mm).

Fig. 2
Fig. 2

3D reconstruction of the virtual pupil. For each point on the pupil boundary the refracted ray that passes through the nodal point of the camera was determined. Subsequently, the refracted rays from both cameras were triangulated to obtain the 3D coordinates of each corresponding point of the virtual pupil. Parameters of the Navarro eye model are indicated on the left-hand side.

Fig. 3
Fig. 3

Estimation of the center of corneal curvature. A. The ray-trace model to estimate the center of corneal curvature based on a spherical anterior cornea, L1 and L2 are the light sources, C1 and C2 are the cameras. The incident ray is reflected at the anterior corneal surface. The angle of incidence θi is equal to the angle of reflection θr. The normal vectors intersect at the center of corneal curvature. B. The intersecting normal vectors in case of a spherical corneal curvature. C. The normal vectors at the surface of an aspherical cornea.

Fig. 4
Fig. 4

Gaze reconstruction using conic algebra. For each camera a cone can be projected through the projection of the virtual pupil at the image plane. The virtual pupil lies at the intersection of these cones.

Fig. 5
Fig. 5

Location, orientation and shape of the virtual pupil obtained through ray tracing compared to the actual pupil for two different cases. A. The actual pupil is rotated 15 degrees to the right. Top view of the simulation. B. The actual pupil is rotated 15 degrees up. Side view of the simulation. The optical axis (OA) always goes through the fixed center of corneal curvature (fCC) and the center of the actual pupil (AP). The 3D shape of the virtual pupil boundary varied between conditions. A plane was fit through the boundary points of the virtual pupil (VPplane). The normal vector of this plane (VPnorm) was used to describe the orientation of the virtual pupil. The VP-CC vector is defined by the estimated center of corneal curvature (CC) and the center of the virtual pupil (VP).

Fig. 6
Fig. 6

Reconstruction of gaze direction with stereo eye-tracking methods compared to the orientation of the actual and virtual pupil. A. The actual pupil was positioned at the central location between both cameras (X = 0mm, Y = 0mm, Z = 400mm). B. The actual pupil was shifted 6 cm to the left and 6 cm downwards, simulating a translation of the head.

Fig. 7
Fig. 7

Accuracy of the different gaze reconstruction methods for a fixed pupil diameter of 4 mm. A. the conic algebra method. B. the VP-fCC method. C. the VP-CC method. Each plot shows the horizontal/vertical gaze reconstruction errors as a function of the horizontal/vertical orientation of the actual pupil for each of the nine different pupil/head translations. Data are averaged either across the nine different vertical pupil orientations (left-hand plots) or the nine different horizontal pupil orientations (right-hand plots). Colors identify the magnitude of the horizontal (left-hand plots) or vertical (right-hand plots) translation. Note the scaling differences between plots A, B and C.

Fig. 8
Fig. 8

The effect of pupil size on stereo eye-tracking methods. A. the conic algebra method. B. the VP-fCC method. C. the VP-CC method. The mean and range of the gaze errors is plotted for all pupil diameters at two head positions. The dashed lines indicate the results for a pupil diameter of 4 mm. For clarity of the graph, the actual pupil was only translated horizontally (left-hand plots) or vertically (right-hand plots). Note the scaling differences between the horizontal and vertical errors.

Fig. 9
Fig. 9

The influence of size, orientation and position of the actual pupil on the distance between the estimation of the center of corneal curvature and the center of the virtual pupil. The color coding in each square represents the distance between VP and CC in mm for one simulated gaze orientation. The left panels show the results for a pupil size of 1 mm and the right panels the results for a pupil size of 6 mm. A. The actual pupil was located at the central position. B. The actual pupil was shifted 6 cm to the left, and 6 cm down.

Tables (1)

Tables Icon

Table 1 Parameters of the Navarro schematic eye model . The surface of the anterior cornea is described by the formula x 2 +  y 2 +( 1+Q ) z 2 2Rz=0 , where Q is the conic constant and R is the radius of curvature.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

(LC)×(UC)(CCC)=0.
pc pc =[ cosφsinθ sinφ cosφcosθ ].
[ x n y n z n ]=[ cosφsinθ sinφ cosφcosθ ].

Metrics