Abstract

Head pose is utilized to approximate a user’s line-of-sight for real-time image rendering and interaction in most of the 3D visualization applications using head-mounted displays (HMD). The eye often reaches an object of interest before the completion of most head movements. It is highly desirable to integrate eye-tracking capability into HMDs in various applications. While the added complexity of an eyetracked-HMD (ET-HMD) imposes challenges on designing a compact, portable, and robust system, the integration offers opportunities to improve eye tracking accuracy and robustness. In this paper, based on the modeling of an eye imaging and tracking system, we examine the challenges and identify parametric requirements for video-based pupil-glint tracking methods in an ET-HMD design, and predict how these parameters may affect the tracking accuracy, resolution, and robustness. We further present novel methods and associated algorithms that effectively improve eye-tracking accuracy and extend the tracking range.

© 2006 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. M. Bajura, H. Fuchs, and R. Ohbuchi, "Merging virtual objects with the real world: Seeing ultrasound imagery within the patient," in Proceedings of ACM SIGGRAPH (ACM, Chicago, IL, 1992), pp. 203-210.
  2. T. Caudell and D. Mizell, "Augmented reality: An application of heads-up display technology to manual manufacturing processes," in Proceedings of Hawaii International Conferences on Systems Sciences (Hawaii, HI, 1992), pp. 659-69.
  3. J. P. Rolland and H. Fuchs, "Optical versus video see-through head-mounted displays in medical visualization," Presence: Teleoperators and Virtual Environments 9, 287-309 (2000).
    [CrossRef]
  4. H. L. Pryor, T. A. Furness, and E. Viirre, "The Virtual Retinal Display: A New Display Technology Using Scanned Laser Light," in Proceedings of Human Factors and Ergonomics Society, 42nd Annual Meeting (1998) pp. 1570-1574.
  5. H. Hua, A. Girardot, C. Gao, and J. P. Rolland, "Engineering of head-mounted projective displays," Appl. Opt. 39, 3814-3824 (2000).
    [CrossRef]
  6. J. P. Rolland, L. Davis, and Y. Baillot, "A survey of tracking technology for virtual environments," in Fundamentals of Wearable Computers and Augmented Reality. M. Barfield and T. Caudell, eds. (Mahwah, NJ, 2000).
  7. J. Rolland, Y. Ha, and C. Fidopiastis, "Albertian errors in head-mounted displays: I. Choice of eye-point location for a near- or far-field task visualization," J. Opt. Soc. Am. A 21, 901-912 (2004).
    [CrossRef]
  8. K. Iwamoto, K. Komoriya, K. Tanie, "Eye Movement Tracking Type Image Display System for Wide View Image Presentation with High-Resolution: -Evaluation of High-resolution Image Presentation," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (Institute of Electrical and Electronics Engineers, EPFL, Switzerland, 2002), pp.1190-95.
  9. J. P. Rolland, A. Yoshida, L. Davis, and J.H. Reif, "High resolution inset head-mounted display," Appl. Opt. 37, 4183-4193 (1998).
    [CrossRef]
  10. L. R. Young and D. Sheena, "Methods and Designs: Survey of eye movement recording methods," Behav. Res. Methods Instrum. 7, 397-429 (1975).
    [CrossRef]
  11. R. J.K. Jacob and K.S. Karn, "Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises (Section Commentary)," in The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research, J. Hyona, R. Radach, and H. Deubel, eds. (Elsevier Science, Amsterdam, 2003), pp. 573-605.
  12. A. T. Duchowski, "Incorporating the viewer's Point-Of-Regard (POR) in gaze-contingent virtual environments," in Proc.SPIE 3295, 332-43 (1998).
    [CrossRef]
  13. M. Hayhoe, D. Ballard, J. Triesch, and H. Shinoda, "Vision in natural and virtual environments," Proceedings, in Proceedings of ACM 2002 Symposium on Eye Tracking Research and Applications (ACM, New Orleans, USA 2002), pp. 7-13.
  14. L. Vaissie, J. Rolland, "Head mounted display with eyetracking capability," US Patent 6,433,760 B1 (2002).
  15. H. Hua, "Integration of eye tracking capability into optical see-through head mounted displays," in Helmet-Mounted Displays, Proc. SPIE 4297, 496-503 (2001).
    [CrossRef]
  16. C. W. Hess, R. Muri, O. Meienberg, "Recording of horizontal saccadic eye movements: methodological comparison between electro-oculography and infrared reflection oculography," Neuro-Ophthalmology 6, 264-272 (1986).
    [CrossRef]
  17. D. A. Robinson, "A method of measuring eye movements using a scleral search coil in a magnetic field," IEEE Trans. Biomed. Electron. BME 10, 137-145 (1963).
  18. Applied Science Laboratories: Technology and Systems for Eye Tracking, http://www.a-s-l.com.
  19. M. Eizenman, R. C. Frecker, P. E. Hallett, "Precise non-contacting measurement of eye movements using the corneal reflex," Vision Res. 24, 167-74 (1984).
    [CrossRef] [PubMed]
  20. T. N. Cornsweet, H. D. Crane, "Accurate two-dimensional eye tracker using first and fourth Purkinje images," J. Opt. Soc. Am. 63, 921-8 (1973).
    [CrossRef] [PubMed]
  21. J. Merchant, R. Morrissette, J. L. Porterfield, "Remote measurement of eye direction allowing subject motion over one cubic foot of space," IEEE Trans. Biomed. Engineering BME 21, 309-17 (1974).
    [CrossRef]
  22. Y. Ebisawa, "Improved video-based eye-gaze detection method," IEEE Trans. Instrum. Meas. 47, 948-55 (1998).
    [CrossRef]
  23. C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
    [CrossRef]
  24. D. Yoo, J. Kim, B. Lee, M., "Non-contact eye gaze tracking system by mapping of corneal reflections," in Proceedings of Fifth IEEE International Conference on Automatic Face and Gesture Recognition (IEEE, Piscataway, NJ, USA, 2002), pp. 0101-06.
  25. S. W. Shih, and Jin Liu, "A novel approach to 3-D gaze tracking using stereo cameras," IEEE Trans. on Systems, Man, and Cybernetics-Part B: Cybernetics 34, 234-245 (2004).
  26. K. Ryoung Park, "Gaze detection by wide and narrow view stereo camera," Lecture Notes in Computer Science (CIARP’2004) 3287, 140-147 (2004).
  27. M. L. Thomas, W. P. Siegmund, S. E. Antos, and R. M. Robinson, "Fiber optic development for use on the fiber optic helmet-mounted display," in Helmet-Mounted Displays, J. T. Carollo, ed., Proc. SPIE 116, 90-101 (1989).
  28. G. Beach, CharlesJ  Cohen, Jeff Braun, G. Moody, "Eye tracker system for use with head mounted displays," in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (IEEE, Piscataway, NJ, USA, 1998) 5, pp. 4348-52.
  29. C. Curatu, H. Hua, and J. Rolland, "Projection-based head mounted display with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).
  30. C. W. Pansing, H. Hua, and J. Rolland, "Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).
  31. J. Schwiegerling, Field Guide to Visual and Opthalmic Optics (SPIE Press, Bellingham, WA, 2004).
    [CrossRef]
  32. L. G. Farkas, Anthropometry of the Head and Face, Second Ed. (Raven Press, New York, 1994).
  33. J. D. Foley and A. van Dam, Fundamentals of Interactive Computer Graphics, Second Ed. (Addison-Wesley, Reading, Mass., 1996).
  34. Z. Zhang, "A flexible new technique of camera calibration," IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330-1334 (2000).
    [CrossRef]
  35. Karlene Nguyen, Cindy Wagner, David Koons, and Myron Flickner, "Differences in the infrared bright pupil response of human eyes," in Proceedings of ACM 2002 Symposium on Eye Tracking Research and Applications (ACM, New Orleans, USA 2002). pp. 133-138.
  36. P. Krishnaswamy, "Design and assessment of improved feature-based eye tracking methods for head-mounted displays," University of Arizona M.S. thesis report, 2005.

2005 (2)

C. Curatu, H. Hua, and J. Rolland, "Projection-based head mounted display with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

C. W. Pansing, H. Hua, and J. Rolland, "Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

2004 (1)

2001 (1)

H. Hua, "Integration of eye tracking capability into optical see-through head mounted displays," in Helmet-Mounted Displays, Proc. SPIE 4297, 496-503 (2001).
[CrossRef]

2000 (4)

J. P. Rolland and H. Fuchs, "Optical versus video see-through head-mounted displays in medical visualization," Presence: Teleoperators and Virtual Environments 9, 287-309 (2000).
[CrossRef]

H. Hua, A. Girardot, C. Gao, and J. P. Rolland, "Engineering of head-mounted projective displays," Appl. Opt. 39, 3814-3824 (2000).
[CrossRef]

Z. Zhang, "A flexible new technique of camera calibration," IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330-1334 (2000).
[CrossRef]

C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
[CrossRef]

1998 (3)

Y. Ebisawa, "Improved video-based eye-gaze detection method," IEEE Trans. Instrum. Meas. 47, 948-55 (1998).
[CrossRef]

A. T. Duchowski, "Incorporating the viewer's Point-Of-Regard (POR) in gaze-contingent virtual environments," in Proc.SPIE 3295, 332-43 (1998).
[CrossRef]

J. P. Rolland, A. Yoshida, L. Davis, and J.H. Reif, "High resolution inset head-mounted display," Appl. Opt. 37, 4183-4193 (1998).
[CrossRef]

1986 (1)

C. W. Hess, R. Muri, O. Meienberg, "Recording of horizontal saccadic eye movements: methodological comparison between electro-oculography and infrared reflection oculography," Neuro-Ophthalmology 6, 264-272 (1986).
[CrossRef]

1984 (1)

M. Eizenman, R. C. Frecker, P. E. Hallett, "Precise non-contacting measurement of eye movements using the corneal reflex," Vision Res. 24, 167-74 (1984).
[CrossRef] [PubMed]

1975 (1)

L. R. Young and D. Sheena, "Methods and Designs: Survey of eye movement recording methods," Behav. Res. Methods Instrum. 7, 397-429 (1975).
[CrossRef]

1974 (1)

J. Merchant, R. Morrissette, J. L. Porterfield, "Remote measurement of eye direction allowing subject motion over one cubic foot of space," IEEE Trans. Biomed. Engineering BME 21, 309-17 (1974).
[CrossRef]

1973 (1)

1963 (1)

D. A. Robinson, "A method of measuring eye movements using a scleral search coil in a magnetic field," IEEE Trans. Biomed. Electron. BME 10, 137-145 (1963).

Amir, A.

C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
[CrossRef]

Cornsweet, T. N.

Crane, H. D.

Curatu, C.

C. Curatu, H. Hua, and J. Rolland, "Projection-based head mounted display with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

Davis, L.

Duchowski, A. T.

A. T. Duchowski, "Incorporating the viewer's Point-Of-Regard (POR) in gaze-contingent virtual environments," in Proc.SPIE 3295, 332-43 (1998).
[CrossRef]

Ebisawa, Y.

Y. Ebisawa, "Improved video-based eye-gaze detection method," IEEE Trans. Instrum. Meas. 47, 948-55 (1998).
[CrossRef]

Eizenman, M.

M. Eizenman, R. C. Frecker, P. E. Hallett, "Precise non-contacting measurement of eye movements using the corneal reflex," Vision Res. 24, 167-74 (1984).
[CrossRef] [PubMed]

Fidopiastis, C.

Flickner, M.

C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
[CrossRef]

Frecker, R. C.

M. Eizenman, R. C. Frecker, P. E. Hallett, "Precise non-contacting measurement of eye movements using the corneal reflex," Vision Res. 24, 167-74 (1984).
[CrossRef] [PubMed]

Fuchs, H.

J. P. Rolland and H. Fuchs, "Optical versus video see-through head-mounted displays in medical visualization," Presence: Teleoperators and Virtual Environments 9, 287-309 (2000).
[CrossRef]

Gao, C.

Girardot, A.

Ha, Y.

Hallett, P. E.

M. Eizenman, R. C. Frecker, P. E. Hallett, "Precise non-contacting measurement of eye movements using the corneal reflex," Vision Res. 24, 167-74 (1984).
[CrossRef] [PubMed]

Hess, C. W.

C. W. Hess, R. Muri, O. Meienberg, "Recording of horizontal saccadic eye movements: methodological comparison between electro-oculography and infrared reflection oculography," Neuro-Ophthalmology 6, 264-272 (1986).
[CrossRef]

Hua, H.

C. Curatu, H. Hua, and J. Rolland, "Projection-based head mounted display with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

C. W. Pansing, H. Hua, and J. Rolland, "Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

H. Hua, "Integration of eye tracking capability into optical see-through head mounted displays," in Helmet-Mounted Displays, Proc. SPIE 4297, 496-503 (2001).
[CrossRef]

H. Hua, A. Girardot, C. Gao, and J. P. Rolland, "Engineering of head-mounted projective displays," Appl. Opt. 39, 3814-3824 (2000).
[CrossRef]

Koons, D.

C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
[CrossRef]

Meienberg, O.

C. W. Hess, R. Muri, O. Meienberg, "Recording of horizontal saccadic eye movements: methodological comparison between electro-oculography and infrared reflection oculography," Neuro-Ophthalmology 6, 264-272 (1986).
[CrossRef]

Merchant, J.

J. Merchant, R. Morrissette, J. L. Porterfield, "Remote measurement of eye direction allowing subject motion over one cubic foot of space," IEEE Trans. Biomed. Engineering BME 21, 309-17 (1974).
[CrossRef]

Morimoto, C. H.

C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
[CrossRef]

Morrissette, R.

J. Merchant, R. Morrissette, J. L. Porterfield, "Remote measurement of eye direction allowing subject motion over one cubic foot of space," IEEE Trans. Biomed. Engineering BME 21, 309-17 (1974).
[CrossRef]

Muri, R.

C. W. Hess, R. Muri, O. Meienberg, "Recording of horizontal saccadic eye movements: methodological comparison between electro-oculography and infrared reflection oculography," Neuro-Ophthalmology 6, 264-272 (1986).
[CrossRef]

Pansing, C. W.

C. W. Pansing, H. Hua, and J. Rolland, "Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

Porterfield, J. L.

J. Merchant, R. Morrissette, J. L. Porterfield, "Remote measurement of eye direction allowing subject motion over one cubic foot of space," IEEE Trans. Biomed. Engineering BME 21, 309-17 (1974).
[CrossRef]

Reif, J.H.

Robinson, D. A.

D. A. Robinson, "A method of measuring eye movements using a scleral search coil in a magnetic field," IEEE Trans. Biomed. Electron. BME 10, 137-145 (1963).

Rolland, J.

C. Curatu, H. Hua, and J. Rolland, "Projection-based head mounted display with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

C. W. Pansing, H. Hua, and J. Rolland, "Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

J. Rolland, Y. Ha, and C. Fidopiastis, "Albertian errors in head-mounted displays: I. Choice of eye-point location for a near- or far-field task visualization," J. Opt. Soc. Am. A 21, 901-912 (2004).
[CrossRef]

Rolland, J. P.

Sheena, D.

L. R. Young and D. Sheena, "Methods and Designs: Survey of eye movement recording methods," Behav. Res. Methods Instrum. 7, 397-429 (1975).
[CrossRef]

Yoshida, A.

Young, L. R.

L. R. Young and D. Sheena, "Methods and Designs: Survey of eye movement recording methods," Behav. Res. Methods Instrum. 7, 397-429 (1975).
[CrossRef]

Zhang, Z.

Z. Zhang, "A flexible new technique of camera calibration," IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330-1334 (2000).
[CrossRef]

Appl. Opt. (2)

Behav. Res. Methods Instrum. (1)

L. R. Young and D. Sheena, "Methods and Designs: Survey of eye movement recording methods," Behav. Res. Methods Instrum. 7, 397-429 (1975).
[CrossRef]

IEEE Trans. Biomed. Electron. BME (1)

D. A. Robinson, "A method of measuring eye movements using a scleral search coil in a magnetic field," IEEE Trans. Biomed. Electron. BME 10, 137-145 (1963).

IEEE Trans. Biomed. Engineering BME (1)

J. Merchant, R. Morrissette, J. L. Porterfield, "Remote measurement of eye direction allowing subject motion over one cubic foot of space," IEEE Trans. Biomed. Engineering BME 21, 309-17 (1974).
[CrossRef]

IEEE Trans. Instrum. Meas. (1)

Y. Ebisawa, "Improved video-based eye-gaze detection method," IEEE Trans. Instrum. Meas. 47, 948-55 (1998).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

Z. Zhang, "A flexible new technique of camera calibration," IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330-1334 (2000).
[CrossRef]

Image Vision Comput. (1)

C. H. Morimoto, D. Koons, A. Amir, and M. Flickner, "Pupil detection and tracking using multiple light sources," Image Vision Comput. 18, 331-5 (2000).
[CrossRef]

J. Opt. Soc. Am. (1)

J. Opt. Soc. Am. A (1)

Neuro-Ophthalmology (1)

C. W. Hess, R. Muri, O. Meienberg, "Recording of horizontal saccadic eye movements: methodological comparison between electro-oculography and infrared reflection oculography," Neuro-Ophthalmology 6, 264-272 (1986).
[CrossRef]

Proc. SPIE (1)

H. Hua, "Integration of eye tracking capability into optical see-through head mounted displays," in Helmet-Mounted Displays, Proc. SPIE 4297, 496-503 (2001).
[CrossRef]

Proc. SPIE. (2)

C. Curatu, H. Hua, and J. Rolland, "Projection-based head mounted display with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

C. W. Pansing, H. Hua, and J. Rolland, "Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities," in Novel Optical Systems Design and Optimization VIII, Jose M. Sasian, R. John Koshel, and Richard C. Juergens Eds., Proc. SPIE. 5875, 128-140 (2005).

SPIE (1)

A. T. Duchowski, "Incorporating the viewer's Point-Of-Regard (POR) in gaze-contingent virtual environments," in Proc.SPIE 3295, 332-43 (1998).
[CrossRef]

Teleoperators and Virtual Environments (1)

J. P. Rolland and H. Fuchs, "Optical versus video see-through head-mounted displays in medical visualization," Presence: Teleoperators and Virtual Environments 9, 287-309 (2000).
[CrossRef]

Vision Res. (1)

M. Eizenman, R. C. Frecker, P. E. Hallett, "Precise non-contacting measurement of eye movements using the corneal reflex," Vision Res. 24, 167-74 (1984).
[CrossRef] [PubMed]

Other (19)

Karlene Nguyen, Cindy Wagner, David Koons, and Myron Flickner, "Differences in the infrared bright pupil response of human eyes," in Proceedings of ACM 2002 Symposium on Eye Tracking Research and Applications (ACM, New Orleans, USA 2002). pp. 133-138.

P. Krishnaswamy, "Design and assessment of improved feature-based eye tracking methods for head-mounted displays," University of Arizona M.S. thesis report, 2005.

J. Schwiegerling, Field Guide to Visual and Opthalmic Optics (SPIE Press, Bellingham, WA, 2004).
[CrossRef]

L. G. Farkas, Anthropometry of the Head and Face, Second Ed. (Raven Press, New York, 1994).

J. D. Foley and A. van Dam, Fundamentals of Interactive Computer Graphics, Second Ed. (Addison-Wesley, Reading, Mass., 1996).

D. Yoo, J. Kim, B. Lee, M., "Non-contact eye gaze tracking system by mapping of corneal reflections," in Proceedings of Fifth IEEE International Conference on Automatic Face and Gesture Recognition (IEEE, Piscataway, NJ, USA, 2002), pp. 0101-06.

S. W. Shih, and Jin Liu, "A novel approach to 3-D gaze tracking using stereo cameras," IEEE Trans. on Systems, Man, and Cybernetics-Part B: Cybernetics 34, 234-245 (2004).

K. Ryoung Park, "Gaze detection by wide and narrow view stereo camera," Lecture Notes in Computer Science (CIARP’2004) 3287, 140-147 (2004).

M. L. Thomas, W. P. Siegmund, S. E. Antos, and R. M. Robinson, "Fiber optic development for use on the fiber optic helmet-mounted display," in Helmet-Mounted Displays, J. T. Carollo, ed., Proc. SPIE 116, 90-101 (1989).

G. Beach, CharlesJ  Cohen, Jeff Braun, G. Moody, "Eye tracker system for use with head mounted displays," in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (IEEE, Piscataway, NJ, USA, 1998) 5, pp. 4348-52.

H. L. Pryor, T. A. Furness, and E. Viirre, "The Virtual Retinal Display: A New Display Technology Using Scanned Laser Light," in Proceedings of Human Factors and Ergonomics Society, 42nd Annual Meeting (1998) pp. 1570-1574.

M. Bajura, H. Fuchs, and R. Ohbuchi, "Merging virtual objects with the real world: Seeing ultrasound imagery within the patient," in Proceedings of ACM SIGGRAPH (ACM, Chicago, IL, 1992), pp. 203-210.

T. Caudell and D. Mizell, "Augmented reality: An application of heads-up display technology to manual manufacturing processes," in Proceedings of Hawaii International Conferences on Systems Sciences (Hawaii, HI, 1992), pp. 659-69.

K. Iwamoto, K. Komoriya, K. Tanie, "Eye Movement Tracking Type Image Display System for Wide View Image Presentation with High-Resolution: -Evaluation of High-resolution Image Presentation," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (Institute of Electrical and Electronics Engineers, EPFL, Switzerland, 2002), pp.1190-95.

J. P. Rolland, L. Davis, and Y. Baillot, "A survey of tracking technology for virtual environments," in Fundamentals of Wearable Computers and Augmented Reality. M. Barfield and T. Caudell, eds. (Mahwah, NJ, 2000).

M. Hayhoe, D. Ballard, J. Triesch, and H. Shinoda, "Vision in natural and virtual environments," Proceedings, in Proceedings of ACM 2002 Symposium on Eye Tracking Research and Applications (ACM, New Orleans, USA 2002), pp. 7-13.

L. Vaissie, J. Rolland, "Head mounted display with eyetracking capability," US Patent 6,433,760 B1 (2002).

R. J.K. Jacob and K.S. Karn, "Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises (Section Commentary)," in The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research, J. Hyona, R. Radach, and H. Deubel, eds. (Elsevier Science, Amsterdam, 2003), pp. 573-605.

Applied Science Laboratories: Technology and Systems for Eye Tracking, http://www.a-s-l.com.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1.

Schematic diagram of an ET-HMD integrated system.

Fig. 2.
Fig. 2.

Simulation of an eye illumination and imaging system: (a) Schematic eye model; (b) Illustration of coordinate systems and parameters.

Fig. 3.
Fig. 3.

Examples of simulated eye features: (a) (0,0) degrees of eye rotation; (b) (30,30) degrees of eye rotation.

Fig. 4.
Fig. 4.

Movements of the pupil centroid and an on-axis glint as the eye rotates diagonally: (a) Absolute displacements from the sensor center; (b) Absolute velocities; (c) Relative velocity of glint vs. pupil centroid.

Fig. 5.
Fig. 5.

Characterization of an off-axis glint movement as the eye rotates diagonally: (a) Horizontal glint location; (b) Vertical glint location; (c) Horizontal glint velocity; (d) Vertical glint velocity.

Fig. 6.
Fig. 6.

The non-linearity of glint movement depends on the placement of the camera from the eye: (a) On-axis glint; (b) Off-axis glint.

Fig. 7.
Fig. 7.

Examples of eye illumination schemes: (a) Single IR-LED illuminator; (b) Four IR-LED illuminators.

Fig. 8.
Fig. 8.

Movements of four symmetrically arranged off-axis glints as the eye rotates diagonally: (a) Horizontal glint displacements from their zero-degree locations; (b) Vertical glint displacements from their zero-degree location; (c) Horizontal glint velocity; (d) Vertical glint velocity.

Fig. 9.
Fig. 9.

Comparison of the virtual glint and the glint centroid created by four symmetrically arranged off-axis LEDs: (a) The virtual glint, independent of the offset distance, r, of the LEDs, follows the same path as an on-axis glint; (b) The distance between two adjacent glints (e.g., g 2 and g 3) varies with the offset distance, r; The glint centroid shows significant non-linearity and disparity from the path of the on-axis glint in both (b) horizontal and (c) vertical directions.

Fig. 10.
Fig. 10.

Illustrations of the new tracking methods: (a) The geometrical conditions for computing an on-axis virtual glint from multiple off-axis glints; (b) The geometrical conditions for orthogonality approximation; (c) The geometrical conditions for parallelism approximation; (d) Estimation of the virtual glint using the orthogonality and parallelism conditions when only two of the four glints are detected.

Fig. 11.
Fig. 11.

a) Undetected glint failed the tracking of the virtual glint; (b) Orthogonality approximation secured the tracker of the virtual glint with an undetected glint.

Fig. 12.
Fig. 12.

Orthogonality approximation to estimate the virtual glint: (a) The angle θ remains constantly 90° when the camera axis is perpendicular to the LED plane, and it varies within 88~92° when the LED plane is tilted by up to 40° away from its perpendicular orientation; (b) When the LED plane is tilted by up to 20°, the errors of the estimated virtual glint from three of the four glints is less lens 1 pixel by applying orthogonality approximation.

Fig. 13.
Fig. 13.

Transform estimation: (a) A reference frame with all of the four glints detected; (b) Estimation of the fourth undetected glint and the virtual glint from the transform M1 estimated from three tracked glints; (c) Estimation of the two undetected glints and the virtual glint from the transform M2 estimated from two tracked glints.

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

q ˜ = M ˜ proj · Q ˜ = [ 1 s x 0 u 0 f 0 0 1 s y v 0 f 0 0 0 1 0 0 0 1 f 0 ] · Q ˜ ,
q = [ x ˜ q w ˜ q y ˜ q w ˜ q ] T .
T O E = T z ( t ) R y ( β ) R x ( α ) ,
g ˜ = M proj · T O E · M refl · T O E 1 · L ˜ ,
p ˜ = M proj · T O E · M refr · P ˜ ,
s ˜ = M proj · T O E · S ˜ .
f = max ( z 0 H a , z 0 W b ) ,
g 0 = [ u 00 v 00 ] = 1 k 2 k 1 [ b 2 b 1 b 2 k 1 b 1 k 2 ] ,
Δ g ˜ = M · g ˜ ,
{ x Ge = r c 1 2 ( r e z e ) r c 1 x e y Ge = r c 1 2 ( r e z e ) r c 1 y e z Ge = r c 1 ( r e z e ) 2 ( r e z e ) r c 1 + r e .
M refl = [ 1 0 0 0 0 1 0 0 0 0 1 2 r e r c 1 2 r e ( r e r c 1 ) r c 1 0 0 2 r c 1 2 r e r c 1 1 ] .
{ 1 f c = ( n c n a ) [ n c 1 ( n c n a ) r c 1 1 r c 2 + t c ( n c 1 ) n c r c 1 r c 2 ] FFL = f c Δ t 1 BFL = f c Δ t 2 ,
{ x pe = n a f c n a f c + ( z P z 1 ) x P y pe = n a f c n a f c + ( z P z 1 ) y P z pe = f c n a f c + ( z P z 1 ) ( z P z 1 ) + z 2
M refr = [ 1 0 0 0 0 1 0 0 0 0 1 n a + z 1 n a f c z 2 z 1 n a z 1 z 2 n a f c 0 0 1 n a f c 1 z 1 n a f c ]

Metrics