Abstract

Stereoscopic augmented reality (AR) displays have a fixed focus plane and they suffer from visual discomfort due to vergence-accommodation conflict (VAC). In this study, we demonstrated a biocular (i.e. common optics for two eyes and same images are shown to both eyes) two focal-plane based AR system with real-time gaze tracker, which provides a novel interactive experience. To mitigate VAC, we propose a see-through near-eye display mechanism that generates two separate virtual image planes at arm’s length depth levels (i.e. 25 cm and 50 cm). Our optical system generates virtual images by relaying two liquid crystal displays (LCDs) through a beam splitter and a Fresnel lens. While the system is limited to two depths and discontinuity occurs in the virtual scene, it provides correct focus cues and natural blur effect at the corresponding depths. This allows the user to distinguish virtual information through the accommodative response of the eye, even when the virtual objects overlap and partially occlude in the axial direction. The system also provides correct motion parallax cues within the movement range of the user without any need for sophisticated head trackers. A road scene simulation is realized as a convenient use-case of the proposed display so that a large monitor is used to create a background scene and the rendered content in the LCDs is augmented into the background. Field-of-view (FOV) is 60 × 36 degrees and the eye-box is larger than 100 mm, which is comfortable enough for two-eye viewing. The system includes a single camera-based pupil and gaze tracker, which is able to select the correct depth plane based on the shift in the interpupillary distance with user’s convergence angle. The rendered content can be distributed to both depth planes and the background scene simultaneously. Thus, the user can select and interact with the content at the correct depth in a natural and comfortable way. The prototype system can be used in tasks that demand wide FOV and multiple focal planes and as an AR and vision research tool.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Multi-plane augmented reality display based on cholesteric liquid crystal reflective films

Quanming Chen, Zenghui Peng, Yan Li, Shuxin Liu, Pengcheng Zhou, Jingyuan Gu, Jiangang Lu, Lishuang Yao, Min Wang, and Yikai Su
Opt. Express 27(9) 12039-12047 (2019)

Hybrid multi-layer displays providing accommodation cues

Dongyeon Kim, Seungjae Lee, Seokil Moon, Jaebum Cho, Youngjin Jo, and Byoungho Lee
Opt. Express 26(13) 17170-17184 (2018)

References

  • View by:
  • |
  • |
  • |

  1. I. P. Howard and B. J. Rogers, “Development and pathology of binocular vision,” in Binocular Vision and Stereopsis (Oxford Scholarship Online, 1996), pp. 603–644.
    [Crossref]
  2. T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
    [Crossref]
  3. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
    [Crossref]
  4. G. Mather and D. R. Smith, “Depth cue integration: Stereopsis and image blur,” Vision Res. 40(25), 3501–3506 (2000).
    [Crossref]
  5. M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).
  6. H. Hua, “Enabling Focus Cues in Head-Mounted Displays,” Proc. IEEE 105(5), 805–824 (2017).
    [Crossref]
  7. T. Ando and E. Shimizu, “Head-mounted display using holographic optical element,” Three-Dimensional Video and Display: Devices and Systems: A Critical Review (2001).
  8. K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
    [Crossref]
  9. J. P. Rolland, M. W. Krueger, and A. Goon, “Multifocal planes head-mounted displays,” Appl. Opt. 39(19), 3209 (2000).
    [Crossref]
  10. K. Akşit, J. Kautz, and D. Luebke, “Slim near-eye display using pinhole aperture arrays,” Appl. Opt. 54(11), 3422 (2015).
    [Crossref]
  11. F. Huang, D. Luebke, and G. Wetzstein, “The light field stereoscope,” ACM SIGGRAPH 2015 Emerging Technologies on - SIGGRAPH 15 (2015).
  12. S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).
  13. J. Macnamara and G., Three Dimensional Virtual and Augmented Reality Display System, 26 Jan. 2017.
  14. B. T. Schowengerdt, D. Lin, and P. S. Hilaire, U.S. Patent No. US20180052277A1. (U.S. Patent and Trademark Office, Washington, DC, 2017).
  15. S. Rushton, M. Mon-Williams, and J. P. Wann, “Binocular vision in a bi-ocular world: New-generation head-mounted displays avoid causing visual deficit,” Displays 15(4), 255–260 (1994).
    [Crossref]
  16. J. P. Rolland, M. W. Krueger, and A. A. Goon, “Dynamic focusing in head-mounted displays,” Stereoscopic Displays and Virtual Reality Systems VI (1999).
  17. K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM SIGGRAPH 2004 Papers on - SIGGRAPH 04 (2004).
  18. Optics - Imaging - Photonics - Optomechanics - Lasers Edmund Optics. Retrieved from http://www.edmundoptics.com/
  19. C. R. Spitzer, U. Ferrell, and T. Ferrell, Digital Avionics Handbook3rd ed. (CRC Press, 2017).
  20. G. Iannizzotto and F. L. Rosa, “Competitive Combination of Multiple Eye Detection and Tracking Techniques,” IRE Trans. Ind. Electron. 58(8), 3151–3159 (2011).
    [Crossref]
  21. E. Wood and A. Bulling, “EyeTab,” Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 14 (2014).
  22. A. Fitzgibbon, M. Pilu, and R. Fisher, “Direct least squares fitting of ellipses,” Proceedings of 13th International Conference on Pattern Recognition (1996).
  23. Unity. Retrieved from https://unity3d.com/
  24. OpenCV library. Retrieved from http://www.opencv.org/
  25. Mosca. Retrieved from http://www.mosca.io/
  26. Reconstruction an Image from Projection Data – MATLAB & Simulink Example. Retrieved from http://www.mathworks.com/help/stats/boxplot.html

2017 (2)

H. Hua, “Enabling Focus Cues in Head-Mounted Displays,” Proc. IEEE 105(5), 805–824 (2017).
[Crossref]

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

2015 (1)

2011 (1)

G. Iannizzotto and F. L. Rosa, “Competitive Combination of Multiple Eye Detection and Tracking Techniques,” IRE Trans. Ind. Electron. 58(8), 3151–3159 (2011).
[Crossref]

2008 (1)

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

2005 (1)

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

2000 (2)

G. Mather and D. R. Smith, “Depth cue integration: Stereopsis and image blur,” Vision Res. 40(25), 3501–3506 (2000).
[Crossref]

J. P. Rolland, M. W. Krueger, and A. Goon, “Multifocal planes head-mounted displays,” Appl. Opt. 39(19), 3209 (2000).
[Crossref]

1994 (1)

S. Rushton, M. Mon-Williams, and J. P. Wann, “Binocular vision in a bi-ocular world: New-generation head-mounted displays avoid causing visual deficit,” Displays 15(4), 255–260 (1994).
[Crossref]

Akeley, K.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM SIGGRAPH 2004 Papers on - SIGGRAPH 04 (2004).

Aksit, K.

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

K. Akşit, J. Kautz, and D. Luebke, “Slim near-eye display using pinhole aperture arrays,” Appl. Opt. 54(11), 3422 (2015).
[Crossref]

Ando, T.

T. Ando and E. Shimizu, “Head-mounted display using holographic optical element,” Three-Dimensional Video and Display: Devices and Systems: A Critical Review (2001).

Banks, M. S.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM SIGGRAPH 2004 Papers on - SIGGRAPH 04 (2004).

Bulling, A.

E. Wood and A. Bulling, “EyeTab,” Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 14 (2014).

Cem, A.

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

Cholewiak, S. A.

M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).

Ferrell, T.

C. R. Spitzer, U. Ferrell, and T. Ferrell, Digital Avionics Handbook3rd ed. (CRC Press, 2017).

Ferrell, U.

C. R. Spitzer, U. Ferrell, and T. Ferrell, Digital Avionics Handbook3rd ed. (CRC Press, 2017).

Fisher, R.

A. Fitzgibbon, M. Pilu, and R. Fisher, “Direct least squares fitting of ellipses,” Proceedings of 13th International Conference on Pattern Recognition (1996).

Fitzgibbon, A.

A. Fitzgibbon, M. Pilu, and R. Fisher, “Direct least squares fitting of ellipses,” Proceedings of 13th International Conference on Pattern Recognition (1996).

Girshick, A. R.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM SIGGRAPH 2004 Papers on - SIGGRAPH 04 (2004).

Goon, A.

Goon, A. A.

J. P. Rolland, M. W. Krueger, and A. A. Goon, “Dynamic focusing in head-mounted displays,” Stereoscopic Displays and Virtual Reality Systems VI (1999).

Hedili, M.

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

Hilaire, P. S.

B. T. Schowengerdt, D. Lin, and P. S. Hilaire, U.S. Patent No. US20180052277A1. (U.S. Patent and Trademark Office, Washington, DC, 2017).

Hoffman, D. M.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Howard, I. P.

I. P. Howard and B. J. Rogers, “Development and pathology of binocular vision,” in Binocular Vision and Stereopsis (Oxford Scholarship Online, 1996), pp. 603–644.
[Crossref]

Hua, H.

H. Hua, “Enabling Focus Cues in Head-Mounted Displays,” Proc. IEEE 105(5), 805–824 (2017).
[Crossref]

Huang, F.

F. Huang, D. Luebke, and G. Wetzstein, “The light field stereoscope,” ACM SIGGRAPH 2015 Emerging Technologies on - SIGGRAPH 15 (2015).

Iannizzotto, G.

G. Iannizzotto and F. L. Rosa, “Competitive Combination of Multiple Eye Detection and Tracking Techniques,” IRE Trans. Ind. Electron. 58(8), 3151–3159 (2011).
[Crossref]

Iwasaki, T.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

Kautz, J.

Kawai, T.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

Kazempourradi, S.

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

Kim, J.

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

Krueger, M. W.

J. P. Rolland, M. W. Krueger, and A. Goon, “Multifocal planes head-mounted displays,” Appl. Opt. 39(19), 3209 (2000).
[Crossref]

J. P. Rolland, M. W. Krueger, and A. A. Goon, “Dynamic focusing in head-mounted displays,” Stereoscopic Displays and Virtual Reality Systems VI (1999).

Lin, D.

B. T. Schowengerdt, D. Lin, and P. S. Hilaire, U.S. Patent No. US20180052277A1. (U.S. Patent and Trademark Office, Washington, DC, 2017).

Lopes, W.

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

Love, G. D.

M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).

Luebke, D.

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

K. Akşit, J. Kautz, and D. Luebke, “Slim near-eye display using pinhole aperture arrays,” Appl. Opt. 54(11), 3422 (2015).
[Crossref]

F. Huang, D. Luebke, and G. Wetzstein, “The light field stereoscope,” ACM SIGGRAPH 2015 Emerging Technologies on - SIGGRAPH 15 (2015).

Macnamara, J.

J. Macnamara and G., Three Dimensional Virtual and Augmented Reality Display System, 26 Jan. 2017.

Mather, G.

G. Mather and D. R. Smith, “Depth cue integration: Stereopsis and image blur,” Vision Res. 40(25), 3501–3506 (2000).
[Crossref]

Miyake, N.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

Mon-Williams, M.

S. Rushton, M. Mon-Williams, and J. P. Wann, “Binocular vision in a bi-ocular world: New-generation head-mounted displays avoid causing visual deficit,” Displays 15(4), 255–260 (1994).
[Crossref]

Ng, R.

M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).

Ohta, K.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

Otsuki, M.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

Pilu, M.

A. Fitzgibbon, M. Pilu, and R. Fisher, “Direct least squares fitting of ellipses,” Proceedings of 13th International Conference on Pattern Recognition (1996).

Rogers, B. J.

I. P. Howard and B. J. Rogers, “Development and pathology of binocular vision,” in Binocular Vision and Stereopsis (Oxford Scholarship Online, 1996), pp. 603–644.
[Crossref]

Rolland, J. P.

J. P. Rolland, M. W. Krueger, and A. Goon, “Multifocal planes head-mounted displays,” Appl. Opt. 39(19), 3209 (2000).
[Crossref]

J. P. Rolland, M. W. Krueger, and A. A. Goon, “Dynamic focusing in head-mounted displays,” Stereoscopic Displays and Virtual Reality Systems VI (1999).

Rosa, F. L.

G. Iannizzotto and F. L. Rosa, “Competitive Combination of Multiple Eye Detection and Tracking Techniques,” IRE Trans. Ind. Electron. 58(8), 3151–3159 (2011).
[Crossref]

Rushton, S.

S. Rushton, M. Mon-Williams, and J. P. Wann, “Binocular vision in a bi-ocular world: New-generation head-mounted displays avoid causing visual deficit,” Displays 15(4), 255–260 (1994).
[Crossref]

Schowengerdt, B. T.

B. T. Schowengerdt, D. Lin, and P. S. Hilaire, U.S. Patent No. US20180052277A1. (U.S. Patent and Trademark Office, Washington, DC, 2017).

Shibata, T.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

Shimizu, E.

T. Ando and E. Shimizu, “Head-mounted display using holographic optical element,” Three-Dimensional Video and Display: Devices and Systems: A Critical Review (2001).

Shirley, P.

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

Smith, D. R.

G. Mather and D. R. Smith, “Depth cue integration: Stereopsis and image blur,” Vision Res. 40(25), 3501–3506 (2000).
[Crossref]

Soner, B.

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

Spitzer, C. R.

C. R. Spitzer, U. Ferrell, and T. Ferrell, Digital Avionics Handbook3rd ed. (CRC Press, 2017).

Srinivasan, P.

M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).

Ulusoy, E.

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

Urey, H.

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

Wann, J. P.

S. Rushton, M. Mon-Williams, and J. P. Wann, “Binocular vision in a bi-ocular world: New-generation head-mounted displays avoid causing visual deficit,” Displays 15(4), 255–260 (1994).
[Crossref]

Watt, S. J.

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM SIGGRAPH 2004 Papers on - SIGGRAPH 04 (2004).

Wetzstein, G.

F. Huang, D. Luebke, and G. Wetzstein, “The light field stereoscope,” ACM SIGGRAPH 2015 Emerging Technologies on - SIGGRAPH 15 (2015).

Wood, E.

E. Wood and A. Bulling, “EyeTab,” Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 14 (2014).

Yoshihara, Y.

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

ACM Transactions on Graphics (1)

K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke, “Near-eye varifocal augmented reality display using see-through screens,” ACM Transactions on Graphics 36(6), 1–13 (2017).
[Crossref]

Appl. Opt. (2)

Displays (1)

S. Rushton, M. Mon-Williams, and J. P. Wann, “Binocular vision in a bi-ocular world: New-generation head-mounted displays avoid causing visual deficit,” Displays 15(4), 255–260 (1994).
[Crossref]

IRE Trans. Ind. Electron. (1)

G. Iannizzotto and F. L. Rosa, “Competitive Combination of Multiple Eye Detection and Tracking Techniques,” IRE Trans. Ind. Electron. 58(8), 3151–3159 (2011).
[Crossref]

J. Soc. Inf. Disp. (1)

T. Shibata, T. Kawai, K. Ohta, M. Otsuki, N. Miyake, Y. Yoshihara, and T. Iwasaki, “Stereoscopic 3-D display with optical correction for the reduction of the discrepancy between accommodation and convergence,” J. Soc. Inf. Disp. 13(8), 665 (2005).
[Crossref]

J. Vision (1)

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Proc. IEEE (1)

H. Hua, “Enabling Focus Cues in Head-Mounted Displays,” Proc. IEEE 105(5), 805–824 (2017).
[Crossref]

Vision Res. (1)

G. Mather and D. R. Smith, “Depth cue integration: Stereopsis and image blur,” Vision Res. 40(25), 3501–3506 (2000).
[Crossref]

Other (17)

M. S. Banks, S. A. Cholewiak, G. D. Love, P. Srinivasan, and R. Ng, “ChromaBlur: Rendering Chromatic Eye Aberration Improves Accommodation and Realism in HMDs,” Imaging and Applied Optics 2017 (3D, AIO, COSI, IS, MATH, PcAOP) (2017).

F. Huang, D. Luebke, and G. Wetzstein, “The light field stereoscope,” ACM SIGGRAPH 2015 Emerging Technologies on - SIGGRAPH 15 (2015).

S. Kazempourradi, M. Hedili, K., B. Soner, A. Cem, E. Ulusoy, and H. Urey, “Full-Color Holographic Near-Eye Display with Natural Depth Cues,” In Proc. of the 18th International Meeting of Information Display (IMID) (2018).

J. Macnamara and G., Three Dimensional Virtual and Augmented Reality Display System, 26 Jan. 2017.

B. T. Schowengerdt, D. Lin, and P. S. Hilaire, U.S. Patent No. US20180052277A1. (U.S. Patent and Trademark Office, Washington, DC, 2017).

E. Wood and A. Bulling, “EyeTab,” Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 14 (2014).

A. Fitzgibbon, M. Pilu, and R. Fisher, “Direct least squares fitting of ellipses,” Proceedings of 13th International Conference on Pattern Recognition (1996).

Unity. Retrieved from https://unity3d.com/

OpenCV library. Retrieved from http://www.opencv.org/

Mosca. Retrieved from http://www.mosca.io/

Reconstruction an Image from Projection Data – MATLAB & Simulink Example. Retrieved from http://www.mathworks.com/help/stats/boxplot.html

J. P. Rolland, M. W. Krueger, and A. A. Goon, “Dynamic focusing in head-mounted displays,” Stereoscopic Displays and Virtual Reality Systems VI (1999).

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM SIGGRAPH 2004 Papers on - SIGGRAPH 04 (2004).

Optics - Imaging - Photonics - Optomechanics - Lasers Edmund Optics. Retrieved from http://www.edmundoptics.com/

C. R. Spitzer, U. Ferrell, and T. Ferrell, Digital Avionics Handbook3rd ed. (CRC Press, 2017).

T. Ando and E. Shimizu, “Head-mounted display using holographic optical element,” Three-Dimensional Video and Display: Devices and Systems: A Critical Review (2001).

I. P. Howard and B. J. Rogers, “Development and pathology of binocular vision,” in Binocular Vision and Stereopsis (Oxford Scholarship Online, 1996), pp. 603–644.
[Crossref]

Supplementary Material (2)

NameDescription
» Visualization 1       Experimental video capture of the virtual scene
» Visualization 2       Experimental video capture of the virtual scene

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1.
Fig. 1. Schematic illustration of the optical layout
Fig. 2.
Fig. 2. 3-D Optical Layout in ZEMAX
Fig. 3.
Fig. 3. a) The distortion values of different field points, while the eye is in the center of eye-box. b) The distortion values of different field points, while eyes are in the natural position (i.e. off-centered by IPD/2 = 32.5 mm)
Fig. 4.
Fig. 4. The spot radius sizes for 9 sample points, which are selected as the center, edge and corner points of the display to cover the entire FOV. In this ZEMAX simulation, the eye is shifted by IPD/2 and the wavelengths are chosen to support the visible light spectrum (red color with 656.3 nm, green color with 587.6 nm and blue color with 486.1 nm). Positions of the field points in the FOV are shown in bottom-right.
Fig. 5.
Fig. 5. The prototype hardware including 2 LCDs for two focal planes and camera for simultaneously tracking two eyes and computing the gaze distance
Fig. 6.
Fig. 6. a) Content appearing in the near display b) Content appearing in the far display
Fig. 7.
Fig. 7. Experimental captures of the virtual scene. a) The camera is focused on 4 diopters (25 cm). b) The camera is focused on 2 diopters (50 cm). FOV is 60.0 by 35.5 degrees and the image can be seen with both eyes simultaneously. (see Visualization 1 and Visualization 2)
Fig. 8.
Fig. 8. a) A geometrical model is developed to estimate the convergence depth of a user. b) Pupil centers illustrated on the left and right eyes. The blue and red dots respectively indicate the positions of pupils when user converges at the near display and far display
Fig. 9.
Fig. 9. a) Calibration and tracking process. b) Details of the eye center and IPD estimation algorithm
Fig. 10.
Fig. 10. a) Region of interest for one eye b) Red-channel image c) Thresholded and segmented image d) Polar transformed image e) Radial edge points, the best-fitting ellipse, and its center (illustrated as blue dot) f) Final result on both eyes (computed pupil center illustrated as a green dot)
Fig. 11.
Fig. 11. Interaction with gaze tracking. User’s gaze is at the a) near display and b) far display. Based on the user’s gaze, text color changes automatically.
Fig. 12.
Fig. 12. a) Measured IPD distribution of 4 users while looking at the near display and far display subsequently during the calibration process. Data extracted from 180 frames in each interval. Red dashed line shows the selected threshold value for each user. b) Repeated gaze tracking results with User-4 when the user repeatedly focuses on near and then far display. User-4. Each box consists of IPD measurement extracted from 60 frames. The threshold value found in the calibration process is illustrated by the dashed line.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

Distortion (%)=100 × ychiefyrefyref
Δθ = θN θF= tan1(IPD/2Near depth)tan1(IPD/2Far depth)=3.7= 0.065 rad
ΔIPD=2Δx2 ×reye × Δθ=1.56mm,