Abstract

We demonstrate a method for measuring the transverse chromatic aberration (TCA) in a virtual reality head-mounted display. The method relies on acquiring images of a digital bar pattern and measuring the displacement of different color bars. This procedure was used to characterize the TCAs in the Oculus Go, Oculus Rift, Samsung Gear, and HTC Vive. The results show noticeable TCAs for the Oculus devices for angles larger than 5° from the center of the field of view. TCA is less noticeable in the Vive in part due to off-axis monochromatic aberrations. Finally, user measurements were conducted, which were in excellent agreement with the laboratory results.

1. Introduction

Interest in creating the volumetric perception of three dimensional (3D) images can be traced back to the mid-19th century with the stereoscope work by Charles Wheatstone [1], which was foundational for the invention of virtual reality (VR) head-mounted displays (HMDs) [2]. The past few years have seen a resurgence in efforts to develop VR and augmented reality (AR) devices enabled by advancements in mobile sensing, graphical computing, display technology, and optical fabrication techniques. While the majority of these efforts have focused on consumer applications and entertainment, these technologies have considerable potential in medicine and are being explored for surgical [3–16], training [15, 17–19], and therapeutic applications [15, 20]. The continued advancements in 3D medical imaging technologies present an additional application for visualizing 3D data sets, which has been demonstrated for displaying magnetic resonance imaging (MRI) segmentation [21] and for surgical planning [22,23].

While AR and VR open new frontiers within medicine, they also raise unanswered questions concerning the performance, safety, and efficacy of the devices for these new applications. These concerns are more pronounced for HMDs since form factor and weight limitations necessitate trade-offs in design and compromises in performance. As a result, increasing effort has been invested into developing methods to characterize the radiometric [24–26], color [26, 27], and general optical performance of these devices [28–33]. These studies, however, have primarily emphasized the fundamental properties of devices including radiance and centering, and not on the impact of the optical aberrations on image quality. This distinction is necessary, as understanding and characterizing the impact of optical aberrations on HMDs is relevant for medical applications such as surgery and diagnostics.

Generally, aberrations are corrected with complex multi-element optical systems. However, HMDs typically use a single-lens design to minimize the cost and form factor, which substantially restricts the options available for correcting aberrations. The optical design is further complicated by the preference for wide field-of-view (FOV) at the expense of focal length, which increases the monochromatic aberrations, such as distortion and spherical aberrations.

In addition to the monochromatic aberrations, optical aberrations also include two types of chromatic aberrations: a shift in the focal length with wavelength, know as longitudinal chromatic aberration (LCA), and wavelength dependent magnification across the FOV, called transverse chromatic aberration (TCA) [34]. LCA introduces wavelength-dependent defocusing, whereas TCA changes the size of a rendered object as a function of wavelength. The effect seen is that the relative position between two objects will vary across the FOV and as a function of color. Furthermore, the color at a given spatial location can also vary due to incorrect pixel mapping between the object and image, which would result in unintended color mixing. While TCA presents an obvious challenge for accurately rendering 3D data in color, it also increases blurring in grayscale images away from the center of the FOV, due to the TCA induced color mixing from the sub-pixels.

The TCA for a VR HMD is illustrated in Fig. 1(a), where a single off-axis white pixel is imaged onto the user’s eye. With no TCA, the red, green, and blue sub-pixels are imaged into the same spatial location on the retina. With TCA, the light from the sub-pixels are imaged to different locations on the retina, resulting in a color-dependent blurring of the image or in spatially separated objects of different colors. TCA is most commonly observed as color fringing in high contrast areas of an image, such as white on a black background. To demonstrate this effect, a pattern of alternating blue and red bars is sent to the Oculus Rift and an image of the resulting representation on the HMD shown in Fig. 1(b). In the center of the FOV, colored bars alternate as expected. However as the pattern moves away from the center of the FOV, the relative positions of the red and blue bars shift, resulting in the two colors overlapping near the edge of the FOV. This provides a simple illustration of the impact of TCA on image quality across the FOV. In addition, TCA also leads to an apparent movement of the colored bars relative to each other as the user’s eye rotates to focus on objects near the edge of the FOV. In other words, the relative location of an object of interest varies with the rendering color and position in the FOV. This is particularly important for 3D visualization of medical data that could be used for diagnostic or surgical planning purposes, where exact estimates of size and localization are critical. To fully understand these effects, methods need to be developed to characterize TCA across devices using bench testing.

 

Fig. 1 (a) Sketch of a VR HMD without (top panel) and with (bottom panel) TCA where the different colors of white light focus to different locations on the retina. (b) Image of TCA on the Oculus Rift. Red and blue bars are separate in the center of the FOV. The bars overlap at the edge of the FOV due to TCA. The white arrow with the labeled edge is at a visual angle of 15°

Download Full Size | PPT Slide | PDF

Here we demonstrate a methodology including a digital test pattern and an optical setup to characterize the TCA of VR HMDs. We tested our method on the Oculus Go, Oculus Rift, Samsung Gear, and HTC Vive and found noticeable TCA for angles above 5° for the Gear and Rift. TCA on the Go was significantly reduced compared to the Rift and Gear. The TCA on Vive was well corrected in the horizontal direction, but the images have reduced sharpness away from the central region of the FOV. It is also important to note that the TCA was measured with a black background. The differences in horizontal TCA for the red and blue for the Gear, Rift, Go, and Vive were found to be −0.74 arcmin/deg, 0.79 arcmin/deg, 0.25 arcmin/deg, and 0.02 arcmin/deg, respectively, where positive sign indicates red (blue) shifted away (towards) from the center of the FOV relative to the expected position. This offset is relevant since the angular resolution of the human eye is ≈ 1 arcmin [35] and the change in TCA of the human eye is less than 1 arcmin in the central visual range [36]. Finally, user measurements were conducted using a modified version of the test pattern and the results were in excellent agreement with the laboratory measurements. Despite TCA software corrections being built into many of the VR HMDs, these results demonstrate that TCAs still degrade image quality and color performance. This is particularly true for mobile devices with limited computing power, such as the Samsung Gear. Even for non-mobile devices, TCA corrections are often implemented on the HMDs to decrease latency, which have more limited performance compared to graphics cards. Finally, the sub-pixels have broadband emission and therefore cannot be fully corrected by applying software corrections to the red, green, and blue sub-pixels. As new rendering modalities are developed, such as foveated rendering, TCA will warrant continued development of testing methods.

2. Methods and results

Figures 2(a) and 2(b) shows a sketch and photograph, respectively, of the experimental setup for measuring the TCA of VR HMDs. An achromatic double lens (f = 25 mm) imaged the pattern from the HMD onto a CMOS detector. The focal length of the lens was chosen to approximate the focal length of the human eye. Similarly, an iris was positioned in front of the lens and closed to 4 mm to mimic the human iris. While the human eye has TCA as well, an achromat was selected in order to measure the TCA of the HMDs with minimal additional TCA from the optical system. XYZ translation stages were used to position the camera in the center of the eyebox for the measurements. After centering, the stages remained fixed for the TCA measurements. A test pattern consisting of saturated red, green, and blue bars was generated in Unity for all devices except the Oculus Go, which was generated in MATLAB. A black background was used, since a non-black background will introduce color mixing between the test pattern and the background. The pattern was fixed to the coordinate system of the HMD to remove any alignment issues between the pattern on the headset and the optical setup. The input test pattern and resulting image acquired by the camera from the Oculus Rift are shown in Figs. 2(c) and 2(d), respectively. At the edges of the FOV, the blue bars appear to be shifted inward relative to the red and green bars consistent with the behavior shown in Fig. 1(b).

 

Fig. 2 (a) Sketch of the experimental setup to characterize a VR HMD. Light from the VR HMD passes through an iris and is focused onto an RGB camera using a 25 mm focal length lens. The XYZ translation stages were used to center the camera in the eyebox of the HMD. (b) Photograph of setup. (c) Test pattern for characterizing TCA. (d) Image of test pattern from c on the Oculus Rift. White lines designate placement of profile used for plots in Fig. 3(a).

Download Full Size | PPT Slide | PDF

To quantitatively characterize TCAs, profiles were taken along the white lines in Fig. 2(d) and the results are plotted in Fig. 3(a). At the edge of the FOV, the blue profile is shifted relative to the red and green as expected from Fig. 2(d), whereas the red and green bars do not show a noticeable shift. To better illustrate the shift, the center position of each of the bars from the profile are plotted in Fig. 3(b) with the aberrations of the measurement system removed using measurements on a conventional LCD display with the same test pattern (Fig. 3(c)). The cubic curvature in each of the profiles is due to the distortion from the lens in the HMD and the relative offsets between the three profiles is from the TCA. The curves were fit (solid lines in Fig. 3(b)) using the equation Δx = a1x3 + a2,λx, where Δx is the deviation from the expected x position, a1 is the coefficient for the distortion and a2,λ is the wavelength (λ) dependent coefficient for TCA. For fitting, a1 was kept constant for the three curves since distortion is wavelength independent and the linear term is a free fitting parameter for each color. Figure 3(b) shows a significant shift in the position of the blue bars across the FOV compared to the red and green bars.

 

Fig. 3 (a) Plots of profile across the red, green, and blue bars along the white lines in Fig. 2(d). (b) Offset in the positions of the bars relative to the expected positions from the designed pattern, where Δx is the deviation from the expected position of the bars from the input test pattern. Solid lines are fits using Δx = a1x3 + a2,λx for the distortion and TCA of the HMD. (c) Offset in the positions of the bars relative to the expected positions from the designed pattern using a conventional LCD display.

Download Full Size | PPT Slide | PDF

While Fig. 3(b) shows the amount of TCA across an HMD, a more intuitive unit for relating with human perception is to convert the measurements to visual angle. The distortion of the HMD was removed by normalizing the red and blue curves by the green curve and the resulting data was converted to an angular displacement using the equation θ = tan−1 (x/d), where x is the position on the CMOS sensor and d is the distance from the lens to the CMOS sensor. The normalized and converted data from Fig. 3(b) is shown in Fig. 4(a). Given that the resolution of the human eye in the fovea is approximately 1 arcmin [35] and binocular visual acuity is higher than monocular [37], TCAs have a noticeable impact on the image quality for angles larger than approximately 5°. This is a relatively small deviation from the center of the FOV and is relevant for determining the angular region in the full FOV that is suitable for accurate 3D image rendering. The performance in the vertical direction is similar, as shown in Fig. 4(b). These measurements were repeated for the Oculus Go, Samsung Gear, and HTC Vive. The results for each device are plotted in Fig. 4 for the horizontal (left column) and vertical (right column) TCA. A positive (negative) value for TCA indicates a shift towards the nasal (temporal) visual field. While the Rift, Go, and Gear are all manufactured by or in partnership with Oculus, the performance varies significantly. The two most notable differences are, first, that the TCA of the Gear has the opposite sign compared to the Rift and Go. This is due to corrections to chromatic aberrations implemented in the image processing for the Oculus Rift and Go. The TCA for the Go had the same sign as the Gear and was considerably larger prior to the release of this correction. Second, the TCA for the Rift is significantly larger than for the Go, the latter of which has nearly been corrected in the horizontal direction. This is likely due to changes in the fabrication process for the newer Go. Compared to the other devices, the TCA on the Vive is well corrected and difficult to see by eye. While this seems to suggest that the usable angular region for accurate 3D data rendering is larger on the Vive, image sharpness decreases away from the center of the FOV compared to the Go or Rift. Table 1 summarizes the slopes, mR/B,H/V, from Fig. 4 for the red (R) and blue (B) lines in the horizontal (H) and vertical (V) directions. To allow for more direct comparisons, the difference in the slopes, ΔmH/V = mred,H/Vmblue,H/V, is also listed. These results provide a comparison of the TCA for different commercially available VR HMDs and demonstrate that the proposed test pattern provides a suitable method for measuring the TCA across a wide variety of HMDs.

 

Fig. 4 TCA in the horizontal and vertical directions for the (a),(b) Oculus Rift (c),(d) Oculus Go (e),(f) Samsung Gear (g),(h) HTC Vive, respectively. A positive (negative) value for TCA indicates a shift towards the nasal (temporal) visual field. As a reference, the results of the test pattern using a conventional display are shown in (a) using red and blue dashed lines.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. TCA for VR HMDs

Since the user experience is important in VR applications, a modified version of the test pattern was implemented in Unity to measure the TCA experienced by users. In VR the user re-scales the red and blue bars independently until they are aligned with the green bars, as illustrated in the inset of Fig. 5(a). The re-scaling factor gives the TCA percentage for red and blue measured by the user. This task was conducted using the Rift for five users and the results are plotted in Fig. 5(a) for a visual angle of 30°. To further explore TCA for the user, this test was conducted for several visual angles and the results are plotted in Fig. 5(b), where the dashed lines are the laboratory results and the squares (triangles) show the average user results for the red (blue) bars. These results are compared to the values from Table 1 for the horizontal direction of the Rift. As shown in Fig. 5, the results for the five users are in excellent agreement with the laboratory results for angles above 15° and demonstrate that the proposed test pattern is applicable for user tests in addition to laboratory measurements.

 

Fig. 5 (a) Measured TCA percentages for five users on the Rift for the red and blue bars at a visual angle of 30°. The results from Table 1 for the horizontal direction on the Rift are plotted for comparison. Inset: Depiction of the user task to measure TCA. The blue and red bars are shifted by the user until alignment with the green bars as indicated by the white arrows. (b) Psychophysics tests for the five users at different visual angles with the individual results (dots), blue user averages (blue triangles), red user averages (red squares), and laboratory measurements (dashed lines). The error bars are ± 1 standard deviation.

Download Full Size | PPT Slide | PDF

3. Conclusion

We have demonstrated a methodology to measure the TCA of VR HMDs and applied it to four different devices. This technique measures TCA for the entire HMD, which includes both the optical TCA and corrections applied in the graphics pipeline and the results show that most of the measured TCA can be compensated using a linear correction factor. The measured TCA is significant for image quality, particularly considering the visual acuity is higher for binocular vision than monocular vision [37]. This measurement approach is suitable for measuring the TCA across a wide range of VR HMDs, and can be easily extended to AR devices. While TCA can be to some degree corrected through image processing, TCA is not completely corrected for cases with a black background such as the test pattern demonstrated in this article. In the case of white, or colored backgrounds, TCA leads to color changes for the user due to color mixing of the sub-pixels, as opposed to the displacements observed for a black background. This is particularly relevant for 3D rendering of volumetric medical images. The TCAs in these devices also suggest that the usable FOV for displaying data should be carefully considered to avoid decreased image quality away from the center of the FOV. We are investigating a follow-up study to use sensitivity to TCA as a method for determining the eyebox centering.

Acknowledgments

The authors acknowledge Dr. Wei-Chung Cheng (FDA) for valuable discussions and technical insights.

Disclosures

Andrea S. Kim is supported by an appointment to the Research Participation Program at the Center for Devices and Radiological Health administered by the Oak Ridge Institute for Science and Education through an interagency agreement between the U.S. Department of Energy and the U.S. Food and Drug Administration. The mention of commercial products, their sources, or their use in connection with material reported herein is not to be construed as either an actual or implied endorsement of such products by the Department of Health and Human Services.

References

1. C. Wheatstone, “XVIII. contributions to the physiology of vision. - part the first. on some remarkable, and hitherto unobserved, phenomena of binocular vision,” Philos. Trans. Roy. Soc. Lond. 128, 371–394 (1838). [CrossRef]  

2. I. E. Sutherland, “A head-mounted three dimensional display,” in Proceedings of the December 9–11, 1968, fall joint computer conference, part I, (ACM, 1968), pp. 757–764.

3. D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010). [CrossRef]  

4. D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015). [CrossRef]  

5. Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015). [CrossRef]  

6. S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015). [CrossRef]   [PubMed]  

7. H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015). [CrossRef]  

8. N. Cui, P. Kharel, and V. Gruev, “Augmented reality with Microsoft Hololens holograms for near infrared fluorescence based image guided surgery,” in Molecular-Guided Surgery: Molecules, Devices, and Applications III, (International Society for Optics and Photonics, 2017), p. 100490I.

9. M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017). [CrossRef]   [PubMed]  

10. R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018). [CrossRef]   [PubMed]  

11. X. Chen and J. Hu, “A review of haptic simulator for oral and maxillofacial surgery based on virtual reality,” Expert. Rev. Med. Devices 15, 435–444 (2018). [CrossRef]   [PubMed]  

12. D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018). [CrossRef]  

13. M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018). [CrossRef]  

14. J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018). [CrossRef]  

15. T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018). [CrossRef]  

16. A. S. Pillai and P. S. Mathew, “Impact of virtual reality in healthcare: a review,” in Virtual and Augmented Reality in Mental Health Treatment, (IGI Global, 2019), pp. 17–31. [CrossRef]  

17. G. S. Ruthenbeck and K. J. Reynolds, “Virtual reality for medical training: the state-of-the-art,” J. Simul. 9, 16–26 (2015). [CrossRef]  

18. C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018). [CrossRef]  

19. F. G. Hamza-Lup, J. P. Rolland, and C. Hughes, “A distributed augmented reality system for medical training and simulation,” arXiv preprint arXiv:1811.12815 (2018).

20. M. M. North and S. M. North, “Virtual reality therapy,” in Computer-assisted and web-based innovations in psychology, special education, and health, (Elsevier, 2016), pp. 141–156. [CrossRef]  

21. D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019). [CrossRef]  

22. J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

23. A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018). [CrossRef]  

24. R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960. [CrossRef]  

25. J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017). [CrossRef]  

26. J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953. [CrossRef]  

27. L. Zhang and M. J. Murdoch, “Color matching criteria in augmented reality,” in Color and Imaging Conference, (Society for Imaging Science and Technology, 2018), pp. 102–109. [CrossRef]  

28. A. Abileah, “30.1: Overview: testing 3d-stereo displays, techniques and challenges,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2013), pp. 368–371. [CrossRef]  

29. K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067. [CrossRef]  

30. K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957. [CrossRef]  

31. K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502. [CrossRef]  

32. R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964. [CrossRef]  

33. R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018). [CrossRef]  

34. W. J. Smith, Modern optical engineering (Tata McGraw-Hill Education, 1966).

35. M. Yanoff and J. S. Duker, Ophthalmology (MOSBY Elsevier, 2009).

36. S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016). [CrossRef]   [PubMed]  

37. R. Cagenello, A. Arditi, and D. L. Halpern, “Binocular enhancement of visual acuity,” J. Opt. Soc. Am. A 10, 1841–1848 (1993). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. C. Wheatstone, “XVIII. contributions to the physiology of vision. - part the first. on some remarkable, and hitherto unobserved, phenomena of binocular vision,” Philos. Trans. Roy. Soc. Lond. 128, 371–394 (1838).
    [Crossref]
  2. I. E. Sutherland, “A head-mounted three dimensional display,” in Proceedings of the December 9–11, 1968, fall joint computer conference, part I, (ACM, 1968), pp. 757–764.
  3. D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
    [Crossref]
  4. D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
    [Crossref]
  5. Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
    [Crossref]
  6. S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
    [Crossref] [PubMed]
  7. H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
    [Crossref]
  8. N. Cui, P. Kharel, and V. Gruev, “Augmented reality with Microsoft Hololens holograms for near infrared fluorescence based image guided surgery,” in Molecular-Guided Surgery: Molecules, Devices, and Applications III, (International Society for Optics and Photonics, 2017), p. 100490I.
  9. M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
    [Crossref] [PubMed]
  10. R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
    [Crossref] [PubMed]
  11. X. Chen and J. Hu, “A review of haptic simulator for oral and maxillofacial surgery based on virtual reality,” Expert. Rev. Med. Devices 15, 435–444 (2018).
    [Crossref] [PubMed]
  12. D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
    [Crossref]
  13. M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
    [Crossref]
  14. J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
    [Crossref]
  15. T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018).
    [Crossref]
  16. A. S. Pillai and P. S. Mathew, “Impact of virtual reality in healthcare: a review,” in Virtual and Augmented Reality in Mental Health Treatment, (IGI Global, 2019), pp. 17–31.
    [Crossref]
  17. G. S. Ruthenbeck and K. J. Reynolds, “Virtual reality for medical training: the state-of-the-art,” J. Simul. 9, 16–26 (2015).
    [Crossref]
  18. C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
    [Crossref]
  19. F. G. Hamza-Lup, J. P. Rolland, and C. Hughes, “A distributed augmented reality system for medical training and simulation,” arXiv preprint arXiv:1811.12815 (2018).
  20. M. M. North and S. M. North, “Virtual reality therapy,” in Computer-assisted and web-based innovations in psychology, special education, and health, (Elsevier, 2016), pp. 141–156.
    [Crossref]
  21. D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
    [Crossref]
  22. J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).
  23. A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
    [Crossref]
  24. R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
    [Crossref]
  25. J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
    [Crossref]
  26. J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
    [Crossref]
  27. L. Zhang and M. J. Murdoch, “Color matching criteria in augmented reality,” in Color and Imaging Conference, (Society for Imaging Science and Technology, 2018), pp. 102–109.
    [Crossref]
  28. A. Abileah, “30.1: Overview: testing 3d-stereo displays, techniques and challenges,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2013), pp. 368–371.
    [Crossref]
  29. K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
    [Crossref]
  30. K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
    [Crossref]
  31. K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
    [Crossref]
  32. R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964.
    [Crossref]
  33. R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
    [Crossref]
  34. W. J. Smith, Modern optical engineering (Tata McGraw-Hill Education, 1966).
  35. M. Yanoff and J. S. Duker, Ophthalmology (MOSBY Elsevier, 2009).
  36. S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
    [Crossref] [PubMed]
  37. R. Cagenello, A. Arditi, and D. L. Halpern, “Binocular enhancement of visual acuity,” J. Opt. Soc. Am. A 10, 1841–1848 (1993).
    [Crossref] [PubMed]

2019 (1)

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

2018 (9)

J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
[Crossref]

C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
[Crossref]

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

X. Chen and J. Hu, “A review of haptic simulator for oral and maxillofacial surgery based on virtual reality,” Expert. Rev. Med. Devices 15, 435–444 (2018).
[Crossref] [PubMed]

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

2017 (2)

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

2016 (1)

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

2015 (5)

G. S. Ruthenbeck and K. J. Reynolds, “Virtual reality for medical training: the state-of-the-art,” J. Simul. 9, 16–26 (2015).
[Crossref]

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

2010 (1)

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

1993 (1)

1838 (1)

C. Wheatstone, “XVIII. contributions to the physiology of vision. - part the first. on some remarkable, and hitherto unobserved, phenomena of binocular vision,” Philos. Trans. Roy. Soc. Lond. 128, 371–394 (1838).
[Crossref]

Abileah, A.

A. Abileah, “30.1: Overview: testing 3d-stereo displays, techniques and challenges,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2013), pp. 368–371.
[Crossref]

Achilefu, S.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Akers, W. J.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Akinduro, O. O.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Ard, T.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Arditi, A.

Austin, R. L.

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

Bae, J. H.

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

Bodenstedt, S.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Bosc, R.

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

Boynton, P. A.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964.
[Crossref]

Brown, B. L.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Bydon, M.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Cagenello, R.

Calpito, R. C.

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

Castrillon-Oberndorfer, G.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Chai, G.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Chai, Y. J.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Chen, R. E.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Chen, X.

X. Chen and J. Hu, “A review of haptic simulator for oral and maxillofacial surgery based on virtual reality,” Expert. Rev. Med. Devices 15, 435–444 (2018).
[Crossref] [PubMed]

Cho, M.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Chung, K.-H.

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

Cui, N.

N. Cui, P. Kharel, and V. Gruev, “Augmented reality with Microsoft Hololens holograms for near infrared fluorescence based image guided surgery,” in Molecular-Guided Surgery: Molecules, Devices, and Applications III, (International Society for Optics and Photonics, 2017), p. 100490I.

Cutkosky, M. R.

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

Daniel, B. L.

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

Dao, T.-H.

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

Denning, B.

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

Denning, B. S.

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

Diaz, R. J.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Dillmann, R.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Dohi, T.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Draper, R. S.

R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964.
[Crossref]

Drews, B.

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

Drews, B. C.

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

Duker, J. S.

M. Yanoff and J. S. Duker, Ophthalmology (MOSBY Elsevier, 2009).

Duncan, D.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Fabian, P.

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

Fedoriouk, V.

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

Fedoriouk, V. B.

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

Fields, R. C.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Fitoussi, A.

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

Folguera, S.

C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
[Crossref]

Forner, L.

C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
[Crossref]

Freeman, W. D.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Gacy, L. W.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

Gao, S.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Garner, R.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Gruev, V.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

N. Cui, P. Kharel, and V. Gruev, “Augmented reality with Microsoft Hololens holograms for near infrared fluorescence based image guided surgery,” in Molecular-Guided Surgery: Molecules, Devices, and Applications III, (International Society for Optics and Photonics, 2017), p. 100490I.

Halpern, D. L.

Hamza-Lup, F. G.

F. G. Hamza-Lup, J. P. Rolland, and C. Hughes, “A distributed augmented reality system for medical training and simulation,” arXiv preprint arXiv:1811.12815 (2018).

Han, P. K.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Heft, E. L.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

Hersant, B.

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

Hoffmann, J.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Hoshi, K.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Hosseinpour, A.-R.

A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
[Crossref]

Hu, J.

X. Chen and J. Hu, “A review of haptic simulator for oral and maxillofacial surgery based on virtual reality,” Expert. Rev. Med. Devices 15, 435–444 (2018).
[Crossref] [PubMed]

Hughes, C.

F. G. Hamza-Lup, J. P. Rolland, and C. Hughes, “A distributed augmented reality system for medical training and simulation,” arXiv preprint arXiv:1811.12815 (2018).

Hussain, T.

A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
[Crossref]

Hyodo, K.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Inoguchi, K.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Ishimoto, M.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Iwai, J.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Jenkins, S. A.

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

Jiang, T.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Katic, D.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Kerezoudis, P.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Kharel, P.

N. Cui, P. Kharel, and V. Gruev, “Augmented reality with Microsoft Hololens holograms for near infrared fluorescence based image guided surgery,” in Molecular-Guided Surgery: Molecules, Devices, and Applications III, (International Society for Optics and Photonics, 2017), p. 100490I.

Kim, D.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Kim, E. J.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Kim, H. C.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Komotar, R. J.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Kondo, R.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Kong, H.-J.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Kozakai, T.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Kurashige, M.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Lee, D.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Lee, K. E.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Leibfried, L. V.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

Li, Q.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Liang, K.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Liang, R.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Lianza, T. A.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

Liao, H.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Lin, L.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Lin, M. A.

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

Lin, Y.-K.

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

Linte, C. A.

T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018).
[Crossref]

Liu, D.

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

Liu, F.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Llena, C.

C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
[Crossref]

Lundström, L.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Margenthaler, J.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Masamune, K.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Mathew, P. S.

A. S. Pillai and P. S. Mathew, “Impact of virtual reality in healthcare: a review,” in Virtual and Augmented Reality in Mental Health Treatment, (IGI Global, 2019), pp. 17–31.
[Crossref]

Mendez, A.

A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
[Crossref]

Meningaud, J.-P.

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

Meyer, F. M.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

Mondal, S. B.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Murdoch, M. J.

L. Zhang and M. J. Murdoch, “Color matching criteria in augmented reality,” in Color and Imaging Conference, (Society for Imaging Science and Technology, 2018), pp. 102–109.
[Crossref]

Naruse, K.

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Newman, B.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

North, M. M.

M. M. North and S. M. North, “Virtual reality therapy,” in Computer-assisted and web-based innovations in psychology, special education, and health, (Elsevier, 2016), pp. 141–156.
[Crossref]

North, S. M.

M. M. North and S. M. North, “Virtual reality therapy,” in Computer-assisted and web-based innovations in psychology, special education, and health, (Elsevier, 2016), pp. 141–156.
[Crossref]

Oka, H.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Oshima, K.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Otsuka, K.

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Ouchi, S.

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Pan, J. J.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Penczek, J.

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964.
[Crossref]

Peters, T. M.

T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018).
[Crossref]

Pillai, A. S.

A. S. Pillai and P. S. Mathew, “Impact of virtual reality in healthcare: a review,” in Virtual and Augmented Reality in Mental Health Treatment, (IGI Global, 2019), pp. 17–31.
[Crossref]

Pirris, S. M.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Privitera, C.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Quinones-Hinojosa, A.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Raptis, C.

J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

Reynolds, K. J.

G. S. Ruthenbeck and K. J. Reynolds, “Virtual reality for medical training: the state-of-the-art,” J. Simul. 9, 16–26 (2015).
[Crossref]

Rodríguez-Lozano, F.

C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
[Crossref]

Rolland, J. P.

F. G. Hamza-Lup, J. P. Rolland, and C. Hughes, “A distributed augmented reality system for medical training and simulation,” arXiv preprint arXiv:1811.12815 (2018).

Roorda, A.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Russell, W. J.

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

Ruthenbeck, G. S.

G. S. Ruthenbeck and K. J. Reynolds, “Virtual reality for medical training: the state-of-the-art,” J. Simul. 9, 16–26 (2015).
[Crossref]

Sabesan, R.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Sanderson, P. M.

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

Sano, H.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Saslow, A.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Sato, O.

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Sato, Y.

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Seeberger, R.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Shibahara, Y.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Si, P.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Silva, J.

J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

Silva, J. N.

J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

Siu, A. F.

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

Smith, W. J.

W. J. Smith, Modern optical engineering (Tata McGraw-Hill Education, 1966).

Som, A.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Southworth, M.

J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

Speidel, S.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Spengler, P.

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Sudlow, G. P.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Suenaga, H.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Sugawara, M.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Sutherland, I. E.

I. E. Sutherland, “A head-mounted three dimensional display,” in Proceedings of the December 9–11, 1968, fall joint computer conference, part I, (ACM, 1968), pp. 757–764.

Takato, T.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Takenaka, H.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Tiruveedhula, P.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Toga, A. W.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Totani, T.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

Tran, H. H.

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Tsurutani, K.

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Uehara, S.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Ukai, R.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Unsbo, P.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Vachlin, F.

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

Valverde, I.

A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
[Crossref]

Varshneya, R.

R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964.
[Crossref]

Vogt, T.

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

Wakemoto, H.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

Wang, I.-C.

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

Wang, M. Y.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Wanserski, E.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Wharen, R. E. J.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Wheatstone, C.

C. Wheatstone, “XVIII. contributions to the physiology of vision. - part the first. on some remarkable, and hitherto unobserved, phenomena of binocular vision,” Philos. Trans. Roy. Soc. Lond. 128, 371–394 (1838).
[Crossref]

Williams, J.

T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018).
[Crossref]

Winter, S.

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

Xin, Y.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Yamada, M.

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

Yaniv, Z.

T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018).
[Crossref]

Yanoff, M.

M. Yanoff and J. S. Duker, Ophthalmology (MOSBY Elsevier, 2009).

Yau, H.-T.

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

Yi, J. W.

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

Yoon, J. W.

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Zhang, L.

L. Zhang and M. J. Murdoch, “Color matching criteria in augmented reality,” in Color and Imaging Conference, (Society for Imaging Science and Technology, 2018), pp. 102–109.
[Crossref]

Zhang, Y.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Zheng, C.

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

Zhu, M.

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Zhu, N.

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

Zrantchev, I.

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

Anesth. Analg. (1)

D. Liu, S. A. Jenkins, P. M. Sanderson, P. Fabian, and W. J. Russell, “Monitoring with head-mounted displays in general anesthesia: a clinical evaluation in the operating room,” Anesth. Analg. 110, 1032–1038 (2010).
[Crossref]

Annals Surg. Treat. Res. (1)

D. Lee, H.-J. Kong, D. Kim, J. W. Yi, Y. J. Chai, K. E. Lee, and H. C. Kim, “Preliminary study on application of augmented reality visualization in robotic thyroid surgery,” Annals Surg. Treat. Res. 95, 297–302 (2018).
[Crossref]

BMC Med. Imag. (1)

H. Suenaga, H. H. Tran, H. Liao, K. Masamune, T. Dohi, K. Hoshi, and T. Takato, “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study,” BMC Med. Imag. 15, 51 (2015).
[Crossref]

Clin. Implant. Dent. Relat. Res. (1)

Y.-K. Lin, H.-T. Yau, I.-C. Wang, C. Zheng, and K.-H. Chung, “A novel dental implant guided surgery based on integration of surgical template and augmented reality,” Clin. Implant. Dent. Relat. Res. 17, 543–553 (2015).
[Crossref]

Eur. Hear. J. (1)

A. Mendez, T. Hussain, A.-R. Hosseinpour, and I. Valverde, “Virtual reality for preoperative planning in large ventricular septal defects,” Eur. Hear. J. 40, 1092 (2018).
[Crossref]

Eur. J. Dental Educ. (1)

C. Llena, S. Folguera, L. Forner, and F. Rodríguez-Lozano, “Implementation of augmented reality in operative dentistry learning,” Eur. J. Dental Educ. 22, e122–e130 (2018).
[Crossref]

Expert. Rev. Med. Devices (1)

X. Chen and J. Hu, “A review of haptic simulator for oral and maxillofacial surgery based on virtual reality,” Expert. Rev. Med. Devices 15, 435–444 (2018).
[Crossref] [PubMed]

IEEE Robot. Autom. Lett. (1)

M. A. Lin, A. F. Siu, J. H. Bae, M. R. Cutkosky, and B. L. Daniel, “Holoneedle: augmented reality guidance system for needle placement investigating the advantages of three-dimensional needle shape reconstruction,” IEEE Robot. Autom. Lett. 3, 4156–4162 (2018).
[Crossref]

Int. J. Comput. Assist. Radiol. Surg. (1)

D. Katić, P. Spengler, S. Bodenstedt, G. Castrillon-Oberndorfer, R. Seeberger, J. Hoffmann, R. Dillmann, and S. Speidel, “A system for context-aware intraoperative augmented reality in dental implant surgery,” Int. J. Comput. Assist. Radiol. Surg. 10, 101–108 (2015).
[Crossref]

Int. J. Med. Robot. Comput. Assist. Surg. (1)

J. W. Yoon, R. E. Chen, E. J. Kim, O. O. Akinduro, P. Kerezoudis, P. K. Han, P. Si, W. D. Freeman, R. J. Diaz, R. J. Komotar, S. M. Pirris, B. L. Brown, M. Bydon, M. Y. Wang, R. E. J. Wharen, and A. Quinones-Hinojosa, “Augmented reality for the surgeon: systematic review,” Int. J. Med. Robot. Comput. Assist. Surg. 14, e1914 (2018).
[Crossref]

Int. J. Oral Maxillofac. Surg. (1)

R. Bosc, A. Fitoussi, B. Hersant, T.-H. Dao, and J.-P. Meningaud, “Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies,” Int. J. Oral Maxillofac. Surg. 48, 132–139 (2018).
[Crossref] [PubMed]

J. Digit. Imag. (1)

D. Duncan, R. Garner, I. Zrantchev, T. Ard, B. Newman, A. Saslow, E. Wanserski, and A. W. Toga, “Using virtual reality to improve performance and user experience in manual correction of MRI segmentation errors by non-experts,” J. Digit. Imag. 32, 97–104 (2019).
[Crossref]

J. Opt. Soc. Am. A (1)

J. Simul. (1)

G. S. Ruthenbeck and K. J. Reynolds, “Virtual reality for medical training: the state-of-the-art,” J. Simul. 9, 16–26 (2015).
[Crossref]

J. Soc. for Inf. Disp. (2)

R. L. Austin, B. S. Denning, B. C. Drews, V. B. Fedoriouk, and R. C. Calpito, “Qualified viewing space determination of near-eye and head-up displays,” J. Soc. for Inf. Disp. 26, 567–575 (2018).
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “Absolute radiometric and photometric measurements of near-eye displays,” J. Soc. for Inf. Disp. 25, 215–221 (2017).
[Crossref]

J. Vis. (1)

S. Winter, R. Sabesan, P. Tiruveedhula, C. Privitera, P. Unsbo, L. Lundström, and A. Roorda, “Transverse chromatic aberration across the visual field of the human eye,” J. Vis. 16(14), 9 (2016).
[Crossref] [PubMed]

JACC: Basic to Transl. Sci. (1)

J. N. Silva, M. Southworth, C. Raptis, and J. Silva, “Emerging applications of virtual reality in cardiovascular medicine,” JACC: Basic to Transl. Sci. 3, 420–430 (2018).

Philos. Trans. Roy. Soc. Lond. (1)

C. Wheatstone, “XVIII. contributions to the physiology of vision. - part the first. on some remarkable, and hitherto unobserved, phenomena of binocular vision,” Philos. Trans. Roy. Soc. Lond. 128, 371–394 (1838).
[Crossref]

Sci. Rep. (2)

S. B. Mondal, S. Gao, N. Zhu, G. P. Sudlow, K. Liang, A. Som, W. J. Akers, R. C. Fields, J. Margenthaler, R. Liang, V. Gruev, and S. Achilefu, “Binocular goggle augmented imaging and navigation system provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping,” Sci. Rep. 5, 12117 (2015).
[Crossref] [PubMed]

M. Zhu, F. Liu, G. Chai, J. J. Pan, T. Jiang, L. Lin, Y. Xin, Y. Zhang, and Q. Li, “A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery,” Sci. Rep. 7, 42365 (2017).
[Crossref] [PubMed]

Other (16)

T. M. Peters, C. A. Linte, Z. Yaniv, and J. Williams, Mixed and augmented reality in medicine (CRC, 2018).
[Crossref]

A. S. Pillai and P. S. Mathew, “Impact of virtual reality in healthcare: a review,” in Virtual and Augmented Reality in Mental Health Treatment, (IGI Global, 2019), pp. 17–31.
[Crossref]

N. Cui, P. Kharel, and V. Gruev, “Augmented reality with Microsoft Hololens holograms for near infrared fluorescence based image guided surgery,” in Molecular-Guided Surgery: Molecules, Devices, and Applications III, (International Society for Optics and Photonics, 2017), p. 100490I.

I. E. Sutherland, “A head-mounted three dimensional display,” in Proceedings of the December 9–11, 1968, fall joint computer conference, part I, (ACM, 1968), pp. 757–764.

F. G. Hamza-Lup, J. P. Rolland, and C. Hughes, “A distributed augmented reality system for medical training and simulation,” arXiv preprint arXiv:1811.12815 (2018).

M. M. North and S. M. North, “Virtual reality therapy,” in Computer-assisted and web-based innovations in psychology, special education, and health, (Elsevier, 2016), pp. 141–156.
[Crossref]

W. J. Smith, Modern optical engineering (Tata McGraw-Hill Education, 1966).

M. Yanoff and J. S. Duker, Ophthalmology (MOSBY Elsevier, 2009).

R. L. Austin, B. Drews, T. Vogt, V. Fedoriouk, B. Denning, and F. Vachlin, “65-3: Spectroradiometric measurements of near-eye and head up displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 958–960.
[Crossref]

J. Penczek, P. A. Boynton, F. M. Meyer, E. L. Heft, R. L. Austin, T. A. Lianza, L. V. Leibfried, and L. W. Gacy, “65-1: Distinguished paper: photometric and colorimetric measurements of near-eye displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 950–953.
[Crossref]

L. Zhang and M. J. Murdoch, “Color matching criteria in augmented reality,” in Color and Imaging Conference, (Society for Imaging Science and Technology, 2018), pp. 102–109.
[Crossref]

A. Abileah, “30.1: Overview: testing 3d-stereo displays, techniques and challenges,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2013), pp. 368–371.
[Crossref]

K. Oshima, K. Naruse, K. Tsurutani, J. Iwai, T. Totani, S. Uehara, S. Ouchi, Y. Shibahara, H. Takenaka, Y. Sato, T. Kozakai, M. Kurashige, and H. Wakemoto, “79-3: Eyewear display measurement method: entrance pupil size dependence in measurement equipment,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2016), pp. 1064–1067.
[Crossref]

K. Tsurutani, K. Naruse, K. Oshima, S. Uehara, Y. Sato, K. Inoguchi, K. Otsuka, H. Wakemoto, M. Kurashige, O. Sato, M. Cho, S. Ouchi, and H. Oka, “65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2017), pp. 954–957.
[Crossref]

K. Oshima, H. Sano, S. Uehara, H. Oka, M. Ishimoto, M. Sugawara, Y. Sato, M. Kurashige, K. Tsurutani, K. Hyodo, M. Yamada, R. Kondo, K. Inoguchi, H. Wakemoto, M. Cho, R. Ukai, and S. Ouchi, “P-83: Evaluation scheme for transparent properties of AR see-through eyewear display,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 1499–1502.
[Crossref]

R. S. Draper, J. Penczek, R. Varshneya, and P. A. Boynton, “72-2: Standardizing fundamental criteria for near eye display optical measurements: determining eye point position,” in SID Symposium Digest of Technical Papers, (Wiley Online Library, 2018), pp. 961–964.
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 (a) Sketch of a VR HMD without (top panel) and with (bottom panel) TCA where the different colors of white light focus to different locations on the retina. (b) Image of TCA on the Oculus Rift. Red and blue bars are separate in the center of the FOV. The bars overlap at the edge of the FOV due to TCA. The white arrow with the labeled edge is at a visual angle of 15°
Fig. 2
Fig. 2 (a) Sketch of the experimental setup to characterize a VR HMD. Light from the VR HMD passes through an iris and is focused onto an RGB camera using a 25 mm focal length lens. The XYZ translation stages were used to center the camera in the eyebox of the HMD. (b) Photograph of setup. (c) Test pattern for characterizing TCA. (d) Image of test pattern from c on the Oculus Rift. White lines designate placement of profile used for plots in Fig. 3(a).
Fig. 3
Fig. 3 (a) Plots of profile across the red, green, and blue bars along the white lines in Fig. 2(d). (b) Offset in the positions of the bars relative to the expected positions from the designed pattern, where Δx is the deviation from the expected position of the bars from the input test pattern. Solid lines are fits using Δx = a1x3 + a2,λx for the distortion and TCA of the HMD. (c) Offset in the positions of the bars relative to the expected positions from the designed pattern using a conventional LCD display.
Fig. 4
Fig. 4 TCA in the horizontal and vertical directions for the (a),(b) Oculus Rift (c),(d) Oculus Go (e),(f) Samsung Gear (g),(h) HTC Vive, respectively. A positive (negative) value for TCA indicates a shift towards the nasal (temporal) visual field. As a reference, the results of the test pattern using a conventional display are shown in (a) using red and blue dashed lines.
Fig. 5
Fig. 5 (a) Measured TCA percentages for five users on the Rift for the red and blue bars at a visual angle of 30°. The results from Table 1 for the horizontal direction on the Rift are plotted for comparison. Inset: Depiction of the user task to measure TCA. The blue and red bars are shifted by the user until alignment with the green bars as indicated by the white arrows. (b) Psychophysics tests for the five users at different visual angles with the individual results (dots), blue user averages (blue triangles), red user averages (red squares), and laboratory measurements (dashed lines). The error bars are ± 1 standard deviation.

Tables (1)

Tables Icon

Table 1 TCA for VR HMDs

Metrics