Abstract

The concept and instantiation of real-time tomographic holography (RTTH) for augmented reality is presented. RTTH enables natural hand–eye coordination to guide invasive medical procedures without requiring tracking or a head-mounted device. It places a real-time virtual image of an object’s cross section into its actual location, without noticeable viewpoint dependence (e.g., parallax error). The virtual image is viewed through a flat narrowband holographic optical element (HOE) with optical power that generates an in-situ virtual image (within 1m of the HOE) from a small spatial light modulator display without obscuring a direct view of the physical world. Rigidly fixed upon a medical ultrasound probe, an RTTH device could show the scan in its actual location inside the patient, even as the probe was moved relative to the patient.

© 2010 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. I. Sutherland, in AFIPS Proceedings of the Fall Joint Computer Conference (AFIPS, 1968), Vol. 33, pp. 757–764.
  2. D. Drascic and P. Milgram, Proc. SPIE 2653, 123 (1996).
    [CrossRef]
  3. S. Hofstein, “Ultrasound scope,” U.S. patent 4,200,885 (April 29, 1980).
  4. G. Stetten, “System and method for location-merging of real-time tomographic slice images with human vision,” U.S. patent 6,599,247 (July 29, 2003).
  5. A. Nowatzyk, D. Shelton, J. Galeotti, and G. Stetten, in AMI-ARCS 2004, Workshop for Augmented Environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS, 2004).
  6. R. D. Rallison and S. M. Arnold, Proc. SPIE 2689, 267 (1996).
    [CrossRef]

1996 (2)

D. Drascic and P. Milgram, Proc. SPIE 2653, 123 (1996).
[CrossRef]

R. D. Rallison and S. M. Arnold, Proc. SPIE 2689, 267 (1996).
[CrossRef]

Arnold, S. M.

R. D. Rallison and S. M. Arnold, Proc. SPIE 2689, 267 (1996).
[CrossRef]

Drascic, D.

D. Drascic and P. Milgram, Proc. SPIE 2653, 123 (1996).
[CrossRef]

Galeotti, J.

A. Nowatzyk, D. Shelton, J. Galeotti, and G. Stetten, in AMI-ARCS 2004, Workshop for Augmented Environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS, 2004).

Hofstein, S.

S. Hofstein, “Ultrasound scope,” U.S. patent 4,200,885 (April 29, 1980).

Milgram, P.

D. Drascic and P. Milgram, Proc. SPIE 2653, 123 (1996).
[CrossRef]

Nowatzyk, A.

A. Nowatzyk, D. Shelton, J. Galeotti, and G. Stetten, in AMI-ARCS 2004, Workshop for Augmented Environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS, 2004).

Rallison, R. D.

R. D. Rallison and S. M. Arnold, Proc. SPIE 2689, 267 (1996).
[CrossRef]

Shelton, D.

A. Nowatzyk, D. Shelton, J. Galeotti, and G. Stetten, in AMI-ARCS 2004, Workshop for Augmented Environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS, 2004).

Stetten, G.

A. Nowatzyk, D. Shelton, J. Galeotti, and G. Stetten, in AMI-ARCS 2004, Workshop for Augmented Environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS, 2004).

G. Stetten, “System and method for location-merging of real-time tomographic slice images with human vision,” U.S. patent 6,599,247 (July 29, 2003).

Sutherland, I.

I. Sutherland, in AFIPS Proceedings of the Fall Joint Computer Conference (AFIPS, 1968), Vol. 33, pp. 757–764.

Proc. SPIE (2)

D. Drascic and P. Milgram, Proc. SPIE 2653, 123 (1996).
[CrossRef]

R. D. Rallison and S. M. Arnold, Proc. SPIE 2689, 267 (1996).
[CrossRef]

Other (4)

I. Sutherland, in AFIPS Proceedings of the Fall Joint Computer Conference (AFIPS, 1968), Vol. 33, pp. 757–764.

S. Hofstein, “Ultrasound scope,” U.S. patent 4,200,885 (April 29, 1980).

G. Stetten, “System and method for location-merging of real-time tomographic slice images with human vision,” U.S. patent 6,599,247 (July 29, 2003).

A. Nowatzyk, D. Shelton, J. Galeotti, and G. Stetten, in AMI-ARCS 2004, Workshop for Augmented Environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS, 2004).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1

An HOE can be used to project a nearly viewpoint- independent autostereoscopic 2D virtual image at the actual 3D location and orientation of the data in real time. The size, shape, and position of the virtual image are not fixed by that of the SLM image source.

Fig. 2
Fig. 2

A construction optic in one of the construction beams allows the HOE to counterbalance aberrations.

Fig. 3
Fig. 3

(a) Final playback layout of our RTTH system. (b), (c) Photos showing fetal ultrasound projected inside the mother. The images appear to have higher resolution in person, due primarily to the larger dynamic range and smaller pupil size of the human visual system. The left photograph shows the ultrasound image floating inside the mother, as viewed through the HOE. The right image is “zoomed in,” showing part of the fetus, oriented facing left with the head down.

Fig. 4
Fig. 4

Zemax analysis of the worst (left) and best (right) image points. The point spreads across the surface of the face plate can be multiplied by the optical system’s magnification factors (5.39 horizontally and 7.23 vertically) to predict the stability of these points in the virtual image. The long, sparse tails represent viewpoints for which the entire virtual image is not simultaneously in view.

Fig. 5
Fig. 5

Virtual image that remained aligned with a physical target (a) to within 0.5 mm across viewpoints, e.g., (b) and (c). The maximum pixel width was (d) 0.97 mm .

Metrics