The display of dynamic holographic images is possible by computing the hologram of objects in a three-dimensional scene and then transcribing the two-dimensional digital hologram onto a digital micromirror system illuminated with coherent light. Proof-of-principle instruments that reconstruct real and virtual images are described. The underlying process, its characteristics, limitations and utility are discussed.
©2003 Optical Society of America
Film-based media for the storage and projection of holographic images have existed for many years and have recently demonstrated very high quality color volumetric projections [1, 2]. There also exist several technologies that can project time-dependent non-holographic 3-D images  or give the illusion of 3-D through stereoscopic effects [4–6]. True holographic projection that is dynamic and digitally driven could form the basis of future scientific and commercial visualization systems. To accomplish this, a high-resolution computer-controlled real-time phase modulator is required.
The Texas Instruments’ Digital Light Processing (DLP) micro-mirror chip was developed as the core amplitude modulation element for computer projectors and movie projection . We have now conceived of and constructed a device wherein the DLP is used to control the phase rather than the amplitude of the light directed upon it. This enables a new class of possibilities, from mega-channel phase controlled communications switches to 3-D color holographic television. We describe the prototype monochrome holographic display, the optical and computational process on which it is based, and show sample 3-D images obtained on the instrument.
2. Digital micro-mirror device
Spatial light modulators to alter the intensity and phase of light which have been studied include transparent films [1, 2, 8], liquid crystal displays (LCD) [2, 9] and digital micromirror devices (DMD) . Our DLP systems are constructed using the Texas Instruments’ 1024×768 (XGA) micro-mirror chip and associated electronics which are driven by a standard computer video driver card (Inside Technology LCD555/PCI Video Card). The high reflectance aluminum micro-mirrors are 16×16 microns with a one-micron gap, which gives a 17-micron row or column pitch.
The Texas Instruments DLP DMD has current applications in TV, video and movie projectors. Most white light applications produce excellent quality 2-D images via the direct reflection or the zero diffraction order of light from the individual canting DMD mirrors in order to construct an image . Coherent light applications ranging from interferometric deformation and depth measurement [10, 11] to holographic storage  have been demonstrated. Volumetric images with incoherent light have also been produced with the DMD by projecting a series of 2-D constructs to give a 3-D appearance [3, 13].
We have constructed a system that projects true dynamic 3-D holographic images from computer-generated holograms utilizing the lowest orders of diffracted light from a laser illuminated DMD. The canting mirror structure of the DMD does not preclude its use as a reflective holographic medium for phase and amplitude control. Its highly reflective properties provide a much higher throughput than a transmissive LCD . One disadvantage of the DMD as a phase modulator is the multiple Fraunhofer diffraction orders that occur with coherent light due to the DMD grating pattern. However, it has been measured that over 88% of the diffraction energy can be coupled into a single Fraunhofer diffraction order . By using the reconstructed image of the brightest diffraction order, we estimate our upper limit in intensity can, likewise, be 88% of that delivered to the DMD.
We have demonstrated the utility of the DMD as a 3-D image holographic medium by producing virtual and real 3-D images at finite distances, an essential condition for reconstruction with image depth. Our aim is to create a real-time, multi-color projection system for all digital holograms. This includes computer-generated holograms that have been studied for some time [16, 17]. Thus, our proof-of-principle demonstration is based on transcribing an existing digital hologram to the DMD and viewing the results.
3. Optical system
The optical system is composed of a 15 mW HeNe Laser, spatial filter, 10 cm focal length collimating lens, DMD, 40 cm focal length converging lens and an image reconstructor for real image viewing [see Fig. 1(a)]. The real image reconstructor may be a frosted glass plate, fiber optic magnifier, or CCD/digital camera for visualization of a planar cross-section of a 3-D image; or it may be a translucent block such as a thick Agarose gel to create a suspension of micro-scatter bodies to simultaneously view the whole 3-D real image. The “original image” depicted in the upper left in Fig. 1(a) is a bitmap of a 2-D irregular perspective object. Its computed interferogram is represented on the computer monitor . The picture at the “image reconstructor” is a CCD camera photo of the actual image reconstructed on a frosted glass plate. This is an illustration of the DMD’s capability to reconstruct 2-D irregular perspective objects as well as full 3-D holographic scenes.
The 3-D holographic virtual image can be observed by looking directly into the DMD [see Fig. 1(b)]. The convergent lens and image reconstructor are removed from the optical system and the laser intensity is substantially reduced (by 80%) with neutral density filters for viewing directly by eye. The DMD functions as a reflective holographic medium in either projection mode.
4. Transform computation
The theory and algorithms for calculating the computer-generated holograms incorporate the affects of all the components of the system as in Fig. 1. The hologram may be thought of as having been produced by a coherent illumination beam that reflects off an object, then passes through, and produces an interference pattern at the back plane of the converging lens [Fig. 1(a)]. Setting the algorithm’s parameters to select the back plane of the lens as the location for the formation of the hologram offers flexibility in adding components on either side of the converging lens such as placing physical filters at the Fourier planes of the lens to modify high-pass and low-pass components to improve reconstruction quality.
Equation 1 is the mathematical transform containing the wave physics of monochromatic light emanating from each object point, passing through the optical system and being superimposed at each point in the holographic plane. It represents the integration over the object of spherical wave solutions of the Helmholtz form of the wave equation  with additional phase shifts due to a spherical converging lens in the light pathway.
Us(x,y) represents the intensity amplitude at a point in the hologram plane and U(x’,y’,z’) is the intensity amplitude at a point on the objects in the 3-D scene of volume V’ . The z’-axis is normal to the center of the hologram plane and extends through the center of the reconstructed 3-D scene volume. The wave number of the light is given by k and f is the focal length of the converging lens.
The second term in the exponential represents the phase change due to the differing lengths of material the light traverses in the converging lens in our system . Recall, a function of the converging lens is to produce the 3-D image reconstruction over a relatively near field. Operating in the near field spatially disperses the depth information which is lost for reconstruction images that converge at infinity . The phase information due to depth is conveyed via the first term in the exponential which is the radial distance from an object point to a hologram point. We do not approximate this distance in our calculation, but we have incorporated the Fresnel approximation in the amplitude term in front of the exponential [16, 19]. Experimentally, we observed little change in image quality on reconstruction with this simplification.
Although the physical hologram illumination does not make any approximations in reconstructing the image, we have used the Fresnel approximations to estimate image properties since the they should be valid in the range of our 3-D reconstructions . Likewise, we may also apply aspects of the Fresnel rather than Fraunhofer  approximations to calculate the computer generated hologram when it reduces computation time without a noticeable degradation in display image quality.
Individual 2-D perspective images, 3-D volumes or 3-D volumes assembled from 2-D slices can be calculated using Eq. (1). Our demonstration of the use of a DMD as a 3-D projector is based on assembling the hologram using the volume slice approach. The 3-D volume is decomposed into planes orthogonal to the z’-axis. An interferogram for each plane with significant information content is calculated using Eq. (1). All calculated interferograms are then added, applying the superposition principle of light, to yield the final computergenerated hologram . The advantage of this approach is its potential to reduce the number of calculations necessary for real-time dynamic simulations. Subjects in 3-D scenes can be added; removed; or moved in x, y, or z by adding and/or subtracting their respective interferograms. A final hologram may also be computed in real time from a library of precomputed interferograms .
As a simple example, consider the scene containing two objects shown in Fig. 2. A bitmap-image of a jet is placed at the rear plane (z’=27.5 cm), and a bitmap-image of a helicopter is placed at the front plane (z’=30 cm). For each interferogram calculation, every pixel in the image at each plane is treated as a point source of light with an intensity equal to its gray scale value.
The resulting transform calculations can be scaled in software to adjust the size of the displayed scene and to associate them with physical spatial coordinates. To reduce computation time, small bitmap-images of the objects are magnified prior to transformation to fill the scene, but at a trade-off in final image quality. Calculation of the transform of all the point sources of the 3-D volume results in a complex hologram array. It has been verified experimentally that either complex component, since they are mathematically related by simple phase shift, is sufficient to produce the final hologram for image reconstruction. It was also experimentally verified that the magnitudes of the complex numbers do not produce a viewable image from the hologram because phase information is lost. All projections described herein use only the imaginary component of the complex numbers to form the digital hologram such as shown in Fig. 3.
The hologram in Fig. 3 was calculated using a wavelength of 633 nm and a converging lens focal length of 25 cm with the bit-maps located as shown in Fig. 2. While the hologram in Fig. 3 is a computed approximation of a real object’s hologram, nature provides a reconstruction without any approximations. Our assumptions are validated when the image appears as and where it was encoded.
The approximations in the computer-generated hologram in Fig. 3 include a Fresnel approximation of the wave front amplitudes and zero depths for the each of the two individual bit-map objects, as noted above. The physical sizes of the objects were simulated by coding each pixel in their bit-map as having a value of 51 by 51 microns. The size of helicopter bitmap was 80 pixels wide by 65 pixels high while the jet was 120 pixels wide by 20 pixels high. The objects were also given an artificial offset in x=50 pixels and y =100 pixels to shift their centers from the bright centrally diffracted light as an aid in viewing and photographing.
Lens position and alignment errors in our optical setup, which do not match the encoding parameters, can result in an image not properly forming. However, the physical reconstruction is quite tolerant of small position and alignments errors and the projected 3-D image can usually be found and adjustments made lenses to fine-tune the image.
5. System demonstration
When the computer-generated hologram, shown in Fig. 3, is transcribed to a HeNe laser illuminated DMD, a real and a virtual 3-D image of the jet and helicopter results, Fig. 4. For this demonstration, the 10cm collimating lens was located 10 cm in front spatial filter, the DMD was 28 cm from the collimating lens, and the 40cm converging lens was 11.5 cm from the DMD. The angle between the incident illumination axis and the image axis is about 20 degrees due to the cant angle of the mirrors [7, 15]. A frosted glass reconstructor was translated in front of the converging lens to image different slices of the real images produced at the two different focal (z’=27.5 cm and z’=30.0cm from the 40 cm lens) positions, thus validating that the whole computer-generated scene was 3-dimensional. Photographs at each focal position were taken, yielding the original helicopter and jet, Figs. 4(a) and 4(b), respectively. The different diffraction orders result in multiple images separated by the appropriate diffraction angles relative to the brightest diffraction order . Likewise for each diffraction order, there is a corresponding inverted image. The normal and inverted images occur at equal distances, plus or minus, from the focal distance of the lens . These other diffraction order images around the edges of the photos have been cropped for presentation. The haze at the top of the jet in Fig. 4(b) is the same diffraction order out-offocus helicopter. The haze at the bottom of the jet is the same diffraction order out-of-focus inverted jet.
The more traditional way to visualize the reconstructed 3-D scene is to directly view the virtual image that appears in the DMD, Fig 1(b). Shown in Fig. 4(c) is the virtual image of the reconstructed spatial volume. The helicopter appears in front of the fighter just as in the real image. The camera was positioned in front of the DMD to capture the virtual image and avoid bright diffraction spots (bright spot near lower right in figure). The camera was focused for a distance of 3 ft and placed 22 cm in front of the DMD for this picture. (The positions of the DMD, 10 cm lens and spatial filter remain as above). The depth of field of the camera was maximized to simultaneously capture both objects in the virtual image volume. Due to the limited depth of field of the camera, the focus and overall quality is not representative of that viewed by eye.
6. System refinements
To be of true utility, a system must be high-resolution, real-time, dynamic, have multiple colors, and attain this with simple hardware and reasonable computational resources. One of the first stated needs for such 3-D displays is in aircraft cockpits . As an initial application of our DMD approach, we are constructing a prototype virtual 3-D heads-up display in which the display scenes are pre-computed interferograms to be superimposed to form the final hologram.
We are developing algorithms to move elements of a scene without recomputing the entire volume, thus reducing computing requirements. Already, we can compute and redisplay a random 3-D position in the scene volume every few seconds. We are also exploring algorithms and filters to minimize unwanted diffraction effects and to increase image quality.
Resolution of the holographic image depends directly on the DMD specifications, becoming finer as the number of mirrors increase and/or their individual dimensions decrease. Each point in a hologram samples every observable point on the 3-D object, thus, the loss of part of the hologram does not equate to the loss of a part of the reconstructed image. An entire row or column of mirrors on the DMD can malfunction and yet a readable display of data will occur, but with some loss of brightness and resolution. Thus, lost data due to damaged DMD mirrors or interference noise does not result in total loss of vital information. This resilience would be an advantage in a cockpit display .
While we have displayed holographic images with several colors independently, multicolor images can be produced by dynamically synchronizing red, blue and green lasers with their respective holograms to form a composite image similar to methods used with DMD-based projectors now.
Additional examples, illustrations, and information are available at our Web site http://innovation.swmed.edu/HolographicTV/index.html. As in the virtual image viewing mode, all objects in the 3-D real image can be seen simultaneously by projecting into a volume of translucent scattering bodies. We have cast a 7.5 cm thick gel made from an Agarose mixture to view 3-D real images. Some pictures taken of the 3-D images in the gel tank are shown on the Web site. Illustrations of the robust nature of the hologram with respect to random loss of DMD mirrors or loss of entire rows/columns of the DMD mirrors are also presented on the Web site as well as movies of dynamic projection of the holographic images.
The DMD holographic system’s immediate advantage is its dynamic 3-D capability with current hardware technology. Figure 5 is a frame from a movie we made by sending precomputed bit-map holograms at 24 frames per second to the laser illuminated DMD in our system. At present, we evaluate the 3-D projection of the hologram by observing their images in an Agarose gel tank, as noted above. However, to make the AVI (Audio-Video Interleaved) movie to be viewed in 2-D, the images were collected on a frosted glass plate. The video runs at 12 frames per second and shows two jets flying behind a helicopter, which then lifts off, and lands. The jets appear blurred since the glass plate was placed for the best focus of the helicopter. Although, the video cannot convey the depth capability, it does show that the resolution is good enough to distinguish different aircraft types and movements which are essential in a heads-up display.
This research has received financial support from the State of Texas Advanced Research Program (ARP), the National Cancer Institute, and the Center For Biomedical Inventions, UT Southwestern Medical Center.
References and links
1. R. Kunzig, “The Hologram Revolution,” Discover 23, 55–57, (February 2002).
2. Zebra Imaging, (2002), http://www.zebraimaging.com.
3. C. Lewis and G. Favalora, Actuality Systems, (2001), http://www.actuality-systems.com/.
4. S.A. Benton, “Real Image Holographic Stereograms,” U.S. Patent 4,834,476 (issued 30 May 1989).http://www.media.mit.edu/groups/spi.
5. Dimension Technology, Inc., (2001), http://www.dti3d.com/about.asp.
6. M. Lucente, IBM Corporation, (1996), http://www.research.ibm.com/imaging/vizspace.html.
7. L.J. Hornbeck, “Digital light processing for high-brightness, high-resolution applications,” presented at the Electronic Imaging, EI ’97, Projection Displays III co-sponsored by IS&T and SPIE, San Jose, CA, 10 February 1997.
8. H.A. Klein, Holography (Lippincott, New York, 1970).
9. T. Hast, M. Schonleber, and H.J. Tiziani, “Computer-generated holograms form 3D-objects written on twistednematic liquid crystal displays,” Opt. Commun. 140, 299 (1997). [CrossRef]
10. T. Kreis, P. Aswendt, and R. Hofling, “Hologram reconstruction using a digital micromirror device,” Opt. Eng. 40926 (2001). [CrossRef]
13. Christie Digital Systems, Mirage 10000, http://www.christiedigital.com/index10.asp
14. R.S. Nesbitt, et al., “Holographic recording using a digital micromirror device,” in Conference on Practical Holography XIII, Proc. SPIE3637, 12–20 (1999). [CrossRef]
15. L. Yoder, et al., “DLP Technology: Applications in Optical Networking,” Texas Instruments White Pages, (2001), http://www.dlp.com/dlp_technology/dlp_technology_white_papers.asp.
16. J. P. Waters, “Three-dimensional Fourier-transform method for synthesizing binary holograms,” J. Opt. Soc. Am. 58, 368 (1968). [CrossRef]
17. A.W. Lohmann and D.P. Paris, “Binary Fraunhofer holograms, generated by computer,” App. Opt. 6, 1739 (1967) [CrossRef]
18. Note: The noun, hologram, will follow the convention that it is defined as containing the “whole” (phase and amplitude) information of a 3-D scene. Interferogram will be used throughout as containing all information (phase and amplitude) but for only a 2-D scene. Holographic or interferometric image will refer to the reconstruction due to diffraction from a hologram or interferogram, respectively
19. B.E.A. Saleh and M.C. Teich, Fundamentals of Photonics, (Wiley, New York, 1991), pp 55–60.
20. J. R. Thayn, J Ghrayeb, and D. G. Hooper, “3-D display design concept for cockpit and mission crewstations,” in Cockpit Displays VI: Display for Defense Applications, Proc. SPIE3690, 180 (1999). [CrossRef]