Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Volumetric 3D display in real space using a diffractive lens, fast projector, and polychromatic light source

Open Access Open Access

Abstract

We present a method for realizing a solid-state volumetric display based on the chromatic dispersion properties of a 150 mm diffractive lens that images a series of planar patterns presented by a digital micro-mirror device projector. The projector is driven by a narrowband polychromatic source, where the position of each image plane is defined by a distinct wavelength of light. The volumetric display system achieves 20 image planes at a volume refresh rate of 20 Hz, creating a volume of 17.4cm3 with 13 mm of depth and a field of view of 10° floating 145 mm above the lens in real space.

Published by The Optical Society under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

Volumetric displays constitute a class of 3D display which control 3D volume elements (voxels) to produce a composite volume image, in a similar fashion to a conventional 2D display, forming a picture by modulating many planar picture elements (pixels). The benefit of using this method of creating 3D imagery is that because the voxels occupy a physical volume of space, volumetric displays fulfil all the cues used by the human eye to perceive depth: the eyes of the viewer can both aim (vergence) and focus (accommodation) on the same point, and all the effects of perspective apply. The common exception to this is occlusion; as the voxels have no physical presence with which to block light, the objects in the background are not obscured by the foreground. This can be advantageous, however, to applications such as reviewing medical scans or viewing point cloud data for LIDAR/RADAR. There are many methods of voxel formation; examples include sweeping an array of LEDs [1], projecting onto a moving screen [2] or stack of liquid crystal panels [3], and using lasers to form points in free space [4] or inside a material [57]. Holographic solutions to volumetric displays have been proposed [8,9] from which this Letter follows. In this Letter, we present our work on using chromatic dispersion to produce an image volume in real space, without many of the concerns of other volumetric displays such as mechanical reliability and high-power lasers.

The volumetric 3D display in this Letter is based on the chromatic dispersion properties of a diffractive optical element (DOE) lens, as seen in Fig. 1, and described by Fourier imaging theory [10]. A polychromatic light source and high-speed projector feed the lens with a rapid series of 2D images spanning a predetermined range of wavelengths. The lens focuses an input image to a plane in free space, with the focal distance depending on the illumination wavelength. The end result is similar to the moving screen displays, with the 2D images sweeping out a volume with time. This has the benefit over physically swept displays of producing an image within the user’s reach, having no moving parts, and can produce significantly larger images compared to laser plasma approaches. The drawback comes in the form of a restricted field of view (FOV), as this is defined by the lens f-number, as well as an inability to display color images due to the core principle of multiplexing with wavelengths. A similar approach has been proposed for near-eye systems [11,12], though this instead trades 2D display resolution for multiple image planes using a DOE to map different focal lengths.

 figure: Fig. 1.

Fig. 1. Diagram of the optical setup for the volumetric display, including the LED array light source, high-speed projector, and DOE lens in a 4f configuration.

Download Full Size | PDF

Therefore, the lens is a critical component in the display, as requires not only high chromatic dispersion to achieve a large depth of field, but also a large radius to allow for more sizeable images while maintaining a manageable thickness. We chose to use a DOE for our lens, as it requires only a very thin profile, can be manufactured with relative ease and low cost, and has strong chromatic dispersion properties. The difficulty comes in designing a diffractive profile that works for low f-numbers, where the paraxial approximation used in most literature no longer applies [13]. The mathematics required to design a lens without approximation given a radius r, focal length f, design wavelength λ0, and refractive indices of the material and surrounding medium np and ne are

z(r)=np(np2ne2){mλ0ne±[ξmλ0np]2+δ2r2+ξ},
where ξ=nenp(npne)f and δ=nenpnp2ne2 [14,15]. The boundary rm of each diffractive zone m can be found using
rm=2mλ0f+(mλ)2.
A Ø 150mm lens was designed in acrylic with a focal length of 150 mm (f/1) at a design wavelength of 500 nm. This was fabricated using a high-precision diamond turning machine, allowing for the continuous kinoform profile with sub-micron scale features to be cut directly into the bulk material. The surface profile is 1 μm deep with features as small as 1 μm wide at the lens edge. The finished lens was characterized using a range of applicable wavelengths from 460–640 nm, produced by a Brimrose TLS 300 Tunable Light Source with a spectral resolution of 2 nm collimated towards the DOE lens. The focal length was measured manually using a screen, visually determining the location of the sharpest focus and recording its distance to the lens. As seen in Fig. 2, the result was acceptably close to a linear variation in focal length across the visible range, spanning 47 mm over 180 nm or, equivalently, 0.26 mm/nm. A physical optics simulation of the lens profile was also performed in MATLAB, sampling the path difference across the lens profile and summing amplitudes. This produced a curve approximately given by
f=f0λ0λ.
The measured result is offset from this by approximately 5.9 mm across the full measurement range, a calibration error in the collimation of the tunable source. The diffraction efficiency of the lens is 13% which is low, but satisfactory, due to the manufacturing process still being in development.

 figure: Fig. 2.

Fig. 2. Graph of the DOE focal length, both measured and simulated as a function of input wavelength. The response is fairly linear over the investigated spectrum. There is an offset of 5.9 mm between the measured and simulated focal lengths.

Download Full Size | PDF

The light source required to realize this architecture of volumetric display is subject to extreme specifications. It must cover as wide a wavelength range as possible in the visible spectrum to achieve maximum image depth, but also must have a narrow linewidth to make sure the image planes are well resolved and do not overlap. The wavelength range must be sampled uniformly, avoiding gaps in the spectrum, which would result in gaps in the image volume. In order to achieve an image without visible flicker, as each image plane is individually displayed, the light source must be able to output the desired number of wavelengths N in sequence with the full sequence repeating at 20Hz. This also leads to the requirement of high brightness, as each plane is illuminated for at most only 1/N of each render pass. With such specific constraint, a new light source was externally developed for this project based on gallium nitride (GaN) resonant cavity light-emitting diodes (RCLEDs). RCLEDs can be considered as a classical LED structure inserted within an optical cavity that acts as a wavelength filter, using distributed Bragg reflectors. This preserves the high speed and brightness of the LED, while minimizing the linewidth and allowing for tunability between different LEDs. A 4×8 array of such RCLEDs was created, the LEDs being tuned between the natural blue output of GaN and green, with the array spanning 13.55mm×10.10mm. Then this LED array can provide a maximum of 32 different image planes. Once received, the LED array was characterized using an Ocean Optics FLAME-T-VIS-NIR spectrometer. The peak wavelengths spanned from 467.1–538.7 nm, a range of 71.6 nm. However, four LEDs were deemed to be unsuitable for this application due to wide linewidths or multiple output peaks. With these excluded, the average full-width at half-maximum (FWHM) for the LEDs was 6.9 nm, greater than the planned specification of 4 nm. As the fabrication of these RCLEDs is experimental and proprietary, we are unsure of the mechanism behind these flaws. Future versions of the RCLEDs are expected to meet this specification, as the fabrication methods are refined. The distribution of wavelengths included overlapping peaks and gaps in the spectrum, resulting in a further four LEDs being made redundant. The individual powers of the LEDs were measured using a Thorlabs PM100D with a S302C sensor. As this instrument is thermal-based, the measurement was allowed to settle before being read. The measurement was performed three times on separate days to take account of any variation in the ambient temperature, the results being averaged. The LED powers varied from 3.44–9.97 mW, a significant difference, as seen in Fig. 3, and is visibly noticeable when looking at the array.

 figure: Fig. 3.

Fig. 3. Power spectrum of the LED array, representing each LED by its peak wavelength for clarity with the FWHM represented by the central error bars. All blue LEDs<485nm exhibit powers exceeding 6 mW, while LEDs>485nm have powers less than 6 mW. LEDs overlap with one or more adjacent peaks within their FWHM. The four lighter bars represent LEDs with insufficient spectral purity.

Download Full Size | PDF

The final key component was a Texas Instruments DLinnovations CEL 5500 high-speed projector based on digital micro-mirror device (DMD) technology. This has a resolution of 1024×768 (XGA) and a frame rate of 500 Hz for 8-bit images. This was chosen for its high frame rate, as it can match the requirements of the light source, needing to project N different images each at 20Hz—the 500 Hz rate giving a maximum of 25 planes. The frame rate of the projector was tested independently of the LED array, using a standard white light source and observing the image on a screen. The projector was able to display 20 different images at 20 Hz in sequence without noticeable flicker—i.e., an image rate of 400 Hz producing a volume rate of 20 Hz.

The light source was coupled into the projector using a lens triplet, as seen in Fig. 1, concentrating and collimating the light. However, this solution was far from ideal, with the distributed nature of the LEDs giving rise to varying coupling efficiencies for each LED on top of the power variation already measured. Once the light exits the projector as an inverted image, it is focused onto a Thorlabs N-BK7 ground glass, 120 grit diffuser acting as a transparent screen which the following optics can then image. To achieve a wide FOV, an additional lens was used in conjunction with the DOE lens to produce a 4f system, focusing the light at the focal length of the DOE lens without magnification. To best match the diameter and f-number of the DOE lens, an f/1 Fresnel lens was used with a diameter and focal length of 152 mm, placed one focal length away from the diffuser, as seen in Fig. 1. The DOE lens was the last in the sequence, placed 252 mm away from the Fresnel lens and focusing the image, no longer inverted, in free space at its focal length dependent on wavelength. It is worth noting that the magnification of the image planes will also vary with a wavelength as a result of this. An Arduino board was used to drive the LEDs one at a time, addressing them in order of their peak wavelengths. The projector and LEDs needed to be synchronized, so the clock output of the projector was used to trigger progressing to the next LED each time the projected image updated. The projector was controlled from a PC, with 8-bit images uploaded to the projector’s internal memory then replayed in sequence on the DMD at the chosen frame rate.

The volumetric display was set up using 20 of the LEDs, choosing those with the best spaced peaks across the available range resulting in a span of 469.82–538.68 nm. A window pattern was displayed on the projector, and pictures of the image planes were taken for each LED with a camera positioned in front of the DOE lens. The exposure time was held constant so as to capture changes in brightness accurately between LEDs. The results can be seen in Fig. 4, with large variations in both the brightness and uniformity between different LEDs. The blue end of the spectrum in particular suffers from low brightness, which is made worse by the human eye’s lower sensitivity at these wavelengths. When the LEDs were modulated in sequence at 400 Hz, as shown in Fig. 5, the effect observed was a series of adjacent image planes forming a volume in space. The brightest wavelengths tend to dominate the image, however, making these planes more prominent. Without taking this into account, the depth of field over the range of wavelengths was 13 mm. For a representative image size on the diffuser of 37×27mm, this produced images of 38×28mm and 46×35mm at the maximum and minimum wavelength planes, respectively, creating an overall volume of 17.4cm3.

 figure: Fig. 4.

Fig. 4. Images of a window pattern with different LED illuminations and constant camera exposure time. There are strongly perceived variations in brightness between different sources, as well as non-uniform illumination across the pattern.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Volumetric display with all 20 wavelengths used in sequence. Well-defined planes at different depths labelled with wavelengths: (a) 538.68, (b) 513.94, and (c) 476.60 nm.

Download Full Size | PDF

The viewing angle of the display was assessed with respect to the image size on the diffuser using the furthest plane (shortest wavelength), as this is the first to distort or be clipped by the lens aperture. The size of the image projected onto the diffuser was varied, and the respective angle at which image distortion began was recorded. The results can be seen in Table 1. As the viewer moves away from the optical axis, the image is observed to gradually distort, as it gets closer to the edge of the lens before losing visibility completely. While subjective, the viewing angle was observed to decrease with the image size from a maximum of 19° to a minimum of 4°, i.e., tending towards being bound by the paraxial condition. This strongly indicates that while the diameter of the lens is 150 mm, the viable image area is only a fraction of this.

Tables Icon

Table 1. Viewing Angle Versus Image Size on a Diffuser

As a use case example, medical image slices from an MRI scanner were presented on the volumetric display in sequence. As before, only 2–3 planes were visible when operating at 20 Hz. Images of each plane were then taken individually, with the camera exposure adjusted for each to best resolve the image. Three of these images in green, cyan, and blue are presented in Fig. 6, along with their respective source images. The level of detail visible is enough to resolve the major features of each image, though smaller features lose clarity. More distortion is present at the blue end of the spectrum, as the images are magnified and focused further from the lens. This was investigated further by presenting a blue (469.82 nm) grid pattern on the diffuser and taking photos of the resulting image, as shown in Fig. 7. The smaller pattern in Fig. 7(a) has non-uniform brightness, but no apparent distortion. The larger pattern in Fig. 7(b) is imaged by the camera as clipped at the edges. The off-axis distortion seen towards the edge of the lens in Fig. 7(c) appears to be spherical aberration, as caused by the highly non-paraxial conditions.

 figure: Fig. 6.

Fig. 6. (a)–(c) Medical MRI image slices converted from DICOM format. (d)–(f) Images produced by the volumetric display using the above patterns.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Square grid pattern, 469.82 nm (blue) illumination. The pattern sizes were measured on the diffuser. (a) 24 mm on-axis, (b) 32 mm on-axis, and (c) 24 mm, 5° off-axis.

Download Full Size | PDF

A series of adjacent image planes were produced in free space, forming an image volume as intended. The significant differences in brightness between the wavelengths caused some planes to appear more prominently, the human eye being drawn towards these planes at the expense of dimmer ones and resulting in a more fragmented volume being perceived. To address this issue, future work will be aimed at modifying the light source, either re-engineering the coupling solution or finding an alternative to improve and homogenize the brightness of the different wavelengths. Improving the range, uniformity, and sharpness of the spectral peaks would also yield a better image. Beyond this, further work would go into designing and fabricating a larger DOE to increase the possible image volume—for such a system, a volume of 1000cm3 is a desirable target—and lowering the f-number of the lens to improve the FOV, or returning to investigating holographic optical elements as an alternative. With the current implementation, image processing could be applied to reduce the distortion seen, as well as compensate for the varying image magnification with scaling. Applying calibrated pulse width modulation to the LEDs would also allow for the intensity of the planes to be homogenized at the expense of brightness.

In conclusion, we propose and experimentally demonstrate a solid-state volumetric display approach based on the combination of a diffractive lens, polychromatic light source, and high-speed projector, which is able to present eye-safe volumetric images in free space. Images are presented to the lens over a range of wavelengths and focused to a number of image planes in free space in front of the lens, achieving a measured image volume of 17.4cm3 and a viewing angle of 19°, starting from 145 mm in front of the lens with 13 mm of depth. To this end, the principle of the display was successfully demonstrated.

Funding

Engineering and Physical Sciences Research Council (EPSRC) (EP/L01596X/1); Horizon 2020 Framework Programme (H2020) (694328).

Acknowledgment

The authors thank X. Luo and F. Ding of the University of Strathclyde for the diamond turning of the DOE, as well as E. Feltin of Novagan for the fabrication of the GaN RCLED array.

REFERENCES

1. R. K. Dorval, M. Thomas, and J. L. Bareau, “Volumetric three-dimensional display system,” U.S. patent 6, 554, 430 (7 September 2003).

2. G. E. Favalora, J. Napoli, D. M. Hall, R. K. Dorval, M. G. Giovinco, M. J. Richmond, and W. S. Chun, Proc. SPIE 4712, 300 (2002). [CrossRef]  

3. A. Sullivan, Proc. SPIE 5291, 279 (2004). [CrossRef]  

4. Y. Ochiai, K. Kumagai, T. Hoshi, J. Rekimoto, S. Hasegawa, and Y. Hayasaki, in ACM SIGGRAPH 2015 Emerg. Technol. on - SIGGRAPH ‘15 (2015), pp. 1.

5. B. Zhu, B. Qian, Y. Liu, C. Xu, C. Liu, Q. Chen, J. Zhou, X. Liu, and J. Qiu, NPG Asia Mater. 9, e394 (2017). [CrossRef]  

6. K. Kumagai and Y. Hayasaki, in IEEE 14th International Conference on Industrial Informatics (INDIN) (IEEE, 2016), pp. 567–569.

7. K. Kumagai, S. Hasegawa, and Y. Hayasaki, Optica 4, 298 (2017). [CrossRef]  

8. J. Khan, C. Can, A. Greenaway, and I. Underwood, Proc. SPIE 8644, 86440M (2013). [CrossRef]  

9. J. Khan, C. Blackwell, C. Can, and I. Underwood, SID Symp. Dig. Tech. Pap. 49, 177 (2018). [CrossRef]  

10. J. W. Goodman, Introduction to Fourier Optics, 3rd ed. (Roberts and Company Publishers, 2005).

11. W. Cui and L. Gao, Opt. Lett. 42, 2475 (2017). [CrossRef]  

12. W. Cui and L. Gao, Sci. Rep. 9, 6064 (2019). [CrossRef]  

13. J. C. Marron, D. K. Angell, and A. M. Tai, Proc. SPIE 1211, 62 (1990). [CrossRef]  

14. V. Moreno, J. F. Román, and J. R. Salgueiro, Am. J. Phys. 65, 556 (1997). [CrossRef]  

15. D. A. Buralli, G. M. Morris, and J. R. Rogers, Appl. Opt. 28, 976 (1989). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Diagram of the optical setup for the volumetric display, including the LED array light source, high-speed projector, and DOE lens in a 4 f configuration.
Fig. 2.
Fig. 2. Graph of the DOE focal length, both measured and simulated as a function of input wavelength. The response is fairly linear over the investigated spectrum. There is an offset of 5.9 mm between the measured and simulated focal lengths.
Fig. 3.
Fig. 3. Power spectrum of the LED array, representing each LED by its peak wavelength for clarity with the FWHM represented by the central error bars. All blue LEDs < 485 nm exhibit powers exceeding 6 mW, while LEDs > 485 nm have powers less than 6 mW. LEDs overlap with one or more adjacent peaks within their FWHM. The four lighter bars represent LEDs with insufficient spectral purity.
Fig. 4.
Fig. 4. Images of a window pattern with different LED illuminations and constant camera exposure time. There are strongly perceived variations in brightness between different sources, as well as non-uniform illumination across the pattern.
Fig. 5.
Fig. 5. Volumetric display with all 20 wavelengths used in sequence. Well-defined planes at different depths labelled with wavelengths: (a) 538.68, (b) 513.94, and (c) 476.60 nm.
Fig. 6.
Fig. 6. (a)–(c) Medical MRI image slices converted from DICOM format. (d)–(f) Images produced by the volumetric display using the above patterns.
Fig. 7.
Fig. 7. Square grid pattern, 469.82 nm (blue) illumination. The pattern sizes were measured on the diffuser. (a) 24 mm on-axis, (b) 32 mm on-axis, and (c) 24 mm, 5° off-axis.

Tables (1)

Tables Icon

Table 1. Viewing Angle Versus Image Size on a Diffuser

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

z ( r ) = n p ( n p 2 n e 2 ) { m λ 0 n e ± [ ξ m λ 0 n p ] 2 + δ 2 r 2 + ξ } ,
r m = 2 m λ 0 f + ( m λ ) 2 .
f = f 0 λ 0 λ .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.