Abstract

We propose a compact head-worn 3D display which provides glasses-free full motion parallax. Two picoprojectors placed on the viewer’s head project images on a retro-reflective screen that reflects left and right images to the appropriate eyes of the viewer. The properties of different retro-reflective screen materials have been investigated, and the key parameters of the projection – brightness and cross-talk – have been calculated. A demonstration system comprising two projectors, a screen tracking system and a commercial retro-reflective screen has been developed to test the visual quality of the proposed approach.

© 2014 Optical Society of America

1. Introduction

The development of 3D display solutions attracts more and more interest both in the scientific community and in the consumer electronics industry. Numerous different methods are currently investigated to gain better quality 3D experience. The first -so called stereoscopic- 3D systems show only one image pair as viewed from a fixed perspective using e.g. color or polarization filtering [1, 2]. Better 3D perception is obtained by projecting multiple images to different viewing angles. Different techniques - such as parallax barrier [3], lenticular lenses [4] or using projector arrays [5] are investigated to achieve multiple discrete views distributed in the horizontal plane. Our goal is to construct a high fidelity 3D projector system that provides high resolution horizontal and vertical parallax, viewing distance dependent perspective and supports multi-viewer and glasses-free operation.

It is known, that retro-reflective materials can be applied in head-mounted projection displays to enhance image brightness [6, 7]. These solutions apply a beam splitter between the eyes and the screen to project the retro-reflected image of the output pupil of the projector to the pupil of the eye. To avoid the use of beam splitters a rotating screen can be used [8], which is a cumbersome mechanical solution, or one can use the near-retro-reflected light by placing two projectors close to the eyes [9]. In the above systems, 3D experience is usually achieved by head tracking, which requires an external tracking system and difficult to adapt to multi-user displays.

We propose a head mounted 3D projection display (HMPD) using a diffusing retro-reflective screen and an integrated screen tracking sensor. In the next section, we explain the operation of the system, and show the important role of the retro-reflective screen in the quality of 3D imaging. In the third section we describe the evaluation of commercially available retro-reflective screen materials by goniophotometric measurement. Next, we determine the expectable screen brightness and cross-talk performances of the system based on the measurement results. Last, but not least, the operation of a demonstration model based on the proposed system and the chosen screen is presented and discussed.

2. System operation

The system we propose comprises a head mounted unit and a retro-reflective projection screen (see Fig. 1.). The head mounted unit consists of two picoprojectors close to the eyes that project the images to the screen. The projectors have small size and low power consumption, thus they are suitable for being placed on the viewer’s head without any inconvenience. The screen tracking sensor – also placed on the head mounted unit - emits invisible electromagnetic radiation (e.g. near infrared light) to the screen and detects the image of the retro-reflecting screen by a CCD camera. From the size and shape of the screen image it calculates the distance and the viewing angle of the viewer from the screen. The integrated data processing system calculates the appropriate image pair of the object, as viewed from the actual position of the viewer. The screen tracking sensor signal can be used for image stabilization, too.

 figure: Fig. 1

Fig. 1 Schematic diagram of the mobile 3D setup where (1) and (2) are the picoprojectors, (3) is the projected image of the right eye projector, (4) is the projected image of the left eye projector, (5) is the retro-reflective screen and (6) is the screen tracking sensor.

Download Full Size | PPT Slide | PDF

The projection screen has a key role in realizing the proposed 3D projection. Its task is to reflect back the left and right images to the left eye and right eyes respectively, without utilizing any bulky equipment. By definition, retro-reflective materials reflect back the light into the direction of their source. This feature allows multiple users to view different contents on the same screen. Retro-reflective materials reflect back light with very good efficiency that permits the use of low-power picoprojectors. A finite amount of light is retro-reflected to the close area around the projector, which area contains the eye. Ideally, one eye would only see the image of the projector placed just above it, and nothing of the image of the other projector. In reality – depending on the light scattering properties of the screen - a finite amount of light is also seen emanating from the projector of the other eye, resulting in a cross-talk between the two images.

The quality of the 3D imaging can be characterized by two important physical parameters: cross-talk ratio and brightness. Cross-talk ratio is given by the light power sensed by one eye from the projector above the other eye divided by the sensed intensity of the projector just above the eye. Our intention is to find the screen with the lowest cross-talk and highest brightness.

3. Measurements

In order to evaluate the cross-talk and brightness values achievable by a given screen material we have to measure the scattered intensity of the screen with respect to the scattering angle.

For this purpose we have built a goniophotometric measurement setup [10], which can be seen on Fig. 2(a). The sample is illuminated by a 532 nm frequency doubled Nd-YAG laser at perpendicular incidence. We have seen that the screen materials have no visible dispersion, thus there would be no notable difference in the measurement results for different wavelengths. Hence, monochromatic measurement setup was used which was much easier to handle. The power of the scattered intensity is measured as a function of the angle by a Coherent FieldMate detector which can detect power as low as 10 nW up to the mW region. The measured values are transferred to a computer by an AD-converter, and they are read by a software written in Labview. The scattered intensity is measured in reflection region, so basically we determine the Bidirectional Reflectance Distribution Function (BRDF) of the screen material at normal incidence.

 figure: Fig. 2

Fig. 2 Schematic diagram of measurement setup (a), and typical angular distribution of retro-reflected light (b) with and without spatial averaging.

Download Full Size | PPT Slide | PDF

The most interesting part of the scattering profile is at small scattering angles. In order to avoid the disturbing shadow of the detector around the zero order, we put a beam splitter at the end of the rotating arm, to which the detector is placed perpendicularly. The distance between the beam splitter and the rotating axis is 25 cm, the distance between the beam splitter and the detector is 12 cm, so rdet = 37 cm. In order to achieve angular resolution of 0.15°, we place a slit of 0.97 mm width in front of the detector.

To decrease the noise of the measurement due to laser speckle, a spatial average is measured. An x-y stepper is integrated, which can move the sample in horizontal and vertical direction. To minimize the deviation of the data, the distance between two measurement points is 60 μm. This corresponds to the average size of the glass beads on the measured screens. A compromise is made between the sampling number and measurement time by choosing 100 measurement points for each scattering angle. Also, the measured scattering profile is to be corrected because of the beam splitter, to fit the data measured when the incident light goes through the beam splitter and when it does not. Measurement results before and after the modification can be compared on Fig. 2(b).

We have chosen the best screen material among the available commercial retro-reflectors for the demonstration system. Two types of retro-reflective structures were investigated: corner cube retro-reflectors and glass bead retro-reflectors. We first made a visual rating by projecting images onto the screens. We found that typical corner cube retro-reflectors add artifacts to the image thus reduce homogeneity. For further experiences we have been using the glass beads based retro-reflectors. According to the described measurement we have found that a material comprising 50 μm beads partly embedded into a metallic reflector layer is currently the best retro-reflective material. Its performance in the demonstration model will be described in the next section.

4. Estimation of system performances

From the measured angular intensity distribution it is possible to calculate the cross-talk ratio and the screen brightness.

The viewing angle of one eye and the projector just above it (φcloser) depends on the distance of the projector and the eye (dcloser) and the distance from the screen (D):

φcloser=arctan(dcloserD)

The viewing angle of the eye and the other projector (φfurther) is given by the following expression:

φfurther=arccos(D2(deyes2+D2)(dcloser2+D2))
where dcloser is the distance of the projector and the eye just above it, dfurther is the distance of the eye and the other projector, deyes is the distance of the two eyes and D is the distance of the eye from the screen. For the calculations, the average interpupillary distance of 65 mm is used [11]. The distance dcloser was chosen so that the projector does not cover the projected image and does not reduce the field of view. We found that a projector eye distance of 2 cm is the shortest realizable distance without using beam splitter.

Figure 3 illustrates the geometry of the above equations.

 figure: Fig. 3

Fig. 3 (a) Scattering angles of the eyes to the projector, (b) the scattering angles of the two eyes at a given screen-viewer distance marked on the angular scattering profile on a schematic drawing.

Download Full Size | PPT Slide | PDF

Using these expressions, cross-talk is given as a function of the screen-viewer distance, as it can be seen on Fig. 4. The chosen screen is able to provide a cross-talk ratio less than 0.1 in most of the distance range, which provided a satisfactory 3D perception, in agreement with the visual tests carried out by Kooi and Toet [12]. Though, the goal in the future is to reduce the cross-talk ratio below 0.01.

 figure: Fig. 4

Fig. 4 Cross-talk with respect to the distance of the screen and the viewer.

Download Full Size | PPT Slide | PDF

Taking into consideration the solid angle of the detector, the efficiency and the area of the screen, the image brightness [13] as a function of the screen-projector distance (D) can be expressed as

B=m(φ)R2TPLγ1D2

where m(φ) is the measured intensity distribution function, T is the area of the detector, R is the detector-screen distance, L is the luminance of the projector, P is the power of the laser and A = γ∙D2 is the illuminated area. The brightness of the screen seen by the two eyes as a function of the screen-projector distance can be seen on Fig. 5.

 figure: Fig. 5

Fig. 5 Screen brightness seen by the eyes as a function of the screen-viewer distance.

Download Full Size | PPT Slide | PDF

Compared to a typical indoors display luminosity of about 200-300 cd/m2, we achieve more than 1000 cd/m2 when the viewer’s distance varies between 1 m and 4 m, which is more than satisfying. We will have the opportunity to use smaller projectors with less output power in the future.

5. Demonstration system

We have built a demonstration system (see Fig. 6.) to prove our concept. It consists of two Samsung SP-H03 30 lumen picoprojectors and a light source with a web camera for screen tracking on a helmet. In order to project images from an output pupil as close to the eyes as possible, we use folding mirrors just above the eyes. Today, much smaller projectors with suitable luminous power and resolution are under development [14] that can be placed closer to the eyes without mirrors. The applied screen is the one chosen by the measurements.

 figure: Fig. 6

Fig. 6 Demonstration system comprising two nanoprojectors (1),(2), two mirrors (5), a screen tracing sensor made of a webcam and LEDs (6) on a helmet (7) where the windows for the viewer’s right and left eyes just below the mirrors are marked by (3) and (4) respectively.

Download Full Size | PPT Slide | PDF

The software of the demonstration system is composed of two modules. The screen localization module finds reference points on the captured image of the web camera and calculates the camera position and viewing direction. The second module is a 3D game engine (MOgre [15]) which receives the calculated parameters from the screen localization module and generates the image pair to be projected from the appropriate position and orientation. Due to the fact the system has two projectors for the two eyes it is needed to render the virtual scene from two different viewpoints which are horizontally shifted from each other.

The screen localization module has to make a match between the detected reference points and the stored reference points which have 3D coordinates. The software can calculate the position and the viewing direction of the camera if at least four points can be located on the picture. We use a rectangular screen, therefore the most efficiently detectable reference points are the corners of the screen. The system uses a light source near the camera thus the image of the retro-reflective screen on the camera has a high brightness and contrast compared to the environment. To increase the reference point detection reliability and accuracy the software does not detect the corners but edges of the screen image, so it can avoid false corner detection when the captured image is noisy or contains bright regions outside the screen. When the four edges of the screen are detected, the intersections of the lines fitting these edges define the positions of the corners.

The most important challenge in the software of the demonstration system is to achieve fast response time. Naturally if we decrease response time we have less accuracy and if we increase accuracy we have larger response times. In order to get the best result we have to find the optimal balance between these parameters. In the present demo system the processing is made by an external PC. The camera response time, the data transfer, the processing and the projection makes up a delay of 120 ms which results a delay of about 7.2 frames of the projected image. This can be reduced to an imperceptibly short delay time by further development of the algorithm and using onboard electronics integrated to the HMPD.

Our experiments confirm our calculations, so that we find the screen very bright and we are able to see high quality 3D images from a perspective in accordance to our position and viewing angle.

6. Conclusion

In this paper we have presented a compact, full-parallax, multi-viewer and glasses-free 3D head mounted projection display system. We have investigated the retro-reflective screen as the part of the system. A goniophotometric measurement setup has been developed to acquire the scattering properties of retro-reflective surfaces. Based on our measurements and calculations, the best fitting commercial screen material for our system has been found. Using the measurement results we determined the key system performances which are brightness and cross-talk ratio. We found that the system has a very high brightness, and a moderate cross-talk ratio, that merits further development.

A demonstration model has also been constructed for testing the approach. Good 3D perception and high brightness of the screen were observed in agreement with our calculations. These results suggest that the approach has a very good potential, and a further development of the screen can lead to a significant increase in the system performances.

Acknowledgments

This research was supported by the European Union and the State of Hungary, co-financed by the European Social Fund in the framework of TAMOP 4.2.4. A/-11-1-2012-0001 “National Excellence Program”, and “Research and development in medical technology for the efficient treatment of cataract”, VKSZ_12-1-2013-0080.

References and links

1. C. Matt, “Real D 3D Theatrical System,” European Digital Cinema Forum, Retrieved 2009–03–28. http://www.edcf.net/edcf_docs/real-d.pdf

2. K. E. Jachimowicz and R. S. Gold, “Stereoscopic (3D) projection display using polarized color multiplexing,” Opt. Eng. 29(32), 838–842 (1990).

3. Y. H. Tao, Q. H. Wang, J. Gu, W. X. Zhao, and D. H. Li, “Autostereoscopic three-dimensional projector based on two parallax barriers,” Opt. Lett. 34(20), 3220–3222 (2009). [CrossRef]   [PubMed]  

4. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010), http://www.opticsinfobase.org/oe/abstract.cfm?uri=oe-18-9-8824. [CrossRef]   [PubMed]  

5. T. Balogh, P. Kovacs, and A. Barsi, “Holovizio 3D display system,” in 3DTV Conf., pp. 1–4. (2007). [CrossRef]  

6. H. Hua, A. Girardot, C. Gao, and J. P. Rolland, “Engineering of head-mounted projective displays,” Appl. Opt. 39(22), 3814–3824 (2000). [CrossRef]   [PubMed]  

7. R. Martins, V. Shaoulov, Y. Ha, and J. P. Rolland, “A mobile head-worn projection display,” Opt. Express 15(22), 14530–14538 (2007), http://www.opticsinfobase.org/oe/fulltext.cfm?uri=oe-15-22-14530&id=144401. [CrossRef]   [PubMed]  

8. O. Eldes, K. Akşit, and H. Urey, “Multi-view autostereoscopic projection display using rotating screen,” Opt. Express 21(23), 29043–29054 (2013). [CrossRef]   [PubMed]  

9. D. A. Stanton, “Head-mounted projection display system,” US Patent App. 2003/0179157 (2003)

10. Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002). [CrossRef]  

11. N. A. Dodgson, “Variation and extrema of human interpupillary distance,” Proc. SPIE 5291, 36–46 (2004). [CrossRef]  

12. F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004). [CrossRef]  

13. W. J. Smith, Modern Optical Engineering, (McGraw-Hill, 2000), Chap. 8.

14. STMicro Press Release, (2011) http://www.st.com/web/en/press/en/t3130.

15. Support and community documentation for Ogre3D, http://www.ogre3d.org/tikiwiki/MOGRE.

References

  • View by:
  • |
  • |
  • |

  1. C. Matt, “Real D 3D Theatrical System,” European Digital Cinema Forum, Retrieved 2009–03–28. http://www.edcf.net/edcf_docs/real-d.pdf
  2. K. E. Jachimowicz and R. S. Gold, “Stereoscopic (3D) projection display using polarized color multiplexing,” Opt. Eng. 29(32), 838–842 (1990).
  3. Y. H. Tao, Q. H. Wang, J. Gu, W. X. Zhao, and D. H. Li, “Autostereoscopic three-dimensional projector based on two parallax barriers,” Opt. Lett. 34(20), 3220–3222 (2009).
    [Crossref] [PubMed]
  4. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010), http://www.opticsinfobase.org/oe/abstract.cfm?uri=oe-18-9-8824 .
    [Crossref] [PubMed]
  5. T. Balogh, P. Kovacs, and A. Barsi, “Holovizio 3D display system,” in 3DTV Conf., pp. 1–4. (2007).
    [Crossref]
  6. H. Hua, A. Girardot, C. Gao, and J. P. Rolland, “Engineering of head-mounted projective displays,” Appl. Opt. 39(22), 3814–3824 (2000).
    [Crossref] [PubMed]
  7. R. Martins, V. Shaoulov, Y. Ha, and J. P. Rolland, “A mobile head-worn projection display,” Opt. Express 15(22), 14530–14538 (2007), http://www.opticsinfobase.org/oe/fulltext.cfm?uri=oe-15-22-14530&id=144401 .
    [Crossref] [PubMed]
  8. O. Eldes, K. Akşit, and H. Urey, “Multi-view autostereoscopic projection display using rotating screen,” Opt. Express 21(23), 29043–29054 (2013).
    [Crossref] [PubMed]
  9. D. A. Stanton, “Head-mounted projection display system,” US Patent App. 2003/0179157 (2003)
  10. Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
    [Crossref]
  11. N. A. Dodgson, “Variation and extrema of human interpupillary distance,” Proc. SPIE 5291, 36–46 (2004).
    [Crossref]
  12. F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004).
    [Crossref]
  13. W. J. Smith, Modern Optical Engineering, (McGraw-Hill, 2000), Chap. 8.
  14. STMicro Press Release, (2011) http://www.st.com/web/en/press/en/t3130 .
  15. Support and community documentation for Ogre3D, http://www.ogre3d.org/tikiwiki/MOGRE .

2013 (1)

2010 (1)

2009 (1)

2007 (1)

2004 (2)

N. A. Dodgson, “Variation and extrema of human interpupillary distance,” Proc. SPIE 5291, 36–46 (2004).
[Crossref]

F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004).
[Crossref]

2002 (1)

Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
[Crossref]

2000 (1)

1990 (1)

K. E. Jachimowicz and R. S. Gold, “Stereoscopic (3D) projection display using polarized color multiplexing,” Opt. Eng. 29(32), 838–842 (1990).

Aksit, K.

Dodgson, N. A.

N. A. Dodgson, “Variation and extrema of human interpupillary distance,” Proc. SPIE 5291, 36–46 (2004).
[Crossref]

Eldes, O.

Gao, C.

Girardot, A.

Gold, R. S.

K. E. Jachimowicz and R. S. Gold, “Stereoscopic (3D) projection display using polarized color multiplexing,” Opt. Eng. 29(32), 838–842 (1990).

Gu, J.

Ha, Y.

Hua, H.

Hvilsted, S.

Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
[Crossref]

Jachimowicz, K. E.

K. E. Jachimowicz and R. S. Gold, “Stereoscopic (3D) projection display using polarized color multiplexing,” Opt. Eng. 29(32), 838–842 (1990).

Kerekes, Á.

Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
[Crossref]

Kooi, F.

F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004).
[Crossref]

Li, D. H.

Lorincz, W.

Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
[Crossref]

Martins, R.

Nago, N.

Ramanujam, P. S.

Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
[Crossref]

Rolland, J. P.

Shaoulov, V.

Takaki, Y.

Tao, Y. H.

Toet, A.

F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004).
[Crossref]

Urey, H.

Wang, Q. H.

Zhao, W. X.

Appl. Opt. (1)

Displays (1)

F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004).
[Crossref]

Opt. Commun. (1)

Á. Kerekes, W. Lőrincz, P. S. Ramanujam, and S. Hvilsted, “Light scattering of thin azobenzene side-chain polyester layer,” Opt. Commun. 206(1), 57–65 (2002).
[Crossref]

Opt. Eng. (1)

K. E. Jachimowicz and R. S. Gold, “Stereoscopic (3D) projection display using polarized color multiplexing,” Opt. Eng. 29(32), 838–842 (1990).

Opt. Express (3)

Opt. Lett. (1)

Proc. SPIE (1)

N. A. Dodgson, “Variation and extrema of human interpupillary distance,” Proc. SPIE 5291, 36–46 (2004).
[Crossref]

Other (6)

C. Matt, “Real D 3D Theatrical System,” European Digital Cinema Forum, Retrieved 2009–03–28. http://www.edcf.net/edcf_docs/real-d.pdf

W. J. Smith, Modern Optical Engineering, (McGraw-Hill, 2000), Chap. 8.

STMicro Press Release, (2011) http://www.st.com/web/en/press/en/t3130 .

Support and community documentation for Ogre3D, http://www.ogre3d.org/tikiwiki/MOGRE .

T. Balogh, P. Kovacs, and A. Barsi, “Holovizio 3D display system,” in 3DTV Conf., pp. 1–4. (2007).
[Crossref]

D. A. Stanton, “Head-mounted projection display system,” US Patent App. 2003/0179157 (2003)

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Schematic diagram of the mobile 3D setup where (1) and (2) are the picoprojectors, (3) is the projected image of the right eye projector, (4) is the projected image of the left eye projector, (5) is the retro-reflective screen and (6) is the screen tracking sensor.
Fig. 2
Fig. 2 Schematic diagram of measurement setup (a), and typical angular distribution of retro-reflected light (b) with and without spatial averaging.
Fig. 3
Fig. 3 (a) Scattering angles of the eyes to the projector, (b) the scattering angles of the two eyes at a given screen-viewer distance marked on the angular scattering profile on a schematic drawing.
Fig. 4
Fig. 4 Cross-talk with respect to the distance of the screen and the viewer.
Fig. 5
Fig. 5 Screen brightness seen by the eyes as a function of the screen-viewer distance.
Fig. 6
Fig. 6 Demonstration system comprising two nanoprojectors (1),(2), two mirrors (5), a screen tracing sensor made of a webcam and LEDs (6) on a helmet (7) where the windows for the viewer’s right and left eyes just below the mirrors are marked by (3) and (4) respectively.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

φ closer =arctan( d closer D )
φ further =arccos( D 2 ( d eyes 2 + D 2 )( d closer 2 + D 2 ) )
B= m(φ) R 2 TP L γ 1 D 2

Metrics