Abstract

Real-time terrain rendering with high-resolution has been a hot spot in computer graphics for many years, which is widely used in electronic maps. However, the traditional two-dimensional display cannot provide the occlusion relationship between buildings, which restricts the observers’ judgment of spatial accuracy. With three projectors, compound lenticular lens array and holographic functional screen, a dynamic three-dimensional (3D) light-field display with 90° viewing angle is demonstrated. Three projectors provide views for the right 30 degrees, center 30 degrees and left 30 degrees, respectively. The holographic functional screen recomposes the light distribution, and the compound lenticular lens array is optimized to balance the aberrations and improve the display quality. In our experiment, the 3D light-field image with 96 perspectives provides the right geometric occlusion and smooth parallax in the viewing range. By rendering 3D images and synchronizing projectors, the dynamic light field display is obtained.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

As 3D display can provide natural scenes in an intuitive and natural manner, the method to obtain high quality 3D display has increasingly drawn interests and various efforts have been made [1–12]. With the development of hardware and GPU, the real-time 3D terrain rendering is widely used in many applications, such as landscape editor, electronic map and geographic information system [13–15]. 3D display is regarded as an ideal and potential technique to visualize terrain rendering with high accuracy. The auto-stereoscopic display based on lenticular lens array or parallax barrier is a mature commercialized 3D display technology [1,2], which can distribute several parallax images in the space. Many inherent drawbacks limit the application of the auto-stereoscopic 3D displays, such as narrow observing range, viewing jump and convergence-accommodation conflict. Integral imaging (II) is an attractive method to realize full-color stereoscopic image with full parallax [3,4]. The limited resolution is the primary disadvantages of II. The holographic 3D display is considered as an impressive method to present 3D image, but it’s still a challenge to realize high quality dynamic holographic 3D display in the near future [5,6].

Most of the existing 3D display systems have not met the expectation of the current terrain rendering technology for electronic map. A high quality 3D display with large viewing angle and acceptable resolution is expected, which can distribute the views in space uniformly, clearly and efficiently for the observers. Light field display is considered as a promising method to reconstruct the light rays’ distribution of the real 3D scene [7–12]. Unlike traditional 3D display methods based on the binocular disparity, a 3D light-field display provides all depth cues and light rays are distributed with variable brightness and different colors in different directions. By simulating light rays emitted from each point of the objects, a natural and realistic 3D sense is obtained. However, many 3D light-field display methods suffer from low display quality. In practical application, it is necessary to guarantee the viewing angle and resolution, simultaneously. Our previous research demonstrated a full-parallax light field display with the viewing angle of 45° [16] and a 162-inch 3D light field display with 40°observing range [17]. To present more spatial information for a 3D urban terrain, it is required to further increase the viewing angle on the premise of ensuring the clarity of the image.

Here, a dynamic 3D light-field display with 90° viewing angle and acceptable resolution is demonstrated. The proposed display system consists of three projectors, the compound lenticular lens array and the holographic functional screen [11,12]. These projectors with the resolution of 3840 × 2160 are used to display the coded images. The compound lenticular lens with two aspheric surfaces and two different refractive indices are optimized to balance the aberrations and improve the display quality. Light beams emitted from the projectors pass through the compound lenticular lens array and reach the holographic functional screen. By the modulation of the holographic functional screen, the 3D light field image is acquired. In the experiment, the displayed image of urban terrain provides the right 3D perception and occlusion in 90° viewing angle along the horizontal direction. By real-time rendering coded images for the projectors, the observer can interact with the displayed dynamic 3D scene.

2. Experimental configuration

Figure 1(a) illustrates the schematic of the demonstrated 3D light-field display. Three projectors are used to provide the 3D contents. The brightness of the projectors is 2200 lumens and the throw ratio is 1.56. The compound lenticular lens array and the holographic functional screen are utilized to modulate the light distribution. The perspectives of 90°viewing angle are divided into three groups: right 30° views, center 30° views and left 30° views. For convenience, the projectors are marked with A, B and C. Project A is located at the left side and provides the right perspectives. Project B is located at the center and provides the center perspectives. Project C is located at the right side and provides the left perspectives. Different groups of perspectives are integrated as three coded images with the resolution of 3840 × 2160 for the corresponding projectors. The coded 3D images are loaded and projected on the compound lenticular lens array by the corresponding projectors, respectively. The resolution of the projectors is 3840 × 2160 and the size of the compound lenticular lens array is 1296mm × 729mm. To ensure the three coded images overlap on the compound lenticular lens array accurately, which can guarantee coded light rays are directed to correcting positions of corresponding lenses, the means of homography is employed for the projectors. In our experimental setup, the continuous and clear 3D scene can be obtained since the light wavefronts are recomposed by the holographic functional screen. The holographic functional screen is utilized to modulate the light distribution from the compound lenticular lens array.

 

Fig. 1 (a) The configuration of the proposed light-field display based on the compound lenticular lens array and multi-projectors. (b) The formation of the voxel from the lenses.

Download Full Size | PPT Slide | PDF

Figure 1(b) illustrates the formation of a voxel, which is the basic display unit of a light field display system. The light beams emitted from the projectors fully cast on the compound lenticular lens array. For the convenience of description, the formation of a voxel Vij is separated to be illustrated. The holographic functional screen is set at the focal plane of the compound lenticular lens array. As the pitch of the lenticular lens (2.7mm) is very small for the projecting system, the light rays passing through one lenticular lens from a projector can be seen as parallel beams. According to the principle of lens imaging, the light rays of different projectors emitted from different directions can converge to a point. According to the imaging relationship, the focal length of the compound lenticular lens array is 5.8mm and the calculation process is given in Eq. (1).

f=ytan(θ)

Where y is the image height of projector C, which is equal to the pitch of lens. The incident angle of the light rays emitted from projector C is θ.

By the modulation of the holographic functional screen, the voxel is formed. The voxel to the 3D light field display is the same as the pixel to the 2D panel display. The voxel emits light rays with variable brightness and different colors in different directions, which is used to simulate a point of the 3D scene realistically.

As shown in Fig. 2, three groups of light beams pass through the compound lenticular lens and converge to a point on the holographic functional screen with the designed spreading function. The holographic functional screen processes the optical transformation to distribute the incident light rays in a specific arranged geometry. To present correct perspectives, it is necessary to spread the light beams with the right spatial angle. The output spatial angle of the voxel Vij is contributed by the light beams emitted from Projector A, Projector B and Projector C. The integral result of the output distribution spatial angle Ωij is given in Eq. (2). The spatial angles formed by Projector A, Projector B and Projector C contribute to the reconstructed 3D scene, simultaneously. So the proposed light field 3D display provides large viewing angle and continuous parallax in the horizontal direction.

 

Fig. 2 Schematic diagram of the holographic functional screen modulation distribution of the spatial angle.

Download Full Size | PPT Slide | PDF

Ωij=n=1NωAn+n=1NωBn+n=1NωCn

The diffusion characteristic of the holographic functional screen is crucial to recompose the light beams emitted from the lenticular lens array. The method to realize the holographic functional screen is called as the directional laser speckle [11]. When a diffusing plate with the size a × b is illuminated by a laser, the speckle patterns are exposed on the photoresist plate behind the diffusing plate. We record the speckle pattern at a distance z0. The average size of the speckle is δx = λz0/a, δy = λz0/b. By the ultraviolet curing and splicing method, the repeated speckle patterns consist the holographic functional screen. When the speckle pattern is illuminated by a light wave, it diffuses and limits the light in a solid diffused angle ωhorizontal = λ/δx = a/z0 and ωvertical = λ/δy = b/z0. Through controlling the speckle pattern of the holographic functional screen, the angular distributions of the light beams with the diffused angles ωAn, ωBn and ωCn are achieved.

The aberrations of the lenticular lens array reduce the convergence accuracy of the light beams drawn in Fig. 1(b), which will degrade the 3D imaging quality. To suppress the aberrations, the compound lens with two aspheric surfaces and two different refractive indices is designed. The aspheric surface formula is given in Eq. (3).

z=cr21+1(1+k)c2r2+α2r2+α4r4+α6r6+
where r is the radial coordinate, k is the conic constant, c is the vertex curvature, α2,α4,α6 are the aspheric coefficients. The damped least-squares method is used to optimize the primary aberrations and high order aberrations. By aberrations balancing, the optimized structure and corresponding parameters are shown in Fig. 3(a). As shown in Fig. 3(b), Fig. 3(c) and Fig. 4(b), the chromatic aberration and distortion are suppressed. They will not degrade the image quality. Figure 4(a) illustrates the comparison between the optimized lens and the standard lens. The modulation transfer function of the compound lenticular lens is obviously improved. The aberrations of the designed optical system are acceptable in the side viewing angle 45°.

 

Fig. 3 (a) The designed structure parameters of the compound lenticular lens. (b) Spot diagram of Project A. (c) Spot diagram of Project A and Project C.

Download Full Size | PPT Slide | PDF

 

Fig. 4 (a) Comparison of modulation transfer function for the compound lens and the single lens. (b) Grid distortion of the compound lenticular lens.

Download Full Size | PPT Slide | PDF

Figure 5 illustrates the displayed images of two optical structures with the same 3D content. The 3D image with the traditional lenticular lens array is blurry. By introducing the compound lenticular lens, the image quality is significantly improved and the detail of the sternum and the heart is much clear.

 

Fig. 5 (a) The 3D image with the traditional lenticular lens array. (b) The 3D image with the optimized compound lenticular lens array.

Download Full Size | PPT Slide | PDF

3. Experimental results

In the proposed light field display system, light beams emitted from the projectors are recomposed by the optimized compound lenticular lens array and the holographic functional screen. To simulate the natural 3D vision, a large number of perspectives is necessary to achieve smooth motion parallax. The resolution of the projectors is 3840 × 2160 and the projecting area is 1296mm × 729mm, which is the size of the compound lenticular lens array. 8 pixels cast on one compound lenticular lens with the pitch of 2.7mm. To increase the number of views, the slant angle of the compound lenticular lens array is set as 14.036°, which is arctan(1/4). This kind of slant optical structure can balance the horizontal and vertical resolution and provide more perspectives. To explain the coding method, three lenticular lens and their corresponding pixels from different projectors are shown in Fig. 6. The number marked on the pixels are the serial number of views (from left to right, the viewpoints’ numbers increase). The left 32 views, the center 32 views and the right 32 views are coded for Project C, Projector B and Projector A, respectively. According to the relative position between the pixel and the central axis of the lenticular lens, the coding maps of 3D images for the three projectors are obtained [18], which is shown in Fig. 6.

 

Fig. 6 Pixels’ arrangement of the coding images for the projectors.

Download Full Size | PPT Slide | PDF

Through projecting the coded images on the compound lenticular lens array and modulating by the holographic functional screen, 96 views are presented for the 90°viewing angles. The demonstrated display system provides more than one viewpoint image per degree, which presents continuous 3D vision for the observers.

The 3D coded images and the corresponding 3D light-field display results of a 3D image of the sternum and heart model are shown in Fig. 7. Continuous motion parallax can be seen in Visualization 1. In the proposed light field display system, 8 pixels are covered by one compound lenticular lens. The horizontal resolution of the 3D image is calculated as 480(the projector’s horizontal resolution 3840 divides 8). As the slanted lenticular lens is employed, the horizontal and vertical resolution are balanced. As shown in Fig. 6, the vertical resolution of the 3D image is calculated as 540(the projector’s vertical resolution 2160 divides 4).

 

Fig. 7 (a) Coded images for the corresponding projectors. (b) 3D light-field display of human heart and sternum (see Visualization 1).

Download Full Size | PPT Slide | PDF

Electronic map is an important application of the 3D light field display. By using the demonstrated display system, a static 3D image for urban terrain is shown in Fig. 8. The display depth is more than 30cm and the occlusion relationship between buildings can be seen clearly.

 

Fig. 8 3D light-field display of static 3D image of urban terrain (see Visualization 2).

Download Full Size | PPT Slide | PDF

For the convenience of viewing the 3D electronic map, a tabletop light field display system with large viewing angle is designed as Fig. 9. The diagonal of the viewing area is 31.5 inches. The pitch of the compound lenticular lens array is 1.4mm and the focal length is 3mm. The display screen is slanted at a 20 degrees angle to the ground. Three projectors are used to provide the 3D contents. When the observer looks downwards on the tabletop 3D system, he can perceive 3D buildings out of the screen. By rendering 3D images based on back-ward ray tracing technique [19] and synchronizing projectors, the real-time interactive operations can be realized at the frame rate of 25 fps. As shown in Visualization 4, the buildings are zoomed in and out through the mouse wheel and the flight path of the aircraft is controlled by the mouse click.

 

Fig. 9 (a) Frontal view of the tabletop 3D light field display system. (b) Side view of the tabletop 3D light field display system. (c) 3D light-field display of static 3D image (see Visualization 3) and interactive dynamic 3D light-field display with the controlled aircraft and the urban terrain (see Visualization 4).

Download Full Size | PPT Slide | PDF

4. Conclusion

A dynamic three-dimensional light-field display method with 90 degrees viewing angle is demonstrated. It provides more than one viewpoint image per degree, which realizes smooth parallax and right geometric occlusion in the viewing range. 96 views are separated into three groups and coded. The coded 3D images are projected by three projectors with the resolution of 3840 × 2160 from different directions, respectively. To improve the image quality, the compound lenticular lens array is designed and fabricated to suppress the aberrations. The slant angle of the compound lenticular lens array is set as 14.036°to increase the number of views. Through the holographic functional screen and the compound lenticular lens array, the light beams are recomposed and the realistic 3D light field display is demonstrated. By rendering and synchronizing 3D images for the projectors according to the operations, the interactive dynamic light field display is obtained. For the convenience of viewing the 3D electronic map, a tabletop light field display is presented.

Funding

National Natural Science Foundation of China (NSFC) (61705014); State Key Laboratory of Information Photonics and Optical Communications; National Key Research and Development Program (2017YFB1002900).

References

1. Y.-C. Chang, L.-C. Tang, and C.-Y. Yin, “Efficient simulation of intensity profile of light through subpixel-matched lenticular lens array for two- and four-view auto-stereoscopic liquid-crystal display,” Appl. Opt. 52(1), A356–A359 (2013). [CrossRef]   [PubMed]  

2. G. J. Lv, Q. H. Wang, W. X. Zhao, and J. Wang, “3D display based on parallax barrier with multiview zones,” Appl. Opt. 53(7), 1339–1342 (2014). [CrossRef]   [PubMed]  

3. N. Okaichi, M. Miura, J. Arai, M. Kawakita, and T. Mishina, “Integral 3D display using multiple LCD panels and multi-image combining optical system,” Opt. Express 25(3), 2805–2817 (2017). [CrossRef]   [PubMed]  

4. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017). [CrossRef]   [PubMed]  

5. Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015). [CrossRef]   [PubMed]  

6. J. Y. Son, C. H. Lee, O. O. Chernyshov, B. R. Lee, and S. K. Kim, “A floating type holographic display,” Opt. Express 21(17), 20441–20451 (2013). [CrossRef]   [PubMed]  

7. X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30(6), 6–14 (2014). [CrossRef]  

8. G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012). [CrossRef]  

9. W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015). [CrossRef]  

10. D. Chen, X. Sang, X. Yu, X. Zeng, S. Xie, and N. Guo, “Performance improvement of compressive light field display with the viewing-position-dependent weight distribution,” Opt. Express 24(26), 29781–29793 (2016). [CrossRef]   [PubMed]  

11. X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009). [CrossRef]   [PubMed]  

12. X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011). [CrossRef]  

13. K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018). [CrossRef]  

14. R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016). [CrossRef]  

15. G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018). [CrossRef]  

16. X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018). [CrossRef]   [PubMed]  

17. S. Yang, X. Sang, X. Yu, X. Gao, L. Liu, B. Liu, and L. Yang, “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express 26(25), 33013–33021 (2018). [CrossRef]   [PubMed]  

18. X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 060008 (2014). [CrossRef]  

19. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. Y.-C. Chang, L.-C. Tang, and C.-Y. Yin, “Efficient simulation of intensity profile of light through subpixel-matched lenticular lens array for two- and four-view auto-stereoscopic liquid-crystal display,” Appl. Opt. 52(1), A356–A359 (2013).
    [Crossref] [PubMed]
  2. G. J. Lv, Q. H. Wang, W. X. Zhao, and J. Wang, “3D display based on parallax barrier with multiview zones,” Appl. Opt. 53(7), 1339–1342 (2014).
    [Crossref] [PubMed]
  3. N. Okaichi, M. Miura, J. Arai, M. Kawakita, and T. Mishina, “Integral 3D display using multiple LCD panels and multi-image combining optical system,” Opt. Express 25(3), 2805–2817 (2017).
    [Crossref] [PubMed]
  4. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
    [Crossref] [PubMed]
  5. Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015).
    [Crossref] [PubMed]
  6. J. Y. Son, C. H. Lee, O. O. Chernyshov, B. R. Lee, and S. K. Kim, “A floating type holographic display,” Opt. Express 21(17), 20441–20451 (2013).
    [Crossref] [PubMed]
  7. X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30(6), 6–14 (2014).
    [Crossref]
  8. G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
    [Crossref]
  9. W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
    [Crossref]
  10. D. Chen, X. Sang, X. Yu, X. Zeng, S. Xie, and N. Guo, “Performance improvement of compressive light field display with the viewing-position-dependent weight distribution,” Opt. Express 24(26), 29781–29793 (2016).
    [Crossref] [PubMed]
  11. X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009).
    [Crossref] [PubMed]
  12. X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
    [Crossref]
  13. K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
    [Crossref]
  14. R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
    [Crossref]
  15. G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018).
    [Crossref]
  16. X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
    [Crossref] [PubMed]
  17. S. Yang, X. Sang, X. Yu, X. Gao, L. Liu, B. Liu, and L. Yang, “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express 26(25), 33013–33021 (2018).
    [Crossref] [PubMed]
  18. X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 060008 (2014).
    [Crossref]
  19. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
    [Crossref] [PubMed]

2018 (4)

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018).
[Crossref]

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref] [PubMed]

S. Yang, X. Sang, X. Yu, X. Gao, L. Liu, B. Liu, and L. Yang, “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express 26(25), 33013–33021 (2018).
[Crossref] [PubMed]

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

2017 (3)

2016 (2)

2015 (2)

W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
[Crossref]

Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015).
[Crossref] [PubMed]

2014 (3)

2013 (2)

2012 (1)

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
[Crossref]

2011 (1)

X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
[Crossref]

2009 (1)

Arai, J.

Cao, L.

Chang, Y.-C.

Chen, D.

Chernyshov, O. O.

Choi, S.

X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
[Crossref]

X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009).
[Crossref] [PubMed]

Dai, S.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Dou, W.

Duo, C.

Evangelidis, K.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Fan, F.

X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
[Crossref]

Fan, F. C.

Gao, X.

Guan, Y.

Guo, N.

Han, J.

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018).
[Crossref]

Hilas, C.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Hirsch, M.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
[Crossref]

Jiang, C.

X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
[Crossref]

Jiang, C. C.

Jin, G.

Kang, G.

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018).
[Crossref]

Kawakita, M.

Kim, S. K.

Kong, D.

Lanman, D.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
[Crossref]

Lee, B. R.

Lee, C. H.

Li, H.

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30(6), 6–14 (2014).
[Crossref]

Li, Y.

Liu, B.

Liu, L.

Liu, X.

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30(6), 6–14 (2014).
[Crossref]

Liu, Y.

W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
[Crossref]

Lu, K.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Lv, G. J.

Mastorokostas, P.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Mishina, T.

Miura, M.

Okaichi, N.

Pan, W.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Pang, B.

Papadopoulos, T.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Papatheodorou, K.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Raskar, R.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
[Crossref]

Sang, X.

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref] [PubMed]

S. Yang, X. Sang, X. Yu, X. Gao, L. Liu, B. Liu, and L. Yang, “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express 26(25), 33013–33021 (2018).
[Crossref] [PubMed]

S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
[Crossref] [PubMed]

S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
[Crossref] [PubMed]

D. Chen, X. Sang, X. Yu, X. Zeng, S. Xie, and N. Guo, “Performance improvement of compressive light field display with the viewing-position-dependent weight distribution,” Opt. Express 24(26), 29781–29793 (2016).
[Crossref] [PubMed]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 060008 (2014).
[Crossref]

X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
[Crossref]

X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009).
[Crossref] [PubMed]

Sim, Y.

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018).
[Crossref]

Son, J. Y.

Song, W.

W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
[Crossref]

Tang, L.-C.

Wang, J.

Wang, K.

Wang, P.

Wang, Q. H.

Wang, Y.

W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
[Crossref]

Wetzstein, G.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
[Crossref]

Wu, Y.

Xie, S.

Xing, S.

Xu, D.

Yan, B.

Yang, L.

Yang, S.

Yin, C.-Y.

Yu, C.

Yu, X.

S. Yang, X. Sang, X. Yu, X. Gao, L. Liu, B. Liu, and L. Yang, “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express 26(25), 33013–33021 (2018).
[Crossref] [PubMed]

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref] [PubMed]

S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
[Crossref] [PubMed]

S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
[Crossref] [PubMed]

D. Chen, X. Sang, X. Yu, X. Zeng, S. Xie, and N. Guo, “Performance improvement of compressive light field display with the viewing-position-dependent weight distribution,” Opt. Express 24(26), 29781–29793 (2016).
[Crossref] [PubMed]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 060008 (2014).
[Crossref]

Yuan, J.

Zeng, X.

Zhai, R.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Zhang, H.

Zhao, T.

Zhao, W. X.

Zhao, Y.

Zhu, Q.

W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
[Crossref]

ACM Trans. Graph. (1)

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays,” ACM Trans. Graph. 31(4), 1–11 (2012).
[Crossref]

Appl. Opt. (2)

Chin. Opt. Lett. (1)

Comput. Geosci. (1)

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Graph. Models (1)

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution,” Graph. Models 97, 64–79 (2018).
[Crossref]

Inf. Disp. (1)

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30(6), 6–14 (2014).
[Crossref]

Neurocomputing (1)

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Opt. Eng. (2)

X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011).
[Crossref]

W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Volumetric display based on multiple mini-projectors and a rotating screen,” Opt. Eng. 54(1), 013103 (2015).
[Crossref]

Opt. Express (8)

D. Chen, X. Sang, X. Yu, X. Zeng, S. Xie, and N. Guo, “Performance improvement of compressive light field display with the viewing-position-dependent weight distribution,” Opt. Express 24(26), 29781–29793 (2016).
[Crossref] [PubMed]

N. Okaichi, M. Miura, J. Arai, M. Kawakita, and T. Mishina, “Integral 3D display using multiple LCD panels and multi-image combining optical system,” Opt. Express 25(3), 2805–2817 (2017).
[Crossref] [PubMed]

S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
[Crossref] [PubMed]

Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015).
[Crossref] [PubMed]

J. Y. Son, C. H. Lee, O. O. Chernyshov, B. R. Lee, and S. K. Kim, “A floating type holographic display,” Opt. Express 21(17), 20441–20451 (2013).
[Crossref] [PubMed]

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref] [PubMed]

S. Yang, X. Sang, X. Yu, X. Gao, L. Liu, B. Liu, and L. Yang, “162-inch 3D light field display based on aspheric lens array and holographic functional screen,” Opt. Express 26(25), 33013–33021 (2018).
[Crossref] [PubMed]

S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017).
[Crossref] [PubMed]

Opt. Lett. (1)

Supplementary Material (4)

NameDescription
» Visualization 1       3D light-field display of human heart and sternum
» Visualization 2       3D light-field display of static 3D image of urban terrain
» Visualization 3       Static 3D light-field image of the tabletop display system
» Visualization 4       Interactive dynamic 3D light-field display with the controlled aircraft and the urban terrain

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 (a) The configuration of the proposed light-field display based on the compound lenticular lens array and multi-projectors. (b) The formation of the voxel from the lenses.
Fig. 2
Fig. 2 Schematic diagram of the holographic functional screen modulation distribution of the spatial angle.
Fig. 3
Fig. 3 (a) The designed structure parameters of the compound lenticular lens. (b) Spot diagram of Project A. (c) Spot diagram of Project A and Project C.
Fig. 4
Fig. 4 (a) Comparison of modulation transfer function for the compound lens and the single lens. (b) Grid distortion of the compound lenticular lens.
Fig. 5
Fig. 5 (a) The 3D image with the traditional lenticular lens array. (b) The 3D image with the optimized compound lenticular lens array.
Fig. 6
Fig. 6 Pixels’ arrangement of the coding images for the projectors.
Fig. 7
Fig. 7 (a) Coded images for the corresponding projectors. (b) 3D light-field display of human heart and sternum (see Visualization 1).
Fig. 8
Fig. 8 3D light-field display of static 3D image of urban terrain (see Visualization 2).
Fig. 9
Fig. 9 (a) Frontal view of the tabletop 3D light field display system. (b) Side view of the tabletop 3D light field display system. (c) 3D light-field display of static 3D image (see Visualization 3) and interactive dynamic 3D light-field display with the controlled aircraft and the urban terrain (see Visualization 4).

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

f= y tan(θ)
Ω ij = n=1 N ω An + n=1 N ω Bn + n=1 N ω Cn
z= c r 2 1+ 1( 1+k ) c 2 r 2 + α 2 r 2 + α 4 r 4 + α 6 r 6 +

Metrics