Abstract

When employing the light field method with standard lens array and the holographic functional screen (HFS) to realize the tabletop three-dimensional (3D) display, the viewing area of the reconstructed 3D images is right above the screen. As the observers sit around the table, the generated viewpoints in the middle of the viewing area are wasteful. Here, a 360-degree viewable light-field display system is demonstrated, which can present 3D images to multiple viewers in ring-shaped viewing range. The proposed display system consists of the HFS, the aspheric conical lens array, a 27-inch LCD with the resolution of 3840×2160, the LEDs array and the Fresnel lens array. By designing the aspheric conical lens, the light rays emitting from the elemental images forms the viewpoints in a ring-type arrangement. And the corresponding coding method is given. Compared with the light field display with standard lens array, the viewpoint density is increased and the aliasing phenomenon is reduced. In the experiment, the tabletop light-field display based on aspheric conical lens array can present high quality 360-degree viewable 3D image with the right perception and occlusion.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

The three-dimensional (3D) display provides natural scenes in a realistic manner, which has attracted much attention of the scientists. Various efforts have been made to advance the 3D technology [1]. Recent developments in advanced real-time terrain rending make it possible to acquire high definition topographic map with massive data, which is widely used in many applications, such as navigation, electronic map and flight simulation [1618]. However, the traditional two-dimensional tabletop display can’t provide the occlusion relationship between buildings, which restricts the observers’ judgment of spatial accuracy. The tabletop 3D display is regarded as an ideal and potential technique to present the rendering topographic map in a realistic manner [215]. An ideal tabletop 3D display should provide correct field of view and spatial occlusion effect. And multiple viewers can share the 3D images simultaneously in 360 degrees viewing angle. The holographic 3D display is considered to be one of the most promising techniques in terms of real image reproduction. This impressive method can be applied into tabletop 3D display [24]. However, it is still a challenge to realize the high quality holographic 3D image with the large size and true color in the near future. By introducing a high-speed rotating motor, volumetric (volume swept) displays produce volume-filling three-dimensional imagery [57]. Although each voxel in the reconstructed 3D scene emits light rays from the position in which it appears, the displayed image is transparent, which sacrifices the occlusion relationship. Recently, many researchers utilize the ring-shaped projector arrays to realize the 360-degree tabletop 3D display [8]. The complexity and size limit the application of the system. To simplify the structure, the display methods with one or a few high-refresh-rate projectors are proposed [911]. The high-speed component has the tradeoffs between the color depth range and refresh rate. Therefore, the generated 3D images are presented with low color and grayscale.

Integral imaging (II) is an attractive method to obtain full-parallax and full-color 3D images, which is implemented in the tabletop 3D display systems [12,13]. As the pattern of the lens-array is noticeably observable, the primary disadvantages of II is the low spatial resolution. Our previous researches overcome the problem by introducing holographic functional screen (HFS) and demonstrate two light field display systems [14,15]. In the demonstrated systems, the viewpoints are generated uniformly in front of the screen. In the application of the tabletop presentation, the viewing range of the reconstructed 3D scene is turned from the front to the top. When people sit around the table, the generated viewpoints surrounding the tabletop 3D display are visible for the observers and the viewpoints right above the light field display system are wasteful as shown in Fig. 1(a). The compound lens array [14] can suppress the aberrations and improve the image quality of the center views with 45 degrees viewing range. By employing the designed novel triplet lenses array [15], the views of the outer ring (from 21 degrees to 30 degrees) are clear and the views of the inner ring are degraded. Apparently, the triplet lenses array is more suitable for the application of the tabletop presentation compared with the compound lens array. However, the distribution of viewpoints can’t be changed. The center pixels of the elemental images are wasteful.

 

Fig. 1. (a) Configuration of the light-field display with the standard lens array. (b) Light controlling process of the standard lens. (c) Configuration of the proposed light-field display with the conical lens array. (d) Light controlling process of the conical lens.

Download Full Size | PPT Slide | PDF

In order to satisfy special requirements of the tabletop display and improve the utilization efficiency of elemental images, a 3D light-field display with ring-shaped viewing range based on aspheric conical lens array is proposed. The aspheric conical lens array is designed to distribute the light rays emitting from the elemental images with the ring-type distribution instead of the uniform distribution. According to the parameters of the aspheric conical lens and the requirements of the viewpoints distribution, the elemental images are coded in polarization coordinate mode. By the modulation of the HFS, the tabletop 3D light field image with ring-shaped viewing range is acquired. Compared with the light field image with traditional lens (single lens, compound lens or triplet lens) array, the proposed tabletop 3D display system improves the efficiency of the elemental image utilization by eliminating the wasteful viewpoints as shown in Fig. 1(b). As the number of the pixels in an elemental image is invariable, the density of the ring-shaped viewpoints is obviously higher. By increasing the viewpoint density, the aliasing phenomenon [19] is avoided. In the experiment, the 360-degree viewable 3D light-field display based on aspheric conical lens array can present high quality 3D images with the right perception and occlusion. By real-time rendering elemental images, the dynamic 3D scene is obtained.

2. Experimental configuration

2.1 Optical configuration

Figure 1(a) illustrates the light-field display with the standard lens array. To present 3D images for the observers sitting around the tabletop display system, the viewing range above the HFS is at least 70 degrees. In the tabletop application, the generated viewpoints in the middle are wasteful, people do not need these perspectives. As shown in Fig. 1(b), the pixels in the central range of the elemental image are not observed, which are marked with gray. To increase the efficiency of the pixels’ utilization, the aspheric conical lens is designed to uniformly distribute the viewpoints in a ring-shaped range by controlling the light from the elemental image. Figure 1(c) shows the configuration of the proposed tabletop light-field display. The demonstrated display system consists of the HFS, the aspheric conical lens array, a LCD with the resolution of 3840×2160, the LED array and the Fresnel lens array. The HFS is a diffusive screen. It can distribute light beams in a solid diffused angle. By the diffusing function of the HFS, the gaps between the adjacent conical lenses are eliminated. The method to realize the HFS is called as the directional laser speckle [20]. The photoresist plate is used to expose the speckle patterns. By the ultraviolet curing and splicing method, the repeated speckle patterns consist the HFS. The designed speckle patterns determine the diffusing angle.

Figure 1(d) presents the light controlling process of an elemental image. A LED with the size of 2 mm and a corresponding Fresnel lens provide the collimating light. The LED is set at the focal plane of the Fresnel lens. The diameter and focal length of the Fresnel lens are 10 mm and 15 mm, respectively. The collimating light rays pass though the LCD and reach the lower surface of the designed aspheric conical lens. When the light rays with the information of the pixels emit from the upper surface of the aspheric conical lens, they are directed as shown in Fig. 1(d). The parameter ${\varphi }$ represents the angle between the view’s direction and the vertical axis. The central pixels of the elemental image form the viewpoints of the inner ring with ${\varphi } = 20^\circ $ and the edged pixels of the elemental image form the viewpoints of the outer ring with ${\varphi } = 35^\circ $. Visibly, by introducing the aspheric conical lens and distributing the pixels of an elemental image in a ring-shaped range, the density of viewpoints is increased.

To satisfy the requirements of the proposed tabletop light-field display with ring-shaped viewing area as shown in Fig. 1(c), there are three conditions to be reached. Firstly, the perspectives of the inner ring with ${\varphi } = 20^\circ $ are constructed by the central pixels of the elemental image and the perspectives of the outer ring with ${\varphi } = 35^\circ $ are constructed by the edged pixels of the elemental image. Next, the formed viewpoints should be arranged uniformly in the ring-shaped range. Finally, the image quality which is displayed on the HFS should be clear enough to present the 3D scene. To meet the above conditions, the aspheric conical lens with two aspheric surfaces is designed. The surface of the conical lens is rotational symmetrical. The aspheric surface formula is given in Eq. (1).

$$z = \frac{{c{{(r + {r_0})}^2}}}{{1 + \sqrt {1 - ({1 + k} ){c^2}{{(r + {r_0})}^2}} }} + {\alpha _2}{(r + {r_0})^2} + {\alpha _4}{(r + {r_0})^4} + {\alpha _6}{(r + {r_0})^6} + \cdots (r > 0)$$
where r is the radial coordinate, k is the conic constant, c is the vertex curvature, are the aspheric coefficients. The parameter ${\textrm{r}_0}$ is the off-center distance of the surface. The damped least-squares method is utilized to design the aspheric surfaces to meet the conditions mentioned above. The corresponding parameters and optimized structure are shown in Fig. 2(a). The aspheric lens is made of glass and the material of the glass is FK1 (which is defined by Schott Glaswerke AG). The refractive index is 1.49 and the abbe constant is 81.48. Figure 2(b) illustrates the modulation transfer function (MTF) of the aspheric conical lens. According to the MTF diagram of the designed lens, it can be concluded that the image quality of the proposed tabletop light field display system is improved comparing with the display with the general single lens array.

 

Fig. 2. (a) The designed structure parameters of the aspheric conical lens. (b) Comparison of modulation transfer function for the aspheric conical lens and the general single lens.

Download Full Size | PPT Slide | PDF

2.2 Image coding

As mentioned above, the designed aspheric conical lens can distribute light rays from the elemental images in a ring-shaped arrangement. The description of the corresponding coding method of the elemental image is shown in Fig. 3. $\textrm{Ray}({\textrm{x},\textrm{y},{\theta },{\varphi },\textrm{R},\textrm{G},\textrm{B}} )$ represents one light ray from a voxel. x and y are the coordinates of a voxel on the HFS. ${\theta }$ and ${\varphi }$ are the direction angles of the outgoing light ray. R, G and B contain the intensity and color information. The above seven parameters can describe one light ray, precisely. Massive light rays with different parameters construct the vivid light field 3D scene. Pixel$\textrm{}({\textrm{i},\textrm{j},\textrm{R},\textrm{G},\textrm{B}} )$ is a pixel which is located on the two-dimensional liquid crystal display panel. The five parameters represent the coordinates, intensity and color information. By the modulating function of the aspheric conical lens and the HFS, each light ray corresponds to one pixel. The image coding process is to obtain the mapping relationship between light rays of the light field display and pixels of the LCD.

 

Fig. 3. Coding process of the elemental image.

Download Full Size | PPT Slide | PDF

To simplify the calculation process, the polar coordinates $\textrm{r}$ and ${\theta ^{\prime}}$ of the pixel on the elemental image are introduced. According to the geometrical relationship, the formulas about the polar coordinates are as follows.

$$\theta ^{\prime} = \arcsin [(i\%N - N/2)/r]$$
$$r = \sqrt {{{[N/2 - i\%N]}^2} + {{[N/2 - j\%N]}^2}} \cdot {P_0}$$
where N is the number of pixels of an elemental image at the diameter direction, ${\textrm{P}_0}$ is the width of a pixel. The symbol “%” is a modulo operator. As the formed viewpoints are arranged uniformly, the following equations are required.
$$r = \frac{{\varphi - {\varphi _{inner}}}}{{{\varphi _{outer}} - {\varphi _{inner}}}} \cdot N \cdot {P_0}$$
$${\theta ^{\prime} = \theta + 18 {0}^ \circ }$$
In the experiment, ${\varphi _{inner}}$ is equal to 20 degree and ${\varphi _{inner}}$ is equal to 35 degree. According to the above equations and geometric operation, the calculating relationship between the parameters ($\textrm{x},\textrm{y},{\theta },{\varphi }$) of the light rays and the parameters $({\textrm{i},\textrm{j}} )$ of the pixels are shown as follows.
$$x = i \cdot {P_0} - (i\%N - N/2)/r \cdot L \cdot \tan \varphi$$
$$y = j \cdot {P_0} - (j\%N - N/2)/r \cdot L \cdot \tan \varphi$$
$$\theta = \arcsin [(i\%N - N/2)/r] - {18}{{0}^ \circ }$$
$$\varphi = \frac{{r({\varphi _{outer}} - {\varphi _{inner}})}}{{N \cdot {P_0}}} - {\varphi _{inner}}$$
To construct a 3D light field scene, all the parameters of every light ray are prepared. The coding process is to obtain the values of R, G and B of every pixel on the LCD. Putting the parameters $({\textrm{i},\textrm{j}} )$ of each pixel into the Eq. (6), Eq. (7), Eq. (8), and Eq. (9), the corresponding light ray and its R, G and B values are confirmed.

As shown in Fig. 4, each pixel is transformed to the arc in the HFS plane, the ray has some width. In the coding process, each pixel corresponds to the center of the light ray. To confirm the parameters, the center lines represent the light rays with some width. The central positions of the arc areas marked with yellow dots are used to determine the center lines as shown in Fig. 4(c). The parameters of center lines are employed to code the pixels.

 

Fig. 4. Top view of the mapping process of the pixels of the elemental images. (a) Pixels displayed on the LCD. (b) Pixels imaged on the HFS. (c) Center lines of the light rays with some width.

Download Full Size | PPT Slide | PDF

As each ray width of the central pixel and edge-area pixel is different, 3D image quality is changed along radius direction. According to the top view of the mapping process, the light ray density is improved as the value of ${\varphi }$ increases. In the outer viewing range, it can present complex 3D scene with high quality. In the inner viewing range, the display quality is degraded, seriously. Especially for the perspectives around ${\varphi } = 20^\circ $, only simple 3D images with low resolution can be displayed.

By utilizing the coding method mentioned above, the coded elemental images are obtained. Figure 5 illustrates the comparison of the elemental images of the light field display with the standard lens array and the elemental images of the proposed tabletop 3D light-field display. The edged pixels of two contrast elemental images correspond to the light rays with ${\varphi } = 35^\circ $. The central pixels of the two contrast elemental images correspond to the light rays with ${\varphi } = 0^\circ $ and ${\varphi } = 20^\circ $, respectively. Apparently, by employing the aspheric conical lens and introducing the corresponding coding method, the viewpoints density is increased, significantly.

 

Fig. 5. (a) Elemental images of the previous light field display. (b) Elemental images of the proposed tabletop 3D light-field display.

Download Full Size | PPT Slide | PDF

2.3 Comparison of the traditional and the proposed methods

The demonstrated tabletop 3D light-field display based on aspheric conical lens array provides ring-shaped viewing range to the observers. Compared with the implement method of the light field display with standard lens array [14], the proposed tabletop 3D display system improves the efficiency of the elemental image utilization and the viewpoints density, significantly. As the viewpoint density increases, the inter-perspective aliasing phenomenon is reduced and the 3D image quality is improved. In order to evaluate the image quality of the tabletop 3D scene, the simulation method is adopted to obtain different perspectives for two kinds of light field display system. Actually, the 3D image acquired by human eyes is composed of a series of sub-images observed through each single lens. Though calculating the displayed images for human eyes, the simulation results are obtained [21]. The structural similarity (SSIM) indexes of the viewpoints with different angles are calculated. As shown in Fig. 6(d), the building image with large depth of the light field display with standard lens array has low similarity. By employing the demonstrated tabletop 3D display system, the values of SSIM are significantly improved (especially for the part of images with large depth). Figure 7 illustrates the displayed 3D images of the light field display with the standard lens array and the light field display with the aspheric conical lens array. The scaling factors of the displayed images and the partial images are marked in the Fig. 7. In the testing experiment, the maximum height of the buildings is 8 cm and the depth of the ground from the HFS plane is 5 cm.

 

Fig. 6. Simulation of SSIM for different perspectives of a tabletop 3D scene. (a) Perspectives captured from different angles. (b) Depth maps of different perspectives. (c) Simulation results of the light field display with the general lens array. (d) SSIM of the light field display with the general lens array. (e) Simulation results of the light field display with the aspheric conical lens array. (f) SSIM of the light field display with the aspheric conical lens array.

Download Full Size | PPT Slide | PDF

 

Fig. 7. Comparison of the displayed 3D image of (a) the light field display with the general lens array (b) the light field display with the aspheric conical lens array.

Download Full Size | PPT Slide | PDF

3. Experimental results

In the demonstrated 360-degree tabletop light field display system, LEDs, Fresnel lenses, elemental images and aspheric conical lens have the same number, which is 46×26 . The distance between LED and Fresnel lens is 15 mm and the distance between the aspheric conical lens array and HFS is 200 mm. The LCD with the size of 27 inches and the resolution of 3840×2160 is utilized to display the elemental images. To eliminate the gaps between the adjacent conical lenses, the HFS with 5 degrees diffusing angle is adopted. According to the parameters of the aspheric conical lens, a standard computer-numerical-control (CNC) milling machine was used to make a rough mold in steel. Surface treatment of the steel mold by spray coating with photoresist is used to smooth the mold surface. The aspheric conical lenses are obtained by grinding the picked material (FK1) with the mold. An aluminium plate is punched with 46×26 holes with the radius of 10 mm. The distance between the center of the adjacent holes is 13 mm. All the manufactured aspheric conical lenses are put in the holes of the aluminium plate and form the lens array. The designed aspheric conical lens array distributes light rays from the elemental images in a ring-shaped arrangement. The central pixels of the elemental images are used to construct the inner light rays with ${\varphi } = 20^\circ $ and the edged pixels of the elemental images are used to construct the outer light rays with ${\varphi } = 35^\circ $. By corresponding the pixels, light rays and viewpoints, the demonstrated light field display presents perspectives in a ring-shaped range. Different perspectives along different directions of the proposed light field display system are shown in Fig. 8 and Fig. 9. The designed system is suitable for the tabletop display.

 

Fig. 8. Different perspectives along the circular direction of the static tabletop light field 3D display (see Visualization 1).

Download Full Size | PPT Slide | PDF

 

Fig. 9. Different perspectives along the radius direction of the static tabletop light field 3D display (see Visualization 2). (a) Perspectives from left to right. (b) Perspectives from front to back.

Download Full Size | PPT Slide | PDF

Electronic map is an important application of the tabletop 3D display. By coding massive viewpoints for the demonstrated light field display system, a 3D electronic map is reconstructed with ring-shaped viewing range. Different views of the 3D electronic map are shown in Fig. 10. The 3D images present occlusion relationship of different angles, obviously.

 

Fig. 10. Different perspectives along the circular direction of the 3D electronic map (see Visualization 3).

Download Full Size | PPT Slide | PDF

When people sit around a table, the tabletop light field display can present natural 3D scenes, intuitively. And the observers have a face to face communication about the displayed 3D scene, conveniently. As shown in Fig. 11, by rendering and displaying elemental images on the LCD, the dynamic tabletop 3D display can be realized.

 

Fig. 11. Dynamic tabletop 3D display (see Visualization 4).

Download Full Size | PPT Slide | PDF

4. Conclusion

A 360-degree tabletop 3D light-field display with ring-shaped viewing range is demonstrated. It can provide the right perception and occlusion relationship, simultaneously. In the proposed tabletop 3D display system, the aspheric conical lens is designed to distribute the light rays emitting from the elemental images in a ring-type. The mapping relationship between light rays of the light field display and pixels of the LCD is deduced. Compared with the previous light field display with standard lens array, the demonstrated tabletop 3D display system improves the using efficiency of the elemental images and the viewpoints density. As the viewpoint density increases, the image quality with large depth is improved. In the experiment, the tabletop 3D light-field display based on aspheric conical lens array presents different perspectives in different directions.

Funding

National Key Research and Development Program (2017YFB1002900); Fundamental Research Funds for the Central Universities (2019RC13, 2019PTB-018); National Natural Science Foundation of China (61705014); State Key Laboratory of Information Photonics and Optical Communications.

References

1. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013). [CrossRef]  

2. E.-Y. Chang, J. Choi, S. Lee, S. Kwon, J. Yoo, M. Park, and J. Kim, “360-degree color hologram generation for real 3D objects,” Appl. Opt. 57(1), A91–A100 (2018). [CrossRef]  

3. Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016). [CrossRef]  

4. Y. Sando, D. Barada, and T. Yatagal, “Optical rotation compensation for a holographic 3D display with a 360 degree horizontal viewing zone,” Appl. Opt. 55(30), 8589–8595 (2016). [CrossRef]  

5. G. E. Favalore, “Volumetric 3D displays and application infrastructure,” Computer 38(8), 37–44 (2005). [CrossRef]  

6. T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010). [CrossRef]  

7. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

8. S. Yoshida, “fVisiOn: 360-degree viewable glasses-free tabletop 3D display composed of conical screen and modular projector,” Opt. Express 24(12), 13194–13203 (2016). [CrossRef]  

9. Y. Takaki and J. Nakamura, “Generation of 360-degree color three dimensional images using a small array of high speed projectors to provide multiple vertical viewpoints,” Opt. Express 22(7), 8779–8789 (2014). [CrossRef]  

10. Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express 20(8), 8848–8861 (2012). [CrossRef]  

11. X. Xia, X. Liu, H. Li, Z. Zheng, H. Wang, Y. Peng, and W. Shen, “A 360-degree floating 3D display based on light field regeneration,” Opt. Express 21(9), 11237–11247 (2013). [CrossRef]  

12. L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019). [CrossRef]  

13. M.-Y. He, H.-L. Zhang, H. Deng, X.-W. Li, D.-H. Li, and Q.-H. Wang, “Dual-view-zone tabletop 3D display system based on integral imaging,” Appl. Opt. 57(4), 952–958 (2018). [CrossRef]  

14. X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018). [CrossRef]  

15. X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017). [CrossRef]  

16. K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018). [CrossRef]  

17. R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016). [CrossRef]  

18. G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution Graphical Models,” Graph. Models 97, 64–79 (2018). [CrossRef]  

19. X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014). [CrossRef]  

20. C. Yu, J. Yuan, F. Fan, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010). [CrossRef]  

21. S. Yang, X. Sang, X. Yu, X. Gao, and B. Yan, “Analysis of the depth of field for integral imaging with consideration of facet braiding,” Appl. Opt. 57(7), 1534–1540 (2018). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
    [Crossref]
  2. E.-Y. Chang, J. Choi, S. Lee, S. Kwon, J. Yoo, M. Park, and J. Kim, “360-degree color hologram generation for real 3D objects,” Appl. Opt. 57(1), A91–A100 (2018).
    [Crossref]
  3. Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016).
    [Crossref]
  4. Y. Sando, D. Barada, and T. Yatagal, “Optical rotation compensation for a holographic 3D display with a 360 degree horizontal viewing zone,” Appl. Opt. 55(30), 8589–8595 (2016).
    [Crossref]
  5. G. E. Favalore, “Volumetric 3D displays and application infrastructure,” Computer 38(8), 37–44 (2005).
    [Crossref]
  6. T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
    [Crossref]
  7. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.
  8. S. Yoshida, “fVisiOn: 360-degree viewable glasses-free tabletop 3D display composed of conical screen and modular projector,” Opt. Express 24(12), 13194–13203 (2016).
    [Crossref]
  9. Y. Takaki and J. Nakamura, “Generation of 360-degree color three dimensional images using a small array of high speed projectors to provide multiple vertical viewpoints,” Opt. Express 22(7), 8779–8789 (2014).
    [Crossref]
  10. Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express 20(8), 8848–8861 (2012).
    [Crossref]
  11. X. Xia, X. Liu, H. Li, Z. Zheng, H. Wang, Y. Peng, and W. Shen, “A 360-degree floating 3D display based on light field regeneration,” Opt. Express 21(9), 11237–11247 (2013).
    [Crossref]
  12. L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
    [Crossref]
  13. M.-Y. He, H.-L. Zhang, H. Deng, X.-W. Li, D.-H. Li, and Q.-H. Wang, “Dual-view-zone tabletop 3D display system based on integral imaging,” Appl. Opt. 57(4), 952–958 (2018).
    [Crossref]
  14. X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
    [Crossref]
  15. X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
    [Crossref]
  16. K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
    [Crossref]
  17. R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
    [Crossref]
  18. G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution Graphical Models,” Graph. Models 97, 64–79 (2018).
    [Crossref]
  19. X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
    [Crossref]
  20. C. Yu, J. Yuan, F. Fan, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010).
    [Crossref]
  21. S. Yang, X. Sang, X. Yu, X. Gao, and B. Yan, “Analysis of the depth of field for integral imaging with consideration of facet braiding,” Appl. Opt. 57(7), 1534–1540 (2018).
    [Crossref]

2019 (1)

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

2018 (6)

2017 (1)

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

2016 (4)

2014 (2)

Y. Takaki and J. Nakamura, “Generation of 360-degree color three dimensional images using a small array of high speed projectors to provide multiple vertical viewpoints,” Opt. Express 22(7), 8779–8789 (2014).
[Crossref]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

2013 (2)

2012 (1)

2010 (2)

T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
[Crossref]

C. Yu, J. Yuan, F. Fan, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010).
[Crossref]

2005 (1)

G. E. Favalore, “Volumetric 3D displays and application infrastructure,” Computer 38(8), 37–44 (2005).
[Crossref]

Barada, D.

Bolas, M.

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

Chang, E.-Y.

Chen, D.

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Choi, J.

Choi, S.

Choo, H.-G.

Dai, S.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Debevec, P.

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

Deng, H.

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

M.-Y. He, H.-L. Zhang, H. Deng, X.-W. Li, D.-H. Li, and Q.-H. Wang, “Dual-view-zone tabletop 3D display system based on integral imaging,” Appl. Opt. 57(4), 952–958 (2018).
[Crossref]

Dou, W.

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Evangelidis, K.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Fan, F.

Favalore, G. E.

G. E. Favalore, “Volumetric 3D displays and application infrastructure,” Computer 38(8), 37–44 (2005).
[Crossref]

Fujii, T.

T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
[Crossref]

Gao, X.

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref]

S. Yang, X. Sang, X. Yu, X. Gao, and B. Yan, “Analysis of the depth of field for integral imaging with consideration of facet braiding,” Appl. Opt. 57(7), 1534–1540 (2018).
[Crossref]

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Geng, J.

J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
[Crossref]

Hahn, J.

Han, J.

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution Graphical Models,” Graph. Models 97, 64–79 (2018).
[Crossref]

He, M.-Y.

Hilas, C.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Hong, K.

Jones, A.

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

Kang, G.

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution Graphical Models,” Graph. Models 97, 64–79 (2018).
[Crossref]

Kim, H.

Kim, H.-E.

Kim, J.

Kim, T.

Kwon, S.

Lee, S.

Li, D.-H.

Li, H.

Li, S.

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

Li, X.-W.

Li, Y.

Lim, Y.

Lin, C.

Liu, X.

Lu, K.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Luo, L.

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

Mastorokostas, P.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

McDowall, I.

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

Nakamura, J.

Nam, J.

Pan, W.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Papadopoulos, T.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Papatheodorou, K.

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Park, M.

Peng, Y.

Ren, H.

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

Sando, Y.

Sang, X.

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref]

S. Yang, X. Sang, X. Yu, X. Gao, and B. Yan, “Analysis of the depth of field for integral imaging with consideration of facet braiding,” Appl. Opt. 57(7), 1534–1540 (2018).
[Crossref]

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

C. Yu, J. Yuan, F. Fan, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010).
[Crossref]

Shen, W.

Sim, Y.

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution Graphical Models,” Graph. Models 97, 64–79 (2018).
[Crossref]

Takaki, Y.

Tanimoto, M.

T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
[Crossref]

Tehrani, M. P.

T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
[Crossref]

Uchida, S.

Wang, H.

Wang, P.

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Wang, Q.-H.

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

M.-Y. He, H.-L. Zhang, H. Deng, X.-W. Li, D.-H. Li, and Q.-H. Wang, “Dual-view-zone tabletop 3D display system based on integral imaging,” Appl. Opt. 57(4), 952–958 (2018).
[Crossref]

Wu, Y.

Xia, X.

Xin, Y.

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

Xing, S.

Xu, D.

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

C. Yu, J. Yuan, F. Fan, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010).
[Crossref]

Yamada, H.

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

Yan, B.

S. Yang, X. Sang, X. Yu, X. Gao, and B. Yan, “Analysis of the depth of field for integral imaging with consideration of facet braiding,” Appl. Opt. 57(7), 1534–1540 (2018).
[Crossref]

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Yang, S.

Yatagal, T.

Yendo, T.

T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
[Crossref]

Yoo, J.

Yoshida, S.

Yu, C.

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

C. Yu, J. Yuan, F. Fan, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010).
[Crossref]

Yu, X.

S. Yang, X. Sang, X. Yu, X. Gao, and B. Yan, “Analysis of the depth of field for integral imaging with consideration of facet braiding,” Appl. Opt. 57(7), 1534–1540 (2018).
[Crossref]

X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018).
[Crossref]

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Yuan, J.

Zhai, R.

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Zhang, H.-L.

Zhang, W.

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

Zhao, T.

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

Zheng, Z.

Adv. Opt. Photonics (1)

J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
[Crossref]

Appl. Opt. (4)

Chin. Opt. Lett. (2)

X. Yu, X. Sang, D. Chen, P. Wang, X. Gao, T. Zhao, B. Yan, C. Yu, D. Xu, and W. Dou, “Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch,” Chin. Opt. Lett. 12(6), 60008–60011 (2014).
[Crossref]

X. Gao, X. Sang, X. Yu, W. Zhang, B. Yan, and C. Yu, “360° light field 3D display system based on a triplet lensesarray and holographic functional screen,” Chin. Opt. Lett. 15(12), 60008 (2017).
[Crossref]

Comput. Geosci. (1)

K. Evangelidis, T. Papadopoulos, K. Papatheodorou, P. Mastorokostas, and C. Hilas, “3D geospatial visualizations: Animation and motion effects on spatial objects,” Comput. Geosci. 111, 200–212 (2018).
[Crossref]

Computer (1)

G. E. Favalore, “Volumetric 3D displays and application infrastructure,” Computer 38(8), 37–44 (2005).
[Crossref]

Graph. Models (1)

G. Kang, Y. Sim, and J. Han, “Terrain rendering with unlimited detail and resolution Graphical Models,” Graph. Models 97, 64–79 (2018).
[Crossref]

J. Vis. Comun. Image Res. (1)

T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Comun. Image Res. 21(5–6), 586–594 (2010).
[Crossref]

Neurocomputing (1)

R. Zhai, K. Lu, W. Pan, and S. Dai, “GPU-based real-time terrain rendering: Design and implementation,” Neurocomputing 171, 1–8 (2016).
[Crossref]

Opt. Commun. (1)

L. Luo, Q.-H. Wang, Y. Xin, H. Deng, H. Ren, and S. Li, “360-degree viewable tabletop 3D display system based on integral imaging by using perspective-oriented layer,” Opt. Commun. 438, 54–60 (2019).
[Crossref]

Opt. Express (7)

Other (1)

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “An interactive 360° light field display,” in Proceedings of ACM SIGGRAPH 2007 Emerging Technologies (2007), pp. 13.

Supplementary Material (4)

NameDescription
» Visualization 1       Different perspectives of the static tabletop light field 3D display
» Visualization 2       Different perspectives along the radius direction of the static tabletop light field 3D display
» Visualization 3       Different perspectives along the circular direction of the 3D electronic map
» Visualization 4       Dynamic tabletop 3D display

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. (a) Configuration of the light-field display with the standard lens array. (b) Light controlling process of the standard lens. (c) Configuration of the proposed light-field display with the conical lens array. (d) Light controlling process of the conical lens.
Fig. 2.
Fig. 2. (a) The designed structure parameters of the aspheric conical lens. (b) Comparison of modulation transfer function for the aspheric conical lens and the general single lens.
Fig. 3.
Fig. 3. Coding process of the elemental image.
Fig. 4.
Fig. 4. Top view of the mapping process of the pixels of the elemental images. (a) Pixels displayed on the LCD. (b) Pixels imaged on the HFS. (c) Center lines of the light rays with some width.
Fig. 5.
Fig. 5. (a) Elemental images of the previous light field display. (b) Elemental images of the proposed tabletop 3D light-field display.
Fig. 6.
Fig. 6. Simulation of SSIM for different perspectives of a tabletop 3D scene. (a) Perspectives captured from different angles. (b) Depth maps of different perspectives. (c) Simulation results of the light field display with the general lens array. (d) SSIM of the light field display with the general lens array. (e) Simulation results of the light field display with the aspheric conical lens array. (f) SSIM of the light field display with the aspheric conical lens array.
Fig. 7.
Fig. 7. Comparison of the displayed 3D image of (a) the light field display with the general lens array (b) the light field display with the aspheric conical lens array.
Fig. 8.
Fig. 8. Different perspectives along the circular direction of the static tabletop light field 3D display (see Visualization 1).
Fig. 9.
Fig. 9. Different perspectives along the radius direction of the static tabletop light field 3D display (see Visualization 2). (a) Perspectives from left to right. (b) Perspectives from front to back.
Fig. 10.
Fig. 10. Different perspectives along the circular direction of the 3D electronic map (see Visualization 3).
Fig. 11.
Fig. 11. Dynamic tabletop 3D display (see Visualization 4).

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

z = c ( r + r 0 ) 2 1 + 1 ( 1 + k ) c 2 ( r + r 0 ) 2 + α 2 ( r + r 0 ) 2 + α 4 ( r + r 0 ) 4 + α 6 ( r + r 0 ) 6 + ( r > 0 )
θ = arcsin [ ( i % N N / 2 ) / r ]
r = [ N / 2 i % N ] 2 + [ N / 2 j % N ] 2 P 0
r = φ φ i n n e r φ o u t e r φ i n n e r N P 0
θ = θ + 18 0
x = i P 0 ( i % N N / 2 ) / r L tan φ
y = j P 0 ( j % N N / 2 ) / r L tan φ
θ = arcsin [ ( i % N N / 2 ) / r ] 18 0
φ = r ( φ o u t e r φ i n n e r ) N P 0 φ i n n e r

Metrics