Abstract

A new-type of three-dimensional (3D) display system based on two different techniques, image floating and integral imaging, is proposed. The image floating is an antiquated 3D display technique, in which a large convex lens or a concave mirror is used to display the image of a real object to observer. The electro-floating system, which does not use a real object, requires a volumetric display part in order to present 3D moving pictures. Integral imaging is an autostereoscopic technique consisting of a lens array and a two-dimensional display device. The integral imaging method can be adapted for use in an electro-floating display system because the integrated image has volumetric characteristics within the viewing angle. The proposed system combines the merits of the two techniques such as an impressive feel of depth and the facility to assemble. In this paper, the viewing characteristics of the two techniques are defined and analyzed for the optimal design of the proposed system. The basic experiments for assembling the proposed system were performed and the results are presented. The proposed system can be successfully applied to many 3D applications such as 3D television.

©2005 Optical Society of America

1. Introduction

Three-dimensional (3D) display, the ultimate visual media, can transmit image information with the maximum reality to observers. Many researchers have made long term efforts to propose and develop 3D display systems that are capable of expressing 3D images like real objects. There are several 3D display categories, such as the stereoscopic display, the volumetric display, and the holographic display [1]. The stereoscopic display, one of the most popular 3D display types, adopts the binocular disparity as the depth cue, because the stereoscopic display system can be easily embodied, compared with other type 3D display systems. However, most stereoscopic display systems are only to give a feeling of depth, but volumetric display systems can present 3D images with a real volume.

Integral imaging is one of the most promising methods among the autostereoscopic displays. This type of stereoscopic display can provide a full color image and both vertical and horizontal parallaxes without the need for any special aids for the observer [25]. The integral imaging display system consists of a lens array and an ordinary two-dimensional display device, and the 3D reconstructed images are integrated by the lens array from elemental images that are displayed on the display device.

Image floating is an antiquated 3D display method, which is now found in the science museums. The floating display system uses a large convex lens or a concave mirror to display the image of a real object to the observer. Since the 3D image of the object can be located in close proximity to the observer, the floating display system can provide an impressive feel of depth, in that the image appears to be located in a free space near the observer. An electro-floating display system is merely a floating display system in which the volumetric display system is substituted for the real object [6].

We propose in this paper a new-type 3D display scheme which combines two different display methods. The proposed system is an electro-floating display system of which the object part is composed by an integral imaging display system. The integral imaging method can be adapted for use in an electro-floating display system because the integrated image has volumetric characteristics within a certain viewing region. The proposed system, which consists of simple and low-cost optical components such as a large aperture convex lens and a lens array, can present volumetric 3D images with an impressive feel of depth. In this paper, experimental results verifying the proposed system are presented. Moreover, the viewing parameters of the proposed system such as the viewing angle, magnification factor and the thickness of the floating image are analyzed using the viewing characteristics of the parent techniques.

2. Concepts of adopted display techniques

2.1 Image floating

Image floating is a simple 3D display method in which the feel of depth is emphasized using a floating lens. Figure 1(a) shows a schematic concept of the floating display. The floating image, however, can be located in front of or behind the floating lens, and the position of the floating image is determined by the lens equation related with the focal length of the floating lens and the position of the object. A system in which the distance between the object and the floating lens is always a greater distance than the focal length is amenable for use in the floating system. Consequently, the floating system, the floating image of which is located in front of the floating lens, is only treated in this paper. As shown in Fig. 1(a), the floating image is rotated by 180° because of the imaging phenomenon of the floating lens.

The floating display system cannot make a 3D image from a plane image but can only change the position of the image, which means that a 3D image cannot be generated by a floating display system in which an ordinary 2D display device replaces a real object. Therefore, the electro-floating display system which does not use a real object requires a 3D display component which is capable of providing the volumetric 3D images in order to represent 3D moving pictures. Figure 1(b) shows the concept of the electro-floating display. In this case, the inversion of the floating image can be easily corrected through rotating the object image, as shown in Fig. 1(b). The floating lens shown in Fig. 1 can also be changed into a concave mirror because the concave mirror is similar to the convex lens from the viewpoint of optics.

The floating image can be observed through the floating lens. As a result, the viewing area of the floating system is restricted by the aperture of the floating lens. This restriction becomes severe when the distance between the floating lens and the floating image, referred to as the floating distance, is longer, which is determined by the lens equation with the focal length of the floating lens and the distance between the floating lens and the object, referred to as the object distance. Figure 2 shows that the viewing area of the floating system is dependant on the aperture of the floating lens. The observer can watch the entire floating image in the blue region, while only partial images can be observed in the red regions, as shown in Fig. 2. The viewing area becomes wider as the aperture of the floating lens becomes larger and the floating distance becomes shorter. The viewing angle of the floating system with the whole view Ωfl_whole is given by:

Ωfl_whole=2arctan(wsfl2Lfl),

where w is the aperture size of the floating lens, L fl the floating distance and s fl the size of the floating image. The reference point of the viewing angle of whole view, A rf, is changed in accordance with the floating distance and the sizes of the floating lens and the floating image. The variation in the reference point makes comparisons with different systems difficult. The viewing angle with a fixed reference point can be approximately defined under the assumption that the size of floating image can be ignored and is determined by:

Ωfl=2arctan(w2Lfl),

where Ωfl is the approximate viewing angle of the floating system, the reference point of which is fixed on the floating image. In the actual system, the viewing angle obtained by Eq. (2) denotes the angular region where the observer can see more than half of the floating image.

 figure: Fig. 1.

Fig. 1. Concept of image floating

Download Full Size | PPT Slide | PDF

The size of the floating image can be magnified or reduced due to the ratio of the positions of the object and the floating image as shown in Fig. 2. The magnification factor of the floating system is obtained by:

Mfl=sflsob=LflLob=fflLobffl=Lflfflffl,

where s ob is the size of the object, M fl the magnification factor of the floating lens, f fl the focal length of the floating lens and L ob the object distance. According to Eq. (3), the magnification factor is set to 1 when the object distance and the floating distance are equal and twice the focal length. When the objects are plural and their longitudinal locations are different, the magnification factor for each object is determined individually according to Eq. (3). In the floating display system, the distance between the floating images can be changed from that between the objects, but the sequence of the floating images is identical with that of the objects.

 figure: Fig. 2.

Fig. 2. Viewing area of floating system

Download Full Size | PPT Slide | PDF

2.2 Integral imaging

Integral imaging, also referred to as integral photography or integral videography, is an autostereoscopic 3D display method [2, 7]. Figure 3 shows the concept of an integral imaging system. As shown in Fig. 3, the lens array, which is composed of elemental lenses, produces elemental images about the object, and the elemental images presented on the 2D display device pass through the lens array where they are integrated into a 3D image. When elemental images obtained by a real pickup process are directly used in the reconstruction, the longitudinal sequence of the image is inverted and this distortion is referred to as pseudoscopic problem, which occurs because the directions of the pickup and the display are opposite. Methods such as two-step method and graded-index lens method have been proposed to solve the pseudoscopic problem [3, 5]. Moreover, elemental images can be generated by computer calculation instead of a pickup process [8]. In this case, the pseudoscopic problem can be resolved naturally.

 figure: Fig. 3.

Fig. 3. Concept of an integral imaging system

Download Full Size | PPT Slide | PDF

The integral imaging display system is capable of offering a 3D image which not only has a feeling of depth but also its own perspective. Unlike images from the other stereoscopic display systems, the integrated image has thickness and volume elements within a certain range, which are the characteristic of the volumetric image. Figure 4 shows the volumetric properties of an integrated image. In Fig. 4, the cherries at the upper right, the center and the lower left are located 135 mm, 150 mm and 165 mm from the lens array, respectively. Figure 4(b), (c) and (d) show images from the diffuser, which proves that each cherry is actually located on the setting position. Therefore, the integrated image has volumetric elements and can represent a real volume. The thickness of the integrated image is limited within a certain range, denoted as the image depth. The margin of the image depth Δz m can be expressed as [9]:

Δzm=2l2gPLPX,

where l is the position of the central depth plane of the integrated image, g the gap between the lens array and the display device, P L the elemental lens pitch, and P X the pixel size of the display device.

 figure: Fig. 4.

Fig. 4. Image depth of integrated image (Each gap between neighboring cherries is 15 mm.)

Download Full Size | PPT Slide | PDF

The integrated image has volumetric characteristics within the restricted angular area. The viewing angle represents this angular limitation, which is one of the viewing parameters of the integral imaging display system and is determined by system parameters such as the size of the elemental lens, the gap between the display device and the lens array. When the observer moves out of the viewing angle, the flipped images appear. Image flipping occurs because the elemental images are shown through the neighboring elemental lens, not the corresponding elemental lens.

The exact viewing zone at a certain location of observer can be estimated by means of a somewhat complicated calculation [10]. However, if the size of the integrated image and the number of elemental lenses are ignored, the viewing angle can be approximated by [11]:

Ωii=2arctan(PL2g),

where Ωii is the viewing angle of the integral imaging system. The tendency of the actual viewing angle corresponds to the result of Eq. (5).

The integral imaging display system has three different display modes; real, virtual and focused [9, 1113]. The display modes are selected by the ratio of the gap to the focal length of the lens array. When the gap is longer than the focal length, the integrated image is displayed in the real mode and is located in front of the lens array. Otherwise, when the gap is shorter than the focal length, the integrated image is represented in the virtual mode and is located behind the lens array. If the gap is almost equal to the focal length, the integrated image can be simultaneously displayed in front of and behind the lens array. In this case, denoted as the focused mode, the resolution of the integrated image is degraded and fixed by the reciprocal of the elemental lens pitch [9]. According to Eq. (5), the viewing angle is inversely proportional to the gap. The viewing angle of the virtual mode is wider than that of the other modes when the other conditions are similar. However, the integrated image in the virtual mode is located at a greater distance from the observer than that in the real mode. Therefore, the feel of depth of the virtual mode is inferior to that of the other modes.

Recent studies concerning the integral imaging have focused on the enhancement of viewing qualities such as the viewing angle and the image depth. Methods in which time, space, and polarization multiplexing are used have been proposed to enhance the viewing angle and the image depth of an integral imaging system [12, 1419]. Methods using novel structures such as a stepped lens array, a composed lens array, a curved lens array and an embossed screen have also been reported [2024].

3. Electro-floating display system using integral imaging method

3.1 Experimental setup

The 3D display system proposed in this paper is an electro-floating display system, in which the object component is composed of an integral imaging display system. The volumetric characteristics of integral imaging make this application possible. As mentioned above, the volumetric property of the integrated image is restricted within the viewing angle, so the integral imaging system implemented in the virtual mode is more appropriate than those in the other modes. Accordingly, the proposed system can be described as a system in which the feel of depth of the integrated image in the virtual mode is emphasized, while the superior viewing angle is preserved. Figure 5 shows the scheme for the proposed system and the experimental setup. Table 1 shows the specifications of the experimental setup. The focal length and the size of the elemental lens are 22 mm and 10 mm, respectively. The display device consists of a diffuser screen and a projection system, of which pixel size is easily changed [7]. The pixel size of the display device is set to 0.15 mm. The floating lens and the lens array consist of a groove-out type Fresnel convex lens, which has minimal aberration for imaging. The elemental images for the experiments were generated by computer processing, not direct pickup. An elemental image obtained by real pickup can be used in the proposed system after image processing, in which each elemental image is rotated by 180°.

3.2 Experimental results

Figure 6 shows solids displayed in the assembled system and the integrated image used as the object image. The solids consist of a green cube, a blue cylinder, and a red pyramid, the sides of which are all 20 mm, with the arrangement shown in Fig. 6(a). The integrated image, the center of which is located 100 mm behind the lens array, has different views at the right and left side, as shown in the Fig. 6(b). According to Eq. (5), the viewing angle of the integral imaging system is approximately 15° in each direction.

 figure: Fig. 5.

Fig. 5. Scheme for the proposed system and experimental setup

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. Specifications of the experimental setup

 figure: Fig. 6.

Fig. 6. The solids to be displayed and the integrated images

Download Full Size | PPT Slide | PDF

Figure 7 shows the floating images at different points of view, for an object distance of 350 mm, twice of the focal length of the floating lens, and the floating distance is equal to the object distance. In this case, the viewing angle of the floating system is nearly equal to that of the integral imaging system. The floating images are cut off as the images are moved out of the viewing region, as shown in Fig. 7.

The solidity of the floating image can be easily proved using diffuser scanning as shown in Fig. 8(a), the result of which is shown in Fig. 8(b). According to the result, it is clear that the floating image has a thickness of about 50 mm. Figure 8(c) shows still photos of the movie shown in Fig. 8(b) and the white circles in the photos indicate the focused part of the floating image. The results shown in Fig. 8 prove that the floating image of the proposed system has both volumetric elements and solidity.

 figure: Fig. 7.

Fig. 7. Floating image at different view points

Download Full Size | PPT Slide | PDF

 figure: Fig. 8.

Fig. 8. Verification of solidity of floating image [Media 1]

Download Full Size | PPT Slide | PDF

In the proposed system, the solidity of the floating image is obtained from that of the integrated image. According to the imaging phenomenon of the floating lens, the thickness of the floating image is determined by the image depth of the integrated image and the object distance of the central depth plane. Figure 9 shows the scheme used to calculate the thickness of the floating image.

 figure: Fig. 9.

Fig. 9. Thickness of floating image

Download Full Size | PPT Slide | PDF

The thickness of the floating image Δ t fl can be obtained by:

Δtfl=Δzm(1M)2(Δzm2ffl)2,

where M is the magnification factor for the central depth plane. The variations in image depth are different for the images in front of and behind the central depth plane, as shown in Fig. 9. The distance between the front image and the center image, Δt 2, is greater than that between the center image and the rear image, Δt 1. In the experiment described in Fig. 8, the image depth of the integrated image is about 40 mm. As a result, the thickness of the floating image is about 40.5 mm according to Eq. (6). In this case, the thickness of the floating image is not greatly changed from the image depth of the integrated image because the magnification factor is 1.

The viewing angle of the floating system is determined by the floating distance and the aperture size of the floating lens, as shown in Eq. (2). In the proposed system, the viewing angle of the integrated image has an influence on the actual size of the floating lens aperture. When the aperture of the floating lens is larger than the viewing angle of the integrated image, the flipped image also floats and appears with the floating image. In the virtual mode, the distance between the integrated image and the flipped image, denoted as the flipping distance, is determined by the f-number of the elemental lens and the central depth plane of the integral imaging system. The flipping distance d, as shown in Fig. 10, is obtained by:

d=PLfiil,

where f ii is the focal length of the lens array. Because the flipping distance is not affected by the size of the integrated image, the location of the flipped image is also independent of image size.

 figure: Fig. 10.

Fig. 10. Geometry to define flipping distance

Download Full Size | PPT Slide | PDF

Figure 11 shows experimental results which prove the relation between the size of the image and the flipped distance. The size of the cherry shown in Fig. 11(a) is 20 mm while that in Fig. 11(b) is 40 mm. The magnification factor in both cases is equal to 1. Figure 11(c) shows the floating image shown in Fig. 11(a) with the diffuser located at the floating distance. Figure 11(d) shows the diffusing image for the case in Fig. 11(b). As shown in Fig. 11(c) and (d), the flipped distance is 45 mm, which agrees with Eq. (7) and is not affected by the image size. As shown in Fig. 11, the flipped image in Fig. 11(a) and (b) is the limited part of that in Fig. 11(c) and (d). The part of the flipped image out of the viewing angle of the integrated image is observed without the diffuser.

 figure: Fig. 11.

Fig. 11. Floating image for different sizes

Download Full Size | PPT Slide | PDF

The flipped distance can be extended in proportion to the magnification factor. Figure 12 shows the floating image for a magnification factor of 2. The flipping image disappears in the certain region because the flipping distance is doubled. In this case, however, the viewing angle of the floating system is not enlarged because the floating distance, which is in inverse-proportion to the viewing angle according to Eq. (2), is also doubled.

 figure: Fig. 12.

Fig. 12. Floating image using a magnification factor of 2

Download Full Size | PPT Slide | PDF

4. Conclusions

A new-type 3D display system is proposed, in which the merits of two different 3D display techniques are combined. The experimental results show that the integral imaging display system can be successfully adapted for use in an electro-floating system. There are some future research subjects concerning the proposed system that need to be pursued, such as a detailed analysis of the relation between the system parameters.

The electro-floating system using the integral imaging method in the virtual mode can provide 3D images with a more impressive feel of depth than the integral imaging system in the real mode, within the relatively wide viewing area which originates from the advantage of the integral imaging in the virtual mode. Therefore, the advantages of the parent techniques can be successfully combined in the proposed system. Moreover, techniques for enhancing the viewing angle, such as methods using time and space multiplexing, the curved lens and the embossed screen, can be directly applied to the proposed system. The proposed system can be successfully applied to a variety of 3D applications such as 3D television.

Acknowledgments

This work was supported by the Ministry of Information and Communications, Korea, under the Information Technology Research Center (ITRC) Support Program.

References and links

1. T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, pp. 548–564 (1980). [CrossRef]  

2. G. Lippmann, “La photographie integrale,” Comptes-Rendus 146, 446–451, Academie des Sciences (1908).

3. H. E. Ives, “Optical properties of a Lippmann lenticulated sheet,” J. Opt. Soc. Am. 21, 171–176 (1931). [CrossRef]  

4. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, pp. 1598–1603 (1997). [CrossRef]   [PubMed]  

5. J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37, 2034–2045 (1998). [CrossRef]  

6. S.-W. Min, J. Kim, and B. Lee, “Three-dimensional electro-floating display system based on integral imaging technique,” in Stereoscopic Displays and Applications XVI, Electronics Imaging, paper 5664A-37, San Jose, CA (2005).

7. H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multiprojector,” Opt. Express 12, 1067–1076 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-6-1067. [CrossRef]   [PubMed]  

8. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.

9. S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005). [CrossRef]  

10. H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005). [CrossRef]   [PubMed]  

11. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, pp. 5217–5232 (2001). [CrossRef]  

12. B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26, pp. 1481–1482 (2001). [CrossRef]  

13. J.-S. Jang, F. Jin, and B. Javidi, “Three-dimensional integral imaging with large depth of focus by use of real and virtual image fields,” Opt. Lett. 28, pp. 1421–1423 (2003). [CrossRef]   [PubMed]  

14. B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).

15. J. S. Jang and B. Javidi, “Improved viewing resolution of 3-D integral imaging with nonstationary micro-optics,” Opt. Lett. 27, pp. 324–326 (2002). [CrossRef]  

16. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27, pp. 818–820 (2002). [CrossRef]  

17. J. S. Jang and B. Javidi, “Three dimensional synthetic aperture integral imaging,” Opt. Lett. 27, pp. 1144–1146 (2002). [CrossRef]  

18. H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927. [CrossRef]   [PubMed]  

19. J. S. Jang, Yong-Seok Oh, and B. Javidi, “Spatiotemporally multiplexed integral imaging projector for large-scale high resolution three-dimensional display,” Opt. Express 12, pp. 557–563 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-4-557 [CrossRef]   [PubMed]  

20. H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731. [CrossRef]  

21. J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28, pp. 1924–1926 (2003). [CrossRef]   [PubMed]  

22. Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12, pp. 421–429 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-421. [CrossRef]   [PubMed]  

23. S.-W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. 29, pp. 2420–2422 (2004). [CrossRef]   [PubMed]  

24. H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, pp. 548–564 (1980).
    [Crossref]
  2. G. Lippmann, “La photographie integrale,” Comptes-Rendus 146, 446–451, Academie des Sciences (1908).
  3. H. E. Ives, “Optical properties of a Lippmann lenticulated sheet,” J. Opt. Soc. Am. 21, 171–176 (1931).
    [Crossref]
  4. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, pp. 1598–1603 (1997).
    [Crossref] [PubMed]
  5. J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37, 2034–2045 (1998).
    [Crossref]
  6. S.-W. Min, J. Kim, and B. Lee, “Three-dimensional electro-floating display system based on integral imaging technique,” in Stereoscopic Displays and Applications XVI, Electronics Imaging, paper 5664A-37, San Jose, CA (2005).
  7. H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multiprojector,” Opt. Express 12, 1067–1076 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-6-1067.
    [Crossref] [PubMed]
  8. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.
  9. S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005).
    [Crossref]
  10. H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005).
    [Crossref] [PubMed]
  11. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, pp. 5217–5232 (2001).
    [Crossref]
  12. B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26, pp. 1481–1482 (2001).
    [Crossref]
  13. J.-S. Jang, F. Jin, and B. Javidi, “Three-dimensional integral imaging with large depth of focus by use of real and virtual image fields,” Opt. Lett. 28, pp. 1421–1423 (2003).
    [Crossref] [PubMed]
  14. B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).
  15. J. S. Jang and B. Javidi, “Improved viewing resolution of 3-D integral imaging with nonstationary micro-optics,” Opt. Lett. 27, pp. 324–326 (2002).
    [Crossref]
  16. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27, pp. 818–820 (2002).
    [Crossref]
  17. J. S. Jang and B. Javidi, “Three dimensional synthetic aperture integral imaging,” Opt. Lett. 27, pp. 1144–1146 (2002).
    [Crossref]
  18. H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927.
    [Crossref] [PubMed]
  19. J. S. Jang, Yong-Seok Oh, and B. Javidi, “Spatiotemporally multiplexed integral imaging projector for large-scale high resolution three-dimensional display,” Opt. Express 12, pp. 557–563 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-4-557
    [Crossref] [PubMed]
  20. H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731.
    [Crossref]
  21. J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28, pp. 1924–1926 (2003).
    [Crossref] [PubMed]
  22. Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12, pp. 421–429 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-421.
    [Crossref] [PubMed]
  23. S.-W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. 29, pp. 2420–2422 (2004).
    [Crossref] [PubMed]
  24. H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004).
    [Crossref]

2005 (2)

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005).
[Crossref]

H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005).
[Crossref] [PubMed]

2004 (5)

2003 (3)

2002 (3)

2001 (2)

1998 (1)

1997 (1)

1980 (1)

T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, pp. 548–564 (1980).
[Crossref]

1931 (1)

1908 (1)

G. Lippmann, “La photographie integrale,” Comptes-Rendus 146, 446–451, Academie des Sciences (1908).

Arai, J.

Choi, H.

H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005).
[Crossref] [PubMed]

Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12, pp. 421–429 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-421.
[Crossref] [PubMed]

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004).
[Crossref]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927.
[Crossref] [PubMed]

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731.
[Crossref]

Dohi, T.

Hata, N.

Hong, J.

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004).
[Crossref]

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731.
[Crossref]

Hoshino, H.

Ives, H. E.

Iwahara, M.

Jang, J. S.

Jang, J.-S.

Javidi, B.

Jin, F.

Jung, S.

H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005).
[Crossref] [PubMed]

Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12, pp. 421–429 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-421.
[Crossref] [PubMed]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927.
[Crossref] [PubMed]

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27, pp. 818–820 (2002).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, pp. 5217–5232 (2001).
[Crossref]

B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26, pp. 1481–1482 (2001).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.

B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).

Kim, J.

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005).
[Crossref]

S.-W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. 29, pp. 2420–2422 (2004).
[Crossref] [PubMed]

S.-W. Min, J. Kim, and B. Lee, “Three-dimensional electro-floating display system based on integral imaging technique,” in Stereoscopic Displays and Applications XVI, Electronics Imaging, paper 5664A-37, San Jose, CA (2005).

Kim, Y.

Lee, B.

H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005).
[Crossref] [PubMed]

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005).
[Crossref]

S.-W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. 29, pp. 2420–2422 (2004).
[Crossref] [PubMed]

Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12, pp. 421–429 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-421.
[Crossref] [PubMed]

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004).
[Crossref]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927.
[Crossref] [PubMed]

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27, pp. 818–820 (2002).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, pp. 5217–5232 (2001).
[Crossref]

B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26, pp. 1481–1482 (2001).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.

S.-W. Min, J. Kim, and B. Lee, “Three-dimensional electro-floating display system based on integral imaging technique,” in Stereoscopic Displays and Applications XVI, Electronics Imaging, paper 5664A-37, San Jose, CA (2005).

B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731.
[Crossref]

Liao, H.

Lippmann, G.

G. Lippmann, “La photographie integrale,” Comptes-Rendus 146, 446–451, Academie des Sciences (1908).

Min, S. W.

B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).

Min, S.-W

Min, S.-W.

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005).
[Crossref]

S.-W. Min, J. Kim, and B. Lee, “Wide-viewing projection-type integral imaging system with an embossed screen,” Opt. Lett. 29, pp. 2420–2422 (2004).
[Crossref] [PubMed]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927.
[Crossref] [PubMed]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, pp. 5217–5232 (2001).
[Crossref]

B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26, pp. 1481–1482 (2001).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.

S.-W. Min, J. Kim, and B. Lee, “Three-dimensional electro-floating display system based on integral imaging technique,” in Stereoscopic Displays and Applications XVI, Electronics Imaging, paper 5664A-37, San Jose, CA (2005).

Oh, Yong-Seok

Okano, F.

Okoshi, T.

T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, pp. 548–564 (1980).
[Crossref]

Park, J. H.

B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).

Park, J.-H.

H. Choi, Y. Kim, J.-H. Park, S. Jung, and B. Lee, “Improved analysis on the viewing angle of integral imaging,” Appl. Opt. 44, pp. 2311–2317 (2005).
[Crossref] [PubMed]

Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12, pp. 421–429 (2004), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-12-3-421.
[Crossref] [PubMed]

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004).
[Crossref]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11, pp. 927–932 (2003), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-8-927.
[Crossref] [PubMed]

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27, pp. 818–820 (2002).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, pp. 5217–5232 (2001).
[Crossref]

B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26, pp. 1481–1482 (2001).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731.
[Crossref]

Yuyama, I.

Appl. Opt. (4)

Comptes-Rendus (1)

G. Lippmann, “La photographie integrale,” Comptes-Rendus 146, 446–451, Academie des Sciences (1908).

J. Opt. Soc. Am. (1)

Japanese J. Appl. Phys. (2)

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Japanese J. Appl. Phys. 44, pp. L71–L74 (2005).
[Crossref]

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” Japanese J. Appl. Phys. 43, pp. 5330–5336 (2004).
[Crossref]

Opt. Express (4)

Opt. Lett. (7)

Proc. IEEE (1)

T. Okoshi, “Three-dimensional displays,” Proc. IEEE 68, pp. 548–564 (1980).
[Crossref]

Other (4)

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Computer-generated integral photography,” in Proceedings of The 6th International Workshop on 3-D Imaging Media Technology, Seoul, Korea, July 2000, pp. 21–28.

S.-W. Min, J. Kim, and B. Lee, “Three-dimensional electro-floating display system based on integral imaging technique,” in Stereoscopic Displays and Applications XVI, Electronics Imaging, paper 5664A-37, San Jose, CA (2005).

B. Lee, S. Jung, J. H. Park, and S. W. Min, “Viewing-angle-enhanced integral imaging using lens switching,” in Stereoscopic Displays and Virtual Reality Systems IX, A. J. Woods, J. O. Merritt, S. A. Benton, and M. T. Bolas, eds., Proc. SPIE4660, pp. 146–154 (2002).

H. Choi, J.-H. Park, J. Hong, and B. Lee, “Depth-enhanced integral imaging with a stepped lens array or a composite lens array for three-dimensional display,” in Technical Digest of The 16th Annual Meeting of the IEEE Lasers & Electro-Optics Society (LEOS 2003), Tucson, Arizona, USA, Oct.2003, vol. 2, pp. 730–731.
[Crossref]

Supplementary Material (1)

» Media 1: AVI (2260 KB)     

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1.
Fig. 1. Concept of image floating
Fig. 2.
Fig. 2. Viewing area of floating system
Fig. 3.
Fig. 3. Concept of an integral imaging system
Fig. 4.
Fig. 4. Image depth of integrated image (Each gap between neighboring cherries is 15 mm.)
Fig. 5.
Fig. 5. Scheme for the proposed system and experimental setup
Fig. 6.
Fig. 6. The solids to be displayed and the integrated images
Fig. 7.
Fig. 7. Floating image at different view points
Fig. 8.
Fig. 8. Verification of solidity of floating image [Media 1]
Fig. 9.
Fig. 9. Thickness of floating image
Fig. 10.
Fig. 10. Geometry to define flipping distance
Fig. 11.
Fig. 11. Floating image for different sizes
Fig. 12.
Fig. 12. Floating image using a magnification factor of 2

Tables (1)

Tables Icon

Table 1. Specifications of the experimental setup

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

Ω fl _ whole = 2 arctan ( w s fl 2 L fl ) ,
Ω fl = 2 arctan ( w 2 L fl ) ,
M fl = s fl s ob = L fl L ob = f fl L ob f fl = L fl f fl f fl ,
Δ z m = 2 l 2 g P L P X ,
Ω ii = 2 arctan ( P L 2 g ) ,
Δ t fl = Δ z m ( 1 M ) 2 ( Δ z m 2 f fl ) 2 ,
d = P L f ii l ,

Metrics