## Abstract

A computer generation method of elemental image for integral floating display, named as floated image mapping, is proposed. The concept of floated integral imaging system is introduced to overcome the problems coming from the conventional viewpoint of the integral floating display. Both analysis and experimental results which support and verify the functional contributions of the floated image mapping are presented. The expansion of expressible depth range and the reduction of computation time for the real time processing are main contributions of the proposed algorithm.

© 2008 Optical Society of America

## 1. Introduction

Integral floating display is the combination of the integral imaging display and the floating display [1–6]. Figure 1 illustrates the concept of integral floating display system. An integral imaging system is located in the object space of the floating device, which can be either a convex lens or a concave mirror. The object space means the area for the objects which will be floated by the floating device, while the image space denotes that for the floating images. The integral imaging system constructs a source image from which the floating device forms a floating image in the image space. For commercialization of the integral imaging system, it is essential to establish a computer-generation method of the elemental image. The computer-generation method of elemental image for integral imaging system and its applications have been actively researched for decades [7–11]. Therefore, the computer generation of elemental image for the integral floating display may be thought as a mere extension of that for the integral imaging. In the integral floating system, we design a three-dimensional (3D) image which would be the floating image, and we calculate the source image of it by using lens maker’s formula. Using the source image, one can finally generate its elemental image for the integral imaging system using the algorithms which are already developed. Recently, various methods for the real time process of the elemental image generation for the integral imaging have been studied and compared [12, 13], and a wise solution of pseudoscopic problems to use the computer graphical library for the computer generation of elemental image has been reported [14].

The algorithms described above have some problems to be directly applied to the integral floating system. The first problem is the retro-floating range, which is defined as the depth range from the floating device to the focal plane. It is because of the optical property of the convex lenses or concave mirrors that the location of the floating image of real object is restricted to be farther than the focal plane of the floating device. Therefore, the floating image should be separated by more than focal length from the floating device. The second problem is that the transformation from the floating image to the source image is not shift-invariant because the magnification factor between the source image and the floating image is different according to their locations. Therefore, the source image would have a deformed shape of the floating image. As a result, the transformation process becomes a bottleneck step in the elemental image generation algorithm. Moreover, the various computer graphics libraries which implement the physical laws of a real world such as the illumination and the free fall caused by the gravity cannot be directly applied to the source image because of the deformation during this transformation process.

The retro-floating range can be used by forming the source image intended to be located in the image space. Figure 2 illustrates the floating image formation in the retro-floating range, where the floating image 1 is constructed in the retro-floating range. The floating image 2 can be easily displayed by forming a source image in the object space, which is depicted as the source image 2. However, the floating image 1 can not be generated with any source image which is located in the object space. The only solution is to generate a source image in the image space, which would integrate a real image if the floating device is absent. Instead of the integrating source image 1, the floating image 1 will be formed due to the refraction or reflection of the floating device. Its principle is similar to the virtual image formation of convex lens or concave mirror. But the occlusion is reversed between the floating image 1 and the floating image 2. As illustrated in Fig. 2, the source image 1 occludes source image 2 in the viewpoint of the integral imaging system in the object space. Therefore, the floating image 1 will occlude the floating image 2. But the floating image 2 is desired to occlude the floating image 1 because it is closer to the observer. Adapting graphical libraries such as OpenGL is a good solution for the fast elemental image generation in integral imaging [14]. However, the graphical libraries should be modified to overcome occlusion inversion when it is adapted to the elemental image generation of the integral floating display.

Although it may seem as a minor problem which can be easily overcome by considering the occlusion relationship, there is another problem. As a floating image which is located in the floating range for real object approaches the focal plane, the source image should be located farther and farther from the floating device and have bigger and bigger image size. Then when the floating image crosses the focal plane, the source image would jump from the infinite distance in the object space to the infinite distance in the image space. As the floating image approaches to the floating device, the source image also approaches to the floating device. Therefore, if a floating image is located across the focal plane, source image is hard to generate because the focal plane corresponds to infinite distances in both spaces. Such problem is illustrated in Fig. 3.

The other problem is the shift-variant characteristic of the transformation. The computer graphical libraries can be adapted to the deformation during the transformation by modifying the physical laws embedded in the libraries. But the costly processing time in the transformation is unavoidable because it originates from the nature of the transformation which is not shift-invariant. Rather than modifying the computer graphical library for merely seeking the partial solution, we propose a computer generation method of elemental image for the integral floating display which clearly solves these problems discussed above.

## 2. Computer generation of elemental image of integral floating display

In subsection 2.1, we introduce the concept of the floated integral imaging system and explain the principle of the proposed algorithm. In subsection 2.2, the merits of the proposed algorithm will be explained by showing how it can solve the problems mentioned above.

#### 2.1. Principle of the method

The integral imaging system consists of a lens array and a two-dimensional (2D) display device. The central depth plane is a plane on which the 2D display device is actually imaged by the lens array. In the integral floating display, the lens array and the central depth plane in the object space are imaged to the image space as in Fig. 4. The imaged lens array is referred to as the floated lens array in this paper. The source image is constructed around the central depth plane of the integral imaging system while the floating image is constructed around the floating plane, which is the image of the central depth plane formed by the floating device. As we discussed in introduction, the transformation from the floating image to the source image is a main cause of the various problems. Therefore, we want to avoid the transformation process by using the floated lens array.

In the computer generation of elemental image for integral imaging, the elemental image is generated by assuming the lens array as a pinhole array [12–14]. We adopt the same assumption for generating the elemental image for the integral floating display. In this case, we should determine the location and the pixel pitch of the floated 2D display device on which the information of floating image would be generated. In order to determine the location of floated 2D display device, the assumption of pinhole array is very important. The simplest and intuitive attempt is to use the image of 2D display which would be formed by the floating device when there is no lens array. This is a reasonable choice because the light rays do not experience the refraction when lens array is assumed as pinhole array. The location and the pixel pitch of floated 2D display can be calculated by applying lens maker’s formula without regarding the refraction effect of the lens array.

where, *d _{di}* and

*d*are the distance from the 2D display of the integral imaging to the floating device and that from the 2D display of the floated integral imaging system to the floating device, while

_{df}*φ*and

_{pi}*φ*are the pixel pitch of the 2D display of the integral imaging and that of the floated integral imaging system, respectively. The image of the 2D display is named as a floated 2D display and is located between the floating device and the floated lens array. The floated lens array and the floated 2D display together are named as a floated integral imaging system. An elemental image of the floating image can be generated using this floated integral imaging system.

_{pf}The floating location of the central depth plane of integral imaging system in Fig. 4 is matched with the floating plane of the floated integral imaging system, which can be proved by simple geometric calculations. The elemental image generated using the floated integral imaging system would geometrically resemble the elemental image of the source image generated using the actual integral imaging system. The actual elemental image which should be displayed on the 2D display device of integral imaging system in the object space can be found by rotation by 180° and expanding or shrinking the elemental image on the floated 2D display by the magnification factor of the real 2D display to the floated 2D display. However, this magnification factor is the same as the magnification factor of the pixel pitch of real 2D display to the pixel pitch of floated 2D display. Therefore, one can directly use the elemental image acquired from the floated integral imaging only after rotating the image by 180°, without multiplying the magnification factor.

#### 2.2. Merits of the proposed method

The introduction of the floated integral imaging system can resolve the problems of the elemental image generation for the integral floating system. Although the partial solutions can be found without using the floated integral imaging system, the floated integral imaging system resolves all the problems without any modification of graphical libraries.

In the above sections, we pointed out that the retro-floating range can be used without introducing the floated integral imaging by forming the source image intended to be located in the image space of the floating device. But the occlusion relation is reversed in this case. Therefore, the reversed occlusion relation should be regarded when one wants to use graphical libraries such as OpenGL for the computer generation of elemental image. However, the floating image shows the correct occlusion relation in the viewpoint of the floated integral imaging system and there is no need to modify the algorithms of the computer graphics libraries. It will be further investigated with the experimental verification in the following section.

The costly process of transformation from the floating image to the source image is also avoided by using the floated integral imaging system. Figure 5 shows how the algorithm changes when the floated integral imaging system is introduced. During the elemental image generation, one image forming the transformation process should be performed because the elemental image would actually be displayed on the 2D display which is in the object space and the final image, the floating image, is in the image space. Since the computer generation of elemental image aims a 3D moving picture or 3D entertainment, the floating 3D data is expected to change frequently while the structure of system is fixed. Therefore, it is wiser to apply the transformation process on the integral imaging system, not on the floating image whose data changes frequently. Each time the floating image changes, the processes included in the dashed boundaries in Fig. 5 is repeated. The transformation process can be avoided when the floated integral imaging is introduced.

Also, the distortion of physical laws in the graphical libraries due to the deformation is resolved through the method using the floated integral imaging system. The illumination can be an example. The graphical libraries reproduce the effect of illumination by applying material coefficients and calculating the incident angle at which a light source faces the surface of target. But the deformation of space during the transformation from the floating image to the source image changes the incident angles. However, the illumination effect is usually calculated while the elemental image is being picked up, which is the last step. Therefore, the accurate illumination reproduction requires the modification of graphical libraries and this can be avoided by introducing the floated integral imaging system.

## 3. Experimental results

First, we show the problem of reversed occlusion relation when the unmodified algorithm is used to display in the retro-floating range. Figure 6 shows the experimental setup.

A convex lens with the aperture of 300mm and the focal length of 175mm is used as a floating device. A liquid crystal display (LCD) monitor with high resolution of 3840×2400 is used as the 2D display, and the lens array is composed of 13×13 lenses whose pitch is 10mm and focal length is 22mm. In this setup, the retro-floating range is the depth range from the floating lens to the focal plane which is 175mm away from the lens. The integral floating system constructs three fish images at the locations 150mm, 175mm and 200mm distant from the floating lens. As shown in Fig. 6, the image of yellow fish is in the retro-floating range, that of blue fish is on the focal plane and that of red fish is in the floating range for real object.

Two different algorithms, using and not using the floating integral imaging system, are used to generate the elemental image for integral floating display. The setup and the results are shown in Fig. 7. As shown in Fig. 6, the source image 2 is supposed to be located at infinite distance to locate floating image 2 at focal plane. For the conventional algorithm, the source image 2 is located sufficiently far, 22m away from the floating lens, considering the pixel pitch of 2D display and the pitch of lens array. The floating image 3 should occlude the other floating images and the floating image 2 should occlude the floating image 1. However, in the viewpoint of the integral imaging system, the source image 1, which should be occluded by other images, occludes all other source images. This leads to the incorrect occlusion relationship among the floating images when the floated integral imaging system is not used. On the other hand, as shown in Fig. 7, the occlusion relation can be successfully resolved by the proposed method.

Figure 8 shows that the 3D image in the retro-floating range is actually integrated correctly by sweeping the diffuser across the depth range where the 3D images are intended to be constructed by the integral floating display. The experimental setup depicted in Fig. 6 is used again in this experiment. The *z*-value in the animation indicates the distance from the floating device. Notice that the yellow fish is well integrated although it is located 150mm away from the floating lens, thus in the retro-floating range. Figure 9 shows the still images when the diffuser is placed at the distance of 150mm, 175mm and 200mm from the floating lens, which shows that all the fishes are integrated at the desired locations.

In additions, we show a disparity of a volumetric 3D image which is constructed by the integral floating system. In this case, the volumetric 3D images are used instead of the plane images. In the 3D images, three geometrical objects are rendered, a green cube 150mm away from the floating lens, a red pyramid 175mm away from the floating lens and a blue cylinder 200mm away from the floating lens. The green cube is in the retro-floating range. The experimental results are shown in Fig. 10 and each object shows the different 2D shapes according to the different viewing positions. Also, the occlusion relation and their relative 2D positions are correctly satisfied. The result verifies a graphical library can be readily adopted for the computer generation of elemental image without any modification.

## 4. Conclusion

We propose a concept of the floated integral imaging system for the computer generation of elemental image for the integral floating display. Also the necessity of the floated integral imaging system is verified both through the analysis and the experimental results. By means of the floated integral image system, the problems of the elemental image generation for the integral floating system can be successfully resolved and the computer graphical libraries can be directly adopted without any modification. Moreover, since the costly computation time to resolve the distortions can be avoided using the proposed method, the real time generation process can be accomplished.

## References and links

**1. **S.-W. Min, M. Hahn, J. Kim, and B. Lee, “Three-dimensional electro-floating display system using an integral imaging method,” Opt. Express **13**, 4358–4369 (2005). [CrossRef]

**2. **J. Kim, S.-W. Min, and B. Lee, “Viewing region maximization of an integral floating display through location adjustment of viewing window,” Opt. Express **15**, 13023–13034 (2007). [CrossRef]

**3. **G. Lippmann, “La photographie integrale,” C. -R. Acad. Sci. **146**, 446–451 (1908).

**4. **C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. **58**, 71–76 (1967). [CrossRef]

**5. **H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A **15**, 2059–2065 (1998). [CrossRef]

**6. **H. Liao, T. Dohi, and M. Iwahara, “Improved viewing resolution of integral videography by use of rotated prism sheets,” Opt. Express **15**, 4814–4822 (2007). [CrossRef]

**7. **Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. **17**, 1683–1684, (1978). [CrossRef]

**8. **S.-W. Min, S. Jung, J.-H. Park, and B. Lee “Three-dimensional display system based on computergenerated integral photography,” Stereoscopic Displays and Virtual Reality Systems VIII, Photonics West, San Jose, CA, USA, Proc. SPIE. **4297**, 187–195, Jan. 2001. [CrossRef]

**9. **S. Nakajima, K. Nakamura, K. Masamune, I. Sakuma, and T. Dohi, “Three-dimensional medical imaging display with computer-generated integral photography,” Comput. Med. Image Graph. **25**, 235–241 (2001). [CrossRef]

**10. **D.-H. Shin, E.-S. Kim, and B. Lee, “Computational Reconstruction of Three-Dimensional Objects in Integral Imaging using Lenslet Array,” Jpn. J. Appl. Phys. **44**, 8016–8018 (2005). [CrossRef]

**11. **J.-B. Hyun, D.-C. Hwang, D.-H. Shin, and E.-S. Kim, “Curved computational integral imaging reconstruction technique for resolution-enhanced display of three-dimensional object images,” Appl. Opt. **46**, 7697–7708 (2007). [CrossRef]

**12. **S.-W. Min, K. S. Park, B. Lee, Y. Cho, and M. Hahn, “Enhanced image mapping algorithm for computer-generated integral imaging system,” Jpn. J. Appl. Phys. **45**, L744–L747 (2006). [CrossRef]

**13. **K. S. Park, S.-W. Min, and Y. Cho, “Viewpoint vector rendering for efficient elemental image generation,” IEICE Trans. Inf. & Syst. **E90-D**, 233–241 (2007). [CrossRef]

**14. **G. Park, S.-W Cho, J. Kim, Y. Kim, H. Choi, J. Hahn, and B. Lee, “Pickup method from OpenGL by reverse projection matrix for real-orthoscopic integral imaging,” Digital Holography and Three-Dimensional Imaging, Vancouver, Canada, paper DTuA8, June 2007.