Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display

Open Access Open Access

Abstract

In this paper, we propose a tilted elemental image array generation method for computer generated integral imaging display with reduced moiré patterns. The pixels of the tilted elemental image array are divided into border pixels and effective pixels. According to the optimal tilted angle, the effective pixels are arranged with uniform arrangement. Also, a pixel mapping method is proposed. Appropriate experiments are carried out and the experimental results show that not only the color moiré patterns are reduced remarkably, but also the resolution of the reconstructed 3D images are improved through the proposed method.

©2013 Optical Society of America

1. Introduction

Three-dimensional (3D) integral imaging, which is one of the most attractive techniques, is distinguished from other 3D display techniques. It can produce full-parallax, full-color, auto-stereoscopic 3D images without special viewing devices and it does not require any coherent light sources [16]. Basically, integral imaging is composed of two processes: pick-up and display/reconstruction. Since integral imaging has difficulty in capturing the elemental images (EIs) of 3D objects, computer-generated integral imaging (CGII) has been used in many research fields [79]. The conventional integral imaging display sets a micro-lens array (MLA) in front of a display device, such as liquid crystal display (LCD), plasma display panel, and other color flat-panel displays (FPDs). These display devices have fine pixel structure, flat surface, and high geometric positional accuracy, and they can offer bright and high resolution two-dimensional (2D) images for integral imaging display. Typically LCD is most widely used in integral imaging display [10, 11]. LCD usually has a high periodicity of pixels combined of red (R), green (G), and blue (B) sub-pixels, which is the cause of visible color moiré pattern that often appears in 3D displays. The color moiré pattern degrades the quality of reconstructed 3D images seriously.

Analyses about the moiré pattern have been done and many methods have been provided to reduce the moiré patterns of 3D displays by previous researches [1215]. For integral imaging, a fundamental solution is using two types of optical low-pass filters: diffuser and defocusing [16]. However, the defocusing degrades the resolution of 3D images. Another effective and convenient method is tilting the MLA with a certain angle [17, 18]. The color moiré pattern is reduced and no extra components are added by this method. Many researches focus on optimal moiré-reduced angle by performing calculations or simulations for moiré analysis. Among them, visualizing the color moiré pattern in detail and analyzing the patterns using spatial Fourier transform, is a convenient method [19]. The obtained optimal moiré-reduced angle is an integer like 17°.

Several methods have been proposed for CGII, such as point retracing rendering [20], multiple viewpoint rendering [21], parallel group rendering [22, 23], viewpoint vector rendering [2426]. All of these methods generate elemental image arrays (EIAs) for conventional integral imaging display. Compared with the conventional integral imaging display, a tilted MLA requires a corresponding tilted EIA displaying on the LCD. Therefore, these methods can’t be used directly for generating tilted EIA.

A method to generate a tilted EI is modifying the CGII algorithm and repeating the process to obtain a set of tilted EIs [18]. Multiple viewpoint rendering is used to render the perspective images, and the EIA for a slanted fly’s eye lens is generated after modified synthesis stage. However, this method ignores the pixel arrangement of tilted EI and the full-color information of a pixel is substituted by sub-pixel information. In the ideal case, a set of tilted EIs are calculated by an algorithm similar to the conventional CGII. However, the tilted EIs may have different pixel arrangements for different positions in the tilted EIA. In this case, the method needs a large number of calculations to generate different tilted EIs. Another simple approach to obtain tilted EIA is directly rotating the conventional EIA by image rotation. Normally, image rotation is composed of two basic steps: rotation and interpolation. The rotation operator performs a geometric transform which maps source pixels in input image onto target pixels in output image by rotating the input image with a user-specified angle. The rotation operator may produce holes or missing pixels, and the solution is using interpolation [27, 28]. However, interpolation causes some problems such as the interpolation error and computational load [29]. In addition, the resolution of reconstructed 3D images would be degraded by interpolation error.

In this paper, we propose a tilted EIA generation method in CGII by simultaneous consideration of both moiré-reduced angle and pixel arrangement. The proposed method transforms the conventional EIA into the tilted EIA according to the optimal tilted angle. The tilted EIA corresponds to a rectangular-type MLA, which is most widely used in integral imaging to reconstruct 3D images. In order to improve the resolution of reconstructed 3D images, a pixel mapping method without interpolation is proposed. Even though the proposed pixel mapping may introduce distortions in the 3D image, it doesn’t need any additional re-rending approach [30]. Experimental verification of the usefulness of the proposed method is also provided.

2. Principle of the proposed method

2.1 Pixel classification in tilted EIA

In the conventional integral imaging display, the micro-lens’s pitch is assumed to be integer multiple of LCD’s pixel size, and pixels of the EIA have a one to one correspondence with the LCD pixels as shown in Fig. 1(a). However, when the micro-lens is rotated, the micro-lens edge intersects some LCD pixels as shown in Fig. 1(b). We call them border pixels. The border pixels are common to two or more adjacent micro-lenses. Since a RGB pixel only has full-color information of a ray, the border pixels cannot provide two EIs’ information. In this case the border pixels cannot provide correct information to the viewers. We define the LCD pixels that are totally covered by a micro-lens as effective pixels. Therefore, the LCD pixels are classified as border pixels and effective pixels. Similarly, the tilted EIA should be composed of the border pixels and the effective pixels for the one to one correspondence between the EIA pixels and the LCD pixels.

 figure: Fig. 1

Fig. 1 Concept of the display process. (a) conventional EI display and (b) tilted EI display.

Download Full Size | PDF

2.2 Generation method of tilted EIA

As we know, the EIs of the conventional EIA are arranged periodically and have a uniform pixel arrangement. However, for tilted EIs, when the pixel size and micro-lens’s pitch are fixed, the number and arrangement of the effective pixels vary with the tilted angle. Figure 2 shows the effective pixel arrangements of four adjacent EIs with different tilted angles. Figure 2(a) shows that each EI with a tilted angle φ has a different effective pixel arrangement. Figure 2(b) shows that each EI with a tilted angle θ has a uniform effective pixel arrangement. We prefer to adopt the tilted angle θ because the titled EIs with the uniform arrangement can be calculated by the same algorithm. In other words, the tilted EIs can be generated more easily by choosing a more suitable tilted angle.

 figure: Fig. 2

Fig. 2 Effective pixel arrangements of four adjacent EIs. (a) tilted angle φ and (b) tilted angle θ.

Download Full Size | PDF

Taking the LCD’s pixel size p and the micro-lens pitch d into consideration, we combine the color moiré pattern reducing and the effective pixel arrangement uniformity to find out an optimal tilted angle. Assume that n and m represent the pixel numbers within two vertices of the micro-lens in the horizontal direction and vertical direction respectively as shown in Fig. 3. Both n and m are integers that range from 1 to d/p. The distance of the two LCD pixels which are under the two vertices rnm can be expressed as:

rnm=pn2+m2 for{n=1,2,,floor(d/p)m=1,2,,floor(d/p).
We define mismatch error ratio fnm as:
fnm={|drnmd|,θ1<arctan(mn)<θ2-1,elsewhere.
where (θ1, θ2) is the angular range of the moiré pattern reducing. The optimal tilted angle θ can be calculated by the following relationships:
A(n,m)=Findmin(fnm)s×u=Findmin(f11f12f1uf21f22f2ufs1fs2fsu)
and
θ=arctan(mn),
where the Findmin function gives A(n′, m′) the subscripts of the minimum non-negative entry of a matrix. When the values of n′ and m′ are given, θ can be determined and then the effective pixel arrangement of a tilted EI is also confirmed.

 figure: Fig. 3

Fig. 3 Parameters of the tilted micro-lens on a LCD.

Download Full Size | PDF

Figure 4 briefly shows a rectangular coordinate of the proposed pixel mapping method. Red boxes Si,j and blue boxes Gh,v represent source pixels and target pixels respectively. i, j, h and v are equal to 1, 2, 3, ... . The centers of S1,1 and G1,1 are both on the origin of the coordinate. The input image is rotated by θ around the origin o. We assume the pixel size is unit length. Therefore, the coordinate value of Gh,v’s center is (h-1, v-1). Since the direction of rotation is clockwise, transformation formula is given as:

 figure: Fig. 4

Fig. 4 Conceptual diagram of the proposed pixel mapping method.

Download Full Size | PDF

(xy)=(cosθsinθsinθcosθ)(xy).

According to Eq. (5), the four vertices A(x1,y1), B(x2,y2), C(x3,y3) and D(x4,y4) of Si,j are calculated as:

{x1=(i3/2)cosθ+(j1/2)sinθy1=(i3/2)sinθ+(j1/2)cosθ,
{x2=(i1/2)cosθ+(j1/2)sinθy2=(i1/2)sinθ+(j1/2)cosθ,
{x3=(i1/2)cosθ+(j3/2)sinθy3=(i1/2)sinθ+(j3/2)cosθ,
{x4=(i3/2)cosθ+(j3/2)sinθy4=(i3/2)sinθ+(j3/2)cosθ.
If the center of Gh,v is inside of Si,j, we set up one to one mapping correspondence between Gh,v and Si,j. The relationship between Gh,v and Si,j is represented as Gh,v = Si,j. We use the method of linear programming and substitute the coordinate value of Gh,v’s center, A(x1,y1), B(x2,y2), C(x3,y3), and D(x4,y4), then the sufficient condition of Gh,v = Si,j is given by

{(h1)sinθ+(v1)cosθ<j12(h1)sinθ+(v1)cosθ>j32(h1)cosθ(v1)sinθ<i12(h1)cosθ(v1)sinθ>i32.

In the proposed pixel mapping method, the target pixels are effective pixels of the titled EIA. We find out a target pixel according to the effective pixel arrangement, and then look for its source pixel in the conventional EIs. That is the main difference between the general image rotation and our pixel mapping method.

3. Experimental results and discussion

To show the usefulness of the proposed method, we have implemented experiments to compare the reconstructed images’ qualities using the different tilted EIA generation methods.

In the experiments, we obtain the conventional EI by 3DS MAX. 3DS MAX, which is a 3D modeling software, has been usually used to form and capture a virtual 3D scene in integral imaging [31, 32]. It can provide a virtual camera array to simulate a MLA for capturing 3D objects as shown in Fig. 5(a). In our experiments, the virtual camera array is composed of 157 × 118 cameras. The camera’s focal length is 97.356 mm and the field of view is 20.95°. Each EI rendered by the corresponding camera has 13 × 13 pixels. The 3D model is two “footballs” as shown in Fig. 5(b). “T”, “L”, “F”, and “P” represent top view, left view, front view and perspective user view, respectively.

 figure: Fig. 5

Fig. 5 (a) Scheme of the virtual pick-up stage and (b) 3D model used in this experiment.

Download Full Size | PDF

In our experiments, we use an iPad to display the EIAs. A rectangular-type MLA has been embedded in the iPad to reconstruct the 3D images. A CCD camera is placed in front of the iPad to capture the reconstructed 3D images as shown in Fig. 6. The parameters of our experimental set-up are given in Table 1. We get d = 1.25mm and p = 0.096mm from Table 1. The moiré-reduced angle range, which is measured by software testing and naked-eye observation, is (15°, 25°). According to Eqs. (1)-(4) and the value range, we obtain n′ = 12, m′ = 5, θ = 22.62°. And then the effective pixels’ arrangement of a tilted EI is also confirmed.

 figure: Fig. 6

Fig. 6 Experimental set-up for 3D image reconstruction.

Download Full Size | PDF

Tables Icon

Table 1. Parameters of the Integral Imaging Display

Table 2 shows the characteristics of different EIs. The notation of ‘-’ signifies that there is no border pixel. The tilted EI generated by general image rotation has no border pixels, because it doesn’t classify the pixels. The EIs that locate at the edge of the EIA are not included in the total number of EI.

Tables Icon

Table 2. Comparison of EIs Generated by Different Methods

The computer-generated EIAs and reconstructed 3D images of different generation methods are shown in Fig. 7. We can make the following conclusions by observing and comparing Figs. 7(a), (b), and (c). The reconstructed 3D image of the conventional CGII has severe color distortion as shown in Fig. 7(a). Figures 7 (b) and (c) show that the color moiré patterns are reduced remarkably with 22.62° rotation angle. In Fig. 7(b), the reconstructed blue-and-yellow “football” is discrete and the edge of the red-and-white “football” is blurred. However, in our method, the reconstructed 3D images have clear edge and have no distortion as shown in Fig. 7(c). The reconstructed view images from different directions by our method are shown in Fig. 8.

 figure: Fig. 7

Fig. 7 Computer-generated EIAs (top) and reconstructed 3D images (bottom). (a) conventional CGII, (b) general image rotation, and (c) proposed method.

Download Full Size | PDF

 figure: Fig. 8

Fig. 8 Different views of the reconstructed 3D images.

Download Full Size | PDF

4. Conclusion

A tilted EIA generation method in CGII has been proposed. We divide the tilted EIA into border pixels and effective pixels. The uniform effective pixel arrangement makes the generation of the tilted EIs easier. A new pixel mapping method is proposed to improve the quality of the reconstructed 3D images. Experiment results show the reconstructed 3D images using the proposed method have a higher resolution and less distortion than the general image rotation method.

Acknowledgments

This work is supported by the “973” Program under Grant No. 2013CB328802, the NSFC under Grant Nos. 61225022 and 61036008, and the “863” Program under Grant Nos. 2012AA011901 and 2012AA03A301.

References and links

1. B. Lee, S. Y. Jung, S. W. Min, and J. H. Park, “Three-dimensional display by use of integral photography with dynamically variable image planes,” Opt. Lett. 26(19), 1481–1482 (2001). [CrossRef]   [PubMed]  

2. A. Stern and B. Javidi, “Three dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). [CrossRef]  

3. J. H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009). [CrossRef]   [PubMed]  

4. H. Yoo, “Artifact analysis and image enhancement in three-dimensional computational integral imaging using smooth windowing technique,” Opt. Lett. 36(11), 2107–2109 (2011). [CrossRef]   [PubMed]  

5. Y. Liu, H. Ren, S. Xu, Y. Chen, L. Rao, T. Ishinabe, and S. T. Wu, “Adaptive focus integral image system design based on fast-response liquid crystal microlens,” J. Disp. Technol. 7(12), 674–678 (2011). [CrossRef]  

6. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]   [PubMed]  

7. C. C. Ji, H. Deng, and Q. H. Wang, “Pixel extraction based integral imaging with controllable viewing direction,” J. Opt. 14(9), 095401 (2012). [CrossRef]  

8. K. C. Kwon, C. Park, M. U. Erdenebat, J. S. Jeong, J. H. Choi, N. Kim, J. H. Park, Y. T. Lim, and K. H. Yoo, “High speed image space parallel processing for computer-generated integral imaging system,” Opt. Express 20(2), 732–740 (2012). [CrossRef]   [PubMed]  

9. S. H. Jiao, X. G. Wang, M. C. Zhou, W. M. Li, T. Hong, D. Nam, J. H. Lee, E. H. Wu, H. T. Wang, and J. Y. Kim, “Multiple ray cluster rendering for interactive integral imaging system,” Opt. Express 21(8), 10070–10086 (2013). [CrossRef]   [PubMed]  

10. G. Baasantseren, J. H. Park, K. C. Kwon, and N. Kim, “Viewing angle enhanced integral imaging display using two elemental image masks,” Opt. Express 17(16), 14405–14417 (2009). [CrossRef]   [PubMed]  

11. J. H. Jung, S. G. Park, Y. Kim, and B. Lee, “Integral imaging using a color filter pinhole array on a display panel,” Opt. Express 20(17), 18744–18756 (2012). [CrossRef]   [PubMed]  

12. I. Amidror, R. D. Hersch, and V. Ostromoukhov, “Spectral analysis and minimization of moiré patterns in color separation,” J. Electron. Imaging 3(3), 295–317 (1994). [CrossRef]  

13. R. Börner, “Four autostereoscopic monitors on the level of industrial prototypes,” Displays 20(2), 57–64 (1999). [CrossRef]  

14. V. Saveljev, J. Y. Son, B. Javidi, S. K. Kim, and D. S. Kim, “Moiré minimization condition in three-dimensional image displays,” J. Disp. Technol. 1(2), 347–353 (2005). [CrossRef]  

15. V. Saveljev and S. K. Kim, “Simulation of moiré effect in 3D displays,” J. Opt. Soc. Korea 14(4), 310–315 (2010). [CrossRef]  

16. M. Okui, M. Kobayashi, J. Arai, and F. Okano, “Moire fringe reduction by optical filters in integral three-dimensional imaging on a color flat-panel display,” Appl. Opt. 44(21), 4475–4483 (2005). [CrossRef]   [PubMed]  

17. Y. Kim, G. Park, S. W. Cho, J. H. Jung, B. Lee, Y. Choi, and M. G. Lee, “Integral imaging with reduced color moiré pattern by using a slanted lens array,” Proc. SPIE 6803, 68030L (2008). [CrossRef]  

18. K. Yanaka and K. Uehira, “Extended fractional view integral imaging using slanted fly's eye lens,” in Proceedings of SID Symposium Digest of Technical Papers, Wiley (Academic, 2011), pp. 1124–1127. [CrossRef]  

19. Y. Kim, G. Park, J. H. Jung, J. Kim, and B. Lee, “Color moiré pattern simulation and analysis in three-dimensional integral imaging for finding the moiré-reduced tilted angle of a lens array,” Appl. Opt. 48(11), 2178–2187 (2009). [CrossRef]   [PubMed]  

20. Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978). [CrossRef]  

21. M. Halle, “Multiple viewpoint rendering,” SIGGRAPH ’98, Proceedings of the 25th Annual conference on Computer Graphics and Interactive Techniques, 243–254 (1998).

22. R. Yang, X. Huang, and S. Chen, “Efficient rendering of integral images,” SIGGRAPH’05, Proceedings of 32nd Annual conference on Computer Graphics and Interactive Techniques,44 (2005).

23. S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005). [CrossRef]  

24. S. W. Min, K. S. Park, B. Lee, Y. Cho, and M. Hahn, “Enhanced image mapping algorithm for computer-generated integral imaging system,” Jpn. J. Appl. Phys. 45(28), L744–L747 (2006). [CrossRef]  

25. B. N. R. Lee, Y. Cho, K. S. Park, S. W. Min, J. S. Lim, M. C. Whang, and K. R. Park, “Design and implementation of a fast integral image rendering method,” International Conference on Electronic Commerce 2006, 135–140 (2006). [CrossRef]  

26. K. S. Park, S. W. Min, and Y. Cho, “Viewpoint vector rendering for efficient elemental image generation,” IEICE – Transactions on Information and Systems E 90-D, 233–241 (2007).

27. S. C. Kang, Z. Z. Stroll, and S. C. Miller, “Small angle image rotation using block transfers,” U.S. patent 4829452 (May 9, 1989).

28. D. H. Shin and H. Yoo, “Image quality enhancement in 3D computational integral imaging by use of interpolation methods,” Opt. Express 15(19), 12039–12049 (2007). [CrossRef]   [PubMed]  

29. H. Yoo, “Axially moving a lenslet array for high-resolution 3D images in computational integral imaging,” Opt. Express 21(7), 8873–8878 (2013). [CrossRef]   [PubMed]  

30. W. Li, H. Wang, M. Zhou, S. Wang, S. Jiao, X. Mei, T. Hong, H. Lee, and J. Kim, “Principal observation ray calibration for tiled-lens-array integral imaging display,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Oregon Convention Center, Portland, Oregon, 2013), pp. 1019–1026.

31. H. Deng, Q. H. Wang, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011). [CrossRef]  

32. Y. Xu, X. R. Wang, Y. Sun, and J. Q. Zhang, “Homogeneous light field model for interactive control of viewing parameters of integral imaging displays,” Opt. Express 20(13), 14137–14151 (2012). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Concept of the display process. (a) conventional EI display and (b) tilted EI display.
Fig. 2
Fig. 2 Effective pixel arrangements of four adjacent EIs. (a) tilted angle φ and (b) tilted angle θ.
Fig. 3
Fig. 3 Parameters of the tilted micro-lens on a LCD.
Fig. 4
Fig. 4 Conceptual diagram of the proposed pixel mapping method.
Fig. 5
Fig. 5 (a) Scheme of the virtual pick-up stage and (b) 3D model used in this experiment.
Fig. 6
Fig. 6 Experimental set-up for 3D image reconstruction.
Fig. 7
Fig. 7 Computer-generated EIAs (top) and reconstructed 3D images (bottom). (a) conventional CGII, (b) general image rotation, and (c) proposed method.
Fig. 8
Fig. 8 Different views of the reconstructed 3D images.

Tables (2)

Tables Icon

Table 1 Parameters of the Integral Imaging Display

Tables Icon

Table 2 Comparison of EIs Generated by Different Methods

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

r nm =p n 2 + m 2  for{ n=1,2,,floor(d/p) m=1,2,,floor(d/p) .
f nm ={ | d r nm d |, θ 1 <arctan( m n )< θ 2 -1,elsewhere.
A( n , m )=Findmin ( f nm ) s×u =Findmin( f 11 f 12 f 1u f 21 f 22 f 2u f s1 f s2 f su )
θ=arctan( m n ),
( x y )=( cosθ sinθ sinθ cosθ )( x y ).
{ x 1 =( i3/2 )cosθ+( j1/2 )sinθ y 1 =( i3/2 )sinθ+( j1/2 )cosθ ,
{ x 2 =( i1/2 )cosθ+( j1/2 )sinθ y 2 =( i1/2 )sinθ+( j1/2 )cosθ ,
{ x 3 =( i1/2 )cosθ+( j3/2 )sinθ y 3 =( i1/2 )sinθ+( j3/2 )cosθ ,
{ x 4 =( i3/2 )cosθ+( j3/2 )sinθ y 4 =( i3/2 )sinθ+( j3/2 )cosθ .
{ (h1)sinθ+(v1)cosθ<j 1 2 (h1)sinθ+(v1)cosθ>j 3 2 (h1)cosθ(v1)sinθ<i 1 2 (h1)cosθ(v1)sinθ>i 3 2 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.