Abstract

In this paper, we analyze the depth of field (DOF) of integral imaging displays based on wave optics. With considering the diffraction effect, we analyze the intensity distribution of light with multiple micro-lenses and derive a DOF calculation formula for integral imaging display system. We study the variations of DOF values with different system parameters. Experimental results are provided to verify the accuracy of the theoretical analysis. The analyses and experimental results presented in this paper could be beneficial for better understanding and designing of integral imaging displays.

© 2013 Optical Society of America

1. Introduction

Integral imaging is a 3D technique that was proposed by Lippman in 1908 [1]. It can provide full-parallax and continuous-viewing 3D images and does not require any special glasses or coherent light, such as those used in multi-view auto-stereoscopic 3D display or holography [27]. Integral imaging has attracted extensive attention for its research and application in the area of 3D sensing and display [829].

However, integral imaging displays are affected by diffraction and aberration effects in both optical pickup and display processes. To eliminate the effects of diffraction and aberration in the optical pickup process, computer-generated integral imaging can be applied [30]. It can generate elemental images by using computer graphic techniques with the parameters of the virtual micro-lens array (MLA) without a real optical system. Because of its flexibility and diffraction-free feature, computer-generated integral imaging has been proposed in many fields such as entertainment and medical sciences et al.. In such cases, the deterioration factor in the pickup process can be ignored and only the display process should be considered when analyzing the viewing performance of the system. In this paper, we analyze the depth of field (DOF) using computer-generated integral imaging.

The main parameters used to evaluate the performance of the integral imaging display are DOF, viewing resolution, and viewing angle [3136]. In this paper, we focus on the analysis of the DOF of integral imaging displays using wave optics. First, we analyze the number of correspondence pixels which are responsible for reconstructing the same image point in 3D space. Then, by analyzing the wave propagation and light intensity distribution with multiple micro-lenses, we perform the DOF calculation based on hyperbola fitting and the minimum angular resolution of human eyes. By taking into consideration of the diffraction effect and using the superposition of light intensity with multiple micro-lenses, we can obtain more accurate DOF calculation results compared to the previous methods for the integral imaging display. Experimental results are also given to verify the accuracy of the proposed method.

This paper is organized as follows. In section 2, we review two pre-existing DOF calculation methods based on geometrical optics and Gaussian beam distribution model [37,38]. In section 3, we propose a DOF calculation method using wave optics [39,40]. In sections 4, we calculate several groups of DOF values using our proposed method and analyze the variations of DOF with different system parameters. In section 5, the accuracy of the method is verified by comparing the experimental results of the proposed method with the previous researches. In section 6, we summarize the main achievements of this paper and conclude with future directions and outlook.

2. Previous DOF calculation methods

One of the most important parameters that set 3D displays apart from 2D displays is their DOF. To understand the DOF of integral imaging displays, Min et al. [37] performed an analysis using geometrical optics theory without considering the diffraction effect caused by MLA. Because the integrated image is out of focus when the image is located away from the central depth plane (CDP), DOF can be set as the distance between the rear and front marginal depth planes, where the ray-optical focusing error occurs due to the overlap of the image pixel:

DOFGeom=2l2g×ppd,
where p and pd represent the pitch of the MLA and the pixel size of the display, respectively. Parameter g represents the gap between the display and the MLA, and l represents the conjugate distance defined by the Gauss lens law [41].

A DOF calculation method based on Gaussian beam distribution model was also proposed [38]. In that model, the light emanating from a single pixel on the display was considered as a Gaussian beam whose waist is located at the CDP. Taking into account the minimum angular resolution of human eyes (αe) [42], the DOF was obtained by calculating the locations of the rear and front marginal depth planes based on the Gaussian beam function:

DOFGauss=2l4tan(αe/2)2d2p2g2+4pd2tan(αe/2)2l4p2l2pd2|4tan(αe/2)2l2ggp2|,
where d refers to the viewing distance between the observer and the CDP.

These DOF calculation methods are easy to use but suffer from low accuracy because they both neglect the diffraction effect and only take a single micro-lens into consideration. Even though the second method tries to approximate the actual light intensity distribution in the image space, the obtained Gaussian light beam functions are derived from the geometrical relationships of the integral imaging display. It results in the low accuracy of the DOF calculation results.

3. DOF calculation methods based on wave optics

Here, we propose another approach for DOF calculation using wave-optics. First, we analyze the number of correspondence pixels which are responsible for reconstructing the same image point in 3D space. Then, by analyzing the wave propagation and light intensity distribution with multiple micro-lenses, we perform the DOF calculation based on hyperbola fitting and the minimum angular resolution of human eyes.

Note that for an integral imaging display the pixel size of the display device is a factor that strongly influences the DOF [37,38]. However, in our analysis, we regard the display pixels as ideal points for simplification. We only consider square micro-lenses in this paper, but it is not difficult to expand our proposed method to other micro-lens shapes. In addition, our studies are based on resolution priority integral imaging (RPII) displays [43] while the derivations of depth priority integral imaging (DPII) [43] displays are similar.

3.1 Number of correspondence pixels of the RPII display

For a RPII display, the correspondence pixels are located on different elemental images and imaged by their corresponding micro-lenses to reconstruct the same image point on the conjugate plane, as shown in Fig. 1.Suppose the MLA contains M × N micro-lenses, and the central micro-lens is denoted as the 0th micro-lens. To describe the light propagation in the image space, we set up a coordinate system xyz in which the z axis coincides with the optical axis of the 0th micro-lens. And the origin is located at the center of that lens. Also, a plane coordinate system x0y0 is utilized to describe the MLA plane (z = 0).

 

Fig. 1 Number of correspondence pixels to reconstruct image point A in the RPII display.

Download Full Size | PPT Slide | PDF

We take image point A located on the z axis for example. Assuming the wth elemental image is the critical one that contains the correspondence pixel for image point A, according to the geometric relationships, w is given by

w=f2|gf|,
where the symbol ⌊x⌋ denotes the greatest integer less than or equal to x, distance f and g are defined as the focal length and the gap between the display and the MLA, respectively.

The number of correspondence pixels (H × V) for image point A is given by

H×V={(2w+1)×(2w+1)(2w+1)min(M,N)M×(2w+1)M(2w+1)N(2w+1)×NN(2w+1)MM×N(2w+1)max(M,N),
where functions min(M, N) and max(M, N) give the smaller and the larger values of M and N, respectively.

Among the H × V correspondence pixels on the display panel (xD,yD), as shown in Fig. 1, the coordinates of an arbitrary correspondence pixel D(xmnD,ymnD) can be expressed with the system parameters as

(xmnD,ymnD)=(gfmp,gfnp),
where m = -w…-1, 0, 1...w and n = -w…-1, 0, 1…w are the indexes of micro-lens in the MLA corresponding to the correspondence pixel D(xmnD,ymnD).

3.2 Light intensity distribution of the RPII display

After obtaining the number of correspondence pixels, we will analyze the light intensity distribution in the image space of the RPII display.

For the RPII display shown in Fig. 2, the pupil function for the central micro-lens can be given as

 

Fig. 2 Light intensity distribution of the RPII display.

Download Full Size | PPT Slide | PDF

P00(x0,y0)=rect(x0p,y0p)={1-p2x0p2,-p2y0p20otherwise.

The pupil function for other micro-lenses can be expressed as Pmn(x0,y0)= P00(x0mp,y0np).

Assuming that the corresponding pixel D(xmnD,ymnD) is a δ function (point source), the phase transformation of the corresponding micro-lens is given by

tmn(x0,y0)=P00(x0mp,y0np)exp{jk2f[(x0mp)2+(y0np)2]}.

For a single correspondence pixel D(xmnD,ymnD), we can obtain the light intensity distribution at an arbitrary point C(x, y, z) on a certain depth plane by using the paraxial approximation and the Fresnel diffraction theory, as given below:

Imn(x,y;z)=|1λ¯2gz+exp{jk2g[(x0xmnD)2+(y0ymnD)2]}×tmn(x0,y0)×exp{jk2z[(xx0)2+(yy0)2]}dx0dy0|2,
where λ¯ is the mean wavelength, k is the wave number and is given by k=2π/λ¯.

For multiple correspondence pixels, all the intensity distributions can be superimposed linearly. Therefore, the light intensity distribution of the RPII display can be expressed as

I(x,y;z)=m=w,n=wm=w,n=wImn(x,y;z).

3.3 DOF of the RPII display

In the following parts, we only take into account one dimension for simplicity, and the discussions for the other dimension are the same due to the square shapes of micro-lenses.

For a given RPII display the diffraction intensity patterns on different depth planes can be obtained by using Eqs. (8) and (9). Figures 3(a) and 3(b) show the diffraction intensity patterns at depth planes z = 20.0mm (the conjugate plane) and z = 26.0mm with parameters f = 4.0mm, g = 5.0mm, p = 0.8mm, λ¯=5.5×104mm, and the number of correspondence pixels H × V = 5 × 5.

 

Fig. 3 Cross sections and meridian sections (y = 0) of the diffraction intensity patterns on different depth planes (a) z = 20.0mm and (b) z = 26.0mm. Here, f = 4.0mm, g = 5.0mm, p = 0.8mm, λ¯=5.5×104mm, and H × V = 5 × 5.

Download Full Size | PPT Slide | PDF

For different depth planes, the corresponding diffraction intensity patterns can be calculated as shown in Fig. 3. For a given diffraction intensity pattern, we use a square light spot that contains a substantial percentage (84%) of the whole pattern energy to replace the entire pattern [44]. The diffraction intensity pattern can then be simplified as a square light spot with a half side length r(z). Therefore, r(z) can be calculated as

-r(z)r(z)I(x,0;z)dx=0.84-I(x,0;z)dx.

The discrete data of half side length r(z) is obtained and fitted into a hyperbola function whose waist is located on the conjugate plane, as denoted by the red lines in Fig. 4.Taking into consideration the minimum angular resolution of human eyes (αe), we can calculate the DOF using the intersection of the hyperbola and the margin lines of αe (blue lines shown in Fig. 4). Here, d represents the viewing distance between the eye and the conjugate plane.

 

Fig. 4 Analysis of DOF of the RPII display based on wave optics taking into account the minimum angular resolution of human eyes.

Download Full Size | PPT Slide | PDF

After obtaining the hyperbola functions, we can deduce the DOF as follows. The hyperbola fitting function of the light beam can be written as

x2a2(zl)2b2=1,
where a and b are given by the fitting results of the hyperbola light beam.

The margin lines of the minimum angular resolution of human eyes (αe) are given by the following equation:

x=±tan(αe2)(zld).

Combining Eqs. (11) and (12), we can confirm the locations of the rear and front marginal depth planes. Therefore, the DOF of the RPII display can be defined as the distance between the two marginal depth planes:

DOFWave=2a4b2+a2b4tan(αe/2)2+a2b2d2tan(αe/2)2|a2b2tan(αe/2)2|.

4. DOF calculation results based on wave optics

After obtaining the DOF calculation formula, we can use it to analyze the variations of DOF with different parameters. In the calculation process we assume that the number of correspondence pixels (H × V) always meets the requirement of (2w+1)min(M,N). Note that H × V is decided by the focal length f and the gap g according to Eqs. (3) and (4). It means that once f and g are given, H × V is fixed and can be calculated by the given f and g. The relationship between the viewing distance d and the DOFWave is given by Eq. (13). It shows that the DOFWave increases monotonously if d increases. Therefore, we will only discuss the variations of DOFWave with different focal length f, gap g, and MLA pitch p in this paper. Table 1 lists the parameters used in this section. The minimum angular resolution of human eyes is set to be αe = 1.662 × 10-2° [42], the mean wavelength is λ¯=5.5×104mm, and the viewing distance is d = 2500mm [4547].

Tables Icon

Table 1. Parameters used in DOF calculation

Tables Icon

Table 3. Variation of DOFWave with different gap g

DOFWave have been calculated with different specifications according to Eq. (13). Tables 24 show the variations of DOFWave with different focal length f, gap g, and MLA pitch p, respectively.

Tables Icon

Table 2. Variation of DOFWave with different focal length f

Tables Icon

Table 4. Variation of DOFWave with different MLA pitch p

From the results shown above, we can see that the values of DOFWave highly depend on focal length f, gap g, and micro-lens pitch p. The variations of DOFWave are not monotonic as we change these parameters, and the optimal DOFWave values can be determined by multi-parameter optimization [48,49].

5. Experimental results

According to the above analyses, we designed and implemented an experiment to show the accuracy of the proposed method. To avoid light diffraction or aberration effects in the pickup process, we used Autodesk 3ds Max 2012 software to generate the elemental image array, and the obtained elemental image array was printed on a piece of high-quality photo paper with a Brother HL-4150CDN color printer to implement the optical display. The reconstructed 3D images were then captured by a Canon EOS 5D Mark II camera at a viewing distance of d = 2500mm. Table 5 shows the parameters used in the pickup and display experiments.

Tables Icon

Table 5. Parameters used in the pickup and display experiments

To verify the proposed DOF calculation method, we compared the DOF calculation results with those calculated by geometrical optics and Gaussian beam methods as shown in Table 6..With given parameters, we calculated three groups of DOF values using three different DOF calculation methods. Here, DOFGeom, DOFGauss, and DOFWave refer to the DOF derived by geometrical optics method, Gaussian beam method, and wave optics method, respectively.

Tables Icon

Table 6. DOF calculation results obtained from geometrical optics, Gaussian beam, and wave optics methods

We set up a 3D scene which consists of seven colorful “florets” located at seven different depth positions with Autodesk 3ds Max software 2012, as shown in Fig. 5.According to the DOF calculation results shown in Table 6 and the locations of the seven different “florets” given in Fig. 5, “florets” number 1 to 7 are within the range of DOFGauss, “florets” number 2 to 6 are within the range of DOFGeom, and “florets” number 3 to 5 are within the range of DOFWave.

 

Fig. 5 Schematic, not to scale, of the pickup process of RPII conducted in Autodesk 3ds Max 2012. Here, M × N = 110 × 110, f = 3.3mm, and p = 1.0mm.

Download Full Size | PPT Slide | PDF

The pickup process was conducted in a computer without the influence of diffraction or aberration effects, as shown in Fig. 5. The parameters used in the pickup process are listed in Table 5 and the generated elemental image array contains 110 × 110 elemental images, each with a resolution of 40 × 40 pixels, as given in Fig. 6(a).

 

Fig. 6 (a) Obtained elemental image array, and (b) experimental setup of the RPII display setup. Here, M × N = 110 × 110, H × V = 9 × 9, f = 3.3mm, p = 1.0mm, g = 3.7mm, and the resolution of each elemental image is 40 × 40 pixels.

Download Full Size | PPT Slide | PDF

In the optical 3D display process, as shown in Fig. 6(b), the elemental image array was printed on a piece of paper (as our display) and placed behind the MLA, which has identical parameters with the previous one used in the pickup process, with a separation of g = 3.7mm. Other parameters are given in Table 5. Figure 7 shows the captured display results.

 

Fig. 7 (a) Different perspectives of the reconstructed 3D image, (b) enlarged view of the center viewpoint, and (c) 2D image of the original 3D“florets” number 4 for reference.

Download Full Size | PPT Slide | PDF

From the experimental results, we can see that the reconstructed 3D “florets” number 1, 2, 6, and 7 which are located within the theoretically predicted range of DOFGauss or DOFGeom and out of the range of DOFWave appear blurry. However, “florets” number 3, 4, and 5 which are located within the theoretically predicted range of DOFWave and also within the ranges of both DOFGauss and DOFGeom appear clearer and smoother. What’s more, by comparing Fig. 7(b) with Fig. 7(c), we can see that 3D “florets” number 3, 4, and 5 are reconstructed more correctly than the rest ones with less distortion.

The results demonstrate that the effect of multiple micro-lenses and diffraction strongly affect the DOF in an RPII display. Even though we have neglected the pixel size of the display, the proposed DOF calculation method based on wave optics is more accurate than both geometrical optics and Gaussian beam methods. Experimental results demonstrate the accuracy of the proposed method.

6. Conclusion

In this paper, we analyzed the DOF of the integral imaging display with multiple micro-lenses using wave optics. We first derived the light intensity distribution with multiple micro-lenses, and then the DOF was calculated by combining hyperbola fitting of the diffraction intensity patterns and the minimum angular resolution of human eyes. With given system parameters, we determined several groups of DOF values and analyzed the variations of them under different situations. Finally, optical experimental results confirmed the accuracy of the proposed method. In the future work, other parameters such as viewing resolution and viewing angle will be studied as well as the pixel size of the display panel. These analyses can be quite beneficial for researchers to better understand and develop desired integral imaging displays.

Acknowledgments

This work is supported by the “973” Program under Grant No. 2013CB328802, the NSFC under Grant Nos. 61225022 and 61036008, and the “863” Program under Grant Nos. 2012AA011901 and 2012AA03A301. M. Martinez-Corral acknowledges financial support from projects DPI2012-32994 and PROMETEO2009-077. This work is also supported in part by Samsung Electronics Inc. under the Samsung Advanced Institute of Technology (SAIT) Global Research Outreach (GRO) Program.

References and links

1. G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

2. M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011). [CrossRef]  

3. C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013). [CrossRef]   [PubMed]  

4. X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]   [PubMed]  

5. Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011). [CrossRef]   [PubMed]  

6. J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012). [CrossRef]  

7. S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008). [CrossRef]   [PubMed]  

8. H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. A 21(3), 171–176 (1931). [CrossRef]  

9. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. A 58(1), 71–74 (1968). [CrossRef]  

10. T. Okoshi, Three-Dimensional Imaging Techniques (Academic, 1976).

11. L. Yang, M. McCormick, and N. Davies, “Discussion of the optics of a new 3-D imaging system,” Appl. Opt. 27(21), 4529–4534 (1988). [CrossRef]   [PubMed]  

12. F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006). [CrossRef]  

13. J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37(11), 2034–2045 (1998). [CrossRef]   [PubMed]  

14. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998). [CrossRef]  

15. T. Mishina, “3D television system based on integral photography,” in Picture Coding Symposium (PCS) (Nagoya, Japan, 2010), p. 20. [CrossRef]  

16. J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010). [CrossRef]  

17. D. Aloni, A. Stern, and B. Javidi, “Three-dimensional photon counting integral imaging reconstruction using penalized maximum likelihood expectation maximization,” Opt. Express 19(20), 19681–19687 (2011). [CrossRef]   [PubMed]  

18. M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010). [CrossRef]  

19. B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006). [CrossRef]   [PubMed]  

20. J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A 20(6), 996–1004 (2003). [CrossRef]   [PubMed]  

21. A. Ö. Yöntem and L. Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28(11), 2359–2375 (2011). [CrossRef]   [PubMed]  

22. F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999). [CrossRef]  

23. J. H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009). [CrossRef]   [PubMed]  

24. R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009). [CrossRef]  

25. R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008). [CrossRef]   [PubMed]  

26. B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. 45(13), 2986–2994 (2006). [CrossRef]   [PubMed]  

27. H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004). [CrossRef]   [PubMed]  

28. H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011). [CrossRef]  

29. J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013). [CrossRef]  

30. Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978). [CrossRef]  

31. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). [CrossRef]  

32. C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012). [CrossRef]  

33. R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007). [CrossRef]   [PubMed]  

34. C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

35. H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integral-imaging camera by double snapshot,” Opt. Express 20(2), 890–895 (2012). [CrossRef]   [PubMed]  

36. R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005). [CrossRef]  

37. S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005). [CrossRef]  

38. C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012). [CrossRef]  

39. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005), Chap. 5.

40. F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007). [CrossRef]   [PubMed]  

41. F. A. Jenkins and H. E. White, Fundamentals of Optics, 4th ed., S. Grall, ed. (McGraw-Hill, 2001), Part 1.

42. M. Born and E. Wolf, Principle of Optic, 7th ed. (Cambridge University, 1999), Chap. VIII.

43. J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003). [CrossRef]   [PubMed]  

44. J. E. Greivenkamp, “Airy Disk,” in Field Guide to Geometrical Optics (SPIE, 2004).

45. B. Javidi, F. Okano, and J. Y. Son, Three-Dimensional Imaging, Visualization, and Display (Springer, 2009).

46. G. Park, J. H. Jung, K. Hong, Y. Kim, Y. H. Kim, S. W. Min, and B. Lee, “Multi-viewer tracking integral imaging system and its viewing zone analysis,” Opt. Express 17(20), 17895–17908 (2009). [CrossRef]   [PubMed]  

47. ITU-R Rec. BT.1438, Subjective assessment of stereoscopic television pictures (2000).

48. M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357–360 (2012). [CrossRef]  

49. D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. 37(1), 19–21 (2012). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).
  2. M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011).
    [Crossref]
  3. C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
    [Crossref] [PubMed]
  4. X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
    [Crossref] [PubMed]
  5. Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011).
    [Crossref] [PubMed]
  6. J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
    [Crossref]
  7. S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
    [Crossref] [PubMed]
  8. H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. A 21(3), 171–176 (1931).
    [Crossref]
  9. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. A 58(1), 71–74 (1968).
    [Crossref]
  10. T. Okoshi, Three-Dimensional Imaging Techniques (Academic, 1976).
  11. L. Yang, M. McCormick, and N. Davies, “Discussion of the optics of a new 3-D imaging system,” Appl. Opt. 27(21), 4529–4534 (1988).
    [Crossref] [PubMed]
  12. F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
    [Crossref]
  13. J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37(11), 2034–2045 (1998).
    [Crossref] [PubMed]
  14. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998).
    [Crossref]
  15. T. Mishina, “3D television system based on integral photography,” in Picture Coding Symposium (PCS) (Nagoya, Japan, 2010), p. 20.
    [Crossref]
  16. J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
    [Crossref]
  17. D. Aloni, A. Stern, and B. Javidi, “Three-dimensional photon counting integral imaging reconstruction using penalized maximum likelihood expectation maximization,” Opt. Express 19(20), 19681–19687 (2011).
    [Crossref] [PubMed]
  18. M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010).
    [Crossref]
  19. B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006).
    [Crossref] [PubMed]
  20. J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A 20(6), 996–1004 (2003).
    [Crossref] [PubMed]
  21. A. Ö. Yöntem and L. Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28(11), 2359–2375 (2011).
    [Crossref] [PubMed]
  22. F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999).
    [Crossref]
  23. J. H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009).
    [Crossref] [PubMed]
  24. R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009).
    [Crossref]
  25. R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
    [Crossref] [PubMed]
  26. B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. 45(13), 2986–2994 (2006).
    [Crossref] [PubMed]
  27. H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
    [Crossref] [PubMed]
  28. H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
    [Crossref]
  29. J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
    [Crossref]
  30. Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978).
    [Crossref]
  31. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
    [Crossref]
  32. C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
    [Crossref]
  33. R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007).
    [Crossref] [PubMed]
  34. C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).
  35. H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integral-imaging camera by double snapshot,” Opt. Express 20(2), 890–895 (2012).
    [Crossref] [PubMed]
  36. R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005).
    [Crossref]
  37. S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005).
    [Crossref]
  38. C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
    [Crossref]
  39. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005), Chap. 5.
  40. F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007).
    [Crossref] [PubMed]
  41. F. A. Jenkins and H. E. White, Fundamentals of Optics, 4th ed., S. Grall, ed. (McGraw-Hill, 2001), Part 1.
  42. M. Born and E. Wolf, Principle of Optic, 7th ed. (Cambridge University, 1999), Chap. VIII.
  43. J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003).
    [Crossref] [PubMed]
  44. J. E. Greivenkamp, “Airy Disk,” in Field Guide to Geometrical Optics (SPIE, 2004).
  45. B. Javidi, F. Okano, and J. Y. Son, Three-Dimensional Imaging, Visualization, and Display (Springer, 2009).
  46. G. Park, J. H. Jung, K. Hong, Y. Kim, Y. H. Kim, S. W. Min, and B. Lee, “Multi-viewer tracking integral imaging system and its viewing zone analysis,” Opt. Express 17(20), 17895–17908 (2009).
    [Crossref] [PubMed]
  47. ITU-R Rec. BT.1438, Subjective assessment of stereoscopic television pictures (2000).
  48. M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357–360 (2012).
    [Crossref]
  49. D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. 37(1), 19–21 (2012).
    [Crossref] [PubMed]

2013 (4)

C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
[Crossref] [PubMed]

X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
[Crossref] [PubMed]

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

2012 (6)

H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integral-imaging camera by double snapshot,” Opt. Express 20(2), 890–895 (2012).
[Crossref] [PubMed]

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357–360 (2012).
[Crossref]

D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. 37(1), 19–21 (2012).
[Crossref] [PubMed]

2011 (5)

2010 (2)

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010).
[Crossref]

2009 (3)

2008 (2)

R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
[Crossref] [PubMed]

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

2007 (2)

2006 (4)

B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. 45(13), 2986–2994 (2006).
[Crossref] [PubMed]

B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006).
[Crossref] [PubMed]

F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
[Crossref]

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[Crossref]

2005 (2)

R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005).
[Crossref]

S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005).
[Crossref]

2004 (1)

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

2003 (2)

1999 (1)

F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999).
[Crossref]

1998 (2)

1988 (1)

1978 (1)

Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978).
[Crossref]

1968 (1)

C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. A 58(1), 71–74 (1968).
[Crossref]

1931 (1)

H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. A 21(3), 171–176 (1931).
[Crossref]

1908 (1)

G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Aloni, D.

Arai, J.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007).
[Crossref] [PubMed]

F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
[Crossref]

J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A 20(6), 996–1004 (2003).
[Crossref] [PubMed]

F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999).
[Crossref]

J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37(11), 2034–2045 (1998).
[Crossref] [PubMed]

Barreiro, J. C.

Blanche, P. A.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Burckhardt, C. B.

C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. A 58(1), 71–74 (1968).
[Crossref]

Chang, Y. C.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

Chen, C. W.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

Cho, M.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357–360 (2012).
[Crossref]

M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011).
[Crossref]

M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010).
[Crossref]

Daneshpanah, M.

D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. 37(1), 19–21 (2012).
[Crossref] [PubMed]

M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011).
[Crossref]

Davies, N.

Deng, H.

C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
[Crossref] [PubMed]

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
[Crossref]

Dohi, T.

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

Flores, D.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Furuya, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Gong, X. X.

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

Gu, T.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Haino, Y.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Hata, N.

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

Hong, K.

Hong, S. H.

Hoshino, H.

Hsieh, P. Y.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

Huang, X.

R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
[Crossref] [PubMed]

Huang, Y. P.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

Igarashi, Y.

Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978).
[Crossref]

Isono, H.

Ives, H. E.

H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. A 21(3), 171–176 (1931).
[Crossref]

Iwahara, M.

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

Jang, J. S.

Javidi, B.

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
[Crossref] [PubMed]

D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. 37(1), 19–21 (2012).
[Crossref] [PubMed]

M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357–360 (2012).
[Crossref]

H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integral-imaging camera by double snapshot,” Opt. Express 20(2), 890–895 (2012).
[Crossref] [PubMed]

M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011).
[Crossref]

D. Aloni, A. Stern, and B. Javidi, “Three-dimensional photon counting integral imaging reconstruction using penalized maximum likelihood expectation maximization,” Opt. Express 19(20), 19681–19687 (2011).
[Crossref] [PubMed]

M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010).
[Crossref]

R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009).
[Crossref]

R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007).
[Crossref] [PubMed]

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[Crossref]

B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. 45(13), 2986–2994 (2006).
[Crossref] [PubMed]

B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006).
[Crossref] [PubMed]

R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005).
[Crossref]

J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003).
[Crossref] [PubMed]

Jaynes, C.

R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
[Crossref] [PubMed]

Jen, T. H.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

Ji, C. C.

C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
[Crossref] [PubMed]

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Jung, J. H.

Kawakita, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007).
[Crossref] [PubMed]

Kim, J.

S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005).
[Crossref]

Kim, S. K.

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

Kim, Y.

Kim, Y. H.

Lee, B.

Lee, K. H.

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

Li, D. H.

C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
[Crossref] [PubMed]

H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
[Crossref]

Li, G.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Li, L.

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

Li, S.

R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
[Crossref] [PubMed]

Liao, H.

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

Lin, W.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Lippmann, G.

G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Luo, C. G.

C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
[Crossref] [PubMed]

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

Martinez-Corral, M.

R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009).
[Crossref]

R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007).
[Crossref] [PubMed]

Martínez-Corral, M.

Martinez-Cuenca, R.

R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009).
[Crossref]

Martínez-Cuenca, R.

R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007).
[Crossref] [PubMed]

R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005).
[Crossref]

Matoba, O.

McCormick, M.

Min, S. W.

G. Park, J. H. Jung, K. Hong, Y. Kim, Y. H. Kim, S. W. Min, and B. Lee, “Multi-viewer tracking integral imaging system and its viewing zone analysis,” Opt. Express 17(20), 17895–17908 (2009).
[Crossref] [PubMed]

S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005).
[Crossref]

Mishina, T.

T. Mishina, “3D television system based on integral photography,” in Picture Coding Symposium (PCS) (Nagoya, Japan, 2010), p. 20.
[Crossref]

Mitani, K.

F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
[Crossref]

Moon, I.

M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011).
[Crossref]

B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006).
[Crossref] [PubMed]

Murata, H.

Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978).
[Crossref]

Nakajima, S.

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

Nakamura, J.

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011).
[Crossref] [PubMed]

Navarro, H.

Norwood, R. A.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Okano, F.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007).
[Crossref] [PubMed]

F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
[Crossref]

J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A 20(6), 996–1004 (2003).
[Crossref] [PubMed]

F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999).
[Crossref]

H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998).
[Crossref]

J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37(11), 2034–2045 (1998).
[Crossref] [PubMed]

Okui, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
[Crossref]

J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A 20(6), 996–1004 (2003).
[Crossref] [PubMed]

Onural, L.

Park, G.

Park, J. H.

Peyghambarian, N.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Rokutanda, S.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Saavedra, G.

H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integral-imaging camera by double snapshot,” Opt. Express 20(2), 890–895 (2012).
[Crossref] [PubMed]

R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009).
[Crossref]

R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007).
[Crossref] [PubMed]

R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005).
[Crossref]

Sakuma, I.

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

Sato, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Shin, D.

Son, J. Y.

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

Son, W. H.

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

St Hilaire, P.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Stern, A.

Su, Y. R.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

Takahashi, T.

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

Takaki, Y.

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011).
[Crossref] [PubMed]

Tanaka, K.

Tao, Y. H.

H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
[Crossref]

Tay, S.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Thomas, J.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Tunç, A. V.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Ueda, M.

Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978).
[Crossref]

Voorakaranam, R.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Wang, F. N.

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
[Crossref]

Wang, P.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Wang, Q. H.

C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013).
[Crossref] [PubMed]

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
[Crossref]

Wang, Y. Z.

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Xiao, X.

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
[Crossref] [PubMed]

Yamamoto, M.

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Yang, L.

Yang, R.

R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
[Crossref] [PubMed]

Yeom, S.

Yöntem, A. Ö.

Yoshimura, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Yuyama, I.

Appl. Opt. (5)

C. R. Acad. Sci. (1)

G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

IEEE Trans. Inf. Technol. Biomed. (1)

H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004).
[Crossref] [PubMed]

IEEE Trans. Vis. Comput. Graph. (1)

R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008).
[Crossref] [PubMed]

J. Disp. Technol. (7)

C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005).
[Crossref]

H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011).
[Crossref]

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010).
[Crossref]

C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012).
[Crossref]

M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357–360 (2012).
[Crossref]

J. Opt. Soc. Am. A (5)

J. Soc. Inf. Disp. (1)

J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012).
[Crossref]

Jpn. J. Appl. Phys. (2)

S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005).
[Crossref]

Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978).
[Crossref]

Nature (1)

S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008).
[Crossref] [PubMed]

Opt. Eng. (1)

F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999).
[Crossref]

Opt. Express (7)

Opt. Lett. (3)

Proc. IEEE (5)

F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006).
[Crossref]

M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011).
[Crossref]

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[Crossref]

R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009).
[Crossref]

J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE 101(1), 190–205 (2013).
[Crossref]

Proc. SID Symp. Dig. (1)

C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231–234 (2013).

Other (8)

T. Okoshi, Three-Dimensional Imaging Techniques (Academic, 1976).

T. Mishina, “3D television system based on integral photography,” in Picture Coding Symposium (PCS) (Nagoya, Japan, 2010), p. 20.
[Crossref]

F. A. Jenkins and H. E. White, Fundamentals of Optics, 4th ed., S. Grall, ed. (McGraw-Hill, 2001), Part 1.

M. Born and E. Wolf, Principle of Optic, 7th ed. (Cambridge University, 1999), Chap. VIII.

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005), Chap. 5.

J. E. Greivenkamp, “Airy Disk,” in Field Guide to Geometrical Optics (SPIE, 2004).

B. Javidi, F. Okano, and J. Y. Son, Three-Dimensional Imaging, Visualization, and Display (Springer, 2009).

ITU-R Rec. BT.1438, Subjective assessment of stereoscopic television pictures (2000).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 Number of correspondence pixels to reconstruct image point A in the RPII display.
Fig. 2
Fig. 2 Light intensity distribution of the RPII display.
Fig. 3
Fig. 3 Cross sections and meridian sections (y = 0) of the diffraction intensity patterns on different depth planes (a) z = 20.0mm and (b) z = 26.0mm. Here, f = 4.0mm, g = 5.0mm, p = 0.8mm, λ ¯ = 5 . 5 × 1 0 4 mm, and H × V = 5 × 5.
Fig. 4
Fig. 4 Analysis of DOF of the RPII display based on wave optics taking into account the minimum angular resolution of human eyes.
Fig. 5
Fig. 5 Schematic, not to scale, of the pickup process of RPII conducted in Autodesk 3ds Max 2012. Here, M × N = 110 × 110, f = 3.3mm, and p = 1.0mm.
Fig. 6
Fig. 6 (a) Obtained elemental image array, and (b) experimental setup of the RPII display setup. Here, M × N = 110 × 110, H × V = 9 × 9, f = 3.3mm, p = 1.0mm, g = 3.7mm, and the resolution of each elemental image is 40 × 40 pixels.
Fig. 7
Fig. 7 (a) Different perspectives of the reconstructed 3D image, (b) enlarged view of the center viewpoint, and (c) 2D image of the original 3D“florets” number 4 for reference.

Tables (6)

Tables Icon

Table 1 Parameters used in DOF calculation

Tables Icon

Table 3 Variation of DOFWave with different gap g

Tables Icon

Table 2 Variation of DOFWave with different focal length f

Tables Icon

Table 4 Variation of DOFWave with different MLA pitch p

Tables Icon

Table 5 Parameters used in the pickup and display experiments

Tables Icon

Table 6 DOF calculation results obtained from geometrical optics, Gaussian beam, and wave optics methods

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

DO F Geom =2 l 2 g×p p d ,
DO F Gauss = 2l 4tan ( α e /2 ) 2 d 2 p 2 g 2 +4 p d 2 tan ( α e /2 ) 2 l 4 p 2 l 2 p d 2 | 4tan ( α e /2 ) 2 l 2 gg p 2 | ,
w= f 2| gf | ,
H×V={ ( 2w+1 )×( 2w+1 ) ( 2w+1 )min( M,N ) M×( 2w+1 ) M( 2w+1 )N ( 2w+1 )×N N( 2w+1 )M M×N ( 2w+1 )max( M,N ),
( x mn D , y mn D )=( g f mp, g f np ),
P 00 ( x 0 , y 0 )=rect( x 0 p , y 0 p )={ 1 - p 2 x 0 p 2 ,- p 2 y 0 p 2 0 otherwise .
t mn ( x 0 , y 0 )= P 00 ( x 0 mp, y 0 np )exp{ j k 2f [ ( x 0 mp ) 2 + ( y 0 np ) 2 ] }.
I mn ( x,y;z )=| 1 λ ¯ 2 gz + exp{ j k 2g [ ( x 0 x mn D ) 2 + ( y 0 y mn D ) 2 ] } × t mn ( x 0 , y 0 ) ×exp{ j k 2z [ ( x x 0 ) 2 + ( y y 0 ) 2 ] }d x 0 d y 0 | 2 ,
I( x,y;z )= m=w,n=w m=w,n=w I mn ( x,y;z ) .
-r(z) r(z) I( x,0;z ) dx=0.84 - I( x,0;z ) dx.
x 2 a 2 ( zl ) 2 b 2 =1,
x=±tan( α e 2 )( zld ).
DO F Wave = 2 a 4 b 2 + a 2 b 4 tan ( α e /2 ) 2 + a 2 b 2 d 2 tan ( α e /2 ) 2 | a 2 b 2 tan ( α e /2 ) 2 | .

Metrics