## Abstract

In this paper, we analyze the depth of field (DOF) of integral imaging displays based on wave optics. With considering the diffraction effect, we analyze the intensity distribution of light with multiple micro-lenses and derive a DOF calculation formula for integral imaging display system. We study the variations of DOF values with different system parameters. Experimental results are provided to verify the accuracy of the theoretical analysis. The analyses and experimental results presented in this paper could be beneficial for better understanding and designing of integral imaging displays.

© 2013 Optical Society of America

## 1. Introduction

Integral imaging is a 3D technique that was proposed by Lippman in 1908 [1]. It can provide full-parallax and continuous-viewing 3D images and does not require any special glasses or coherent light, such as those used in multi-view auto-stereoscopic 3D display or holography [2–7]. Integral imaging has attracted extensive attention for its research and application in the area of 3D sensing and display [8–29].

However, integral imaging displays are affected by diffraction and aberration effects in both optical pickup and display processes. To eliminate the effects of diffraction and aberration in the optical pickup process, computer-generated integral imaging can be applied [30]. It can generate elemental images by using computer graphic techniques with the parameters of the virtual micro-lens array (MLA) without a real optical system. Because of its flexibility and diffraction-free feature, computer-generated integral imaging has been proposed in many fields such as entertainment and medical sciences et al.. In such cases, the deterioration factor in the pickup process can be ignored and only the display process should be considered when analyzing the viewing performance of the system. In this paper, we analyze the depth of field (DOF) using computer-generated integral imaging.

The main parameters used to evaluate the performance of the integral imaging display are DOF, viewing resolution, and viewing angle [31–36]. In this paper, we focus on the analysis of the DOF of integral imaging displays using wave optics. First, we analyze the number of correspondence pixels which are responsible for reconstructing the same image point in 3D space. Then, by analyzing the wave propagation and light intensity distribution with multiple micro-lenses, we perform the DOF calculation based on hyperbola fitting and the minimum angular resolution of human eyes. By taking into consideration of the diffraction effect and using the superposition of light intensity with multiple micro-lenses, we can obtain more accurate DOF calculation results compared to the previous methods for the integral imaging display. Experimental results are also given to verify the accuracy of the proposed method.

This paper is organized as follows. In section 2, we review two pre-existing DOF calculation methods based on geometrical optics and Gaussian beam distribution model [37,38]. In section 3, we propose a DOF calculation method using wave optics [39,40]. In sections 4, we calculate several groups of DOF values using our proposed method and analyze the variations of DOF with different system parameters. In section 5, the accuracy of the method is verified by comparing the experimental results of the proposed method with the previous researches. In section 6, we summarize the main achievements of this paper and conclude with future directions and outlook.

## 2. Previous DOF calculation methods

One of the most important parameters that set 3D displays apart from 2D displays is their DOF. To understand the DOF of integral imaging displays, Min et al. [37] performed an analysis using geometrical optics theory without considering the diffraction effect caused by MLA. Because the integrated image is out of focus when the image is located away from the central depth plane (CDP), DOF can be set as the distance between the rear and front marginal depth planes, where the ray-optical focusing error occurs due to the overlap of the image pixel:

where*p*and

*p*represent the pitch of the MLA and the pixel size of the display, respectively. Parameter

_{d}*g*represents the gap between the display and the MLA, and

*l*represents the conjugate distance defined by the Gauss lens law [41].

A DOF calculation method based on Gaussian beam distribution model was also proposed [38]. In that model, the light emanating from a single pixel on the display was considered as a Gaussian beam whose waist is located at the CDP. Taking into account the minimum angular resolution of human eyes (*α _{e}*) [42], the DOF was obtained by calculating the locations of the rear and front marginal depth planes based on the Gaussian beam function:

*d*refers to the viewing distance between the observer and the CDP.

These DOF calculation methods are easy to use but suffer from low accuracy because they both neglect the diffraction effect and only take a single micro-lens into consideration. Even though the second method tries to approximate the actual light intensity distribution in the image space, the obtained Gaussian light beam functions are derived from the geometrical relationships of the integral imaging display. It results in the low accuracy of the DOF calculation results.

## 3. DOF calculation methods based on wave optics

Here, we propose another approach for DOF calculation using wave-optics. First, we analyze the number of correspondence pixels which are responsible for reconstructing the same image point in 3D space. Then, by analyzing the wave propagation and light intensity distribution with multiple micro-lenses, we perform the DOF calculation based on hyperbola fitting and the minimum angular resolution of human eyes.

Note that for an integral imaging display the pixel size of the display device is a factor that strongly influences the DOF [37,38]. However, in our analysis, we regard the display pixels as ideal points for simplification. We only consider square micro-lenses in this paper, but it is not difficult to expand our proposed method to other micro-lens shapes. In addition, our studies are based on resolution priority integral imaging (RPII) displays [43] while the derivations of depth priority integral imaging (DPII) [43] displays are similar.

#### 3.1 Number of correspondence pixels of the RPII display

For a RPII display, the correspondence pixels are located on different elemental images and
imaged by their corresponding micro-lenses to reconstruct the same image point on the conjugate
plane, as shown in Fig. 1.Suppose the MLA contains *M* × *N* micro-lenses, and
the central micro-lens is denoted as the 0th micro-lens. To describe the light propagation in
the image space, we set up a coordinate system *xyz* in which the
*z* axis coincides with the optical axis of the 0th micro-lens. And the origin
is located at the center of that lens. Also, a plane coordinate system
*x*_{0}*y*_{0} is utilized to describe the MLA
plane (*z* = 0).

We take image point *A* located on the *z* axis for example. Assuming the *w*^{th} elemental image is the critical one that contains the correspondence pixel for image point *A*, according to the geometric relationships, *w* is given by

*x*⌋ denotes the greatest integer less than or equal to

*x*, distance

*f*and

*g*are defined as the focal length and the gap between the display and the MLA, respectively.

The number of correspondence pixels (*H* × *V*) for image point *A* is given by

*M*,

*N*) and max(

*M*,

*N*) give the smaller and the larger values of

*M*and

*N*, respectively.

Among the *H* × *V* correspondence pixels on the display panel $\left({x}^{D},{y}^{D}\right)$, as shown in Fig. 1, the coordinates of an arbitrary correspondence pixel $D\left({x}_{mn}^{D},{y}_{mn}^{D}\right)$ can be expressed with the system parameters as

*m*= -

*w*…-1, 0, 1...

*w*and

*n*= -

*w*…-1, 0, 1…

*w*are the indexes of micro-lens in the MLA corresponding to the correspondence pixel $D\left({x}_{mn}^{D},{y}_{mn}^{D}\right)$.

#### 3.2 Light intensity distribution of the RPII display

After obtaining the number of correspondence pixels, we will analyze the light intensity distribution in the image space of the RPII display.

For the RPII display shown in Fig. 2, the pupil function for the central micro-lens can be given as

The pupil function for other micro-lenses can be expressed as ${P}_{mn}\left({x}_{0},{y}_{0}\right)=$ ${P}_{00}\left({x}_{0}-mp,{y}_{0}-np\right)$.

Assuming that the corresponding pixel $D\left({x}_{mn}^{D},{y}_{mn}^{D}\right)$ is a *δ* function (point source), the phase transformation of the corresponding micro-lens is given by

For a single correspondence pixel $D\left({x}_{mn}^{D},{y}_{mn}^{D}\right)$, we can obtain the light intensity distribution at an arbitrary point *C*(*x*, *y*, *z*) on a certain depth plane by using the paraxial approximation and the Fresnel diffraction theory, as given below:

*k*is the wave number and is given by $k=2\pi /\overline{\lambda}$.

For multiple correspondence pixels, all the intensity distributions can be superimposed linearly. Therefore, the light intensity distribution of the RPII display can be expressed as

#### 3.3 DOF of the RPII display

In the following parts, we only take into account one dimension for simplicity, and the discussions for the other dimension are the same due to the square shapes of micro-lenses.

For a given RPII display the diffraction intensity patterns on different depth planes
can be obtained by using Eqs. (8) and (9). Figures
3(a) and 3(b) show the diffraction intensity
patterns at depth planes *z* = 20.0mm (the conjugate plane) and
*z* = 26.0mm with parameters *f* = 4.0mm, *g* =
5.0mm, *p* = 0.8mm, $\overline{\lambda}=\text{5}.\text{5}\times \text{1}{0}^{-\text{4}}$mm, and the number of correspondence pixels *H*
× *V* = 5 × 5.

For different depth planes, the corresponding diffraction intensity patterns can be calculated as shown in Fig. 3. For a given diffraction intensity pattern, we use a square light spot that contains a substantial percentage (84%) of the whole pattern energy to replace the entire pattern [44]. The diffraction intensity pattern can then be simplified as a square light spot with a half side length *r*(*z*). Therefore, *r*(*z*) can be calculated as

The discrete data of half side length *r*(*z*) is obtained and
fitted into a hyperbola function whose waist is located on the conjugate plane, as denoted by
the red lines in Fig. 4.Taking into consideration the minimum angular resolution of human eyes
(*α _{e}*), we can calculate the DOF using the intersection of
the hyperbola and the margin lines of

*α*(blue lines shown in Fig. 4). Here,

_{e}*d*represents the viewing distance between the eye and the conjugate plane.

After obtaining the hyperbola functions, we can deduce the DOF as follows. The hyperbola fitting function of the light beam can be written as

where*a*and

*b*are given by the fitting results of the hyperbola light beam.

The margin lines of the minimum angular resolution of human eyes (*α _{e}*) are given by the following equation:

Combining Eqs. (11) and (12), we can confirm the locations of the rear and front marginal depth planes. Therefore, the DOF of the RPII display can be defined as the distance between the two marginal depth planes:

## 4. DOF calculation results based on wave optics

After obtaining the DOF calculation formula, we can use it to analyze the variations of DOF with
different parameters. In the calculation process we assume that the number of correspondence
pixels (*H* × *V*) always meets the requirement of
$\left(2w+1\right)\le \mathrm{min}\left(M,N\right)$. Note that *H* × *V* is
decided by the focal length *f* and the gap *g* according to Eqs. (3) and (4). It means that once *f* and *g* are given,
*H* × *V* is fixed and can be calculated by the given
*f* and *g*. The relationship between the viewing distance
*d* and the *DOF _{Wave}* is given by Eq. (13). It shows that the

*DOF*increases monotonously if

_{Wave}*d*increases. Therefore, we will only discuss the variations of

*DOF*with different focal length

_{Wave}*f*, gap

*g*, and MLA pitch

*p*in this paper. Table 1 lists the parameters used in this section. The minimum angular resolution of human eyes is set to be

*α*= 1.662 × 10

_{e}^{-2°}[42], the mean wavelength is $\overline{\lambda}=\text{5}.\text{5}\times \text{1}{0}^{-\text{4}}$mm, and the viewing distance is

*d*= 2500mm [45–47].

*DOF _{Wave}* have been calculated with different
specifications according to Eq. (13). Tables 2–4
show the variations of

*DOF*with different focal length

_{Wave}*f*, gap

*g*, and MLA pitch

*p*, respectively.

From the results shown above, we can see that the values of *DOF _{Wave}* highly depend on focal length

*f*, gap

*g*, and micro-lens pitch

*p*. The variations of

*DOF*are not monotonic as we change these parameters, and the optimal

_{Wave}*DOF*values can be determined by multi-parameter optimization [48,49].

_{Wave}## 5. Experimental results

According to the above analyses, we designed and implemented an experiment to show the accuracy
of the proposed method. To avoid light diffraction or aberration effects in the pickup process,
we used Autodesk 3ds Max 2012 software to generate the elemental image array, and the obtained
elemental image array was printed on a piece of high-quality photo paper with a Brother
HL-4150CDN color printer to implement the optical display. The reconstructed 3D images were then
captured by a Canon EOS 5D Mark II camera at a viewing distance of *d* = 2500mm.
Table 5 shows the parameters used in the pickup and
display experiments.

To verify the proposed DOF calculation method, we compared the DOF calculation results with those
calculated by geometrical optics and Gaussian beam methods as shown in Table 6..With given parameters, we calculated three groups of DOF values using three
different DOF calculation methods. Here, *DOF _{Geom}*,

*DOF*, and

_{Gauss}*DOF*refer to the DOF derived by geometrical optics method, Gaussian beam method, and wave optics method, respectively.

_{Wave}We set up a 3D scene which consists of seven colorful “florets” located at seven
different depth positions with Autodesk 3ds Max software 2012, as shown in Fig. 5.According to the DOF calculation results shown in Table
6 and the locations of the seven different “florets” given in Fig. 5, “florets” number 1 to 7 are within the
range of *DOF _{Gauss}*, “florets” number 2 to 6 are within
the range of

*DOF*, and “florets” number 3 to 5 are within the range of

_{Geom}*DOF*.

_{Wave}The pickup process was conducted in a computer without the influence of diffraction or aberration effects, as shown in Fig. 5. The parameters used in the pickup process are listed in Table 5 and the generated elemental image array contains 110 × 110 elemental images, each with a resolution of 40 × 40 pixels, as given in Fig. 6(a).

In the optical 3D display process, as shown in Fig. 6(b),
the elemental image array was printed on a piece of paper (as our display) and placed behind the
MLA, which has identical parameters with the previous one used in the pickup process, with a
separation of *g* = 3.7mm. Other parameters are given in Table 5. Figure 7 shows the captured
display results.

From the experimental results, we can see that the reconstructed 3D “florets” number 1, 2, 6, and 7 which are located within the theoretically predicted range of *DOF _{Gauss}* or

*DOF*and out of the range of

_{Geom}*DOF*appear blurry. However, “florets” number 3, 4, and 5 which are located within the theoretically predicted range of

_{Wave}*DOF*and also within the ranges of both

_{Wave}*DOF*and

_{Gauss}*DOF*appear clearer and smoother. What’s more, by comparing Fig. 7(b) with Fig. 7(c), we can see that 3D “florets” number 3, 4, and 5 are reconstructed more correctly than the rest ones with less distortion.

_{Geom}The results demonstrate that the effect of multiple micro-lenses and diffraction strongly affect the DOF in an RPII display. Even though we have neglected the pixel size of the display, the proposed DOF calculation method based on wave optics is more accurate than both geometrical optics and Gaussian beam methods. Experimental results demonstrate the accuracy of the proposed method.

## 6. Conclusion

In this paper, we analyzed the DOF of the integral imaging display with multiple micro-lenses using wave optics. We first derived the light intensity distribution with multiple micro-lenses, and then the DOF was calculated by combining hyperbola fitting of the diffraction intensity patterns and the minimum angular resolution of human eyes. With given system parameters, we determined several groups of DOF values and analyzed the variations of them under different situations. Finally, optical experimental results confirmed the accuracy of the proposed method. In the future work, other parameters such as viewing resolution and viewing angle will be studied as well as the pixel size of the display panel. These analyses can be quite beneficial for researchers to better understand and develop desired integral imaging displays.

## Acknowledgments

This work is supported by the “973” Program under Grant No. 2013CB328802, the NSFC under Grant Nos. 61225022 and 61036008, and the “863” Program under Grant Nos. 2012AA011901 and 2012AA03A301. M. Martinez-Corral acknowledges financial support from projects DPI2012-32994 and PROMETEO2009-077. This work is also supported in part by Samsung Electronics Inc. under the Samsung Advanced Institute of Technology (SAIT) Global Research Outreach (GRO) Program.

## References and links

**1. **G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. **146**, 446–451 (1908).

**2. **M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE **99**(4), 556–575 (2011). [CrossRef]

**3. **C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express **21**(17), 19816–19824 (2013). [CrossRef] [PubMed]

**4. **X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. **52**(4), 546–560 (2013). [CrossRef] [PubMed]

**5. **Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express **19**(5), 4129–4139 (2011). [CrossRef] [PubMed]

**6. **J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. **20**(4), 228–234 (2012). [CrossRef]

**7. **S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature **451**(7179), 694–698 (2008). [CrossRef] [PubMed]

**8. **H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. A **21**(3), 171–176 (1931). [CrossRef]

**9. **C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. A **58**(1), 71–74 (1968). [CrossRef]

**10. **T. Okoshi, *Three-Dimensional Imaging Techniques* (Academic, 1976).

**11. **L. Yang, M. McCormick, and N. Davies, “Discussion of the optics of a new 3-D imaging system,” Appl. Opt. **27**(21), 4529–4534 (1988). [CrossRef] [PubMed]

**12. **F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE **94**(3), 490–501 (2006). [CrossRef]

**13. **J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. **37**(11), 2034–2045 (1998). [CrossRef] [PubMed]

**14. **H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A **15**(8), 2059–2065 (1998). [CrossRef]

**15. **T. Mishina, “3D television system based on integral photography,” in Picture Coding Symposium (PCS) (Nagoya, Japan, 2010), p. 20. [CrossRef]

**16. **J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. **6**(10), 422–430 (2010). [CrossRef]

**17. **D. Aloni, A. Stern, and B. Javidi, “Three-dimensional photon counting integral imaging reconstruction using penalized maximum likelihood expectation maximization,” Opt. Express **19**(20), 19681–19687 (2011). [CrossRef] [PubMed]

**18. **M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. **6**(10), 544–547 (2010). [CrossRef]

**19. **B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express **14**(25), 12096–12108 (2006). [CrossRef] [PubMed]

**20. **J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A **20**(6), 996–1004 (2003). [CrossRef] [PubMed]

**21. **A. Ö. Yöntem and L. Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A **28**(11), 2359–2375 (2011). [CrossRef] [PubMed]

**22. **F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. **38**(6), 1072–1077 (1999). [CrossRef]

**23. **J. H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. **48**(34), H77–H94 (2009). [CrossRef] [PubMed]

**24. **R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE **97**(6), 1067–1077 (2009). [CrossRef]

**25. **R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. **14**(1), 84–96 (2008). [CrossRef] [PubMed]

**26. **B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. **45**(13), 2986–2994 (2006). [CrossRef] [PubMed]

**27. **H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. **8**(2), 114–121 (2004). [CrossRef] [PubMed]

**28. **H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. **7**(5), 255–258 (2011). [CrossRef]

**29. **J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-world-like environments,” Proc. IEEE **101**(1), 190–205 (2013). [CrossRef]

**30. **Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. **17**(9), 1683–1684 (1978). [CrossRef]

**31. **A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE **94**(3), 591–607 (2006). [CrossRef]

**32. **C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. **8**(11), 634–638 (2012). [CrossRef]

**33. **R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express **15**(24), 16255–16260 (2007). [CrossRef] [PubMed]

**34. **C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. **44**(1), 231–234 (2013).

**35. **H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integral-imaging camera by double snapshot,” Opt. Express **20**(2), 890–895 (2012). [CrossRef] [PubMed]

**36. **R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. **1**(2), 321–327 (2005). [CrossRef]

**37. **S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. **44**(2), L71–L74 (2005). [CrossRef]

**38. **C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. **8**(2), 112–116 (2012). [CrossRef]

**39. **J. W. Goodman, *Introduction to Fourier Optics* (Roberts and Company, 2005), Chap. 5.

**40. **F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. **32**(4), 364–366 (2007). [CrossRef] [PubMed]

**41. **F. A. Jenkins and H. E. White, *Fundamentals of Optics*, 4th ed., S. Grall, ed. (McGraw-Hill, 2001), Part 1.

**42. **M. Born and E. Wolf, *Principle of Optic*, 7th ed. (Cambridge University, 1999), Chap. VIII.

**43. **J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. **28**(20), 1924–1926 (2003). [CrossRef] [PubMed]

**44. **J. E. Greivenkamp, “Airy Disk,” in *Field Guide to Geometrical Optics* (SPIE, 2004).

**45. **B. Javidi, F. Okano, and J. Y. Son, *Three-Dimensional Imaging, Visualization, and Display* (Springer, 2009).

**46. **G. Park, J. H. Jung, K. Hong, Y. Kim, Y. H. Kim, S. W. Min, and B. Lee, “Multi-viewer tracking integral imaging system and its viewing zone analysis,” Opt. Express **17**(20), 17895–17908 (2009). [CrossRef] [PubMed]

**47. **ITU-R Rec. BT.1438, Subjective assessment of stereoscopic television pictures (2000).

**48. **M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. **8**(6), 357–360 (2012). [CrossRef]

**49. **D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. **37**(1), 19–21 (2012). [CrossRef] [PubMed]