Abstract

We report on the development of a high-resolution see-through integral imaging system with a resolution and fill factor-enhanced lens-array holographic optical element (HOE). We propose a procedure for fabricating of a lens pitch controllable lens-array HOE. By controlling the recording plane and performing repetitive recordings process, the lens pitch of the lens-array HOE could be substantially reduced, with a high fill factor and the same numerical aperture compared to the reference lens-array. We demonstrated the feasibility by fabricating a lens-array HOE with a 500 micrometer pitch. Since the pixel pitch of the projected image can be easily controlled in projection type integral imaging, the small lens pitch enhances the quality of the displayed 3D image very effectively. The enhancement of visibility of the 3D images is verified in experimental results.

© 2014 Optical Society of America

1. Introduction

Integral imaging is an attractive method for displaying autostereoscopic three-dimensional (3D) images [14]. Integral imaging can be implemented by using either a flat panel display or an image projector as the source of the image. A projection-type integral imaging system, which involves the use of single or multiple image projectors, provides autostereoscopic 3D images to observers by refracting or reflecting light rays using a 3D screen [511]. A lens-array or mirror-array can be used for the 3D screen in projection-type integral imaging. The projection-type integral imaging system has a number of advantages. First, it is easy to control a size and pixel pitch of the projected image using relay optics [5, 9]. Second, a large sized 3D screen can be achieved by adopting the spatial multiplexing of multiple projectors [5, 8, 11, 12]. Furthermore, the resulting 3D images could be merged with real a world scene, thus providing an immersive experience to the viewer, because the imaging plane can be floated in midair if a 3D screen with transparency is utilized. Hong et al. recently proposed an optical see-through 3D display system in which a lens-array holographic optical element (HOE) was used [13].

However, previous research indicates that there were some limitations related to projection-type integral imaging. Most projection-type integral imaging systems adopt a focal mode integral imaging method for displaying 3D images [511]. In the focal mode, each lenslet in the lens-array acts as a pixel for the displayed 3D images and consequently, the spatial resolution of a reconstructed 3D image is determined, not by the size of pixels on the image projector, but by the lens pitch of the lens-array. Although the pixel size of the projected image could be reduced much smaller than the pixel size of a typical flat panel device, the spatial resolution of the displayed 3D image is restricted by the specifications of the lens-array. Therefore it would be desirable to create a smaller lens pitch in displaying higher spatial resolution of 3D images. However, fabricating such a lens array with a small lens pitch with a high numerical aperture and high fill factor is a difficult task, and customizing the specifications can be costly [1417].

The image fill factor of the displayed 3D images is another important issue for projection-type integral imaging systems. In a projection-type integral imaging system without a diffusive screen, the visibility is greatly affected by, not only the resolution, but also the fill factor which is defined as the ratio between the effective image area and entire area of the display device. In typical projection-type integral imaging, the fill factor is determined by the projection geometry and the specifications of the lens-array [18]. In a projection-type integral imaging system, the reconstructed 3D image usually suffers from a low fill factor which degrades visibility. In the case of a projection-type integral imaging system using a lens-array HOE, the image fill factor becomes particularly important because of the recording fill factor of the lens-array HOE, which is discussed below.

In this paper, we propose a method for controlling the recording plane for a recording lens-array HOE to enhance the spatial resolution and fill factor of a projection-type integral imaging system using the lens-array HOE. By manipulating the recording distance between a reference lens-array and a photopolymer, it is possible to fabricate a micro lens-array HOE which has a smaller lens pitch and larger fill factor compared to the previous recording process. It has been reported that, even in the case of the same image fill factor, visibility could be significantly enhanced if the lens pitch would be smaller [1820]. Therefore, the proposed method effectively enhances the visibility of 3D images on a projection-type integral imaging system because the resolution and fill factor can be enhanced at the same time. Additionally, an optical see-through property can be also achieved by the angular selectivity of the volume HOE.

2. Principles

2.1. Fabrication of a lens-array HOE for projection-type integral imaging system

The previous process used in fabricating a lens-array HOE is illustrated on Fig. 1 [13, 21, 22]. In the fabricating process of the lens-array HOE, the plane wave passes through the reference lens-array to generate a signal wave for the recording HOE which has the optical properties of a lens-array. Each lenslet of the reference lens-array refracts incident plane waves into a spherical wave which converges at each focal point. A photopolymer is located on the backside of the lens array, and another plane wave illuminates the photopolymer as a reference wave, as shown in Fig. 1(a). The interference pattern between the signal wave (which is modulated by the reference lens-array) and the reference wave is recorded on the photopolymer in the form of volume gratings [23]. After the recording process, the recorded signal wave can be regenerated by illuminating a reference wave which has an incident angle and wavelength that are identical to those of the reference wave in the recording process. According to the principle of holography, the reconstructed wave propagates as if the lens-array existed at the location of the reference lens array in the recording process as shown in Fig. 1(b).

 

Fig. 1 (a) Schematic diagram of previous lens-array HOE fabricating method and (b) reconstruction scheme of lens-array HOE.

Download Full Size | PPT Slide | PDF

The basic principle of fabricating a lens-array HOE involves duplicating the functions of the reference lens-array into photopolymer by recording the wavefront generated by the reference lens-array on the photopolymer. Therefore, the lens pitch of recorded lens-array HOE is exactly identical to that of the reference lens-array. However, the size and focal length of the recording lenslets are slightly smaller compared to those of the reference lens-array. Since the reference lens-array has its own thickness and also the photopolymer substrate has its own thickness, there is a gap between the plane where the plane wave starts to converge and the recording plane where the photopolymer is located, as shown in Fig. 1. Therefore, the size of the effective recording area in this scheme varies along the propagation axis and consequently the recorded area cannot cover the entire area of the photopolymer. This usually becomes a more severe problem when the lens pitch and focal length are smaller. The ratio of the recorded area and the entire area of the photopolymer defines the recording fill factor. When the recording fill factor of the lens-array HOE is low, the fill factor of the displayed images of a projection type integral imaging system using lens-array HOE is also degraded. Furthermore, when a lens-array HOE having a low fill factor is used in the display procedure, image information projected outside the recorded area is wasted.

2.2. Recording plane manipulating method for fabricating resolution and fill factor enhanced lens-array HOE

The protocol for the proposed method is as follows. First, the recording distance is set as (1 ± 1/N)f, where f denotes the focal length of the lenslet and N is an integer which is related to the resolution enhancing factor, and the photopolymer is located at the exact recording plane. Secondly, the photopolymer is exposed to the signal and reference waves in order to record the wavefront from the reference lens-array, as shown in Fig. 2(a). In the recording process, the reference wave should be partly blocked by the mask in order for the exposed areas of the reference wave to match those of the signal wave. Lastly, the photopolymer is shifted laterally by the recorded lens size which is 1/N of the reference lenslet size, as shown in Fig. 2(b). The recording process is repeated N2 times to fill the whole photopolymer area with lenslet HOEs during lateral shifting process.

 

Fig. 2 Schematic diagram of recording plane manipulation method: (a) set the recording distance and (b) repeat the recording process after shifting the photopolymer laterally.

Download Full Size | PPT Slide | PDF

The specifications of proposed lens-array HOE are determined by the recording distance. If we consider the recorded wavefront in an HOE fabricating process, the complex amplitude of the converging spherical wavefront introduced by the lenslet of the reference lens array can be interpreted as u(x,y) in Eq. (1) with a thin lens approximation when λ denotes wavelength and p denotes lens pitch of the reference lens-array. As the signal wave from each lenslet propagates along the optical axis with distance z, the complex amplitude of the propagating wavefront at distance z is given by u2(x,y) of Eq. (2) with the angular spectrum method [24]. In Eq. (2), FT2 denotes a two-dimensional Fourier transform and fX and fY denote spatial frequencies. Since the wavefront from a lens-array is focused after propagating a distance f, the wavefront at this plane will be focused after a distance of fz. Therefore, if we locate the photopolymer at distance z, it will be effectively equivalent to recording a lens-array with a focal length of fz.

u(x,y)=rect[xp]rect[yp]exp[jπλf(x2+y2)],
u2(x,y)=FT21{FT2{rect[xp]rect[yp]exp[jπλf(x2+y2)]}exp[jπλz(fX2+fY2)]}.
If the recording plane is located farther than the focal length f, the recorded wavefront is an array of diverging spherical waves and the sign of the effective focal length becomes negative. Therefore a concave lens-array is recorded on the photopolymer in this case. After the fabrication, the concave lens-array HOE generates diverging light when the reference wave is illuminated. However if the illuminating direction is in reverse, the HOE generates a conjugated signal wave which is converging. Therefore the concave lens-array HOE with conjugated reconstruction can perform the identical role of a convex lens-array HOE, and can therefore be utilized in focal mode integral imaging in the same way [22].

Figure 3 shows the cross sectional amplitude of a recorded wavefront at a recording distance simulated using Eq. (2). In this simulation, the pitch and focal length of the reference lens-array are set as 1 mm and 3.3 mm each and the recording distance is 1.65 mm, which means the enhancing factor N is set as 2. The blue line and the dashed green line show the amplitude of the recorded wavefront for each recording during the lateral shifting procedure. As shown in Fig. 3, a certain portion of the converging wave leaks outside of the recording section assigned by the lateral shift of the photopolymer. Nevertheless, the power of the leakage term is very small (typically, less than 2%). Since the photopolymer has an inhibition dosage, the exposure of this leakage term does not initiate the polymerization of photopolymer [22]. Therefore the leakage term of the wavefront is not recorded on the other sections and the recording fill factor of the photopolymer area would be approximately 25% per single recording of the lateral shifting process. Generally, the recording fill factor (RFF) per single recording for the proposed method is given as Eq. (3):

RFF(%)=100×(1zf)2.
The fill factor of the recorded HOE varies according to the recording distance z. While the signal wave of the reference lens-array propagates from the principal plane to the focal plane of the reference lens-array, the recording area decreases to 0% of the recordable area on the photopolymer, and then it increases back to 100% until the wave propagates to twice of the focal length. As the wave propagates, the recording fill factor of the recorded wavefront changes as shown in Fig. 4.

 

Fig. 3 Simulated recorded wavefront at the recording distance.

Download Full Size | PPT Slide | PDF

 

Fig. 4 Complex wavefronts and recording fill factors, varying the recording distance: (a) at z = f/5, (b) z = f/2, (c) z = 2f/3 and (d) z = 3f/4.

Download Full Size | PPT Slide | PDF

In general, when the recording distance is selected as (1 ± 1/N)f, the recorded wavefront can be interpreted to have a 1/N lenslet size and a 1/N focal length of the reference lens-array with the recording fill factor of 100/N2%. By repeating the recording process N2 times, the lens-array HOE with a high fill factor which has an N times smaller lens pitch than the reference lens-array can be simply fabricated with a conventional lens-array. Meanwhile, the numerical aperture of the lens-array HOE is the same as the reference lens-array, since the converging angle is maintained while changing the recording plane. When the smaller lenslet size of the lens-array HOE is reduced by 1/N the spatial resolution of the reconstructed 3D image is enhanced by a factor of N2. Compared to a conventional lens-array HOE, the viewing angle of the displayed image is maintained since there is no change in the numerical aperture.

3. Experiments

We carried out lens-array HOE fabrication and 3D image displaying experiments in order to verify the proposed method. A 1 mm lens-array with a focal length of 3.3 mm was used as a reference lens array. We fabricated two lens-array HOEs in order to make comparisons: one lens-array HOE was fabricated using the previous method [13] and the other lens-array HOE was fabricated using the proposed method. To demonstrate the feasibility of the proposed method, we used an enhancing factor N of 2 in order to fabricate a lens-array HOE with a lens pitch of 500 μm and a focal length of 1.65 mm. In this case, four repetitions of the recording process are needed to fill the whole recording plane on the photopolymer because the size of the recorded area in a single recording process is one fourth that of the recording plane area. We performed these sequential recordings by shifting the photopolymer parallel to the reference lens-array, as shown in Fig. 4(b).

The experimental setup for the proposed lens-array HOE recording is shown in Fig. 5. A diode-pumped solid state laser with a 532 nm wavelength was used as the coherent light source of lens-array HOE recording process. The expanded plane wave is split by a beam splitter. The signal wave passes through the lens-array and the photopolymer is located behind the lens-array. The linear stage is used for setting the recording distance and shifting the photopolymer laterally. The incident angle of the reference wave is 45° on the photopolymer. As discussed in the previous section, the reference wave is masked in order to match the recording area with the signal wave. The exposure time is set as 40 seconds for each procedure with an exposure of 60 mJ/cm2. Therefore, time costs for the recording process are 40 seconds for the previous method and 3 minutes for the proposed method including the time spent in controlling the linear stage. In this experiment, only a 532 nm monochromatic light was used in the fabrication of the lens-array HOE to demonstrate the feasibility of the proposed method. Under the experimental conditions, the theoretical estimation of wavelength selectivity according to coupled wave theory [25] is approximately 5.3 nm of width at half maximum around 532 nm. And the experimentally measured wavelength selectivity using spectrometer is approximately 7.2 nm, which agrees with the theoretical estimation. Therefore, the integrated images show green color only which is coincident of recorded wavelength (532 nm) when the displaying beam is incoherent. Even though we implement the proposed method using single wavelength, the principles of proposed method can be extended identically to full-color images by using wavelength multiplexing method [13, 21, 22]. In those cases, an achromatic reference lens-array can be used to reduce the chromatic aberration of lens-array HOE.

 

Fig. 5 The experimental setup for recording the proposed lens-array HOE.

Download Full Size | PPT Slide | PDF

Since the lens pitch of the proposed micro lens-array HOE is 500 μm, the number of included lenslets is four times of that for the previous method. To verify the focusing property of the lens-array HOE, the reference wave is illuminated on the lens-array HOE and images at the focal plane are captured by a CCD camera at the distance of each focal length from the surface as shown in Fig. 6. Figure 6(a) shows the captured image at the focal length of the lens-array HOE which was fabricated using previous method [13] with a 1 mm reference lens-array and Fig. 6(b) shows the captured image at the focal length of the micro lens-array that was fabricated using the proposed method.

 

Fig. 6 Captured images at the focused plane of lens-array HOE fabricated by (a) previous method and (b) proposed method, which have focal lengths of 1 mm and 500 μm, respectively.

Download Full Size | PPT Slide | PDF

The enhancement of fill factor of lens-array HOE is demonstrated in Fig. 7. Since the recorded area reacts to Bragg matched light, there is a difference of transmittance between recorded area and unrecorded area. When the HOE is illuminated with broad band light, a certain amount of Bragg matched light is diffracted by the HOE and therefore cannot be transmitted through the HOE. Therefore the color of the transmitted light shifts slightly toward the complementary color of the Bragg matched light. In Fig. 7, a diffuser is located behind the HOE and white light is illuminated through the diffuser to the HOE. The diffuser is used in order to avoid the imaging of the light source by the lenslets of the lens-array HOE. Pictures are then taken at the side opposite to the illumination. Since a 532 nm wavelength is used to record the HOE, the light that is transmitted through the recorded area has a redish color. Figure 7(a) shows a captured image of the lens-array HOE using the previous fabrication method and Fig. 7(b) shows the captured image obtained when the proposed lens-array HOE is used. The experimental result shows that the fill factor is increased by a factor of two compared to the previous method. However, the edge of each lenslet is not fully recorded due to the experimental alignment error and the diffraction effect of reference wave caused by the reference wave mask. The diffraction effect can be reduced by locating the mask closer to the photopolymer or by adopting a 4-f imaging system between the mask and the photopolymer.

 

Fig. 7 Captured images of HOEs fabricated by (a) previous method and (b) proposed method with diffuser located behind.

Download Full Size | PPT Slide | PDF

Finally, a display experiment was carried out using both fabricated lens-array HOEs in order to verify the enhancement in the visibility of the proposed method. Figure 8 shows the display structure for display experiment. A full high-definition (HD) beam projector was used as an imaging device for projecting the elemental image on the lens-array HOE. Since a collimated reference wave is used in the recording process, the projected image is collimated by a telecentric lens to avoid Bragg mismatch. Furthermore, relay optics are utilized in front of the beam projector in order to reduce the size of the projected image. Size of the projected elemental image is horizontally 40 mm and vertically 30 mm. The lens-array HOE is used as a see-through 3D screen and the elemental image is projected to the screen as 45°.

 

Fig. 8 Captured image of display structure for display experiment.

Download Full Size | PPT Slide | PDF

Captured images of display experiments by both lens-array HOEs are shown in Figs. 9(a)-9(b). The elemental image of 3D rendered objects was generated using computer graphics as presented in Figs. 9(c)-9(d). Since the focal mode integral imaging was adopted, the depth of the displayed image can be both real (in front of the HOE) and virtual (behind the HOE). A person is posing at the right side with arms outstretched and the characters “3” and “D” are located at the left side. The character “3” and the left arm of the person are located in front of the lens-array HOE and the character “D” and the right arm are located behind the lens array HOE. A toy house was placed behind the lens-array HOE in order to show the see-through property of the system.

 

Fig. 9 Displayed 3D images with background object using a lens-array HOE made by (a) the previous method and (b) the proposed method. (c) Elemental image used in display experiment of part (a). (d) Elemental image used in display experiment of part (b).

Download Full Size | PPT Slide | PDF

As can be seen in Figs. 9(a)-9(b), visibility using the proposed method is greatly enhanced compared to the previous method. The contour of arms and characters are not very clear in Fig. 9(a), however they can clearly be seen in Fig. 9(b). Reconstruction results according to the perspective view are presented in Fig. 10 (see Media 1). Since the character “D” is closer to the toy house than the character “3”, there is smaller disparity between the toy house and the character “D” compared to the character “3”. A vertical disparity is also observed in the position of the left arm.

 

Fig. 10 Generated 3D images using the proposed lens-array HOE captured at different view positions (Media 1).

Download Full Size | PPT Slide | PDF

4. Conclusion

We propose a method for fabricating a lens-pitch-controllable micro lens-array HOE. By controlling the recording plane and performing repetitive recording processes, the lens pitch and fill factor of lens-array HOE can be controlled according to the recording distance. The lens pitch of lens-array HOE can be reduced to 1/N of the reference lens-array with high fill factor and the same numerical aperture. Compared to the conventional fabrication process used for a small pitched, high fill factored lens-array which requires expensive equipment and involves a complicated process, our proposed method has remarkable advantages in regard to both time and cost. We demonstrated the feasibility of the process by fabricating a 500 μm pitch micro lens-array HOE which has a high fill factor with a 1 mm pitch reference lens-array. Since the pixel pitch of the projected image can be easily controlled in projection type integral imaging, the use of a small lens pitch is effective in enhancing the quality of the displayed 3D image. The enhancement of visibility of the 3D images is clearly demonstrated in experimental results and we expect that it would be possible to give observers an immersive effect with see-through properties and high resolution 3D images.

Acknowledgments

This work was supported by “The Cross-Ministry Giga Korea Project” of the Ministry of Science, ICT and Future Planning, Korea [GK13D0200, Development of Super Multi-View (SMV) Display Providing Real-Time Interaction]. The authors acknowledge the support of Bayer Material Science AG for providing the photopolymer Bayfol HX film used for recording the lens-array HOE. The 3D object used to generate elemental images was modeled by Alexander Lee, and was used under Creative Commons Attribution, Share Alike 3.0.

References and links

1. G. Lippmann, “La photograhie integrale,” CR Acad. Sci. 146, 446–451 (1908).

2. B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013). [CrossRef]  

3. S.-G. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014). [CrossRef]  

4. J.-H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009). [CrossRef]   [PubMed]  

5. H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multiprojector,” Opt. Express 12(6), 1067–1076 (2004). [CrossRef]   [PubMed]  

6. J. S. Jang and B. Javidi, “Depth and lateral size control of three-dimensional images in projection integral imaging,” Opt. Express 12(16), 3778–3790 (2004). [CrossRef]   [PubMed]  

7. M. A. Alam, M. L. Piao, T. Bang, and N. Kim, “Viewing-zone control of integral imaging display using a directional projection and elemental image resizing method,” Appl. Opt. 52(28), 6969–6978 (2013). [CrossRef]   [PubMed]  

8. J. S. Jang, Y.-S. Oh, and B. Javidi, “Spatiotemporally multiplexed integral imaging projector for large-scale high-resolution three-dimensional display,” Opt. Express 12(4), 557–563 (2004). [CrossRef]   [PubMed]  

9. Y. Kim, S.-G. Park, S.-W. Min, and B. Lee, “Projection-type integral imaging system using multiple elemental image layers,” Appl. Opt. 50(7), B18–B24 (2011). [CrossRef]   [PubMed]  

10. M. Okui, J. Arai, Y. Nojiri, and F. Okano, “Optical screen for direct projection of integral imaging,” Appl. Opt. 45(36), 9132–9139 (2006). [CrossRef]   [PubMed]  

11. H. Liao, M. Iwahara, T. Koike, N. Hata, I. Sakuma, and T. Dohi, “Scalable high-resolution integral videography autostereoscopic display with a seamless multiprojection system,” Appl. Opt. 44(3), 305–315 (2005). [CrossRef]   [PubMed]  

12. J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009). [CrossRef]  

13. K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014). [CrossRef]   [PubMed]  

14. M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002). [CrossRef]  

15. A. Tripathi, T. V. Chokshi, and N. Chronis, “A high numerical aperture, polymer-based, planar microlens array,” Opt. Express 17(22), 19908–19918 (2009). [CrossRef]   [PubMed]  

16. Z. D. Popovic, R. A. Sprague, and G. A. N. Connell, “Technique for monolithic fabrication of microlens arrays,” Appl. Opt. 27(7), 1281–1284 (1988). [CrossRef]   [PubMed]  

17. J. Albero, L. Nieradko, C. Gorecki, H. Ottevaere, V. Gomez, H. Thienpont, J. Pietarinen, B. Päivänranta, and N. Passilly, “Fabrication of spherical microlenses by a combination of isotropic wet etching of silicon and molding techniques,” Opt. Express 17(8), 6283–6292 (2009). [CrossRef]   [PubMed]  

18. S.-G. Park, B.-S. Song, and S.-W. Min, “Analysis of image visibility in projection-type integral imaging system without diffuser,” J. Opt. Soc. Korea 14(2), 121–126 (2010). [CrossRef]  

19. I. Biederman, “Recognition-by-components: a theory of human image understanding,” Psychol. Rev. 94(2), 115–117 (1987). [CrossRef]   [PubMed]  

20. M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998). [CrossRef]   [PubMed]  

21. K. Hong, J. Yeom, C. Jang, G. Li, J. Hong, and B. Lee, “Two-dimensional and three-dimensional transparent screens based on lens-array holographic optical elements,” Opt. Express 22(12), 14363–14374 (2014). [CrossRef]   [PubMed]  

22. J. Yeom, K. Hong, Y. Jeong, C. Jang, and B. Lee, “Solution for pseudoscopic problem in integral imaging using phase-conjugated reconstruction of lens-array holographic optical elements,” Opt. Express 22(11), 13659–13670 (2014). [CrossRef]   [PubMed]  

23. F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010). [CrossRef]  

24. J. W. Goodman, Introduction to Fourier Optics, 3rd ed. (Roberts and Company Publishers, 2004).

25. H. Kogelnik, “Coupled wave theory for thick hologram gratings,” Bell Syst. Tech. J. 48(9), 2909–2947 (1969). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. G. Lippmann, “La photograhie integrale,” CR Acad. Sci. 146, 446–451 (1908).
  2. B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
    [Crossref]
  3. S.-G. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
    [Crossref]
  4. J.-H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009).
    [Crossref] [PubMed]
  5. H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multiprojector,” Opt. Express 12(6), 1067–1076 (2004).
    [Crossref] [PubMed]
  6. J. S. Jang and B. Javidi, “Depth and lateral size control of three-dimensional images in projection integral imaging,” Opt. Express 12(16), 3778–3790 (2004).
    [Crossref] [PubMed]
  7. M. A. Alam, M. L. Piao, T. Bang, and N. Kim, “Viewing-zone control of integral imaging display using a directional projection and elemental image resizing method,” Appl. Opt. 52(28), 6969–6978 (2013).
    [Crossref] [PubMed]
  8. J. S. Jang, Y.-S. Oh, and B. Javidi, “Spatiotemporally multiplexed integral imaging projector for large-scale high-resolution three-dimensional display,” Opt. Express 12(4), 557–563 (2004).
    [Crossref] [PubMed]
  9. Y. Kim, S.-G. Park, S.-W. Min, and B. Lee, “Projection-type integral imaging system using multiple elemental image layers,” Appl. Opt. 50(7), B18–B24 (2011).
    [Crossref] [PubMed]
  10. M. Okui, J. Arai, Y. Nojiri, and F. Okano, “Optical screen for direct projection of integral imaging,” Appl. Opt. 45(36), 9132–9139 (2006).
    [Crossref] [PubMed]
  11. H. Liao, M. Iwahara, T. Koike, N. Hata, I. Sakuma, and T. Dohi, “Scalable high-resolution integral videography autostereoscopic display with a seamless multiprojection system,” Appl. Opt. 44(3), 305–315 (2005).
    [Crossref] [PubMed]
  12. J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
    [Crossref]
  13. K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014).
    [Crossref] [PubMed]
  14. M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002).
    [Crossref]
  15. A. Tripathi, T. V. Chokshi, and N. Chronis, “A high numerical aperture, polymer-based, planar microlens array,” Opt. Express 17(22), 19908–19918 (2009).
    [Crossref] [PubMed]
  16. Z. D. Popovic, R. A. Sprague, and G. A. N. Connell, “Technique for monolithic fabrication of microlens arrays,” Appl. Opt. 27(7), 1281–1284 (1988).
    [Crossref] [PubMed]
  17. J. Albero, L. Nieradko, C. Gorecki, H. Ottevaere, V. Gomez, H. Thienpont, J. Pietarinen, B. Päivänranta, and N. Passilly, “Fabrication of spherical microlenses by a combination of isotropic wet etching of silicon and molding techniques,” Opt. Express 17(8), 6283–6292 (2009).
    [Crossref] [PubMed]
  18. S.-G. Park, B.-S. Song, and S.-W. Min, “Analysis of image visibility in projection-type integral imaging system without diffuser,” J. Opt. Soc. Korea 14(2), 121–126 (2010).
    [Crossref]
  19. I. Biederman, “Recognition-by-components: a theory of human image understanding,” Psychol. Rev. 94(2), 115–117 (1987).
    [Crossref] [PubMed]
  20. M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
    [Crossref] [PubMed]
  21. K. Hong, J. Yeom, C. Jang, G. Li, J. Hong, and B. Lee, “Two-dimensional and three-dimensional transparent screens based on lens-array holographic optical elements,” Opt. Express 22(12), 14363–14374 (2014).
    [Crossref] [PubMed]
  22. J. Yeom, K. Hong, Y. Jeong, C. Jang, and B. Lee, “Solution for pseudoscopic problem in integral imaging using phase-conjugated reconstruction of lens-array holographic optical elements,” Opt. Express 22(11), 13659–13670 (2014).
    [Crossref] [PubMed]
  23. F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
    [Crossref]
  24. J. W. Goodman, Introduction to Fourier Optics, 3rd ed. (Roberts and Company Publishers, 2004).
  25. H. Kogelnik, “Coupled wave theory for thick hologram gratings,” Bell Syst. Tech. J. 48(9), 2909–2947 (1969).
    [Crossref]

2014 (4)

2013 (2)

2011 (1)

2010 (2)

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

S.-G. Park, B.-S. Song, and S.-W. Min, “Analysis of image visibility in projection-type integral imaging system without diffuser,” J. Opt. Soc. Korea 14(2), 121–126 (2010).
[Crossref]

2009 (4)

2006 (1)

2005 (1)

2004 (3)

2002 (1)

M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002).
[Crossref]

1998 (1)

M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
[Crossref] [PubMed]

1988 (1)

1987 (1)

I. Biederman, “Recognition-by-components: a theory of human image understanding,” Psychol. Rev. 94(2), 115–117 (1987).
[Crossref] [PubMed]

1969 (1)

H. Kogelnik, “Coupled wave theory for thick hologram gratings,” Bell Syst. Tech. J. 48(9), 2909–2947 (1969).
[Crossref]

1908 (1)

G. Lippmann, “La photograhie integrale,” CR Acad. Sci. 146, 446–451 (1908).

Alam, M. A.

Albero, J.

Arai, J.

Bang, T.

Biederman, I.

I. Biederman, “Recognition-by-components: a theory of human image understanding,” Psychol. Rev. 94(2), 115–117 (1987).
[Crossref] [PubMed]

Bruder, F. K.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Chen, N.

S.-G. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Cho, S.-W.

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Choi, H.

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Chokshi, T. V.

Chronis, N.

Connell, G. A. N.

Deuber, F.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Dohi, T.

Fäcke, T.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Gauthier, I.

M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
[Crossref] [PubMed]

Gomez, V.

Gorecki, C.

Hagen, R.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Hata, N.

Hayward, W. G.

M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
[Crossref] [PubMed]

Hönel, D.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Hong, J.

Hong, J.-Y.

S.-G. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Hong, K.

Iwahara, M.

Jang, C.

Jang, J. S.

Javidi, B.

Jeong, Y.

Jurbergs, D.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Kim, J.

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Kim, N.

Kim, Y.

Y. Kim, S.-G. Park, S.-W. Min, and B. Lee, “Projection-type integral imaging system using multiple elemental image layers,” Appl. Opt. 50(7), B18–B24 (2011).
[Crossref] [PubMed]

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Kogelnik, H.

H. Kogelnik, “Coupled wave theory for thick hologram gratings,” Bell Syst. Tech. J. 48(9), 2909–2947 (1969).
[Crossref]

Koike, T.

Lee, B.

S.-G. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014).
[Crossref] [PubMed]

J. Yeom, K. Hong, Y. Jeong, C. Jang, and B. Lee, “Solution for pseudoscopic problem in integral imaging using phase-conjugated reconstruction of lens-array holographic optical elements,” Opt. Express 22(11), 13659–13670 (2014).
[Crossref] [PubMed]

K. Hong, J. Yeom, C. Jang, G. Li, J. Hong, and B. Lee, “Two-dimensional and three-dimensional transparent screens based on lens-array holographic optical elements,” Opt. Express 22(12), 14363–14374 (2014).
[Crossref] [PubMed]

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

Y. Kim, S.-G. Park, S.-W. Min, and B. Lee, “Projection-type integral imaging system using multiple elemental image layers,” Appl. Opt. 50(7), B18–B24 (2011).
[Crossref] [PubMed]

J.-H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009).
[Crossref] [PubMed]

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Li, G.

Liao, H.

Lippmann, G.

G. Lippmann, “La photograhie integrale,” CR Acad. Sci. 146, 446–451 (1908).

Min, S.-W.

Nieradko, L.

Nojiri, Y.

Oh, Y.-S.

Okano, F.

Okui, M.

Ottevaere, H.

Päivänranta, B.

Park, C.

M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002).
[Crossref]

Park, G.

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Park, J.

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Park, J.-H.

Park, S.-G.

Passilly, N.

Piao, M. L.

Pietarinen, J.

Popovic, Z. D.

Rölle, T.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Sakuma, I.

Song, B.-S.

Sprague, R. A.

Tarr, M. J.

M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
[Crossref] [PubMed]

Thienpont, H.

Tripathi, A.

Weiser, M. S.

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Whitesides, G. M.

M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002).
[Crossref]

Williams, P.

M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
[Crossref] [PubMed]

Wu, M.-H.

M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002).
[Crossref]

Yeom, J.

Appl. Opt. (6)

Bell Syst. Tech. J. (1)

H. Kogelnik, “Coupled wave theory for thick hologram gratings,” Bell Syst. Tech. J. 48(9), 2909–2947 (1969).
[Crossref]

CR Acad. Sci. (1)

G. Lippmann, “La photograhie integrale,” CR Acad. Sci. 146, 446–451 (1908).

J. Inf. Disp. (1)

S.-G. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

J. Opt. Soc. Korea (1)

J. Soc. Inf. Disp. (1)

J. Kim, Y. Kim, H. Choi, S.-W. Cho, Y. Kim, J. Park, G. Park, S.-W. Min, and B. Lee, “Implementation of polarization-multiplexed tiled projection integral imaging system,” J. Soc. Inf. Disp. 17(5), 411–418 (2009).
[Crossref]

Langmuir (1)

M.-H. Wu, C. Park, and G. M. Whitesides, “Fabrication of arrays of microlenses with controlled profiles using gray-scale microlens projection photolithography,” Langmuir 18(24), 9312–9318 (2002).
[Crossref]

Nat. Neurosci. (1)

M. J. Tarr, P. Williams, W. G. Hayward, and I. Gauthier, “Three-dimensional object recognition is viewpoint dependent,” Nat. Neurosci. 1(4), 275–277 (1998).
[Crossref] [PubMed]

Opt. Express (7)

K. Hong, J. Yeom, C. Jang, G. Li, J. Hong, and B. Lee, “Two-dimensional and three-dimensional transparent screens based on lens-array holographic optical elements,” Opt. Express 22(12), 14363–14374 (2014).
[Crossref] [PubMed]

J. Yeom, K. Hong, Y. Jeong, C. Jang, and B. Lee, “Solution for pseudoscopic problem in integral imaging using phase-conjugated reconstruction of lens-array holographic optical elements,” Opt. Express 22(11), 13659–13670 (2014).
[Crossref] [PubMed]

A. Tripathi, T. V. Chokshi, and N. Chronis, “A high numerical aperture, polymer-based, planar microlens array,” Opt. Express 17(22), 19908–19918 (2009).
[Crossref] [PubMed]

J. S. Jang, Y.-S. Oh, and B. Javidi, “Spatiotemporally multiplexed integral imaging projector for large-scale high-resolution three-dimensional display,” Opt. Express 12(4), 557–563 (2004).
[Crossref] [PubMed]

J. Albero, L. Nieradko, C. Gorecki, H. Ottevaere, V. Gomez, H. Thienpont, J. Pietarinen, B. Päivänranta, and N. Passilly, “Fabrication of spherical microlenses by a combination of isotropic wet etching of silicon and molding techniques,” Opt. Express 17(8), 6283–6292 (2009).
[Crossref] [PubMed]

H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multiprojector,” Opt. Express 12(6), 1067–1076 (2004).
[Crossref] [PubMed]

J. S. Jang and B. Javidi, “Depth and lateral size control of three-dimensional images in projection integral imaging,” Opt. Express 12(16), 3778–3790 (2004).
[Crossref] [PubMed]

Opt. Lett. (1)

Phys. Today (1)

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

Proc. SPIE (1)

F. K. Bruder, F. Deuber, T. Fäcke, R. Hagen, D. Hönel, D. Jurbergs, T. Rölle, and M. S. Weiser, “Reaction-diffusion model applied to high resolution Bayfol HX photopolymer,” Proc. SPIE 7619, 76190I (2010).
[Crossref]

Psychol. Rev. (1)

I. Biederman, “Recognition-by-components: a theory of human image understanding,” Psychol. Rev. 94(2), 115–117 (1987).
[Crossref] [PubMed]

Other (1)

J. W. Goodman, Introduction to Fourier Optics, 3rd ed. (Roberts and Company Publishers, 2004).

Supplementary Material (1)

» Media 1: MOV (1671 KB)     

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 (a) Schematic diagram of previous lens-array HOE fabricating method and (b) reconstruction scheme of lens-array HOE.
Fig. 2
Fig. 2 Schematic diagram of recording plane manipulation method: (a) set the recording distance and (b) repeat the recording process after shifting the photopolymer laterally.
Fig. 3
Fig. 3 Simulated recorded wavefront at the recording distance.
Fig. 4
Fig. 4 Complex wavefronts and recording fill factors, varying the recording distance: (a) at z = f/5, (b) z = f/2, (c) z = 2f/3 and (d) z = 3f/4.
Fig. 5
Fig. 5 The experimental setup for recording the proposed lens-array HOE.
Fig. 6
Fig. 6 Captured images at the focused plane of lens-array HOE fabricated by (a) previous method and (b) proposed method, which have focal lengths of 1 mm and 500 μm, respectively.
Fig. 7
Fig. 7 Captured images of HOEs fabricated by (a) previous method and (b) proposed method with diffuser located behind.
Fig. 8
Fig. 8 Captured image of display structure for display experiment.
Fig. 9
Fig. 9 Displayed 3D images with background object using a lens-array HOE made by (a) the previous method and (b) the proposed method. (c) Elemental image used in display experiment of part (a). (d) Elemental image used in display experiment of part (b).
Fig. 10
Fig. 10 Generated 3D images using the proposed lens-array HOE captured at different view positions (Media 1).

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

u ( x , y ) = r e c t [ x p ] r e c t [ y p ] exp [ j π λ f ( x 2 + y 2 ) ] ,
u 2 ( x , y ) = F T 2 1 { F T 2 { r e c t [ x p ] r e c t [ y p ] exp [ j π λ f ( x 2 + y 2 ) ] } exp [ j π λ z ( f X 2 + f Y 2 ) ] } .
RFF(%) = 100 × ( 1 z f ) 2 .

Metrics