Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Multiplexing encoding method for full-color dynamic 3D holographic display

Open Access Open Access

Abstract

The multiplexing encoding method is proposed and demonstrated for reconstructing colorful images accurately by using single phase-only spatial light modulator (SLM). It will encode the light waves at different wavelengths into one pure-phase hologram at the same time based on the analytic formulas. The three-dimensional (3D) images can be reconstructed clearly when the light waves at different wavelengths are incident into the encoding hologram. Numerical simulations and optical experiments for 2D and 3D colorful images are performed. The results show that the colorful reconstructed images with high quality are achieved successfully. The proposed multiplexing method is a simple and fast encoding approach and the size of the system is small and compact. It is expected to be used for realizing full-color 3D holographic display in future.

© 2014 Optical Society of America

1. Introduction

Holographic display, which can provide the full parallax and depth information of a 3D scene without any special eyewear, is one of the research hotspots in 3D display area. Normally, for realizing the full-color holographic display, the computer-generated holograms (CGHs) at different wavelengths are calculated and loaded into optoelectronic devices to reconstruct 3D images by time-multiplexing or space-multiplexing method.

There exist many problems in the Holographic display such as the low image quality, the long time to process the huge data information, the limited space bandwidth of the optoelectronic devices, and the complicated system. As is well-known, the coherent light illuminating will cause the speckle noise, which will degrade the image quality. Furthermore, the commercial optoelectronic devices can modulate the amplitude of the light wave or its phase or both dependently, whereas they cannot modulate the amplitude and the phase, independently and simultaneously. The phase-only optoelectronic devices, such as the phase-only spatial light modulators (SLMs) which have a higher diffractive efficiency than that of the amplitude-only SLM, are the most commonly used. Nevertheless, the quality of the reconstructed image by the kinoform (phase-only CGH) becomes degraded because of the lost information of the light amplitude. Though the image with random initial phase can make the amplitude uniform statistically [1], the reconstructed image is accompanied with speckle noise. In order to generate low-noise kinoforms, some optimization algorithms, such as Gerchberg-Saxton (GS) algorithm [1, 2], multi-plane iterative algorithm (MPIA) [3, 4], and simulated annealing algorithm (SAA) [5], dynamic-pseudorandom-phase (DPP) algorithm [6], have been proposed to encode the partial amplitude information into the phase part. However, those methods are quite time-consuming.

The other problem is the complexity or the size of the display systems when the full-color 3D holographic image is reconstructed. Employing three SLMs respectively to load CGHs with the red, green, and blue components of a color image is one of the most frequently used methods [7, 8], where the utility rate of luminous energy is the highest at the expense of large volume and high cost. For reducing the system size and the cost, several approaches which employ only one SLM for realizing full-color holographic display have been proposed in recent years, such as the depth-division method (DDM) [3, 4], the time-division method (TDM) [9, 10], the space-division method (SDM) [11, 12], and etc. DDM needs the iterative algorithm so it is time-consuming, and it is also suitable to 2D rather than 3D holographic display. TDM requires the SLM with very high rate of frame frequency to produce afterimage effect on human eyes. SDM can reconstruct a color image by one hologram which records the transversely distributed RGB components of the original color image, but it costs time in iterative calculation. In addition, some other similar methods were proposed with one SLM for the color display [1, 13, 14], however, the resolution of the reconstructed image is greatly reduced since the SLM is divided into several regions side by side.

In this paper, a multiplexing encoding method is proposed to generate one pure phase hologram, which minimizes the size of the system. Three computer-generated holograms corresponding to red, green and blue (RGB) are synthesized into one pure-phase hologram by the analytic formulas. The colorful 3D image will be reconstructed when the light waves at three original wavelengths are incident simultaneously. Numerical simulations and optical experiments for 3D colorful images reconstruction are performed and they are in nice agreement. The system is simple, and the synthesized method is fast, as well as the high image quality of the reconstructed 3D display can be achieved successfully.

2. The multiplexing encoding method

The color bird picture can be divided into RGB components, and each component can be considered as many self-illuminating point sources (pixel of the picture). Then each of the RGB components will be propagated toward the hologram directly and separately. When calculating the CGH, we assume that the reference beam of red color is a plane wave, and it illuminates the hologram with a tilt angle θ along z axis. The red component of the bird picture is then interfered with the red reference beam, and the fringe is recorded and encoded as R-CGH. The blue reference beam is a plane wave with a tilt angle -θ along z axis, and the green reference beam is a plane wave parallel to z axis. In the same way, the B-CGH and the G-CGH can be encoded separately. It is noted that all the reference lights are vertical to the y axis. Then we synthesize the RGB CGH to obtain one hologram, and a final phase-only hologram can be achieved by Eq. (4). In the reconstruction process, the phase-only hologram is loaded on the SLM, and three reference lights of RGB colors with different angels will be simultaneously illuminated the SLM as shown in Fig. 1(b), then the color image can be reconstructed.

 figure: Fig. 1

Fig. 1 Schematic view of the CGH generation and 3D image reconstruction. (a) CGH calculation process; (b) 3D image reconstruction process.

Download Full Size | PDF

Since a full-color 3D object can be divided into multiple 2D slices for RGB channels, the object wave distribution at the object plane is oi(x^,y^,z^p), where i = 1, 2 or 3, corresponds to red, green or blue channel, pis the number of the 2D slices of the 3D object. We use the angular spectrum method [15] to describe the propagation process of RGB components. And the frequency spectrum of each color component of the object is described by the Fourier transform

Oi(u,v,z^p)=oi(x^,y^,z^p)exp[j2π(ux^+vy^)]dx^dy^,
where u and v represent the spatial frequency. The reference plane light waves for RGB channels on the hologram plane can be written as Ri(x,y,zh)=exp(jkixsinθi), where ki=2π/λi, λi is the wavelength, θi is the incident angle of the reference light. On the hologram plane, the red component of the image is interfered with the red reference light, the green component interferes with the green reference light, and the blue component interferes with the blue light, then we obtain three different CGHs. The CGH of each color component of the 3D object in the holographic plane can be described as [15]
Hi(x,y,zh)=p{Oi(u,v,z^p)exp[jki(zhz^p)1(λiu)2(λiv)2]·exp[j2π(ux+vy)]dudv}·Ri(x,y,zh)=Ai(x,y)exp{j[φi(x,y)+kixsinθi]},
where (x, y) denote coordinates of the space domain, the formula in the brace represents the complex amplitude of each color component of the 3D object after the propagation process of spectrum, zhz^p is the distance between the object plane and the hologram plane, Ai(x,y) represents the amplitude distribution of the CGH, and φi(x,y) is the phase component. Then the complex amplitudes of all color components are synthesized as
S(x,y)=i=13Hi(x,y,zh)·exp(jkiysinψ)=i=13Ai(x,y)exp{j[φi(x,y)+kixsinθi+kiysinψ]}=A(x,y)exp[jφ(x,y)],
where 0A(x,y)1, ψis the tilted angle along y axis, which will eliminate the influence of zero-order diffraction beam of the SLM [16] in the reconstruction process. It is noted that the superposition of the complex amplitude is only an encoding method, and there is no physical process. The different components are still included independently in the hologram encoded by Eq. (3). One can synthesize the complex amplitude as a pure phase modulation by exp[jA(x,y)φ(x,y)]. And the final phase-only distribution (hologram) can be represented [17, 18] by a mixed Fourier–Taylor series as
H(x,y)=exp[jA(x,y)φ(x,y)]=n=Kn(x,y)exp[jnφ(x,y)]=K0(x,y)+K1(x,y)exp[jφ(x,y)]+e(x,y),
where the coefficients Kn(x,y) are given by
Kn(x,y)=exp{j[nA(x,y)]π}sin{π[nA(x,y)]}π[nA(x,y)].
K0(x,y) is the zero order, K1(x,y)exp[jφ(x,y)]is the first order which we are interested, and e(x,y) is the noise from the higher orders which can be ignored.

In the reconstruction process, a reconstructed color image is obtained by simultaneously illuminating three reconstructed lights of RGB colors to the phase-only hologram in different angles just as shown in Fig. 1(b). The reconstructed lights of RGB colors can be written asRi'(x,y,zh)=exp[jkixsin(θi)]. Then the reconstructed optical field distribution near the hologram is given by

O(x,y)=H(x,y)·R'(x,y,zh)=K0(x,y)i=13exp[jkixsin(θi)]Zeroorders+K1(x,y)A(x,y)i=13Ai(x,y)exp{j[φi(x,y)+kiysinψ]}Desiredfirstorders+K1(x,y)A(x,y)i=2,3Ai(x,y)exp{j[φi(x,y)+kixsin(θi)+kiysinψ+k1xsin(θ1)]}Unwantedfirstordersoftheredchannel+K1(x,y)A(x,y)i=1,3Ai(x,y)exp{j[φi(x,y)+kixsin(θi)+kiysinψ+k2xsin(θ2)]}Unwantedfirstordersofthegreenchannel+K1(x,y)A(x,y)i=1,2Ai(x,y)exp{j[φi(x,y)+kixsin(θi)+kiysinψ+k3xsin(θ3)]}Unwantedfirstordersofthebluechannel+E(x,y).Thehigherorders
It is clearly seen that the desired first items [K1(x,y)/A(x,y)]i=13Ai(x,y)exp{jφi(x,y)} are separated from the other diffraction items both in space and in frequency domain because of the tilted phase factors in Eq. (6). The reconstructed lights can reproduce the desired complex amplitude when K1(x,y)=A(x,y). Next, we will evaluate the error caused by K1/A(x,y), which can be written as K(x,y)=exp{j[1A(x,y)]π}sinc[1A(x,y)]/A(x,y). Take one dimension for simplification, K(x)=exp{j[1A(x)]π}sinc[1A(x)]/A(x). Assuming that A(x) increases from 0 to 1 at a fixed step length, the evaluated values are shown in Fig. 2. From Fig. 2 we can see that sinc[1A(x)] approximately equals to A(x). In our proposed method, A(x,y) is the amplitude of superposed complex amplitudes and then is normalized, which causes A(x,y) to become more uniform. Thensinc[1A(x)]/A(x)will be approximately equal to a constant and exp{j[1A(x)]π} will be smoother. So there is no optimization required, which will speed up the calculation of the CGH.

 figure: Fig. 2

Fig. 2 The curves of |K(x)|, A(x) and the sinc function.

Download Full Size | PDF

In the final reconstruction process, we can employ a 4f system with a band-pass filter in the frequency-plane [15] to filter out the unwanted items in Eq. (6). When the unnecessary diffractive items are carefully filtered, one can achieve the 3D objective image with high quality. Here our proposed method can encode one pure phase hologram for colorful 3D object, and our calculation doesn’t need optimization processing. Therefore, our proposed method is employed for multi-wavelength and can reconstruct a 3D color image by one phase-only hologram clearly.

3. Numerical and optical experimental results

The full-color holographic display system is shown in Fig. 3. Three lasers with different wavelengths are collimated by the collimators which consist of spatial filters and collimating lens, and they illuminate a phase-only SLM (Holoeye Pluto VIS, fill factor: 87%, pixel pitch 8.0 μm, resolution: 1920 × 1080, 256 phase modulation levels) in different angles simultaneously. The SLM is placed on the front focus plane of L1. In the 4f system the band-pass filter is placed on the back focus plane of L1, which is also the front focus plane L2. In the reconstructed process, the desired first orders will be picked up by the designed hole on the filter. The distance between the hole and the optic axis of the hole is determined by d=fL1sinψ, where fL1 is the focal length of L1. The reconstructed color image is obtained on the output plane after the back focal plane of L2. The CCD (Lumenera’s INFINITY 4-11C) is used to record the experimental results. In addition, the scaling of the reconstructed image is determined byfL2/fL1, where fL2 is the focal length of L2. In our experiment, the focal lengths of L1 and L2 are both 543mm and their apertures are both 120mm.

 figure: Fig. 3

Fig. 3 Setup of full-color holographic display system: BS is the beam splitter; SLM is the spatial light modulator; PC is the personal computer; L1 and L2 are the Fourier transform lens; d is the distance between the small hole and the optic axis.

Download Full Size | PDF

The selected ideal 2D color images with 1024 × 1024 pixels are shown in Figs. 4(a)4(c). The main parameters used in the numerical simulations and the optical experiments are given as following: the sampling interval of the hologram is 8.0μm × 8.0μm; the wavelengths of RGB reference lights are 671 nm for red, 532 nm for green, and 473 nm for blue, respectively; the distance between the object plane and the hologram plane is 500 mm; the incident angles θ of the red and the blue reference lights are both 1.57°; the tilted angle ψis 1.57°. Our program is run by a personal computer with a CPU core of a 2.6 GHz clock frequency under Matlab 2011b. The kinoform with size of 1,024 × 1,024 pixels is numerically generated by multiplexing encoding method. The time for calculating a hologram of 2D color image with 1024 × 1024 pixels with our proposed analytical method is about 1.32 seconds, and no iteration is needed.

 figure: Fig. 4

Fig. 4 (a) Car image, (b) Parrot image and (c) Cup and teapot image are the ideal images; (d) Car image, (e) Parrot image and (f) Cup and teapot image are numerical results; (g) Car image, (h) Parrot image and (i) Cup and teapot image are optical experimental results.

Download Full Size | PDF

We will first evaluate the quality of the reconstructed color image by peak signal noise ratio (PSNR). The equation for the calculation of the PSNR is defined as PSNR(dB)=10log10(2552/(1MNMN(IoIr)2)), where M and N are the horizontal and vertical numbers of pixels for the original image and the reconstructed image, Io and Ir are the original image and the reconstructed image, respectively [11]. We first calculate the PSNR for the R, G, and B components separately and then take their average value as an overall evaluation for the RGB image. The numerical reconstructed color images are shown in Figs. 4(d)4(f). The PSNRs of the numerical reconstructed images are 23.69 dB for the car image, 23.90 dB for the parrot image, 21.18 dB for the cup and teapot image. It is clearly observed that the color images can be numerical reconstructed successfully. Note that the numerical results are a little darker than the original images because of light losses. The light loss is caused by the zero orders (0 items) and the unwanted first orders (2 items) as shown in Fig. 7, which is described by Eq. (6). If the numerical reconstructed images are normalized, then they will have the same kind of “lightness” as the original images. And we also can improve the brightness of the images by using high power lasers in the optical experiment. We will talk about it in details in the discussion section.

The phase-only hologram is then loaded on the phase-only SLM. In the process of the optical experiment, the numerical zero-padding technique is utilized to eliminate the dispersions caused by different wavelengths [6]. The relationship between sampling numbers and corresponding wavelengths in the RGB channels should satisfy the functionMR:MG:MB=NR:NG:NB=λR:λG:λB, which indicates that a larger number of sampling points in the object plane is needed for the longer wavelength. We set the sampling number of the red channel as 1024 × 1024 pixels, and the sampling numbers of the green and the blue channels are 812 × 812 pixels and 722 × 722 pixels, respectively. The redundant marginal districts in the green and the blue channels are padded with zeros. The optical experimental results are shown in Figs. 4(g)4(i). It is evident that the full color of the image is realized successfully, the image quality of the numerical reconstructed scene is acceptable for human eyes, and the speckle noises are well suppressed. The optical experimental results are in good agreement with computer simulations. In Fig. 4, it is noted that the experimental results exhibit some color shift if one looks carefully though it does not exist in the numerical results. The color shift is caused by our optical experimental system and it could be corrected by finely tuning the output power of the RGB lasers [19]. What’s more, the SLM is a dispersive device and has a different phase modulation depth for different wavelengths. It will also cause the color shift which can be minimized by the pre-compensated lookup table.

Now we consider a full-color 3D object, which can be divided into multiple 2D slices with RGB channels. Here, we take a full-color 3D object consisting of two slices as an example for simplification. The CGH generating system is shown in Fig. 5. The distances between each object plane and hologram plane are z1 = 650 mm, and z2 = 500 mm. The hologram synthetized by the encoding method is then loaded into the SLM. The numerical and optical reconstructed full-color 3D images at different distances are displayed in Figs. 6(a)-6(d). When the CCD (INFINITY 4-11C) focus at 650 mm, the ‘RMB’ image is clear and ‘YGC’ image is blur as shown in Figs. 6(a) and 6(c), and vice versa. It is easily seen that the 3D colorful image is reconstructed properly and the high image quality is achieved. Note that the blue images in Figs. 6(c) and 6(d) are not aligned to the rest of the images whereas the numerical results are fine. There are two reasons that can cause the shift of the blue in Figs. 6(c) and 6(d). The first one is the mis-alignment of the system or the reference beams which can be corrected by adjusting the optical setup. The second one is that the blue laser beam (shorter wavelength) may have more dispersion in our optical system and the standard blue plane wave could be distorted. This problem can be solved by employing the achromatic optical elements in the optical system and shaping the blue laser beam to be a standard plane wave.

 figure: Fig. 5

Fig. 5 Simple model for RGB kinoform generation of full color 3D objects.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 The numerical and optical experimental reconstructed colorful 3D images: (a) and (b) are the numerical reconstructed results; (c) and (d) are the optical experimental results; (a) and (c) are focused on the ‘RMB’ image at 650 mm; (b) and (d) are focused on the ‘YGC’ image at 500 mm.

Download Full Size | PDF

It is noted that the model could be easily extended to generate the kinoform of a complex full-color 3D object composed of more than two slices [6]. When more slices are utilized to decompose the 3D object, the cross-talk could be introduced, and the quality of the reconstruction will be slightly low. However, it could be improved if we consider the occlusion nulling [20].

4. Discussion

In our proposed method, the tilted phase factors synthetized to the holograms will cause the higher diffraction orders as shown in Fig. 7(a). Since there existed unwanted images, the efficiency of energy utilization should be considered. The efficiency of energy is calculated numerically by the function ηe=[(MNId)/(XYIt)]×100%, where M and N are the horizontal and vertical numbers of pixels for the desired reconstructed image Id, X and Y are the horizontal and vertical numbers of pixels for the total reconstructed image It. The efficiency of energy is 5.05% for the magic cubic case. One can distinguish the details and colors of an object when the luminance exceeds 3 cd/m2 [21]. In our system, the diffraction efficiency of the SLM is 60%. The equation for the calculation of the required total luminous flux is defined as F(lm)=L·S·Ω/(ηe·ηslm), where L (cd/m2) is the luminance, S (m2) is the area of the reconstructed image, Ω(sr) is the solid angle and ηslmis the diffraction efficiency of the SLM. In addition, the solid angle is given by Ω(sr)S/r2, where r is the distance of distinct vision (normally 0.25m). Assuming that the luminance of the reconstructed image is 3 cd/m2, the size of the reconstructed 2D image is 5cm × 5cm, and the efficiency of energy is 5.05%, then the obtained total luminous flux is 0.0099 lumen. According to the spectral luminous efficiency in photopic vision [21], one lumen corresponds to 45.75 milliwatts (mW) for the red light with wavelength of 671 nm, 1.66 mW for the green light with wavelength of 532 nm, and 97.48 mW for the blue light with wavelength of 473 nm. So the minimal power for the RGB lasers are 0.453 mW, 0.016 mW and 0.965 mW respectively. In the calculation, the energy loss of the laser beams in propagation is neglected.

 figure: Fig. 7

Fig. 7 Numerical reconstructed result of magic cube without filtering: (a) 0 items represent the zero orders, 1 items represent the desired first orders, 2 items represent the unwanted first orders, the fuzzy patches represent the higher orders in Eq. (6); (b) The enlarged desired first order.

Download Full Size | PDF

We can use higher power lasers to improve the luminance of the image especially for complex scenes. It can be seen from Fig. 7(a) and 7(b) that the color image can be reconstructed clearly since the unwanted images are separated from the desired image. The desired order can be picked up and the unnecessary diffraction lights will be filtered out after the beam goes through a “4f” filtering architecture as shown in Fig. 3.

For real-time 3D holographic display, it is important to consider the calculation speed. Our proposed method is based on the analytical formula, so there is no iteration, and it does not cost more time to generate hologram that reconstructs the 3D object with high quality image in color. A preview of animated 2D projection is shown in Fig. 8 (Media 1). It is noted that the dynamic 3D display also can be easily obtained.

 figure: Fig. 8

Fig. 8 (Media 1) Experimental result of animated display.

Download Full Size | PDF

5. Conclusion

The multiplexing encoding method is proposed where the CGH is analytically generated. The numerical and experimental results both indicate that the colorful 2D or 3D images can be reconstructed clearly. The size of the optical system can be quite compact because of using single phase-only SLM. It is a simple and time-saving approach. It is a promising way for realizing dynamic full-color 3D holographic display in future. It could also be used to other complex amplitude modulations for multi-wavelength systems, such as optical encryption in color and diffractive optical elements with multi-wavelength, and so on.

Acknowledgment

This work was supported by the National Basic Research Program of China (973 Program Grant nos. 2013CB328801 and 2013CB328806), the National Natural Science Foundation of China (61235002).

References and links

1. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012). [CrossRef]   [PubMed]  

2. M. Hacker, G. Stobrawa, and T. Feurer, “Iterative Fourier transform algorithm for phase-only pulse shaping,” Opt. Express 9(4), 191–199 (2001). [CrossRef]   [PubMed]  

3. M. Makowski, M. Sypek, and A. Kolodziejczyk, “Colorful reconstructions from a thin multi-plane phase hologram,” Opt. Express 16(15), 11618–11623 (2008). [PubMed]  

4. M. Makowski, M. Sypek, I. Ducin, A. Fajst, A. Siemion, J. Suszek, and A. Kolodziejczyk, “Experimental evaluation of a full-color compact lensless holographic display,” Opt. Express 17(23), 20840–20846 (2009). [CrossRef]   [PubMed]  

5. N. Yoshikawa and T. Yatagai, “Phase optimization of a kinoform by simulated annealing,” Appl. Opt. 33(5), 863–868 (1994). [CrossRef]   [PubMed]  

6. H. Zheng, T. Tao, L. Dai, and Y. Yu, “Holographic imaging of full-color real-existing three-dimensional objects with computer-generated sequential kinoforms,” Chin. Opt. Lett. 9(4), 040901 (2011). [CrossRef]  

7. A. Shiraki, N. Takada, M. Niwa, Y. Ichihashi, T. Shimobaba, N. Masuda, and T. Ito, “Simplified electroholographic color reconstruction system using graphics processing unit and liquid crystal display projector,” Opt. Express 17(18), 16038–16045 (2009). [CrossRef]   [PubMed]  

8. J. Jia, Y. Wang, J. Liu, X. Li, Y. Pan, Z. Sun, B. Zhang, Q. Zhao, and W. Jiang, “Reducing the memory usage for effective computer-generated hologram calculation using compressed look-up table in full-color holographic display,” Appl. Opt. 52(7), 1404–1412 (2013). [CrossRef]   [PubMed]  

9. X. Li, Y. Wang, J. Liu, J. Jia, Y. Pan, and J. Xie, “Color holographic display using a phase-only spatial light modulator,” presented at the 10th International Symposium on Display Holography, Hawaii, USA, 25–29 April 2013. [CrossRef]  

10. M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19(13), 12008–12013 (2011). [CrossRef]   [PubMed]  

11. T. Shimobaba, T. Takahashi, N. Masuda, and T. Ito, “Numerical study of color holographic projection using space-division method,” Opt. Express 19(11), 10287–10292 (2011). [CrossRef]   [PubMed]  

12. T. Ito and K. Okano, “Color electroholography by three colored reference lights simultaneously incident upon one hologram panel,” Opt. Express 12(18), 4320–4325 (2004). [CrossRef]   [PubMed]  

13. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, A. Kolodziejczyk, and M. Sypek, “Extremely simple holographic projection of color images,” Proc. SPIE 8280, 1–6 (2012). [CrossRef]  

14. M. Makowski, I. Ducin, M. Sypek, A. Siemion, A. Siemion, J. Suszek, and A. Kolodziejczyk, “Color image projection based on Fourier holograms,” Opt. Lett. 35(8), 1227–1229 (2010). [CrossRef]   [PubMed]  

15. J. W. Goodman, Introduction to Fourier Optics, 2nd ed. (McGraw-Hill, 1996), chap. 2.2.

16. H. Zhang, J. Xie, J. Liu, and Y. Wang, “Elimination of a zero-order beam induced by a pixelated spatial light modulator for holographic projection,” Appl. Opt. 48(30), 5834–5841 (2009). [CrossRef]   [PubMed]  

17. I. Moreno, J. Campos, C. Gorecki, and M. J. Yzuel, “Effects of amplitude and phase mismatching errors in the generation of a kinoform for pattern recognition,” Jpn. J. Appl. Phys. 34, 6423–6432 (1995). [CrossRef]  

18. J. A. Davis, D. M. Cottrell, J. Campos, M. J. Yzuel, and I. Moreno, “Encoding Amplitude Information onto Phase-Only Filters,” Appl. Opt. 38(23), 5004–5013 (1999). [CrossRef]   [PubMed]  

19. R. Shi, J. Liu, H. Zhao, Z. Wu, Y. Liu, Y. Hu, Y. Chen, J. Xie, and Y. Wang, “Chromatic dispersion correction in planar waveguide using one-layer volume holograms based on three-step exposure,” Appl. Opt. 51(20), 4703–4708 (2012). [CrossRef]   [PubMed]  

20. K. Wakunami, H. Yamashita, and M. Yamaguchi, “Occlusion culling for computer generated hologram based on ray-wavefront conversion,” Opt. Express 21(19), 21811–21822 (2013). [CrossRef]   [PubMed]  

21. N. Ohta and A. R. Robertson, Colorimetry: Fundamentals and Applications (John Wiley & Sons, 2005).

Supplementary Material (1)

Media 1: MOV (441 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Schematic view of the CGH generation and 3D image reconstruction. (a) CGH calculation process; (b) 3D image reconstruction process.
Fig. 2
Fig. 2 The curves of | K ( x ) | , A ( x ) and the sinc function.
Fig. 3
Fig. 3 Setup of full-color holographic display system: BS is the beam splitter; SLM is the spatial light modulator; PC is the personal computer; L1 and L2 are the Fourier transform lens; d is the distance between the small hole and the optic axis.
Fig. 4
Fig. 4 (a) Car image, (b) Parrot image and (c) Cup and teapot image are the ideal images; (d) Car image, (e) Parrot image and (f) Cup and teapot image are numerical results; (g) Car image, (h) Parrot image and (i) Cup and teapot image are optical experimental results.
Fig. 5
Fig. 5 Simple model for RGB kinoform generation of full color 3D objects.
Fig. 6
Fig. 6 The numerical and optical experimental reconstructed colorful 3D images: (a) and (b) are the numerical reconstructed results; (c) and (d) are the optical experimental results; (a) and (c) are focused on the ‘RMB’ image at 650 mm; (b) and (d) are focused on the ‘YGC’ image at 500 mm.
Fig. 7
Fig. 7 Numerical reconstructed result of magic cube without filtering: (a) 0 items represent the zero orders, 1 items represent the desired first orders, 2 items represent the unwanted first orders, the fuzzy patches represent the higher orders in Eq. (6); (b) The enlarged desired first order.
Fig. 8
Fig. 8 (Media 1) Experimental result of animated display.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

O i (u,v, z ^ p )= o i ( x ^ , y ^ , z ^ p ) exp[j2π (u x ^ + v y ^ )]d x ^ d y ^ ,
H i (x,y, z h )= p { O i (u,v, z ^ p ) exp[ j k i ( z h z ^ p ) 1 ( λ i u) 2 ( λ i v) 2 ] ·exp[j2π(ux+vy)]dudv } · R i (x,y, z h ) = A i (x,y)exp{ j[ φ i (x,y)+ k i xsin θ i ] },
S(x,y)= i=1 3 H i (x,y, z h ) ·exp(j k i ysinψ) = i=1 3 A i (x,y)exp{ j[ φ i (x,y)+ k i xsin θ i + k i ysinψ] } =A(x,y)exp[jφ(x,y)],
H(x,y)=exp[jA(x,y)φ(x,y)] = n= K n (x,y) exp[jnφ(x,y)] = K 0 (x,y)+ K 1 (x,y)exp[jφ(x,y)]+e(x,y),
K n (x,y)=exp{ j[nA(x,y)]π } sin{ π[nA(x,y)] } π[nA(x,y)] .
O ( x , y ) = H ( x , y ) · R ' ( x , y , z h ) = K 0 ( x , y ) i = 1 3 exp [ j k i x sin ( θ i ) ] Z e r o o r d e r s + K 1 ( x , y ) A ( x , y ) i = 1 3 A i ( x , y ) exp { j [ φ i ( x , y ) + k i y sin ψ ] } D e s i r e d f i r s t o r d e r s + K 1 ( x , y ) A ( x , y ) i = 2 , 3 A i ( x , y ) exp { j [ φ i ( x , y ) + k i x sin ( θ i ) + k i y sin ψ + k 1 x sin ( θ 1 ) ] } U n w a n t e d f i r s t o r d e r s o f t h e r e d c h a n n e l + K 1 ( x , y ) A ( x , y ) i = 1 , 3 A i ( x , y ) exp { j [ φ i ( x , y ) + k i x sin ( θ i ) + k i y sin ψ + k 2 x sin ( θ 2 ) ] } U n w a n t e d f i r s t o r d e r s o f t h e g r e e n c h a n n e l + K 1 ( x , y ) A ( x , y ) i = 1 , 2 A i ( x , y ) exp { j [ φ i ( x , y ) + k i x sin ( θ i ) + k i y sin ψ + k 3 x sin ( θ 3 ) ] } U n w a n t e d f i r s t o r d e r s o f t h e b l u e c h a n n e l + E ( x , y ) . T h e h i g h e r o r d e r s
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.