Abstract

In this manuscript, the astigmatism of the waveguide combiner with a pair of symmetry HOEs was analyzed. The light field can be predicted by the modified convolution formulation of Fresnel diffraction when the information of light passes through the astigmatism causing element. Then the astigmatism can be corrected. The theory was experimentally proved by the system with a phase-only SLM and a diffraction planar waveguide. Furthermore, the image quality of astigmatism corrected phase-type CGHs can be improved via the iteration process. Since the coherence of light source was employed, the temporal averaging method was utilized to avoid speckle noise.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Display technology was already developed for decades. In order to provide information more completely and vividly, 3D displays have been attracting growing attention in recent year. The most direct way to achieve 3D display is by providing 2D images with different viewing angle to eyes as named as autostereoscopic 3D display [1]. However, 3D displays without depth information are prone to cause the vergence–accommodation conflicts (VAC) [2]. In order to avoid VAC effect, a 3D display with depth information is necessary. The holographic technique is a fundamental method to achieve natural 3D images display because the depth information can be reconstructed via wavefront recording [3,4]. Traditionally, the wavefront of information light was recorded on photosensitive materials to achieve static display.

In recent years, many teams proposed researches related to the computer-generated hologram (CGH) technique to achieve dynamic holographic display [511]. About this technique, the light field of information light on the hologram plane is predicted by numerical calculation. Then the information will be reconstructed well if the complex amplitude distribution is reproduced exactly on the experiment. Furthermore, 3D displays can be achieved by reproducing the light field of dynamic objects sequentially. However, reproducing a complex amplitude is difficult to achieve via current technology. The general method is that reproduce the phase distribution of the light field via a phase-only spatial light modulator (SLM). The phase-only SLM can spatially modulate the phase of the reading beam and reconstructs the wavefront of information light. However, eliminating the amplitude distribution will degrade the image quality. In order to resolve this issue, several studies were proposed to achieve complex amplitude distribution using two groups of phase distribution [1214]. But occupying more pixel number is necessary. Moreover, several methods were proposed to improve the image quality using single phase distribution [1519]. The quality degrading caused by amplitude elimination can be remedied by the iteration process. Among them, the temporal averaging method can improve image quality furthermore [1719]. However, the SLM with gray-scale 2π modulation is difficult to provide high frame rate because of the long response time. Therefore, the method based on high-speed ferroelectric liquid crystal display (FLCD) was also proposed [19]. This method improved the image quality of binary phase distribution which phase modulation value of binary modulation is only 0 and π. The storage capacity of the binary hologram is less than the gray-scale hologram and it can operate the FLCD with a high frame rate.

The disadvantage of the CGH technique is that the SLMs are difficult to achieve a large display area. The display area of present SLMs is too small to provide intact images with long eye relief. Therefore, the traditional tabletop display incorporating the CGH technique is difficult to achieve. According to the short eye relief requirement and the diffraction angle of current SLM, the CGH technique is suitable to incorporate with AR head-mounted displays (HMDs). Therefore, several studies combined AR HMD and CGH technique were proposed recently [2022]. The current issue of AR HMD is that additional elements such as half-mirror are required to achieve the see-through function. It leads the HMD device becomes bulky. In order to resolve this issue, the waveguide combiners attracted growing attention recently [23,24]. The information light was guided in a thin planar waveguide and coupled into the human eye via diffractive optical elements (DOEs) or holographic optical elements (HOEs). Then the see-through function could be achieved via a light element. Furthermore, another advantage of the diffractive waveguide combiner is that it can expand the exit pupil when the virtual image at infinity [25,26].

The disadvantage of the diffractive waveguide is that the image at the near place will be blurred by the astigmatism. Recently, astigmatism of the waveguide with two symmetry HOEs [27] and single HOE [28] were analyzed using the ray-tracing method. For the former one, the astigmatism was independent of image distance. For the latter one, the total diffraction efficiency is higher but the anamorphic effect occurred and astigmatism dependent on the image distance. Both of them, the variation of astigmatism dependent on viewing angle could be ignored. Astigmatism in these two works were corrected by the modified point-based algorithm and an astigmatism lens phase separately. Finally, astigmatism-free images can be obtained but the image quality still has room for improvement.

In this manuscript, the iteration algorithm for image quality improvement based on diffractive waveguide elements was proposed. Astigmatism of waveguide elements can be analyzed by ray-tracing. Different from prior literature, the astigmatism was compensated by the modified convolution Fresnel diffraction method. The light propagation in astigmatism space can be performed by the method. Furthermore, the proposed propagating method was proved to improve the image quality via the iteration process. In the experiment, a phase-only SLM was utilized to reproduce the phase distribution. The reconstructed images were temporally averaged with 20 holograms to avoid speckle noise. Finally, the contribution of the iteration process based on the proposed propagating method was proved in the 8-bits grayscale phase distribution and binary phase distribution. We expected that this technique can be operated both on a traditional phase-only SLM and high-speed FLCD well.

2. Astigmatism corrected algorithm

2.1 Astigmatism analysis

In this paper, a general waveguide HMD device as shown as Fig. 1 was utilized to provide holographic images. The light source is the diode-pumped solid-state (DPSS) laser with linear polarization light at 532nm. The half-wave plate (HWP) was utilized to modulate the polarization to match the SLM and the spatial filter (SF) was employed to block the noise of the laser. The abbreviations BS denoted beam splitter. The phase-only SLM (JD955B, Jasper Display) with 6.4µm pixel pitch and the convergence reading beam were employed to provide the holographic information. The resolution of CGHs which is dependent on the SLM is 1920 by 1080 pixels. The glass waveguide with a pair of symmetry HOEs was used to guide the information to the human eye. The thickness of the waveguide is 8mm and the refractive index is 1.52. The HOEs were recorded by two collimated beams on the photopolymer material (C-RT20, Litiholo) with 16µm thickness. The included angle between the normal and beams in glass are 0 degree and 50 degrees separately. The distance between HOE1 and HOE2 is about 3cm. When the convergence beam probed the SLM, the reconstructed information and the DC noise would be coupled into the waveguide by HOE1 simultaneously. Then the information light and the DC noise would both be coupled out the waveguide by HOE2. However, the DC noise could be blocked easily because of that is focused on the barrier. Then the observer could obtain clear information without DC noise.

 

Fig. 1. The HMD system composed of a CGH system and a holographic waveguide with astigmatism.

Download Full Size | PPT Slide | PDF

According to the previous studies, the linear grating will cause the effective propagating distance in the x-z plane and y-z plane become different. It leads the image planes in x-z plane and y-z plane separate from each other. Then the images will be blurred because of the serious astigmatism. In this paper, the astigmatism was analyzed based on ray-tracing. On the opposite, the human eye could obtain the astigmatism-free image when the light of the astigmatism image which was provided by SLM incident HOE1 with -z direction. In the simulation, the light propagation direction was inversed to Fig. 1. Assuming an aberration-free holographic image was arranged at 350mm behind the HOE2. When the light of the holographic image incident HOE2 with + z direction, the light would output from HOE1 with + z direction and formed an astigmatism induced intermediate image. On the opposite, the human eye could obtain the astigmatism-free virtual image when the light of the astigmatism induced intermediate image which was provided by SLM incident HOE1 with -z direction. Finally, the distance between the final virtual image and HOE2 would be 350mm if the system was arranged rigorously.

In the simulation, the intermediate image was composed by the imaging points of the virtual image points with different field height. The position of the intermediate imaging points was defined by the intersection of the both marginal rays. The marginal rays have defined that passed through the pupil edge when the diameter of the eye pupil was 8mm. Figure 2 shows the schematic when the ray fan is spread in the x-z plane. The diffraction angle of the different incident light can be calculated by H. Kogelnik’s coupled wave theory [27]. Then the position of intermediate image plane can be defined. Since the distance between the SLM and the waveguide in the experiment is known (about 6cm), the distance between the intermediate image plane in the x-z plane and the SLM (${d_{x - z}}$) can be calculated. Similarly, the distance between the SLM and the intermediate image plane in the y-z plane (${d_{y - z}}$) can be calculated when the ray fan which spread in the y-z plane is considered. Notice that the grating vector in the x-direction and the y-direction are different. Therefore, ${d_{x - z}}$ and ${d_{y - z}}$ are different for arbitrary image field.

 

Fig. 2. The schematic diagram of the simulation configuration.

Download Full Size | PPT Slide | PDF

In order to enhance the credibility, the above theory was performed in mathematics software Matlab and compared to the results of ray-tracing software Zemax. The simulation results were shown in Fig. 3. The simulation results in Zemax and Matlab were matched to each other. Since the component of the grating vector in the y-direction is 0, ${d_{x - z}}$ and ${d_{y - z}}$ were independent to field height in the y-direction. As shown in Fig. 3(a), ${d_{x - z}}$ and ${d_{y - z}}$ with arbitrary field height in y-direction were 227mm and 262mm. These indicated that the intermediate imaging points in the x-z plane and the y-z plane located at 227mm and 262mm right of the SLM, separately. On the other hand, when the incident angle of information ray was changed in the x-z plane, the diffracted angle in the waveguide would be changed non-linear. It caused the separation between two intermediate image planes changed according to the field height in the x-direction as shown as Fig. 3(b). The separation of the central part is 35mm, the discrepancy of the separation is about ± 4.6mm. In this study, the discrepancy was ignored and the separation of two image plane should be independent of the virtual image distance. Accordingly, the relationship between two intermediate image planes can be described as for arbitrary field height and virtual image distance.

$${d_{y - z}} = {d_{x - z}} + 35,$$

 

Fig. 3. The relationship between the relative position of the intermediate image and the object heights in the (a) y-direction; (b) x-direction.

Download Full Size | PPT Slide | PDF

2.2 Iteration ping-pong algorithm in astigmatism space

The CGH algorithm proposed in this paper was based on convolution Fresnel diffraction. In order to compensate astigmatism, the Fresnel diffraction formula with astigmatism is necessary. Assume the light field at the image plane and hologram plane can be denoted as ${u_{i}}({x,y} )$ and ${u_{h}}({x,y} )$ separately. The light field propagating from hologram plane to image plane through an astigmatism space can be described as

$${u_{i}}({x,y} )= {u_{h}}({x,y} )\otimes exp\left[ {\frac{{i\pi }}{\lambda }\left( {\frac{{{x^2}}}{{{d_{x - z}}}} + \frac{{{y^2}}}{{{d_{y - z}}}}} \right)} \right]. $$
On the opposite, the light field inverse Fresnel propagating from image plane to hologram plane can be described as
$${u_{h}}({x,y} )= {u_{i}}({x,y} )\otimes exp\left[ {\frac{{ - i\pi }}{\lambda }\left( {\frac{{{x^2}}}{{{d_{x - z}}}} + \frac{{{y^2}}}{{{d_{y - z}}}}} \right)} \right]. $$
Then ${u_{h}}({x,y} )$ can be reformed as amplitude, phase or complex amplitude type CGH. If the inverse propagation and Eq. (1) matched, astigmatism caused by the waveguide as shown in Fig. 1 can be compensated exactly. Accordingly, a clear image can be obtained if ${u_{i}}({x,y} )$ is an astigmatism-free amplitude distribution. The experimental results of the system as Fig. 1 and phase-type CGH without astigmatism and with astigmatism were shown in Figs. 4(a) and 4(b) separately. Note that the phase distribution should multiply with the conjugate phase of the reading beam. This procedure is necessary to cancel the effect of the convergence spherical wavefront. The resulting images in Figs. 4(a) and 4(b) were captured at the same position. The CGHs of Figs. 4(a) and 4(b) were computed with the same target object and same ${d_{y - z}}$. Therefore, the space and the image quality of horizontal lines in Figs. 4(a) and 4(b) were equal to each other. For traditional Fresnel propagation, ${d_{x - z}}$ was equal to ${d_{y - z}}$. Then the vertical lines became blurred because of astigmatism as shown in Fig. 4(a). On the opposite, the image could be obtained clearly as shown in Fig. 4(b) when the Fresnel propagation matched to Eq. (1). It proved both the aberration analyzation and the Fresnel propagation with astigmatism worked well.

 

Fig. 4. The resulting image was obtained (a) without astigmatism correction; (b) with astigmatism correction in experiment.

Download Full Size | PPT Slide | PDF

The proposed method employed the iteration ping-pong algorithm to enhance image quality. The flow diagram of the proposed iteration process was shown in Fig. 5. The superscript k in the formulas stand for the iteration number. The subscript i and h correspond to the function in image plane and hologram plane separately. Assume the amplitude of the astigmatism-free final virtual image is $|{o({x,y} )} |$. The initial light filed in the image plane can be described as

$$u_{i}^{1}({x,y} )= |{o({x,y} )} |exp\{{ip_{i}^{1}({x,y} )} \}, $$

 

Fig. 5. The iteration process which propagated in astigmatism space was employed to improve image quality.

Download Full Size | PPT Slide | PDF

where $exp\{{ip_{i}^{1}({x,y} )} \}$ is a random phase distribution of a diffusion surface. When $u_{i}^{1}({x,y} )$ inverse propagate in astigmatism space to the hologram plane, the field can be described as

$$u_{h}^{1}({x,y} )= |{a_{h}^{1}({x,y} )} |exp\{{ip_{h}^{1}({x,y} )} \}. $$
Since the SLM modulates only phase distribution but not the amplitude. The amplitude in the hologram plane should be replaced as 1. The phase distribution should become grayscale according to the gray levels number of SLM. Then the light field can be described as
$$u_{h,g}^{1}({x,y} )= exp\{{ip_{h,g}^{1}({x,y} )} \}. $$
Furthermore, the reconstructed results can be expected via propagate the light field back to the image plane. When $u_{h,g}^{k}({x,y} )$ propagate in astigmatism space back to the image plane, the light field can be described as
$$u_{i}^{k}({x,y} )= |{a_{i}^{k}({x,y} )} |exp\{{ip_{i}^{k}({x,y} )} \}. $$
Then the amplitude distribution $|{a_{i}^{k}({x,y} )} |$ should be replaced to the original amplitude $|{o({x,y} )} |$ and step into the next iteration. When the iteration process is completed, the phase-type CGH $u_{h,g}^{k}({x,y} )$ is obtained.

Generally, the image will be reconstructed exactly when the phase distribution is reproduced well on the experiment. In the following experiment, the number of gray levels was 255; the phase modulation was from 0 to 2π. The results were compared based on 1 and 20 iterations. Since the reading beam in this system was not a collimated beam, the spherical wavefront should be compensated on the SLM. The compensation method is same as the former experiment and Ref. 28. In order to reduce the speckle noise, the following results were average with 20 holograms. Different random phase distributions $exp\{{ip_{i}^{1}({x,y} )} \}$ were substituted into each hologram.

3. Experimental results

In the experiment, the results were recorded by the camera (Nikon, D90) and the lens kits (Nikon, Zoom-NIKKOR 35-70mm 1:3.5-4.8). The U.S Air Force resolution test chart as shown as Fig. 6 was utilized to test the quality of the proposed method. The most meticulous linewidth is 140µm which located in the 6th element of the group 1. Since the pixel pitch of SLM is 6.4µm, the most meticulous linewidth is 22 pixels in the algorithm. The amplitude distribution of the white pattern in Fig. 6 was substituted into Eq. (4). In the propagation and inverse propagation, ${d_{x - z}}$ and ${d_{y - z}}$ were 227mm and 262mm separately. The experimental results were shown in Fig. 7.

 

Fig. 6. The resolution test chart.

Download Full Size | PPT Slide | PDF

 

Fig. 7. The resulting image provided by the proposed method with 20 iterations.

Download Full Size | PPT Slide | PDF

The reconstructed image and the ruler which located at 350 mm behind the HOE2 were clearly captured simultaneously. The resulting image proved the final image located at the specific position which was decided in the simulation. The most meticulous part which linewidth is 140µm was captured clearly at 35 cm far. It showed the resolution can achieve 3 arc minutes.

In order to figure out the contribution of the proposed algorithm, the comparison of different perform condition was shown in Fig. 8. The resulting image without astigmatism correction as shown in Fig. 8(a) was blurred by astigmatism. The resulting image of the astigmatism correction algorithm with 1 and 20 iterations were shown in Figs. 8(b) and 8(c). The blurred area became clearer both in Figs. 8(b) and 8(c). The uniformity and image quality were enhanced with the iteration process. That the brightness of the edge area was increased.

 

Fig. 8. The reconstructed images of the CGHs (a) without correction; (b) with astigmatism correction and 1 iteration; (c) with astigmatism correction and 20 iterations.

Download Full Size | PPT Slide | PDF

Figure 9 showed the normalized gray value of the horizontal line in different regions. The normalized gray value was defined as that the average gray value of lines divided by the average value of the central square. The value of the lines in group 1 element 6 was shown in Fig. 9(a). The contrast ratio was improved obviously with astigmatism correction. The iteration process improved the contrast ratio and brightness furthermore. However, the spatial frequency approached the limit of resolution of the CCD. Therefore, the gray value of the dark region with 20 iterations was increased a bit. The improvement of the proposed method was more obvious in group 0 element 1 such as Fig. 9(b). The correction and the iteration increased the slope of the gray value at the edge of lines and led the shape to become sharper.

 

Fig. 9. The normalized gray value distributions of the horizontal lines in (a) group 1 element 6; (b) group 0 element1 compared between different experimental results.

Download Full Size | PPT Slide | PDF

Furthermore, the feasibility of binary CGH based on the proposed method was proved in this study. The algorithm was performed based on only 2 gray levels. Namely, the gray scaling step was replaced as binarization. The phase modulation of different gray value was 0 and π. The experimental results of the slimmest region with 1 iteration and 20 iterations was shown in Figs. 10(a) and 10(b) separately. The resulting image with 20 iterations was obviously clearer than the former one.

 

Fig. 10. The experimental results of the astigmatism corrected binary CGHs with (a) 1 iteration; (b) 20 iterations.

Download Full Size | PPT Slide | PDF

Since the astigmatism is almost independent of the virtual image distance, the calculation of holograms with different propagation distance should match to Eq. (1). The gray-scale hologram which recorded 3 images with different distance was utilized to provide 3D images. The character “3”, “D” and “NCUE” were recorded where ${d_{y - z}}$ is 262, 462 and 662 separately. Then the images should locate at 350, 550 and 750mm behind the HOE2 separately. In order to determine the distance, a ruler, a blue plastic post and a cardboard which drawing a cartoon mosquito were placed at 350, 550 and 750 mm. The resulting images were shown in Fig. 11. The images were obtained at specific positions clearly.

 

Fig. 11. The 3D information reconstructed via the gray-scale CGH at (a) 350 mm; (b) 550 mm; (c) 750 mm.

Download Full Size | PPT Slide | PDF

4. Conclusions

The optimization method of CGHs for diffraction waveguide was proposed in this paper. The astigmatism of waveguide elements can be analyzed by the ray-tracing method. The modified convolution formula which included astigmatism parameter produced the phase distribution of an astigmatism object. Then the astigmatism of waveguide element was compensated. Furthermore, the iteration process improved the image quality of the phase-only system. Considering the potential of binary CGHs on the high-speed SLM system, the proposed method was also proved with the binary operation.

Funding

Ministry of Science and Technology, Taiwan (MOST 107-2221-E-018-005, MOST 108-2221-E-018-018-MY3).

References

1. N. A. Dodgson, “Autostereoscopic 3D displays,” Computer 38(8), 31–36 (2005). [CrossRef]  

2. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008). [CrossRef]  

3. D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948). [CrossRef]  

4. E. N. Leith and J. Upatnieks, “Reconstructed wavefronts and communication theory,” J. Opt. Soc. Am. 52(10), 1123–1130 (1962). [CrossRef]  

5. I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015). [CrossRef]  

6. T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24(3), 2189–2199 (2016). [CrossRef]  

7. S.-F. Lin and E.-S. Kim, “Single SLM full-color holographic 3-D display based on sampling and selective frequency-filtering methods,” Opt. Express 25(10), 11389–11404 (2017). [CrossRef]  

8. Y. Ogihara and Y. Sakamoto, “Fast calculation method of a CGH for a patch model using a point-based method,” Appl. Opt. 54(1), A76–A83 (2015). [CrossRef]  

9. T. Shimobaba, K. Matsushima, T. Kakue, N. Masuda, and T. Ito, “Scaled angular spectrum method,” Opt. Lett. 37(19), 4128–4130 (2012). [CrossRef]  

10. K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44(22), 4607–4614 (2005). [CrossRef]  

11. K. Matsushima, H. Schimmel, and F. Wyrowski, “Fast calculation method for optical diffraction on tilted planes by use of the angular spectrum of plane waves,” J. Opt. Soc. Am. A 20(9), 1755–1762 (2003). [CrossRef]  

12. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017). [CrossRef]  

13. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013). [CrossRef]  

14. X. Li, J. Liu, T. Zhao, and Y. Wang, “Color dynamic holographic display with wide viewing angle by improved complex amplitude modulation,” Opt. Express 26(3), 2349–2358 (2018). [CrossRef]  

15. R. G. Dorsch, A. W. Lohmann, and S. Sinzinger, “Fresnel ping-pong algorithm for two-plane computer-generated hologram display,” Appl. Opt. 33(5), 869–875 (1994). [CrossRef]  

16. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012). [CrossRef]  

17. T. Shimobaba, M. Makowski, T. Kakue, M. Oikawa, N. Okada, Y. Endo, R. Hirayama, and T. Ito, “Lensless zoomable holographic projection using scaled Fresnel diffraction,” Opt. Express 21(21), 25285–25290 (2013). [CrossRef]  

18. M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005). [CrossRef]  

19. K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016). [CrossRef]  

20. J.-S. Chen and D. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015). [CrossRef]  

21. Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017). [CrossRef]  

22. E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017). [CrossRef]  

23. J.-A. Piao, G. Li, M.-L. Piao, and N. Kim, “Full color holographic optical element fabrication for waveguide-type head mounted display using photopolymer,” J. Opt. Soc. Korea 17(3), 242–248 (2013). [CrossRef]  

24. M.-L. Piao and N. Kim, “Achieving high levels of color uniformity and optical efficiency for a wedge-shaped waveguide head-mounted display using a photopolymer,” Appl. Opt. 53(10), 2180–2186 (2014). [CrossRef]  

25. L. Eisen, M. Meyklyar, M. Golub, A. A. Friesem, I. Gurwich, and V. Weiss, “Planar configuration for image projection,” Appl. Opt. 45(17), 4005–4011 (2006). [CrossRef]  

26. T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14(5), 467–475 (2006). [CrossRef]  

27. H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015). [CrossRef]  

28. W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism and deformation correction for a holographic head-mounted display with a wedge-shaped holographic waveguide,” Appl. Opt. 57(25), 7094–7101 (2018). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. N. A. Dodgson, “Autostereoscopic 3D displays,” Computer 38(8), 31–36 (2005).
    [Crossref]
  2. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
    [Crossref]
  3. D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948).
    [Crossref]
  4. E. N. Leith and J. Upatnieks, “Reconstructed wavefronts and communication theory,” J. Opt. Soc. Am. 52(10), 1123–1130 (1962).
    [Crossref]
  5. I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
    [Crossref]
  6. T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24(3), 2189–2199 (2016).
    [Crossref]
  7. S.-F. Lin and E.-S. Kim, “Single SLM full-color holographic 3-D display based on sampling and selective frequency-filtering methods,” Opt. Express 25(10), 11389–11404 (2017).
    [Crossref]
  8. Y. Ogihara and Y. Sakamoto, “Fast calculation method of a CGH for a patch model using a point-based method,” Appl. Opt. 54(1), A76–A83 (2015).
    [Crossref]
  9. T. Shimobaba, K. Matsushima, T. Kakue, N. Masuda, and T. Ito, “Scaled angular spectrum method,” Opt. Lett. 37(19), 4128–4130 (2012).
    [Crossref]
  10. K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44(22), 4607–4614 (2005).
    [Crossref]
  11. K. Matsushima, H. Schimmel, and F. Wyrowski, “Fast calculation method for optical diffraction on tilted planes by use of the angular spectrum of plane waves,” J. Opt. Soc. Am. A 20(9), 1755–1762 (2003).
    [Crossref]
  12. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017).
    [Crossref]
  13. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
    [Crossref]
  14. X. Li, J. Liu, T. Zhao, and Y. Wang, “Color dynamic holographic display with wide viewing angle by improved complex amplitude modulation,” Opt. Express 26(3), 2349–2358 (2018).
    [Crossref]
  15. R. G. Dorsch, A. W. Lohmann, and S. Sinzinger, “Fresnel ping-pong algorithm for two-plane computer-generated hologram display,” Appl. Opt. 33(5), 869–875 (1994).
    [Crossref]
  16. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
    [Crossref]
  17. T. Shimobaba, M. Makowski, T. Kakue, M. Oikawa, N. Okada, Y. Endo, R. Hirayama, and T. Ito, “Lensless zoomable holographic projection using scaled Fresnel diffraction,” Opt. Express 21(21), 25285–25290 (2013).
    [Crossref]
  18. M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005).
    [Crossref]
  19. K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
    [Crossref]
  20. J.-S. Chen and D. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015).
    [Crossref]
  21. Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
    [Crossref]
  22. E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017).
    [Crossref]
  23. J.-A. Piao, G. Li, M.-L. Piao, and N. Kim, “Full color holographic optical element fabrication for waveguide-type head mounted display using photopolymer,” J. Opt. Soc. Korea 17(3), 242–248 (2013).
    [Crossref]
  24. M.-L. Piao and N. Kim, “Achieving high levels of color uniformity and optical efficiency for a wedge-shaped waveguide head-mounted display using a photopolymer,” Appl. Opt. 53(10), 2180–2186 (2014).
    [Crossref]
  25. L. Eisen, M. Meyklyar, M. Golub, A. A. Friesem, I. Gurwich, and V. Weiss, “Planar configuration for image projection,” Appl. Opt. 45(17), 4005–4011 (2006).
    [Crossref]
  26. T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14(5), 467–475 (2006).
    [Crossref]
  27. H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015).
    [Crossref]
  28. W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism and deformation correction for a holographic head-mounted display with a wedge-shaped holographic waveguide,” Appl. Opt. 57(25), 7094–7101 (2018).
    [Crossref]

2018 (2)

2017 (4)

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017).
[Crossref]

Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017).
[Crossref]

S.-F. Lin and E.-S. Kim, “Single SLM full-color holographic 3-D display based on sampling and selective frequency-filtering methods,” Opt. Express 25(10), 11389–11404 (2017).
[Crossref]

2016 (2)

T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24(3), 2189–2199 (2016).
[Crossref]

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

2015 (4)

2014 (1)

2013 (3)

2012 (2)

2008 (1)

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

2006 (2)

2005 (3)

N. A. Dodgson, “Autostereoscopic 3D displays,” Computer 38(8), 31–36 (2005).
[Crossref]

K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44(22), 4607–4614 (2005).
[Crossref]

M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005).
[Crossref]

2003 (1)

1994 (1)

1962 (1)

1948 (1)

D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948).
[Crossref]

Akeley, K.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Banks, M. S.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Bieda, M.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

Chen, J.-S.

Chen, Z.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Chlipala, M.

Chu, D.

Dodgson, N. A.

N. A. Dodgson, “Autostereoscopic 3D displays,” Computer 38(8), 31–36 (2005).
[Crossref]

Dorsch, R. G.

Duan, X.

Ducin, I.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
[Crossref]

Eisen, L.

Endo, Y.

Friesem, A. A.

Gabor, D.

D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948).
[Crossref]

Gao, Q.

Gao, X.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Girshick, A. R.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Golub, M.

Gurwich, I.

Hirayama, R.

Hoffman, D. M.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Ito, T.

Ji, Y.-M.

Jia, J.

Kakarenko, K.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
[Crossref]

Kakue, T.

Kim, E.-S.

Kim, H.-J.

Kim, N.

Kim, S.-B.

Kim, S.-H.

Kolodziejczyk, A.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
[Crossref]

M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005).
[Crossref]

Kowalczyk, A.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

Kozacki, T.

Leith, E. N.

Levola, T.

T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14(5), 467–475 (2006).
[Crossref]

Li, B.

Li, G.

Li, J.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Li, X.

Lin, B.-S.

Lin, Q.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Lin, S.-F.

Lin, W.-K.

Liu, J.

Liu, P.

Lohmann, A. W.

Makowski, M.

Masuda, K.

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

Masuda, N.

Matoba, O.

W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism and deformation correction for a holographic head-mounted display with a wedge-shaped holographic waveguide,” Appl. Opt. 57(25), 7094–7101 (2018).
[Crossref]

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

Matsushima, K.

Meyklyar, M.

MikuL, G.

M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005).
[Crossref]

Murakami, E.

E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017).
[Crossref]

Nitta, K.

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

Ogihara, Y.

Oguro, Y.

E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017).
[Crossref]

Oikawa, M.

Okada, N.

Pan, Y.

Park, J.-H.

Piao, J.-A.

Piao, M.-L.

Saita, Y.

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

Sakamoto, Y.

E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017).
[Crossref]

Y. Ogihara and Y. Sakamoto, “Fast calculation method of a CGH for a patch model using a point-based method,” Appl. Opt. 54(1), A76–A83 (2015).
[Crossref]

Schimmel, H.

Shimobaba, T.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

T. Shimobaba, M. Makowski, T. Kakue, M. Oikawa, N. Okada, Y. Endo, R. Hirayama, and T. Ito, “Lensless zoomable holographic projection using scaled Fresnel diffraction,” Opt. Express 21(21), 25285–25290 (2013).
[Crossref]

T. Shimobaba, K. Matsushima, T. Kakue, N. Masuda, and T. Ito, “Scaled angular spectrum method,” Opt. Lett. 37(19), 4128–4130 (2012).
[Crossref]

Sinzinger, S.

Su, W.-C.

Suszek, J.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
[Crossref]

Sypek, M.

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
[Crossref]

M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005).
[Crossref]

Toritani, R.

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

Upatnieks, J.

Wang, K.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Wang, Y.

Weiss, V.

Wyrowski, F.

Xia, P.

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

Xie, S.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Yan, B.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Yeom, H.-J.

Yu, C.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Yu, X.

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Zhang, H.

Zhao, T.

Appl. Opt. (6)

Computer (1)

N. A. Dodgson, “Autostereoscopic 3D displays,” Computer 38(8), 31–36 (2005).
[Crossref]

IEICE Trans. Electron. (1)

E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. 100(11), 965–971 (2017).
[Crossref]

J. Disp. Technol. (1)

K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016).
[Crossref]

J. Opt. Soc. Am. (1)

J. Opt. Soc. Am. A (1)

J. Opt. Soc. Korea (1)

J. Soc. Inf. Disp. (1)

T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14(5), 467–475 (2006).
[Crossref]

J. Vision (1)

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 33 (2008).
[Crossref]

Nature (1)

D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948).
[Crossref]

Opt. Commun. (2)

I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015).
[Crossref]

Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Opt. Eng. (1)

M. Sypek, A. Kolodziejczyk, and G. MikuĹ, “Three-plane phase-only computer hologram generated with iterative Fresnel algorithm,” Opt. Eng. 44(12), 125805 (2005).
[Crossref]

Opt. Express (9)

H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015).
[Crossref]

T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24(3), 2189–2199 (2016).
[Crossref]

S.-F. Lin and E.-S. Kim, “Single SLM full-color holographic 3-D display based on sampling and selective frequency-filtering methods,” Opt. Express 25(10), 11389–11404 (2017).
[Crossref]

Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017).
[Crossref]

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref]

X. Li, J. Liu, T. Zhao, and Y. Wang, “Color dynamic holographic display with wide viewing angle by improved complex amplitude modulation,” Opt. Express 26(3), 2349–2358 (2018).
[Crossref]

J.-S. Chen and D. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012).
[Crossref]

T. Shimobaba, M. Makowski, T. Kakue, M. Oikawa, N. Okada, Y. Endo, R. Hirayama, and T. Ito, “Lensless zoomable holographic projection using scaled Fresnel diffraction,” Opt. Express 21(21), 25285–25290 (2013).
[Crossref]

Opt. Lett. (1)

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. The HMD system composed of a CGH system and a holographic waveguide with astigmatism.
Fig. 2.
Fig. 2. The schematic diagram of the simulation configuration.
Fig. 3.
Fig. 3. The relationship between the relative position of the intermediate image and the object heights in the (a) y-direction; (b) x-direction.
Fig. 4.
Fig. 4. The resulting image was obtained (a) without astigmatism correction; (b) with astigmatism correction in experiment.
Fig. 5.
Fig. 5. The iteration process which propagated in astigmatism space was employed to improve image quality.
Fig. 6.
Fig. 6. The resolution test chart.
Fig. 7.
Fig. 7. The resulting image provided by the proposed method with 20 iterations.
Fig. 8.
Fig. 8. The reconstructed images of the CGHs (a) without correction; (b) with astigmatism correction and 1 iteration; (c) with astigmatism correction and 20 iterations.
Fig. 9.
Fig. 9. The normalized gray value distributions of the horizontal lines in (a) group 1 element 6; (b) group 0 element1 compared between different experimental results.
Fig. 10.
Fig. 10. The experimental results of the astigmatism corrected binary CGHs with (a) 1 iteration; (b) 20 iterations.
Fig. 11.
Fig. 11. The 3D information reconstructed via the gray-scale CGH at (a) 350 mm; (b) 550 mm; (c) 750 mm.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

d y z = d x z + 35 ,
u i ( x , y ) = u h ( x , y ) e x p [ i π λ ( x 2 d x z + y 2 d y z ) ] .
u h ( x , y ) = u i ( x , y ) e x p [ i π λ ( x 2 d x z + y 2 d y z ) ] .
u i 1 ( x , y ) = | o ( x , y ) | e x p { i p i 1 ( x , y ) } ,
u h 1 ( x , y ) = | a h 1 ( x , y ) | e x p { i p h 1 ( x , y ) } .
u h , g 1 ( x , y ) = e x p { i p h , g 1 ( x , y ) } .
u i k ( x , y ) = | a i k ( x , y ) | e x p { i p i k ( x , y ) } .

Metrics