Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Using high-diffraction-efficiency holographic optical elements in a full-color augmented reality display system

Open Access Open Access

Abstract

Holographic optical elements (HOEs) play an important role in augmented reality (AR) systems. However, the fabrication of full-color HOEs is difficult and the diffraction efficiency is low. In this paper, we use the time-scheduled iterative exposure method to fabricate full-color HOEs with high diffraction efficiency. Through this method, a full-color HOE with an average diffraction efficiency of 73.4% was implemented in a single photopolymer, the highest rate yet reported. In addition, the AR system is simulated by the geometric optics method combining k-vector circle and ray tracing and structured by combining laser micro-drop and high diffraction efficiency HOEs. A good color blending effect was achieved in a full-color AR system by using the reconstruction wavelength consistent with the recording light. It can present clear holographic images in a full-color AR display system.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Display is a new intelligent human-computer interaction platform that has entered all domains of life. With the miniaturization of display devices, augmented reality (AR) systems are entering the public sphere. The light of virtual images and real images in the augmented reality systems is imaged simultaneously on the retina of human eye, thereby implementing augmented reality. AR systems are mainly used in near-eye displays (NEDs) and head-up displays (HUDs) [1]. To transmits the light of real scenes and of virtual images of the micro-display to the human eye to achieve a more realistic human-computer interaction. Since the micro-display cannot block the human eye, it needs an optical-beam-combining element, which needs to generate additional focal power to achieve optical amplification of the image. Recently, most optical combining elements have adopted the structure of an optical waveguide [2] or implement free-space beam combining [3,4]. At present, the most studied technical route is the near-eye reality system based on an optical waveguides, which limits light to a flat glass with a thickness of millimeters and transmits it to the human eye, making it easier to shape it as eyeglasses [5]. However, this structure limits the field of view and the range of the eye box of the system. Generally, the range of the eye box is increased by expanding the exit pupil. The method involves a sacrifice of image efficiency and will also lead to the problem of uneven images [2].

Another technical route is to directly project the image into the human eye through free space. This method can yields a large field angle and receive the image while ensuring uniformity and high efficiency [6]. The free space combiner is an off-axis system, including a free space catadioptric optical element [7], metasurface [810], holographic lenses (HLs) [11]. The holographic optical elements (HOEs) have the advantages of small size, light weight and the ability to record an arbitrary wavefront, which make it a potential transparent imaging element [12]. The wavefront of the object light is recorded in the form of a sub-wavelength phase grating by interference in the photopolymer. The reconstruction process is to modulate the wavefront of the incident light to reconstruct the desired wavefront. Although the polymer surface is not bent, it can also generate light focus, which achieves a more compact optical structure. In addition, the HOE is an optical element based on the diffraction principle. Its wavelength and angle selectivity ensure high transmittance of real scenes and high diffraction of virtual images of the micro-display. The stray light directly passes through without diffraction, achieving high real scene transmittance while maintaining high image contrast [13].

Recently, off-axis HLs have been reported as free-space beam combining elements in NED. Jang et al. [14] used HOEs to generate 3D image in an AR system. Later, Lee et al. [15] optimized the free spacing system based on HOEs and proposed an astigmatism compensation method. However, these studies were mainly based on an HL optical system made of full-color HOEs. For the optical system of full-color HOE, there are still more difficulties in the process and optical system [16]. Current research on full-color HOEs is mainly based on the holographic optical waveguide structure. Southeast University of China has systematically studied the diffraction response of holographic grating through rigorous coupled wave theory, and made a prototype of full-color holographic optical waveguide [17,18]. Piao et al. [19,20] have fabricated a full-color holographic coupled grating with optical waveguide structure by sequential exposure and optimized the diffraction efficiency. The HL based on full-color HOEs can directly provide optical focus, further compressing the space of the system. It can ensure the high uniformity, efficiency, field of view, and eye box range of virtual images. However, it has higher requirements for the design and process of full-color HOEs, and is used with a narrow linewidth display. There has been less work in this field.

In this study, an experimental recording platform for full-color holographic optical elements was built. A high-diffraction-efficiency full-color HOE was fabricated by the time-scheduled iterative exposure method. The recorded full-color HOE had a sufficient bandwidth of angle, and the average peak diffraction efficiency reached 73.4%. Combining the full-color HOE with laser micro-projection, we realized an AR prototype. The system was evaluated from three aspects: astigmatism, distortion, and the modulation transfer function (MTF) curve. A good color blending effect was achieved in a full-color AR system by using the reconstruction wavelength consistent with the recording light.

2. Imaging principle of a full-color HOE

First, the imaging properties of HL based on a full-color HOE are described by the imaging equation, whereby the position of image point is predicted by the k-vector circle and ray tracing simulation, and the imaging process is analyzed.

2.1 Imaging process of full-color HLs

A full-color HL is a HOE obtained by the interference of two spherical wavefronts. This recording method can realize the process of lens imaging. The HL of off-axis recording has two optical axes. The optical axes of object and image are not in a straight line, and the imaging process is different from that of the coaxial optical element.

In actual HL imaging, only one plane (xz) has an inclination in Fig. 1, which leads to different inclination angles of the meridian (xz) and sagittal (yz) planes, and different distances between the two imaging planes, resulting in astigmatism. The calculation formula for the vertical axis magnification of the meridian and sagittal planes can be written as [21]:

$${M_X} = \frac{{cos{\theta _{ox}}{R_{ix}}}}{{\textrm{cos}{\theta _{ix}}{R_{ox}}}}$$
$${M_y} = \frac{{cos{\theta _{oy}}{R_{iy}}}}{{\textrm{cos}{\theta _{iy}}{R_{oy}}}}$$
where RS and Rr are the radii of curvature of the single light and reference light, respectively, Ro and Ri are the radii of curvature of the object point and image point, respectively, θs and θr are the tilt angles of the single light and reference light in recording process, respectively, and θo and θi are the tilt angles of the object point and image point, respectively. Unlike traditional coaxial imaging elements, the paraxial magnification of HL is related to the angle of the reconstructed object point. When the field of view (FOV) is large, it will introduce large distortions, causing significant image distortion.

 figure: Fig. 1.

Fig. 1. Schematic diagram of the HL recording wavefront

Download Full Size | PDF

According to the imaging process of HL, the object-image relationship of the imaging system is shown in Fig. 2.

 figure: Fig. 2.

Fig. 2. Imaging principle diagram of a single HL

Download Full Size | PDF

The object surface is a micro-display, whose center lies on the optical axis of object, as does the corresponding image point center. When object distance is approximately equal to one focal length, a virtual image at infinity can be obtained. According to the imaging system in Fig. 2, the FOV can be calculated by [21]:

$$FOV = 2\arctan \left( {\frac{h}{{2fcos\theta }}} \right)$$
where h is the size of object plane, f is the curvature of position the signal position, and θ is the tilt angle of the optical axis.

2.2 Simulation of the imaging process of full-color HL

HL imaging is simulated by ray tracing. First, the effect of each local grating of HL on light is determined by the k-vector circle, which is essentially a change in direction of incident light. The k-vector circle principle of recording and reconstruction is shown in Fig. 3. In order to simplify digital representation, primarily two-dimensional recording is discussed.

 figure: Fig. 3.

Fig. 3. K-vector circle of the HOE: (a) recording process and (b) reconstruction process.

Download Full Size | PDF

The recording process forms a grating vector distribution through the interference of the object light and the reference light. Each local grating region can be defined by a grating vector $\overrightarrow {{k_g}} $ in Fig. 3(a), whose expression is given by:

$$\overrightarrow {{k_g}} = \overrightarrow {{k_s}} - \overrightarrow {{k_r}} $$
where $\overrightarrow {{k_s}} $ and $\overrightarrow {{k_r}} $ are the wavefront vectors of the signal light and reference light. In the actual reconstruction process, there may also be inconsistencies in the wavelength and angle. To simplify the simulation, an equivalent Bragg condition is used to obtain the equivalent recording condition for the same grating vector at different wavelengths, as shown in Fig. 3(a). The ${\theta _{r^{\prime}}}$ is tilt angle of reference light under the equivalent Bragg condition, and $\overrightarrow {{k_{r^{\prime}}}} $ is the wavefront vector direction of the reference light. The ${\theta _{s^{\prime}}}$ is tilt angle of signal light under the equivalent Bragg condition, and $\overrightarrow {{k_{s^{\prime}}}} $ is the wavefront vector direction of the signal light. By shifting the grating vector from the recording wavefront vector circle to the reconstruction wavefront vector circle, the recording conditions can be obtained when the recorded and reconstruction wavelength are consistent. All problems of Bragg mismatch reconstruction can be qualitatively judged based on different angles.

When the wavelength or angle of the reconstruction light is inconsistent with the reference light, the diffraction light will deviate from the original signal light direction, which is called Bragg mismatch wavefront reconstruction in Fig. 3(b). According to Kogelink’s coupled-wave theory [22], the diffraction efficiency decreases, but the diffracted light still exists. Here the volume grating generates an additional Bragg mismatch quantity $\overrightarrow {\varDelta k} $ acting on the reconstruction light, and the wavefront vector $\overrightarrow {{k_d}} $ of the diffraction light can be expressed as:

$$\overrightarrow {{k_d}} = \overrightarrow {{k_g}} + \overrightarrow {{k_p}} + \overrightarrow {\varDelta k} $$

The construction wavefront equation can be written as [21]:

$$sin{\theta _d} = sin{\theta _p} + \frac{{{k_s}}}{{{k_p}}}sin{\theta _s} - \frac{{{k_r}}}{{{k_p}}}sin{\theta _r}$$
where $\overrightarrow {{k_d}} $ and $\overrightarrow {{k_p}} $ are the wavefront vectors of the diffraction light and reconstruction light, respectively. Thus, the effect of HOEs on light of any wavelength and any direction can been obtained. The imaging process can be simulated by ray tracing. The numerical simulation of HOE imaging system is carried out with MATLAB as the programming environment. The principle of ray tracing of holographic lens imaging is as follows: The wavefront $\overrightarrow {{k_p}} $ of the object ray determines the direction of its emission. Firstly, the intersection of the object ray and HOE is calculated. The recorded wavefront vector $\overrightarrow {{k_s}} $ and $\overrightarrow {{k_r}} $ of HOE at this point are obtained, and then the grating vector $\overrightarrow {{k_g}} $ at different positions of HOE is calculated by Eq. (4). Finally, the wavefront vector $\overrightarrow {{k_d}} $ of the diffraction ray corresponding to each $\overrightarrow {{k_p}} $ is predicted by Eq. (6). By tracing a large number of $\overrightarrow {{k_p}} $, the imaging situation can be determined.

When the reconstruction wavelength is not consistent with the recording wavelength, the recording-reconstruction wavelength offsets of three sub-grating channels are different, resulting in RGB images being seperated into different angles in space. This phenomenon can be intuitively expressed by simulation. Red and green light are taken as an example in Fig. 4, where the red and green lines are their respective imaging process, the light blue line is the light emitted by the object point in different FOV, and the black line is the surface of HOE. It may be seen that as their deflection angles are different, they do not overlap in space and cannot achieve color matching. Thus, in this experiment the consistent reconstruction wavelength will be used. The eye box is the range or size of the viewing area where a complete image can be seen. In Fig. 4, it is defined as the overlapping region of image points of different FOV.

 figure: Fig. 4.

Fig. 4. The imaging process of different sub-grating.

Download Full Size | PDF

The imaging results of xz and yz plane are simulated by ray tracing as shown in Fig. 5. Comparing the imaging distance in Fig. 5(a) and Fig. 5(b), it can be seen that the system has astigmatism, which is consistent with the theoretical prediction results. The cyan lines represent the object rays, the green lines represent the diffraction rays from on-axis object points, and the red and blue lines represent the diffraction rays from off-axis object points.

 figure: Fig. 5.

Fig. 5. The imaging results simulated by ray tracing: (a)xz plane and (b) yz plane.

Download Full Size | PDF

3. AR system with a full-color HOE

The AR system with an HL based on a full-color HOE is shown in Fig. 6. The narrow-spectrum laser light source that satisfies the diffraction condition of full-color HOE is uniformly illuminated on the spatial light modulator through a homogenizer, and then the image is formed on the screen through the collimating imaging lens. Finally, the virtual image is formed behind the full-color HOE, and the broad-spectrum light from the real scene does not diffract directly through the full-color HOE. The human eye can see the virtual image and real scene at the same time in the eye box area, and the AR effect is obtained in the brain. Following the schematic diagram of the optical path, we built a laser AR system based on a full-color HOE.

 figure: Fig. 6.

Fig. 6. Schematic diagram of the AR system with full-color HL (SLM: spatial light modulator).

Download Full Size | PDF

3.1 Fabrication of full-color HOE with high diffraction efficiency

The fabrication device was constructed as shown in Fig. 7.

 figure: Fig. 7.

Fig. 7. Fabrication device of a full-color HOE (ES: electric shutter, HWP: half wave plate, M: mirror, DM: dichroic mirror, PBS: polarization beam splitter, L: lens, Medium: photopolymer).

Download Full Size | PDF

In order to ensure coherence length, the red, green and blue sources use a single longitudinal mode laser with high coherence. The recording wavelengths are 639 nm (MSL-FN-639-200 mw, produced by Changchun New Industries), 532 nm (RN-SLM-532nm-50 mW, produced by Beijing Ranbond Technology), and 473 nm (MSL-FN-473-100 mw, produced by Changchun New Industries) respectively. The three beams of the laser are combined by a mirror and two dichroic mirrors to produce white light. The amplitude splitting interference is realized by a polarization beam splitter, and HWP4 is used to ensure that the two paths of the PBS have the same polarization state. A shutter is placed in front of each laser to control exposure time of each path, and the HWP on each path is used to continuously adjust splitting ratio of the PBS, so that the power density of irradiation on the photopolymer (PP-P-G, produced by Beijing Hope-rainbow Technology) is the same when inference is the strongest. To increase the effective aperture of the HL, the beams of the two inference arms are expanded through a telescope system. The signal light is modulated into a spherical wave by L5 to realize the fabrication of the full-color HOE.

Recently, two technical routes for the fabrication of full-color HOE have been reported: simultaneous exposure and time-scheduled iterative exposure. In simultaneous exposure, the RGB laser light source irradiates the holographic medium in the same time interval, so that the expose dose of single wavelength is controlled by the exposure time and laser intensity. The power density of simultaneous exposure method is large, which leads to the acceleration of monomer polymerization, the rapid termination of the polymerization reaction, and inability to obtain highest refractive index modulation (RIM), affecting the peak diffraction efficiency of full-color HOE. In addition, the simultaneous exposure method cannot adjust the exposure time of full-color channel alone, and the power of a single longitudinal mode laser is generally not adjustbale, which will lead to different peak powers of different colors. The diffraction efficiencies of the three channels of full-color HOE are not uniform and cannot be adjusted. Therefore, the time-scheduled iterative exposure method is used in the experiment.

Conversely, in time-scheduled iterative exposure, only one wavelength of light is used to irradiate the photopolymer in a single time period, which can ensure the polymerization speed of an appropriate amount of monomer, and allows control of the exposure time ratio of each color separately, as well as affording an optimized space for the diffraction efficiency uniformity of RGB three channels. The main problem of time-scheduled iterative exposure is that the exposure recording time period is long, and it is also greatly affected by system vibrations. In order to obtain full-color with high diffraction efficiency and uniformity, it is necessary to further optimize the exposure recording process of RGB three colors. The overall optimization idea is to find an initial reference point, and then change the exposure time according to this reference point for iterative optimization. Considering the photosensitive properties of holographic medium, the exposure doses of red, green, and blue are 150 mJ/cm2, 30 mJ/cm2 and 30 mJ/cm2, respectively. The photosensitivity of the holographic medium to blue and green light is similar, and the photosensitivity to red light is about one-fifth of that to blue and green light. Therefore, in order to ensure the photosensitivity of red light, it is necessary to increase the relative power density of red light. This principle should be taken into account when purchasing single longitudinal mode lasers for exposure. The power densities of the three colors can been fine-tuned by placing attenuators. The power densities of red, green and blue exposure are set to be 4.1 mW/cm2, 0.9 mW/cm2, and 0.5 mW/cm2, respectively. Next, the initial single exposure time is set, and the exposure times of time-scheduled iterative exposure for each light are the same. According to the value of the exposure dose, the total exposure time of each light is determined, so that the relative proportions of the red, green, and blue exposure times can be calculated. Since the human eye is most sensitive to green light, whose wavelength is between those of red and blue, the exposure time of green time is selected as the reference point. According to an exposure experiment, the initial single exposure time of green light is set to 600 ms and the exposure times of red, green, and blue is recorded as a cycle, allowing a total of 55 cycles of exposure to be determined. The initial single exposure times of red and blue light were set to 600 ms and 750 ms, respectively.

According to Kogelink’s coupled-wave theory [22], the RIM value of the medium is the most important factor contributing to diffraction efficiency. The exposure energy is one of major factors which affect the RIM of the medium. In these experiments, the recording beam intensity remains unchanged. By changing the exposure time of different wavelengths, the appropriate exposure energy of them was found to achieve high RIM corresponding to high diffraction efficiency. The relative theoretical efficiency for a volume refection hologram that meets the Bragg condition in the reconstruction can be written as:

$$\eta = tan{h^2}\left( {\frac{{\pi dn}}{{\lambda \sqrt {{c_r}{c_s}} }}} \right)$$
where d is the thickness of photopolymer, n is the RIM, the λ is the reconstruction wavelength, cr is the direction cosine of the angle of incidence (cr = 1) and cs is the direction cosine of the angle of reference beam (cs = cos45°).

The above time sequential exposure process is realized by three shutters (GCI-7101 M, produced by Daheng Optics) and a precision electronic timer (GCI-73 M, produced by Daheng Optics). The response time of the shutter is within 1 ms, which can be ignored. The total exposure time of full-color HL is about 2.5 minutes, and total fabrication time of a full-color HOE is about 20 minutes, including stabilization of laser and post-processing. The diffraction efficiency of each diffraction peak of a full-color HOE with different exposure times is shown in Table 1.

Tables Icon

Table 1. Peak diffraction efficiency of the full-color HOE with different exposure times

In Experiment 1, the peak diffraction efficiency of blue light is close to 100%, while that of red light is almost zero. In Experiment 2, the peak diffraction efficiency of red light is improved by increasing the exposure time of red light. However, the peak diffraction efficiency of blue light does not change much, so the exposure time of blue light was reduced in Experiment 3. In experiment 3, the peak diffraction efficiency of red light is close to 100%, while that of blue light is zero and that of green light is basically maintained at 30% to 40%. In Experiment 4, the exposure time of red light is reduced and that of blue light is increased. A satisfactory diffraction efficiency is thereby obtained. The variation curve of the diffraction efficiency of full-color HOE in Experiment 4 with Bragg condition angle offset is shown in Fig. 8. The diffraction efficiency measurement device of HOE with Bragg condition angle offset is shown in Fig. 9. The incident light passes through a small aperture diaphragm and then enters the HOE. The diffraction light and transmission light are respectively the OPM1 (Thorlabs S140C) and OPM2 (Thorlabs S142C). The intensities of the diffraction light ID and the transmission light IT can be simultaneously recorded to prevent errors caused by unstable power of the light source. Without considering the influence of medium absorption, the expression of diffraction efficiency can be written as:

$$\eta (\%)= \frac{{{I_D}}}{{{I_D} + {I_T}}} \times 100$$

 figure: Fig. 8.

Fig. 8. The variation curve of diffraction efficiency of full-color HOE with Bragg condition angle offset.

Download Full Size | PDF

 figure: Fig. 9.

Fig. 9. The diffraction efficiency measurement device diagram of HOE with Bragg condition angle offset (OMP: optical power meter).

Download Full Size | PDF

As shown in Fig. 8, the diffraction efficiencies of red, green and blue light of single full-color HOE were 84.4%, 69.4% and 66.5%, respectively. The average peak diffraction efficiency reached 73.4%, which is the highest rate yet reported. The standard deviation is 9.6%. The uniformity still may be improved. Its angular bandwidth is 12°, sufficient to ensure the size of eye box of the system. The previous reports of full-color HOE with high diffraction efficiency are shown in Table 2.

Tables Icon

Table 2. Report parameters of HOE with high diffraction

The photopolymer thickness d provided by Beijing Hope-rainbow Technology is 18 µm. Substituting the d into Eq. (7), the RIMs of three wavelengths in Table 1 are calculated. Then the range of RIM for red, green, and blue wavelengths is from 0.0006 to 0.0254, from 0.0052 to 0.0094, from 0.0004 to 0.0218, respectively. In Experiment 4, the RIMs for red, green, and blue wavelengths are 0.0150, 0.0094, 0.0080, respectively, which is consistent with the diffraction efficiency.

The HOE was evaluated by measuring its wavelength selectivity through the transmission spectra when illuminated with light at the Bragg angle condition with an ultraviolet spectrophotometer (UV-2700, produced by SHIMADZU) in Fig. 10.

 figure: Fig. 10.

Fig. 10. The transmission spectra of full-color HOE at the Bragg angle condition.

Download Full Size | PDF

It can be seen that the HOE has wavelength selectivity and make the real scene maintain high transmission to achieve AR. A slight shift of peak wavelength is observed in Fig. 10, with a relative displacement below 1%. This effect is attributed to shrinkage effect. The full width at half maximum (FWHM) of three wavelengths is about 10 nm.

3.2 Construction of a laser AR system based on a full-color HOE

According to the simulation in Section 2.2, using the reproduction wavelength consistent with the recording wavelength can fully fuse the red, green and blue images in space to display the best color effect, so the reproduction light source also uses the single longitudinal mode laser light source for fabricating a full-color HOE. In Fig. 7, a collimating lens is placed in front of the PBS to couple the laser emitted by single longitudinal mode laser to the transmission fiber, whose core is 400 µm. Due to the strong intensity of the red laser, HWP4 is added to the rear of HWP2 to adjust the intensity of the red laser to change the laser color coupled to the fiber, as shown in Fig. 11.

 figure: Fig. 11.

Fig. 11. Fiber coupling diagram of single longitudinal mode laser.

Download Full Size | PDF

As shown in Fig. 6, a prototype laser AR system combining laser micro projection and full-color HOE was built. As shown in Fig. 12, the output end of the coupling fiber was connected to the input end of a micro projection light machine to realize the color image output, and the color image was relayed to a projection scattering screen through the collimating lens, whereupon the color virtual image was imaged to the human eye by the full-color HOE. In order to show the effect of the laser AR system, a real object was placed behind the full-color HOE as a reference.

 figure: Fig. 12.

Fig. 12. Prototype of projection laser AR system.

Download Full Size | PDF

A color CCD camera was used to simulate the human eye to shoot image. The size of the eye box of the display system can be obtained by translating the camera, and the FOV of the system can be obtained by measuring the size of image. The RGB three-color image taken by CCD and the color image matching effect of its combination are shown in Fig. 13.

 figure: Fig. 13.

Fig. 13. The image effect of full-color HOE imaging: (a) green, (b) red, (c) blue, (d) yellow, (e) cyan, (f) magenta, (g) white, and (h) original input image.

Download Full Size | PDF

Where Fig. 13(h) is the original image entered by the computer, Figs. 13(a) – (c) are monochrome images, Figs. 13(d) – (f) are mixed images of two colors, and Fig. 13(g) is a white image mixed with the three primary colors. From Fig. 13, it can be seen that the display effect is better when the recording wavelength is consistent with the reproduction wavelength. The system can achieve a high diffraction efficiency output of different color images and meet the basic functions of the display system. The superposition of the virtual image and real scene is shown in Fig. 14. The color mismatch at the edge of the image is caused by the uneven diffraction efficiency of the edge of the HOE.

 figure: Fig. 14.

Fig. 14. The superposition of the virtual image and real scene.

Download Full Size | PDF

From Fig. 14, it can be seen that the color mixing effect of the system is better, and the fusion of real images and virtual images can be realized to achieve the effect of AR.

The definition of the HL imaging made by the full-color HOE is evaluated next. The MTF was calculated in the meridian and sagittal directions by contrast method. The cosine grating images with different spatial frequencies were generated by computer. The white light fused by the three-color laser was used as the incident light for measurement. The measurement results for the MTF are shown in Fig. 15.

 figure: Fig. 15.

Fig. 15. MTF measurement curve of full-color HOE imaging system.

Download Full Size | PDF

Limited by the resolution of the measuring projector (1080p), the maximum spatial frequency of the system cannot exceed 10 cy/mm. It can be seen that the imaging effect of the y-direction of the system is better than that of the x-direction, because of the off-axis recording of the meridian plane and the coaxial recording of the sagittal plane, resulting in an off-axis aberration of the meridian plane greater than that of the sagittal plane. Based on the theory of MTF that the human eye can distinguish 30% contrasts [27], it can be seen that the MTF values of the sagittal plane and meridian plane are still greater than 30% at the spatial frequency 10 cy/mm. The system thus meets the basic requirements of resolution of the human eye.

4. Analysis and discussion

This section mainly analyzes the display effect of full-color HOE imaging system and diffraction efficiency results of fabrication of full-color HOE, and proposes optimization methods.

4.1 Effect of imaging distortion and astigmatism of HL based on full-color HOE

The astigmatism effect of HL based on full-color HOE is show in Fig. 16 in experiment.

 figure: Fig. 16.

Fig. 16. Astigmatism effect of HL based on a full-color HOE when the lens focuses on the (a) meridian image and (b) sagittal image.

Download Full Size | PDF

By comparing the clarity of the meridian and sagittal images, it can be seen that the imaging distances of the two planes are different. Because the equivalent focal lengths of the two planes of the full-color HL imaging system are, respectively,

$${f_s} = \frac{{{f_x}}}{{co{s^2}{\theta _x}}}$$
$${f_m} = \frac{{{f_y}}}{{co{s^2}{\theta _y}}}$$

The different tilt angles of the recording beams in the meridian and sagittal directions lead to astigmatism. In addition, it shows that the system has a distortion phenomenon of lateral stretching, the measurement effect of the distortion is shown in Fig. 17. It can be seen that system has a small distortion, and the overall display effect is better because the FOV of the system is small. For a full-color imaging system with a large FOV, this distortion needs to be eliminated.

 figure: Fig. 17.

Fig. 17. Distortion effect of full-color HL.

Download Full Size | PDF

4.2 Analysis of efficiency of full-color HOE

First, the exposure times of the three colors of light in Table 2 are discussed. The single exposure time of red light is slightly higher than the initial value. This is because the holographic medium used is less sensitive to red light and the response time of the medium to red time is larger than blue and green light. According to the comparison between Experiment 2 and Experiment 3, due to the large difference in the wavelength of red and blue light, the spatial frequencies of the holographic local grating are quite different, resulting in mutual suppression and competition during recording. In theory, there is a set of optimal solutions to make the highest uniformity, under which condition it has the largest average peak diffraction efficiency. The average peak diffraction efficiency of RGB obtained in the experiment is high. The reason is that the ratio of single exposure times of the three colors balances out the suppression between spatial frequency grating, so that the interference fringe contrast of the three sub-gratings of medium reaches its maximum value, and good results are obtained. In conclusion, the optimization method for diffraction efficiency is based on the properties of photopolymers. The average diffraction efficiency of HOE is determined by the exposure time ratio of the three wavelengths in each cycle when the total exposure dose reaches the saturation dose of the photopolymer. The process of this optimization method is as follows: Firstly, an initial ratio is set according to the saturation dose of photopolymer used and the recording beam intensity. According to the diffraction efficiency results of three wavelengths by initial ratio, it can be observed how many of the three diffraction efficiencies are achieved reaching the desired diffraction efficiency. According to the highest efficiency and the lowest diffraction of the three wavelengths, the ratio of the two wavelengths is adjusted to form a new ratio to balance the suppression between the two wavelengths. Then according to results of the previous ratio, the ratio of the two wavelengths is adjusted to form a new ratio again. It is optimized in turn until the three diffraction efficiencies reach the target result.

In Experiment 4, the uniformity can be further improved by further reducing the exposure time of red light, but there are still challenges in experiment process. In the design of the experiment, the power density of the single longitudinal mode laser is controlled to remain unchanged, and its uniformity is improved by changing the exposure time ratios of red, green and blue. During the experiment, the power of the single longitudinal mode laser is temperature sensitive, so the power density irradiated to the full-color HOE surface will change with time. In order to ensure that the optical path recording is not affected by vibration, the heat dissipation system of the laser is in close state, which will lead to the change of power density with time and affect the uniformity of the full-color HOE. The power density can be guaranteed by adding a temperature control module to the single longitudinal mode laser. The vibration of the system, temperature drift of power and output power and output power limitation of the single longitudinal mode laser are problems that need to be resolved in the future studies of large-area full-color HOE recording.

4.3 Imaging analysis of full-color HOE

In Fig. 4, the problem of color mismatch in full-color AR system using different wavelength is simulated. For this imaging quality issue, it is necessary to perform wavefront compensation separately for the RGB three channels during the recording process, and the three-channel imaging beam is adjusted to the same direction, so as to realize the color, angle uniform virtual.

According the simulation results in Fig. 5, the FOV of the built AR system is 18° and the eye box is 10 mm, which basically meet the needs of AR display. There is still much room for improvement, and the requirement for uniformity is higher. The previous reports of HL with two high parameters are shown in Table 3.

Tables Icon

Table 3. Report parameters of HL in FOV and eye box

5. Conclusion

In summary, the off-axis aberration properties of full-color HL were analyzed by scalar diffraction theory, and the color segregation effect of full-color is simulated. We simulated an augmented reality system is by the k-vector method and ray tracing and fabricated a full-color HOE with an average peak diffraction efficiency of 73.4%, which is the highest yet reported. The prototype AR system was built by combining laser micro-projection with full-color HL. The three-color fusion is obtained by using a reconstruction wavelength consistent with the recording wavelength. Consistent with the simulation result, the color matching effect is good. The astigmatism and distortion of this system are small, and The MTF curve of system was measured meeting the resolution requirements of the human eye.

Funding

National Key Research and Development Program of China (2021YFF0307804).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. H. Xiong, E. L. Hsiang, Z. Q. He, T. Zhan, and S. T. Wu, “Augmented reality and virtual reality displays: emerging technologies and future perspectives,” Light: Sci. Appl. 10(1), 216 (2021). [CrossRef]  

2. J. Han, J. Liu, X. C. Yao, and Y. T. Wang, “Portable waveguide display system with a large field of view by integrating freeform elements and volume holograms,” Opt. Express 23(3), 3534–3549 (2015). [CrossRef]  

3. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016). [CrossRef]  

4. M. Y. He, D. Wang, Y. Xing, Y. W. Zheng, H. L. Zhang, X. L. Ma, R. Y. Yuan, and Q. H. Wang, “Compact and lightweight optical see-through holographic near-eye display based on holographic lens,” Displays 70, 102104 (2021). [CrossRef]  

5. R. Wu, Y. T. Tian, D. F. Zhao, D. W. Li, N. Hua, and P. Shao, “Total internal reflection orders in transmission grating,” Acta Phys. Sin. 65(5), 054202 (2016). [CrossRef]  

6. H. C. Peng, D. W. Cheng, J. Han, C. Xu, W. T. Song, L. Z. Ha, J. Yang, Q. X. Hu, and Y. T. Wang, “Design and fabrication of a holographic head-up display with asymmetric field of view,” Appl. Opt. 53(29), H177–H185 (2014). [CrossRef]  

7. D. W. Cheng, Y. T. Wang, H. Hua, and M. M. Talha, “Design of an optical see-through head-mounted display with a low f-number and large field of view using a freeform prism,” Appl. Opt. 48(14), 2655–2668 (2009). [CrossRef]  

8. Z. Y. Liu, D. Y. Wang, H. Gao, M. X. Li, H. X. Zhou, and C. Zhang, “Metasurface-enabled augmented reality display: a review,” Adv. Photon. 5(03), 034001 (2023). [CrossRef]  

9. Z. Q. Yu, Q. B. Zhang, X. Tao, Y. Li, C. N. Tao, F. Wu, C. Wang, and Z. R. Zheng, “High-performance full-color imaging system based on end-to-end joint optimization of computer-generated holography and metalens,” Opt. Express 30(22), 40871–40883 (2022). [CrossRef]  

10. G. Y. Lee, J. Y. Hong, S. Hwang, S. Moon, H. Kang, S. Jeon, H. Kim, J. H. Jeong, and B. Lee, “Metasurface eyepiece for augmented reality,” Nat. Commun. 9(1), 4562 (2018). [CrossRef]  

11. G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23(26), 33170–33183 (2015). [CrossRef]  

12. C. G. Jang, O. Mercier, K. Bang, G. Li, Y. Zhao, and D. Lanman, “Design and Fabrication of Freeform Holographic Optical Elements,” ACM Trans. Graph. 39(6), 1–15 (2020). [CrossRef]  

13. A. Maimone and J. R. Wang, “Holographic Optics for Thin and Lightweight Virtual Reality,” ACM Trans. Graph. 39(4), 67 (2020). [CrossRef]  

14. C. Jang, K. Hong, J. Yeom, and B. Lee, “See-through integral imaging display using a resolution and fill factor-enhanced lens-array holographic optical element,” Opt. Express 22(23), 27958–27967 (2014). [CrossRef]  

15. S. Lee, B. Lee, J. Cho, C. Jang, J. Kim, and B. Lee, “Analysis and Implementation of Hologram Lenses for See-Through Head-Mounted Display,” IEEE Photon. Technol. Lett. 29(1), 82–85 (2017). [CrossRef]  

16. Y. N. Q. Li, Q. Yang, J. H. Xiong, K. Yin, and S. T. Wu, “3D displays in augmented and virtual realities with holographic optical elements [Invited],” Opt. Express 29(26), 42696–42712 (2021). [CrossRef]  

17. Y. Xie, M. W. Kang, and B. P. Wang, “Experimental method for testing diffraction properties of reflection waveguide holograms,” Appl. Opt. 53(19), 4206–4210 (2014). [CrossRef]  

18. Y. N. Zhang, X. L. Zhu, A. Liu, Y. S. Weng, Z. W. Shen, and B. P. Wang, “Modeling and optimizing the chromatic holographic waveguide display system,” Appl. Opt. 58(34), G84–G90 (2019). [CrossRef]  

19. M. L. Piao and N. Kim, “Achieving high levels of color uniformity and optical efficiency for a wedge-shaped waveguide headmounted display using a photopolymer,” Appl. Opt. 53(10), 2180–2186 (2014). [CrossRef]  

20. M. L. Piao, K. C. Kwon, H. J. Kang, K. Y. Lee, and N. Kim, “Full-color holographic diffuser using time-scheduled iterative exposure,” Appl. Opt. 54(16), 5252–5259 (2015). [CrossRef]  

21. Y. H. Yang, L. X. Deng, L. Q. Zhu, B. H. Yao, X. X. Ma, C. Gu, and L. X. Xu, “Characterization and design of a freeform holographic optical element,” Optik 281, 170788 (2023). [CrossRef]  

22. H. Kogelnik, “Coupled Wave Theory for Thick Hologram Gratings,” Bell Syst. Tech J. 48(9), 2909–2947 (1969). [CrossRef]  

23. I. Vazquez-Martin, M. Gomez-Climente, J. Marin-Saez, M. V. Collados, and J. Atencia, “True colour Denisyuk-type hologram recording in Bayfol HX self-developing photopolymer,” Holography: Advances and Modern Trends V 10233 (2017).

24. I. Vazquez-Martin, J. Marin-Saez, M. Gomez-Climente, D. Chemisana, M. V. Collados, and J. Atencia, “Full-color multiplexed reflection hologram of diffusing objects recorded by using simultaneous exposure with different times in photopolymer (R) HX,” Opt. Laser Technol. 143, 107303 (2021). [CrossRef]  

25. F. K. Bruder, S. Hansen, C. Manecke, E. Orselli, C. Rewitz, T. Rolle, and B. Wewer, “Wavelength Multiplexing Recording of vHOEs in Bayfol (R) HX Photopolymer Film,” Proc Spie 10676 (2018).

26. C. W. Shin, H. Y. Wu, K. C. Kwon, Y. L. Piao, K. Y. Lee, S. K. Gil, and N. Kim, “Diffraction efficiency enhancement and optimization in full-color HOE using the inhibition characteristics of the photopolymer,” Opt. Express 29(2), 1175–1187 (2021). [CrossRef]  

27. B. C. Kress, Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets, 5th ed. (SPIE, 2020).

28. J. S. Lee, Y. K. Kim, and Y. H. Won, “See-through display combined with holographic display and Maxwellian display using switchable holographic optical element based on liquid lens,” Opt. Express 26(15), 19341–19355 (2018). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (17)

Fig. 1.
Fig. 1. Schematic diagram of the HL recording wavefront
Fig. 2.
Fig. 2. Imaging principle diagram of a single HL
Fig. 3.
Fig. 3. K-vector circle of the HOE: (a) recording process and (b) reconstruction process.
Fig. 4.
Fig. 4. The imaging process of different sub-grating.
Fig. 5.
Fig. 5. The imaging results simulated by ray tracing: (a)xz plane and (b) yz plane.
Fig. 6.
Fig. 6. Schematic diagram of the AR system with full-color HL (SLM: spatial light modulator).
Fig. 7.
Fig. 7. Fabrication device of a full-color HOE (ES: electric shutter, HWP: half wave plate, M: mirror, DM: dichroic mirror, PBS: polarization beam splitter, L: lens, Medium: photopolymer).
Fig. 8.
Fig. 8. The variation curve of diffraction efficiency of full-color HOE with Bragg condition angle offset.
Fig. 9.
Fig. 9. The diffraction efficiency measurement device diagram of HOE with Bragg condition angle offset (OMP: optical power meter).
Fig. 10.
Fig. 10. The transmission spectra of full-color HOE at the Bragg angle condition.
Fig. 11.
Fig. 11. Fiber coupling diagram of single longitudinal mode laser.
Fig. 12.
Fig. 12. Prototype of projection laser AR system.
Fig. 13.
Fig. 13. The image effect of full-color HOE imaging: (a) green, (b) red, (c) blue, (d) yellow, (e) cyan, (f) magenta, (g) white, and (h) original input image.
Fig. 14.
Fig. 14. The superposition of the virtual image and real scene.
Fig. 15.
Fig. 15. MTF measurement curve of full-color HOE imaging system.
Fig. 16.
Fig. 16. Astigmatism effect of HL based on a full-color HOE when the lens focuses on the (a) meridian image and (b) sagittal image.
Fig. 17.
Fig. 17. Distortion effect of full-color HL.

Tables (3)

Tables Icon

Table 1. Peak diffraction efficiency of the full-color HOE with different exposure times

Tables Icon

Table 2. Report parameters of HOE with high diffraction

Tables Icon

Table 3. Report parameters of HL in FOV and eye box

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

M X = c o s θ o x R i x cos θ i x R o x
M y = c o s θ o y R i y cos θ i y R o y
F O V = 2 arctan ( h 2 f c o s θ )
k g = k s k r
k d = k g + k p + Δ k
s i n θ d = s i n θ p + k s k p s i n θ s k r k p s i n θ r
η = t a n h 2 ( π d n λ c r c s )
η ( % ) = I D I D + I T × 100
f s = f x c o s 2 θ x
f m = f y c o s 2 θ y
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.