Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Eyebox expansion with accurate hologram generation for wide-angle holographic near-eye display

Open Access Open Access

Abstract

Small eyebox in wide-angle holographic near-eye display is a severe limitation for 3D visual immersion of the device. In this paper, an opto-numerical solution for extending the eyebox size in these types of devices is presented. The hardware part of our solution expands the eyebox by inserting a grating of frequency fg within a non-pupil forming display configuration. The grating multiplies eyebox, increasing the possible eye motion. The numerical part of our solution is an algorithm that enables proper coding of wide-angle holographic information for projecting correct object reconstruction at arbitrary eye position within the extended eyebox. The algorithm is developed through the employment of the phase-space representation, which facilitates the analysis of the holographic information and the impact of the diffraction grating in the wide-angle display system. It is shown that accurate encoding of the wavefront information components for the eyebox replicas is possible. In this way, the problem of missing or incorrect views in wide angle near-eye display with multiplied eyeboxes is efficiently solved. Moreover, this study investigates the space-frequency relation between the object and the eyebox and how the hologram information is shared between eyebox replicas. The functionality of our solution is tested experimentally in an augmented reality holographic near-eye display that has maximum field of view of 25.89°. Obtained optical reconstructions demonstrate that correct object view is obtained for arbitrary eye position within extended eyebox.

Published by Optica Publishing Group under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

Holographic near-eye holographic display (HNED) is a unique virtual reality or augmented reality technology since it is the only one that can recreate full optical wavefront, i.e. amplitude and phase [1,2]. This allows observing 3D object with its natural depth and parallax and providing user realistic images. Recent advances in HNEDs solutions employ color reconstruction [35], speckle reduction [6], minimalization of aberrations [7,8], reduction of form factor [9,10], and others.

To get high 3D immersion, the holographic display should reconstruct a high-quality 3D image that is visible at wide angle and allow user to move freely to see it from different perspectives. Optical reconstruction in HNED is obtained with the use of Spatial Light Modulator (SLM) [11]. Thus, the display limitations are strictly related with SLM parameters. With current 4 K modulators, it is possible to directly obtain high-quality reconstruction [1,12], but Field of View (FOV) and eyebox size are too small for ensuring comfortable viewing. The first parameter depends on pixel size of SLM while the second on SLM size. Natural solution to this problem is producing SLM with small pixel size and large number of pixels [13,14]. However, this is not an efficient method not only from a hardware point of view, but also from a signal processing perspective. Development of small pixel size is technically challenging and expensive. Other issues are related to the processing of huge amount of data – transferring, generation, numerical manipulations. Currently, there are methods that enable enlarging FOV optically [15]. However, large FOV leads to the generation of small eyebox at the output of the display. When placing an eye within this small eyebox, the viewer sees object reconstruction but as soon the eye is moved from that location, the reconstruction is lost. Hence, the problem of limited FOV and eyebox size can only be efficiently solved by optomechanical methods supported by numerical processing.

In the state of the art, there are two main solutions addressing the issue of limited eyebox size. In the first one size of the eyebox is fixed and its position in space is changed according to viewer’s eye position. Location of the eye is found with an eye tracking system. Wavefront reconstruction is redirected to a given detected location. This can be realized with scanning element [16]. In the second solution, the display setup is simplified and build without scanning device. In this method an extended eyebox is generated at the output of the setup using diffractive element. To obtain 1D extension [17], Holographic Optical Element (HOE), with coded three horizontally separated concave mirrors, is utilized. It creates multiple eyebox replicas forming an extended eyebox in one direction. Nevertheless, eyebox extension solution is not limited to horizontal direction, and thus, it is also possible to extend eyebox in two directions. Another method employs HOE with two volume holographic gratings [18]. They are used to divide beam of light carrying image information into two horizontally shifted parallel beams. The separation between beams provides enlarged eyebox. In Ref. [17], a computer generated hologram (CGH) of five lens functions, center, two horizontally shifted, and two vertically shifted, is calculated and printed onto the HOE with a holographic printer. Different solutions employ HOE with multiple horizontal focal spots and the vertical high order diffractions of the SLM [19]. However, in all of solutions with multiplexed eyeboxes, observation of the correct reconstruction position is only possible when the eye is placed exactly in the center part of the extended eyebox. In other positions, information received by the eye is incorrect. For each eye position, there is a different eyebox containing different object perspective. Thus, for obtaining comfortable observation these eyebox extension methods must be supported with proper numerical tools.

The solution to the problem of incorrect information inside the extended eyebox was proposed in Ref. [19]. In that method the correct 3D image is obtained by wrapping of angular spectrum of the CGH to provide the correct information after eyebox replication. The display employs an architecture where hologram is reconstructed close to the eye and object is reconstructed far from the observer. For this configuration, the hologram can code the correct information in the eyebox because there is Fourier Transform (FT) relation between the object and the hologram/eyebox. However, this method supports only small FOV displays. It cannot be applied for non-paraxial display [20], in which there is no Fourier Transform relation between object and eyebox planes. Moreover, in proposed method, the 3D input data is a combination of 2D images separated in z-axis while there is no non-paraxial propagation algorithm for such large FOV data. It should also be noted that the eyebox expansion technique should be supported by 3D CGH, e.g. generated for the 3D point cloud.

This work presents a non-pupil wide-angle near-eye display that allows a larger eyebox size by employing a diffraction grating. The diffraction grating generates eyebox replicas where the reconstructed object can be observed. In the experiment, we show that holographic reconstruction does not match with the desired position of the object and has the same perspective as for on-axis observation i.e., no parallax is present. Additionally, observation between eyebox replicas has the problem of reconstructing two spatially mismatched objects, and thus, the view is incorrect. To correct those problems, a novel technique that enables proper coding of wide-angle holographic information for projecting correct object reconstruction between eyebox replicas is proposed. This technique is developed through the employment of the phase-space representation (PSR), which facilitates the analysis of the holographic information and the impact of the diffraction grating in the wide-angle NED system. It is shown that this new technique is capable to accurately encode the wavefront information components for the eyebox replicas. In this way, the problem of missing or incorrect views in wide angle near display with replicated eyeboxes is efficiently solved. Moreover, this work shows a theoretical analysis of the eyebox and extended eyebox for the analyzed NED configuration. By employing the PSR, it is demonstrated for the first time that the size of the eyebox, as well as the extended eyebox at the observer's position, is dependent on the instantaneous frequencies of the point source in the object plane. This result has not been discussed in current literature at the best of our knowledge. Experimental optical reconstructions prove that our proposed technique provides correct viewpoints of the reconstructed object within whole extended eyebox.

2. Phase-space analysis with eyebox multiplication

The non-pupil wide-angle HNED configuration without a grating was studied with PSR in Ref. [20]. There, it was shown that it exists a finite area Ωxh where wavefront information from a point source of an object can be defined in hologram plane. This information is used to calculate an accurate and efficient computer-generated hologram (CGH). The CGH allows bright and sharp reconstruction of the source provided that the observer in on-axis position. When the eye is shifted from the on-axis position, the brightness of the observed image will decrease because only some or no information reaches the eye. To solve this problem, this work employs a diffraction grating for expanding the eyebox size. The role of the grating is to generate replicas of the wavefront that reaches the eyebox plane, which are the diffraction orders ±1 from the grating. In this way, the eyebox is extended and the eye has the possibility of larger transversal movement. However, the mere presence of the grating does not provide a proper object reconstruction. Figure 1(a) depicts the effect of grating when placing it within the HNED system. When eye slightly moves to an off-axis position, two-point sources will be seen, one from the order 0 and one from the high order replica. The green area indicates the limits of the spherical wave that travels from object plane to eyebox plane for the conventional HNED while the gray area depicts the wavefront replica due to grating diffraction order. Note that this geometry shows the formed eyebox according to the wavefront allowed by Ωxh, which is depicted by the red box in the hologram plane. However, described reconstruction is incorrect. Figure 1(b)) presents desired wavefront reconstruction for an off-axis observer. Depicted object reconstruction is only possible when: 1) a new region Ωxs, red square in Fig. 1(b)), is defined over the hologram, which contains the new wavefront information; and 2) numerical correction of the wavefront within Ωxs is applied. These two steps enable obtaining a single wavefront, indicated by the green area in Fig. 1(b)), that travels between object and relative position of observer, and thus, a single source is seen by the eye. Note, that under this geometry, the new wavefront information in Ωxs is a composite of original and replica wavefront. Therefore, this Section analyzes the process of eyebox expansion due to the grating at the hologram plane and location of the area Ωxs.

 figure: Fig. 1.

Fig. 1. Reconstruction in the HNED with diffraction grating for eyebox expansion. a) Without numerical correction; b) with numerical correction. Here, xe, xh, and xo stand for eyebox, hologram, and object plane, respectively. On the right, it is shown a zoomed region from the hologram plane.

Download Full Size | PDF

With the aim to analytically address the described problem, we employ the PSR. We start by considering a point source with coordinates rop = [xop, yop, zop] over the object plane. This point source propagates the corresponding optical field to the hologram plane. Hence, the wavefield that arrives to this plane is expressed as

$${U_o}({\mathbf{r}_h}) = {e^{ - ik\left\| {{\mathbf{r}_h} - {\mathbf{r}_{op}}} \right\|}},$$
where rh = [xh, yh, zh] are the coordinates over the hologram plane, and ||r|| = (x2 + y2 + z2)1/2 . At the hologram plane, a grating defined by
$$g = 1 + \cos ({f_g}{x_h}),$$
where fg is the frequency of the grating, is placed. This grating causes that the wavefront coming out from the hologram is dispersed in three diffraction orders, which travel to the eyebox plane. This enables the observer to see the reconstructed point source at different positions at the eyebox plane. This is because information in each diffraction order is identical but shifted in space. When projecting back diffracted wavefronts on the hologram plane, the wavefronts information is contained in three regions – named Ω-1, Ω0 and Ω1. The total length of these regions is the spatial extension of region Ωxh at hologram plane. Note that Ω0 = Ωxh.

Size and position of areas Ω-1, Ω0 and Ω1 are found by firstly deriving the argument of Eq. (1) with respect xh, which yields

$${f_{xo}}({\mathbf{r}_h}) = \frac{1}{{2\pi }}\frac{\partial }{{\partial {x_h}}}[{\arg \{{{U_o}({r_h})} \}} ]= \frac{{{x_h} - {x_{op}}}}{{\lambda \|{{\mathbf{r}_h} - {\mathbf{r}_{op}}} \|}},$$
in x-direction; the same can be obtained for y-direction.

In Ref. [20], it is shown that the hologram H encodes the object wave given by Eq. (1). To reconstruct the corresponding wavefield Ur, it is necessary to multiply H with a spherical wave R, which is conjugated to the reference.

$${U_r}({\mathbf{r}_h}) = H({\mathbf{r}_h})R({\mathbf{r}_h}),$$
where R(rh) = eik||rh||. The spherical wave R together with the hologram bandwith Bfh = 1/Δh, where Δh is pixel pitch of hologram at xh, are necessary to define the local bandwidth of reconstructed signal Ur. Reference [20] applied the PSR over the argument of R and then added ± Bfh to defined the bandwidth of the hologram. However, for the case considered in this work, upper and lower limits of the local bandwidth is extended by the presence of the grating and can be calculated as follows
$$f_{xh \pm }^{(m)}({\mathbf{r}_h}) = \frac{1}{{2\pi }}\frac{\partial }{{\partial {x_h}}}[{k\|{{\mathbf{r}_h}} \|} ]\pm \frac{{{B_{fh}}}}{2} + m{f_g} = \frac{1}{\lambda }\frac{{{x_h}}}{{\|{{\mathbf{r}_h}} \|}} \pm \frac{{{B_{fh}}}}{2} + m{f_g},$$
where m = -1, 0, 1indicates the diffraction order.

To find the spatial size of each region Ωm at the hologram plane, the intersections between fxo and $f_{xh \pm }^{(m)}({\mathbf{r}_h})$ need to be found, as shown in Fig. 2. In Ref. [20], the intersection between fxo and $f_{xh \pm }^{(0)}({\mathbf{r}_h})$ was found by employing the local spatial curvature of the wavefront and the local frequency bandwidth. In that work, it was shown that Eq. (3) and (5) can be linearized by employing the tangent line to the local spatial frequency of the corresponding spherical wavefront. This PSR approach can be employed here to find the intersections for the orders ±1. For a detailed description of local spatial frequency curvature, see Appendix A. Local spatial curvatures are used to reduce Eq. (3) and (5) into polynomials of first degree that can be equated as follows

$${f_x} - {f_{xo}}({\mathbf{r}_{hop}}) = \frac{{c_{xo}^{}({\mathbf{r}_{hop}})}}{\lambda }({x_h} - {x_{hop}}),$$
and
$${f_x} - {f_{xh \pm }}({\mathbf{r}_{hop}}) = \frac{{c_{xh}^{(m)}({\mathbf{r}_{hop}})}}{\lambda }({x_h} - {x_{hop}}) + m{f_g},$$
where rhop =[xhop, yhop, zhop] = [xopzh/zop yopzh/zop, zh]. Hence, the intersections between the tangent lines can be written as
$${c_{xo}}({\mathbf{r}_{hop}}){D_m} = {c_{xh}}({\mathbf{r}_{hop}}){D_m} + \frac{{\lambda {B_{fh}}}}{2}.$$

 figure: Fig. 2.

Fig. 2. Illustration of the extended frequency potential and the expanded spatial support region due to the diffraction grating at the hologram plane.

Download Full Size | PDF

Note that 2Dm (Eq. A5) is the size of the support region Ωm. Thus, coordinates of the boundary points of Ωm are expressed by

$$x_{hop \pm }^{(m)} = {x_{hop}} + {D_m}.$$

With the aim to depict the support region extension due to the grating, Fig. 2 shows the PSR on the hologram plane when considering a point source with coordinates xo = 0.45Bxo and zo = 1000 mm where Bxo = 520.5 mm. Wavelength and pixel size are defined by λ = 633 nm, and Δh = 57 µm, which are employed parameters for display configuration of this work. Because of the grating, the PSR of the field Ur contains replicas along the frequency axis, which are depicted in Fig. 2 with brown lines and a blue line for ±1 and zero orders, respectively. In the plot, magenta lines and green area represents the native PSR of the eyebox at the hologram plane. Cyan and green solid lines depict the frequency bands generated by the orders ±1 of the grating, respectively. The orange, violet and yellow vertical lines mark the boundaries of the extended spatial regions due to grating. Red lines within the green area represents the segments of the original signal (blue line) that are wrapped up again within the hologram region. It should be noted that the original signal is centered at the coordinates (rhop, fxo(rhop)). When reconstructing these signals simultaneously, three-point sources at the different positions are obtained at the eyebox plane. However, only the source at the central with m = 0 is placed at the proper position.

When considering an off-axis observer at the eyebox plane, it is necessary to consider the movement of the object point support region Ωxs at the hologram plane. This allows determining what information from the wave field is coded in the hologram. Positioning of Ωxs is carried out by considering the off-axis spherical wave Re, which is at the position xe ≠ 0

$${R_e}({\mathbf{r}_h}) = \exp (ik\|{\mathbf{r}_h} - {x_e}\|).$$

For this off-axis spherical wavefront, the upper and lower limits of the frequency bandwidth are given by

$$f_{xh \pm }^{(oa)}({\mathbf{r}_h}) = \frac{1}{\lambda }\frac{{{x_h} - {x_e}}}{{\|{{\mathbf{r}_h} - {x_e}} \|}} \pm \frac{{{B_{fh}}}}{2}.$$

To find the new support region Ωxs Eq. (11) needs to be equalized with Eq. (3) and solved, which is done with the local spatial curvature. Consequently, it is shown that the boundaries of the support region Ωxs are given by

$${x_\Omega } = x_{hop}^e \pm {D_0},$$
where
$$x_{hop}^e = ({x_{op}} - {x_e})\frac{{{z_h}}}{{{z_{op}}}},$$
and 2D0 is the size of this new support region.

The shift of the region Ωxs because of the off-axis positioning of the observer is presented in Fig. 3. In this figure, the observer is placed at xe = 3 mm at the eyebox plane. It is seen that frequency range is shifted vertically, depicted by the green area. Intersections between blue line and green area generates the region Ωxs on the hologram plane. It is worth noting that Ωxs, for this example, the local bandwidth generated by Re is partially out of the native bandwidth of the hologram.

 figure: Fig. 3.

Fig. 3. Change in the frequency and hologram support region for observer motion to xe = 3 mm at the hologram plane.

Download Full Size | PDF

3. CGH calculation

The presence of the grating on the hologram plane effectively extends spatial size of the eyebox, as shown in previous Section. Moreover, as the observer moves in the extended eyebox plane, the region Ωxs moves through the extended support region in the hologram plane, as shown in Fig. 3. In that example, it is seen that Ωxs lies between Ω-1 and Ω0. Thus, hologram information will be a composite of two different regions. Each of these regions allows capturing a portion of the wavefront that enables to construct the CGH. In this section, we will name the sub-areas as Ω-1s, Ω0s, and Ω1s that are within Ω-1, Ω0 and Ω1, respectively. The way to define the limits of these sub-areas are given in Appendix B. The first case to analyze is when the position of the observer lies between -1 and 0 order eyebox i.e., xe > 0. This means that support region Ωxs is between Ω-1 to Ω0. Thus, the field information at the hologram plane will be given by U(rh-1s)) and U(rh0s)). However, the term U(rh-1s)) needs to be multiplied by the correction term exp(-2πirh-1s)fg) since wavefield in the region Ω-1s is outside the native hologram frequency band. Hence, the final information cyphered in the hologram is given by

$$H({\mathbf{r}_h}) = {R^\ast }({\mathbf{r}_h})({{U_o}({\mathbf{r}_h}({\Omega _{ - 1s}}))\exp ( - 2\pi i{\mathbf{r}_h}({\Omega _{ - 1s}}){f_g}) + {U_o}({\mathbf{r}_h}({\Omega _{0s}}))} ).$$

When introducing the grating into the experimental system, the first term in Eq. (14) is shifted to the high frequency band. In this way, a continuous wavefront is formed by both orders and correct hologram reconstruction is obtained. The PSR representation of this step is presented in Fig. 4(a). As shown in the plot, two spatial support regions are defined according to Eq. (B1) and (B2), see Appendix B. From the support region Ω0, the cyphered wavefield in this area is within the zero-frequency band. However, for the region Ω-1 the cyphered field lies in the minus-one frequency band. In this form, the field cannot be registered because it causes frequency aliasing. Thus, the wavefield is re-shifted by a plane wave with frequency of the grating. This operation will properly transfer the wavefield to the zero-frequency band.

 figure: Fig. 4.

Fig. 4. Coding information at the hologram plane for a single point source a) xe = 3 mm, b) xe = -3 mm at the hologram plane.

Download Full Size | PDF

The second case is given when the position of the observer lies between 0 and 1 order eyebox i.e., xe < 0. The wavefield in this case will be a composite of orders 0 and +1. Thus, the wavefield information is given by the fields evaluated as U(rh0s)) and U(rh1s)), where boundaries of regions Ω0s and Ω1s are given by Eq. (B3) and (B4), see Appendix B. Like in previous case, the spherical wavefield in the region Ω1s is outside the native frequency band. This problem can be solved as well by multiplying this spherical wavefront with a plane wave with positive frequency carrier fg. Thus, the field cyphered within the hologram can be written as

$$H({\mathbf{r}_h}) = {R^\ast }({\mathbf{r}_h})({{U_o}({\mathbf{r}_h}({\Omega _{ + 1s}}))\exp (2\pi i{\mathbf{r}_h}({\Omega _{ + 1s}}){f_g}) + Uo({\mathbf{r}_h}({\Omega _{0s}}))} ).$$

Like in Eq. (14), the purpose of the plane wave in Eq. (15) is to shift the wavefield in the region Ω1s from the plus one frequency band to the zero-frequency band to avoid frequency aliasing. The PSR representation of this case is depicted in Fig. 4(b)).

The above outlined formulation can be generalized for an arbitrary set of point sources as

$$H({\mathbf{r}_h}) = {R^\ast }({\mathbf{r}_h})\sum\limits_{p = 1}^P {{U_p}({\mathbf{r}_h}({\Omega _{xs}}))} ,$$
where P is the number of point sources.

4. View analysis

It is a common assumption in near-eye displays that eyebox spatial shape is a rectangular area where the eye can move in x and y direction at a certain position z [2123]. This assumption of rectangular shape is true for wide-angle NED of pupil configuration [24]. For a non-pupil one, it is only true in the paraxial case, where there is an FT relationship between hologram and eyebox planes [25]. Hence, in this Section, we study wide-angle non-pupil configuration where the size and frequency support of the eyebox, and the extended eyebox are investigated by employing the PSR.

In the wide-angle case, the hologram and the eyebox are not related through the FT. Nevertheless, we can relate information at these planes by propagation. The process of propagating back the information contained in the frequency bands generated by the grating to the viewer position delivers the shape of eyebox and extended eyebox. However, this process is computational expensive when using classical propagation tools. Eye box formation can be better visualized by the PSR. In this approach, field propagation is reduced to a shearing operation [26,27] that is applied to the frequency bands between hologram to observer planes, and thus, the frequencies experience a rotation to the left for the illustrated case. In this way, shape and size of the normal and extended eyeboxes can be found. In Fig. 5, the final eyebox is presented. Figure 5(a)) presents the three replicas of eyeboxes generated by the grating at the eyebox plane. The size of these replicas at this plane can be estimated roughly with the diffraction angle of the magnified SLM, which is approximately 5 mm in the minimum waist. While separation of the replicas is given by the frequency grating and separation between hologram and eyebox planes; with our configuration this separation approximately is 6 mm. The black box depicts the eyebox for on-axis position, while the blue and red boxes represent the ±1 order, respectively. The three boxes together create the extended eyebox. Notably, the size of eyeboxes, has a frequency dependency. Thus, for a frequency equal to zero, the size is 17.6 mm, while for the frequency corresponding to the maximum FoV, the size is 20 mm. Moreover, a gray box was added to Fig. 5(a)), this represents the PSR of the eyebox for an off-axis observer. It is seen that this box has the same shape as the central box, but it is displaced in x-direction. This means that the eyebox for the observer only shifts spatially according to xe, which size has frequency dependence as well. Figure 5(b)) shows the PSR of two-point sources, center and corner one (xo = 0.45Bxo) of the object. The PSRs show that the wavefront information is well contained within the extended eyebox.

 figure: Fig. 5.

Fig. 5. Eye box shape at the eyebox plane; a) sizes of the extended eyebox and the observer eyebox; b) corner and central frequencies from the object point sources over the extended eyebox for zo = 1000 mm.

Download Full Size | PDF

5. Display

The proposed HNED setup is based on non-pupil forming architecture providing large FOV and eyebox extension with diffraction grating. In this configuration, 4F imaging system creates a complex hologram, which is observed with the eyepiece acting as a magnifying glass. In the hologram plane, the phase sinusoidal diffraction grating is placed, which multiplies the eyebox in the horizontal direction. This display configuration allows for the observation of large virtual object within extended eyebox.

The scheme of the HNED with extended eyebox is illustrated in Fig. 6. In the setup, a phase-only SLM (HoloEye GAEA 2.0, pixel count 4160 × 2464, pixel pitch 3.74 µm) is illuminated by a coherent plane wave generated by red laser, microobjective, pinhole, and collimator (fc = 400 mm). The polarization state of the beam is set with the use of polarizer P, which is placed behind collimator, according to the main polarization axis of the SLM. Reconstruction beam modulated by the SLM propagates through 4F imaging setup, which is formed with two lenses L1 and L2 with the same focal length (f1 = f2 = 100 mm). At the back focal plane of lens L2 real copy of the SLM is created with unitary magnification. This copy is observed with the eyepiece (fep = 33 mm) that acts as a magnifying glass and produces virtual image of the SLM. The magnification depends on its axial location of eyepiece in relation to the real SLM image plane. It increases when the distance s, between eyepiece and real image of the SLM, takes values close to the fep. In our implementation s = 31 mm. Using the lens equation: 1/s’-1/s = 1/fep it is found that distance s’ between eyepiece and the final virtual image is 511.5 mm. For these parameters, the magnification introduced by the eyepiece mep is 16.48. As a result, large hologram is formed with pixel size Δh equal to 61.63 µm and physical dimensions 236.64 × 133.11 [mm2] giving maximum FOV 25.89°. In the SLM image plane sinusoidal transmissive phase diffraction grating is placed. The grating generates diffraction orders, which are focused by the eyepiece and forms replicas of the eyebox in horizontal direction. These three copies of the eyebox corresponding to orders +1, 0, -1 creates an extended eyebox. Separation between replicas depends on the period of the diffraction grating. In our implementation, we utilize grating with 300 lines/mm. As illustrated in Fig. 4(a), for this frequency value, there are small gaps between copies of the eyeboxes. According to the PSR the size of the extended eyebox equals 17.6 mm for frequency equal to zero and 20 mm for frequency equal to maximum FOV. In the HNED system, the eye position within extended eyebox SLM is addressed with different hologram encoding accurately the wavefront information about the object. This allows to enlarge tolerance movement region that provides correct object views. Behind the eyepiece, beam splitter is utilized, enabling observation of real 3D scene and virtual 3D optical reconstruction at the same time. In this way, we can compare extended view reconstruction of the system with the real 3D object.

 figure: Fig. 6.

Fig. 6. Scheme of large FOV near-eye holographic display setup with extended eyebox. Lc – collimating lens, P – polarizer, BS – beam splitter, L1 and L2 lenses forming 4F system, DG – diffraction grating, Lep – eyepiece.

Download Full Size | PDF

6. Experimental results

In this Section, the proposed wide-angle HNED with extended eyebox is experimentally investigated. The proposed HNED operates in augmented reality mode enabling 3D observation of real scene and virtual optical reconstruction at the same time. In the reconstruction volume, the real object, a 3D printed arrow, is placed to mark 3D location of the reconstructed holographic object. The virtual test object is 3D model of a 18th century British sailing ship consisting of 12 million points, which has the following dimensions: 560 mm width, 460 mm height, and 236 mm depth. The object is located 488.5 mm from the large hologram plane. For the selected view, the occlusion frequency-based algorithm [28] was applied, which removes hidden points from the 3D cloud. After this operation, the 3D object consists of 7.5 million points. Resulting point cloud and algorithm described in Section 3 are employed to generate the corresponding CGHs. CGHs are loaded into the SLM, which enables optical reconstructions. Reconstructed holograms of this Section were captured by a smartphone’s wide angle camera, which parameters allows for imitation of the human eye [29].

First conducted experiment is designed to illustrate the capability of the eyebox extension. This feature enables to enlarge viewing angle. Thus, within this experiment, the smartphone captures reconstructed holograms for three different viewpoints of the extended eyebox. Selected camera locations are the central part of the eyebox replica generated by 0 order of the diffraction grating, the central part of the eyebox replica for +1 order, and the middle position between replicas of 0 and +1 order. For the sake of discussion, let's call these positions as “0”, “+1”, and “+1/2”, respectively. Figure 7 shows images of optical reconstructions for selected viewpoints for two CGH generation approaches. This first approach does not account observation model of extended eyebox (left column) while second approach does (right column) it. For the first case, the CGH generation technique described in our previous paper is employed [20]. In this case, we assumed that the eye position is fixed and change of the viewpoint was not considered. Thus, the single hologram is calculated and reconstructed by the SLM for all selected camera positions. For the case that considers correct observation model, three holograms corresponding to chosen viewpoints are calculated separately with the CGH method described in Section 3. For each camera location, SLM is addressed with different CGH. To better visualize the differences between the results enlarged part of images are presented in Fig. 7. Images obtained for eye in “0” (top row) position for both approaches look the same. Results for “+1” location (bottom row) show that without implementation of our method, reconstruction is horizontally shifted and a gap between real arrow object and virtual hologram reconstruction is observed. This is not visible in the reconstruction obtained from CGH calculated with the method of this work. An interesting case is presented in the center row. It shows what happens when the camera is in position “+1/2”. For the case that does not consider the observation position, the camera captures an image that is result of the two replicas, “0” and “1”, which are simultaneously observed for the selected eye position. Thus, a final blurry image is obtained. When calculating the CGH correctly, a high-quality image in the right 3D location is obtained because the position of the reconstructed object relative to the real arrow object has been preserved.

 figure: Fig. 7.

Fig. 7. Optical reconstructions obtained without viewpoint correction (first column) and with proposed viewpoint correction method (second column) for different eye position within extended eyebox plane: a) for central part of eyebox generated by 0 order of the diffraction grating, b) + 1/2 position located between 0 and +1 order, and c) central part of eyebox for +1 order.

Download Full Size | PDF

Second experiment employs virtual reality mode to show that proposed method enables reconstruction of 3D object with depth focus. The camera is in between 0 and 1 replica of the eyebox. For this experiment, a hologram was also generated for the boat object, however the object was rotated to obtain larger depth. Reconstruction results captured for two chosen focus planes are presented in Fig. 8. For illustration of defocusing effect enlarged regions of images are included. In Fig. a) center top sail of the boat is in focus while front part is out of focus. On the contrary in Fig. b) front part is in focus while center sail is out of focus. This illustrates that the display supported with developed coding technique enables reconstruction of 3D objects.

 figure: Fig. 8.

Fig. 8. Optical reconstructions of boat object captured for to focus positions a) top sails, b) wooden front and their enlarged parts.

Download Full Size | PDF

7. Conclusion

This work presents a full holographic solution for enlarging the eyebox size in wide-angle HNED that includes hardware part, display configuration, numerical method, and calculations of proper wavefront in the hologram plane. Extension of the eyebox is achieved by the hardware operation, which employs a diffraction grating that generates eyebox replicas. Thus, the tolerance region where the eye can be moved, while observing reconstructed object, is significantly enhanced. However, this enhancement can really be used through the correct coding of holographic information. Therefore, the display is supported with the numerical operation that is an algorithm for hologram generation. The developed algorithm encodes the wavefront information from the object plane that fits with the observer position. It is shown that the problem of missing or incorrect views in wide angle near eye display with replicated eyeboxes is efficiently solved by our hologram generator. HNED with human interface must be supported with eye tracking system for dynamic update of the hologram.

For holographic wide-angle systems, like the one shown here, there is no simple relation between object and eyebox plane, and thus, a valuable theoretical view analysis of the system is developed. This analysis is done with the help of the PSR. It has been found that there is a dependency between the size of eyebox and the local instantaneous frequency from the object plane. In this way, it is shown that size and shape of the eyebox and extended eyebox are not uniform, as traditionally is tough. For the parameters described in this work, it is found that the minimum and maximum size of the eyebox are 5.3 mm and 6.1 mm for point sources in the position (0,0) and (0.45Bx0, 0.45Bx0) in the object plane, respectively. Meanwhile, minimum and maximum sizes of extended eyebox are 17.6 mm and 20 mm for point sources positions (0,0) and (0.45Bx0, 0.45Bx0) in the object plane, respectively.

Experimental validation of our work was carried out in a near-eye holographic display of FOV 25.89° The display is supported with our CGH method, which enables obtaining correct information for any eye position within extended eyebox. The display is based on non-pupil forming architecture producing large, defocused hologram in a proximity to the reconstructed 3D object. The total magnification of the system equals 16.48, giving hologram with physical dimensions 236.64 × 133.11 mm2. Eyebox extension is achieved with phase sinusoidal diffraction grating placed in the hologram plane. Diffraction orders of the grating produce horizontally shifted replicas of the eyebox. Replicas obtained for orders -1, 0, and +1 form extended eyebox. This concept can be extended to 2D eyebox extension by implementing two perpendicular diffraction gratings in the hologram plane. Finally, this method can be used for 2D expansion with grating with HOE, which will result in a system of smaller size.

Appendix A. Local curvature approximation with diffraction grating

As stablished in Section 2, to find regions Ωm it is necessary to find the intersection of fxo (Eq. (3)) and $f_{xh \pm }^{(m)}$ (Eq. (5)). This is done by equating these expressions as follows

$$f_{xh \pm }^{(m)}({\mathbf{r}_h}) = {f_{xo}}({\mathbf{r}_h}).$$

However, the resulting expressions are nonlinear, and thus, analytical solution are not guaranteed. Instead, numerical methods are needed to reach the searched result, but this comes with the price of using more computer resources. When using a cloud that contains millions of point source, the processing time of a CGH can be larger.

The non-linear problem of Eq. (A1) can be avoided if fxo and $f_{xh \pm }^{(m)}$ are converted in linear equations, as follows

$${f_x} - {f_{xo}}({\mathbf{r}_h}) = {\left. {\frac{\partial }{{\partial {x_h}}}f_{xo}^{}({\mathbf{r}_h})} \right|_{{x_h} = {x_{hop}}}}({x_{h \pm }} - {x_{hop}}),$$
and
$${f_x} - {f_{xh \pm }}({\mathbf{r}_h}) = {\left. {\frac{\partial }{{\partial {x_h}}}f_{xh \pm }^{(m)}({\mathbf{r}_h})} \right|_{{x_h} = {x_{hop}}}}({x_h} - {x_{hop}}) + m{f_g},$$
where [xhop, yhop, zhop] = [xopzh/zop yopzh/zop, zh]. The linearization of these equations allows solving the intersections with analytical formulas. In Ref. [20]., the derivative of the local spatial frequencies are called local spatial curvatures and are expressed as
$${c_{xo}}({\mathbf{r}_h}) = \lambda \frac{{\partial {f_{xo}}}}{{\partial {x_h}}} ={-} \frac{{{{({{y_h} - {y_o}} )}^2} + {{({{z_h} - {z_o}} )}^2}}}{{{{\|{{\mathbf{r}_h} - {\mathbf{r}_o}} \|}^3}}},$$
while for limits of the bandwidth of hologram it is
$$c_{xh}^{(m)}({\mathbf{r}_h}) = \lambda \frac{{\partial f_{xh \pm }^{(m)}}}{{\partial {x_h}}} = \frac{{y_h^2 + z_h^2}}{{{{\|{{\mathbf{r}_h}} \|}^3}}}.$$

Note that Eq. (A4) and Eq. (A5) are the second derivative of the phase respect to the coordinate xh. Equation (A2) and Eq. (A3) are equalized, and thus, the coordinates for each region Ωm is found by solving the expression

$${c_{xo}}({\mathbf{r}_{hop}}){D_m} = c_{xh}^{(m)}({\mathbf{r}_{hop}}){D_m} + \frac{{\lambda {B_h}}}{2},$$
where Dm is given by
$${D_m} = \frac{{({{B_{fh}} + m{f_g}} )\lambda ({{z_{op}} - {z_h}} ){{\|{{\mathbf{r}_{hop}}} \|}^3}}}{{2({z_h^2 - y_{hop}^2} ){z_{op}}}}.$$

This model is used for finding the coordinates of the region Ωxs, as well. In that case, Eq. (10) is linearized as follows

$${f_x} - {f_{xh \pm }}({\mathbf{r}_h}) = {\left. {\frac{\partial }{{\partial {x_h}}}f_{xh \pm }^{(oa)}({\mathbf{r}_h})} \right|_{{x_h} = x_{hop}^e}}({x_h} - x_{hop}^e).$$

Thus, the following equality can be stablished by

$${c_{xo}}({\mathbf{r}_{hop}}){D_0} = c_{xh}^{(oa)}({\mathbf{r}_{hop}}){D_0} + \frac{{\lambda {B_h}}}{2},$$
where $c_{xh}^{(oa)}$ is the second derivative of Eq. (11). When solving this expression, the boundaries of the region Ωxs are found.

Appendix B. Area selection for calculation of CGH with diffraction grating

This Appendix describes the boundaries of regions Ω-1s, Ω0s, and Ω1s for proper coding of holographic information within the extended support region. The case when xe > 0 will be considered first. In this situation, the observer lies between 0 and -1 order eyebox, and thus, support region Ωxs is between Ω-1s, to Ω0s.

To determine the wavefront information within Ωxs, it is necessary to find the parts of Ωxs belonging to Ω-1s, and Ω0s, respectively. This can be done when analyzing the boundaries of regions determined by Eq. (9) and (12). Hence, the boundaries of the Ωxs that lies within Ω-1 are defined by

$$\begin{array}{l} {x_{{\Omega _{ - 1s}}l}} = \max ({x_{hop - }^{( - 1)},{x_\Omega }_ - } ),\\ {x_{{\Omega _{ - 1s}}r}} = x_{hop + }^{( - 1)}, \end{array}$$
where letters l and r stand for left and right, respectively. While the boundaries of Ωxs that lies in Ω0 are given by
$$\begin{array}{l} {x_{{\Omega _{0s}}l}} = \max ({x_{hop + }^{( - 1)},x_{hop - }^{(0)}} ),\\ {x_{{\Omega _{0s}}r}} = {x_\Omega }_ + . \end{array}$$

When determining these subregions, it is possible to encode the corresponding wavefront information.

The following case is xe < 0, which means that the observer lies between 0 and 1 orders, and thus, support region Ωxs moves from Ω0 to Ω1. The limits of the sub-support area Ω0s are defined as

$$\begin{array}{l} {x_{{\Omega _{0s}}l}} = \min ({x_{hop + }^{(0)},{x_\Omega }_ - } ),\\ {x_{{\Omega _{0s}}r}} = x_{hop + }^{(0)}. \end{array}$$

And for the sub area Ω+1s

$$\begin{array}{l} {x_{{\Omega _{ + 1s}}l}} = \min ({{x_\Omega }_ - ,x_{hop - }^{( + 1)}} ),\\ {x_{{\Omega _{ + 1s}}r}} = \min ({x_{hop + }^{( + 1)},{x_\Omega }_ + } ). \end{array}$$

Funding

Narodowe Centrum Nauki (UMO-2018/31/B/ST7/02980); Politechnika Warszawska.

Acknowledgments

We would like to acknowledge Dr. Marcin Adamczyk for providing the 3D printed arrow employed in this work.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. C. Chang, K. Bang, G. Wetzstein, B. Lee, and L. Gao, “Toward the next-generation VR/AR optics: a review of holographic near-eye displays from a human-centric perspective,” Optica 7(11), 1563–1578 (2020). [CrossRef]  

2. J.-H. Park and B. Lee, “Holographic techniques for augmented reality and virtual reality near-eye displays,” Light Adv. Manuf. 3(1), 1 (2022). [CrossRef]  

3. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017). [CrossRef]  

4. X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27(26), 38236–38249 (2019). [CrossRef]  

5. X. Yang, S. Jiao, Q. Song, G.-B. Ma, and W. Cai, “Phase-only color rainbow holographic near-eye display,” Opt. Lett. 46(21), 5445–5448 (2021). [CrossRef]  

6. P. Sun, S. Chang, S. Liu, X. Tao, C. Wang, and Z. Zheng, “Holographic near-eye display system based on double-convergence light Gerchberg-Saxton algorithm,” Opt. Express 26(8), 10140–10151 (2018). [CrossRef]  

7. D. Kim, S.-W. Nam, K. Bang, B. Lee, S. Lee, Y. Jeong, J.-M. Seo, and B. Lee, “Vision-correcting holographic display: evaluation of aberration correcting hologram,” Biomed. Opt. Express 12(8), 5179 (2021). [CrossRef]  

8. S. W. Nam, S. Moon, C. K. Lee, H. S. Lee, and B. Lee, “Aberration-corrected full-color holographic augmented reality near-eye display using a Pancharatnam-Berry phase lens,” Opt. Express 28(21), 30836–30850 (2020). [CrossRef]  

9. M. Y. He, D. Wang, Y. Xing, Y. W. Zheng, H. Le Zhang, X. L. Ma, R. Y. Yuan, and Q. H. Wang, “Compact and lightweight optical see-through holographic near-eye display based on holographic lens,” Displays 70(102104), 102104 (2021). [CrossRef]  

10. B. C. Kress and M. Pace, “Holographic optics in planar optical systems for next generation small form factor mixed reality headsets,” Light Adv. Manuf. 3(4), 1 (2022). [CrossRef]  

11. J.-H. Park and S.-B. Kim, “Optical see-through holographic near-eye-display with eyebox steering and depth of field control,” Opt. Express 26(21), 27076 (2018). [CrossRef]  

12. D. Blinder, A. Ahar, S. Bettens, T. Birnbaum, A. Symeonidou, H. Ottevaere, C. Schretter, and P. Schelkens, “Signal processing challenges for digital holographic video display systems,” Signal Process. Image Commun. 70, 114–130 (2019). [CrossRef]  

13. Y. Isomae, Y. Shibata, T. Ishinabe, and H. Fujikake, “Design of 1-µm-pitch liquid crystal spatial light modulators having dielectric shield wall structure for holographic display with wide field of view,” Opt. Rev. 24(2), 165–176 (2017). [CrossRef]  

14. R. Higashida, N. Funabashi, K. Aoshima, M. Miura, and K. Machida, “Diffraction of light using high-density magneto-optical light modulator array,” Opt. Eng. 59(06), 1–13 (2020). [CrossRef]  

15. E. Buckley, A. Cable, N. Lawrence, and T. Wilkinson, “Viewing angle enhancement for two- And three-dimensional holographic displays with random superresolution phase masks,” Appl. Opt. 45(28), 7334–7341 (2006). [CrossRef]  

16. H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head- mounted display with eyetracking capability,” Opt. Express 21(25), 30993–30998 (2013). [CrossRef]  

17. J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27(26), 38006 (2019). [CrossRef]  

18. X. Duan, J. Liu, X. Shi, Z. Zhang, and J. Xiao, “Full-color see-through near-eye holographic display with 80° field of view and an expanded eye-box,” Opt. Express 28(21), 31316–31329 (2020). [CrossRef]  

19. M.-H. Choi, Y.-G. Ju, and J.-H. Park, “Holographic near-eye display with continuously expanded eyebox using two-dimensional replication and angular spectrum wrapping,” Opt. Express 28(1), 533 (2020). [CrossRef]  

20. T. Kozacki, M. Chlipala, J. Martinez-Carranza, R. Kukołowicz, and M. S. Idicula, “LED near-eye holographic display with a large non-paraxial hologram generation,” Opt. Express 30(24), 43551 (2022). [CrossRef]  

21. K. Li and A. Lake, “Eyebox evaluation in AR/VR near-eye display testing,” Dig. Tech. Pap. - Soc. Inf. Disp. Int. Symp. 50(1), 434–437 (2019). [CrossRef]  

22. C. Chang, W. Cui, J. Park, and L. Gao, “Computational holographic Maxwellian near-eye display with an expanded eyebox,” Sci. Rep. 9(1), 18749–9 (2019). [CrossRef]  

23. S. A. Cholewiak, Z. Başgöze, O. Cakmakci, D. M. Hoffman, and E. A. Cooper, “A perceptual eyebox for near-eye displays,” Opt. Express 28(25), 38008 (2020). [CrossRef]  

24. T. Kozacki, M. Chlipala, and P. L. Makowski, “Color Fourier orthoscopic holography with laser capture and an LED display,” Opt. Express 26(9), 12144–12158 (2018). [CrossRef]  

25. J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372 (2008). [CrossRef]  

26. W. Zhang, H. Zhang, C. J. R. Sheppard, and G. Jin, “Analysis of numerical diffraction calculation methods: from the perspective of phase space optics and the sampling theorem,” J. Opt. Soc. Am. A 37(11), 1748 (2020). [CrossRef]  

27. L. Waller, G. Situ, and J. W. Fleischer, “Phase-space measurement and coherence synthesis of optical beams,” Nat. Photonics 6(7), 474–479 (2012). [CrossRef]  

28. J. Martinez-Carranza, T. Kozacki, R. Kukołowicz, M. Chlipala, and M. S. Idicula, “Occlusion culling for wide-angle computer-generated holograms using phase added stereogram technique,” Photonics 8(8), 298 (2021). [CrossRef]  

29. R. Cicala, “The Camera Versus the Human Eye,” https://petapixel.com/2012/11/17/the-camera-versus-the-human-eye/.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. Reconstruction in the HNED with diffraction grating for eyebox expansion. a) Without numerical correction; b) with numerical correction. Here, xe, xh, and xo stand for eyebox, hologram, and object plane, respectively. On the right, it is shown a zoomed region from the hologram plane.
Fig. 2.
Fig. 2. Illustration of the extended frequency potential and the expanded spatial support region due to the diffraction grating at the hologram plane.
Fig. 3.
Fig. 3. Change in the frequency and hologram support region for observer motion to xe = 3 mm at the hologram plane.
Fig. 4.
Fig. 4. Coding information at the hologram plane for a single point source a) xe = 3 mm, b) xe = -3 mm at the hologram plane.
Fig. 5.
Fig. 5. Eye box shape at the eyebox plane; a) sizes of the extended eyebox and the observer eyebox; b) corner and central frequencies from the object point sources over the extended eyebox for zo = 1000 mm.
Fig. 6.
Fig. 6. Scheme of large FOV near-eye holographic display setup with extended eyebox. Lc – collimating lens, P – polarizer, BS – beam splitter, L1 and L2 lenses forming 4F system, DG – diffraction grating, Lep – eyepiece.
Fig. 7.
Fig. 7. Optical reconstructions obtained without viewpoint correction (first column) and with proposed viewpoint correction method (second column) for different eye position within extended eyebox plane: a) for central part of eyebox generated by 0 order of the diffraction grating, b) + 1/2 position located between 0 and +1 order, and c) central part of eyebox for +1 order.
Fig. 8.
Fig. 8. Optical reconstructions of boat object captured for to focus positions a) top sails, b) wooden front and their enlarged parts.

Equations (29)

Equations on this page are rendered with MathJax. Learn more.

U o ( r h ) = e i k r h r o p ,
g = 1 + cos ( f g x h ) ,
f x o ( r h ) = 1 2 π x h [ arg { U o ( r h ) } ] = x h x o p λ r h r o p ,
U r ( r h ) = H ( r h ) R ( r h ) ,
f x h ± ( m ) ( r h ) = 1 2 π x h [ k r h ] ± B f h 2 + m f g = 1 λ x h r h ± B f h 2 + m f g ,
f x f x o ( r h o p ) = c x o ( r h o p ) λ ( x h x h o p ) ,
f x f x h ± ( r h o p ) = c x h ( m ) ( r h o p ) λ ( x h x h o p ) + m f g ,
c x o ( r h o p ) D m = c x h ( r h o p ) D m + λ B f h 2 .
x h o p ± ( m ) = x h o p + D m .
R e ( r h ) = exp ( i k r h x e ) .
f x h ± ( o a ) ( r h ) = 1 λ x h x e r h x e ± B f h 2 .
x Ω = x h o p e ± D 0 ,
x h o p e = ( x o p x e ) z h z o p ,
H ( r h ) = R ( r h ) ( U o ( r h ( Ω 1 s ) ) exp ( 2 π i r h ( Ω 1 s ) f g ) + U o ( r h ( Ω 0 s ) ) ) .
H ( r h ) = R ( r h ) ( U o ( r h ( Ω + 1 s ) ) exp ( 2 π i r h ( Ω + 1 s ) f g ) + U o ( r h ( Ω 0 s ) ) ) .
H ( r h ) = R ( r h ) p = 1 P U p ( r h ( Ω x s ) ) ,
f x h ± ( m ) ( r h ) = f x o ( r h ) .
f x f x o ( r h ) = x h f x o ( r h ) | x h = x h o p ( x h ± x h o p ) ,
f x f x h ± ( r h ) = x h f x h ± ( m ) ( r h ) | x h = x h o p ( x h x h o p ) + m f g ,
c x o ( r h ) = λ f x o x h = ( y h y o ) 2 + ( z h z o ) 2 r h r o 3 ,
c x h ( m ) ( r h ) = λ f x h ± ( m ) x h = y h 2 + z h 2 r h 3 .
c x o ( r h o p ) D m = c x h ( m ) ( r h o p ) D m + λ B h 2 ,
D m = ( B f h + m f g ) λ ( z o p z h ) r h o p 3 2 ( z h 2 y h o p 2 ) z o p .
f x f x h ± ( r h ) = x h f x h ± ( o a ) ( r h ) | x h = x h o p e ( x h x h o p e ) .
c x o ( r h o p ) D 0 = c x h ( o a ) ( r h o p ) D 0 + λ B h 2 ,
x Ω 1 s l = max ( x h o p ( 1 ) , x Ω ) , x Ω 1 s r = x h o p + ( 1 ) ,
x Ω 0 s l = max ( x h o p + ( 1 ) , x h o p ( 0 ) ) , x Ω 0 s r = x Ω + .
x Ω 0 s l = min ( x h o p + ( 0 ) , x Ω ) , x Ω 0 s r = x h o p + ( 0 ) .
x Ω + 1 s l = min ( x Ω , x h o p ( + 1 ) ) , x Ω + 1 s r = min ( x h o p + ( + 1 ) , x Ω + ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.