Abstract

Wearable near-eye displays for virtual and augmented reality (VR/AR) have seen enormous growth in recent years. While researchers are exploiting a plethora of techniques to create life-like three-dimensional (3D) objects, there is a lack of awareness of the role of human perception in guiding the hardware development. An ultimate VR/AR headset must integrate the display, sensors, and processors in a compact enclosure that people can comfortably wear for a long time while allowing a superior immersion experience and user-friendly human–computer interaction. Compared with other 3D displays, the holographic display has unique advantages in providing natural depth cues and correcting eye aberrations. Therefore, it holds great promise to be the enabling technology for next-generation VR/AR devices. In this review, we survey the recent progress in holographic near-eye displays from the human-centric perspective.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. INTRODUCTION

The near-eye display is the enabling platform for virtual reality (VR) and augmented reality (AR) [1], holding great promise to revolutionize healthcare, communication, entertainment, education, manufacturing, and beyond. An ideal near-eye display must be able to provide a high-resolution image within a large field of view (FOV) while supporting accommodation cues and a large eyebox with a compact form factor. However, we are still a fair distance away from this goal because of various tradeoffs among the resolution, depth cues, FOV, eyebox, and form factor. Alleviating these tradeoffs, therefore, has opened an intriguing avenue for developing the next-generation near-eye displays.

The key requirement for a near-eye display system is to present a natural-looking three-dimensional (3D) image for a realistic and comfortable viewing experience. The early-stage techniques, such as binocular displays [2,3], provide 3D vision through stereoscopy. Despite being widely adopted in commercial products, such as Sony PlayStation VR and Oculus, this type of display suffers from the vergence-accommodation conflict (VAC)—the mismatch between the eyes’ vergence distance and focal distance [4], leading to visual fatigue and discomfort.

The quest for the correct focus cues is the driving force of current near-eye displays. Representative accommodation-supporting techniques encompass multi/varifocal displays, light field displays, and holographic displays. To reproduce a 3D image, the multi/varifocal display creates multiple focal planes using either spatial or temporal multiplexing [511], while the light field display [1215] modulates the light ray angles using microlenses [12], multilayer liquid crystal displays (LCDs) [13], or mirror scanners [15]. Although both techniques have the advantage of using incoherent light, they are based on ray optics, a model that provides only coarse wavefront control, limiting their ability to produce accurate and natural focus cues.

In contrast, holographic displays encode and reproduce the wavefront by modulating the light through diffractive optical elements, allowing pixel-level focus control, aberration correction, and vision correction [16]. The holographic display encodes the wavefront emitted from a 3D object into a digital diffractive pattern—computer-generated hologram (CGH)—through numerical holographic superposition. The object can be optically reconstructed by displaying the CGH on a spatial light modulator (SLM), followed by illuminating it with a coherent light source. Based on diffraction, holographic displays provide more degrees of freedom to manipulate the wavefront than multi/varifocal and light field displays, thereby enabling more flexible control of accommodation and depth range.

From the viewer’s perspective, the ultimate experience is defined by two features: comfort and immersion. While comfort relates to wearability, vestibular sense, visual perception, and social ability, immersion involves the FOV, resolution, and gestures and haptic interaction. To provide such an experience, the hardware design must be centered around the human visual system. In this review, we survey holographic near-eye displays from this new perspective (Fig. 1). We first introduce the human vision system, providing the basis for visual perception, acuity, and sensitivity. We then categorize the recent advancement of holographic near-eye displays in Comfort and Immersion and review the representative techniques. Finally, we discuss unsolved problems and give perspectives.

 

Fig. 1. Categorization of holographic near-eye displays from the human-centric perspective.

Download Full Size | PPT Slide | PDF

2. HUMAN VISUAL SYSTEM

The human eye is an optical imaging system that has naturally evolved. Light rays enter the eye through an adjustable iris, which controls the light throughput. After being refracted by the cornea and crystalline lens, the light forms an image on the retina, where the photoreceptors convert the light signals to electrical signals. These electrical signals then go through significant signal processing on the retina (e.g., inhibition by the horizontal cells) prior to the brain. Finally, the brain interprets the images and perceives the object in 3D based on multiple cues.

With binocular vision, humans have a horizontal field of vision FOV of almost 200°, in which 120° is a binocular overlap. Figure 2 shows the horizontal extent of the angular regions of the human binocular vision system [17]. The vertical field of vision is approximately 130°. Adaption for the eye’s natural field of vision is vital for an immersive experience. For example, there is a consensus that the minimal required FOVs for VR and AR displays are 100° and 20°, respectively [18]. With the eye’s visual field, the photoreceptor density significantly varies between the central and peripheral areas [19]. The ability of the eye to resolve small features is referred to as visual acuity. The visual acuity is highly dependent on the photoreceptor density. The central area of the retina, known as the fovea, has the highest photoreceptor density, and this area spans a FOV of 5.2°. Outside the fovea, the visual acuity drops dramatically. The uniform distribution of photoreceptors is a result of natural evolution, accommodating the on/off-axis resolution of the eye lens. Therefore, to efficiently utilize the display pixels, a near-eye display must be optimized to provide a varying resolution that matches the visual acuity across the retina.

 

Fig. 2. Visual fields of the human eye.

Download Full Size | PPT Slide | PDF

As a 3D vision system, the human eyes provide depth cues, which can be further categorized into physiological and psychological cues. Physiological depth cues include accommodation, convergence, and motion parallax. Vergence is known as the rotation of each eye in opposite directions so that the eyes’ lines of sight converge on the object, while accommodation is the ability of adjusting optical power of the eye lens so the eye can focus at different distances. For near-eye displays, it is crucial to present images at locations where the vergence distance is equal to the accommodation distance. Otherwise, the conflict between the two will induce eye fatigue and discomfort [20]. Psychological depth cues are enabled through our experience of the world, such as linear perspective, occlusion, shades, shadows, and texture gradient embedded in a 2D image [21].

Last, the human eye is constantly moving to help acquire, fixate, and track visual stimuli. To ensure the displayed image is always visible, the exit pupil of the display system—the eyebox—must be larger than the eye movement range, which is ${\sim}{{12}}\;{\rm{mm}}$. For a display system, the product of the eyebox and FOV is proportional to the space-bandwidth product of the device, and it is finite. Therefore, increasing the eyebox will reduce the FOV and vice versa. Although the device’s space-bandwidth product can be improved through complicated optics, it often compromises the form factor. To alleviate this problem, common strategies include duplicating the exit pupil into an array [2224] and employing eye-tracking devices [25].

The knowledge of the eye is crucial for human-centric optimization, providing the basis for the hardware design. To maximize the comfort and immersion experience, we must build the display that best accommodates the optical architecture of the human vision system.

3. HOLOGRAPHIC NEAR-EYE DISPLAYS: COMFORT

A. Wearability

The form factor of a device determines its wearability. An ideal near-eye display must be lightweight and compact like a regular eyeglass, allowing comfortable wearing for all-day use. Compared with other techniques, holographic near-eye displays have an advantage in enabling a small form factor because of its simple optics—the key components consist of only a coherent illumination source and an SLM. The SLM is essentially a pixelated display that can modulate the incident light’s amplitude or phase by superimposing a CGH onto the wavefront.

For AR near-eye displays, the device must integrate an optical combiner, and its form factor must be accounted for. Most early-stage holographic near-eye displays use a beam splitter as an optical combiner, which combines the reflected light from the SLM and the transmitted light from real-world objects [Fig. 3(a)]. The modulated light from the SLM can be guided directly to the eye pupil through free-space propagation [26] or pass through an additional ${{4}}f$ system for hologram relay [Fig. 3(b)] [27] or frequency filtering [28,29]. Ooi et al. also embedded multiple beam splitters into an element for see-through realization with a large eyebox in an electro-holography glass [30].

 

Fig. 3. Holographic see-through near-eye display with a beam splitter as an optical combiner. (a) Conceptual schematic setup. (b) The hologram is relayed and reflected in the eye by a beam splitter through a $4f$ system (reproduced with permission from [27], 2015, OSA).

Download Full Size | PPT Slide | PDF

 

Fig. 4. Holographic optical element (HOE) as an optical combiner. (a) Off-axis HOE geometry (included here by permission from [31], 2017, ACM). (b) Waveguide geometry (reproduced with permission from [32], 2015, OSA).

Download Full Size | PPT Slide | PDF

Despite being widely used in proof-of-concept demos, the beam splitter is not an ideal choice regarding the form factor because of its cubic shape. As an alternative, the holographic optical element (HOE) is thin and flat, and it functions like a beam splitter. As a volume hologram, the HOE modulates only the Bragg-matched incident light while leaving the Bragg-mismatch light as is. When being implemented in a holographic near-eye display, the HOE redirects the Bragg-matched light from the SLM to the eye pupil and transmits the light from the real-world scene without adding additional optical power. Moreover, the ability to record diverse wavefront functions into the HOE allows it to replace many conventional optical components, such as lenses and gratings, further reducing the volume of the device.

The first implementation of the HOE in a near-eye display was performed by Ando et al. in 1999 [33]. In a holographic near-eye display, the HOE is usually deployed in an off-axis configuration [Fig. 4(a)], where the light from the SLM impinges on the HOE at an oblique incident angle. The HOE can be fabricated as a multi-functional device, integrating the functions of mirrors and lenses. For example, the HOE has been employed as an elliptically curved reflector [31,34] and a beam expander [35]. Curved/bent HOEs have also been reported to reduce the form factor and improve system performance in FOV and image quality [36]. To further flatten the optics and miniaturize the system, Martinez et al. used a combination of a HOE and a planar waveguide to replace the off-axis illumination module [37]. A typical setup is shown in Fig. 4(b), where the light is coupled into and out of the waveguide, both through HOEs [32,38,39]. Noteworthily, conventional HOEs are fabricated by two-beam interference [40]. Limited by the small refractive index modulation of the holographic materials induced in the recording process, conventional HOEs work for only monochromatic light with a narrow spectral and angular bandwidth. The recent advances in metasurface [41,42] and liquid crystals [43,44] provide alternative solutions. For example, Huang et al. demonstrated a multicolor and polarization-selective all-dielectric near-eye display system by using a metasurface structure as the CGH to demultiplex light of different wavelengths and polarizations [41,42].

 

Fig. 5. Eyebox relay through (a) ${{4}}f$ system and (b) CGH. Eyebox expansion through (c) pupil tracking (included here by permission from [45], 2019, ACM) and replication (reproduced with permission from [46], 2020, OSA).

Download Full Size | PPT Slide | PDF

B. Visual Comfort

1. Eyebox

In lensless holographic near-eye displays, the active display area of the SLM determines the exit pupil of the system and thereby the eyebox. However, to add the eye relief, we must axially shift the exit pupil from the SLM plane. We can use a ${{4}}f$ relay for this purpose [Fig. 5(a)] and magnify the pupil. In this configuration, the CGH can be computed directly using numerical diffraction propagation algorithms, such as a Fresnel integral method or an angular spectrum method [47]. However, the use of the optical relay increases the device’s form factor. Alternatively, we can rely on only the CGH to shift the exit pupil [Fig. 5(b)]. In this case, we can use a double-step diffraction algorithm to compute the CGH, where the desired eyebox location serves as an intermediate plane to relay the calculated hologram.

For a comfortable viewing experience, the near-eye display must provide an eyebox that is greater than the eye movement range. Active pupil tracking and passive pupil replication are two main strategies. A representative schematic of active pupil tracking is shown in Fig. 5(c). According to the detected eye position, a micro–electro–mechanical-system (MEMS) mirror changes the incident angle of light on the SLM, adding a linear phase to the diffracted wave. After being focused by another HOE, the wavefront converges at the exit pupil, which dynamically follows the eye’s movement [45]. While the active method requires additional components to track the eye pupil’s movement, the passive method directly replicates the exit pupil into an array, thereby effectively expanding the eyebox. Park and Kim [48] demonstrated the expansion of the eyebox along the horizontal axis by multiplexing three different converging beams on a photopolymer film. Later, they extended this work to the vertical axis by leveraging the high-order diffractions of the SLM [Fig. 5(d)] [46]. Jeong et al. developed another passive eyebox expansion method using a custom HOE fabricated by holographic printing [49]. The resultant method expands the eyebox along both horizontal and vertical axes while maintaining a large FOV (50°). The passive eyebox expansion method using holographic modulation has also been implemented in Maxwellian displays [50,51]. The conventional Maxwellian display optics uses refractive lenses to reduce the effective exit pupil into a pinhole, thereby rendering an all-in-focus image. In contrast, a holographic Maxwellian display replaces the refractive lenses with holograms, modulating the wavefront into an array of focused “pinholes” for eyebox expansion. The duplication of a pinhole-shaped pupil was accomplished by multiplexing concave mirrors into a single waveguide HOE [Fig. 6(a)] [50] or numerically encoding the hologram with multiple off-axis converging spherical waves [Fig. 6(b)] [51].

 

Fig. 6. Eyebox expansion in a Maxwellian display using holographic modulation through (a) HOE (reproduced with permission from [50], 2018, OSA) and (b) encoding CGH with multiplexed off-axis plane waves (reproduced with permission from [51], 2019, Springer Nature).

Download Full Size | PPT Slide | PDF

2. Speckle Noise

Holographic displays commonly use lasers as the illumination source due to the coherence requirement. However, the use of lasers induces the speckle noise, a grainy intensity pattern superimposed on the image. To suppress the speckle noise, we can adopt three strategies: superposition, spatial coherence construction, and temporal coherence destruction.

In the superposition-based method, we first calculate a series of holograms for the same 3D object by adding statistically independent random phases to the complex amplitude, followed by sequentially displaying them on the SLM within the eye’s response time [Fig. 7(a)]. Due to the spatial randomness, the average contrast of the summed speckles in the reconstructed 3D image is reduced, leading to improved image quality [52]. This method has been used primarily in layer-based 3D models [27,53], where the calculation of the holograms is computationally efficient. Alternatively, we can use high-speed speckle averaging devices, such as a rotating or vibrating diffuser. Figure 7(b) shows an example of superposition by placing a rotating diffuser in front of the laser source. In this case, the speckle patterns are averaged within the SLM’s frame time at the expense of an increased form factor.

 

Fig. 7. Speckle noise suppression by (a) superposition of multiple CGHs [53], (b) rotating diffuser, and (c) complex amplitude modulation using a single CGH (reproduced with permission from [29], 2017, OSA).

Download Full Size | PPT Slide | PDF

The speckle noise is caused by the destructive spatial interference among the overlapped imaged point spread function [54]. We can alleviate this problem by actively manipulating spatial coherence constructions. Within this category, the most important method is complex amplitude modulation, which introduces constructive interference to the hologram. Rather than imposing random phases on the wavefront, complex amplitude modulation uses a “smooth” phase map, such as a uniform phase, for constructive interference among overlapped imaged spots. To synthesize a desired complex amplitude field using the phase-only or amplitude-only CGH, we can use analytic multiplexing [55,56], double-phase decomposition [57,58], double-amplitude decomposition [26,29], optimization-enabled phase retrieval [5961], or neural holography [62]. Figure 7(c) shows an example where the complex amplitude wavefront is analytically decomposed into two-phase holograms and loaded into different zones of an SLM. The system then uses a holographic grating filter to combine the holograms and reconstruct complex amplitude 3D images [29]. The major drawback of this method is the requirement that the complex wavefront contains only low frequencies, resulting in a small numerical aperture at the object side. This increases the depth of field and thereby weakens the focus cue. As an alternative solution, Makowski et al. introduced a spatial separation between the adjacent image pixels to avoid overlaps in the image space, thereby eliminating spurious interferences [54].

Last, using a light source with low temporal coherence can also suppress the speckle noise. Partially coherent light sources, such as a superluminescent light-emitting diode (sLED) and micro LED (mLED), are usually employed for this purpose [63]. Also, a spatial filter can be applied to an incoherent light source to shape its spatial coherence while leaving it temporally incoherent. As an example, the spatially filtered LED light source has been demonstrated in holographic displays to reduce the speckle noise [64,65]. Recently, Olwal et al. extended this method to an LED array and demonstrated high-quality 3D holography [66]. The drawbacks of using a partially coherent light source include reduced image sharpness and a shortened depth range. However, the latter can be compensated for by reconstructing the holographic image at a small distance near the SLM, followed by using either a tunable lens [67] or an eyepiece [68] to extend the depth range.

3. Accommodation

The accommodation cue in holographic displays can be accurately addressed by computing the CGH based on physical optical propagation and interference. In physically based CGH calculation, the virtual 3D object is digitally represented by a wavefront from point emitters [31,69] or polygonal tiles [70]. These two models usually require a dense point-cloud or mesh samplings to reproduce a continuous and smooth depth cue. Although there are many methods to accelerate the CGH calculation from the point-cloud or polygon-based 3D models, such physically based CGHs face challenges in processing the enormous data in real time. Moreover, occlusion is often not easily implemented in physically based CGHs, limiting virtual 3D objects to simple geometries.

The advance of computer graphics has led to two image-rendering models that can help a holographic display produce the accommodation cue more efficiently: the layer-based model and the stereogram-based model. The layer-based model renders a 3D object as multiple depth layers, followed by propagating the layer-associated wavefronts to the hologram through fast-Fourier-transform (FFT)-based diffraction [7173]. To render a continuous 3D object with finite depth layers, Akeley et al. developed a depth-weighted blending (or depth-fused) algorithm [74,75]. This algorithm makes the image intensity at each depth plane proportional to the dioptric distance of the point from that plane to the viewer along a line of sight (Fig. 8) using a linear [76] or nonlinear [77] model. Other rendering methods include using an optimization algorithm to compute the layer contents that best match the imaged scene at the retina when the eye focuses at different distances [78], or using color to binary decomposition of a 3D scene into multiple binary images for digital micromirror device (DMD)-based display [79]. As an alternative solution to create continuous depth views by multiplane images, we can also consider displaying high density stacks of depth/focal planes from CGHs using high-speed focus-tunable optics [80,81].

 

Fig. 8. Holographic multiplane near-eye display based on linear depth blending rendering [53].

Download Full Size | PPT Slide | PDF

Despite being computationally efficient, the layer-based model has difficulties in rendering the view-dependent visual effects, such as occlusions, shading, reflectance, refraction, and transparency. In contrast, the holographic stereogram model can provide both the accommodation cue and view-dependent visual effects. As illustrated in Fig. 9(a), this model first calculates the light field of a 3D object using a ray-based propagation method. Then it computes the CGH by converting the light field into a complex wavefront [82]. Simply put, the CGH is spatially partitioned into small holographic elements, referred to as “hogel,” that direct light rays (plane waves) to varied directions to form the correspondent view images. Like the light field display, the holographic stereogram requires a choice of hogel size, imposing a hard tradeoff between spatial and angular resolutions. Noteworthily, this tradeoff has been recently mitigated by using two non-hogel-based methods. The first method is to encode a compressed light field into the hologram [83]. The second method uses an overlap-added stereogram (OLAS) algorithm to convert the dense light field data into a hologram [Fig. 9(a)] [84,85], enabling more efficient computation and improving image quality.

 

Fig. 9. Holographic stereogram and Maxwellian view. (a) Holographic stereogram for realistic 3D perception (included here by permission from [85], 2019, ACM). (b) Holographic Maxwellian view (reproduced with permission from [51], 2019, Springer Nature).

Download Full Size | PPT Slide | PDF

 

Fig. 10. Color holographic near-eye display using (a) time division (reproduced with permission from [86], 2019, OSA) and (b) metasurface HOE (reproduced with permission from [41], 2019, OSA).

Download Full Size | PPT Slide | PDF

The Maxwellian display is another strategy to mitigate the VAC by completely removing the accommodation cues [87,88]. Because the display needs to render only an all-in-focus image, the computational cost is minimized. In a holographic Maxwellian near-eye display, the complex hologram is a superimposed pattern of a real target image and a spherical phase. The light emitted from this hologram enters the eye pupil and forms an all-in-focus image on the retina, as depicted in Fig. 9(b) [51,89]. The flexible phase modulation of the CGH allows correcting wavefront errors in the pupil to produce Maxwellian retinal images for astigmatic eyes [89]. Lee et al. also developed a multi-functional system where they can switch between the holographic 3D view and the Maxwellian view using a switchable HOE [90] or simultaneously display both using temporal multiplexing [91]. Later, they further improved this method by using a foveated image rendering technique [92].

4. Full-Color Display

Displaying colors is challenging for holographic near-eye displays because the CGH is sensitive to the wavelength. There are two methods for full-color reproduction: temporal [86,93] and spatial division [94,95]. Temporal division calculates three sub-CGHs for the RGB components of a 3D object, followed by displaying them sequentially on a single SLM. Accordingly, an RGB laser source illuminates the sub-CGH at the corresponding wavelength [28,31,93]. Figure 10(a) shows a typical setup where each sub-CGH is sequentially illuminated from the RGB laser source. The double-phase encoding and frequency grating filtering are employed for complex amplitude modulation in each color channel [86]. In contrast, spatial division simultaneously displays three CGHs for RGB colors on three SLMs [94] or different zones of a single SLM [95], which are illuminated by the lasers of the corresponding wavelengths. The reconstructed RGB holographic images are then optically combined and projected onto the retina. However, due to the use of multiple SLMs and lasers, the resultant systems generally suffer from a significant form factor. To address this problem, Yang et al. developed a compact color rainbow holographic display [96], where they display only a single encoded hologram on the SLM under white LED illumination. A slit spatial filter is then used in the frequency domain to extract the RGB colors.

For AR applications, several full-color waveguide HOEs have been developed to combine virtual and real-world images. To enable the transmission of multiple wavelengths, we can fabricate a multilayer structure in the HOE recording, each layer responding to a different color [97]. Alternatively, a metasurface component can also be used for color mixing [41]. An example is shown in Fig. 10(b). The RGB illumination light is coupled into the waveguide through a single-period grating at different incident angles. After being transmitted to the eye side and coupled out through a binary metasurface CGH, the light recombines and forms a multicolor holographic image in the far field.

4. HOLOGRAPHIC NEAR-EYE DISPLAYS: IMMERSION

A. Field of View

The FOV is defined as an angular or distance range over which the object spans. In holographic displays, the SLM (loaded with a Fresnel or Fourier hologram) generally locates at the exit pupil of the display system, so the diffraction angle ${\theta _{{\max}}}$ of the SLM determines the maximum size of the holographic image and, therefore, the system’s FOV. Given the SLM pixel size $p$ and the cone angle of the incident beam ${\theta _{\rm{in}}}$, the maximum diffraction angle can be calculated as [98]

$${\theta _{{\max}}} = {\sin ^{- 1}}\left({\lambda /{{2}}p{{+ 2 \sin(}}{\theta _{\rm{in}}}/2)} \right) \approx {\sin ^{{\rm{- 1}}}}\left({\lambda /{{2}}p} \right) + {\theta _{\rm{in}}}.$$
If we consider plane-wave illumination, we can simplify Eq. (1) as ${\theta _{{\max}}} = {{\sin}^{- 1}}(\lambda /{{2}}p)$. To create an eye relief for comfortable wearing, a typical near-eye display usually uses an eyepiece to relay the wavefront on the SLM to the eye pupil, as shown in Fig. 11(a). Given a distance, ${d_{{\rm{slm}}}}$, between the SLM and the eyepiece, the eye relief, ${d_{{\rm{eye}}}}$, can be calculated as ${d_{{\rm{eye}}}} = |{{1/}}({{1/}}f - {{1/}}{d_{{\rm{slm}}}})|$, where $f$ is the focal length of the eyepiece. The FOV can then be calculated as the diffraction angle of the relayed SLM through the eyepiece:
$${\rm{FOV}} = 2{\theta _{{\max}}}\frac{{{d_{{\rm{slm}}}}}}{{{d_{{\rm{eye}}}}}} = \frac{{2{\theta _{{\max}}}}}{M},$$
where $M = {d_{{\rm{eye}}}}/{d_{{\rm{slm}}}}$ is the magnification of the eyepiece.
 

Fig. 11. FOV in a holographic near-eye display. (a) Holographic near-eye display involving an eyepiece. (b) Enlarging the FOV through pupil relay. (c) Enlarging the FOV through spherical-wave illumination.

Download Full Size | PPT Slide | PDF

The typical pixel size of an SLM varies from 3 µm to 12 µm. Therefore, the diffraction angle ${\theta _{{\max}}}$ is generally less than 5° under plane-wave illumination. To expand the FOV, we can use two strategies to increase the diffraction angle for a given SLM: pupil relay and spherical-wave illumination [68]. The first strategy uses a ${{4}}f$ relay system to de-magnify the SLM [Fig. 11(b)]. The imaged pixel size becomes smaller, leading to a larger diffraction angle ${\theta _{{\max}}}$ according to Eq. (1) and, therefore, a larger FOV. The second strategy uses a spherical wavefront to illuminate the SLM [Fig. 11(c)] [99], increasing the diffraction angle and thereby the FOV. In another implementation, Su et al. used an off-axis holographic lens to replace spherical-wave illumination, introducing the quadratic wavefront modulation after the SLM [100].

Despite being simple to implement, both strategies above sacrifice the eyebox size because of a finite space-bandwidth product. To alleviate this tradeoff, we can use multiple SLMs in a planar or curved configuration [101107]. However, this method cannot be readily applied to the near-eye display because of a large form factor. A more practical approach is to use SLM with reduced pixel size, thereby allowing more pixels to be packed in the same area. Further reducing the pixel size of current liquid crystal on silicon (LCoS)- or DMD-based SLMs is challenging due to the manufacturing constraint [108110]. In contrast, the dielectric metasurface holds great promise in this aspect—it can encode CGHs with a pixel size at the subwavelength scale (${\sim}{{300}}\;{\rm{nm}}$), therefore enabling a much larger FOV (${\sim}\;{60\, \rm{deg}}$) than the current systems. Moreover, the metasurface hologram can control the polarization state of light at the pixel level [111], allowing more degrees of freedom for increasing the space-bandwidth product. As a novel technique, the dielectric metasurface holography still faces many challenges, such as a lack of algorithms for nanoscale and vectorial diffraction, a high-cost and time-consuming fabrication process, and inability to dynamically modulate the wavefront.

As an alternative solution, temporal multiplexing can be employed to increase the FOV at the expense of a reduced frame rate. Figure 12(a) illustrates a representative system using a temporal division and spatial tiling (TDST) technique [98]. Two CGHs with spatially separated diffractions are displayed on the SLM sequentially. The output images are then tiled on a curved surface, increasing the horizontal FOV by a factor of four. A similar method that uses resonant scanners has also been reported [112,113]. Figure 12(b) shows another temporal-multiplexing setup that utilizes the high-order diffraction of an SLM [114]. Li et al. tiled different diffraction orders of the SLM in the horizontal direction and passed a specific order at a time through a synchronized electro-shutter. Three-times improvement in horizontal FOV has been demonstrated.

 

Fig. 12. Enlarging the FOV through (a) temporal division and spatial tiling (reproduced with permission from [98], 2013, OSA) and (b) temporal division and diffraction-order tiling [114].

Download Full Size | PPT Slide | PDF

B. Resolution and Foveated Rendering

Most current commercial near-eye displays provide an angular resolution of 10–15 pixels per degree (ppd), whereas the acuity of a normal adult is about 60 ppd. The pixelated image jeopardizes the immersive experience. The foveated display technique has been exploited to improve the resolution in the central vision (${\sim} {{\pm}}\;{{5}}^\circ$) while rendering the peripheral areas with fewer pixels. When combined with eye tracking, the foveated display can provide a large FOV with an increased perceived resolution. The common implementation is to use multiple display panels to create images of varied resolutions [115,116]. The foveal and peripheral display regions can be dynamically driven by gaze tracking using the traveling micro-display and HOE [117].

In holographic near-eye displays, foveated image rendering has been used primarily to reduce the computational load [118123]. Figures 13(a) and 13(b) illustrate examples of using foveated rendering for generating CGHs for the point-cloud- [118120], polygon- (mesh) [121], and multiplane-based models [123]. All these implementations render a high-resolution image only for the fovea, while reducing resolution in peripheral areas, thereby significantly reducing the computation time for the CGH. To render a foveal image that follows the eye’s gaze angle, we must update the CGH accordingly. Figures 13(c) and 13(d) show the difference in image resolution when the eye gazes at different locations for the polygon- [121] and multiplane-based [123] models, respectively.

 

Fig. 13. Foveated image rendering of (a) point-cloud and polygon-based models (reproduced with permission from [121], 2019, OSA) and (b) multiplane-based model [123]. The foveal content changes according to the eye gaze angle in (c) polygon-based model (reproduced with permission from [121], 2019, OSA) and (d) multiplane-based model [123].

Download Full Size | PPT Slide | PDF

C. Real-Time Interaction

1. Gesture and Haptic Interaction

Human–computer interaction is indispensable for enhancing the immersion experience. Gesture or haptic feedback has been applied widely in VR systems when a user experiences virtual objects in the displayed environment, providing a more realistic sensation to mimic the physical interaction process [124]. While haptic techniques rely on wearable devices such as haptic gloves, the gesture is more interactive to handle the virtual object in real time through hand and finger movements, which can be detected by a motion sensor. A state-of-the-art interactive bench-top holographic display was reported by Yamada et al. [125]. They used the CGH to display a holographic 3D image while employing a motion sensor (Leap Motion Inc.) to detect hand and finger gestures with high accuracy. However, the real-time CGH display with gesture interaction has yet to be explored in holographic near-eye displays.

2. Real-Time CGH Calculation

For current holographic near-eye displays, CGHs are usually calculated offline due to the enormous 3D data involved. For real-time interaction, fast CGH calculation is critical, which can be achieved through fast algorithms and hardware improvement.

To calculate the CGH, we commonly use algorithms based on the point-cloud-, polygon-, ray-, and multiplane-based models. In the point-cloud model, the 3D object is rendered as a large number of point sources. The calculation of the correspondent CGH can be accelerated using the lookup table (LUT) [126128] and wavefront recording plane (WRP) methods [129]. Yet, it still requires significant storage and a high data transfer rate. The polygon-based model depicts the 3D object as aggregates of small planar polygons [70]. It is faster than the point-cloud by lowering the sampling rate and using FFT operation [130,131]. However, the reconstructed image suffers from fringe artifacts [132]. The ray-based model is faster than the point-cloud and polygon in CGH calculation [133,134]. Nonetheless, the capture or rendering of the light field incurs additional computational cost. So far, the multiplane model has been considered the best option for real-time interactive holographic near-eye displays because it involves only finite FFT operations between the planes and thereby offers the fastest CGH calculation speed [27]. Noteworthily, a recent study showed that exploring machine learning can further boost CGH calculation for the multiplane model, enabling real-time generation of full-color holographic images at a 1080p resolution [62].

On the other hand, advances in hardware, such as the deployment of field-programmable gate arrays (FPGAs) and graphic process units (GPUs), can increase the CGH calculation speed as well [135]. Recently, a powerful FPGA-based computer named “HORN” was developed for fast CGH calculation [136138]. The HORN can execute CGH algorithms several thousand times faster than personal computers. For example, it can calculate the hologram of ${{1024}} \times {{1024}}$ pixels resolution from 1024 rendered multiplane images in just 0.77 s [139]. The practical use of FPGA-based approaches is hampered by the long R&D cycle of FPGA board design. Alternatively, GPUs can be used for the same purpose. For instance, NVIDIA Geforce series GPUs can calculate a CGH of ${{6400}} \times {{3072}}$ pixels from a 3D object consisting of 2048 points in 55 ms [140]. As another example, using the layer-based CGH algorithm, Gilles et al. demonstrated real-time (24 fps) calculation of CGH of ${{3840}} \times {{2160}}$ pixels [141]. We expect the high-performance GPU will be the enabling tool for real-time interactive holographic displays. Although we herein discuss only offline computing, real-time inline processing can be made possible by synergistic integration of novel algorithms and hardware.

5. OUTLOOK

We envision that the next-generation holographic near-eye displays will be as compact as regular eyeglasses, meeting the computing needs for all-day use. The ability to create a comfortable and immersive viewing experience is the key. As a rule of thumb, we must adapt the hardware design for the human vision system at both low and high levels. Low-level vision refers to a psychophysical process in which the eye acquires visual stimuli, and it involves both the physical optics of the eye and anatomical structure of the retina. In contrast, high-level vision refers to a psychometric process in which the brain interprets the image. It describes the signal processing in the visual cortex, such as perception, pattern recognition, and feature extraction. Figure 14 shows a hierarchical structure of signal processing of the human visual system.

 

Fig. 14. Signal processing of the human visual system.

Download Full Size | PPT Slide | PDF

A. Adaption to Low-Level Vision

To adapt to low-level vision, we must match the FOV, resolution, and exit pupil of the display system with eye optics. For holographic near-eye displays, the major challenge lies in the tradeoff between the FOV and exit pupil (eyebox)—the product of these two factors is limited by the total number of pixels of the SLM. Based on the analysis in Section 4.A, under plane-wave illumination (${\theta _{{\rm{in}}}} = {{0}}$) and paraxial approximation (${{\sin}^{- 1}} x \approx x$), the product of the FOV and eyebox along one dimension can be derived as

$${\rm{FOV}} \cdot {\rm{eyebox}} = \frac{{2{\theta _{{\max}}}}}{M} \cdot MNp \approx 2\frac{\lambda}{{2p}}Np = \lambda N,$$
where $N$ is the pixel numbers of the SLM in one dimension. Figure 15 shows the relationship between the FOV and eyebox when using an SLM of varied pixels according to Eq. (3) ($\lambda = {{532}}\;{\rm{nm}}$). In general, the larger the format of the SLM, the better the balance we can reach between the FOV and the eyebox. Conventional holographic displays are based on phase-only SLMs. However, the manufacturing cost of large-format phase-only SLMs is high, restricting their practical use in consumer products. In contrast, the amplitude-only SLM, such as an LCD or a DMD, is much more cost efficient. Also, recent studies demonstrated that a complex wavefront could be effectively encoded into an amplitude-only CGH, allowing it to be displayed on an amplitude-only SLM [53,123,142]. Therefore, we envision that amplitude-only large-format SLMs will be mainstream devices in future holographic near-eye displays. For example, with a state-of-the-art 8 K DMD [143,144], we can potentially achieve an 85° horizontal FOV with a ${{3}}\;{\rm{mm}} \times {{3}}\;{\rm{mm}}$ eyebox. An alternative direction is to exploit superresolution by mounting masks with fine structures on a regular SLM, thereby increasing the diffraction angle without compromising the total diffraction area. For example, the use of a diffusive medium [145], a random phase mask [146,147], and a non-periodic photon sieve [148] have been reported for this purpose. However, using these additional masks complicates image reconstruction and degrades image quality. To solve this problem, we can explore the machine-learning-based method, which has been demonstrated effective in complex-media-based coherent imaging systems [149,150].
 

Fig. 15. Tradeoff relations between FOV and eyebox (wavelength is 532 nm in the calculation).

Download Full Size | PPT Slide | PDF

The resolution provided by a holographic display is determined by the modulation transfer function (MTF) of the eye at a specific pupil size (i.e., the smaller of the exit pupil of the display and eye pupil). Adapting to the eye’s native resolution, therefore, means rendering an image with a varied resolution that matches with the radial MTF of the eye. The foveated technique has been demonstrated highly effective in this regard, significantly reducing the computational cost and thereby holding promise to enable real-time human–computer interaction. Noteworthily, as demonstrated in a recent work [151], the efficiency and quality of foveated rendering can be further improved by the deep-learning-based method. In addition, a study shows that the human eye possesses a lower resolution in response to the blue color compared to red and green [152,153]. The future holographic near-eye displays can take this fact into consideration when rendering different resolutions for the RGB channels.

For visually impaired eyes such as myopia and hyperopia, adapting to the eye’s native resolution also means correcting for the defocus/astigmatism of the eye lens. Currently, most holographic near-eye displays do not provide this function, and users with visual impairment must wear extra prescription glasses. Holographic near-eye displays have a unique advantage in correcting for eye aberrations by simply adding a compensating phase map to the hologram, and this direction is yet to be explored.

B. Adaption to High-Level Vision

Adaption to high-level vision requires a thorough understanding of human perception, particularly regarding the way signals are processed in the visual cortex. Despite being equally important, there is a lack of awareness of the role that high-level vision plays in visual perception. An interesting fact is that a “perfect” image from the perspective of low-level vision may not be a “perfect” image from the perspective of high-level vision. For example, M. Banks’ group reported that the chromatic aberration induced by eye optics improves the realism perceived by the brain [154]. The same group also revealed that the blurred point spread function caused by the aberrated crystalline lens, which is considered a downside of low-level vision, actually helps the brain interpret the image to form 3D perception [155]. As another example, Wetzstein’s group recently showed that the rendering of ocular parallax—a small depth-dependent image shift on our retina—improves perceptual realism in VR displays [156]. To reflect this principle in holographic near-eye displays, we must optimize the hardware for “perceptual realism” rather than “photorealism.” For instance, to create perfect visual stimuli from the perspective of high-level vision, we must reproduce the “desired” aberrations in the reconstructed image when calculating the CGH (Fig. 16). Despite being studied extensively, there are still many unknowns regarding human perception. Further investigation of this area, therefore, requires synergistic efforts from both optical engineers and neuroscientists.

 

Fig. 16. Reproduction of “perceptual realism” in holographic near-eye displays.

Download Full Size | PPT Slide | PDF

C. Challenges and Perspectives

Despite decades of engineering effort, the challenge of developing real-time, high-quality, and true (i.e., pixel-precise) 3D holographic displays remains. This challenge is caused by two problems: first, the limited space-bandwidth product offered by current SLMs and, second, the lack of fast algorithms that can generate high-quality holograms from 3D point-clouds, meshes, or light fields in real time.

The limited space-bandwidth product imposes a fundamental tradeoff between FOV or image size and eyebox or viewing zone size in near-eye and direct-view displays. Current 1080p or 4 k SLM resolutions are ready for near-eye display applications, but they are far from being able to support direct-view displays. For the latter application, large-scale panels with sub-micrometer-sized phase-modulating pixels or coherent emitter arrays are required, which is out of reach for today’s opto-electronic devices. However, progress has been made with solid-state LiDAR systems for autonomous driving. These LiDAR systems are optimized for beam steering applications, but similar to holographic displays, they require an array of coherent sources. Thus, we hope that efforts on continuing to push the resolution and size of coherent arrays, for example, via advances in laser or photonic waveguide technology, will eventually translate into display applications.

Improving CGH algorithms has been a goal of much recent work. Yet, achieving high-quality true 3D holographic image synthesis at real-time framerates has been a challenging and unsolved problem for decades. While conventional display technologies, such as liquid crystal or organic LED displays, can directly show a target image by setting their pixels’ states to match those of the image, holographic displays cannot. Holographic displays must generate a visible image indirectly through interference patterns of a reference wave at some distance in front of the SLM—and when using a phase-only SLM, there is yet another layer of indirection added to the computation. This challenge requires ultra-precise and automatic calibration of holographic displays. Moreover, to the best of our knowledge, no existing algorithm has been demonstrated to generate, transmit, and convert 3D image data into a hologram offering true 3D display capabilities. One of the most promising approaches to solve these long-standing challenges is to combine methodology developed for modern approaches to artificial intelligence (AI) with the physical optics of holography [62]. AI-driven CGH techniques show promise in solving many of these long-standing challenges.

6. CONCLUSION

The demands for AR/VR devices have been ever increasing. The ability to create a comfortable and immersive viewing experience is critical for translating the technology from lab-based research to the consumer market. Holographic near-eye displays have unique advantages in addressing this unmet need by providing accurate per-pixel focal control. We presented a human-centric framework to review the advances in this field, and we hope such a perspective can inspire new thinking and evoke an awareness of the role of the human visual system in guiding future hardware design.

Funding

National Science Foundation (1553333, 1652150, 1839974); National Institutes of Health (R01EY029397, R21EB028375, R35GM128761); UCLA; Ford (Alliance Project); Sloan Fellowship; Okawa Research Grant; PECASE by the ARO.

Acknowledgment

We thank Prof. Hong Hua from the University of Arizona for helpful discussions.

Disclosures

The authors declare no conflicts of interest.

REFERENCES

1. I. E. Sutherland, “The ultimate display,” in Proceedings of lFIP (1965), Vol. 65, pp. 506–508.

2. T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016). [CrossRef]  

3. Y. Wang, W. Liu, X. Meng, H. Fu, D. Zhang, Y. Kang, R. Feng, Z. Wei, X. Zhu, and G. Jiang, “Development of an immersive virtual reality head-mounted display with high performance,” Appl. Opt. 55, 6969–6977 (2016). [CrossRef]  

4. E. Peli, “Real vision & virtual reality,” Opt. Photon. News 6(7), 28 (1995). [CrossRef]  

5. W. Cui and L. Gao, “All-passive transformable optical mapping near-eye display,” Sci. Rep. 9, 6064 (2019). [CrossRef]  

6. W. Cui and L. Gao, “Optical mapping near-eye three-dimensional display with correct focus cues,” Opt. Lett. 42, 2475–2478 (2017). [CrossRef]  

7. S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18, 11562–11573 (2010). [CrossRef]  

8. G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716–15725 (2009). [CrossRef]  

9. N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017). [CrossRef]  

10. R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in CHI Conference on Human Factors in Computing Systems (ACM, 2016), pp. 1211–1220.

11. K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019). [CrossRef]  

12. D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32, 220 (2013). [CrossRef]  

13. F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015). [CrossRef]  

14. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22, 13484–13491 (2014). [CrossRef]  

15. C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017). [CrossRef]  

16. K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016). [CrossRef]  

17. B. C. Kress, “Human factors,” in Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets (SPIE, 2010).

18. B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013). [CrossRef]  

19. C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990). [CrossRef]  

20. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008). [CrossRef]  

21. B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013). [CrossRef]  

22. H. Urey and K. D. Powell, “Microlens-array-based exit-pupil expander for full-color displays,” Appl. Opt. 44, 4930–4936 (2005). [CrossRef]  

23. T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14, 467–475 (2006). [CrossRef]  

24. C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56, 9390–9397 (2017). [CrossRef]  

25. M. K. Hedili, B. Soner, E. Ulusoy, and H. Urey, “Light-efficient augmented reality display with steerable eyebox,” Opt. Express 27, 12572–12581 (2019). [CrossRef]  

26. Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24, 17372–17383 (2016). [CrossRef]  

27. J.-S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23, 18143–18155 (2015). [CrossRef]  

28. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22, 6526–6534 (2014). [CrossRef]  

29. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25, 8412–8424 (2017). [CrossRef]  

30. C. W. Ooi, N. Muramatsu, and Y. Ochiai, “Eholo glass: electroholography glass. a lensless approach to holographic augmented reality near-eye display,” in SIGGRAPH Asia 2018 Technical Briefs (ACM, 2018), p. 31.

31. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017). [CrossRef]  

32. H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23, 32025–32034 (2015). [CrossRef]  

33. T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

34. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016). [CrossRef]  

35. P. Zhou, Y. Li, S. Liu, and Y. Su, “Compact design for optical-see-through holographic displays employing holographic optical elements,” Opt. Express 26, 22866–22876 (2018). [CrossRef]  

36. K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019). [CrossRef]  

37. C. Martinez, V. Krotov, B. Meynard, and D. Fowler, “See-through holographic retinal projection display concept,” Optica 5, 1200–1209 (2018). [CrossRef]  

38. W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism correction and quality optimization of computer-generated holograms for holographic waveguide displays,” Opt. Express 28, 5519–5527 (2020). [CrossRef]  

39. J. Xiao, J. Liu, Z. Lv, X. Shi, and J. Han, “On-axis near-eye display system based on directional scattering holographic waveguide and curved goggle,” Opt. Express 27, 1683–1692 (2019). [CrossRef]  

40. “Holographic optical elements and application,” IntechOpen, https://www.intechopen.com/books/holographic-materials-and-optical-systems/holographic-optical-elements-and-application.

41. Z. Huang, D. L. Marks, and D. R. Smith, “Out-of-plane computer-generated multicolor waveguide holography,” Optica 6, 119–124 (2019). [CrossRef]  

42. Z. Huang, D. L. Marks, and D. R. Smith, “Polarization-selective waveguide holography in the visible spectrum,” Opt. Express 27, 35631–35645 (2019). [CrossRef]  

43. R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019). [CrossRef]  

44. S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020). [CrossRef]  

45. C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019). [CrossRef]  

46. M.-H. Choi, Y.-G. Ju, and J.-H. Park, “Holographic near-eye display with continuously expanded eyebox using two-dimensional replication and angular spectrum wrapping,” Opt. Express 28, 533–547 (2020). [CrossRef]  

47. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).

48. J.-H. Park and S.-B. Kim, “Optical see-through holographic near-eye-display with eyebox steering and depth of field control,” Opt. Express 26, 27076–27088 (2018). [CrossRef]  

49. J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019). [CrossRef]  

50. S.-B. Kim and J.-H. Park, “Optical see-through Maxwellian near-to-eye display with an enlarged eyebox,” Opt. Lett. 43, 767–770 (2018). [CrossRef]  

51. C. Chang, W. Cui, J. Park, and L. Gao, “Computational holographic Maxwellian near-eye display with an expanded eyebox,” Sci. Rep. 9, 18749 (2019). [CrossRef]  

52. J. Amako, H. Miura, and T. Sonehara, “Speckle-noise reduction on kinoform reconstruction using a phase-only spatial light modulator,” Appl. Opt. 34, 3165–3171 (1995). [CrossRef]  

53. C. Chang, W. Cui, and L. Gao, “Holographic multiplane near-eye display based on amplitude-only wavefront modulation,” Opt. Express 27, 30960–30970 (2019). [CrossRef]  

54. M. Makowski, “Minimized speckle noise in lens-less holographic projection by pixel separation,” Opt. Express 21, 29205–29216 (2013). [CrossRef]  

55. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21, 20577–20587 (2013). [CrossRef]  

56. G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22, 18473–18482 (2014). [CrossRef]  

57. Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24, 30368–30378 (2016). [CrossRef]  

58. C. Chang, Y. Qi, J. Wu, J. Xia, and S. Nie, “Speckle reduced lensless holographic projection from phase-only computer-generated hologram,” Opt. Express 25, 6568–6580 (2017). [CrossRef]  

59. C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg–Saxton algorithm,” Appl. Opt. 54, 6994–7001 (2015). [CrossRef]  

60. P. Sun, S. Chang, S. Liu, X. Tao, C. Wang, and Z. Zheng, “Holographic near-eye display system based on double-convergence light Gerchberg-Saxton algorithm,” Opt. Express 26, 10140–10151 (2018). [CrossRef]  

61. P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019). [CrossRef]  

62. Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).

63. Y. Deng and D. Chu, “Coherence properties of different light sources and their effect on the image sharpness and speckle of holographic displays,” Sci. Rep. 7, 5893 (2017). [CrossRef]  

64. T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24, 2189–2199 (2016). [CrossRef]  

65. M. Chlipala and T. Kozacki, “Color LED DMD holographic display with high resolution across large depth,” Opt. Lett. 44, 4255–4258 (2019). [CrossRef]  

66. A. Olwal and B. Kress, “1D eyewear: peripheral, hidden LEDs and near-eye holographic displays for unobtrusive augmentation,” in ACM International Symposium on Wearable Computers (ISWC) (ACM, 2018), pp. 184–187.

67. D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

68. G. Li, “Study on improvements of near-eye holography: form factor, field of view, and speckle noise reduction,” Thesis (Seoul National University, 2018).

69. C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019). [CrossRef]  

70. K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44, 4607–4614 (2005). [CrossRef]  

71. S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20, 45–59 (2019). [CrossRef]  

72. J. Xia, W. Zhu, and I. Heynderickx, “41.1: three-dimensional electro-holographic retinal display,” SID Symp. Dig. Tech. Pap. 42, 591–594 (2011). [CrossRef]  

73. Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018). [CrossRef]  

74. K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004). [CrossRef]  

75. S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004). [CrossRef]  

76. S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940–20952 (2011). [CrossRef]  

77. S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18, 11562–11573 (2010). [CrossRef]  

78. R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015). [CrossRef]  

79. K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018). [CrossRef]  

80. J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019). [CrossRef]  

81. S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019). [CrossRef]  

82. L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017). [CrossRef]  

83. Z. Wang, L. M. Zhu, X. Zhang, P. Dai, G. Q. Lv, Q. B. Feng, A. T. Wang, and H. Ming, “Computer-generated photorealistic hologram using ray-wavefront conversion based on the additive compressive light field approach,” Opt. Lett. 45, 615–618 (2020). [CrossRef]  

84. J.-H. Park and M. Askari, “Non-hogel-based computer generated hologram from light field using complex field recovery technique from Wigner distribution function,” Opt. Express 27, 2562–2574 (2019). [CrossRef]  

85. N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019). [CrossRef]  

86. Z. Zhang, J. Liu, Q. Gao, X. Duan, and X. Shi, “A full-color compact 3D see-through near-eye display system based on complex amplitude modulation,” Opt. Express 27, 7023–7035 (2019). [CrossRef]  

87. R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017). [CrossRef]  

88. P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019). [CrossRef]  

89. Y. Takaki and N. Fujimoto, “Flexible retinal image formation by holographic Maxwellian-view display,” Opt. Express 26, 22985–22999 (2018). [CrossRef]  

90. J. S. Lee, Y. K. Kim, and Y. H. Won, “See-through display combined with holographic display and Maxwellian display using switchable holographic optical element based on liquid lens,” Opt. Express 26, 19341–19355 (2018). [CrossRef]  

91. J. S. Lee, Y. K. Kim, and Y. H. Won, “Time multiplexing technique of holographic view and Maxwellian view using a liquid lens in the optical see-through head mounted display,” Opt. Express 26, 2149–2159 (2018). [CrossRef]  

92. J. S. Lee, Y. K. Kim, M. Y. Lee, and Y. H. Won, “Enhanced see-through near-eye display using time-division multiplexing of a Maxwellian-view and holographic display,” Opt. Express 27, 689–701 (2019). [CrossRef]  

93. M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19, 12008–12013 (2011). [CrossRef]  

94. H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010). [CrossRef]  

95. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20, 25130–25136 (2012). [CrossRef]  

96. X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27, 38236–38249 (2019). [CrossRef]  

97. H.-Y. Wu, C.-W. Shin, and N. Kim, “Full-color holographic optical elements for augmented reality display,” in Holographic Materials and Applications, M. Kumar, ed. (IntechOpen, 2019).

98. Y.-Z. Liu, X.-N. Pang, S. Jiang, and J.-W. Dong, “Viewing-angle enlargement in holographic augmented reality using time division and spatial tiling,” Opt. Express 21, 12068–12076 (2013). [CrossRef]  

99. Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017). [CrossRef]  

100. Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018). [CrossRef]  

101. N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995). [CrossRef]  

102. T. Kozacki, M. Kujawińska, G. Finke, B. Hennelly, and N. Pandey, “Extended viewing angle holographic display system with tilted SLMs in a circular configuration,” Appl. Opt. 51, 1771–1780 (2012). [CrossRef]  

103. J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16, 12372–12386 (2008). [CrossRef]  

104. F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19, 9147–9156 (2011). [CrossRef]  

105. T. Kozacki, G. Finke, P. Garbat, W. Zaperty, and M. Kujawińska, “Wide angle holographic display system with spatiotemporal multiplexing,” Opt. Express 20, 27473–27481 (2012). [CrossRef]  

106. H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015). [CrossRef]  

107. H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019). [CrossRef]  

108. L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018). [CrossRef]  

109. Q. Jiang, G. Jin, and L. Cao, “When metasurface meets hologram: principle and advances,” Adv. Opt. Photon. 11, 518–576 (2019). [CrossRef]  

110. Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019). [CrossRef]  

111. R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018). [CrossRef]  

112. J. Li, Q. Smithwick, and D. Chu, “Full bandwidth dynamic coarse integral holographic displays with large field of view using a large resonant scanner and a galvanometer scanner,” Opt. Express 26, 17459–17476 (2018). [CrossRef]  

113. J. Li, Q. Smithwick, and D. Chu, “Scalable coarse integral holographic video display with integrated spatial image tiling,” Opt. Express 28, 9899–9912 (2020). [CrossRef]  

114. G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015). [CrossRef]  

115. G. Tan, Y.-H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, and S.-T. Wu, “Foveated imaging for near-eye displays,” Opt. Express 26, 25076–25085 (2018). [CrossRef]  

116. S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018). [CrossRef]  

117. J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019). [CrossRef]  

118. J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016). [CrossRef]  

119. J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

120. J. Hong, “Foveation in near-eye holographic display,” in International Conference on Information and Communication Technology Convergence (ICTC) (2018), pp. 602–604.

121. Y.-G. Ju and J.-H. Park, “Foveated computer-generated hologram and its progressive update using triangular mesh scene model for near-eye displays,” Opt. Express 27, 23725–23738 (2019). [CrossRef]  

122. L. Wei and Y. Sakamoto, “Fast calculation method with foveated rendering for computer-generated holograms using an angle-changeable ray-tracing method,” Appl. Opt. 58, A258–A266 (2019). [CrossRef]  

123. C. Chang, W. Cui, and L. Gao, “Foveated holographic near-eye 3D display,” Opt. Express 28, 1345–1356 (2020). [CrossRef]  

124. W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019). [CrossRef]  

125. S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018). [CrossRef]  

126. M. E. Lucente, “Interactive computation of holograms using a look-up table,” J. Electron. Imaging 2, 28 (1993). [CrossRef]  

127. S.-C. Kim and E.-S. Kim, “Effective generation of digital holograms of three-dimensional objects using a novel look-up table method,” Appl. Opt. 47, D55–D62 (2008). [CrossRef]  

128. J. Liu, J. Jia, Y. Pan, and Y. Wang, “Overview of fast algorithm in 3D dynamic holographic display,” Proc. SPIE 8913, 89130X (2013). [CrossRef]  

129. T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18, 19504–19509 (2010). [CrossRef]  

130. Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52, A290–A299 (2013). [CrossRef]  

131. Y.-Z. Liu, J.-W. Dong, Y.-Y. Pu, B.-C. Chen, H.-X. He, and H.-Z. Wang, “High-speed full analytical holographic computations for true-life scenes,” Opt. Express 18, 3345–3351 (2010). [CrossRef]  

132. X.-N. Pang, D.-C. Chen, Y.-C. Ding, Y.-G. Chen, S.-J. Jiang, and J.-W. Dong, “Image quality improvement of polygon computer generated holography,” Opt. Express 23, 19066–19073 (2015). [CrossRef]  

133. H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55, A135–A143 (2016). [CrossRef]  

134. Y. Takaki and K. Ikeda, “Simplified calculation method for computer-generated holographic stereograms from multi-view images,” Opt. Express 21, 9652–9663 (2013). [CrossRef]  

135. T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016). [CrossRef]  

136. T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002). [CrossRef]  

137. T. Ito, N. Masuda, K. Yoshimura, A. Shiraki, T. Shimobaba, and T. Sugie, “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express 13, 1923–1932 (2005). [CrossRef]  

138. Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009). [CrossRef]  

139. N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010). [CrossRef]  

140. N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51, 7303–7307 (2012). [CrossRef]  

141. A. Gilles and P. Gioia, “Real-time layer-based computer-generated hologram calculation for the Fourier transform optical system,” Appl. Opt. 57, 8508–8517 (2018). [CrossRef]  

142. H. Kim, C.-Y. Hwang, K.-S. Kim, J. Roh, W. Moon, S. Kim, B.-R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels,” Appl. Opt. 53, G139–G146 (2014). [CrossRef]  

143. “ISE to feature world’s first commercial 8K DLP projector,” https://www.svconline.com/the-wire/ise-feature-world-s-first-commercial-8k-dlp-projector-410993.

144. G. Kayye, “DPI Readies Industry’s First 8K DLP Projector, Ships 4K 3-Chip DLP Projector at 12.5K Lumens—rAVe [PUBS],” https://www.ravepubs.com/dpi-readies-industrys-first-8k-dlp-projector-demos-delivers-4k-12-5k-lumens-3-chip-dlp/.

145. H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017). [CrossRef]  

146. E. Buckley, A. Cable, N. Lawrence, and T. Wilkinson, “Viewing angle enhancement for two- and three-dimensional holographic displays with random superresolution phase masks,” Appl. Opt. 45, 7334–7341 (2006). [CrossRef]  

147. W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

148. J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019). [CrossRef]  

149. R. Horisaki, R. Takagi, and J. Tanida, “Learning-based imaging through scattering media,” Opt. Express 24, 13738–13743 (2016). [CrossRef]  

150. Y. Li, Y. Xue, and L. Tian, “Deep speckle correlation: a deep learning approach toward scalable imaging through scattering media,” Optica 5, 1181–1190 (2018). [CrossRef]  

151. A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019). [CrossRef]  

152. C. Cepko, “Giving in to the blues,” Nat. Genet. 24, 99–100 (2000). [CrossRef]  

153. Z. Qiu, Z. Zhang, and J. Zhong, “Efficient full-color single-pixel imaging based on the human vision property—‘giving in to the blues’,” Opt. Lett. 45, 3046–3049 (2020). [CrossRef]  

154. S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017). [CrossRef]  

155. S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018). [CrossRef]  

156. R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” in ACM SIGGRAPH 2019 Talks (ACM, 2019), p. 56.

References

  • View by:
  • |
  • |
  • |

  1. I. E. Sutherland, “The ultimate display,” in Proceedings of lFIP (1965), Vol. 65, pp. 506–508.
  2. T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
    [Crossref]
  3. Y. Wang, W. Liu, X. Meng, H. Fu, D. Zhang, Y. Kang, R. Feng, Z. Wei, X. Zhu, and G. Jiang, “Development of an immersive virtual reality head-mounted display with high performance,” Appl. Opt. 55, 6969–6977 (2016).
    [Crossref]
  4. E. Peli, “Real vision & virtual reality,” Opt. Photon. News 6(7), 28 (1995).
    [Crossref]
  5. W. Cui and L. Gao, “All-passive transformable optical mapping near-eye display,” Sci. Rep. 9, 6064 (2019).
    [Crossref]
  6. W. Cui and L. Gao, “Optical mapping near-eye three-dimensional display with correct focus cues,” Opt. Lett. 42, 2475–2478 (2017).
    [Crossref]
  7. S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18, 11562–11573 (2010).
    [Crossref]
  8. G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716–15725 (2009).
    [Crossref]
  9. N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
    [Crossref]
  10. R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in CHI Conference on Human Factors in Computing Systems (ACM, 2016), pp. 1211–1220.
  11. K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019).
    [Crossref]
  12. D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32, 220 (2013).
    [Crossref]
  13. F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015).
    [Crossref]
  14. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22, 13484–13491 (2014).
    [Crossref]
  15. C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
    [Crossref]
  16. K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
    [Crossref]
  17. B. C. Kress, “Human factors,” in Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets (SPIE, 2010).
  18. B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
    [Crossref]
  19. C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
    [Crossref]
  20. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008).
    [Crossref]
  21. B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
    [Crossref]
  22. H. Urey and K. D. Powell, “Microlens-array-based exit-pupil expander for full-color displays,” Appl. Opt. 44, 4930–4936 (2005).
    [Crossref]
  23. T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14, 467–475 (2006).
    [Crossref]
  24. C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56, 9390–9397 (2017).
    [Crossref]
  25. M. K. Hedili, B. Soner, E. Ulusoy, and H. Urey, “Light-efficient augmented reality display with steerable eyebox,” Opt. Express 27, 12572–12581 (2019).
    [Crossref]
  26. Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24, 17372–17383 (2016).
    [Crossref]
  27. J.-S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23, 18143–18155 (2015).
    [Crossref]
  28. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22, 6526–6534 (2014).
    [Crossref]
  29. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25, 8412–8424 (2017).
    [Crossref]
  30. C. W. Ooi, N. Muramatsu, and Y. Ochiai, “Eholo glass: electroholography glass. a lensless approach to holographic augmented reality near-eye display,” in SIGGRAPH Asia 2018 Technical Briefs (ACM, 2018), p. 31.
  31. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017).
    [Crossref]
  32. H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23, 32025–32034 (2015).
    [Crossref]
  33. T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).
  34. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016).
    [Crossref]
  35. P. Zhou, Y. Li, S. Liu, and Y. Su, “Compact design for optical-see-through holographic displays employing holographic optical elements,” Opt. Express 26, 22866–22876 (2018).
    [Crossref]
  36. K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019).
    [Crossref]
  37. C. Martinez, V. Krotov, B. Meynard, and D. Fowler, “See-through holographic retinal projection display concept,” Optica 5, 1200–1209 (2018).
    [Crossref]
  38. W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism correction and quality optimization of computer-generated holograms for holographic waveguide displays,” Opt. Express 28, 5519–5527 (2020).
    [Crossref]
  39. J. Xiao, J. Liu, Z. Lv, X. Shi, and J. Han, “On-axis near-eye display system based on directional scattering holographic waveguide and curved goggle,” Opt. Express 27, 1683–1692 (2019).
    [Crossref]
  40. “Holographic optical elements and application,” IntechOpen, https://www.intechopen.com/books/holographic-materials-and-optical-systems/holographic-optical-elements-and-application .
  41. Z. Huang, D. L. Marks, and D. R. Smith, “Out-of-plane computer-generated multicolor waveguide holography,” Optica 6, 119–124 (2019).
    [Crossref]
  42. Z. Huang, D. L. Marks, and D. R. Smith, “Polarization-selective waveguide holography in the visible spectrum,” Opt. Express 27, 35631–35645 (2019).
    [Crossref]
  43. R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
    [Crossref]
  44. S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020).
    [Crossref]
  45. C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019).
    [Crossref]
  46. M.-H. Choi, Y.-G. Ju, and J.-H. Park, “Holographic near-eye display with continuously expanded eyebox using two-dimensional replication and angular spectrum wrapping,” Opt. Express 28, 533–547 (2020).
    [Crossref]
  47. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).
  48. J.-H. Park and S.-B. Kim, “Optical see-through holographic near-eye-display with eyebox steering and depth of field control,” Opt. Express 26, 27076–27088 (2018).
    [Crossref]
  49. J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019).
    [Crossref]
  50. S.-B. Kim and J.-H. Park, “Optical see-through Maxwellian near-to-eye display with an enlarged eyebox,” Opt. Lett. 43, 767–770 (2018).
    [Crossref]
  51. C. Chang, W. Cui, J. Park, and L. Gao, “Computational holographic Maxwellian near-eye display with an expanded eyebox,” Sci. Rep. 9, 18749 (2019).
    [Crossref]
  52. J. Amako, H. Miura, and T. Sonehara, “Speckle-noise reduction on kinoform reconstruction using a phase-only spatial light modulator,” Appl. Opt. 34, 3165–3171 (1995).
    [Crossref]
  53. C. Chang, W. Cui, and L. Gao, “Holographic multiplane near-eye display based on amplitude-only wavefront modulation,” Opt. Express 27, 30960–30970 (2019).
    [Crossref]
  54. M. Makowski, “Minimized speckle noise in lens-less holographic projection by pixel separation,” Opt. Express 21, 29205–29216 (2013).
    [Crossref]
  55. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21, 20577–20587 (2013).
    [Crossref]
  56. G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22, 18473–18482 (2014).
    [Crossref]
  57. Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24, 30368–30378 (2016).
    [Crossref]
  58. C. Chang, Y. Qi, J. Wu, J. Xia, and S. Nie, “Speckle reduced lensless holographic projection from phase-only computer-generated hologram,” Opt. Express 25, 6568–6580 (2017).
    [Crossref]
  59. C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg–Saxton algorithm,” Appl. Opt. 54, 6994–7001 (2015).
    [Crossref]
  60. P. Sun, S. Chang, S. Liu, X. Tao, C. Wang, and Z. Zheng, “Holographic near-eye display system based on double-convergence light Gerchberg-Saxton algorithm,” Opt. Express 26, 10140–10151 (2018).
    [Crossref]
  61. P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
    [Crossref]
  62. Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).
  63. Y. Deng and D. Chu, “Coherence properties of different light sources and their effect on the image sharpness and speckle of holographic displays,” Sci. Rep. 7, 5893 (2017).
    [Crossref]
  64. T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24, 2189–2199 (2016).
    [Crossref]
  65. M. Chlipala and T. Kozacki, “Color LED DMD holographic display with high resolution across large depth,” Opt. Lett. 44, 4255–4258 (2019).
    [Crossref]
  66. A. Olwal and B. Kress, “1D eyewear: peripheral, hidden LEDs and near-eye holographic displays for unobtrusive augmentation,” in ACM International Symposium on Wearable Computers (ISWC) (ACM, 2018), pp. 184–187.
  67. D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.
  68. G. Li, “Study on improvements of near-eye holography: form factor, field of view, and speckle noise reduction,” Thesis (Seoul National University, 2018).
  69. C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
    [Crossref]
  70. K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44, 4607–4614 (2005).
    [Crossref]
  71. S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20, 45–59 (2019).
    [Crossref]
  72. J. Xia, W. Zhu, and I. Heynderickx, “41.1: three-dimensional electro-holographic retinal display,” SID Symp. Dig. Tech. Pap. 42, 591–594 (2011).
    [Crossref]
  73. Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
    [Crossref]
  74. K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
    [Crossref]
  75. S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
    [Crossref]
  76. S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940–20952 (2011).
    [Crossref]
  77. S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18, 11562–11573 (2010).
    [Crossref]
  78. R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
    [Crossref]
  79. K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
    [Crossref]
  80. J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019).
    [Crossref]
  81. S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
    [Crossref]
  82. L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
    [Crossref]
  83. Z. Wang, L. M. Zhu, X. Zhang, P. Dai, G. Q. Lv, Q. B. Feng, A. T. Wang, and H. Ming, “Computer-generated photorealistic hologram using ray-wavefront conversion based on the additive compressive light field approach,” Opt. Lett. 45, 615–618 (2020).
    [Crossref]
  84. J.-H. Park and M. Askari, “Non-hogel-based computer generated hologram from light field using complex field recovery technique from Wigner distribution function,” Opt. Express 27, 2562–2574 (2019).
    [Crossref]
  85. N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019).
    [Crossref]
  86. Z. Zhang, J. Liu, Q. Gao, X. Duan, and X. Shi, “A full-color compact 3D see-through near-eye display system based on complex amplitude modulation,” Opt. Express 27, 7023–7035 (2019).
    [Crossref]
  87. R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
    [Crossref]
  88. P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
    [Crossref]
  89. Y. Takaki and N. Fujimoto, “Flexible retinal image formation by holographic Maxwellian-view display,” Opt. Express 26, 22985–22999 (2018).
    [Crossref]
  90. J. S. Lee, Y. K. Kim, and Y. H. Won, “See-through display combined with holographic display and Maxwellian display using switchable holographic optical element based on liquid lens,” Opt. Express 26, 19341–19355 (2018).
    [Crossref]
  91. J. S. Lee, Y. K. Kim, and Y. H. Won, “Time multiplexing technique of holographic view and Maxwellian view using a liquid lens in the optical see-through head mounted display,” Opt. Express 26, 2149–2159 (2018).
    [Crossref]
  92. J. S. Lee, Y. K. Kim, M. Y. Lee, and Y. H. Won, “Enhanced see-through near-eye display using time-division multiplexing of a Maxwellian-view and holographic display,” Opt. Express 27, 689–701 (2019).
    [Crossref]
  93. M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19, 12008–12013 (2011).
    [Crossref]
  94. H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010).
    [Crossref]
  95. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20, 25130–25136 (2012).
    [Crossref]
  96. X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27, 38236–38249 (2019).
    [Crossref]
  97. H.-Y. Wu, C.-W. Shin, and N. Kim, “Full-color holographic optical elements for augmented reality display,” in Holographic Materials and Applications, M. Kumar, ed. (IntechOpen, 2019).
  98. Y.-Z. Liu, X.-N. Pang, S. Jiang, and J.-W. Dong, “Viewing-angle enlargement in holographic augmented reality using time division and spatial tiling,” Opt. Express 21, 12068–12076 (2013).
    [Crossref]
  99. Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
    [Crossref]
  100. Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
    [Crossref]
  101. N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
    [Crossref]
  102. T. Kozacki, M. Kujawińska, G. Finke, B. Hennelly, and N. Pandey, “Extended viewing angle holographic display system with tilted SLMs in a circular configuration,” Appl. Opt. 51, 1771–1780 (2012).
    [Crossref]
  103. J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16, 12372–12386 (2008).
    [Crossref]
  104. F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19, 9147–9156 (2011).
    [Crossref]
  105. T. Kozacki, G. Finke, P. Garbat, W. Zaperty, and M. Kujawińska, “Wide angle holographic display system with spatiotemporal multiplexing,” Opt. Express 20, 27473–27481 (2012).
    [Crossref]
  106. H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
    [Crossref]
  107. H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
    [Crossref]
  108. L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018).
    [Crossref]
  109. Q. Jiang, G. Jin, and L. Cao, “When metasurface meets hologram: principle and advances,” Adv. Opt. Photon. 11, 518–576 (2019).
    [Crossref]
  110. Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
    [Crossref]
  111. R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
    [Crossref]
  112. J. Li, Q. Smithwick, and D. Chu, “Full bandwidth dynamic coarse integral holographic displays with large field of view using a large resonant scanner and a galvanometer scanner,” Opt. Express 26, 17459–17476 (2018).
    [Crossref]
  113. J. Li, Q. Smithwick, and D. Chu, “Scalable coarse integral holographic video display with integrated spatial image tiling,” Opt. Express 28, 9899–9912 (2020).
    [Crossref]
  114. G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
    [Crossref]
  115. G. Tan, Y.-H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, and S.-T. Wu, “Foveated imaging for near-eye displays,” Opt. Express 26, 25076–25085 (2018).
    [Crossref]
  116. S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
    [Crossref]
  117. J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
    [Crossref]
  118. J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
    [Crossref]
  119. J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.
  120. J. Hong, “Foveation in near-eye holographic display,” in International Conference on Information and Communication Technology Convergence (ICTC) (2018), pp. 602–604.
  121. Y.-G. Ju and J.-H. Park, “Foveated computer-generated hologram and its progressive update using triangular mesh scene model for near-eye displays,” Opt. Express 27, 23725–23738 (2019).
    [Crossref]
  122. L. Wei and Y. Sakamoto, “Fast calculation method with foveated rendering for computer-generated holograms using an angle-changeable ray-tracing method,” Appl. Opt. 58, A258–A266 (2019).
    [Crossref]
  123. C. Chang, W. Cui, and L. Gao, “Foveated holographic near-eye 3D display,” Opt. Express 28, 1345–1356 (2020).
    [Crossref]
  124. W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
    [Crossref]
  125. S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
    [Crossref]
  126. M. E. Lucente, “Interactive computation of holograms using a look-up table,” J. Electron. Imaging 2, 28 (1993).
    [Crossref]
  127. S.-C. Kim and E.-S. Kim, “Effective generation of digital holograms of three-dimensional objects using a novel look-up table method,” Appl. Opt. 47, D55–D62 (2008).
    [Crossref]
  128. J. Liu, J. Jia, Y. Pan, and Y. Wang, “Overview of fast algorithm in 3D dynamic holographic display,” Proc. SPIE 8913, 89130X (2013).
    [Crossref]
  129. T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18, 19504–19509 (2010).
    [Crossref]
  130. Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52, A290–A299 (2013).
    [Crossref]
  131. Y.-Z. Liu, J.-W. Dong, Y.-Y. Pu, B.-C. Chen, H.-X. He, and H.-Z. Wang, “High-speed full analytical holographic computations for true-life scenes,” Opt. Express 18, 3345–3351 (2010).
    [Crossref]
  132. X.-N. Pang, D.-C. Chen, Y.-C. Ding, Y.-G. Chen, S.-J. Jiang, and J.-W. Dong, “Image quality improvement of polygon computer generated holography,” Opt. Express 23, 19066–19073 (2015).
    [Crossref]
  133. H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55, A135–A143 (2016).
    [Crossref]
  134. Y. Takaki and K. Ikeda, “Simplified calculation method for computer-generated holographic stereograms from multi-view images,” Opt. Express 21, 9652–9663 (2013).
    [Crossref]
  135. T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016).
    [Crossref]
  136. T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002).
    [Crossref]
  137. T. Ito, N. Masuda, K. Yoshimura, A. Shiraki, T. Shimobaba, and T. Sugie, “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express 13, 1923–1932 (2005).
    [Crossref]
  138. Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009).
    [Crossref]
  139. N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
    [Crossref]
  140. N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51, 7303–7307 (2012).
    [Crossref]
  141. A. Gilles and P. Gioia, “Real-time layer-based computer-generated hologram calculation for the Fourier transform optical system,” Appl. Opt. 57, 8508–8517 (2018).
    [Crossref]
  142. H. Kim, C.-Y. Hwang, K.-S. Kim, J. Roh, W. Moon, S. Kim, B.-R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels,” Appl. Opt. 53, G139–G146 (2014).
    [Crossref]
  143. “ISE to feature world’s first commercial 8K DLP projector,” https://www.svconline.com/the-wire/ise-feature-world-s-first-commercial-8k-dlp-projector-410993 .
  144. G. Kayye, “DPI Readies Industry’s First 8K DLP Projector, Ships 4K 3-Chip DLP Projector at 12.5K Lumens—rAVe [PUBS],” https://www.ravepubs.com/dpi-readies-industrys-first-8k-dlp-projector-demos-delivers-4k-12-5k-lumens-3-chip-dlp/ .
  145. H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
    [Crossref]
  146. E. Buckley, A. Cable, N. Lawrence, and T. Wilkinson, “Viewing angle enhancement for two- and three-dimensional holographic displays with random superresolution phase masks,” Appl. Opt. 45, 7334–7341 (2006).
    [Crossref]
  147. W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.
  148. J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019).
    [Crossref]
  149. R. Horisaki, R. Takagi, and J. Tanida, “Learning-based imaging through scattering media,” Opt. Express 24, 13738–13743 (2016).
    [Crossref]
  150. Y. Li, Y. Xue, and L. Tian, “Deep speckle correlation: a deep learning approach toward scalable imaging through scattering media,” Optica 5, 1181–1190 (2018).
    [Crossref]
  151. A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
    [Crossref]
  152. C. Cepko, “Giving in to the blues,” Nat. Genet. 24, 99–100 (2000).
    [Crossref]
  153. Z. Qiu, Z. Zhang, and J. Zhong, “Efficient full-color single-pixel imaging based on the human vision property—‘giving in to the blues’,” Opt. Lett. 45, 3046–3049 (2020).
    [Crossref]
  154. S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
    [Crossref]
  155. S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018).
    [Crossref]
  156. R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” in ACM SIGGRAPH 2019 Talks (ACM, 2019), p. 56.

2020 (7)

2019 (33)

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019).
[Crossref]

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Y.-G. Ju and J.-H. Park, “Foveated computer-generated hologram and its progressive update using triangular mesh scene model for near-eye displays,” Opt. Express 27, 23725–23738 (2019).
[Crossref]

L. Wei and Y. Sakamoto, “Fast calculation method with foveated rendering for computer-generated holograms using an angle-changeable ray-tracing method,” Appl. Opt. 58, A258–A266 (2019).
[Crossref]

Q. Jiang, G. Jin, and L. Cao, “When metasurface meets hologram: principle and advances,” Adv. Opt. Photon. 11, 518–576 (2019).
[Crossref]

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

J.-H. Park and M. Askari, “Non-hogel-based computer generated hologram from light field using complex field recovery technique from Wigner distribution function,” Opt. Express 27, 2562–2574 (2019).
[Crossref]

N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019).
[Crossref]

Z. Zhang, J. Liu, Q. Gao, X. Duan, and X. Shi, “A full-color compact 3D see-through near-eye display system based on complex amplitude modulation,” Opt. Express 27, 7023–7035 (2019).
[Crossref]

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

J. S. Lee, Y. K. Kim, M. Y. Lee, and Y. H. Won, “Enhanced see-through near-eye display using time-division multiplexing of a Maxwellian-view and holographic display,” Opt. Express 27, 689–701 (2019).
[Crossref]

X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27, 38236–38249 (2019).
[Crossref]

C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019).
[Crossref]

J. Xiao, J. Liu, Z. Lv, X. Shi, and J. Han, “On-axis near-eye display system based on directional scattering holographic waveguide and curved goggle,” Opt. Express 27, 1683–1692 (2019).
[Crossref]

Z. Huang, D. L. Marks, and D. R. Smith, “Out-of-plane computer-generated multicolor waveguide holography,” Optica 6, 119–124 (2019).
[Crossref]

Z. Huang, D. L. Marks, and D. R. Smith, “Polarization-selective waveguide holography in the visible spectrum,” Opt. Express 27, 35631–35645 (2019).
[Crossref]

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019).
[Crossref]

C. Chang, W. Cui, J. Park, and L. Gao, “Computational holographic Maxwellian near-eye display with an expanded eyebox,” Sci. Rep. 9, 18749 (2019).
[Crossref]

C. Chang, W. Cui, and L. Gao, “Holographic multiplane near-eye display based on amplitude-only wavefront modulation,” Opt. Express 27, 30960–30970 (2019).
[Crossref]

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

M. Chlipala and T. Kozacki, “Color LED DMD holographic display with high resolution across large depth,” Opt. Lett. 44, 4255–4258 (2019).
[Crossref]

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20, 45–59 (2019).
[Crossref]

J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019).
[Crossref]

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

W. Cui and L. Gao, “All-passive transformable optical mapping near-eye display,” Sci. Rep. 9, 6064 (2019).
[Crossref]

K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019).
[Crossref]

M. K. Hedili, B. Soner, E. Ulusoy, and H. Urey, “Light-efficient augmented reality display with steerable eyebox,” Opt. Express 27, 12572–12581 (2019).
[Crossref]

K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019).
[Crossref]

2018 (20)

C. Martinez, V. Krotov, B. Meynard, and D. Fowler, “See-through holographic retinal projection display concept,” Optica 5, 1200–1209 (2018).
[Crossref]

P. Zhou, Y. Li, S. Liu, and Y. Su, “Compact design for optical-see-through holographic displays employing holographic optical elements,” Opt. Express 26, 22866–22876 (2018).
[Crossref]

P. Sun, S. Chang, S. Liu, X. Tao, C. Wang, and Z. Zheng, “Holographic near-eye display system based on double-convergence light Gerchberg-Saxton algorithm,” Opt. Express 26, 10140–10151 (2018).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

S.-B. Kim and J.-H. Park, “Optical see-through Maxwellian near-to-eye display with an enlarged eyebox,” Opt. Lett. 43, 767–770 (2018).
[Crossref]

J.-H. Park and S.-B. Kim, “Optical see-through holographic near-eye-display with eyebox steering and depth of field control,” Opt. Express 26, 27076–27088 (2018).
[Crossref]

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
[Crossref]

Y. Takaki and N. Fujimoto, “Flexible retinal image formation by holographic Maxwellian-view display,” Opt. Express 26, 22985–22999 (2018).
[Crossref]

J. S. Lee, Y. K. Kim, and Y. H. Won, “See-through display combined with holographic display and Maxwellian display using switchable holographic optical element based on liquid lens,” Opt. Express 26, 19341–19355 (2018).
[Crossref]

J. S. Lee, Y. K. Kim, and Y. H. Won, “Time multiplexing technique of holographic view and Maxwellian view using a liquid lens in the optical see-through head mounted display,” Opt. Express 26, 2149–2159 (2018).
[Crossref]

L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018).
[Crossref]

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

J. Li, Q. Smithwick, and D. Chu, “Full bandwidth dynamic coarse integral holographic displays with large field of view using a large resonant scanner and a galvanometer scanner,” Opt. Express 26, 17459–17476 (2018).
[Crossref]

G. Tan, Y.-H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, and S.-T. Wu, “Foveated imaging for near-eye displays,” Opt. Express 26, 25076–25085 (2018).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
[Crossref]

A. Gilles and P. Gioia, “Real-time layer-based computer-generated hologram calculation for the Fourier transform optical system,” Appl. Opt. 57, 8508–8517 (2018).
[Crossref]

Y. Li, Y. Xue, and L. Tian, “Deep speckle correlation: a deep learning approach toward scalable imaging through scattering media,” Optica 5, 1181–1190 (2018).
[Crossref]

S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018).
[Crossref]

2017 (13)

H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
[Crossref]

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

Y. Deng and D. Chu, “Coherence properties of different light sources and their effect on the image sharpness and speckle of holographic displays,” Sci. Rep. 7, 5893 (2017).
[Crossref]

C. Chang, Y. Qi, J. Wu, J. Xia, and S. Nie, “Speckle reduced lensless holographic projection from phase-only computer-generated hologram,” Opt. Express 25, 6568–6580 (2017).
[Crossref]

Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25, 8412–8424 (2017).
[Crossref]

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017).
[Crossref]

C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56, 9390–9397 (2017).
[Crossref]

W. Cui and L. Gao, “Optical mapping near-eye three-dimensional display with correct focus cues,” Opt. Lett. 42, 2475–2478 (2017).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

2016 (11)

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
[Crossref]

Y. Wang, W. Liu, X. Meng, H. Fu, D. Zhang, Y. Kang, R. Feng, Z. Wei, X. Zhu, and G. Jiang, “Development of an immersive virtual reality head-mounted display with high performance,” Appl. Opt. 55, 6969–6977 (2016).
[Crossref]

Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24, 17372–17383 (2016).
[Crossref]

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016).
[Crossref]

T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24, 2189–2199 (2016).
[Crossref]

Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24, 30368–30378 (2016).
[Crossref]

R. Horisaki, R. Takagi, and J. Tanida, “Learning-based imaging through scattering media,” Opt. Express 24, 13738–13743 (2016).
[Crossref]

T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55, A135–A143 (2016).
[Crossref]

2015 (8)

X.-N. Pang, D.-C. Chen, Y.-C. Ding, Y.-G. Chen, S.-J. Jiang, and J.-W. Dong, “Image quality improvement of polygon computer generated holography,” Opt. Express 23, 19066–19073 (2015).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
[Crossref]

C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg–Saxton algorithm,” Appl. Opt. 54, 6994–7001 (2015).
[Crossref]

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23, 32025–32034 (2015).
[Crossref]

J.-S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23, 18143–18155 (2015).
[Crossref]

F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015).
[Crossref]

2014 (4)

2013 (9)

2012 (4)

2011 (4)

2010 (6)

2009 (2)

2008 (3)

2006 (2)

2005 (3)

2004 (2)

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
[Crossref]

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

2002 (1)

T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002).
[Crossref]

2000 (1)

C. Cepko, “Giving in to the blues,” Nat. Genet. 24, 99–100 (2000).
[Crossref]

1999 (1)

T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

1995 (3)

E. Peli, “Real vision & virtual reality,” Opt. Photon. News 6(7), 28 (1995).
[Crossref]

J. Amako, H. Miura, and T. Sonehara, “Speckle-noise reduction on kinoform reconstruction using a phase-only spatial light modulator,” Appl. Opt. 34, 3165–3171 (1995).
[Crossref]

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

1993 (1)

M. E. Lucente, “Interactive computation of holograms using a look-up table,” J. Electron. Imaging 2, 28 (1993).
[Crossref]

1990 (1)

C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
[Crossref]

Akeley, K.

S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940–20952 (2011).
[Crossref]

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008).
[Crossref]

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
[Crossref]

Aksit, K.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Albert, R.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Albert, R. A.

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

Amako, J.

Ando, T.

T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

Angelopoulos, A.

R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” in ACM SIGGRAPH 2019 Talks (ACM, 2019), p. 56.

Askari, M.

Awazu, S.

Bang, K.

K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019).
[Crossref]

C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

Banks, M. S.

S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018).
[Crossref]

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940–20952 (2011).
[Crossref]

G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716–15725 (2009).
[Crossref]

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008).
[Crossref]

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
[Crossref]

Blate, A.

K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
[Crossref]

Boev, A.

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

Boudaoud, B.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Bourquin, S.

T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
[Crossref]

Buckley, E.

Bulbul, A.

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

Cable, A.

Cai, Z.

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Cao, L.

Cepko, C.

C. Cepko, “Giving in to the blues,” Nat. Genet. 24, 99–100 (2000).
[Crossref]

Chakravarthula, P.

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

Chang, C.

Chang, J.-H. R.

J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019).
[Crossref]

Chang, K.-M.

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

Chang, S.

Chen, B.-C.

Chen, C.

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

Chen, D.-C.

Chen, J.

Chen, J.-S.

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

J.-S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23, 18143–18155 (2015).
[Crossref]

Chen, K.

F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015).
[Crossref]

Chen, Y.

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

Chen, Y.-G.

Chen, Z.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Chlipala, M.

Cho, J.

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016).
[Crossref]

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

Cho, S. Y.

S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020).
[Crossref]

Choi, M.-H.

Choi, S.

Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).

Choi, W.-Y.

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

Cholewiak, S. A.

S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018).
[Crossref]

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

Choo, H.-G.

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

Chu, D.

J. Li, Q. Smithwick, and D. Chu, “Scalable coarse integral holographic video display with integrated spatial image tiling,” Opt. Express 28, 9899–9912 (2020).
[Crossref]

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

J. Li, Q. Smithwick, and D. Chu, “Full bandwidth dynamic coarse integral holographic displays with large field of view using a large resonant scanner and a galvanometer scanner,” Opt. Express 26, 17459–17476 (2018).
[Crossref]

Y. Deng and D. Chu, “Coherence properties of different light sources and their effect on the image sharpness and speckle of holographic displays,” Sci. Rep. 7, 5893 (2017).
[Crossref]

Chu, D. P.

Cooper, E. A.

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in CHI Conference on Human Factors in Computing Systems (ACM, 2016), pp. 1211–1220.

Cui, W.

Curcio, C. A.

C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
[Crossref]

Dai, P.

Dai, Z.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Dangxiao, W.

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Deng, Y.

Y. Deng and D. Chu, “Coherence properties of different light sources and their effect on the image sharpness and speckle of holographic displays,” Sci. Rep. 7, 5893 (2017).
[Crossref]

Ding, Y.-C.

Dong, J.-W.

Duan, H.

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

Duan, X.

Ducin, I.

Feng, Q. B.

Feng, R.

Finke, G.

Fowler, D.

Fructuoso, H. N.

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

Fu, H.

Fuchs, H.

K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019).
[Crossref]

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
[Crossref]

Fujimoto, N.

Fukaya, N.

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

Gao, H.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Gao, J.

Gao, L.

Gao, Q.

Gao, X.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Garbat, P.

Georgiou, A.

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017).
[Crossref]

Gilles, A.

Gioia, P.

Girshick, A. R.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008).
[Crossref]

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
[Crossref]

Goodall, T.

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

Goodman, J. W.

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).

Greer, T.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Guo, P.

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Hahn, J.

Hamada, Y.

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

Han, J.

Hands, P. J. W.

Hashimura, S.

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

He, H.-X.

He, M.-Y.

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

Hedili, M. K.

Heide, F.

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

Hendrickson, A. E.

C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
[Crossref]

Hennelly, B.

Heynderickx, I.

J. Xia, W. Zhu, and I. Heynderickx, “41.1: three-dimensional electro-holographic retinal display,” SID Symp. Dig. Tech. Pap. 42, 591–594 (2011).
[Crossref]

Hishinuma, S.

T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002).
[Crossref]

Hoffman, D. M.

G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716–15725 (2009).
[Crossref]

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008).
[Crossref]

Honda, T.

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

Hong, J.

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

J. Hong, “Foveation in near-eye holographic display,” in International Conference on Information and Communication Technology Convergence (ICTC) (2018), pp. 602–604.

Hong, K.

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

Hong, S.

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

Horisaki, R.

Hsieh, P.-Y.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

Hu, B.

Hu, Y.

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

Hua, H.

Huang, F.-C.

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015).
[Crossref]

Huang, L.

L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018).
[Crossref]

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Huang, S.

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Huang, Y.-P.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

Huang, Z.

Hwang, C.-Y.

Ichihashi, Y.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010).
[Crossref]

Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009).
[Crossref]

Ikeda, K.

Ito, T.

S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
[Crossref]

T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016).
[Crossref]

N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51, 7303–7307 (2012).
[Crossref]

M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19, 12008–12013 (2011).
[Crossref]

H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010).
[Crossref]

T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18, 19504–19509 (2010).
[Crossref]

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009).
[Crossref]

T. Ito, N. Masuda, K. Yoshimura, A. Shiraki, T. Shimobaba, and T. Sugie, “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express 13, 1923–1932 (2005).
[Crossref]

T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002).
[Crossref]

Jang, C.

K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019).
[Crossref]

C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
[Crossref]

Javidi, B.

Jeong, J.

Jeong, Y.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016).
[Crossref]

Ji, Y.-M.

Jia, J.

Jiang, G.

Jiang, Q.

Jiang, S.

Jiang, S.-J.

Jin, G.

Jing, X.

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Jo, Y.

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

Ju, Y.-G.

Kadowaki, K.

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

Kakarenko, K.

Kakue, T.

S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
[Crossref]

T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016).
[Crossref]

Kalina, R. E.

C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
[Crossref]

Kang, H.

H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55, A135–A143 (2016).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19, 9147–9156 (2011).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

Kang, Y.

Kaplanyan, A. S.

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

Kazempourradi, S.

S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20, 45–59 (2019).
[Crossref]

Kilcher, L.

T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
[Crossref]

Kim, D.

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

Kim, E.-S.

Kim, H.

Kim, H.-J.

Kim, J.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

Kim, K.-S.

Kim, M.

Kim, N.

H.-Y. Wu, C.-W. Shin, and N. Kim, “Full-color holographic optical elements for augmented reality display,” in Holographic Materials and Applications, M. Kumar, ed. (IntechOpen, 2019).

Kim, S.

Kim, S.-B.

Kim, S.-C.

Kim, S.-H.

Kim, Y.

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

Kim, Y. K.

Kirby, A. K.

Kollin, J.

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

Kollin, J. S.

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017).
[Crossref]

Kolodziejczyk, A.

Konrad, R.

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in CHI Conference on Human Factors in Computing Systems (ACM, 2016), pp. 1211–1220.

R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” in ACM SIGGRAPH 2019 Talks (ACM, 2019), p. 56.

Kozacki, T.

Kress, B.

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

A. Olwal and B. Kress, “1D eyewear: peripheral, hidden LEDs and near-eye holographic displays for unobtrusive augmentation,” in ACM International Symposium on Wearable Computers (ISWC) (ACM, 2018), pp. 184–187.

Kress, B. C.

B. C. Kress, “Human factors,” in Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets (SPIE, 2010).

Krotov, V.

Kujawinska, M.

Kumar, B. V. K. V.

J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019).
[Crossref]

Kunugi, T.

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

Lanman, D.

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32, 220 (2013).
[Crossref]

Lawrence, N.

Lee, B.

K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019).
[Crossref]

J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019).
[Crossref]

J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019).
[Crossref]

C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019).
[Crossref]

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016).
[Crossref]

G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
[Crossref]

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16, 12372–12386 (2008).
[Crossref]

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

Lee, B.-R.

Lee, D.

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41, 2486–2489 (2016).
[Crossref]

G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
[Crossref]

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

Lee, J.

Lee, J. S.

Lee, K.

J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019).
[Crossref]

H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
[Crossref]

Lee, M. Y.

Lee, S.

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
[Crossref]

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

Lee, S.-Y.

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

Lee, Y.-H.

Lei, W.

Leimkühler, T.

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

Levola, T.

T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14, 467–475 (2006).
[Crossref]

Li, B.

Li, G.

Li, H.

Li, J.

Li, S.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Li, X.

Li, Y.

Lim, Y.

Lin, B.-S.

Lin, Q.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Lin, W.-K.

Liu, J.

J. Xiao, J. Liu, Z. Lv, X. Shi, and J. Han, “On-axis near-eye display system based on directional scattering holographic waveguide and curved goggle,” Opt. Express 27, 1683–1692 (2019).
[Crossref]

Z. Zhang, J. Liu, Q. Gao, X. Duan, and X. Shi, “A full-color compact 3D see-through near-eye display system based on complex amplitude modulation,” Opt. Express 27, 7023–7035 (2019).
[Crossref]

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25, 8412–8424 (2017).
[Crossref]

Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24, 17372–17383 (2016).
[Crossref]

G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22, 18473–18482 (2014).
[Crossref]

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21, 20577–20587 (2013).
[Crossref]

Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52, A290–A299 (2013).
[Crossref]

J. Liu, J. Jia, Y. Pan, and Y. Wang, “Overview of fast algorithm in 3D dynamic holographic display,” Proc. SPIE 8913, 89130X (2013).
[Crossref]

Liu, N.

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

Liu, P.

Liu, Q.

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Liu, S.

Liu, W.

Liu, X.

Liu, Y.-Z.

Lopes, W.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

Love, G. D.

S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018).
[Crossref]

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716–15725 (2009).
[Crossref]

Lu, Y.

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Lucente, M. E.

M. E. Lucente, “Interactive computation of holograms using a look-up table,” J. Electron. Imaging 2, 28 (1993).
[Crossref]

Luebke, D.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32, 220 (2013).
[Crossref]

Luo, X.

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

Lv, G. Q.

Lv, Z.

Maeno, K.

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

Maimone, A.

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017).
[Crossref]

Majercik, Z.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Makowski, M.

Marks, D. L.

Martinez, C.

Masuda, N.

N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51, 7303–7307 (2012).
[Crossref]

M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19, 12008–12013 (2011).
[Crossref]

H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010).
[Crossref]

T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18, 19504–19509 (2010).
[Crossref]

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009).
[Crossref]

T. Ito, N. Masuda, K. Yoshimura, A. Shiraki, T. Shimobaba, and T. Sugie, “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express 13, 1923–1932 (2005).
[Crossref]

Matoba, O.

Matsumoto, K.

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

Matsumoto, T.

T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

Matsushima, K.

Matusik, W.

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

McGuire, M.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Meng, X.

Meynard, B.

Ming, H.

Miura, H.

Molner, K.

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

Moon, E.

Moon, S.

J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

Moon, W.

Muramatsu, N.

C. W. Ooi, N. Muramatsu, and Y. Ochiai, “Eholo glass: electroholography glass. a lensless approach to holographic augmented reality near-eye display,” in SIGGRAPH Asia 2018 Technical Briefs (ACM, 2018), p. 31.

Nakayama, H.

Narain, R.

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

Ng, R.

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

Nie, S.

Nishikawa, O.

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

North, T.

T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
[Crossref]

O’Brien, J. F.

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

Ochiai, Y.

C. W. Ooi, N. Muramatsu, and Y. Ochiai, “Eholo glass: electroholography glass. a lensless approach to holographic augmented reality near-eye display,” in SIGGRAPH Asia 2018 Technical Briefs (ACM, 2018), p. 31.

Oh, K.-J.

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

Oh, S.

Ohtsuka, S.

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

Oi, R.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

Oikawa, M.

Okada, N.

Okui, M.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

Okunev, M.

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

Olwal, A.

A. Olwal and B. Kress, “1D eyewear: peripheral, hidden LEDs and near-eye holographic displays for unobtrusive augmentation,” in ACM International Symposium on Wearable Computers (ISWC) (ACM, 2018), pp. 184–187.

Ono, M.

S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020).
[Crossref]

Onural, L.

Ooi, C. W.

C. W. Ooi, N. Muramatsu, and Y. Ochiai, “Eholo glass: electroholography glass. a lensless approach to holographic augmented reality near-eye display,” in SIGGRAPH Asia 2018 Technical Briefs (ACM, 2018), p. 31.

Ozaki, M.

S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020).
[Crossref]

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

Ozaki, R.

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

Padmanaban, N.

N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019).
[Crossref]

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

Pan, Y.

Pandey, N.

Pandmanaban, N.

Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).

Pang, X.-N.

Park, G.

Park, J.

C. Chang, W. Cui, J. Park, and L. Gao, “Computational holographic Maxwellian near-eye display with an expanded eyebox,” Sci. Rep. 9, 18749 (2019).
[Crossref]

J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019).
[Crossref]

H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
[Crossref]

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

Park, J.-H.

Park, Y.

J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019).
[Crossref]

H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
[Crossref]

Peli, E.

E. Peli, “Real vision & virtual reality,” Opt. Photon. News 6(7), 28 (1995).
[Crossref]

Peng, Y.

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019).
[Crossref]

C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56, 9390–9397 (2017).
[Crossref]

Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).

Powell, K. D.

Pryn, M. J.

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

Pu, Y.-Y.

Qi, Y.

Qiu, Z.

Rathinavel, K.

K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019).
[Crossref]

K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
[Crossref]

Ravikumar, S.

Roh, J.

Rufo, G.

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

Sain, B.

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Sakai, S.

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

Sakamoto, Y.

Sang, X.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Sankaranarayanan, A. C.

J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019).
[Crossref]

Sasaki, H.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

Satake, S.

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

Sato, K.

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

Senoh, T.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

Shi, L.

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

Shi, X.

Shimizu, E.

T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

Shimobaba, T.

S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
[Crossref]

T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016).
[Crossref]

N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51, 7303–7307 (2012).
[Crossref]

M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19, 12008–12013 (2011).
[Crossref]

H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010).
[Crossref]

T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18, 19504–19509 (2010).
[Crossref]

Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009).
[Crossref]

T. Ito, N. Masuda, K. Yoshimura, A. Shiraki, T. Shimobaba, and T. Sugie, “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express 13, 1923–1932 (2005).
[Crossref]

T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002).
[Crossref]

Shin, C.

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

Shin, C.-W.

H.-Y. Wu, C.-W. Shin, and N. Kim, “Full-color holographic optical elements for augmented reality display,” in Holographic Materials and Applications, M. Kumar, ed. (IntechOpen, 2019).

Shiraki, A.

Shirley, P.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Shiyi, L.

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Shrestha, P. K.

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

Sloan, K. R.

C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
[Crossref]

Smith, D. R.

Smithwick, Q.

Sochenov, A.

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

Sonehara, T.

Soner, B.

Song, P.

Spjut, J.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Srinivasan, P. P.

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

Starner, T.

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

Stengel, M.

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

Stoykova, E.

Stramer, T.

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

Su, W.-C.

Su, Y.

P. Zhou, Y. Li, S. Liu, and Y. Su, “Compact design for optical-see-through holographic displays employing holographic optical elements,” Opt. Express 26, 22866–22876 (2018).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Sugie, T.

Sun, P.

Suszek, J.

Sutherland, I. E.

I. E. Sutherland, “The ultimate display,” in Proceedings of lFIP (1965), Vol. 65, pp. 506–508.

Suyama, S.

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

Sypek, M.

Takada, H.

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

Takada, N.

Takagi, R.

Takahashi, H.

T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

Takaki, Y.

Tan, G.

Tanaka, S.

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

Tang, C.

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Tanida, J.

Tao, X.

Tian, L.

Uehira, K.

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

Ulusoy, E.

S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20, 45–59 (2019).
[Crossref]

M. K. Hedili, B. Soner, E. Ulusoy, and H. Urey, “Light-efficient augmented reality display with steerable eyebox,” Opt. Express 27, 12572–12581 (2019).
[Crossref]

Urey, H.

Wagner, M.

T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
[Crossref]

Wakunami, K.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

Wang, A. T.

Wang, C.

Wang, H.

K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
[Crossref]

Wang, H.-Z.

Wang, J.

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

Wang, K.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Wang, Q.-H.

X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27, 38236–38249 (2019).
[Crossref]

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

Wang, Y.

Wang, Z.

Ward, G. J.

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

Watt, S. J.

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
[Crossref]

Wei, L.

Wei, Q.

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Wei, Z.

Weiliang, X.

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Weiss, T.

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Wetzstein, G.

N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019).
[Crossref]

K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019).
[Crossref]

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015).
[Crossref]

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in CHI Conference on Human Factors in Computing Systems (ACM, 2016), pp. 1211–1220.

Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).

R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” in ACM SIGGRAPH 2019 Talks (ACM, 2019), p. 56.

Wilkinson, T.

Won, Y. H.

Wu, H.-Y.

H.-Y. Wu, C.-W. Shin, and N. Kim, “Full-color holographic optical elements for augmented reality display,” in Holographic Materials and Applications, M. Kumar, ed. (IntechOpen, 2019).

Wu, J.

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

C. Chang, Y. Qi, J. Wu, J. Xia, and S. Nie, “Speckle reduced lensless holographic projection from phase-only computer-generated hologram,” Opt. Express 25, 6568–6580 (2017).
[Crossref]

Wu, S.-T.

Xia, J.

Xiao, J.

Xie, S.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Xu, F.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Xue, G.

Xue, Y.

Yamada, S.

S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
[Crossref]

Yamamoto, K.

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

Yan, B.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Yang, J.

Yang, L.

Yang, X.

Yang, Z.

Yaras, F.

Yeom, H.-J.

Yeom, J.

Yoda, T.

Yoo, C.

Yoo, D.

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

Yoshida, H.

S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020).
[Crossref]

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

Yoshikawa, H.

Yoshimura, K.

Yu, C.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56, 9390–9397 (2017).
[Crossref]

Yu, H.

H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
[Crossref]

Yu, X.

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Yu, Y.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Yuan, G.

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Yudate, S.

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

Yuru, Z.

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Zaperty, W.

Zentgraf, T.

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018).
[Crossref]

Zhan, T.

Zhang, D.

Zhang, H.

Zhang, Q.

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

Zhang, S.

L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018).
[Crossref]

Zhang, X.

Zhang, Z.

Zhao, D.

Zhao, Q.

Zhao, R.

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Zhao, T.

Zheng, H.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Zheng, Z.

Zhong, J.

Zhou, F.

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

Zhou, P.

Zhou, W.

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Zhu, L. M.

Zhu, W.

J. Xia, W. Zhu, and I. Heynderickx, “41.1: three-dimensional electro-holographic retinal display,” SID Symp. Dig. Tech. Pap. 42, 591–594 (2011).
[Crossref]

Zhu, X.

Zou, W.

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

ACM Trans. Graph. (15)

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32, 220 (2013).
[Crossref]

F.-C. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 34, 60 (2015).
[Crossref]

C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee, “Retinal 3D: augmented reality near-eye display via pupil-tracked light field projection on retina,” ACM Trans. Graph. 36, 190 (2017).
[Crossref]

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36, 85 (2017).
[Crossref]

C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37, 195 (2019).
[Crossref]

P. Chakravarthula, Y. Peng, J. Kollin, H. Fuchs, and F. Heide, “Wirtinger holography for near-eye displays,” ACM Trans. Graph. 38, 213 (2019).
[Crossref]

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23,804–813 (2004).
[Crossref]

R. Narain, R. A. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O’Brien, “Optimal presentation of imagery with focus cues on multi-plane displays,” ACM Trans. Graph. 34, 59 (2015).
[Crossref]

J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37, 198 (2019).
[Crossref]

L. Shi, F.-C. Huang, W. Lopes, W. Matusik, and D. Luebke, “Near-eye light field holographic rendering with spherical waves for wide field of view interactive 3D computer graphics,” ACM Trans. Graph. 36, 236 (2017).
[Crossref]

N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38, 214 (2019).
[Crossref]

R. Konrad, N. Padmanaban, K. Molner, E. A. Cooper, and G. Wetzstein, “Accommodation-invariant computational near-eye displays,” ACM Trans. Graph. 36, 88 (2017).
[Crossref]

J. Kim, Y. Jeong, M. Stengel, K. Akşit, R. Albert, B. Boudaoud, T. Greer, J. Kim, W. Lopes, Z. Majercik, P. Shirley, J. Spjut, M. McGuire, and D. Luebke, “Foveated AR: dynamically-foveated augmented reality display,” ACM Trans. Graph. 38, 99 (2019).
[Crossref]

A. S. Kaplanyan, A. Sochenov, T. Leimkühler, M. Okunev, T. Goodall, and G. Rufo, “DeepFovea: neural reconstruction for foveated rendering and video compression using learned statistics of natural videos,” ACM Trans. Graph. 38, 212 (2019).
[Crossref]

S. A. Cholewiak, G. D. Love, P. P. Srinivasan, R. Ng, and M. S. Banks, “Chromablur: rendering chromatic eye aberration improves accommodation and realism,” ACM Trans. Graph. 36, 210 (2017).
[Crossref]

Adv. Opt. Photon. (1)

Appl. Opt. (16)

T. Kozacki, M. Kujawińska, G. Finke, B. Hennelly, and N. Pandey, “Extended viewing angle holographic display system with tilted SLMs in a circular configuration,” Appl. Opt. 51, 1771–1780 (2012).
[Crossref]

H. Nakayama, N. Takada, Y. Ichihashi, S. Awazu, T. Shimobaba, N. Masuda, and T. Ito, “Real-time color electroholography using multiple graphics processing units and multiple high-definition liquid-crystal display panels,” Appl. Opt. 49, 5993–5996 (2010).
[Crossref]

E. Buckley, A. Cable, N. Lawrence, and T. Wilkinson, “Viewing angle enhancement for two- and three-dimensional holographic displays with random superresolution phase masks,” Appl. Opt. 45, 7334–7341 (2006).
[Crossref]

N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51, 7303–7307 (2012).
[Crossref]

A. Gilles and P. Gioia, “Real-time layer-based computer-generated hologram calculation for the Fourier transform optical system,” Appl. Opt. 57, 8508–8517 (2018).
[Crossref]

H. Kim, C.-Y. Hwang, K.-S. Kim, J. Roh, W. Moon, S. Kim, B.-R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels,” Appl. Opt. 53, G139–G146 (2014).
[Crossref]

L. Wei and Y. Sakamoto, “Fast calculation method with foveated rendering for computer-generated holograms using an angle-changeable ray-tracing method,” Appl. Opt. 58, A258–A266 (2019).
[Crossref]

S.-C. Kim and E.-S. Kim, “Effective generation of digital holograms of three-dimensional objects using a novel look-up table method,” Appl. Opt. 47, D55–D62 (2008).
[Crossref]

Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52, A290–A299 (2013).
[Crossref]

H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55, A135–A143 (2016).
[Crossref]

K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44, 4607–4614 (2005).
[Crossref]

C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg–Saxton algorithm,” Appl. Opt. 54, 6994–7001 (2015).
[Crossref]

J. Amako, H. Miura, and T. Sonehara, “Speckle-noise reduction on kinoform reconstruction using a phase-only spatial light modulator,” Appl. Opt. 34, 3165–3171 (1995).
[Crossref]

H. Urey and K. D. Powell, “Microlens-array-based exit-pupil expander for full-color displays,” Appl. Opt. 44, 4930–4936 (2005).
[Crossref]

C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56, 9390–9397 (2017).
[Crossref]

Y. Wang, W. Liu, X. Meng, H. Fu, D. Zhang, Y. Kang, R. Feng, Z. Wei, X. Zhu, and G. Jiang, “Development of an immersive virtual reality head-mounted display with high performance,” Appl. Opt. 55, 6969–6977 (2016).
[Crossref]

Appl. Sci. (1)

H. Gao, F. Xu, J. Liu, Z. Dai, W. Zhou, S. Li, Y. Yu, and H. Zheng, “Holographic three-dimensional virtual reality and augmented reality display based on 4K-spatial light modulators,” Appl. Sci. 9, 1182 (2019).
[Crossref]

Comput. Phys. Commun. (2)

T. Shimobaba, S. Hishinuma, and T. Ito, “Special-purpose computer for holography HORN-4 with recurrence algorithm,” Comput. Phys. Commun. 148, 160–170 (2002).
[Crossref]

N. Masuda, T. Sugie, T. Ito, S. Tanaka, Y. Hamada, S. Satake, T. Kunugi, and K. Sato, “Special purpose computer system with highly parallel pipelines for flow visualization using holography technology,” Comput. Phys. Commun. 181, 1986–1989 (2010).
[Crossref]

IEEE Access (1)

S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee, “Foveated retinal optimization for see-through near-eye multi-layer displays,” IEEE Access 6, 2170–2180 (2018).
[Crossref]

IEEE Photon. J. (1)

C. Chen, M.-Y. He, J. Wang, K.-M. Chang, and Q.-H. Wang, “Generation of Phase-only holograms based on aliasing reuse and application in holographic see-through display system,” IEEE Photon. J. 11, 7000711 (2019).
[Crossref]

IEEE Trans. Ind. Inf. (1)

T. Shimobaba, T. Kakue, and T. Ito, “Review of fast algorithms and hardware implementations on computer holography,” IEEE Trans. Ind. Inf. 12, 1611–1622 (2016).
[Crossref]

IEEE Trans. Visual Comput. Graph. (1)

K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Visual Comput. Graph. 24, 2857–2866 (2018).
[Crossref]

IEEE Trans. Visual Comput. Graphics (1)

K. Rathinavel, G. Wetzstein, and H. Fuchs, “Varifocal occlusion-capable optical see-through augmented reality display based on focus-tunable optics,” IEEE Trans. Visual Comput. Graphics 25, 3125–3134 (2019).
[Crossref]

J. Comp. Neurol. (1)

C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson, “Human photoreceptor topography,” J. Comp. Neurol. 292, 497–523 (1990).
[Crossref]

J. Disp. Technol. (1)

T. North, M. Wagner, S. Bourquin, and L. Kilcher, “Compact and high-brightness helmet-mounted head-up display system by retinal laser projection,” J. Disp. Technol. 12, 982–985 (2016).
[Crossref]

J. Electron. Imaging (1)

M. E. Lucente, “Interactive computation of holograms using a look-up table,” J. Electron. Imaging 2, 28 (1993).
[Crossref]

J. Inf. Disp. (2)

K. Bang, C. Jang, and B. Lee, “Curved holographic optical elements and applications for curved see-through displays,” J. Inf. Disp. 20, 9–23 (2019).
[Crossref]

S. Kazempourradi, E. Ulusoy, and H. Urey, “Full-color computational holographic near-eye display,” J. Inf. Disp. 20, 45–59 (2019).
[Crossref]

J. Soc. Inf. Disp. (1)

T. Levola, “Diffractive optics for virtual reality displays,” J. Soc. Inf. Disp. 14, 467–475 (2006).
[Crossref]

J. Vision (2)

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8, 33 (2008).
[Crossref]

S. A. Cholewiak, G. D. Love, and M. S. Banks, “Creating correct blur and its effect on accommodation,” J. Vision 18, 1 (2018).
[Crossref]

Light Sci. Appl. (2)

Y. Hu, X. Luo, Y. Chen, Q. Liu, X. Li, Y. Wang, N. Liu, and H. Duan, “3D-Integrated metasurfaces for full-colour holography,” Light Sci. Appl. 8, 86 (2019).
[Crossref]

R. Zhao, B. Sain, Q. Wei, C. Tang, X. Li, T. Weiss, L. Huang, Y. Wang, and T. Zentgraf, “Multichannel vectorial holographic display and encryption,” Light Sci. Appl. 7, 95 (2018).
[Crossref]

Mem. Fac. Eng. (1)

T. Ando, T. Matsumoto, H. Takahashi, and E. Shimizu, “Head mounted display for mixed reality using holographic optical elements,” Mem. Fac. Eng. 40, 1–6 (1999).

Nanophotonics (1)

L. Huang, S. Zhang, and T. Zentgraf, “Metasurface holography: from fundamentals to applications,” Nanophotonics 7, 1169–1190 (2018).
[Crossref]

Nat. Commun. (3)

J. Park, K. Lee, and Y. Park, “Ultrathin wide-angle large-area digital 3D holographic display using a non-periodic photon sieve,” Nat. Commun. 10, 1304 (2019).
[Crossref]

S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee, “Tomographic near-eye displays,” Nat. Commun. 10, 2497 (2019).
[Crossref]

K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016).
[Crossref]

Nat. Genet. (1)

C. Cepko, “Giving in to the blues,” Nat. Genet. 24, 99–100 (2000).
[Crossref]

Nat. Photonics (1)

H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11, 186–192 (2017).
[Crossref]

Opt. Commun. (2)

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Y. Su, Z. Cai, Q. Liu, L. Shi, F. Zhou, S. Huang, P. Guo, and J. Wu, “Projection-type dual-view holographic three-dimensional display and its augmented reality applications,” Opt. Commun. 428, 216–226 (2018).
[Crossref]

Opt. Express (52)

S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940–20952 (2011).
[Crossref]

S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18, 11562–11573 (2010).
[Crossref]

T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24, 2189–2199 (2016).
[Crossref]

C. Chang, W. Cui, and L. Gao, “Holographic multiplane near-eye display based on amplitude-only wavefront modulation,” Opt. Express 27, 30960–30970 (2019).
[Crossref]

M. Makowski, “Minimized speckle noise in lens-less holographic projection by pixel separation,” Opt. Express 21, 29205–29216 (2013).
[Crossref]

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21, 20577–20587 (2013).
[Crossref]

G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22, 18473–18482 (2014).
[Crossref]

Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24, 30368–30378 (2016).
[Crossref]

C. Chang, Y. Qi, J. Wu, J. Xia, and S. Nie, “Speckle reduced lensless holographic projection from phase-only computer-generated hologram,” Opt. Express 25, 6568–6580 (2017).
[Crossref]

P. Sun, S. Chang, S. Liu, X. Tao, C. Wang, and Z. Zheng, “Holographic near-eye display system based on double-convergence light Gerchberg-Saxton algorithm,” Opt. Express 26, 10140–10151 (2018).
[Crossref]

M.-H. Choi, Y.-G. Ju, and J.-H. Park, “Holographic near-eye display with continuously expanded eyebox using two-dimensional replication and angular spectrum wrapping,” Opt. Express 28, 533–547 (2020).
[Crossref]

Z. Huang, D. L. Marks, and D. R. Smith, “Polarization-selective waveguide holography in the visible spectrum,” Opt. Express 27, 35631–35645 (2019).
[Crossref]

J.-H. Park and S.-B. Kim, “Optical see-through holographic near-eye-display with eyebox steering and depth of field control,” Opt. Express 26, 27076–27088 (2018).
[Crossref]

J. Jeong, J. Lee, C. Yoo, S. Moon, B. Lee, and B. Lee, “Holographically customized optical combiner for eye-box extended near-eye display,” Opt. Express 27, 38006–38018 (2019).
[Crossref]

H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22, 13484–13491 (2014).
[Crossref]

S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18, 11562–11573 (2010).
[Crossref]

G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716–15725 (2009).
[Crossref]

P. Zhou, Y. Li, S. Liu, and Y. Su, “Compact design for optical-see-through holographic displays employing holographic optical elements,” Opt. Express 26, 22866–22876 (2018).
[Crossref]

W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism correction and quality optimization of computer-generated holograms for holographic waveguide displays,” Opt. Express 28, 5519–5527 (2020).
[Crossref]

J. Xiao, J. Liu, Z. Lv, X. Shi, and J. Han, “On-axis near-eye display system based on directional scattering holographic waveguide and curved goggle,” Opt. Express 27, 1683–1692 (2019).
[Crossref]

H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23, 32025–32034 (2015).
[Crossref]

M. K. Hedili, B. Soner, E. Ulusoy, and H. Urey, “Light-efficient augmented reality display with steerable eyebox,” Opt. Express 27, 12572–12581 (2019).
[Crossref]

Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24, 17372–17383 (2016).
[Crossref]

J.-S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23, 18143–18155 (2015).
[Crossref]

E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22, 6526–6534 (2014).
[Crossref]

Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25, 8412–8424 (2017).
[Crossref]

J.-H. Park and M. Askari, “Non-hogel-based computer generated hologram from light field using complex field recovery technique from Wigner distribution function,” Opt. Express 27, 2562–2574 (2019).
[Crossref]

Y.-Z. Liu, X.-N. Pang, S. Jiang, and J.-W. Dong, “Viewing-angle enlargement in holographic augmented reality using time division and spatial tiling,” Opt. Express 21, 12068–12076 (2013).
[Crossref]

J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16, 12372–12386 (2008).
[Crossref]

F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19, 9147–9156 (2011).
[Crossref]

T. Kozacki, G. Finke, P. Garbat, W. Zaperty, and M. Kujawińska, “Wide angle holographic display system with spatiotemporal multiplexing,” Opt. Express 20, 27473–27481 (2012).
[Crossref]

J. Li, Q. Smithwick, and D. Chu, “Full bandwidth dynamic coarse integral holographic displays with large field of view using a large resonant scanner and a galvanometer scanner,” Opt. Express 26, 17459–17476 (2018).
[Crossref]

J. Li, Q. Smithwick, and D. Chu, “Scalable coarse integral holographic video display with integrated spatial image tiling,” Opt. Express 28, 9899–9912 (2020).
[Crossref]

G. Li, J. Jeong, D. Lee, J. Yeom, C. Jang, S. Lee, and B. Lee, “Space bandwidth product enhancement of holographic display using high-order diffraction guided by holographic optical element,” Opt. Express 23, 33170–33183 (2015).
[Crossref]

G. Tan, Y.-H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, and S.-T. Wu, “Foveated imaging for near-eye displays,” Opt. Express 26, 25076–25085 (2018).
[Crossref]

Z. Zhang, J. Liu, Q. Gao, X. Duan, and X. Shi, “A full-color compact 3D see-through near-eye display system based on complex amplitude modulation,” Opt. Express 27, 7023–7035 (2019).
[Crossref]

M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20, 25130–25136 (2012).
[Crossref]

X. Yang, P. Song, H. Zhang, and Q.-H. Wang, “Full-color computer-generated holographic near-eye display based on white light illumination,” Opt. Express 27, 38236–38249 (2019).
[Crossref]

Y. Takaki and N. Fujimoto, “Flexible retinal image formation by holographic Maxwellian-view display,” Opt. Express 26, 22985–22999 (2018).
[Crossref]

J. S. Lee, Y. K. Kim, and Y. H. Won, “See-through display combined with holographic display and Maxwellian display using switchable holographic optical element based on liquid lens,” Opt. Express 26, 19341–19355 (2018).
[Crossref]

J. S. Lee, Y. K. Kim, and Y. H. Won, “Time multiplexing technique of holographic view and Maxwellian view using a liquid lens in the optical see-through head mounted display,” Opt. Express 26, 2149–2159 (2018).
[Crossref]

J. S. Lee, Y. K. Kim, M. Y. Lee, and Y. H. Won, “Enhanced see-through near-eye display using time-division multiplexing of a Maxwellian-view and holographic display,” Opt. Express 27, 689–701 (2019).
[Crossref]

M. Oikawa, T. Shimobaba, T. Yoda, H. Nakayama, A. Shiraki, N. Masuda, and T. Ito, “Time-division color electroholography using one-chip RGB LED and synchronizing controller,” Opt. Express 19, 12008–12013 (2011).
[Crossref]

T. Ito, N. Masuda, K. Yoshimura, A. Shiraki, T. Shimobaba, and T. Sugie, “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express 13, 1923–1932 (2005).
[Crossref]

Y. Ichihashi, H. Nakayama, T. Ito, N. Masuda, T. Shimobaba, A. Shiraki, and T. Sugie, “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express 17, 13895–13903 (2009).
[Crossref]

R. Horisaki, R. Takagi, and J. Tanida, “Learning-based imaging through scattering media,” Opt. Express 24, 13738–13743 (2016).
[Crossref]

Y.-G. Ju and J.-H. Park, “Foveated computer-generated hologram and its progressive update using triangular mesh scene model for near-eye displays,” Opt. Express 27, 23725–23738 (2019).
[Crossref]

C. Chang, W. Cui, and L. Gao, “Foveated holographic near-eye 3D display,” Opt. Express 28, 1345–1356 (2020).
[Crossref]

T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18, 19504–19509 (2010).
[Crossref]

Y. Takaki and K. Ikeda, “Simplified calculation method for computer-generated holographic stereograms from multi-view images,” Opt. Express 21, 9652–9663 (2013).
[Crossref]

Y.-Z. Liu, J.-W. Dong, Y.-Y. Pu, B.-C. Chen, H.-X. He, and H.-Z. Wang, “High-speed full analytical holographic computations for true-life scenes,” Opt. Express 18, 3345–3351 (2010).
[Crossref]

X.-N. Pang, D.-C. Chen, Y.-C. Ding, Y.-G. Chen, S.-J. Jiang, and J.-W. Dong, “Image quality improvement of polygon computer generated holography,” Opt. Express 23, 19066–19073 (2015).
[Crossref]

Opt. Lett. (6)

Opt. Photon. News (1)

E. Peli, “Real vision & virtual reality,” Opt. Photon. News 6(7), 28 (1995).
[Crossref]

Optica (3)

Optik (1)

Y. Su, Z. Cai, W. Zou, L. Shi, F. Zhou, P. Guo, Y. Lu, and J. Wu, “Viewing angle enlargement in holographic augmented reality using an off-axis holographic lens,” Optik 172, 462–469 (2018).
[Crossref]

OSA Contin. (1)

R. Ozaki, S. Hashimura, S. Yudate, K. Kadowaki, H. Yoshida, and M. Ozaki, “Optical properties of selective diffraction from Bragg-Berry cholesteric liquid crystal deflectors,” OSA Contin. 2, 3554 (2019).
[Crossref]

Phys. Today (1)

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

Proc. Natl. Acad. Sci. USA (1)

N. Padmanaban, R. Konrad, T. Stramer, E. A. Cooper, and G. Wetzstein, “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays,” Proc. Natl. Acad. Sci. USA 114, 2183–2188 (2017).
[Crossref]

Proc. SPIE (4)

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

N. Fukaya, K. Maeno, O. Nishikawa, K. Matsumoto, K. Sato, and T. Honda, “Expansion of the image size and viewing zone in holographic display using liquid crystal devices,” Proc. SPIE 2406, 283–289 (1995).
[Crossref]

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” Proc. SPIE 9771, 97710K (2016).
[Crossref]

J. Liu, J. Jia, Y. Pan, and Y. Wang, “Overview of fast algorithm in 3D dynamic holographic display,” Proc. SPIE 8913, 89130X (2013).
[Crossref]

Research (1)

P. K. Shrestha, M. J. Pryn, J. Jia, J.-S. Chen, H. N. Fructuoso, A. Boev, Q. Zhang, and D. Chu, “Accommodation-free head mounted display with comfortable 3D perception and an enlarged eye-box,” Research 2019, 9273723 (2019).
[Crossref]

Sci. Rep. (6)

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4, 6177 (2015).
[Crossref]

S. Yamada, T. Kakue, T. Shimobaba, and T. Ito, “Interactive holographic display based on finger gestures,” Sci. Rep. 8, 2010 (2018).
[Crossref]

W. Cui and L. Gao, “All-passive transformable optical mapping near-eye display,” Sci. Rep. 9, 6064 (2019).
[Crossref]

S. Y. Cho, M. Ono, H. Yoshida, and M. Ozaki, “Bragg-Berry flat reflectors for transparent computer-generated holograms and waveguide holography with visible color playback capability,” Sci. Rep. 10, 8201 (2020).
[Crossref]

C. Chang, W. Cui, J. Park, and L. Gao, “Computational holographic Maxwellian near-eye display with an expanded eyebox,” Sci. Rep. 9, 18749 (2019).
[Crossref]

Y. Deng and D. Chu, “Coherence properties of different light sources and their effect on the image sharpness and speckle of holographic displays,” Sci. Rep. 7, 5893 (2017).
[Crossref]

SID Symp. Dig. Tech. Pap. (1)

J. Xia, W. Zhu, and I. Heynderickx, “41.1: three-dimensional electro-holographic retinal display,” SID Symp. Dig. Tech. Pap. 42, 591–594 (2011).
[Crossref]

Virtual Reality Intell. Hardware (1)

W. Dangxiao, G. Yuan, L. Shiyi, Z. Yuru, X. Weiliang, and X. Jing, “Haptic display for virtual reality: progress and challenges,” Virtual Reality Intell. Hardware 1, 136 (2019).
[Crossref]

Vision Res. (1)

S. Suyama, S. Ohtsuka, H. Takada, K. Uehira, and S. Sakai, “Apparent 3-D image perceived from luminance-modulated two 2-D images displayed at different depths,” Vision Res. 44, 785–793 (2004).
[Crossref]

Other (17)

A. Olwal and B. Kress, “1D eyewear: peripheral, hidden LEDs and near-eye holographic displays for unobtrusive augmentation,” in ACM International Symposium on Wearable Computers (ISWC) (ACM, 2018), pp. 184–187.

D. Kim, S. Lee, J. Cho, D. Lee, K. Bang, and B. Lee, “Enhancement of depth range in LED-based holographic near-eye display using focus tunable device,” in IEEE 28th International Symposium on Industrial Electronics (ISIE) (2019), pp. 2388–2391.

G. Li, “Study on improvements of near-eye holography: form factor, field of view, and speckle noise reduction,” Thesis (Seoul National University, 2018).

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).

Y. Peng, S. Choi, N. Pandmanaban, and G. Wetzstein, “Neural holography with camera-in-the-loop training,” in SIGGRAPH Asia (ACM, 2020).

I. E. Sutherland, “The ultimate display,” in Proceedings of lFIP (1965), Vol. 65, pp. 506–508.

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in CHI Conference on Human Factors in Computing Systems (ACM, 2016), pp. 1211–1220.

B. C. Kress, “Human factors,” in Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets (SPIE, 2010).

“Holographic optical elements and application,” IntechOpen, https://www.intechopen.com/books/holographic-materials-and-optical-systems/holographic-optical-elements-and-application .

C. W. Ooi, N. Muramatsu, and Y. Ochiai, “Eholo glass: electroholography glass. a lensless approach to holographic augmented reality near-eye display,” in SIGGRAPH Asia 2018 Technical Briefs (ACM, 2018), p. 31.

W.-Y. Choi, K.-J. Oh, K. Hong, H.-G. Choo, J. Park, and S.-Y. Lee, “Generation of CGH with expanded viewing angle by using the scattering properties of the random phase mask,” in Frontiers in Optics + Laser Science APS/DLS (OSA, 2019), paper JTu3A.103.

R. Konrad, A. Angelopoulos, and G. Wetzstein, “Gaze-contingent ocular parallax rendering for virtual reality,” in ACM SIGGRAPH 2019 Talks (ACM, 2019), p. 56.

“ISE to feature world’s first commercial 8K DLP projector,” https://www.svconline.com/the-wire/ise-feature-world-s-first-commercial-8k-dlp-projector-410993 .

G. Kayye, “DPI Readies Industry’s First 8K DLP Projector, Ships 4K 3-Chip DLP Projector at 12.5K Lumens—rAVe [PUBS],” https://www.ravepubs.com/dpi-readies-industrys-first-8k-dlp-projector-demos-delivers-4k-12-5k-lumens-3-chip-dlp/ .

J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Near-eye foveated holographic display,” in Imaging and Applied Optics 2018 (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, PcAOP) (OSA, 2018), paper 3M2G.4.

J. Hong, “Foveation in near-eye holographic display,” in International Conference on Information and Communication Technology Convergence (ICTC) (2018), pp. 602–604.

H.-Y. Wu, C.-W. Shin, and N. Kim, “Full-color holographic optical elements for augmented reality display,” in Holographic Materials and Applications, M. Kumar, ed. (IntechOpen, 2019).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1.
Fig. 1. Categorization of holographic near-eye displays from the human-centric perspective.
Fig. 2.
Fig. 2. Visual fields of the human eye.
Fig. 3.
Fig. 3. Holographic see-through near-eye display with a beam splitter as an optical combiner. (a) Conceptual schematic setup. (b) The hologram is relayed and reflected in the eye by a beam splitter through a $4f$ system (reproduced with permission from [27], 2015, OSA).
Fig. 4.
Fig. 4. Holographic optical element (HOE) as an optical combiner. (a) Off-axis HOE geometry (included here by permission from [31], 2017, ACM). (b) Waveguide geometry (reproduced with permission from [32], 2015, OSA).
Fig. 5.
Fig. 5. Eyebox relay through (a) ${{4}}f$ system and (b) CGH. Eyebox expansion through (c) pupil tracking (included here by permission from [45], 2019, ACM) and replication (reproduced with permission from [46], 2020, OSA).
Fig. 6.
Fig. 6. Eyebox expansion in a Maxwellian display using holographic modulation through (a) HOE (reproduced with permission from [50], 2018, OSA) and (b) encoding CGH with multiplexed off-axis plane waves (reproduced with permission from [51], 2019, Springer Nature).
Fig. 7.
Fig. 7. Speckle noise suppression by (a) superposition of multiple CGHs [53], (b) rotating diffuser, and (c) complex amplitude modulation using a single CGH (reproduced with permission from [29], 2017, OSA).
Fig. 8.
Fig. 8. Holographic multiplane near-eye display based on linear depth blending rendering [53].
Fig. 9.
Fig. 9. Holographic stereogram and Maxwellian view. (a) Holographic stereogram for realistic 3D perception (included here by permission from [85], 2019, ACM). (b) Holographic Maxwellian view (reproduced with permission from [51], 2019, Springer Nature).
Fig. 10.
Fig. 10. Color holographic near-eye display using (a) time division (reproduced with permission from [86], 2019, OSA) and (b) metasurface HOE (reproduced with permission from [41], 2019, OSA).
Fig. 11.
Fig. 11. FOV in a holographic near-eye display. (a) Holographic near-eye display involving an eyepiece. (b) Enlarging the FOV through pupil relay. (c) Enlarging the FOV through spherical-wave illumination.
Fig. 12.
Fig. 12. Enlarging the FOV through (a) temporal division and spatial tiling (reproduced with permission from [98], 2013, OSA) and (b) temporal division and diffraction-order tiling [114].
Fig. 13.
Fig. 13. Foveated image rendering of (a) point-cloud and polygon-based models (reproduced with permission from [121], 2019, OSA) and (b) multiplane-based model [123]. The foveal content changes according to the eye gaze angle in (c) polygon-based model (reproduced with permission from [121], 2019, OSA) and (d) multiplane-based model [123].
Fig. 14.
Fig. 14. Signal processing of the human visual system.
Fig. 15.
Fig. 15. Tradeoff relations between FOV and eyebox (wavelength is 532 nm in the calculation).
Fig. 16.
Fig. 16. Reproduction of “perceptual realism” in holographic near-eye displays.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

θ max = sin 1 ( λ / 2 p + 2 sin ( θ i n / 2 ) ) sin 1 ( λ / 2 p ) + θ i n .
F O V = 2 θ max d s l m d e y e = 2 θ max M ,
F O V e y e b o x = 2 θ max M M N p 2 λ 2 p N p = λ N ,