Abstract

We propose three-dimensional (3D) head-mounted display (HMD) providing multi-focal and wearable functions by using polarization-dependent optical path switching in Savart plate. The multi-focal function is implemented as micro display with high pixel density of 1666 pixels per inches is optically duplicated in longitudinal direction according to the polarization state. The combination of micro display, fast switching polarization rotator and Savart plate retains small form factor suitable for wearable function. The optical aberrations of duplicated panels are investigated by ray tracing according to both wavelength and polarization state. Astigmatism and lateral chromatic aberration of extraordinary wave are compensated by modification of the Savart plate and sub-pixel shifting method, respectively. To verify the feasibility of the proposed system, a prototype of the HMD module for monocular eye is implemented. The module has the compact size of 40 mm by 90 mm by 40 mm and the weight of 131 g with wearable function. The micro display and polarization rotator are synchronized in real-time as 30 Hz and two focal planes are formed at 640 and 900 mm away from eye box, respectively. In experiments, the prototype also provides augmented reality function by combining the optically duplicated panels with a beam splitter. The multi-focal function of the optically duplicated panels without astigmatism and color dispersion compensation is verified. When light field optimization for two additive layers is performed, perspective images are observed, and the integration of real world scene and high quality 3D images is confirmed.

© 2016 Optical Society of America

1. Introduction

Blurring the distinction between real world scene and computer-generated virtual world has been the ultimate goal of display field for a long time. Recently, virtual reality (VR) display device which provides large field of view (FOV) three-dimensional (3D) display leads public attentions as the next-generation display technology. In the VR display system, the immersive and realistic virtual world can be constructed by the principle of binocular disparity [1, 2]. Though the commercialized VR device effectively provides the virtual world experiences, it is hard to prevent visual fatigue which comes from the limited accommodation response at fixed image plane [3, 4]. In addition, the absence of human factor consideration that the interpupillary distance of human visual system is different to each other can cause the depth distortion for some observers because the binocular images for conventional VR devices are designed to satisfy the average interpupillary distance of 65 mm [5]. Super multi-view and multi-focal systems can mitigate the visual fatigue in conventional VR device by providing multiple perspective images to induce accommodation effect for monocular eye. In the super multi-view system, two or more view images are projected to monocular eye for inducing the depth cue of accommodation [6–9]. However, it is hard to implement display optical system providing multiple viewpoints within wearable display system, and expand the coverage of super multi-view system inside the eye box where the pupil of human eye moves around. The 3D image reconstruction by multi-focal planes is a good candidate for solving the visual fatigue in the VR system [10–24]. There were many works for generating multi-focal function in 3D display system: physical stacking of display panels [10–17], tunable optical components including liquid lens or deformable mirror [18–22], and polarization-multiplexed multi-focal system using birefringent optical components [23–29].

The simplest way to realize the multi-focal function is to set physical panels in different depth positions. By using the quasi-transparent function of liquid crystal displays (LCDs), two or more LCDs are stacked in longitudinal direction for implementing the multi-focal function [12–14]. However, the image quality of the system is severely degraded by the low transparency of stacked configuration and diffraction limit of periodic structure of panels [13]. The state-of-the-art holographic optical element (HOE) technique is proposed for solving the drawbacks in the panel based multi-focal system [17]. The unique functions of both high transparency and resolution free from the diffraction limit are realized by the combination of multiple HOEs and high resolution projectors. Though the HOE is a promising technique for realizing transparent and multi-focal function simultaneously, there are some issues which should be solved for commercialization: low efficiency of full-color generation, low uniformity in reconstructed images, and bulky reconstruction optics.

Tunable optical components including liquid lens and deformable mirror are a potential candidate for the multi-focal system. The electrically-controllable shape of tunable optics leads to the modulation of depth position of image [18–22]. By changing the focal length of the liquid lens, the multi-focal function realized via the sequential synchronization of lens shape and display image can show multiple images in different depth positions [20, 21]. Although the adoption of tunable optics improves the degree of freedom in expressible position of images, slow switching speed, noises from intermediate state, optical aberrations from each state, and mechanical issues in scaling limit the number of focal planes and system parameters.

Similar to the operation of tunable optics, the sequential change of polarization state also produces multi-focal function by using birefringent materials such as liquid crystal (LC) lenses and birefringent optical components [23–29]. By changing eigen polarization state of image, the optical paths in the birefringent material are switched and the multi-focal function appears [30]. By the virtue of fast response of modern polarization rotator, cascade configuration of the birefringent material and the polarization rotator can produce multi-focal function more than two image planes. However, it is hard to design the system including birefringent materials because of high costs, limitation in selection of materials, manufacturing difficulty, and fragile characteristic [31]. Moreover, the optical aberrations of astigmatism and color dispersion in extraordinary wave increase the complexity of the system [23, 31–33]. Park et al. reported that the astigmatism in single uniaxial crystal blocks the generation of multi-focal function and suppression of axial component by moving slit array was adopted [23]. Those optical aberrations are not fully considered in the previous work and the special aberrations which do not appear in isotropic optical systems should be compensated for commercial uses.

In this paper, we implement a compact and robust 3D head-mounted display (HMD) system with multi-focal function. The multi-focal function is realized by double refraction in the Savart plate, a crossed pair of uniaxial plates having the same specifications. The manufacturing process and the design of the optical system considering aberrations in the plane-parallel birefringent plate having two faces plane and parallel are easier than that of birefringent lenses and mirrors. For retaining the robustness of multi-focal function, the astigmatism and the color dispersion in the Savart plate are investigated. Both the astigmatism and the chromatic aberration are compensated by modification of the Savart plate with symmetric optical configuration and sub-pixel switching method, respectively. Single micro organic light emitting diode (OLED) with high pixel density of 1666 pixels per inch is duplicated in longitudinal direction according to the polarization state. The multi-focal planes of micro OLED are combined with the beam splitter and the concave mirror for realizing the augmented reality (AR). From well-defined geometrical relation of the separated image planes, layer images of compressive light field display are calculated for reconstructing the dense viewpoints inside the eye box of the HMD [13, 15]. In the experiments, the prototype for monocular eye and the real-time operation with synchronization between the display image and the polarization rotator are implemented. The results verify that both the astigmatism and the chromatic aberration of extraordinary wave are mitigated. The perspective images are observed with the change of observing position inside the eye box. The implemented prototype also convinces the high quality 3D images and satisfaction of wearable function with a small form factor.

2. Principle

2.1 Multi-focal head-mounted display system using Savart plate

Basic concept of the proposed system is the optical duplication of the micro display in longitudinal direction. The optical duplication of the micro display is realized by the polarization-multiplexed optical path switching method in birefringent crystal which is called double refraction. The Savart plate which is comprised of a pair of plane-parallel birefringent plate can be applied to duplicate the micro display in longitudinal direction [34]. To mitigate the astigmatism of extraordinary wave, half wave plate (HWP) is inserted in the middle of the Savart plate as shown in Fig. 1(a). The detailed analysis of astigmatism compensation is performed in the following section. The polarization state of the micro display is modulated by the fast switching polarization rotator. By altering the eigen polarization state of image in sequence, two virtual image planes are formed in different longitudinal positions. From the optically duplicated image planes, the 3D images can be realized by adopting the principle of depth-fused (DFD) display or compressive light field display. While the resolution of two physically stacked panels in conventional systems is limited by diffraction of periodic pixelated structure, the optical duplication of the display panel is free from the diffraction limit of optical imaging system until the size of pixel structure reaches the dimension of visible wavelength. Therefore, it is possible to present the high resolution images by adopting the high resolution micro display. Besides, since two virtual panels share the inherent device characteristic of display, color reproductivity and uniformity issues can be easily considered.

 

Fig. 1 Multi-focal HMD system using Savart plate: (a) schematic diagram of proposed system, (b) optically equivalent model of proposed system

Download Full Size | PPT Slide | PDF

Figure 1(b) shows the optical equivalent model of the proposed system. The duplicated planes can be floated by concave mirror or convex lens for separating those planes in longitudinal direction and the AR function is also realized by combining the beam splitter. The simple configuration of optical components contributes to retaining the small form factor, so that it satisfies the wearable function. From the equivalent model, the FOV, and the positions of virtual planes are calculated through geometric relation between each of optical components.

θwindow=2tan1(Wbs2(deye+Wbs)),
θimage=2tan1(Wa2(deye+Sa)),
where θwindow and θimage are the FOVs of the real world scene and the display image, respectively. Wbs is width of the beam splitter aperture and deye is distance between eye pupil and entrance of the HMD module. When the eye pupil is located at the entrance of the module, the θwindow is maximized with the value of 53 degrees. The FOV of the display image is calculated by the width of aperture in the display part Wa and the distance between the aperture and the concave mirror Sa. In the display part, the aperture is determined by relative size difference between active area of the micro display and the Savart plate.

When duplicated planes are separated by Δs, and both planes are located inside the focal length f of the concave mirror or convex mirror, the separation distance Δsꞌ is magnified as follows:

Δs'=Δs(sff)(sf+Δsf)f2,
where sf is the position of the duplicated panel near the concave mirror. From Eq. (3), the geometrical relation of multi-focal function is defined and the 3D images for the AR can be reconstructed between both panels. The displacement Δs between duplicated planes is dependent on many parameters: thickness of the Savart plate, birefringence of material, orientation of optic axis, and wavelength of light. For clarifying the relation between the system parameters and the optical characteristic of the Savart plate such as image displacement and aberrations, ray-tracing simulation considering those parameters is performed in following sections. In section 2.2, the multi-focal characteristic including the image displacement and astigmatism is calculated with the change of the orientation of optic axis. Unique color dispersion of extraordinary wave in the Savart plate is analyzed with the change of the wavelength of light in section 2.3.

2.2 Astigmatism compensation by modified Savart plate

In order to investigate the polarization-dependent multi-focal function and the astigmatism in the Savart plate, the light ray cone passing through the Savart plate is analyzed with the variables of optic axis orientation. The commercialized calcite (CaCO3) plate with the optic axis with 45 degrees to incident plane is used for simulation and implementation of the prototype.

In birefringent crystal, well-known optical aberration of the astigmatism appears, since the refractive index for extraordinary wave changes according to the plane of incidence. As shown in Figs. 2(a) and 2(b), normal surface shells of index ellipsoid for horizontal and vertical components of extraordinary wave differ when the light ray cone is incident on the plane-parallel calcite plate which has the thickness of 15 mm and the optic axis with 45 degrees to the incident plane. The detailed simulation conditions are shown in Table 1. For both cases, the ray tracing is performed and the virtual image of point light source is evaluated. The horizontal component in Fig. 2(a) shows both longitudinal and lateral shifts of point light source, but the vertical component in Fig. 2(b) only shows the longitudinal shift. Since both components form two different virtual point light sources, the astigmatism appears when the image is observed through the plate. Thus, anamorphic characteristic of single birefringent crystal is not proper to implement the multi-focal function. In Fig. 2(c), locus of virtual image of the point light source is investigated with the changes of optic axis orientation. When the optic axis is placed on the incident surface, the virtual images of the horizontal and vertical components of extraordinary wave are formed at positive and negative side from that of ordinary wave, respectively. As the angle of optic axis to the incident surface increases, the point light source of the horizontal component is shifted in counterclockwise direction and that of the vertical component is shifted in negative-longitudinal direction. Near the optic axis with 90 degrees, the point light sources of both components are almost coincidence, but it is hard to utilize the linear polarization state as the eigen polarization state. Both azimuth and radial polarization are the eigen polarization of plane-parallel birefringent plate with the optic axis perpendicular to incident surface [33]. Therefore, the single birefringent crystal is not appropriate to generate the multi-focal function [23].

 

Fig. 2 Astigmatism in plane-parallel calcite plate: ray tracing for (a) horizontal component (top view, optic axis with 45 degrees), (b) vertical component (side view, optic axis of 45 with degrees), and (c) trajectory of virtual image of point light source with angle of optic axis

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. Simulation conditions for astigmatism analysis

In isotropic optical system, the astigmatism is compensated by the anamorphic lenses. However, in the proposed system, the calcite plate only acts as the anamorphic lens for extraordinary wave. The isotropic anamorphic lens can mitigate the astigmatism of extraordinary wave, but the astigmatism of ordinary wave would arise. For solving the astigmatism in uniaxial crystal, the Savart plate can be used to cancel out the anamorphic function. In normal Savart plate, the eigen polarization state acts as both ordinary and extraordinary waves at 1st and 2nd plate, respectively. Therefore, the astigmatism from extraordinary wave still remains. The modified structure of the Savart plate in which the HWP is inserted between plates can solve the astigmatism [33]. If the Savart plate is modified, each axial component of extraordinary wave would experience two different shapes of normal surface shell and optical compensation of the astigmatism.

Figure 3 shows the ray tracing results in modified Savart plate. In the 1st calcite plate of the Fig. 3(a), the horizontal component of extraordinary wave experiences the intersection of normal surface shell for which the major axis of ellipse is rotated by 45 degrees to the incident surface. After passing through the 1st plate, the polarization state of ray is rotated by 90 degrees by the HWP. The eigen polarization is switched and the extraordinary wave passing through the 1st plate still acts as extraordinary wave in the 2nd plate, but the shape of normal surface shell is changed to that for vertical component in the 1st plate as shown in Fig. 3(b). In the 2nd plate, the horizontal component of extraordinary wave experiences the intersection of normal surface shell of which the major axis of ellipse is placed along the longitudinal direction. Therefore, the ray trajectory of horizontal component is affected by both shapes of normal surface shells. Likewise, the vertical component of extraordinary wave is also affected by two different shapes of normal surface shells as shown in Fig. 3(b). The optical paths in the HWP are ignored because the thickness of the HWP is so small. The tracked rays from the point light source show that positions of the virtual image calculated by each component are one-to-one coincidence in both lateral and longitudinal directions. The results present that the astigmatism of single uniaxial crystal is cancelled by the modified Savart plate and the exact position of virtual images is defined. Thus, the multi-focal function is realized by the simple structure of the modified Savart plate and it is possible to duplicate the array of pixels in the micro display. Under the simulation condition in Table 1, the longitudinal displacement between both virtual images of ordinary and extraordinary wave is about 2 mm. The lateral displacement is about 1.6 mm. The image dislocation of two panels by lateral displacement can be mitigated by pixel shifting or dual layer of modified Savart plate [33]. Since the displacement is directly proportional to the thickness of the crystal, the magnitude of displacement per thickness can be calculated and the values in longitudinal and lateral directions are 0.0674 and 0.0549 mm per unit thickness of the Savart plate, respectively.

 

Fig. 3 Astigmatism compensation using Savart plate with half wave plate: (a) horizontal component (top view) and (b) vertical component (side view)

Download Full Size | PPT Slide | PDF

2.3 Chromatic aberration of extraordinary plane in lateral direction

The color dispersion in the imaging system is an important issue which should be considered to present clear images. Though the wavelength is a major factor of the color dispersion in the isotropic optical system, the color dispersion in the birefringent material is affected by the polarization state of image as well as wavelength because the refractive index is dependent on both the wavelength and the polarization state. For analyzing the color dispersion, the spectrum of micro OLED with high pixel density (Olightek, SVGA050) is measured by the spectrometer (Ocean optics, USB4000-VIS-N) as shown in Fig. 4(a). The center wavelengths of three-primary sub-pixels of micro OLED are evaluated by calculating power-weighted mean wavelength as follows:

λc=1Ptotalp(λ)λdλ,
where λc is the center wavelength of primary sub-pixels, Ptotal is total power of the spectral density, p is the power spectral density of micro OLED, and λ is the wavelength of emitted light. The spectral components more than 50 percent of intensity counts are considered in center wavelength calculation, and the noise from side lobe is neglected.

 

Fig. 4 Color dispersion in modified Savart plate: (a) spectrum of micro OLED and center wavelength, (b) chromatic aberration in modified Savart plate

Download Full Size | PPT Slide | PDF

For each eigen polarization state, the refractive index of calcite crystal is defined by Sellmeier equation as follows [35]:

{no=2.69705+0.0192064/(λ20.01820)0.0151624λ2,ne=2.18438+0.0087309/(λ20.01018)0.0024411λ2,
where no is the refractive index for ordinary wave and ne is the refractive index of extraordinary wave. After calculation of the refractive index for each polarization state and wavelength, the ray tracing is performed. Figure 4(b) shows the color dispersion relation in both ordinary and extraordinary waves. For ordinary wave, only longitudinal chromatic shift occurs likewise the situation that the white light source passes through the isotropic material. On the contrary, for extraordinary wave, both longitudinal and lateral chromatic shifts occur. Though it is hard to observe the affection of longitudinal chromatic shift through the imaging system with large depth of focus (DOF) such as human visual system, the lateral chromatic shift highly degrades the image quality. In particular, the human visual system is more sensitive in recognizing the chromatic aberration in lateral direction. As shown in Fig. 4(b), the chromatic shift in lateral direction occurs within the range of 500 μm. Since it is known that the human eye distinguishes the spatial difference of image with an angle of one minute of arc, the lateral chromatic shift of the panels generated by the extraordinary wave can be observed when the image is magnified by concave mirror or convex lens as shown in Fig. 1(b). In addition, the disregard of chromatic shift would distort the realization of 3D image from DFD or compressive light field display which reconstructs the 3D image by pixel mapping among multiple panels. In isotropic optical system, the chromatic aberration is compensated by combination of multiple lenses and groups with different materials. However, it is hard to compensate the chromatic aberration in the birefringent optical component with same methodology of the isotropic system because of high costs and difficulty in selection of birefringent materials. The combination of two different birefringent materials could reduce the chromatic aberration, but the astigmatism will occur because of difference between the refractive indexes of materials [31]. In this case, the image processing of primary source channel shift in opposite direction of chromatic aberrations could be a better and simple way to compensate the lateral chromatic shift. The degree of shift is approximated by converting the displacement of chromatic shift to the number of sub-pixels as follows:
{Ngreen=[ΔCgreenpsubpp],Nblue=[ΔCblue2psubpp],
where Ngreen and Nblue are the number of shifts for green and blue sub-pixels, respectively. ΔCgreen and ΔCblue are the lateral chromatic shift of green and blue colors. The notation [·] presents the function of rounding off to the nearest integer. Since the micro OLED has the red, green, and blue stripe sub-pixel structure and the reference of the chromatic shift is red-colored sub-pixel, the horizontal pixel shift should be calculated by considering sub-pixel pitch of psub. On the other hand, the vertical pixel shift is approximated by the pixel pitch of pp, and the psub in the Eq. (6) is vanished. Under the simulation condition in Table 2, the green color shift values in horizontal and vertical direction are 3 and 2 pixels, respectively. For blue color, the 4 and 3 pixels shifting in horizontal and vertical directions is required for chromatic aberration compensation.

Tables Icon

Table 2. Simulation conditions for color dispersion analysis

2.4 Additive type compressive light field display

The multi-focal function can reconstruct the 3D images using the principle of DFD or compressive light field display. Since the DFD provides only a single viewpoint at the fixed pupil position, usually eye-tracking or gaze-tracking should be accompanied to provide correct view image with the pupil movement [36]. In case of the compressive light field display system, it is possible to generate multiple viewpoints with corresponding perspective images near the eye pupil [13, 16, 37]. Though the correlation among the view images in the compressive light field display system is lower than that in the DFD system with eye-tracking, a moderate correlation for reconstructing high quality 3D image can be provided when the optimized area is limited inside the eye box. Thus, the compressive light field display can be a good candidates for 3D HMD system without eye-tracking. Since both panels are transparent and independent of each other in the proposed method, the observed image can be presented by the summation of images from each panel as shown in Fig. 5(a). For generating the layer images, the additive type compressive light field display method is adopted as follows [12, 17]:

lt(x,y,u,v)=pf(ff(x,y,u,v))+pr(fr(x,y,u,v)),
where lt is the target light field of the 3D image at a single viewpoint, and pf and pr are pixel information of front and rear layers, respectively. ff and fr are mapping functions of front and rear planes. x and y are the positions of pixels corresponding to the target light field. u and v are the tangential components of the target light field. For the entire viewpoints and their corresponding perspective images inside the eye box, layer images for panels are optimized by solving the least square algorithm [16].

 

Fig. 5 3D image reconstruction: (a) light field optimization for two additive layers, (b) real-time operation of two additive layers

Download Full Size | PPT Slide | PDF

Figure 5(b) shows the real-time synchronization of each optical component. During the switching of duplicated panels to present correct layer images, the LC response in the polarization rotator is controlled by applied voltage to modulate the oscillation axis of linear polarization. After synchronization, the correct layer images are presented on each image plane and the 3D images with correct perspective views are observed inside the eye box.

3. Experiments and results

3.1 Prototype of 3D head-mounted display with modified Savart plate

For investigating the feasibility of the proposed system and the reliability in simulation results about optical aberrations, the prototype of the 3D HMD system for monocular is implemented.

Simple optical structure in the HMD module provides wearable function as shown in Fig. 6(a). The mannequin having the size of 200 mm (width) by 200 mm (height) by 230 mm (depth) can wear the HMD module of the proposed system. The HMD module is operated by connecting to power and video cable from personal computer. We expect that the personal computer can be simply replaced by portable devices or integrated circuit. In Fig. 6(b), the detailed configuration of the module is introduced. The polarization state of the micro OLED with high pixel density is modulated by the polarization rotator (LC-Tec, PolarSpeed®-M(L)) in real-time. After passing through the modified Savart plate, two virtual display panels are formed according to the eigen polarization state. The separation distance of two panels is expanded by the concave mirror and the beam splitter deflects the orientation of images. In the prototype, the concave mirror is employed to exclude the affection from the chromatic aberration in the convex lens and verify the chromatic aberration compensation of extraordinary wave. Complementary metal–oxide–semiconductor (CMOS) camera is located at the entrance of the module and captures the perspective images inside the eye box. The operation voltage of the polarization rotator and display images for both planes are synchronized at 30 Hz. As shown in Fig. 6(c), two virtual planes are floated in the space with separation distance of 260 mm. Both real objects and 3D images are observed simultaneously. The FOV of 3D image in the prototype is limited by the small size of the micro OLED, and the values are 8.9 and 6.3 degrees in horizontal and vertical direction, respectively. The FOV of 3D image can be improved by applying large-sized display and concave mirror or convex lens with short focal length. The maximum FOV of 3D image of 37 degrees is obtained when the size of micro display is equal to that of the Savart plate and the active area of the display sticks to the entrance of the Savart plate. The detailed experimental conditions are written in Table 3.

 

Fig. 6 Experimental setup: (a) wearable function of proposed system, (b) detailed configuration of HMD module, (c) experimental setup with real objects

Download Full Size | PPT Slide | PDF

Tables Icon

Table 3. Experimental conditions

3.2 Experimental results

Figure 7 shows the verifications of the multi-focal function and the chromatic aberration compensation results. When the CMOS camera with small DOF lens captures the image at the positions of 640 mm, 770 mm and 900 mm, the shape of image changes. When the camera focuses on the front plane 640 mm away from the eye box, the real object of ‘F’ and computer-generated image of ‘Front’ are clearly observed without blur. On the contrary, other real objects of ‘C’ and ‘R’ and the image of ‘Rear’ are blurred. When the camera focuses on the real object of ‘C’ located in the middle of both front and rear planes, both virtual images are blurred. Camera focusing on rear plane captures clear image of ‘Rear’ and the object of ‘R’. The results of focus change verify that the multi-focal planes are well defined without the astigmatism. The Visualization 1 of focus change in real-time is provided for clarifying the results in Fig. 7. The magnified image in Fig. 7(a) shows the chromatic shift in lateral direction as expected in the simulation results of Section 2. Assuming that the red color is reference of the color shift, the blue color is shifted to bottom-right direction. The lateral chromatic shift of each color is calculated and the base image is processed to apply the sub-pixel shifting method. When the compensated image is applied to the rear plane, the chromatic shift is reduced and clear white image is observed in Fig. 7(b). The result verifies the feasibility of the sub-pixel shifting method.

 

Fig. 7 Focus changes between front and rear virtual planes: (a) focus change without compensation, (b) focus change with compensation (Visualization 1)

Download Full Size | PPT Slide | PDF

Figure 8 presents the perspective changes of 3D image generated by compressive light field display inside the eye box. The base images for both planes are optimized at viewpoints inside the eye box with 2 mm interval. The size of the eye box is 24 mm by 24 mm with 169 viewpoints. For investigating the perspective of 3D images containing three characters, series of perspective images are captured at the edge of the eye box. As shown in Fig. 8(a), three characters of ‘A’, ‘V’, and ‘O’ are computer-generated 3D images and real objects of ‘F’ and ‘R’ are arranged to make the word ‘FAVOR’. The character ‘A’ and ‘O’ are located at the same depth position of the object ‘F’ and ‘R’, respectively. The character ‘V’ is located in the middle of real objects. The relative perspective changes between real objects and computer generated images verify the correct reconstruction of 3D image in the space. The results show that both the object ‘F’ and image ‘A’ are located at the front plane and the object ‘R’ and ‘O’ are located at the rear plane. Though the eye box is so small with the size of 24 mm by 24 mm, which means that the perspective of 3D image is also small, clear and appropriate perspectives are observed. For clarifying the continuous perspective change of the 3D images, the Visualization 2 and Visualization 3 including perspective changes in both horizontal and vertical directions are provided. In captured images, ghost images are observed by low polarization contrast of polarization rotator. Unintended portion of the eigen polarization component acts as the leakage of image. The polarization rotator with high polarization contrast can mitigate the crosstalk from ghost image. In Fig. 8(b) and the Visualization 4, the focus change of car which is configured along front plane to rear plane is also provided. The front tire and couple dolls are real objects, and the car body and rear tire are computer-generated 3D images. The couple dolls are located at the depth position of rear-view mirror. As shown in Fig. 8(b), the focus change is confirmed by comparing the degree of blur in the magnified images of the rear-view mirror. The integration of real object and its information can be applied for diverse purposes in industrial, medical, and education. In addition, the compressive light field images generated by the multi-focal planes can contribute to relieving the visual fatigue [13, 16]. In the Visualization 5, the additional perspective change of car is provided. From the experimental results, the feasibility of the multi-focal function with the aberration compensation and the realization of 3D images in the prototype are confirmed.

 

Fig. 8 3D images in proposed system: (a) perspective images, (b) focus changes (Visualization 2, Visualization 3, Visualization 4 and Visualization 5)

Download Full Size | PPT Slide | PDF

4. Conclusions

In this paper, realization of 3D HMD with multi-focal function is proposed by using the modified Savart plate. The HWP is inserted in the middle of Savart plate composed of plane-parallel calcite plates to compensate the astigmatism. High resolution micro OLED with 1666 ppi is optically duplicated into two virtual panels in longitudinal direction. To assure the high quality 3D image in the proposed system, the color dispersion of the extraordinary wave is also examined. By applying the sub-pixel shifting method for sub-pixels of red, green, and blue, the lateral chromatic aberration of extraordinary wave is effectively reduced. The AR function with 3D images are realized by combining the duplicated panels with the concave mirror and the beam splitter. From the geometrical relation between two panels, the compressive light field images are represented by optimizing the light field of additive layers. Through the experiments, the prototype of the proposed system is implemented. As the polarization rotator and the display image are synchronized in real-time with 30 Hz, the multi-focal function and the aberration compensation are confirmed. The perspective of 3D image is also verified by comparing to the real objects. Simple optical structure realizes the wearable function of the HMD module in spite of the combination of the beam splitter. The size of the module is about 40 mm (width) by 90 mm (height) by 40 mm (depth) and the weight is 131 g. The form factor of the system can be minimized by optimizing the specification of optical components such as focal length of mirror and the size of display. Besides, the FOV of the system can be improved by adopting large-sized micro display and reducing distance between the display and the concave mirror. The image distortion from pupil swim can be minimized by pre-distortion rendering of target perspective images [38]. We expect that the proposed system can contribute to implementation of the 3D HMD with multi-focal function over two or more focal planes by combining the multiple sets of the modified Savart plate, tunable lenses or deformable mirrors. Moreover, the simple optical configuration and optical aberration compensation in the proposed system will realize the high quality 3D image in the wearable display system such as the HMD and smart watch.

Funding

National Research Foundation of Korea (501100003725)

Acknowledgments

This work was supported by the Brain Korea 21 Plus Project in 2016. The car 3D image used in the experiment is provided by aXel used under Creative Commons Attribution 3.0.

References and Links

1. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013). [CrossRef]   [PubMed]  

2. B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013). [CrossRef]  

3. Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012). [CrossRef]  

4. J. Nakamura, K. Tanaka, and Y. Takaki, “Increase in depth of field of eyes using reduced-view super multi-view displays,” Appl. Phys. Express 6(2), 022501 (2013). [CrossRef]  

5. C. C. Gordon, T. Churchill, C. E. Clauser, B. Bradtmiller, J. T. McConville, I. Tebbetts, and R. A. Walker, “Anthropometric survey of US army personnel: methods and summary statistics,” United States Army Natick Research, Development, and Engineering Center (1988).

6. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010). [CrossRef]   [PubMed]  

7. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011). [CrossRef]   [PubMed]  

8. D. Teng, L. Liu, and B. Wang, “Super multi-view three-dimensional display through spatial-spectrum time-multiplexing of planar aligned OLED microdisplays,” Opt. Express 22(25), 31448–31457 (2014). [CrossRef]   [PubMed]  

9. L. Liu, Z. Pang, and D. Teng, “Super multi-view three-dimensional display technique for portable devices,” Opt. Express 24(5), 4421–4430 (2016). [CrossRef]  

10. K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004). [CrossRef]  

11. G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011). [CrossRef]  

12. D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011). [CrossRef]  

13. F. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 33, 60 (2015).

14. S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

15. X. Hu and H. Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22(11), 13896–13903 (2014). [CrossRef]   [PubMed]  

16. S. Moon, S.-G. Park, C.-K. Lee, J. Cho, S. Lee, and B. Lee, “Computational multi-projection display,” Opt. Express 24(8), 9025–9037 (2016). [CrossRef]   [PubMed]  

17. S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016). [CrossRef]  

18. Y. Kim, H. Choi, J. Kim, S. W. Cho, Y. Kim, G. Park, and B. Lee, “Depth-enhanced integral imaging display system with electrically variable image planes using polymer-dispersed liquid-crystal layers,” Appl. Opt. 46(18), 3766–3773 (2007). [CrossRef]   [PubMed]  

19. S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016). [CrossRef]  

20. X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014). [CrossRef]  

21. R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in Proc. of the ACM Conference on Human Factors in Computing Systems (2016), pp. 1211–1220. [CrossRef]  

22. S. Yoon, H. Baek, S. W. Min, S.-G. Park, M. K. Park, S. H. Yoo, H. R. Kim, and B. Lee, “Implementation of active-type Lamina 3D display system,” Opt. Express 23(12), 15848–15856 (2015). [CrossRef]   [PubMed]  

23. J.-H. Park, S. Jung, H. Choi, and B. Lee, “Integral imaging with multiple image planes using a uniaxial crystal plate,” Opt. Express 11(16), 1862–1875 (2003). [CrossRef]   [PubMed]  

24. G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17(18), 15716–15725 (2009). [CrossRef]   [PubMed]  

25. C. K. Park, S. S. Lee, and Y. S. Hwang, “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths,” Opt. Express 17(21), 19047–19054 (2009). [CrossRef]   [PubMed]  

26. S.-G. Park, S. Yoon, J. Yeom, H. Baek, S.-W. Min, and B. Lee, “Lamina 3D display: projection-type depth-fused display using polarization-encoded depth information,” Opt. Express 22(21), 26162–26172 (2014). [CrossRef]   [PubMed]  

27. H.-S. Chen, Y.-J. Wang, P.-J. Chen, and Y.-H. Lin, “Electrically adjustable location of a projected image in augmented reality via a liquid-crystal lens,” Opt. Express 23(22), 28154–28162 (2015). [CrossRef]   [PubMed]  

28. C.-K. Lee, S.-G. Park, S. Moon, and B. Lee, “Viewing zone duplication of multi-projection 3D display system using uniaxial crystal,” Opt. Express 24(8), 8458–8470 (2016). [CrossRef]   [PubMed]  

29. Y.-H. Lee, F. Peng, and S.-T. Wu, “Fast-response switchable lens for 3D and wearable displays,” Opt. Express 24(2), 1668–1675 (2016). [CrossRef]   [PubMed]  

30. M. Avendaño-Alejo and M. Rosete-Aguilar, “Optical path difference in a plane-parallel uniaxial plate,” J. Opt. Soc. Am. A 23(4), 926–932 (2006). [CrossRef]   [PubMed]  

31. T. Mu, C. Zhang, Q. Li, L. Zhang, Y. Wei, and Q. Chen, “Achromatic Savart polariscope: choice of materials,” Opt. Express 22(5), 5043–5051 (2014). [CrossRef]   [PubMed]  

32. J. P. Lesso, A. J. Duncan, W. Sibbett, and M. J. Padgett, “Aberrations introduced by a lens made from a birefringent material,” Appl. Opt. 39(4), 592–598 (2000). [CrossRef]   [PubMed]  

33. D. Schmid, T.-Y. Huang, S. Hazrat, R. Dirks, O. Hosten, S. Quint, D. Thian, and P. G. Kwiat, “Adjustable and robust methods for polarization-dependent focusing,” Opt. Express 21(13), 15538–15552 (2013). [CrossRef]   [PubMed]  

34. W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008). [CrossRef]  

35. X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011). [CrossRef]   [PubMed]  

36. S.-G. Park, J.-Y. Hong, C.-K. Lee, and B. Lee, “Real-mode depth-fused display with viewer tracking,” Opt. Express 23(20), 26710–26722 (2015). [CrossRef]   [PubMed]  

37. D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1–10 (2013). [CrossRef]  

38. Y. M. Kim, J. Yim, Y.-K. Ahn, and S.-W. Min, “Compensation of elemental image using multiple view vectors for off-axis integral floating system,” Appl. Opt. 53(10), 1975–1982 (2014). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
    [Crossref] [PubMed]
  2. B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
    [Crossref]
  3. Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
    [Crossref]
  4. J. Nakamura, K. Tanaka, and Y. Takaki, “Increase in depth of field of eyes using reduced-view super multi-view displays,” Appl. Phys. Express 6(2), 022501 (2013).
    [Crossref]
  5. C. C. Gordon, T. Churchill, C. E. Clauser, B. Bradtmiller, J. T. McConville, I. Tebbetts, and R. A. Walker, “Anthropometric survey of US army personnel: methods and summary statistics,” United States Army Natick Research, Development, and Engineering Center (1988).
  6. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010).
    [Crossref] [PubMed]
  7. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011).
    [Crossref] [PubMed]
  8. D. Teng, L. Liu, and B. Wang, “Super multi-view three-dimensional display through spatial-spectrum time-multiplexing of planar aligned OLED microdisplays,” Opt. Express 22(25), 31448–31457 (2014).
    [Crossref] [PubMed]
  9. L. Liu, Z. Pang, and D. Teng, “Super multi-view three-dimensional display technique for portable devices,” Opt. Express 24(5), 4421–4430 (2016).
    [Crossref]
  10. K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004).
    [Crossref]
  11. G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
    [Crossref]
  12. D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
    [Crossref]
  13. F. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 33, 60 (2015).
  14. S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).
  15. X. Hu and H. Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22(11), 13896–13903 (2014).
    [Crossref] [PubMed]
  16. S. Moon, S.-G. Park, C.-K. Lee, J. Cho, S. Lee, and B. Lee, “Computational multi-projection display,” Opt. Express 24(8), 9025–9037 (2016).
    [Crossref] [PubMed]
  17. S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016).
    [Crossref]
  18. Y. Kim, H. Choi, J. Kim, S. W. Cho, Y. Kim, G. Park, and B. Lee, “Depth-enhanced integral imaging display system with electrically variable image planes using polymer-dispersed liquid-crystal layers,” Appl. Opt. 46(18), 3766–3773 (2007).
    [Crossref] [PubMed]
  19. S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
    [Crossref]
  20. X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014).
    [Crossref]
  21. R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in Proc. of the ACM Conference on Human Factors in Computing Systems (2016), pp. 1211–1220.
    [Crossref]
  22. S. Yoon, H. Baek, S. W. Min, S.-G. Park, M. K. Park, S. H. Yoo, H. R. Kim, and B. Lee, “Implementation of active-type Lamina 3D display system,” Opt. Express 23(12), 15848–15856 (2015).
    [Crossref] [PubMed]
  23. J.-H. Park, S. Jung, H. Choi, and B. Lee, “Integral imaging with multiple image planes using a uniaxial crystal plate,” Opt. Express 11(16), 1862–1875 (2003).
    [Crossref] [PubMed]
  24. G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17(18), 15716–15725 (2009).
    [Crossref] [PubMed]
  25. C. K. Park, S. S. Lee, and Y. S. Hwang, “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths,” Opt. Express 17(21), 19047–19054 (2009).
    [Crossref] [PubMed]
  26. S.-G. Park, S. Yoon, J. Yeom, H. Baek, S.-W. Min, and B. Lee, “Lamina 3D display: projection-type depth-fused display using polarization-encoded depth information,” Opt. Express 22(21), 26162–26172 (2014).
    [Crossref] [PubMed]
  27. H.-S. Chen, Y.-J. Wang, P.-J. Chen, and Y.-H. Lin, “Electrically adjustable location of a projected image in augmented reality via a liquid-crystal lens,” Opt. Express 23(22), 28154–28162 (2015).
    [Crossref] [PubMed]
  28. C.-K. Lee, S.-G. Park, S. Moon, and B. Lee, “Viewing zone duplication of multi-projection 3D display system using uniaxial crystal,” Opt. Express 24(8), 8458–8470 (2016).
    [Crossref] [PubMed]
  29. Y.-H. Lee, F. Peng, and S.-T. Wu, “Fast-response switchable lens for 3D and wearable displays,” Opt. Express 24(2), 1668–1675 (2016).
    [Crossref] [PubMed]
  30. M. Avendaño-Alejo and M. Rosete-Aguilar, “Optical path difference in a plane-parallel uniaxial plate,” J. Opt. Soc. Am. A 23(4), 926–932 (2006).
    [Crossref] [PubMed]
  31. T. Mu, C. Zhang, Q. Li, L. Zhang, Y. Wei, and Q. Chen, “Achromatic Savart polariscope: choice of materials,” Opt. Express 22(5), 5043–5051 (2014).
    [Crossref] [PubMed]
  32. J. P. Lesso, A. J. Duncan, W. Sibbett, and M. J. Padgett, “Aberrations introduced by a lens made from a birefringent material,” Appl. Opt. 39(4), 592–598 (2000).
    [Crossref] [PubMed]
  33. D. Schmid, T.-Y. Huang, S. Hazrat, R. Dirks, O. Hosten, S. Quint, D. Thian, and P. G. Kwiat, “Adjustable and robust methods for polarization-dependent focusing,” Opt. Express 21(13), 15538–15552 (2013).
    [Crossref] [PubMed]
  34. W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
    [Crossref]
  35. X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
    [Crossref] [PubMed]
  36. S.-G. Park, J.-Y. Hong, C.-K. Lee, and B. Lee, “Real-mode depth-fused display with viewer tracking,” Opt. Express 23(20), 26710–26722 (2015).
    [Crossref] [PubMed]
  37. D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1–10 (2013).
    [Crossref]
  38. Y. M. Kim, J. Yim, Y.-K. Ahn, and S.-W. Min, “Compensation of elemental image using multiple view vectors for off-axis integral floating system,” Appl. Opt. 53(10), 1975–1982 (2014).
    [Crossref] [PubMed]

2016 (6)

2015 (4)

2014 (6)

2013 (5)

D. Schmid, T.-Y. Huang, S. Hazrat, R. Dirks, O. Hosten, S. Quint, D. Thian, and P. G. Kwiat, “Adjustable and robust methods for polarization-dependent focusing,” Opt. Express 21(13), 15538–15552 (2013).
[Crossref] [PubMed]

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1–10 (2013).
[Crossref]

J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
[Crossref] [PubMed]

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

J. Nakamura, K. Tanaka, and Y. Takaki, “Increase in depth of field of eyes using reduced-view super multi-view displays,” Appl. Phys. Express 6(2), 022501 (2013).
[Crossref]

2012 (1)

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

2011 (4)

Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011).
[Crossref] [PubMed]

G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
[Crossref]

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

2010 (1)

2009 (2)

2008 (1)

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

2007 (1)

2006 (1)

2005 (1)

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

2004 (1)

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004).
[Crossref]

2003 (1)

2000 (1)

Ahn, Y.-K.

Akeley, K.

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004).
[Crossref]

Ando, H.

Avendaño-Alejo, M.

Baek, H.

Banks, M. S.

Chen, H.-S.

Chen, K.

F. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 33, 60 (2015).

Chen, P.-J.

Chen, Q.

Chen, X.

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

Cho, J.

S. Moon, S.-G. Park, C.-K. Lee, J. Cho, S. Lee, and B. Lee, “Computational multi-projection display,” Opt. Express 24(8), 9025–9037 (2016).
[Crossref] [PubMed]

S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016).
[Crossref]

Cho, S. W.

Choi, H.

Chun-Min, Z.

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

Cooper, E. A.

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in Proc. of the ACM Conference on Human Factors in Computing Systems (2016), pp. 1211–1220.
[Crossref]

Dirks, R.

Duncan, A. J.

Fujikado, T.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Gao, J.

Geng, J.

J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
[Crossref] [PubMed]

Girshick, A. R.

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004).
[Crossref]

Han-Chen, L.

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

Hands, P. J. W.

Hazrat, S.

Heidrich, W.

G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
[Crossref]

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

Hirsch, M.

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

Hoffman, D. M.

Hong, J.-Y.

Hong, K.

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Hosohata, J.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Hosten, O.

Hu, X.

X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014).
[Crossref]

X. Hu and H. Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22(11), 13896–13903 (2014).
[Crossref] [PubMed]

Hua, H.

X. Hu and H. Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22(11), 13896–13903 (2014).
[Crossref] [PubMed]

X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014).
[Crossref]

Huang, F.

F. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 33, 60 (2015).

Huang, S.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Huang, T.-Y.

Hwang, J.-M.

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Hwang, Y. S.

Ishigure, Y.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Jang, C.

S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016).
[Crossref]

Jiang, K.

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

Jun-Fang, W.

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

Jung, J.-H.

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Jung, S.

Kashiwada, S.

Kim, H. R.

Kim, J.

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Y. Kim, H. Choi, J. Kim, S. W. Cho, Y. Kim, G. Park, and B. Lee, “Depth-enhanced integral imaging display system with electrically variable image planes using polymer-dispersed liquid-crystal layers,” Appl. Opt. 46(18), 3766–3773 (2007).
[Crossref] [PubMed]

Kim, Y.

Kim, Y. M.

Kirby, A. K.

Konrad, R.

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in Proc. of the ACM Conference on Human Factors in Computing Systems (2016), pp. 1211–1220.
[Crossref]

Kwiat, P. G.

Lanman, D.

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1–10 (2013).
[Crossref]

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
[Crossref]

Lee, B.

S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016).
[Crossref]

S. Moon, S.-G. Park, C.-K. Lee, J. Cho, S. Lee, and B. Lee, “Computational multi-projection display,” Opt. Express 24(8), 9025–9037 (2016).
[Crossref] [PubMed]

C.-K. Lee, S.-G. Park, S. Moon, and B. Lee, “Viewing zone duplication of multi-projection 3D display system using uniaxial crystal,” Opt. Express 24(8), 8458–8470 (2016).
[Crossref] [PubMed]

S. Yoon, H. Baek, S. W. Min, S.-G. Park, M. K. Park, S. H. Yoo, H. R. Kim, and B. Lee, “Implementation of active-type Lamina 3D display system,” Opt. Express 23(12), 15848–15856 (2015).
[Crossref] [PubMed]

S.-G. Park, J.-Y. Hong, C.-K. Lee, and B. Lee, “Real-mode depth-fused display with viewer tracking,” Opt. Express 23(20), 26710–26722 (2015).
[Crossref] [PubMed]

S.-G. Park, S. Yoon, J. Yeom, H. Baek, S.-W. Min, and B. Lee, “Lamina 3D display: projection-type depth-fused display using polarization-encoded depth information,” Opt. Express 22(21), 26162–26172 (2014).
[Crossref] [PubMed]

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Y. Kim, H. Choi, J. Kim, S. W. Cho, Y. Kim, G. Park, and B. Lee, “Depth-enhanced integral imaging display system with electrically variable image planes using polymer-dispersed liquid-crystal layers,” Appl. Opt. 46(18), 3766–3773 (2007).
[Crossref] [PubMed]

J.-H. Park, S. Jung, H. Choi, and B. Lee, “Integral imaging with multiple image planes using a uniaxial crystal plate,” Opt. Express 11(16), 1862–1875 (2003).
[Crossref] [PubMed]

Lee, C.-K.

Lee, S.

S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016).
[Crossref]

S. Moon, S.-G. Park, C.-K. Lee, J. Cho, S. Lee, and B. Lee, “Computational multi-projection display,” Opt. Express 24(8), 9025–9037 (2016).
[Crossref] [PubMed]

Lee, S. S.

Lee, Y.-H.

Lesso, J. P.

Li, Q.

Li, X.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Li, Y.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Lin, Y.-H.

Liu, L.

Liu, S.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Love, G. D.

Lu, W.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Luebke, D.

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1–10 (2013).
[Crossref]

Luo, Y.

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

Min, S. W.

Min, S.-W.

Moon, S.

Mu, T.

Nago, N.

Nakamura, J.

J. Nakamura, K. Tanaka, and Y. Takaki, “Increase in depth of field of eyes using reduced-view super multi-view displays,” Appl. Phys. Express 6(2), 022501 (2013).
[Crossref]

Nakamura, K.

Nakazawa, K.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Padgett, M. J.

Pang, Z.

Park, C. K.

Park, G.

Park, J.-H.

Park, M. K.

Park, S.-G.

Pendry, J. B.

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

Peng, F.

Quint, S.

Raskar, R.

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
[Crossref]

Rong, N.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Rosete-Aguilar, M.

Schmid, D.

Seo, J.-M.

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Sibbett, W.

Su, Y.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Suyama, S.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Takada, H.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Takaki, Y.

Takao, Y.

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Tanaka, K.

J. Nakamura, K. Tanaka, and Y. Takaki, “Increase in depth of field of eyes using reduced-view super multi-view displays,” Appl. Phys. Express 6(2), 022501 (2013).
[Crossref]

Teng, D.

Thian, D.

Urano, Y.

Wang, B.

Wang, Y.-J.

Watt, S. J.

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004).
[Crossref]

Wei, Y.

Wetzstein, G.

F. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 33, 60 (2015).

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
[Crossref]

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in Proc. of the ACM Conference on Human Factors in Computing Systems (2016), pp. 1211–1220.
[Crossref]

Wu, S.-T.

Xue-Jun, Z.

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

Yang, H. K.

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

Yeom, J.

Yim, J.

Ying-Tang, Z.

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

Yoo, S. H.

Yoon, S.

Zhang, C.

Zhang, J.

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

Zhang, L.

Zhang, S.

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

Zhou, P.

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

ACM Trans. Graph. (6)

K. Akeley, S. J. Watt, A. R. Girshick, and M. S. Banks, “A stereo display prototype with multiple focal distances,” ACM Trans. Graph. 23(3), 804–813 (2004).
[Crossref]

G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. 30(4), 95 (2011).
[Crossref]

D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011).
[Crossref]

F. Huang, K. Chen, and G. Wetzstein, “The light field stereoscope: immersive computer graphics via factored near-eye light field displays with focus cues,” ACM Trans. Graph. 33, 60 (2015).

S. Lee, C. Jang, S. Moon, J. Cho, and B. Lee, “Additive light field displays: realization of augmented reality with holographic optical elements,” ACM Trans. Graph. 35(4), 60 (2016).
[Crossref]

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1–10 (2013).
[Crossref]

Adv. Opt. Photonics (1)

J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013).
[Crossref] [PubMed]

Appl. Opt. (3)

Appl. Phys. Express (1)

J. Nakamura, K. Tanaka, and Y. Takaki, “Increase in depth of field of eyes using reduced-view super multi-view displays,” Appl. Phys. Express 6(2), 022501 (2013).
[Crossref]

Chin. Phys. B (1)

W. Jun-Fang, Z. Chun-Min, Z. Ying-Tang, L. Han-Chen, and Z. Xue-Jun, “Refraction of extraordinary rays and ordinary rays in the Savart polariscope,” Chin. Phys. B 17(7), 2504–2508 (2008).
[Crossref]

J. Disp. Technol. (2)

X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014).
[Crossref]

Y. Kim, J. Kim, K. Hong, H. K. Yang, J.-H. Jung, H. Choi, S.-W. Min, J.-M. Seo, J.-M. Hwang, and B. Lee, “Accommodative response of integral imaging in near distance,” J. Disp. Technol. 8(2), 70–78 (2012).
[Crossref]

J. Opt. Soc. Am. A (1)

J. Soc. Inf. Disp. (1)

S. Liu, Y. Li, P. Zhou, X. Li, N. Rong, S. Huang, W. Lu, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016).
[Crossref]

Nat. Commun. (1)

X. Chen, Y. Luo, J. Zhang, K. Jiang, J. B. Pendry, and S. Zhang, “Macroscopic invisibility cloaking of visible light,” Nat. Commun. 2, 176 (2011).
[Crossref] [PubMed]

NTT Tech. Rev. (1)

S. Suyama, Y. Ishigure, H. Takada, K. Nakazawa, J. Hosohata, Y. Takao, and T. Fujikado, “Evaluation of visual fatigue in viewing a depth-fused 3-D display in comparison with a 2-D display,” NTT Tech. Rev. 3, 82–89 (2005).

Opt. Express (17)

X. Hu and H. Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22(11), 13896–13903 (2014).
[Crossref] [PubMed]

S. Moon, S.-G. Park, C.-K. Lee, J. Cho, S. Lee, and B. Lee, “Computational multi-projection display,” Opt. Express 24(8), 9025–9037 (2016).
[Crossref] [PubMed]

Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010).
[Crossref] [PubMed]

Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011).
[Crossref] [PubMed]

D. Teng, L. Liu, and B. Wang, “Super multi-view three-dimensional display through spatial-spectrum time-multiplexing of planar aligned OLED microdisplays,” Opt. Express 22(25), 31448–31457 (2014).
[Crossref] [PubMed]

L. Liu, Z. Pang, and D. Teng, “Super multi-view three-dimensional display technique for portable devices,” Opt. Express 24(5), 4421–4430 (2016).
[Crossref]

S.-G. Park, J.-Y. Hong, C.-K. Lee, and B. Lee, “Real-mode depth-fused display with viewer tracking,” Opt. Express 23(20), 26710–26722 (2015).
[Crossref] [PubMed]

T. Mu, C. Zhang, Q. Li, L. Zhang, Y. Wei, and Q. Chen, “Achromatic Savart polariscope: choice of materials,” Opt. Express 22(5), 5043–5051 (2014).
[Crossref] [PubMed]

D. Schmid, T.-Y. Huang, S. Hazrat, R. Dirks, O. Hosten, S. Quint, D. Thian, and P. G. Kwiat, “Adjustable and robust methods for polarization-dependent focusing,” Opt. Express 21(13), 15538–15552 (2013).
[Crossref] [PubMed]

S. Yoon, H. Baek, S. W. Min, S.-G. Park, M. K. Park, S. H. Yoo, H. R. Kim, and B. Lee, “Implementation of active-type Lamina 3D display system,” Opt. Express 23(12), 15848–15856 (2015).
[Crossref] [PubMed]

J.-H. Park, S. Jung, H. Choi, and B. Lee, “Integral imaging with multiple image planes using a uniaxial crystal plate,” Opt. Express 11(16), 1862–1875 (2003).
[Crossref] [PubMed]

G. D. Love, D. M. Hoffman, P. J. W. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17(18), 15716–15725 (2009).
[Crossref] [PubMed]

C. K. Park, S. S. Lee, and Y. S. Hwang, “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths,” Opt. Express 17(21), 19047–19054 (2009).
[Crossref] [PubMed]

S.-G. Park, S. Yoon, J. Yeom, H. Baek, S.-W. Min, and B. Lee, “Lamina 3D display: projection-type depth-fused display using polarization-encoded depth information,” Opt. Express 22(21), 26162–26172 (2014).
[Crossref] [PubMed]

H.-S. Chen, Y.-J. Wang, P.-J. Chen, and Y.-H. Lin, “Electrically adjustable location of a projected image in augmented reality via a liquid-crystal lens,” Opt. Express 23(22), 28154–28162 (2015).
[Crossref] [PubMed]

C.-K. Lee, S.-G. Park, S. Moon, and B. Lee, “Viewing zone duplication of multi-projection 3D display system using uniaxial crystal,” Opt. Express 24(8), 8458–8470 (2016).
[Crossref] [PubMed]

Y.-H. Lee, F. Peng, and S.-T. Wu, “Fast-response switchable lens for 3D and wearable displays,” Opt. Express 24(2), 1668–1675 (2016).
[Crossref] [PubMed]

Phys. Today (1)

B. Lee, “Three-dimensional displays, past and present,” Phys. Today 66(4), 36–41 (2013).
[Crossref]

Other (2)

C. C. Gordon, T. Churchill, C. E. Clauser, B. Bradtmiller, J. T. McConville, I. Tebbetts, and R. A. Walker, “Anthropometric survey of US army personnel: methods and summary statistics,” United States Army Natick Research, Development, and Engineering Center (1988).

R. Konrad, E. A. Cooper, and G. Wetzstein, “Novel optical configurations for virtual reality: evaluating user preference and performance with focus-tunable and monovision near-eye displays,” in Proc. of the ACM Conference on Human Factors in Computing Systems (2016), pp. 1211–1220.
[Crossref]

Supplementary Material (5)

NameDescription
» Visualization 1: MOV (855 KB)      Focus change between front and rear planes
» Visualization 2: MOV (3746 KB)      Perspective change in horizontal direction
» Visualization 3: MOV (3497 KB)      Perspective change in vertical direction
» Visualization 4: MOV (1280 KB)      Focus change of car 3D image
» Visualization 5: MOV (3187 KB)      Perspective change of car 3D image

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Multi-focal HMD system using Savart plate: (a) schematic diagram of proposed system, (b) optically equivalent model of proposed system
Fig. 2
Fig. 2 Astigmatism in plane-parallel calcite plate: ray tracing for (a) horizontal component (top view, optic axis with 45 degrees), (b) vertical component (side view, optic axis of 45 with degrees), and (c) trajectory of virtual image of point light source with angle of optic axis
Fig. 3
Fig. 3 Astigmatism compensation using Savart plate with half wave plate: (a) horizontal component (top view) and (b) vertical component (side view)
Fig. 4
Fig. 4 Color dispersion in modified Savart plate: (a) spectrum of micro OLED and center wavelength, (b) chromatic aberration in modified Savart plate
Fig. 5
Fig. 5 3D image reconstruction: (a) light field optimization for two additive layers, (b) real-time operation of two additive layers
Fig. 6
Fig. 6 Experimental setup: (a) wearable function of proposed system, (b) detailed configuration of HMD module, (c) experimental setup with real objects
Fig. 7
Fig. 7 Focus changes between front and rear virtual planes: (a) focus change without compensation, (b) focus change with compensation (Visualization 1)
Fig. 8
Fig. 8 3D images in proposed system: (a) perspective images, (b) focus changes (Visualization 2, Visualization 3, Visualization 4 and Visualization 5)

Tables (3)

Tables Icon

Table 1 Simulation conditions for astigmatism analysis

Tables Icon

Table 2 Simulation conditions for color dispersion analysis

Tables Icon

Table 3 Experimental conditions

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

θ window =2 tan 1 ( W bs 2( d eye + W bs ) ),
θ image =2 tan 1 ( W a 2( d eye + S a ) ),
Δs'= Δs ( s f f)( s f +Δsf) f 2 ,
λ c = 1 P total p(λ)λd λ,
{ n o =2.69705+0.0192064/( λ 2 0.01820)0.0151624 λ 2 , n e =2.18438+0.0087309/( λ 2 0.01018)0.0024411 λ 2 ,
{ N green =[ Δ C green p sub p p ], N blue =[ Δ C blue 2 p sub p p ],
l t ( x,y,u,v )= p f ( f f (x,y,u,v))+ p r ( f r (x,y,u,v )),

Metrics