Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Flicker-free dual-volume augmented reality display using a pixelated interwoven integral floating technique with a geometric phase lens

Open Access Open Access

Abstract

A geometric phase (GP) integral floating display can provide multifocal three-dimensional (3D) augmented reality (AR) images with enhanced depth expression by switching the focal modes of the GP lens via polarization control. However, using temporal multiplexing to switch between the focal modes of GP optics causes flickering as each 3D AR image is fully presented in different frames and their temporal luminance profile becomes easily recognizable, particularly as the number of available focal modes increases. Here, we propose a novel integral floating technique to generate pixelated interwoven 3D AR images; a half of each image is spatially mixed with another and presented in both focal modes simultaneously to resolve the flickering issue. The principle was verified via experimental demonstration and optically measured data.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

With the increasing interest and demand for advanced augmented reality (AR) and virtual reality (VR) displays, various kinds of approaches have been proposed and employed for realizing immersive display devices [14]. In particular, the integral imaging display is a technique to provide full parallax three-dimensional (3D) images with practicality from utilizing the existing flat panel display techniques [58]. Moreover, the integral imaging technique is expected to be one of the promising candidates to resolve the visual fatigue issue of AR and VR applications with its capability to provide a natural 3D perception [911]. However, despite the advantages of described above, the limited range of expressible 3D depth remains unresolved and a main bottleneck of integral imaging technology. Thus, it is essential to extend the depth range of the integral imaging display by increasing the number of focal planes [12].

As switchable wavefront-modulation optics with time-multiplexing scheme, geometric phase (GP) devices, made with birefringent materials with a half retarder condition, have attracted much attention recently in various optical systems. At a geometric optic axis distribution condition of a GP optics, two different phase modulations are achievable in a single component, and they can be selected by an incident circularly polarization state between two orthogonal ones [13]. With polarization-dependent light path control capability of the GP optics, the efficiency of the projecting optics can be much improved [14] and the switchable beam-steering function can be implemented in a compact module [15,16]. Self-interference behaviors by two orthogonally polarized wavefront modulations in a thin optic layer can be utilized for realization of compact digital holographic camera modules [17,18]. In addition, the use of GP technique in AR and VR applications is also attractive field with realizing compact optical parts [19,20] and polarization selective optical parts to provide binocular images using a single display device [21,22]. Besides, formation of multi-focal planes using the polarization-dependent optical properties of the GP lens have been verified to be effective in extending the depth expression of 3D displays for various applications including the AR and VR devices [2328].

Therefore, the GP integral floating - the combination of the GP technique with the integral imaging display is expected to be a promising approach to improve the performance. For that purpose, we have proposed an GP integral floating technique, including an active polarization switching panel (APSD) combined with a polarization-dependent focusing GP lens, which can provide multifocal (concave/convex) modes as well as extend the range of volumetric expression to present a natural three-dimensional (3D) perception through matching the locations of AR images with those of real objects, as depicted in Fig. 1 [29].

 figure: Fig. 1.

Fig. 1. The basic principle of a bifocal integral floating technique using an APSD and a GP lens.

Download Full Size | PDF

Basically, the GP lens has a bifocal optical property of switching between the convex and concave mode when the induced light has right-handed (RHCP) and left-handed circular polarization (LHCP), respectively. Based on this, in conventional GP integral floating methods, a single-cell structured APSD can produce 3D images having a single polarization status with the scale of a full screen in one frame [29]. Thus, the GP lens can operate with only a single focal mode at each frame corresponding to the induced polarization between the orthogonal polarization conditions (RHCP or LHCP). Further, a method to enhance the image quality of the GP integral floating technique has been proposed to resolve the color separation issue, in which depth locations of integrated red/green/blue images are separately compensated regarding the wavelength-dependent variation in the focal lengths of the GP lens [30]. Through the abovementioned approaches, the potential of the GP integral floating technique to present multi-volume 3D AR images has been verified.

However, despite the outlined advances, an important issue remains: the temporal multiplexing technique employed to form the 3D AR images sequentially causes a reduction in the refresh rate of each focal mode as shown in Fig. 2. Thus, it can induce a flickering phenomenon, indicating a recognizable non-uniform temporal profile of image luminance levels [31,32]. Thus, active devices having at least twice (e.g., 120 Hz) the typical refresh rate are necessary to prevent the flickering problem in most temporal multiplexing volumetric displays, including the conventional GP integral floating approaches described above. Although the flickering problem described above is a common issue in most temporal multiplexing volumetric displays extending the expressible depth ranges [29], the restriction that only a display device and optical system with a high refresh rate are allowed is a significant burden in the design of multi-focus volumetric display systems. Moreover, considering that the required refresh rate for multi-focus volumetric display systems is proportional to the number of focal modes due to flickering, extending the range of volumetric image representation will become much harder because faster devices are necessary to meet this requirement. Though it is possible to use existing display panels with higher refresh rates (e.g., 240 or 360 Hz) to support more volume spaces based on the conventional method, other challenges remain, including the increasing amount of image data requiring additional system resources to render and transmit images to the display panel. Thus, the development of a novel method for reducing the recognizable flicker levels without increasing the refresh rate will be considerably effective to extend the depth range of the volumetric images without the limitation described above.

 figure: Fig. 2.

Fig. 2. Conventional geometric phase (GP) integral floating technique with change in luminance profile over an entire screen, causing flickering.

Download Full Size | PDF

Accordingly, here, we propose a novel method to resolve the flicker issue of GP integral floating AR displays with the same refresh rate. Instead of providing full 3D AR images at different frames, the proposed method presents pixelated interwoven 3D AR images using a pixelated retardation controller (PRC) combined with an interwoven dual-volume rendering technique. If the 3D AR images are interwoven more densely and comprise smaller interwoven units, a weaker recognizable flickering level will be achieved. We verified this technique through experimentation and comparison with the conventional approach via visualizations, proving that the proposed method could prevent flickering recognition without increasing the refresh rate and processed data of the system. Additionally, the optical measurement data is utilized for numerical verification and presented herein. The proposed method can be used to realize more volume spaces using a display panel and PRC with the same refresh rate.

2. Principles

The key idea of the proposed technique is to distribute the luminance change both temporally and spatially via pixelated interwoven 3D AR images for observers to hardly recognize flickering when multi-depth volume images are expressed. To resolve the above issue of the conventional technique, the concept of pixelated polarization multiplexing presenting a mixture of two display states with different optical properties can be effective [33,34]. For that purpose, the proposed method adopts a PRC, which is capable of pixelated temporal polarization control, instead of a single-cell APSD, where the polarization states are uniformly switched into the single state between two circular polarizations. Thus, combining the PRC with the display panel, we can realize the pixelated interweaving units by controlling the polarization of each pixel forming the 3D AR images to have RHCP or LHCP, ensuring that the units are floated with the corresponding focal mode of the GP lens as shown in Fig. 3. Therefore, the proposed system can distribute the presented images using both the focal modes of the GP lens simultaneously at a frame moment. Thus, it is possible to interweave two 3D AR images because one half of each of them is presented to each volume space simultaneously, as elucidated in Fig. 4. In the figure, the odd and even half parts of 3D AR image 1 are displayed in the odd and even frames, respectively; whereas, the odd and even parts of 3D AR image 2 are shown in the opposite order. Thus, an observer can always see halves of both 3D AR images constituting odd or even units at any frame, and the luminance change will occur with a unit-wise scale, not over the entire screen as it occurs in the conventional multiplexing scheme. From the proposed pixelated interwoven dual-volume expression scheme, the flicker-free GP integral floating system can be realized if the size of the interweaving unit is beyond the recognition limitation of the observer. Considering that the spatial resolution of a human with normal vision (visual acuity of 1.0) is 30 cycles per degree, the interweaving unit size is recommended to be around 1 arcmin (1/60°).

 figure: Fig. 3.

Fig. 3. The principle to form a pixelated interweaving unit by combining the pixels of elemental image on the display panel and the corresponding polarization modulation.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. Proposed flicker-free technique to provide pixelated interwoven 3D AR images with temporal and spatial distributions of luminance profile change.

Download Full Size | PDF

A further requirement involves aligning and synchronizing the pixelated polarization modulation of the PRC and proper rendering of 3D AR images displayed for polarization-encoded pixelated elemental image construction at dual depth volumes. This was satisfied using two liquid crystal display (LCD) panels with the same specification—one as the display panel and the other as the PRC after the polarizers on both sides of the panel were removed. More specifically, LCD panel 2 is used to modulate the polarization of the induced light with the same pixel scale as that of LCD panel 1 (display panel). As the display panel and the PRC have a same external dimension and internal pixel arrangement, we could accomplish the precise alignment between them by using a single fixing plate to match the external boarder of two panels. In addition, more precise alignment was also possible by adopting a software to move the elemental images in a sub-pixel order. That software is also programmed to synchronize the output signals to the display panel and the PRC. Thus, by combining these two panels with a lens array and GP lens after alignment and synchronization, a bifocal integral floating system adopting a spatial–temporal retardation control with a single-pixel focus modulation can be realized to minimize the flicker recognition even with the temporal multiplexing scheme employed for extending the depth range.

3. Experimental setup and results

The experimental setup to present two 3D AR images with different focus modes of the GP lens is shown in Fig. 5. 3D AR images 1 and 2 were located 257 and 317 mm away from the beam splitter. Furthermore, to verify that the 3D AR images provide proper 3D cues, we also prepared two real objects at the expected locations of the 3D AR images, as depicted in Fig. 5, for the corresponding pair to have the same focus and motion parallax cues together. For this purpose, two real objects of a cat figure (Object 1) and a toy car (Object 2) were placed 60 mm apart. In the proposed system, a commercial polarization-dependent switching liquid crystal polymer GP lens (#33-466, Edmund Optics) with a diameter of 1-inch and bifocal property of ±100 mm with the RHCP/LHCP, described above, was adopted [35]. The lens array comprised 15 × 15 rectangular elemental lenses, each of which have 5-mm pitch and 10-mm focal length. For the display panel and PRC, we used two identical LCD panels having the pixel pitch of 31.5 µm. Thus, the angular size of the pixelated unit of interwoven 3D AR images 1 and 2 have the angular size of 1 arcmin at the viewing distances of approximately 480 and 1,300 mm, respectively. Thus, it is expected that flicker-free GP integral floating can be mostly achieved if the viewing distance is around 1 m, which covers most of the typical viewing conditions of various display applications. Further, pixelated interwoven 3D AR images 1 (blue die) and 2 (red die) are presented using a beam-splitter, and the proposed structure is presented using two LCD panels. Photographs of the experimental setup are provided in Fig. 6.

 figure: Fig. 5.

Fig. 5. Concept of experimental setup comprising two LCD panels, a lens array, GP lens, and a beam splitter to realize the proposed flicker-free GP integral floating scheme.

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. Pictures of the experimental setup to realize an AR-pixelated interwoven GP integral floating system.

Download Full Size | PDF

Firstly, Fig. 7 presents the experimental results captured at various viewpoints. In Fig. 7(a), the focus of the camera was fixed at object 1 (cat figure) and 3D AR image 1 (blue die) was captured clearly, while object 2 (toy car) and the corresponding 3D AR image (#2, red die) appear blurry. By contrast, when the focus changes to object 2, 3D AR image 2 was captured clearly, whereas the pictures of object 1 and 3D AR image 1 are blurred, as shown in Fig. 7(b). Thus, from these results, 3D AR images 1 and 2 provide the same focus cues, proving their depth-matching feature with objects 1 and 2, respectively. Regarding the motion parallaxes, the gap between 3D AR image 1 and object 1 is almost fixed, while the relative distance between 3D AR image 1 and object 2 varied with the change in the viewing position. By contrast, 3D AR image 2 has the same relationship with object 2. Additionally, the motion parallaxes between 3D AR images 1 and 2 can be observed in both figures with different focus cues. Therefore, the focus cues and motion parallaxes of 3D AR images 1 and 2 have identical features with those of corresponding real objects 1 and 2, verifying that each 3D AR image is well presented at the desired depth location as a dual-volume image. Besides, though it can be verified that halves of both 3D AR images are realized within a frame by the pixelated interweaving scheme proposed, there is a certain limitation that the locations of those 3D AR image are still restricted around the dual focal planes and there can be a gap between the volumetric images if the focal planes are far from each other [36]. To tackle the issue above, further research such as realizing a multi-focus volumetric display with more precise depth expression would be necessary.

 figure: Fig. 7.

Fig. 7. Experimental results of parallaxes of 3D AR images 1 and 2 and the corresponding objects with camera focus on (a) 3D AR image 1 (blue die) and object 1 (cat figure), and (b) 3D AR image 2 (red die) and object 2 (toy car). Each 3D AR image presents the same focus cue and motion parallaxes with the corresponding depth-matching real objects.

Download Full Size | PDF

Secondly, the effect of interweaving unit size in reducing the flickering is presented in Fig. 8. As can be observed in the odd frame, the conventional method with no interweaving presents the entire first 3D AR image at the location of object 1 only, while no image is projected at that of object 2. In the even frame, by contrast, only the entire second 3D AR image appears at the location of object 2; whereas, 3D AR image 1 disappears. Compared with the previously described conventional principles with no interweaving, the proposed method shows halves of both images simultaneously with an interwoven feature. By using LCD panels with the same specification for the display panel and the PRC as described above, the experimental system could realize an interweaving unit with a size of a single pixel (31.5 µm). Besides, in order to compare the effect of interweaving unit size in reducing the flickering, we prepared more conditions with different interweaving unit sizes of 3, 5, and 7 pixels. As presented in Fig. 8, the smaller the unit size, the harder it is to recognize the interwoven pattern and the spatial luminance distribution causing the perception of flickering. By contrast, if the size of the interweaving unit is increased, enhancement of the 3D AR image quality becomes harder to accomplish; thus, it becomes easier for the unpresented parts (empty spaces) between the units in the 3D AR images and their temporal luminance change to be positionally or locally perceived as flicker phenomenon. Accordingly, we can confirm the effect of pixelated interweaving of 3D AR images in minimizing the unit size of spatial luminance distribution.

 figure: Fig. 8.

Fig. 8. Experimental results of the presented 3D AR images obtained with various interweaving unit-size conditions.

Download Full Size | PDF

Besides, the quality of the pixelated interweaving scheme shown in Fig. 8 would be improved further. Since current experimental setup was realized by stacking two commercial LCD panels, light rays with undesired paths such as stray light and partially reflected light between the glass substrates of panels affect the quality of the reconstructed image. Therefore, we expect that the picture quality of proposed method can be improved by an advanced technique such as a development of an LCD panel with an additional in-cell polarization controlling layer, having functions of display panel and PRC within a single device.

In addition, we prepared two movies of experimental results as supplementary materials to exhibit the effect of flicker reduction using our proposed method. The change in focus cue and parallaxes are demonstrated in Visualization 1. As it is a movie, the real level of flicker can be expected, and it confirms that a flicker-free condition is accomplished when the proposed technique is adopted. In addition, Visualization 2 presents a comparison of recorded flicker phenomenon using the conventional technique on the left side, while the proposed scheme is shown on the right. As confirmed from the movie, the flicker problem is significantly reduced and nearly unobservable when the proposed method is employed (right side).

Finally, the temporal luminance profiles of 3D AR image 1 obtained using the conventional and proposed methods and a comparison of the results are presented in Fig. 9. The data was measured using a luminance meter (CA-310, Konica Minolta) with a measuring frequency of 6 Hz. As shown in Fig. 9, the luminance fluctuation of the spatiotemporal retardation control at the refresh rate of 3 Hz is significantly smaller compared to that obtained conventionally. Even under the harsh condition of extremely low Hz operation, the evaluation results prove that the temporal luminance levels can be almost identical, without fluctuation, owing to the proposed pixelated interweaving 3D AR image construction scheme, which resolves the flicker issue in the temporal multiplexing approach.

 figure: Fig. 9.

Fig. 9. Optical measurement results providing normalized luminance profiles of 3D AR image 1 based on the conventional and proposed methods. The proposed technique presents a stable temporal luminance profile, while the conventional method exhibits fluctuations, which cause the flickering issue.

Download Full Size | PDF

4. Conclusion

With growing demands for immersive display devices to present 3D AR images, the integration of multi-volume spaces is expected to be helpful in providing natural depth cues with less visual fatigue. Though the temporal multiplexing approach is an effective and verified method for extending the expressible depth range with more focal status, the flickering issue upon the reduction in the refresh rate for each volume space presented a significant hinderance that needed to be resolved. Herein, we proposed a pixelated interwoven GP integral floating technique to realize flicker-free dual volume AR images. The novelty of the established method was verified from experimental results and optical measurement data. The adoption of this technique is expected to extend the volume expression of 3D AR images with more volume spaces, less burden on the system performance, and a reduced amount of image data for rendering and transmission.

Funding

Institute of Information & Communications Technology Planning & Evaluation (2020-0-00924); National Research Foundation of Korea (2021R1A2C1011803); Ministry of Trade, Industry and Energy (20019235).

Acknowledgments

This work was partly supported by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No.2020-0-00924, Technology development of authoring tool for 3D holographic printing contents, 50%), and National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No.2021R1A2C1011803, Research on gaze-contingent hybrid volumetric display, 30%) and the Technology Innovation Program funded By the Ministry of Trade, Industry & Energy (MOTIE, Korea) (No.20019235, Deformable and immersive volumetric AR glass, 20%).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. X. Hu and H. Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Opt. Express 22(11), 13896–13903 (2014). [CrossRef]  

2. S. Liu, Y. Li, P. Zhou, S. Huang, Q. Chen, and Y. Su, “A multi-plane optical see-through head mounted display design for augmented reality applications,” J. Soc. Inf. Disp. 24(4), 246–251 (2016). [CrossRef]  

3. Y. Zhou, J. Zhang, and F. Fang, “Design of the varifocal and multifocal optical near-eye see-through display,” Optik 270, 169942 (2022). [CrossRef]  

4. T. Zhan, J. Zou, M. Lu, E. Chen, and S.-T. Wu, “Wavelength-multiplexed multi-focal-plane seethrough near-eye displays,” Opt. Express 27(20), 27507–27513 (2019). [CrossRef]  

5. S.-W. Min, M. Hahn, J. Kim, and B. Lee, “Three-dimensional electro-floating display system using an integral imaging method,” Opt. Express 13(12), 4358–4369 (2005). [CrossRef]  

6. J. Hong, S.-W. Min, and B. Lee, “Integral floating display systems for augmented reality,” Appl. Opt. 51(18), 4201–4209 (2012). [CrossRef]  

7. Y. Takaki and Y. Yamaguchi, “Flat-panel type see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015). [CrossRef]  

8. Y. Yamaguchi and Y. Takaki, “See-through integral imaging display with background occlusion capability,” Appl. Opt. 55(3), A144–A149 (2016). [CrossRef]  

9. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014). [CrossRef]  

10. C. Yao and D. Cheng, “Design of an optical see-through light-field near-eye display using a discrete lenslet array,” Opt. Express 26(14), 18292–18301 (2018). [CrossRef]  

11. J. Xiong, E.-L. Hsiang, Z. He, T. Zhan, and S.-T. Wu, “Augmented reality and virtual reality displays: emerging technologies and future perspectives,” Light: Sci. Appl. 10(1), 1–30 (2021). [CrossRef]  

12. X. Wang and H. Hua, “Depth-enhanced head-mounted light field displays based on integral imaging,” Opt. Lett. 46(5), 985–988 (2021). [CrossRef]  

13. J. Xiong and S.-T. Wu, “Planar liquid crystal polarization optics for augmented reality and virtual reality: From fundamentals to applications,” eLight 1(1), 3–20 (2021). [CrossRef]  

14. R. K. Komanduri, C. Oh, and M. J. escuti, “Late-News Paper: Polarization Independent Projection Systems Using Thin Film Polymer Polarization Gratings and Standard Liquid Crystal Microdisplays,” Dig. Tech. Pap. - Soc. Inf. Disp. Int. Symp. 40(1), 487–490 (2009). [CrossRef]  

15. P. F. McManamon, P. J. Bos, M. J. Escuti, J. Heikenfeld, S. Serati, H. Xie, and E. A. Watson, “A review of phased array steering for narrow-band electrooptical systems,” Proc. IEEE 97(6), 1078–1096 (2009). [CrossRef]  

16. J. Kim, C. Oh, S. Serati, and M. J. Escuti, “Wide-angle, nonmechanical beam steering with high throughput utilizing polarization gratings,” Appl. Opt. 50(17), 2636–2639 (2011). [CrossRef]  

17. K. Choi, J. Yim, S. Yoo, and S.-W. Min, “Self-interference digital holography with a geometric-phase hologram lens,” Opt. Lett. 42(19), 3940–3943 (2017). [CrossRef]  

18. K. Choi, J. Yim, and S.-W. Min, “Achromatic phase shifting self-interference incoherent digital holography using linear polarizer and geometric phase lens,” Opt. Express 26(13), 16212–16225 (2018). [CrossRef]  

19. S. Moon, C.-K. Lee, S.-W. Nam, C. Jang, G.-Y. Lee, W. Seo, G. Sung, H.-S. Lee, and B. Lee, “Augmented reality near-eye display using Pancharatnam-Berry phase lenses,” Sci. Rep. 9(1), 6616 (2019). [CrossRef]  

20. S. Moon, S.-W. Nam, Y. Jeong, C.-K. Lee, H.-S. Lee, and B. Lee, “Compact augmented reality combiner using Pancharatnam-Berry phase lens,” IEEE Photonics Technol. Lett. 32(5), 235–238 (2020). [CrossRef]  

21. Y. Weng, D. Xu, Y. Zhang, X. Li, and S.-T. Wu, “Polarization volume grating with high efficiency and large diffraction angle,” Opt. Express 24(16), 17746–17759 (2016). [CrossRef]  

22. Y. H. Lee, G. Tan, T. Zhan, Y. Weng, G. Liu, F. Gou, F. Peng, N. V. Tabiryan, S. Gauza, and S. T. Wu, “Recent progress in Pancharatnam-Berry phase optical elements and the applications for virtual/augmented realities,” Opt. Data Process. Storage 3(1), 79–88 (2017). [CrossRef]  

23. T. Zhan, Y. H. Lee, and S. T. Wu, “High-resolution additive light field near-eye display by switchable Pancharatnam-Berry phase lenses,” Opt. Express 26(4), 4863–4872 (2018). [CrossRef]  

24. S. Li, Y. Liu, Y. Li, S. Liu, S. Chen, and Y. Su, “Fast-response Pancharatnam-Berry phase optical elements based on polymer-stabilized liquid crystal,” Opt. Express 27(16), 22522–22531 (2019). [CrossRef]  

25. C. Yoo, K. Bang, C. Jang, D. Kim, C. K. Lee, G. Sung, and B. Lee, “Dual-focal waveguide see-through near-eye display with polarization-dependent lenses,” Opt. Lett. 44(8), 1920–1923 (2019). [CrossRef]  

26. T. Zhan, Y.-H. Lee, G. Tan, J. Xiong, K. Yin, F. Gou, J. Zou, N. Zhang, D. Zhao, J. Yang, S. Liu, and S.-T. Wu, “Pancharatnam–Berry optical elements for head-up and near-eye displays,” J. Opt. Soc. Am. B 36(5), D52–D65 (2019). [CrossRef]  

27. T. Zhan, J. Xiong, J. Zou, and S.-T. Wu, “Multifocal displays: review and prospect,” PhotoniX 1(1), 10 (2020). [CrossRef]  

28. G. Tan, T. Zhan, Y.-H. Lee, J. Xiong, and S.-T. Wu, “Polarization-multiplexed multiplane display,” Opt. Lett. 43(22), 5651–5654 (2018). [CrossRef]  

29. M. Park, K.-I. Joo, H.-R. Kim, and H.-J. Choi, “An augmented-reality device with switchable integrated spaces using a bi-focal integral floating display,” IEEE Photonics J. 11(4), 1–8 (2019). [CrossRef]  

30. H.-J. Choi, Y.K. Park, H. Lee, K.-I. Joo, T.-H. Lee, S. Hong, and H.-R. Kim, “Compensation of color breaking in bi-focal depth-switchable integral floating augmented reality display with a geometrical phase lens,” Opt. Express 28(24), 35548–35560 (2020). [CrossRef]  

31. U.T. Keesey, “Variables determining flicker sensitivity in small fields,” J. Opt. Soc. Am. 60(3), 390–398 (1970). [CrossRef]  

32. C.W. Tyler, “Analysis of normal flicker sensitivity and its variability in the visuogram test,” Investig. Ophthalmol. Vis. Sci. 32(9), 2552–2560 (1991).

33. P.-Y. Chou, J.-Y. Wu, S.-H. Huang, C.-P. Wang, Z. Qin, C.-T. Huang, P.-Y. Hsieh, H.-H. Lee, T.-H. Lin, and Y.-P. Huang, “Hybrid light field head-mounted display using time-multiplexed liquid crystal lens array for resolution enhancement,” Opt. Express 27(2), 1164–1177 (2019). [CrossRef]  

34. Q. Li, H. Deng, C. Yang, W. He, and F. Zhong, “Locally controllable 2D/3D mixed display and image generation method,” Opt. Express 30(13), 22838–22847 (2022). [CrossRef]  

35. Edmund Optics, https://www.edmundoptics.com/knowledge-center/trending-in-optics/polarization-directed-flat-lenses/?gclid=Cj0KCQjwkOqZBhDNARIsAACsbfK_mnHtc6kcFx9s6Cih2L_B95vttkNSvbaJXZxRpK2mm07ZWLdPwI8aAue_EALw_wcB.

36. S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005). [CrossRef]  

Supplementary Material (2)

NameDescription
Visualization 1       Visualization 1 presents change of focus and motion parallax cues with the real level of flicker.
Visualization 2       Visualization 2 presents a comparison of recorded flicker phenomenon using the conventional technique on the left side, while the proposed scheme is shown on the right. As confirmed from the movie, the flicker problem is significantly reduced and nea

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. The basic principle of a bifocal integral floating technique using an APSD and a GP lens.
Fig. 2.
Fig. 2. Conventional geometric phase (GP) integral floating technique with change in luminance profile over an entire screen, causing flickering.
Fig. 3.
Fig. 3. The principle to form a pixelated interweaving unit by combining the pixels of elemental image on the display panel and the corresponding polarization modulation.
Fig. 4.
Fig. 4. Proposed flicker-free technique to provide pixelated interwoven 3D AR images with temporal and spatial distributions of luminance profile change.
Fig. 5.
Fig. 5. Concept of experimental setup comprising two LCD panels, a lens array, GP lens, and a beam splitter to realize the proposed flicker-free GP integral floating scheme.
Fig. 6.
Fig. 6. Pictures of the experimental setup to realize an AR-pixelated interwoven GP integral floating system.
Fig. 7.
Fig. 7. Experimental results of parallaxes of 3D AR images 1 and 2 and the corresponding objects with camera focus on (a) 3D AR image 1 (blue die) and object 1 (cat figure), and (b) 3D AR image 2 (red die) and object 2 (toy car). Each 3D AR image presents the same focus cue and motion parallaxes with the corresponding depth-matching real objects.
Fig. 8.
Fig. 8. Experimental results of the presented 3D AR images obtained with various interweaving unit-size conditions.
Fig. 9.
Fig. 9. Optical measurement results providing normalized luminance profiles of 3D AR image 1 based on the conventional and proposed methods. The proposed technique presents a stable temporal luminance profile, while the conventional method exhibits fluctuations, which cause the flickering issue.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.