Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Modeling and optimizing through plenoptic function for the dual lenticular lens-based directional autostereoscopic display system

Open Access Open Access

Abstract

We propose an autostereoscopic display system that ensures full resolution for multiple users by directional backlight and eye tracking technology. The steerable beam formed by directional backlight can be regarded as the result of sparsely sampling the light field in space. Therefore, we intuitively propose an optimization algorithm based on the characterization for the state of the steerable beams, which is computed in matrix form using the plenoptic function. This optimization algorithm aims to optimize the exit pupil quality and ultimately enhancing the viewing experience of stereoscopic display. Numerical simulations are conducted and the improvement in exit pupil quality achieved by the optimization scheme is verified. Furthermore, a prototype of the stereoscopic display that employs dual-lenticular lens sheets for the directional backlight has been constructed using the optimized optical parameters. It provides 9 independent exit pupils at the optimal viewing distance of 400 mm, with an exit pupil resolution of 1/30. The field of view is ±16.7°, the viewing distance range is 380 mm to 440 mm. At the optimal viewing distance 400 mm, the average crosstalk of the system is 3%, and the dynamic brightness uniformity across the entire viewing plane reaches 85%. The brightness uniformity of the display at each exit pupil is higher than 88%.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Display technology has always aimed to provide viewers with more realistic and accurate information. With the development of liquid crystal display (LCD) technology, 2-dimensional (2D) display has continuously improved its spatial and temporal resolution, meeting the demands of viewing 2D information. However, the representation of depth information has always been a challenge for achieving a more realistic viewing effect. 3-dimensional (3D) display technology can provide depth information and has important applications in many fields, such as military, medical, entertainment, and advertising [1,2,3]. There are many methods for realizing naked-eye 3D display, including binocular display, multi-view display, integral display, volumetric display, holographic display, light field display. The main principle is to reproduce the distribution of the light field as realistically as possible. Integral display [4,5], volumetric display [6,7], holographic display [8,9], and light field display [10,11] are considered good choices for realizing true 3D because they can reproduce the distribution of the light field in real 3D space in the form of hogel, voxel, or point. These methods can provide all the required depth information, thereby avoiding vergence-accommodation conflict. However, due to the limitations of current device resolution and computing power, these display methods are still difficult to achieve high-definition and real-time requirements, and there is still a lot of room for improvement in the imaging quality.

Autostereoscopic 3D display emphasizes that viewers can obtain higher viewing freedom without any glasses or other auxiliary equipment. This is achieved through interaction between the viewer and the display system, aided by eye tracking technique. There are several strategies for implementing an autostereoscopic display system. We believe that the first autostereoscopic display system with eye tracking is proposed by Perlin and his group from New York University in 2000 [12,13]. It can provide a single viewer with true stereoscopic view of simulated objects, while allowing the observer to move freely. Stripe pattern of the light shutter and image pattern, which are interleaved into three successive time phases, are computed according to the eye tracker information. Then, the autostereoscopic technique has since developed towards multi-viewer scenario. Surman et al. proposes two multi-user autostereoscopic display systems with head tracking to provide images according to the users’ eye positions [14,15,16]. However, their display systems are complicated and bulky in size. Liou et al. proposes a more compact multi-view full resolution 3D display based on lenticular lens array and dynamic controllable LED backlight [17]. Liao’s group proposes a display system that provides different rendering images at different viewing angles and realizes full resolution by eye tracking [18,19].

Directional backlight is an important technique in autostereoscopic display, and numerous researchers have made significant contributions to backlighting technology. Hwang et al. suggest a directional backlight based on 3-colored volume-holographic optical elements (VHOEs) for mobile displays [20]. VHOEs, combined with time-multiplexed technique, are used to control the direction of the backlight to form left and right viewing zones. Chen et al. propose a micro-grooved structure of lightguides with a time-sequential driving scheme, which achieves image crosstalk reduction as well as Moiré pattern free [21]. Yoon et al. suggest using arrays of Lucius microprisms for directionally allocating light that can be applied in autostereoscopic display [22]. Hayashi et al. use an elliptically shaped mirror in the directional backlight for a 23-inch two-view 3D system [23]. However, it is common to find a two-view 3D system instead of multiple views with a large viewing angle in the literature mentioned above. Fattal et al. on the other hand present a multi-directional diffractive backlight with a wide viewing zone up to 180 degrees in principle [24]. Ishizuka et al. adopt a light dot matrix and two-dimensional convex lens array to realize collimated directional backlight, and the accordingly distributed viewing zones were analyzed for optimization [25]. Developed by Zhou’s group, a free-form-surface backlight was proposed that realized an eye-tracked display system for 8 viewing zones [26,27]. It can support multiple users at the same time. Furthermore, optimized backlight design guarantees low crosstalk and nice exit pupil (EP) distribution.

This paper presents a directional backlight 3D display system based on dual-lenticular lens, combining temporal-spatial multiplexing technology and eye-tracking technology to provide high-degree-of-freedom full-resolution naked-eye 3D effects. In order to improve the performance of the display system, this paper introduces an optimization scheme for optical parameters based on the matrix representation of ray tracing with plenoptic function. The optimization scheme takes into account the quality of the distribution of EPs across the entire viewing region and aims to reach the local optimal solution for the optical parameters, such as the relative position between the light source and the lenticular lens sheet 1, and the relative position between the two lenticular lens sheets, through Greedy Algorithm. It guarantees the overall high quality of the stereoscopic viewing effect.

The significance of this optimization scheme is twofold:

  • (1) It is common to observe variations between the parameters of the manufactured optical lens and the originally designed ones. As a result, it becomes necessary to make adjustments to other optical parameters based on the measured parameters of the actual lens.
  • (2) When designing optical parameters for display systems, it is often necessary to rely on partial viewing effects due to the complex and time-consuming nature of simulating a large number of lens arrays. However, the optimization scheme proposed in this paper is capable of automatically reaching the local optimum solution for each parameter, thereby ensuring the overall quality of the display system. This eliminates the tedious process of continuous manual parameter tuning during simulation.

These analyses help in achieving a homogeneous viewing experience through a low-cost, manageable, and general approach. To verify the reliability of the concepts proposed in this paper, we built a prototype, which provides 9 independent EPs at a viewing distance of 400 mm, with an EP resolution of 1/30. The field of view is ±16.7$^\circ $, and the average crosstalk of the system is 3%. The brightness uniformity within each EP is higher than 88%, and the brightness uniformity across the entire viewing plane reaches 85%. The viewing distance range is 380 mm ∼ 440 mm for proper stereo effect. Overall, our proposed approach effectively resolves the conflict between the number of viewpoints and viewing resolution, resulting in an improved, high-quality naked-eye autostereoscopic display system.

2. Principle

2.1 Directional backlight module based on dual lenticular lens

We implemented a directional backlighting technique using dual lenticular lenses. The overall structure comprises an addressable LED array, a lenticular lens sheet (LLS1), a vertical diffuser, and another lenticular lens sheet (LLS2) stacked in order, as shown in Fig. 1. In this setup, LLS1 concentrates light rays emitted by the LED along the x-axis, creating light strips on its focal plane. Each unit of the LLS2 redirects the light strip to form a steerable beam. These steerable beams propagate towards the viewing plane to form a sub-EP. The superimposition of multiple sub-EPs on the viewing plane forms a complete EP. The calculation of the relationships of the relevant parameters is expressed in the Eq. (1)-(4).

$$p = \left( {1 + \frac{{{f_1}}}{{{d_1}}}} \right){t_1} = \; ({1 + {\beta_1}} ){t_1}$$
$${t_0} = p\frac{{{d_1}}}{{{f_1}}} = \frac{p}{{{\beta _1}}}$$
$$p = \left( {1 + \frac{{{f_2}}}{{{d_3} + {d_4}}}} \right){t_2} = \left( {1 + \frac{1}{{{\beta_2}}}} \right){t_2}$$
$$\Delta e = \; {\beta _2} \cdot \Delta p = \; {\beta _1} \cdot {\beta _2} \cdot \Delta {t_0}$$

 figure: Fig. 1.

Fig. 1. Optical structure of the proposed dual lenticular lens directional backlight.

Download Full Size | PDF

In Eq. (1)-(4), ${t_0}$, ${t_1}$, ${t_2}$, p represent the periods of the LED array, LLS1, LLS2 and light strip array. ${f_1}$, ${f_2}$ are the focal lengths of LLS1 and LLS2. ${d_1}$, ${d_3}$, ${d_4}$ is the distance between the LEDs and the LLS1, the thickness of LLS2, and the viewing distance, respectively. $\Delta {t_0}$, $\Delta p$, $\Delta e$ represent the distances between two sets of LED arrays, light strip arrays and EPs respectively. Here, we can introduce two magnification factors ${\beta _1}$ and ${\beta _2}$. ${\beta _1}$ compresses large-pitched LEDs into small-pitched and smaller-sized light strips, while ${\beta _2}$ has the opposite effect, for the purpose: (1) The interval between adjacent LEDs, ${t_0}$, can be larger in the case of multiple EPs, allowing for a more flexible LED arrangement. (2) The period of the light strip, p, can be smaller so that a more uniform backlight can be achieved. Last but not least, (3) the thickness of the backlight system, primarily determined by ${d_1}$, can be effectively reduced.

2.2 Modeling for the autostereoscopic display based on plenoptic function

Two plenoptic functions are able to describe the information propagating from any point in the 3D space to another point at any time, as expressed by Eq. (5):

$$L[{{{({x,y,z,\theta ,\varphi ,\lambda } )}_{LED}},t} ]\mathop \Rightarrow \limits^{OMDF} L[{{{({x,y,z,\theta ,\varphi ,\lambda } )}_{EP}},t} ]$$

The light field, originated from the LED, and modulated by the optical layers, is perceived at the EP finally. OMDF is the modulation function of the optical system for the light source, including the spatial modulation (SM), frequency modulation (FM), amplitude modulation (AM), as well as the temporal dimension modulation. For directional backlight display technology with dual lenticular lens, it is necessary to analyze the characteristics of the optical layers. The influence of optical layers on frequency dimension and temporal dimension can be ignored, so the wavelength and time terms of the plenoptic function can be omitted. Furthermore, since the display system only retains horizontal parallax, optical layers mainly modulate the spatial distribution of the light field horizontally. Therefore, the final simplified light field propagation can be expressed as:

$$L_{EP}({u,{\theta_u}} )= L_{LED} [{SM \times ({x,{\theta_x}} )} ]$$

The positive of angle in calculation is defined as the clockwise direction from the optical axis to the light ray. The state matrices of light ray $S_{BL}$ emitted from the light source, and light ray $S_{OBV}$ reaching the observation plane, are represented respectively:

$$S_{BL}= \left( {\begin{array}{{c}} x\\ {{\theta_x}}\\ 1\\ 1 \end{array}} \right)$$
$$S_{OBV}= \; \left( {\begin{array}{{c}} u\\ {{\theta_u}}\\ 1\\ 0 \end{array}} \right)$$

The propagation of light through the backlight system can be divided into three sections, as shown in Fig. 2. In each section, light refracts or travels in a straight line. The refraction process, which describes the angle change of the light on the junction surface of different media, can be represented by a 4${\times} $4 matrix R; and transition matrix which describes the change in coordinate position while light propagating along a straight line, can be represented by a 4${\times} $4 matrix T. The transformation between the initial state and the end state of the light ray can be calculated by Eq. (9):

$$S_{OBV}= \; {T_4} \times {R_3} \times {T_3} \times {R_2} \times {T_2} \times {R_1} \times {T_1} \times S_{BL}$$
$${T_4} = \left( {\begin{array}{{cccc}} 1&{{d_4}}&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&0 \end{array}} \right),\,{R_3} = \left( {\begin{array}{{cccc}} 1&0&0&0\\ 0&1&\gamma &0\\ 0&0&1&0\\ 0&0&0&0 \end{array}} \right),\,{T_3} = \left( {\begin{array}{{cccc}} 1&{{r_2} + {d_3}}&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&0 \end{array}} \right)$$

 figure: Fig. 2.

Fig. 2. The matrix representation of light ray propagation through multiple optical layers.

Download Full Size | PDF

$${R_2} = \left( {\begin{array}{{cccc}} 1&0&0&0\\ {\frac{{1 - n}}{{n{r_2}}}}&{\frac{1}{n}}&{\frac{{n - 1}}{{n{r_2}}}w{t_2}}&0\\ 0&0&1&0\\ 0&0&0&0 \end{array}} \right),\,{T_2} = \left( {\begin{array}{{cccc}} 1&{{d_2}}&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&0 \end{array}} \right),$$
$${R_1} = \left( {\begin{array}{{cccc}} 1&0&0&0\\ {\frac{1}{{{f_1} + {r_1}}}}&0&0&{ - \frac{1}{{{f_1} + {r_1}}}}\\ 0&0&1&0\\ 0&0&0&0 \end{array}} \right),\,{T_1} = \left( {\begin{array}{{cccc}} {\frac{{ - {f_1}}}{{{d_1} + {r_1}}}}&0&{\frac{{{f_1} \times w{t_1}}}{{{d_1} + {r_1}}}}&{w{t_1}}\\ 0&1&0&0\\ 0&0&1&0\\ 1&{{d_1}}&0&0 \end{array}} \right)$$

In which, $SM = {T_4} \times {R_3} \times {T_3} \times {R_2} \times {T_2} \times {R_1} \times {T_1}$. ${T_1}$, ${T_2}$, ${T_3}$ and ${T_4}$ are transition matrices. ${R_1}$, ${R_2}$ and ${R_3}$ are refraction matrices. Meanwhile, $\gamma = {\theta _{out}} - {\theta _{in}} = \arcsin ({n\ast sin{\theta_{in}}} )- {\theta _{in}}$, where n is the refraction index of lens material, ${\theta _{in}}$ is the incident angle of the light ray, while ${\theta _{out}}$ is the exit angle. w represents the number of lens units. There are eight state matrices which represent the state of the light ray when reaching different position, as shown in Fig. 2.

3. Optimization

One crucial aspect in evaluating the viewing experience of a stereoscopic display system is the characteristics of the EP. This article examines four EP characteristics based on the attributes of the sub-EPs. (1) EP position: It is essentially the average positions of all sub-EPs, primarily determined by the coordinates of the light source and optical components. It can be adjusted by tuning the parameter x. (2) EP width: It is primarily determined by the level of superimposition of multiple sub-EPs on the viewing plane. It can be adjusted by tuning the parameter ${d_1}$. (3) Relationship between brightness and angle at the EP: It is influenced by the brightness contributions of each steerable beams and can be adjusted by tuning the driving currents of each LED. (4) Relationship between EP brightness and spatial position: It is determined by the brightness and width of individual sub-EPs, as well as the level of superimposition of multiple sub-EPs on the viewing plane. It determines the spacing between adjacent EPs and the crosstalk in stereoscopic viewing, and can be adjusted by tuning the parameters ${d_1}$ and ${d_2}$.

The first three characteristics can be considered as the static properties of the EP, which describe the properties within a single EP. The fourth characteristic, on the other hand, is a dynamic property that requires comprehensive consideration of all the EPs. These four indicators collectively influence the effectiveness of stereoscopic display. In the following section, numerical calculations based on the mathematical model presented in Section 2.2 are used to determine the distribution of rays constituting the sub-EPs, thereby obtaining evaluation functions for these four characteristics

Since ${d_1} \gg {t_1}$, we can treat the energy of all rays incident on a single lens unit of LLS1 as equal. It can be approximately represented as:

$$I\left( {\rm \Theta } \right) = {\rm \; }I\left( \theta \right),\Theta \in \left[ {\theta -\theta _0,\; \theta + \theta _0} \right]$$
where $\theta $ is the angle of the light ray passing through the optical center of the lens unit, ${\theta _0}$ represents a small angular range that covers the lens unit. The status matrix for each ray emitted from the LED and reaching the viewing plane can be calculated by Eq. (9). Assuming that 80% of the rays in the ${w^{th}}$ sub-EP of the ${m^{th}}$ EP are distributed within a certain spatial range, the status matrices of the left and right boundaries of this range are denoted as $S_{m,w}^{8l}$ and $S_{m,w}^{8r}$, respectively.

For characteristic (1), we can calculate the center position of the ${m^{th}}$ EP ${u_m}$ as Eq. (11):

$${u_m} = \frac{1}{W}\mathop \sum \nolimits_{w = 1}^W {u_{m,w}} = \frac{1}{W}\mathop \sum \nolimits_{w = 1}^W \frac{{S_{m,w}^{8l}(1 )+ S_{m,w}^{8r}(1 )}}{2}\textrm{}$$
1 in the $S_{m,w}^{8l}(1 )$ means the first element of the status matrix.

For characteristic (2), the mean square error (MSE) of W sub-EPs’ position for the ${m^{th}}$ EP is represented as $MS{E_m}$. The overall position error is represented as MSE.

$$MS{E_m} = \frac{1}{W}\mathop \sum \nolimits_{w = 1}^W {({{u_{m,w}} - {u_m}} )^2}$$
$$MSE = \frac{1}{M}\mathop \sum \nolimits_{m = 1}^M MS{E_m}$$

According to Eq. (10), characteristic (3) can be sparsely represented by the brightness contribution from the ${w^{th}}$ steerable beam to the ${m^{th}}$ EP as Eq. (14)

$$L_{m,w}^8 = \mathop \smallint \nolimits_\mathrm{\Omega }L({{\theta_{m,w}}} )d\omega = \mathop \smallint \nolimits_\mathrm{\Omega }\frac{{dI({{\theta_{m,w}}} )}}{{dA}}d\omega $$
${\theta _{m,w}}$ is the angle of ${S_3}$, $d\omega $ is a unit angle, $\mathrm{\Omega }$ is the angular range of the light strip, and $dA$ is the unit projected area along ${\theta _{m,w}}$.

The evaluation of characteristic (4) of the EP can be better understood by referring to Fig. 3. Figure 3(a) shows the relationship between the brightness distribution and the spatial position on the viewing plane for three sub-EPs as an example. When these sub-EPs are superimposed, they form a curve denoted as ${C_1} - {A_1} - {B_1} - {D_1}$, as shown in Fig. 3(b). Assuming this curve represents the left EP, the corresponding right EP would be represented by the curve ${C_2} - {A_2} - {B_2} - {D_2}$. We aim to maximize the size of the smooth and observable region, represented by the A-B segments, for each EP. At the same time, we need to ensure that the positions of C and D don’t exceed the positions of the adjacent EPs’ B and A, respectively. In an ideal scenario, this would lead to minimal crosstalk. Assuming the desired spacing between the left and right EPs is E, the following condition must be satisfied:

$${\Delta _1} = E - \frac{{{B_2} - {A_2}}}{2} - \frac{{{B_1} - {A_1}}}{2} - ({{A_2} - {C_2}} )\ge 0$$

 figure: Fig. 3.

Fig. 3. Spatial brightness distribution. (a) Sub-EPs. (b) EPs for left eye and right eye.

Download Full Size | PDF

The coordinates of each point in the equation are represented as follows: ${A_m} = \mathop {\max }\limits_{w = 1, \cdots W} S_{m,w}^{8l}(1 )$, ${B_m} = \mathop {\min }\limits_{w = 1, \cdots W} S_{m,w}^{8r}(1 )$, ${C_m} = \mathop {\min }\limits_{w = 1, \cdots W} S_{m,w}^{8l}(1 )$, ${D_m} = \mathop {\max }\limits_{w = 1, \cdots W} S_{m,w}^{8r}(1 )$.

EP characteristics (1), (2), and (4) can be optimized by adjusting the optical parameters${d_1}$, x, and ${d_2}$. Consequently, we consider the evaluation functions of these three indicators as the objective function for a nonlinear optimization problem, as shown in Eq. (16).

$$\min \mathrm{\mathbb{Z}}({{d_1},x,{d_2}} )= {\omega _1}f({{d_1}} )+ {\omega _2}g(x )+ {\omega _3}h({{d_2}} )$$
$$s.t.\textrm{}\left\{ {\begin{array}{{c}} {{d_1} \in ({10{t_1},100{t_1}} ]}\\ {x = \Delta {x_m} + ({k - 1} ){t_0},\Delta {x_m} \in [{0,100{t_1}^2/{f_1}\; } ]}\\ {m = 1,2, \cdots ,M\; and\; k = 1,2, \cdots ,K}\\ {{d_2} \in ({0,\textrm{}2{f_2}} )}\\ {h({{d_2}} )\ge 0}\\ {{\omega_1} \gg {\omega_2} \gg {\omega_3}} \end{array}} \right.$$
$$f({{d_1}} )= MS{E_m}$$
$$g(x )= |{{u_m} - {U_m}} |$$
$$h({{d_2}} )= {\Delta _m}$$
$\Delta {x_m}$ represents the coordinate offset of the LED corresponding to the ${m^{th}}$ EP, k represents the ${k^{th}}$ LED, and ${U_m}$ is the ideal position of the ${m^{th}}$ EP. The optimization process is illustrated in Fig. 4. The ${d_1}$ and x initialization values are calculated with Eq. (1)-(4) under linear conditions, and ${d_2}$ is set to ${f_2} - {r_2}$. Greedy Algorithm is used to decompose the optimization problem into three sub-problems. ${\omega _1}$, ${\omega _2}$, ${\omega _3}$ represent the priorities of three optimization objective functions in the Greedy Algorithm, where a larger $\omega$ indicates a higher priority. Therefore, these sub-problems are then sequentially solved to obtain the local optimal solutions, and the final output is the local optimal solution for the overall problem.

4. Experiments and results

4.1 Prototype setup

Based on the principles introduced previously, a prototype is built, as shown in Fig. 5, which consists of a directional backlight module, an LCD display, and an eye tracking module. The eye tracking module adopts One-Stage eye detection model guided by thermal infrared, which can achieve tracking speed 6∼8 ms and tracking accuracy 1.72 mm [28]. The directional backlight module mainly includes high-density addressable LED array, backlight control module (an STM32H7 array to control the driving scheme and duty cycle, chip TLC5927 as a constant current driver chip for LED), LLS1, vertical diffusion film, LLS2 and PET film material as optical space. The eye tracking module sends the obtained human eye position information to the backlight control module which turns on the corresponding backlight group. The display system parameters and optical component parameters are recorded in Table 1.

 figure: Fig. 4.

Fig. 4. Optimization scheme based on the matrix representation of ray tracing with plenoptic function using greedy algorithm.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Prototype of the proposed display system. (a) and (b) System structure for principle verification.

Download Full Size | PDF

Tables Icon

Table 1. Prototype parameters

Backlight, LLS1 and LLS2 are manually aligned according to the distribution of EPs on the viewing plane. Only one group of LEDs are turned on during the alignment process. When the EP observed on the viewing plane has vertical boundary with width approximate 60 mm, and the display plane observed from the center of the EP is pure white, it indicates that the three components are well aligned. If any of the conditions is not met, it means that the three are not aligned and should be adjusted.

4.2 Numerical simulation results

According to Eq. (16), the optimization results are presented in Fig. 6. Figure 6(a)∼(c) are the boxplot depicting the distribution error between the sub-EP and the ideal EP position for each EP. Purple represents the result obtained using initialized values, yellow is the result after optimizing ${d_1}$, blue is the result after optimizing x, and red is the result after optimizing both ${d_1}$ and x. The following conclusions can be drawn from the results:

  • (1) Optimizing the parameter ${d_1}$ leads to a more concentrated distribution of sub-EPs for the same EP, but it still deviates from the ideal position.
  • (2) Only optimizing the parameter x can reduce the position error of the sub-EPs from their ideal position. But as the viewing angle increases, the sub-EP position deviates from the ideal position more severely.
  • (3) After optimizing both the parameters ${d_1}$ and x, all the aforementioned issues can be addressed. The average distribution error of all sub-EPs of each EP is close to 0, indicating a relatively even distribution of the sub-EPs on both sides of the ideal EP position.

 figure: Fig. 6.

Fig. 6. Distribution error of sub-EPs. (a)∼(c) Boxplot of the distribution error between the sub-EP and the ideal EP position of each EP under different parameters. (d) The distribution error affected by the angle of the controllable beam before optimization. (e) The distribution error affected by the angle of the controllable beam after optimization.

Download Full Size | PDF

The distribution error of the sub-EPs is primarily influenced by the angle of the controllable beam. This relationship is illustrated in Fig. 6(d) and (e), which depict the deviation of the sub-EPs from the ideal EP as a function of the controllable beam angle under the initial conditions, optimized x and ${d_1}$, respectively. In the initial conditions, the distance error of the sub-EPs from the ideal position increases as the controllable beam angle increases due to refraction. However, after parameter optimization, the degree of deviation is improved. As we can see, the visual quality of the entire display plane is ensured at each EP location.

Figure 7(a) illustrates the impact of changes in parameter ${d_2}$ on the optimization of sub-objective function $h({{d_2}} )$. What is considered here is the values of h of the central EP (the fifth EP from left to right). It is observed that when ${d_2} = 0.995mm$, objective function satisfies that $h \ge 0$ and reaches a minimum value. Based on this optimal value, the distribution of all EPs is obtained through numerical simulation, as shown in Fig. 7(b). Under ideal conditions, the effective range of each EP (the flat top area in the figure) will remain unaffected by the brightness of the surrounding EPs. As a result, crosstalk can be reduced to 0.

 figure: Fig. 7.

Fig. 7. (a) The impact of ${d_2}$ on the sub-objective function h. (b) Numerical simulated EP spatial brightness distribution.

Download Full Size | PDF

Calculating with Eq. (14), the brightness uniformity of the display plane can be evaluated and adjusted at each viewpoint, as shown in Fig. 8. Figure 8(a) represents the simulated brightness distribution of the display plane when viewed from the leftmost EP. It is observable that the brightness attenuates from left to right, with the highest attenuation being 75%. The adjusted brightness distribution is illustrated in Fig. 8(b), where the theoretical brightness uniformity reaches 98.73%. Figure 8(c) ∼ (f) respectively show the brightness distribution of the display plane for the ${2^{nd}}$ to ${5^{th}}$ EPs from the left, both before and after adjustment. And the uniformity has been improved.

 figure: Fig. 8.

Fig. 8. Brightness uniformity of the display plane at different viewpoint. (a) and (b) for EP1 at the leftmost, (c) ∼ (f) for EP2 to EP5.

Download Full Size | PDF

The adjacent EP distance is 60 mm as the interpupillary distance, as shown in Fig. 7(b). When the viewer dynamically moves in the horizontal direction, taking the left eye as an example, the changes in brightness and stereoscopic crosstalk is shown in Fig. 9(b) and (c) respectively. The dynamic brightness uniformity is measured to be 24%. Moreover, the highest stereoscopic crosstalk reaches 100%. However, it is worth noting that the crosstalk within the optimal viewing area remains 0. To address these issues and improve the viewing experience, an alternative approach utilizes a high-density EP distribution with an EP distance of 30 mm. In this case, the dynamic brightness curve achieves a significantly higher uniformity 92.6%, as shown in Fig. 9(a). Furthermore, the global stereoscopic crosstalk is theoretically zero.

 figure: Fig. 9.

Fig. 9. The significance of high-density EP. (a) Dynamic brightness distribution with high- density EP and (b) with sparse EP. (c) Crosstalk with sparse EP.

Download Full Size | PDF

To further validate the accuracy of the optimized parameters, a simulation model was established using the optical simulation software TracePro. Table 2 provides a comparison of the parameters before and after optimization. Figure 10(a) and (b) shows the irradiance analysis of the EP at 0 mm and the EP at 90 mm after parameter optimizing. Figure 10(c) presents the analysis of EP at 0 mm before optimization, where significant deviation is observed in the small-angle EP, and the deviation becomes even more pronounced in the large-angle EP. However, after optimization, the large and small angle EPs are accurately close to the designed values.

 figure: Fig. 10.

Fig. 10. Sub-EP and EP brightness distribution of simulation in software at the desired viewing distance. (a) Center EP after optimization. (b) Edge EP after optimization. (c) Center EP before optimization.

Download Full Size | PDF

Tables Icon

Table 2. Parameters before and after optimization

4.3 Experimental results

The backlight brightness of the display system is adjusted according to Eq. (14). Figure 11 shows the viewing effect when the viewing position is 120 mm and the display shows a full white image. Figure 11(a) is the effect after dynamically adjusting the LED drive current duty cycle, while (b) represents the effect before adjustment. In the case of no adjustment, the duty cycle of the drive current for each column of LEDs is 1/16, and the viewing brightness gradually decreases from right to left. The uniformity is 70%, and the LED drive signals for the leftmost and rightmost columns are shown in the embedded pictures. After adjustment, the duty cycle of the drive current for each column gradually decreases from 1/16 on the left side (LED-1) to 1/22 on the right side (LED-24). The LED drive signals for the leftmost and rightmost columns are shown in the embedded pictures. Subsequently, the uniformity reaches 88%. Even at the edge EP, the uniformity of brightness has improved by 1.25 times.

 figure: Fig. 11.

Fig. 11. Experimental results of brightness adjustment of the display plane at the rightmost EP. (a) After optimization. (b) Before optimization. The embedded waveform represents the duty cycle of the LED drive signal.

Download Full Size | PDF

Figure 12 shows the EP projected to the viewers’ eyes after being detected by the eye tracker. Figure 12(c) and (d) shows the real-time detection results of the eye tracker. The display system can provide stereo effect for two viewers at most.

 figure: Fig. 12.

Fig. 12. (a) and (b) are the projected EP, left image is black and right image is white in (a), left image is white and right image is black in (b). (c) and (d) are the detection results.

Download Full Size | PDF

The display effect observed by the viewer is shown in Fig. 13. From (a) to (i), each image shows the scene viewed from 9 different angles. The viewing distance is set at 400 mm, and the viewing zones are horizontally distributed within a range of -120 mm to 120 mm, with a 30 mm interval to provide smooth motion parallax. The display brightness observed from each viewing angle is relatively uniform, especially in the case of the large viewing angle. There are no obvious dark areas at the edges of the display plane.

 figure: Fig. 13.

Fig. 13. Display images observed at different viewpoints.

Download Full Size | PDF

The brightness distribution of EPs is shown in Fig. 14. Each EP has a total width of 100 mm, with an area of approximately 30 mm displaying a uniform brightness distribution. The distance between adjacent EPs is 30 mm to maintain a dynamic viewing brightness uniformity greater than 85% across the viewing area. Therefore, the viewer won’t be experiencing brightness changes and flicker while moving. The display’s theoretical crosstalk is zero. The crosstalk at position x is calculated by $C{T_i}(x )= \textrm{}\frac{{\mathop \sum \nolimits_{m \ne i} {I_m}(x )}}{{{I_i}(x )}}$. $C{T_i}(x )$ represents the crosstalk for the ${i^{th}}$ EP at position x. It is calculated by dividing the sum of the brightness of all the other EPs (except the ${i^{th}}$ EP) at x by the brightness of the ${i^{th}}$ EP at x. In actual engineering, the average crosstalk is 3%. Nonetheless, this satisfies the requirements for a stereoscopic display.

 figure: Fig. 14.

Fig. 14. (a) Brightness distribution of EP. (b) Brightness envelope. (c) Crosstalk distribution.

Download Full Size | PDF

As the viewing distance away from the optimal position, the viewing experience decreases. Firstly, the position of the EP deviates from the set value, which might lead to a mismatch between the control signal for the backlight from the eye tracking module and the actual position of the EP (viewing distance 380 mm with position error 2 mm, viewing distance 440 mm with position error 5 mm). However, with stereo camera, this problem can be solved. Furthermore, the brightness uniformity across the viewing plane will decline and crosstalk will increase when the viewer deviates from the ideal viewing distance, as shown in Fig. 14(b) and (c). Therefore, the actual viewing distance range of the prototype is 380mm∼440 mm to ensure that the brightness uniformity is over 80% and crosstalk is no more than 13% for optimum effect.

5. Conclusion

This paper proposes an autostereoscopic 3D display system that incorporates directional backlight and eye-tracking technology to achieve full-resolution, multi-user viewing. To further enhance the system performance, this paper presents an optimization scheme based on matrix representation of ray tracing using light field function. The light field on the observation plane can be used for quantitative characterization of the EP quality. Utilizing these evaluation functions as objective functions, a nonlinear programming task can automatically obtain a locally optimal solution of the optical parameters through Greedy Algorithm. Numerical simulation demonstrates that the optimization scheme can improve the overall quality of the display system in areas such as brightness consistency, crosstalk and correspondence between the eye tracker and the backlight. We constructed a prototype based on the optimized optical parameters. The system provides 9 independent viewpoints at the optimum viewing distance of 400 mm, with EP resolution 1/30, a viewing angle of ${\pm} 16.7^\circ $. The viewing distance range is 380 mm to 440 mm. At the optimum viewing distance 400 mm, the brightness uniformity of the display at each EP is higher than 88%, while the brightness uniformity across the entire viewing plane reaches 85%. Theoretically, the crosstalk should be zero. But our prototype introduces an average 3% crosstalk, which is still very low for stereoscopic display. Although this system is designed for spherical cylindrical lens, the same dense viewpoint and optimization procedure can be applied to optimize system parameters in other directional backlight optical systems based on non-ideal optical devices (e.g. optical systems with aberrations). However, when applying this backlight structure to other sizes of display, challenges like field of view, brightness uniformity and Moiré Fringes occur. For future research, we will enhance the display system performance and address these issues with feasible solutions.

Funding

National Key Research and Development Program of China (2022YFB3606600).

Acknowledgments

The authors acknowledge supports from National Key R&D Program of China (2022YFB3606600).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. R. Kamierczak and A. Szczepańska, “3D optical illusion as visualisation tools in spatial planning and development,” Sci. Rep. 12(1), 15730 (2022). [CrossRef]  

2. H. Ren, L. X. Ni, H. F. Li, et al., “Review on tabletop true 3D display,” J. Soc. Inf. Display 28(1), 75–91 (2020). [CrossRef]  

3. G. W. Chen, T. Q. Huang, Z. C. Fan, et al., “A naked eye 3D display and interaction system for medical education and training,” J. Biomed. Inform. 100, 103319 (2019). [CrossRef]  

4. W. W. Wang, G. X. Chen, Y. L. Weng, et al., “Large-scale microlens arrays on flexible substrate with improved numerical aperture for curved integral imaging 3D display,” Sci. Rep. 10(1), 11741 (2020). [CrossRef]  

5. Y. Xing, Y. P. Xia, S. Li, et al., “Annular sector elemental image array generation method for tabletop integral imaging 3D display with smooth motion parallax,” Opt. Express 28(23), 34706–34716 (2020). [CrossRef]  

6. R. Hirayama, D. M. Plasencia, N. Masuda, et al., “A volumetric display for visual, tactile and audio presentation using acoustic trapping,” Nature 575(7782), 320–323 (2019). [CrossRef]  

7. D. E. Smally, E. Nygaard, K. Squire, et al., “A photophoretic-trap volumetric display,” Nature 553(7689), 486–490 (2018). [CrossRef]  

8. P. A. Blanche, A. Bablumian, R. Voorakaranam, et al., “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). [CrossRef]  

9. B. Lee, D. Yoo, J. Jeong, et al., “Wide-angle speckleless DMD holographic display using structured illumination with temporal multiplexing,” Opt. Lett. 45(8), 2148–2151 (2020). [CrossRef]  

10. G. Wetzstein, D. Lanman, M. Hirsch, et al., “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31(4), 1–11 (2012). [CrossRef]  

11. B. Y. Liu, X. Z. Sang, X. B. Yu, et al., “Time-multiplexed light field display with 120-degree wide viewing angle,” Opt. Express 27(24), 35728–35739 (2019). [CrossRef]  

12. K. Perlin, S. Paxia, and J. S. Kollin, “An autostereoscopic display,” in the 27th annual Conference on Computer Graphics & Interactive Techniques (2000), pp. 319–326.

13. K. Perlin, C. Poultney, J. S. Kollin, et al., “Recent advances in the NYU autostereoscopic display,” Proc. Spie 4297, 196–203 (2001). [CrossRef]  

14. P. Surman, R. S. Brar, I. Sexton, et al., “MUTED and HELIUM3D autostereoscopic displays,” IEEE International Conference on Multimedia and Expo, 1594–1599 (2010).

15. P. Surman, S. Day, E. Willman, et al., “HELIUM3D: A laser-scanned head-tracked autostereoscopic display,” 2011 International Conference on 3D Imaging (IC3D)1–8, (2011).

16. R. S. Brar, P. Surman, I. Sexton, et al., “Laser-based head-tracker 3D display research,” J. Display Technol. 6(10), 531–543 (2010). [CrossRef]  

17. J. C. Liou and F. H. Chen, “Design and fabrication of optical system for time-multiplex autostereoscopic display,” Opt. Express 19(12), 11007–11017 (2011). [CrossRef]  

18. T. Q. Huang, B. X. Han, X. R. Zhang, et al., “High-performance autostereoscopic display based on the lenticular tracking method,” Opt. Express 27(15), 20421–20434 (2019). [CrossRef]  

19. Y. Meng, Y. Lyu, L. L. Chen, et al., “Motion parallax and lossless resolution autostereoscopic 3D display based on a binocular viewpoint tracking liquid crystal dynamic grating adaptive screen,” Opt. Express 29(22), 35456–35473 (2021). [CrossRef]  

20. Y. S. Hwang, F. K. Bruder, T. Fäcke, et al., “Time-sequential autostereoscopic 3-D display with a novel directional backlight system based on volume-holographic optical elements,” Opt. Express 22(8), 9820–9838 (2014). [CrossRef]  

21. C. H. Chen, Y. C. Yeh, and H. P. D. Shieh, “3-D mobile display based on Moiré-free dual directional backlight and driving scheme for image crosstalk reduction,” J. Display Technol. 4(1), 92–96 (2008). [CrossRef]  

22. H. Yoon, S. G. Oh, D. S. Kang, et al., “Arrays of Lucius microprisms for directional allocation of light and autostereoscopic three-dimensional displays,” Nat. Commun. 2(1), 455 (2011). [CrossRef]  

23. A. Hayashi, T. Kometani, A. Sakai, et al., “A 23-in. full-panel-resolution autostereoscopic LCD with a novel directional backlight system,” J. Soc. Inf. Display 18(7), 507–512 (2010). [CrossRef]  

24. D. Fattal, Z. Peng, T. Tran, et al., “A multi-directional backlight for a wide-angle, glasses-free 3D display,” 2013 IEEE Photonics Conference, 24–25 (2013).

25. S. Ishizuka, T. Mukai, and H. Kakeya, “Viewing zone of an autostereoscopic display with a directional backlight using a convex lens array,” J. Electron Imaging 23(1), 011002 (2014). [CrossRef]  

26. X. K. Li, J. Ding, H. T. Zhang, et al., “Adaptive glasses-free 3D display with extended continuous viewing volume by dynamically configured directional backlight,” OSA Continuum 3(6), 1555–1567 (2020). [CrossRef]  

27. P. Krebs, H. W. Liang, H. Fan, et al., “Homogeneous free-form directional backlight for 3D display,” Opt. Commun. 397, 112–117 (2017). [CrossRef]  

28. X. C. Li, Q. Q. Wu, B. P. Xiao, et al., “High-speed and robust infrared-guiding multiuser eye localization system for autostereoscopic display,” Appl. Opt. 59(14), 4199–4208 (2020). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. Optical structure of the proposed dual lenticular lens directional backlight.
Fig. 2.
Fig. 2. The matrix representation of light ray propagation through multiple optical layers.
Fig. 3.
Fig. 3. Spatial brightness distribution. (a) Sub-EPs. (b) EPs for left eye and right eye.
Fig. 4.
Fig. 4. Optimization scheme based on the matrix representation of ray tracing with plenoptic function using greedy algorithm.
Fig. 5.
Fig. 5. Prototype of the proposed display system. (a) and (b) System structure for principle verification.
Fig. 6.
Fig. 6. Distribution error of sub-EPs. (a)∼(c) Boxplot of the distribution error between the sub-EP and the ideal EP position of each EP under different parameters. (d) The distribution error affected by the angle of the controllable beam before optimization. (e) The distribution error affected by the angle of the controllable beam after optimization.
Fig. 7.
Fig. 7. (a) The impact of ${d_2}$ on the sub-objective function h. (b) Numerical simulated EP spatial brightness distribution.
Fig. 8.
Fig. 8. Brightness uniformity of the display plane at different viewpoint. (a) and (b) for EP1 at the leftmost, (c) ∼ (f) for EP2 to EP5.
Fig. 9.
Fig. 9. The significance of high-density EP. (a) Dynamic brightness distribution with high- density EP and (b) with sparse EP. (c) Crosstalk with sparse EP.
Fig. 10.
Fig. 10. Sub-EP and EP brightness distribution of simulation in software at the desired viewing distance. (a) Center EP after optimization. (b) Edge EP after optimization. (c) Center EP before optimization.
Fig. 11.
Fig. 11. Experimental results of brightness adjustment of the display plane at the rightmost EP. (a) After optimization. (b) Before optimization. The embedded waveform represents the duty cycle of the LED drive signal.
Fig. 12.
Fig. 12. (a) and (b) are the projected EP, left image is black and right image is white in (a), left image is white and right image is black in (b). (c) and (d) are the detection results.
Fig. 13.
Fig. 13. Display images observed at different viewpoints.
Fig. 14.
Fig. 14. (a) Brightness distribution of EP. (b) Brightness envelope. (c) Crosstalk distribution.

Tables (2)

Tables Icon

Table 1. Prototype parameters

Tables Icon

Table 2. Parameters before and after optimization

Equations (23)

Equations on this page are rendered with MathJax. Learn more.

p = ( 1 + f 1 d 1 ) t 1 = ( 1 + β 1 ) t 1
t 0 = p d 1 f 1 = p β 1
p = ( 1 + f 2 d 3 + d 4 ) t 2 = ( 1 + 1 β 2 ) t 2
Δ e = β 2 Δ p = β 1 β 2 Δ t 0
L [ ( x , y , z , θ , φ , λ ) L E D , t ] O M D F L [ ( x , y , z , θ , φ , λ ) E P , t ]
L E P ( u , θ u ) = L L E D [ S M × ( x , θ x ) ]
S B L = ( x θ x 1 1 )
S O B V = ( u θ u 1 0 )
S O B V = T 4 × R 3 × T 3 × R 2 × T 2 × R 1 × T 1 × S B L
T 4 = ( 1 d 4 0 0 0 1 0 0 0 0 1 0 0 0 0 0 ) , R 3 = ( 1 0 0 0 0 1 γ 0 0 0 1 0 0 0 0 0 ) , T 3 = ( 1 r 2 + d 3 0 0 0 1 0 0 0 0 1 0 0 0 0 0 )
R 2 = ( 1 0 0 0 1 n n r 2 1 n n 1 n r 2 w t 2 0 0 0 1 0 0 0 0 0 ) , T 2 = ( 1 d 2 0 0 0 1 0 0 0 0 1 0 0 0 0 0 ) ,
R 1 = ( 1 0 0 0 1 f 1 + r 1 0 0 1 f 1 + r 1 0 0 1 0 0 0 0 0 ) , T 1 = ( f 1 d 1 + r 1 0 f 1 × w t 1 d 1 + r 1 w t 1 0 1 0 0 0 0 1 0 1 d 1 0 0 )
I ( Θ ) = I ( θ ) , Θ [ θ θ 0 , θ + θ 0 ]
u m = 1 W w = 1 W u m , w = 1 W w = 1 W S m , w 8 l ( 1 ) + S m , w 8 r ( 1 ) 2
M S E m = 1 W w = 1 W ( u m , w u m ) 2
M S E = 1 M m = 1 M M S E m
L m , w 8 = Ω L ( θ m , w ) d ω = Ω d I ( θ m , w ) d A d ω
Δ 1 = E B 2 A 2 2 B 1 A 1 2 ( A 2 C 2 ) 0
min Z ( d 1 , x , d 2 ) = ω 1 f ( d 1 ) + ω 2 g ( x ) + ω 3 h ( d 2 )
s . t . { d 1 ( 10 t 1 , 100 t 1 ] x = Δ x m + ( k 1 ) t 0 , Δ x m [ 0 , 100 t 1 2 / f 1 ] m = 1 , 2 , , M a n d k = 1 , 2 , , K d 2 ( 0 , 2 f 2 ) h ( d 2 ) 0 ω 1 ω 2 ω 3
f ( d 1 ) = M S E m
g ( x ) = | u m U m |
h ( d 2 ) = Δ m
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.