Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Wave optic modeling of semi-transparent textured curved surfaces

Open Access Open Access

Abstract

Lightwave propagation through a multilayer structure of patterned semi-transparent curved surfaces is a fundamental issue for various display applications from practical industrial problems such as moiré pattern strain in touch panel displays to fundamental research issues in layered tensor displays and layered complex modulation holographic displays. In this paper, a wave optic modeling of multiple layers of patterned semi-transparent curved surfaces is presented. The proposed wave optic modeling is based on the algorithmic framework used in the field of polygon computer-generated holograms, a framework which can additionally devise the characteristic transmittance function of textured curved surfaces. To present an industrial application, the proposed modeling method is used to numerically simulate the visual perception of the moiré patterns observed in the layered structures of transparent phase-only patterned curved surfaces and their diverse variations with a change in observation angle.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

As highly consumer-oriented functionality is integrated into displays, state of the art display architecture is becoming increasingly functionally versatile through the incorporation of an increasing number of functional layers [1]. The combination of a flexible touch-screen functional layer and a high-resolution curved or flexible display panel is one such example, which has become a mega-trend in industrial display technology. Such combinations of optical layered surface structures are found in numerous optical applications such as touch panel displays, parallax barrier displays, lenticular displays, integral imaging displays, multilayer tensor displays, and holographic displays [2–4]. Diffractive or holographic patterns on curved surfaces are promising advanced display applications [5]. However, the optical attachment of two different optical surface layers induces various optical and visual effects [6], in which some of them are desired but the others are unwanted in a practical point of view.

Display optics should cover not only the optical system of the display but also the viewer’s eye, and, from this perspective, a theoretical analysis of the visual perception of layered patterned curved surfaces is fundamental and widely applicable [7]. Furthermore, approaches to analyzing visual perception need to consider issues of wave optics beyond the scope of the conventional geometric optics normally applied in display optics analysis. For instance, moiré effects in the display field are an issue particularly associated with this context. These patterns are observed by sensitive viewers, in which the diffraction induced by the pixel structure of the display panel and the externally attached optical patterns optically interact. Transparent patterns, that is, phase-only patterns can also generate visual moiré patterns on interaction with periodic patterns of display panel [8]. In particular, this type of moiré cannot be interpreted by conventional geometric optical analysis.

Therefore, in this paper, we develop wave optic modeling of semi-transparent textured curved surfaces. The proposed method is based on wave optic theory, particularly the polygon computer-generated hologram (CGH) theory [9]. According to this theoretical framework, situations in which a viewer observes a curved surface can be analyzed as if the viewer is looking upon a meshed surface composed of triangular facets. The computational wave optic model is constructed for light field transmission through a generally curved surface layer structure with semi-transparent transmittance characteristics and its visual perception. The model focuses on the visual perception of a viewer’s observation, i.e. the light field distribution on the viewer’s retina plane. This paper is organized as follows. In section 2, the wave optic model of the light field generated by layered surfaces featuring characteristic surface transmittance functions is described. In section 3, the results of the simulation of the visual perception of various curved moiré patterns are presented. In section 4, concluding remarks are given.

2. Wave optic modeling of optical curved surfaces represented by connected textured polygons

In this section, first, the fundamental wave optic model of an optical curved surface represented by a connected textured triangular polygon is developed. Figure 1 schematically illustrates the model of a viewer’s eye watching a triangular facet. Without loss of generality, it is assumed that the eye is on the z-axis, that is, the line of sight is matched to the z-axis. In Fig. 1, the geometric imaging transformation of the human eye is illustrated, where the triangular facet P1P2P3 in the object space is mapped as the triangular facet Pr1Pr2Pr3 of retina space of the viewer’s eye. For the three apex points of the triangular facet, (x1,y1,z1), (x2,y2,z2) and (x3,y3,z3), the apex points of the imaged triangular facet are obtained, respectively, by(xr1,yr1,zr1), (xr2,yr2,zr2), and (xr3,yr3,zr3). The center of mass of the triangular facet and the focal length of the eye lens are set to (xc,yc,zc) and f, which is given by 1/f=1/(d1zc)+1/d2. Let (xrc,yrc,zrc) denote the center of mass of the triangle Pr1Pr2Pr3 in retina space corresponding to the center (xc,yc,zc) in object space and then the imaging point of an apex point (x,y,z) of the object space triangular facet is given by (xr,yr,zr)=(D2x/D1,D2y/D1,D2d2), where D1=d1z1 and D2=1/(1/f1/D1). The triangles P1P2P3 and Pr1Pr2Pr3 define two planes in object and retina spaces, respectively, as described by cosϕsinθ(xxc)+sinϕsinθ(yyc)+cosθ(zzc)=0 (object space) and cosϕrsinθ(xrxrc)+sinϕrsinθr(yryrc)+cosθr(zrzrc)=0 (retina space).

 figure: Fig. 1

Fig. 1 Geometric observation model of watching an elementary triangular facet.

Download Full Size | PDF

Each triangular facet is supposed to be dressed by its own texture. The texture image is transported to retina space by the geometric imaging transformation, which is expressed based on the collinear condition. In Fig. 1, the line connecting the object point (x,y,z) and the projection center (0,0,d1) is collinear to the line of (x,y,z) and the mapping point (xr,yr,zr), and the relationship is expressed by

(xyz)=(0xr0yrd1(d1+d2+zr))t+(xryrd1+d2+zr),
where the parameter tis obtained as, by substituting Eq. (1) into the plane wave equation,
t=cosϕsinθ(xcxr)+sinϕsinθ(ycyr)+cosθ(zczrd1d2)cosϕsinθ(xr)+sinϕsinθ(yr)+cosθ(d1(d1+d2+zr)).
Next, let the local coordinate systems of the triangle facets P1P2P3 and Pr1Pr2Pr3 be (x,y,0) and (xr,yr,0). Each triangular facet has its own local coordinate system with the origin set to its center of mass. The local coordinate of a point in object space is solved for the global coordinate
(x'y'z')=(cosθcosϕcosθsinϕsinθsinϕcosϕ0sinθcosϕsinθsinϕcosθ)(xxcyyczzc),
where θ and ϕ are the longitudinal and azimuthal angles denoted in the object space of Fig. 1. The local coordinate of the corresponding point in retina space is given by
(x'ry'rz'r)=(cosθrcosϕrcosθrsinϕrsinθrsinϕrcosϕr0sinθrcosϕrsinθrsinϕrcosθr)(xrxrcyryrczrzrc),
where θr and ϕr are the longitudinal and azimuthal angles denoted in retina space of Fig. 1. The geometric mapping between the local coordinate points (x,y,z) and (xr,yr,zr) reads as
(x'y'z')=(cosθcosϕcosθsinϕsinθsinϕcosϕ0sinθcosϕsinθsinϕcosθ)[(0xr0yrd1(d1+d2+zr))t+(xrxcyrycd1+d2+zrzc)],
where t is given by Eq. (2) and (xr,yr,zr) is solved by the inversion of Eq. (4) as
(xryrzr)=(cosϕrcosθrsinθrcosϕrsinθrsinϕrcosθrcosϕrsinϕrsinθrsinθr0cosθr)(x'ry'rz'r)+(xrcyrczrc).
The above geometric transport mapping interprets the point (x,y) on the surface of the facet by a function of (xr,yr) with the equation of a plane, x=x(xr,yr) and y=y(xr,yr). Therefore, given a texture pattern on the triangular facet, the transported texture pattern is numerically obtained using the mappings (x,y)=(x(xr,yr),y(xr,yr)) and, its inverse mapping, (xr,yr)=(xr(x,y),yr(x,y)). Let us define the characteristic transmittance function on retina local domain by T(xr,yr) and represent it by the angular spectrum integral of the local coordinate system as
T(xr,yr)=Ar(αr,βr)ej2π(αrxr+βryr)dαrdβr,
In polygon CGH theory [9], the textured light pattern is represented in global retina coordinate system by
W(xr,yr,zr)=AG(αr,βr)ej2π(αrxr+βryr+γrzr)dαrdβr,
where the angular spectrum of the retina global field, AG(αr,βr), is given by

AG(αr,βr)=η0ej2π(αr0xrc+βr0yrc+γr0zrc)AL(α'r(αr,βr)α'r0(αr0,βr0),β'r(αr,βr)β'r0(αro,βr0))×H(γ'r(αr,βr))ej2π(αr(xrc)+βr(yc)+γr(zrc))|cosθr+sinθr(αrcosϕr+βrsinϕr)γr|.

Based on the above theory, we simulated a color textured curved surface consisting of polygons [10]. The polygon model of the curved surface consists of 24 polygon facets and is shown in Fig. 2, in which it is located 1m from a viewer in free space. By applying the polygon model, we obtained the mapped texture on the curved surface in the local retina coordinate system [11] and calculated the optical field distribution using Eqs. (7) and (8). When calculating the textured polygon surface, we employed the fast Fourier transform (FFT). Although some theoretical research has attempted to realize the analytic calculation of holographic fields of textured polygon surfaces [12], its applicability to generalized texture patterns remains limited. The texture example that is shown in Fig. 2(a) is a periodic pattern of three RGB lines. In this simulation result, we found the dark line-defects contaminating surface texture at the borders between adjacent triangular facets in the polygon connected surface as indicated in Fig. 2(b).

 figure: Fig. 2

Fig. 2 (a) Simulation of color textured curved surface with a line-defect and (b) its polygon facet model.

Download Full Size | PDF

As an initial approach to the line-defect problem, we first analyze a simpler non-textured curved surface that consists of 8 bare triangular facets. The curved surface is supposed to be located 30cm from the viewer as shown in Fig. 3(a) and the observed image by the viewer’s eye that is imaged on the retina plane is calculated. Figure 3(b) presents the resulting numerical simulation result, wherein some line defects are apparent at the bordering lines between neighboring polygon facets. It is well known that when the angular spectrum of the phase-matched triangle polygon is calculated by the analytic method [10,13], line defects can be removed effectively. The advanced analytic formula for arbitrarily textured triangular facets has not been developed and remains a challenging problem.

 figure: Fig. 3

Fig. 3 (a) Simulation schematic of a simple polygon surface, (b) observation of non-textured simple polygon surface with line defects, (c) intuitive analysis of the cause of line defect generation and a comparison of (d) the analytic calculation and (e) the FFT-based calculation

Download Full Size | PDF

Thus the present analytic angular spectrum method is very limited when expressing the general textures of polygon facets. Therefore, for general textured objects, the use of FFT is both popular and inevitable. However, a problem arises when using FFT to calculate general textured triangular facets. When applying the phase-regularization technique to FFT polygon calculations, the technique does not work as is presented in Figs. 2(a) and 3(b). The discrete approximation of the continuous angular spectrum integral taken in FFT polygon calculations induces numerical errors that lead to the line defect problem.

The tilted triangular facet has a characteristic shape of three-lines along the direction perpendicular to the corresponding side of the triangle as illustrated in Fig. 3(c) [14]. When the triangle facet is tilted in space, the branch of the angular spectrum forms distorted curvy lines. Steps 1, 2 and Steps (1 + 2) in Fig. 3(c) presents the case of the calculation of a simple polygon surface composed of two triangular facets connected by a shared edge. The angular spectrum branches of the edges of the left triangle, L1, L2, and L3 are denoted by A1, A2, and A3, respectively, while in Step2 those of the right triangle, L4, L5, and L6, are denoted by A4, A5, and A6. In the analytic method, the angular spectrums of the shared edges are exactly removed in Step1 + Step2 which is mathematically expressed by

A2+A4=0,
meaning that the two shared edges represented by L2 and L4 precisely cancels each other out and the line artifact does not develop, as was proven in previous research [10,13].

However, in the FFT calculation, the angular spectrums of the shared edges do not precisely cancel each other out because of the considerable numerical errors induced by the discrete approximation of FFT, which causes the non-zero line artifact at the border of the two adjacent polygons. This non-cancelation process of the shared edges L2 and L4 is expressed in terms of the angular spectrum as follows:

(A2+ε2)+(A4+ε4)0,
where the numerical noises of the angular spectrums A2 and A4 are denoted by ε2 and ε4. In practice, for most industrial applications including wave optic analysis of curved or flat surfaces, those kinds of line artifacts are unacceptable.

Therefore, a method for the removal of the FFT-induced line artifacts is necessary. As a solution, we propose a technique that partially filters the angular spectrum. The filtering principle and the practical process are presented in Fig. 4(a). In the first setup, the shared edges of the unit polygons of a given polygon mesh surface are identified. In the second step, the branches of the partial angular spectrums of the shared edge parts are selected and extracted. For arbitrary spatially tilted triangular facets, we can deterministically trace their angular spectrum branches in an analytic manner based on our previous work [14]. Finally, in the third step, the selected branch of the angular spectrum that represents the shared edges in the spatial domain is removed. It is essential to note that, during the filtering process, the remaining part of the low-frequency signal is of importance because the low-frequency signal contains the essential part of a texture pattern. A numerical example of the proposed partial filtering process for a single polygon of the target object is presented in Fig. 4(b). Figure 4(b) presents the filtered angular spectrum and Fig. 4(c) compares observed images featuring line artifacts produced by the conventional method to the one featuring much-reduced line artifacts produced by the proposed method. As another example, for the R/G/B line textured curved surface, we apply the filtering process to all polygons and calculate the observed image of the curved surface with line defect removal. The proposed filtering process presented in Fig. 4(b) shows successful line defect removal, and when Fig. 5(b) is compared to the result of the conventional method shown in Fig. 5(a), the improvement achieved by removing the line defects are apparent.

 figure: Fig. 4

Fig. 4 (a) Filtering process of the angular spectrum with line defects and (b) its simulation. (c) A comparison of the conventional and proposed methods.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 Full-color simulation results by (a) the conventional method and (b) the proposed method.

Download Full Size | PDF

3. Layered optical surfaces with a characteristic surface transmittance function

The wave optic modeling of curved surfaces developed in the previous section is extended into a model of semi-transparent textured curved surfaces. It is used to analyze the visual perception of the moiré patterns observed in most layered structures that are comprised of transparent patterned curved surfaces and to analyze the diverse variation of those patterns with changes in observation angle. In practice, electrode patterned transparent touch panel sheets can be interpreted as phase-only modulation surfaces. In general, the moiré pattern appears when two or more curved surface layers possessing amplitude periodic modulation patterns overlap. However, the following simulation shows that moiré patterns can also be generated strongly by transparent sheets possessing transparent periodic phase-modulation patterns such as those normally engraved on transparent electrode sheets. If the surfaces or sheets are transparent or semi-transparent and feature phase modulation characteristics, the wave optic model should be applied carefully since such transparent sheets are not affectively analyzed by the geometric optics method popularly applied by the industry. Figure 6(a) illustrates the situation of the observation of a double layer of amplitude modulated curved surfaces separated by a finite distance. The viewer can see a moiré pattern through the layered surface. The wave optic analysis algorithm for this typical structure is schematically described in Fig. 6(a) and the moiré pattern simulation results are presented in Figs. 6(b)-6(d), in which the variation of the moiré pattern with respect to change in the viewer’s focus is simulated. According to the proposed analysis scheme, an algorithm for calculating the optical field of a moiré pattern in the retina plane of the viewer is devised, where the patterns of the front and rear curved surfaces are denoted by u1(x,y,z) and u2(x,y,z), respectively. The corresponding optical fields at the retina plane are denoted by F1(x,y) and F2(x,y), respectively. In this schematic, the forward wave propagation from the object space to the retina plane of the viewer and the backward wave propagation from the retina plane to the object space are complementarily described by the pair of the cascaded Fresnel transform (CdFr) and the inverse cascaded Fresnel transform (ICdFr), respectively [14,15]. The cascaded Fresnel transform pair is an essential theoretical part in the wave optic modeling of vision systems. CdFr is the serially cascaded transformation that is composed of the first Fresnel transform section from the object space plane to the eye lens plane and the followed second Fresnel transform section from the eye lens plane to the retina plane in the eyeball. CdFr is used for the wave optic imaging of the object space into the retina plane through which we can simulate most of all wave optic effects related to the human vision system such as accommodation effect. ICdFr is the backward propagation transform from the retinal plane to any certain plane in the object space, which is a mathematical inverse transform corresponding to CdFr. ICdFr is used for designing the holographic field at a specified object plane that eventually produces a given image in the retina plane. In order to find a necessary holographic field at a specific object plane generating a specific image at the retina plane, we can perform ICdFr with a given input of the specific image at the retina plane. Let us assume that the distance from the retina plane to the eye lens is represented by deye. The front and rear patterns are placed at the positions L1 and L2 distant from the eye lens. Let us define the holographic fields u¯1(x,y,z=L1) of a curved surface field distribution u1(x,y,z) and u¯2(x,y,z=L1) of the other curved surface field distribution u2(x,y,z) at the z=L1 plane. The holographic field of the front surface u¯1(x,y) is obtained by the inverse cascaded Fresnel transform [14,15]

u¯1(x,y)=ICdFr{F1(x,y);L1,deye,feye(L1)},
where L1 and deye are the distance between the front surface and the eye lens, and that between the eye lens and retina plane, respectively, and the eye focus is set to feye(L1)=1/(1/L1+1/deye). When the viewer changes the eye focus to z=L2, the observed image loses the focus, and then the observed optical field at the retina plane is calculated by the forward cascaded Fresnel transform of the holographic field u¯1(x,y),
F1(x,y)=CdFr{u¯1(x,y);L1,deye,feye(L2)},
where the focal length of the eye lens focal length is adjusted to feye(L2)=1/(1/L2+1/deye). The first, second and third parameters of the CdFr and ICdFr are the distance from the object plane to the eye lens plane, the distance from the eye lens plane to the retina plane and the focal length of the eye lens [14,15]. The eye lens focus can vary to adjust the viewer’s attention to a certain plane in the object space. Similar to the front surface, the holographic field u¯2(x,y) of the rear surface u2(x,y,z) at z=L1 is obtained by using ICdFr,
u¯2(x,y)=ICdFr{F2(x,y);L1,deye,feye(L2)}.
Here, the optical field u¯2(x,y) is the diffraction field distribution of the rear surface pattern at the front surface position z=L1 as represented in the lower panel of Fig. 6(a). The in-focus image of the rear seen in Fig. 6(c) is obtained by F2(x,y)=CdFr{u¯2(x,y);L1,deye,feye(L2)}, where the eye focus is set to z=L2. It becomes out of focus when the eye focus is set to z=L1. In this case, the defocused field distribution is obtained by F2(x,y)=CdFr{u¯2(x,y);L1,deye,feye(L1)}.

 figure: Fig. 6

Fig. 6 (a) Schematic illustration of the observation process for multiple layers of semi-transparent surfaces. Results of the numerical simulation of the observation of (b) the front surface, (c) the rear surface, (d) the masked rear pattern, and (e) its moiré pattern with a focus on z=L1.

Download Full Size | PDF

The front surface is supposed to have its own complex texture pattern, which can be modeled by the transmittance function of the front surface. The characteristic transmittance function of the front surface is given by T1(x,y), which is an inversion projection image of the front surface, u¯1(x,y). This means that the bright part of u¯1(x,y) occludes the rear surface while the dark part of u¯1(x,y)is actually transparent to the rear surface in this model. Here two curved surfaces are supposed to emit a coherent light field on the bright part as shown in Figs. 6(b) and 6(c). In the object space, a part of the optical field from the rear surface is filtered by the transmittance function of the front surface. The overlapped image of the front curved surface and the occluded second surface in the plane z=L1 is represented by

M¯(x,y)=u¯1(x,y)+T1(x,y)u¯2(x,y),
and the corresponding field in the retina plane is represented by the field pattern,M(x,y)=F1(x,y)+T1(x,y)F2(x,y). The multiplication operation in Eq. (15) is similar to the silhouette masking method [16] for the implementation of the occlusion effect in computer-generated holograms (CGHs). The occluded rear surface image at the focus of z=L1is presented in Fig. (d). M(x,y) is understood to be an overlapped image with a focus on the front surface feye=L1 as seen in Fig. 6(e). The viewer watches the moiré pattern of two curved surfaces focused at L1 by watching the defocused rear surface through the focused semitransparent front surface. In addition, the optical field M¯(x,y) can be considered a holographic field representing the overlapped image of two curved surfaces. When the viewer changes the eye focus to the rear surface z=L2, the viewer can see the different moiré pattern represented by

M(x,y;z=L2)=CdFr{M¯(x,y);L1,deye,feye(L2)}.

In Fig. 7, the situation that a viewer sees a curved surface with a moiré pattern generated by crossed tilted periodic patterns from several directions is addressed. It is assumed that the viewer does actually see the curved surface, but that the perceived moiré patterns vary in accordance with the observation direction. The observation angle is changed from 6° to 24° along the longitudinal direction as shown in Fig. 7(b). The simulation results show that the curvature of the curved surface is more noticeable and that the moiré pattern changes as the viewing angle increases. Figure 7(c) shows the accommodation effect of the moiré pattern at the different focus of the viewer. When the viewer focus is at 1.98m (the front panel), a clear moiré pattern is obtained. On the other hand, blurred moiré patterns are observed when the viewer focus at 0.5m and 0.2m from the eye in the free space with vertical color bands developing around the left and right edges of the surfaces. In Fig. 8, the effect of the proposed line defect filtering is tested in the curved surface moiré simulation. As similar as shown in Fig. 5, the moiré pattern simulated by the conventional method without a filtering process present line defects crossing the curved panel, while the filtering of line defect on the curved surface works effectively to remove the line defects in the simulated moiré pattern. Since, in industrial application, this type of non-physical artificial numerical defects are not accepted, we need to take care of modeling of curved surfaces with a line defect filtering as described in this paper.

 figure: Fig. 7

Fig. 7 (a) Two curved surfaces which are integrated into the object space are observed at various observation angles. (b) Color moiré simulation results for various observation angles and (c) variation in the accommodation effect with a change of focus.

Download Full Size | PDF

 figure: Fig. 8

Fig. 8 Moiré patterns of two curved surfaces calculated by (a) the conventional method and (b) the proposed method. The line defects perceived in the simulation result of the conventional non-filtering method is removed from that by the proposed method.

Download Full Size | PDF

Figure 9 presents an industrial application which may produce a complicated moiré pattern. The moiré patterns produced by a bottom layer of a flexible organic light emitting diode (OLED) display and an upper flexible touch screen panel (TSP) is investigated. In this case, the ITO electrode of the touch panel screen is visually transparent, so the conventional geometric optical analysis is not applicable and only the wave optic analysis should be employed. The specification of the typical PenTile pixel pattern of OLED displays and ITO pattern of TSP are presented in Figs. 9(a) and 9(b), respectively. The red, green, and blue wavelengths of the RGB OLED are set to 632nm, 532nm, and 473nm, respectively.

 figure: Fig. 9

Fig. 9 Industrial example: (a) 1543X1896 periodic patterns of OLED pixels and (b) 4714X1811 periodic patterns of ITO patterns are used in this example. (c) The moiré pattern formed by the layered OLED pixel and ITO structure. It is assumed in this simulation that the green pixels of the OLED pixel is turned on and the other colors are turned off.

Download Full Size | PDF

The OLED pixel pattern of 162μm×174μm and the ITO pattern of 138μm×70μm are supposed to cover a slightly curved rectangular surface area of 33cm × 22cm by a form of a periodic pattern. The OLED pixel pattern and the ITO pattern are represented by u2(x,y,z) and u1(x,y,z), respectively. The characteristic function of the front surface T1(x,y) is given by the holographic field of u1(x,y,z) at z=L1, u¯1(x,y), which represents the phase modulation function of the slightly curved transparent ITO patterned front surface. In this simulation, the thickness and refractive index of the ITO pattern on the TSP are assumed to be 50nm and 1.9, respectively. In order to calculate the moiré pattern of 33cm by 22cm, the simulation resolution of 50001 × 50001 is necessary, which means that the light field variables in this analysis such as F1,F2,u¯1, and u¯2are 50001 × 50001 complex matrices. The resultant moiré pattern of the layered structure, only with the green pixels turned on, is shown in Fig. 9(c). The periodic stripe moiré patterns in vertical direction caused by layered structure are observed in Fig. 9(c). Sometimes we found that the shape of the moiré pattern simulation result can be observed differently on different monitors during the simulation. If the resolution of the picture is larger than the resolution of the monitor, the computer may distort the moiré by interpolating of deleting the image data according to the picture size displayed on the monitor. In the simulation result, moiré pattern can be decomposed into low-frequency and high-frequency parts. At this point, low-frequency part of the moiré pattern delivers a visible stripe pattern which deteriorates the display image quality. However, in general, the removal of this type of moiré pattern is difficult in practical industrial display applications since this annoying moiré pattern is very sensitive to small mismatch in panel alignment.

4. Concluding remarks

We have presented the theoretical framework of a layered or overlapped curved surface with a surface texture pattern representing amplitude or phase transmittance functions. The numerical analysis method was applied to simulate complicated moiré patterns on layered curved surfaces and the industrial problem of the moiré patterns expected of flexible OLED displays with TSP panels. The numerical analysis of the moiré patterns in industrial applications have been developed in this paper, but the technical removal of the moiré patterns is a difficult problem to necessitate extended research. The proposed model can be useful for this more practical research for this problem. Furthermore, the proposed model is extensively applicable to the image quality analysis of general multiple layered displays such as liquid crystal displays with TSP, holographic three-dimensional display, and tensor displays as well as flexible OLEDs with TSP.

Funding

Samsung Display Co. Ltd.

References

1. M. S. Sarwar, Y. Dobashi, C. Preston, J. K. M. Wyss, S. Mirabbasi, and J. D. W. Madden, “Bend, stretch, and touch: Locating a finger on an actively deformed transparent sensor array,” Sci. Adv. 3(3), e1602200 (2017). [CrossRef]   [PubMed]  

2. Y.-M. Ji and J.-H. Park, “Dual layered display that presents auto-stereoscopic 3D images to multiple viewers in arbitrary positions,” J. Soc. Inf. Disp. 24(10), 641–650 (2016). [CrossRef]  

3. D. K. G. de Boer, M. G. H. Hiddink, M. Sluijter, O. H. Willemsen, and S. T. de Zwart, “Switchable lenticular based 2D/3D displays,” Proc. SPIE 6490, 64900R (2007). [CrossRef]  

4. W. Gordon, L. Douglas, H. Matthew, and R. Ramesh, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).

5. L. Zhou, A. Wanga, S.-C. Wu, J. Sun, S. Park, and T. N. Jackson, “All-organic active matrix flexible display,” Appl. Phys. Lett. 88(8), 083502 (2006). [CrossRef]  

6. H. Song, G. Sung, S. Choi, K. Won, H.-S. Lee, and H. Kim, “Optimal synthesis of double-phase computer generated holograms using a phase-only spatial light modulator with grating filter,” Opt. Express 20(28), 29844–29853 (2012). [CrossRef]   [PubMed]  

7. A. Roudaut, H. Pohl, and P. Baudisch, “Touch input on curved surfaces,” in Proceedings of CHI’11 SIGCHI Conference on Human Factors in Computing systems, 1011–1020 (2011).

8. Y. Kim, G. Park, J.-H. Jung, J. Kim, and B. Lee, “Color moiré pattern simulation and analysis in three-dimensional integral imaging for finding the moiré-reduced tilted angle of a lens array,” Appl. Opt. 48(11), 2178–2187 (2009). [CrossRef]   [PubMed]  

9. H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled 3D surface objects for digital holography,” Appl. Opt. 47, D117 (2008). [CrossRef]   [PubMed]  

10. J.-H. Park, H.-J. Yeom, H.-J. Kim, H. Zhang, B. Li, Y.-M. Ji, and S.-H. Kim, “Removal of line artifacts on mesh boundary in computer generated hologram by mesh phase matching,” Opt. Express 23(6), 8006–8013 (2015). [CrossRef]   [PubMed]  

11. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014). [CrossRef]   [PubMed]  

12. Y.-M. Ji, H. Yeom, and J.-H. Park, “Efficient texture mapping by adaptive mesh division in mesh-based computer generated hologram,” Opt. Express 24(24), 28154–28169 (2016). [CrossRef]   [PubMed]  

13. D. Im, E. Moon, Y. Park, D. Lee, J. Hahn, and H. Kim, “Phase-regularized polygon computer-generated holograms,” Opt. Lett. 39(12), 3642–3645 (2014). [CrossRef]   [PubMed]  

14. D. Im, J. Cho, J. Hahn, B. Lee, and H. Kim, “Accelerated synthesis algorithm of polygon computer-generated holograms,” Opt. Express 23(3), 2863–2871 (2015). [CrossRef]   [PubMed]  

15. J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, and H. Kim, “Full-color holographic projection display system featuring an achromatic Fourier filter,” Opt. Express 25(13), 14774–14782 (2017). [CrossRef]   [PubMed]  

16. K. Matsushima, M. Nakamura, and S. Nakahara, “Silhouette method for hidden surface removal in computer holography and its acceleration using the switch-back technique,” Opt. Express 22(20), 24450–24465 (2014). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Geometric observation model of watching an elementary triangular facet.
Fig. 2
Fig. 2 (a) Simulation of color textured curved surface with a line-defect and (b) its polygon facet model.
Fig. 3
Fig. 3 (a) Simulation schematic of a simple polygon surface, (b) observation of non-textured simple polygon surface with line defects, (c) intuitive analysis of the cause of line defect generation and a comparison of (d) the analytic calculation and (e) the FFT-based calculation
Fig. 4
Fig. 4 (a) Filtering process of the angular spectrum with line defects and (b) its simulation. (c) A comparison of the conventional and proposed methods.
Fig. 5
Fig. 5 Full-color simulation results by (a) the conventional method and (b) the proposed method.
Fig. 6
Fig. 6 (a) Schematic illustration of the observation process for multiple layers of semi-transparent surfaces. Results of the numerical simulation of the observation of (b) the front surface, (c) the rear surface, (d) the masked rear pattern, and (e) its moiré pattern with a focus on z= L 1 .
Fig. 7
Fig. 7 (a) Two curved surfaces which are integrated into the object space are observed at various observation angles. (b) Color moiré simulation results for various observation angles and (c) variation in the accommodation effect with a change of focus.
Fig. 8
Fig. 8 Moiré patterns of two curved surfaces calculated by (a) the conventional method and (b) the proposed method. The line defects perceived in the simulation result of the conventional non-filtering method is removed from that by the proposed method.
Fig. 9
Fig. 9 Industrial example: (a) 1543X1896 periodic patterns of OLED pixels and (b) 4714X1811 periodic patterns of ITO patterns are used in this example. (c) The moiré pattern formed by the layered OLED pixel and ITO structure. It is assumed in this simulation that the green pixels of the OLED pixel is turned on and the other colors are turned off.

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

( x y z )=( 0 x r 0 y r d 1 ( d 1 + d 2 + z r ) )t+( x r y r d 1 + d 2 + z r ),
t= cosϕsinθ( x c x r )+sinϕsinθ( y c y r )+cosθ( z c z r d 1 d 2 ) cosϕsinθ( x r )+sinϕsinθ( y r )+cosθ( d 1 ( d 1 + d 2 + z r ) ) .
( x' y' z' )=( cosθcosϕ cosθsinϕ sinθ sinϕ cosϕ 0 sinθcosϕ sinθsinϕ cosθ )( x x c y y c z z c ),
( x ' r y ' r z ' r )=( cos θ r cos ϕ r cos θ r sin ϕ r sin θ r sin ϕ r cos ϕ r 0 sin θ r cos ϕ r sin θ r sin ϕ r cos θ r )( x r x r c y r y rc z r z rc ),
( x' y' z' )=( cosθcosϕ cosθsinϕ sinθ sinϕ cosϕ 0 sinθcosϕ sinθsinϕ cosθ )[ ( 0 x r 0 y r d 1 ( d 1 + d 2 + z r ) )t+( x r x c y r y c d 1 + d 2 + z r z c ) ],
( x r y r z r )=( cos ϕ r cos θ r sin θ r cos ϕ r sin θ r sin ϕ r cos θ r cos ϕ r sin ϕ r sin θ r sin θ r 0 cos θ r )( x ' r y ' r z ' r )+( x r c y rc z rc ).
T( x r , y r )= A r ( α r , β r ) e j2π( α r x r + β r y r ) d α r d β r ,
W( x r , y r , z r )= A G ( α r , β r ) e j2π( α r x r + β r y r + γ r z r ) d α r d β r ,
A G ( α r , β r )= η 0 e j2π( α r0 x rc + β r0 y rc + γ r0 z rc ) A L ( α ' r ( α r , β r )α ' r0 ( α r0 , β r0 ),β ' r ( α r, β r )β ' r0 ( α ro , β r0 ) ) ×H( γ ' r ( α r , β r ) ) e j2π( α r ( x rc )+ β r ( y c )+ γ r ( z rc ) ) | cos θ r + sin θ r ( α r cos ϕ r + β r sin ϕ r ) γ r | .
A 2 + A 4 =0,
( A 2 + ε 2 )+( A 4 + ε 4 )0,
u ¯ 1 ( x,y )=ICdFr{ F 1 ( x , y ); L 1 , d eye , f eye ( L 1 ) },
F 1 ( x , y )=CdFr{ u ¯ 1 ( x,y ); L 1 , d eye , f eye ( L 2 ) },
u ¯ 2 ( x,y )=ICdFr{ F 2 ( x , y ); L 1 , d eye , f eye ( L 2 ) }.
M ¯ ( x,y )= u ¯ 1 ( x,y )+ T 1 ( x,y ) u ¯ 2 ( x,y ),
M ( x , y ;z= L 2 )=CdFr{ M ¯ ( x,y ); L 1 , d eye , f eye ( L 2 ) }.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.