Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Generation of 360° three-dimensional display using circular-aligned OLED microdisplays

Open Access Open Access

Abstract

A 360° all-around multiview three-dimensional (3D) display system is proposed by using coarse-pitch circular-aligned OLED microdisplays. The magnified virtual color images projected from microdisplays serve as stereo images, which can create separate eyeboxes for the viewer. Through inserting baffles, a transitional stereo image assembled by two spatially complementary segments from adjacent stereo images is presented to a complementary fusing zone (CFZ) which locates between adjacent eyeboxes. For a moving observation point, the spatial ratio of the two complementary segments evolves gradually, resulting in continuously changing transitional stereo images and thus overcoming the problem of discontinuous moving parallax. Such a controllable light-ray fusing technology, assured by the inherent large divergent angle of OLED pixels, decreases the required number of display panels for 360° multiview 3D display greatly. A prototype display system with only 67 full-color OLED microdisplays is set up to demonstrate the 360° 3D color display. The develop system is freed from the dependence on mechanical moving elements, high-speed components and diffusion screens.

© 2015 Optical Society of America

1. Introduction

Enabling viewers to perceive position-dependent three-dimensional (3D) scenes from all multiple directions, 360° display is pursued all along in the 3D technology. Based on “voxel”, volumetric 3D display technologies inherently have a 360° viewing region. Each “voxel” locates physically at the spatial position where it is supposed to be, reflecting/emitting light from that position toward omni-directions to form a real image in the eyes of viewers. Such “voxels” can provide all depth cues. However, since these “voxels” are always on display no matter which direction the viewer looks from, the displayed objects keep being transparent. For the majority of display applications, the accuracy, efficiency and effectiveness of 3D visualization tasks will be hindered by 3D displays without hidden line removal [1]. The holographic technique provides natural spatial 3D effects, but the space-bandwidth characteristics of commercial available spatial light modulator lead to limited field of view (FOV) and display size. Spatial tiling of multiple spatial light modulators [2], time-multiplexing of high-speed spatial light modulator [3,4], or hybrid of the former two [5] have been developed for larger FOV or display size, but practical holographic 3D display with both moderate FOV and moderate display size has not reported. Through presenting position-dependent views of a 3D scene to viewers, the multiview display can offer both stereo and motion parallax depth cues. Due to its compatibility with existing plane image systems based on flat two-dimensional (2D) display panels [6], multiview-technology-based 360° display systems become prosperous. Roughly, they can be categorized into four types: high-speed projector, multi-projection, hybrid projection and rotating-LED-array systems [7]. In the high-speed projector system, many images are projected onto a rotating screen in a time-sequential manner and then redirected to surrounding viewpoints [8–11]. In order to obtain dense-pitch viewpoints for continuous 3D effect, a super-high-speed projector is crucial for this type of systems. To alleviate the extravagant-demand on the used projector’s speed, a hybrid system was proposed by introducing several projectors into the high-speed type system [12]. Recently, H. Urey developed a hybrid multiview display system through designing an ingenious rotating retro-reflective diffuser screen. Also it is not a 360° display system, the technique deserves emphasis here due to the merits of a large viewing region along horizontal direction (700 mm) and vertical direction (500 mm) simultaneously by using only two non-high-speed projectors [13]. The rotating-LED-array system is based on the parallax panoramagram technology [14]. The LED array rotates coaxially inside a cylindrical parallax barrier and every LED pixel switches according to its angular positions dynamically. Through the parallax barrier, stereo image fused timely by LED pixels rotated to different angular positions is presented to the eye at corresponding positions. All above three types of 360° multiview display systems need mechanical moving elements to rotate a screen or LED array. The sophisticated mechanical elements will bring noises and at the same time heavily decrease the system’s reliability. The multi-projection system, implemented by S. Yoshida et al [15], is free from such mechanical moving elements and high-speed components. A great number of circular-aligned projectors are employed for projecting images onto a static curved screen. At any viewpoint, the pupil collects fractional slit-like images from different projectors to form an appropriate stereo image. To eliminate black stripes between slit-like images resulting from the physical interval between adjacent projectors, the curved screen must diffuse the incident rays by a certain horizontal angle. But large-scale diffusion will bring obvious crosstalk, chromatic dispersion and non-uniformity of the display luminosity [16]. Therefore, dense-pitch projectors are needed for less diffusion. Experimentally, the latest prototype of S. Yoshida’ system used 120 projectors to cover a viewing area of 150°, only about two-fifths of the ideal 360° [17]. R. Otsuka et al proposed another multi-projection system with a relative simple structure, but a rotating screen was used and the number of the displayed stereo images was too limited to obtain a continuous motion parallax [18].

In this paper, a novel 360° multiview 3D display system is proposed, which belongs to the multi-projection category. A group of circular-aligned projecting units are employed to project magnified virtual images as stereo images to the 3D display zone. Each unit is constructed by an OLED-microdisplay, an imaging lens and a couple of baffles. Though blocking partial microdisplay light-rays by the baffles, complementary fusing zones (CFZs) get generated, where continuously changing transitional stereo images assembled by two spatially complementary segments from adjacent stereo images are presented to a moving pupil. The mechanical moving elements, high-speed components and diffusion screens are not needed in the proposed system. In contrast to the method of S. Yoshida, our method implements a real 360° viewing region by using coarse-pitch projecting units. Since the diffusion screen is not used, the chromatic dispersion is avoided, making the proposed system be suitable for color 3D display.

2. Proposed 360° multiview display system

A projecting unit k in the proposed system is schematically drawn in Fig. 1(a), composing of an OLED microdisplay, an imaging lens and a couple of baffles. The optical axis of the imaging lens passes through the geometrical center of the microdisplay perpendicularly. Through the imaging lens, a magnified virtual color image projected from the microdisplay serves as the stereo image. Here, the stereo image k projected from the microdisplay k is imaged onto a special zone EkE'k, which is centered around the point O on the display plane k. The lens k is processed into a rectangular shape, with two marginal points Mk-1 and Mk along the horizontal direction. Through attaching baffles to the vertical sides of the imaging lens, those light-rays of the microdisplay exceeding corresponding imaging lens are blocked. According to geometric optics, there exist two kinds of zones in the viewing zone: eyebox of the whole stereo image and SVZk, as shown in Fig. 1(a). Following the definition in ref [19,20], an eyebox of the whole stereo image denotes a virtual box suspended in the air where the users' eyes can see the whole stereo image if they are within that box. In this paper, the eyebox is a 3D space and denoted as VZk. For an observation point in the VZk zone, the whole stereo image k displayed on the whole EkE'k region is visible. But when the observation point moves into adjacent SVZk zone, a segment of the EkE'k region gets invisible. For example, to an observing point A in the SVZk zone, only the segment DkE'k of the stereo image k is visible, which is denoted as Pk→A in the figure. Dk is the intersection point of the visual line AMk with the display plane k.

 figure: Fig. 1

Fig. 1 Optical arrangement of the proposed 360° multiview 3D display system with only two adjacent projecting units being drawn for simplicity: (a) Structure of the projecting unit k; (b) The adjacent projecting unit k + 1 with an off-set angle to the unit k; (c) Diagram of the viewing region when two adjacent projecting units are activated simultaneously.

Download Full Size | PDF

Let the point O be the circle center and another projecting unit k + 1 is arrayed along the circumference, as shown in Fig. 1(b). The off-set angle between the two projecting units is precisely designed to guarantee end to end alignment of the imaging lenses. A baffle and a marginal point Mk are shared by the two adjacent projecting units. The stereo image k + 1 projected from the microdisplay k + 1 appears on a new display plane k + 1, with Ek + 1E'k + 1 as the new marginal points. Geometrically, Ek, E'k, Ek + 1 and E'k + 1 are all on a circle which determines the available zone for stereo-images. If the observation point A discussed above is also within the SVZk + 1, as shown in Fig. 1(b), the segment Ek + 1Dk + 1 of the stereo image k + 1, which is denoted as Pk + 1→A, is visible to this point. The Dk + 1 is the intersection point of the visual line AMk with the display plane k + 1. That is to say, the point Dk and Dk + 1 are all on the visual line AMk.

Figure 1(c) shows the situation when the two adjacent projecting units get activated simultaneously. According to geometric optics, there exists an overlapping zone between two SVZs coming from the adjacent projecting units, which is called CFZ in this paper. A zone between a CFZ and an adjacent VZ is named as PVZ. In the Fig. 1(c), these zones are denoted with the subscript k, k + 1 or k~k + 1 according to their ownership. For any observation point in a VZ zone, e.g. the VZk, the stereo image k displayed on the whole EkE'k region is visible, as discussed above. But for the point A in the CFZ zone, the segment Pk→A of stereo image k and the segment Pk + 1→A of stereo image k + 1 are presented simultaneously. Seeing along the visual line, the two segments link up at Dk (Dk + 1), spatially tiling a transitional stereo image. When the observation point A in the CFZ moves, the connection point Dk(Dk + 1) does linkage movement with the observation point but toward the reverse direction. For example, when the observation point moves clock-wisely toward the A', the connection point Dk(Dk + 1) shifts counter-clock-wisely toward the D'k(D'k + 1). For the observed transitional stereo image, the segment from stereo image k shrinks to the G'kD'k zone and the segment from stereo image k + 1 expands to the D'k + 1Gk + 1 zone. They are denoted as Pk→A' and Pk + 1→A' respectively in Fig. 1(c).

Subsequently, when the observation point moves into a PVZ zone, e.g. PVZk, a small segment of the EkE'k region gets invisible. The maximum invisible segment is the EkGk region when the observation point reaches the farthest sideline of the PVZk zone. The Gk is the intersection point of the line MkEk + 1 with the display plane k. G'k is the symmetry point of Gk with respect to the point O. To guarantee a full-size stereo image being visible to any observation point of PVZ, GkG'k part of the EkE'k region (Gk + 1G'k + 1 part of the E + 1kE'k + 1 region) is taken as the effective display region for the stereo image k (k + 1) in the proposed system. As a result, the effective zone for all stereo images shrinks to a region with a radius R3 = GkGk'¯/2 on the horizontal plane, as shown in Fig. 1(c). Actually, the amount of shrinkage is negligible when Δθ is not too large. Under this condition, displayed images confined in the effective zone is wholly visible to any observation point in a compound zone constructed by one VZ adjunct with two adjacent PVZs, which is denoted as CVZ. Then, with the help of the continuously changing transitional stereo image, the observed image will evolve from a complete stereo image k to a complete stereo image k + 1 in a spatial “point by point” manner when the observation point moves from CVZk to CVZk + 1.

As an inherent character, the light emission of an OLED pixel presents a very large divergence angle. For above two circular aligned projecting units, after transformation through imaging lenses, the intensity distribution of lights from all OLED pixels in the corresponding CVZ and CFZ is approximately homogeneous along the circumference centered at the point O. As a result, the observed image shows no obvious light intensity fluctuation as an observation point scanning across these zones.

Table 1 shows the system parameters and variables. The magnification of the stereo image is equal to β = -v/u. According to geometric optics, the horizontal angle ranges of the three kinds of zones are calculated as:

Tables Icon

Table 1. Definitions of symbols

θVZ=2arctan((βdxD)/2v)θPVZ=arcsin(0.5βdxcos(0.5Δθ)/(v0.5βdxsin(0.5Δθ)))arcsin(0.5βdxcos(0.5Δθ)/(v+0.5βdxsin(0.5Δθ)))θCFZ=2arcsin(0.5βdxcos(0.5Δθ)/(v+0.5βdxsin(0.5Δθ)))

Replacing the observation point by a pupil of a diameter 2d, a connection point will become a region, denoted as blending region (BR) in this paper. As shown in Fig. 2, for a pupil in the CFZk~k + 1 with marginal points B1 and B2, the marginal points of the BR, i.e. D1(k) and D2(k) on the display plane k, D1(k + 1) and D2(k + 1) on the display plane k + 1, are on the extension lines of B1Mk and B2Mk. Outside the BR, G'kD2(k) segment of stereo image k and D1(k + 1)Gk + 1 segment of stereo image k + 1 are visible to the pupil. However, within the BR, the content presented to the pupil is a hybrid of the two stereo images’ contents at spatially varied percent through an interpolation fitting. Specifically, along a visual line MkQkQk + 1, the content presented by the BR is

IQkQk+1=αQkIQk+αQk+1IQk+1
where IQk and IQk+1 represent the contents included in the point Qk coming from the stereo image k and in the point Qk + 1 coming from the stereo image k + 1, respectively. The weight factors αQk and αQk+1 depend on the positions of the point Qk and Qk + 1, evolving spatially and continuously:

 figure: Fig. 2

Fig. 2 Optical diagram showing the observed transition stereo image when the pupil is cover by a CFZ.

Download Full Size | PDF

αQk=D2(k)Qk/D1(k)D2(k)αQk+1=Qk+1D1(k+1)/D1(k+1)D2(k+1)

So, as the visual line changes from D1(k)D1(k + 1) to D2(k)D2(k + 1), the presented image content along the corresponding visual line within the BR changes spatial-gradually from stereo image k to stereo image k + 1. Compared with the transitional stereo image presented to an observation point, which is a tiling of two segments from respective stereo images, an additional hybrid BR is inserted into the transitional stereo image for a pupil in CFZs.

Actually, when a pupil moves from a CVZ to its adjacent CVZ, there exists another situation, a pupil straddling adjacent CVZ and CFZ. Based on the same theory, the observed transitional stereo image is a spatial tiling of segments from a stereo image and a BR.

Equations (2)-(3) mathematically indicate that along a visual line the content presented to a moving pupil experiences a timed gradual process as the BR sweeps across. For example, the visual line D2(k + 1)D2(k) in Fig. 2 is at the upper edge of the BR, it presents the content of the stereo image k. But when the pupil clock-wisely moves a distance of 2d (the diameter of the pupil), the visual line D2(k + 1)D2(k) becomes the lower edge of the BR. The presented content changes to that of the stereo image k + 1. In short, all above analysis clearly shows that the observed transitional stereo image changes continuously from the stereo image k to the stereo image k + 1 when the pupil moves from VZk to adjacent VZk + 1. This evolution process takes place not only by a spatial “point by point” way for the whole observed transitional stereo image, but also by a timed gradual way for the presented content of each display point.

Using multiple projecting units to set up full-circle-arranged-projectors in the proposed system, all CVZs and CFZs will form a 360° viewing region. En route continuous changing of transitional stereo images, a 360° multi-view 3D display with continuous motion parallax will be implemented.

3. Experiments and results

A display system is set up to implement the idea described above, as shown in Fig. 3. To show the inner structure more clearly, six lenses and five baffles are removed for photographing purpose. Full color OLED microdisplays from OLEiD of China are used. The imaging lenses are processed into a rectangular shape of 22.5 × 39 mm2. The chosen imaging lenses are achromatic for color 3D image display. Totally, 67 OLED microdisplays are assembled by a full-circle-arrangement in the prototype system.

 figure: Fig. 3

Fig. 3 Photograph of the experimental display system.

Download Full Size | PDF

Table 2 summarizes the used parameters in the prototype system. According to Eq. (1), θVZ=3.6601°, θPVZ=0.0335° and θCFZ=8.9997°. The shrinkage scale of the display size is evaluated as 2(R2-R3) = 2vsinθPVZ = 0.28 mm, a rather small value compared with the available zone for stereo images. Along the radius direction, the radial viewing range of viewers is determined by the maximum size of the CVZ, which isD/2tan(0.5θvZθPVZ) = 358.7 mm. This size is too large to scale in the Fig. 1 and Fig. 2. So, the two figures are not to scale and R1 is much larger than v in the real system. The allowable nearest observation position is close to the imaging lenses, with a distance of v to the display plane. At this observation distance, the displayed pixel on the display plane has a resolution of arcsin(|β|(dx/Rx)/v)1arcmin, which is equal to the resolution criterion of a human eye. For the observation distance within 240~598.7 (i.e. 240 + 358.7) mm in the prototype system, the size of the display pixel on the display plane is fine enough for the viewers. Generally, a circular observing track where a circumferential length in CVZs being equal to the pupil size 2d is a preferred observing path (i.e. a circle of R1 = 550.9 mm), as shown in Fig. 1. It is not suitable for the pupil being too far away from the system, because obvious crosstalk noises from adjacent stereo images will appear when the pupil is just centered at the viewpoint of a corresponding stereo image. Inversely, if the pupil locates at positions too close to the system, the CVZs will have a larger size than the pupil, thus the observed stereo image keeps invariant until the pupil moves out of a CVZ. Naturally, perceived views by a moving viewer are ever-changing. The invariant views due to too large circumferential size of CVZs will affect the natural 3D effect provided by the proposed system, although the transition between adjacent stereo images is still continuous. Therefore, a circumferential viewing region around the preferred observing path is taken as the optimum viewing region. The intersection points of the midlines of each VZ with the preferred observing path (e.g. VPk and VPk + 1 of Fig. 1(c)) are taken as the viewpoints for the corresponding stereo images. For symmetrical consideration, the displayed 3D object is set as 35 × 35 × 35 mm3, ensuring a whole scene observation. With a vertical size of 39 mm, the processed imaging lens provides a vertical VZ of 44.2 mm where the whole displayed scene is visible.

Tables Icon

Table 2. Input system parameters in the prototype system

A uniform light intensity distribution along the circular observing path is a necessary condition for the proposed system. According to the circular symmetry of the optical structure, the sector area between the midline of one VZ and the midline of adjacent CFZ is the element zone, e.g. the sector area between lines OVPk and OMk in Fig. 1(c). The light intensity distribution in this element zone can represent that of the whole display region, which is constructed by 64 × 2 such element zones. With all the microdisplays being active and all the pixels being set at the maximum intensity value, the light intensity distribution in the element zone is measured by a luminance meter CS-2000A at 13 points with equal interval along the preferred observing path. Pk is taken as the starting point. Among them, two points belong to VZk and other points belong to CFZ. The measured values are 78.3, 78.0, 77.8, 77.4, 76.9, 76.8, 76.1, 75.8, 75.2, 74.6, 74.5, 74.1, and 74.1 cd/mm2, respectively. A small light intensity fluctuation (<6%) is confirmed. So, intensities of the observed different 2D stereo images or transitional stereo images are nearly uniform.

Crosstalk distribution of the realized prototype was evaluated. Similar to ref [13], the crosstalk is quantified by Eq. (4).

crosstalk(%)=reflectionsignal×100
Here “reflection” represents the maximum luminance of lights being reflected or scattered by baffles, and “signal” denotes the maximum luminance of lights without interfering from baffles. According to the circular symmetry discussed above, the crosstalk of the above 13 measurement points can indicate the crosstalk phenomenon in the prototype system. Above measured light intensity values include both the “reflection” and the “signal”. To obtain the signal intensity of the 13 measurement points, all baffles are removed. When measuring the “signal” values at points in the CVZk, the stereo image k is activated at the maximum intensity and all other microdisplays are powered off. When measuring “signal” values at points in CFZk~k + 1, the corresponding transitional stereo images (i.e. the spatial tiling of two complementary segments from the stereo image k and k + 1) are activated at the maximum intensity, and at the same time, the rest segments of the stereo image k and k + 1 are set to 0. Other microdisplays are powered off. To eliminate the background light, which exists even the driving voltage of the display pixel on the power-on microdisplays k and k + 1 being set as 0, two temporary baffles are used to block lights from the two rest segments. The measurement results are: 77.2, 77.0, 76.2, 75.6, 75.0, 74.8, 74.1, 73.9, 72.8, 72.1, 72.2, 71.5, and 71.0 cd/mm2. According to Eq. (4), the maximum crosstalk is less than 4%, a rather small value.

A globe model is displayed to demonstrate the proposed idea and system. Using a CCD (Optronis CR5000X2) locating at different positions with equal angular spacing along the preferred observing path, captured color images are shown in Fig. 4. The diaphragm of the CCD is always set as 5 mm, which is the average size of the pupil. Under this condition, the width of the BR zone in the captured 2D transitional stereo image is about 3.86 mm.

 figure: Fig. 4

Fig. 4 Captured color images by an angular spacing of 36° along the preferred observing path when the display system works.

Download Full Size | PDF

Five subjects observed the displayed 3D image along the preferred observing path in the lab environment. Although the “ghosting” phenomenon can be seen around the BR zone, the subjective feelings from five subjects show that the perceived motion parallax is continuous.

4. Discussions on the ghosting phenomenon and the limitation of the display size

In the prototype system, a relative large value of Δθ=360°/37=5.4° is adopted. This angle value is also the offset between viewing directions of adjacent stereo image used by the prototype system. A large Δθ means that there exists obvious difference between contents from adjacent stereo images. A spatial tiling of two segments from adjacent stereo images with obvious difference will result in the “ghosting” phenomenon. A decrease of Δθ can alleviate the “ghosting” problem effectively through using adjacent stereo images with closer similarity. A verification experiment is designed to demonstrate the alleviation of the “ghosting” phenomenon by using a smaller Δθ. All the experimental setup is the same, except for u, which is set as 54.9 mm. Only six adjacent projecting units are employed. The magnification of the stereo image changes to β' = 11.7. The globe model used above is scaled by β'/β = 2.35 in proportion for display in the verification experiment. For comparison purposes, the spatial ratio of the BR zone to the available zone for stereo images should keep consistent in the above prototype system and the verification experimental system. In order to satisfy this requirement, the CCD with a diaphragm of 2 mm located in a CFZ zone is set as 141.7 mm away from the imaging lenses in the verification experiment.

The captured image from the verification experimental display system is shown in Fig. 5(b). Figure 5(a) is the corresponding image captured from the above prototype system. Since a smaller diaphragm of CCD is used in the verification experiment, the captured image gets weaken. For comparison purposes, the brightness of Fig. 5(b) is increased arbitrarily through digital image processing. Obviously, the “ghosting” problem gets alleviated effectively when a smaller angular spacing between adjacent microdisplays is employed. Five subjects (same as above) reported no obvious “ghosting” when observing the displayed 3D image. Taking this value of Δθ as the preferred trade-off angle, an optimized system can get implemented with the input system parameters in Table 3.

 figure: Fig. 5

Fig. 5 The same transitional stereo image captured from the above prototype system and the verification experiment. For comparison purpose, the two images are zoomed into the same size here.

Download Full Size | PDF

Tables Icon

Table 3. Optimized input system parameters

The reachable display size of the proposed system depends on the system parameters by Eq. (5).

2R2=βdx=(v/u)dx=v(1/f+1/v)dx=(R/f+1)dx
Here 2R2 is taken as the reachable display size for simplicity, and R is the curvature radius of the circular aligned projecting units which is equal to v. Obviously, a larger R will bring a larger display size. But in order to get good image definition, the preferred observation distance cannot be much larger than the distance of distinct vision (250 mm). Therefore, according to Eq. (5), a smaller f and a larger dx can further increase the display size. In the proposed system, the aperture of the imaging lens shall be large enough to cover the whole mechanical size of an OLED microdisplay. So, imaging lenses with a short focal length and a large aperture are welcomed and microdisplays with a larger display area and a smaller mechanical size are appreciated in the future work to further improve the display size.

Employing the optimized system parameters, 180 OLED microdisplays, whose mechanical size is nearly identical to their display area, are needed for a real 360° 3D display system. The display size of the projected stereo image can reach 263 × 263 mm2. The radial viewing range is 60 mm and the radius of the preferred observation path is R1 = 691 mm. At this observation distance, the resolution criterion for a human eye, i.e. 1 arcmin, can be satisfied. Under this setup, the system still features the merit of “coarse-pitch” microdisplays, compared with S. Yoshida’s system [17] where 288 projectors were needed for 360° 3D display. Actually, OLED microdisplays with a resolution 1309 × 1309 (as shown in Table 3) and a display area being almost equal with the mechanical size are not available at present. Once ideal microdisplays are available, the proposed system can provide better 3D display effect.

5. Conclusions

In conclusion, a novel 360°3D color display system with coarse-pitch circular-aligned OLED microdisplay are realized through controllable fusing of optical rays from adjacent OLED microdisplays. The proposed technology not only decreases the required number of display panels in the display system greatly, but also gets rid of the dependence on the mechanical moving elements, high-speed components and diffusion screens. Since the diffusion screen is not used, the chromatic dispersion is avoided, making the proposed system be suitable for color 3D display. These merits endow our system with higher practicability compared with conventional 360° multiview 3D display systems.

Acknowledgments

The authors gratefully acknowledge supports by the Natural Science Foundation of China, Grant No. U1201254, and the National High Technology Research and Development Program of China (No.2013AA03A106, No. SQ2015AA0300066).

References and links

1. J. Geng, “Three-dimensional display technologies,” Adv Opt Photonics 5(4), 456–535 (2013). [CrossRef]   [PubMed]  

2. F. Yaraş, H. Kang, and L. Onural, “Circular holographic video display system,” Opt. Express 19(10), 9147–9156 (2011). [CrossRef]   [PubMed]  

3. D. Teng, L. Liu, Z. Wang, B. Sun, and B. Wang, “All-around holographic three-dimensional light field display,” Opt. Commun. 285(21-22), 4235–4240 (2012). [CrossRef]  

4. Y. Sando, D. Barada, and T. Yatagai, “Holographic 3D display observable for multiple simultaneous viewers from all horizontal directions by using a time division method,” Opt. Lett. 39(19), 5555–5557 (2014). [CrossRef]   [PubMed]  

5. T. Kozacki, G. Finke, P. Garbat, W. Zaperty, and M. Kujawińska, “Wide angle holographic display system with spatiotemporal multiplexing,” Opt. Express 20(25), 27473–27481 (2012). [PubMed]  

6. J. Y. Son and B. Javidi, “Three-dimensional imaging methods based on multi-view images,” J. Disp. Technol. 1(1), 125–140 (2005). [CrossRef]  

7. Y. Takaki and J. Nakamura, “Generation of 360-degree color three-dimensional images using a small array of high-speed projectors to provide multiple vertical viewpoints,” Opt. Express 22(7), 8779–8789 (2014). [CrossRef]   [PubMed]  

8. A. Jones, I. Modowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. 26(3), 40 (2007). [CrossRef]  

9. C. Yan, X. Liu, H. Li, X. Xia, H. Lu, and W. Zheng, “Color three-dimensional display with omnidirectional view based on a light-emitting diode projector,” Appl. Opt. 48(22), 4490–4495 (2009). [CrossRef]   [PubMed]  

10. H. Horimai, D. Horimai, T. Kouketsu, P. Lim, and M. Inoue, “Full-color 3D display system with 360 degree horizontal viewing angle,” Proc. of the Int. Symposium of 3D and Contents, 7–10 (2010).

11. G. E. Favalora and O. S. Cossairt, “Theta-parallax-only (TPO) displays,” US Patent 7,364,300.

12. Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express 20(8), 8848–8861 (2012). [CrossRef]   [PubMed]  

13. O. Eldes, K. Akşit, and H. Urey, “Multi-view autostereoscopic projection display using rotating screen,” Opt. Express 21(23), 29043–29054 (2013). [CrossRef]   [PubMed]  

14. T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: cylindrical 3D display viewable from 360 degrees,” J. Vis. Commun. Image R. 21(5–6), 586–594 (2010). [CrossRef]  

15. S. Yoshida, “fVisiOn: glasses-free tabletop 3D display to provide virtual 3D media naturally alongside real media,” Proc. SPIE 8384, 838411 (2012). [CrossRef]  

16. S. Yoshida, M. Kawakita, and H. Ando, “Light-field generation by several screen types for glasses-free table 3D display,” 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON), 1–4 (2011).

17. S. Yoshida, “Real-time rendering of multi-perspective images for a glasses-free tabletop 3D display,” 3DTV-Conference: The True Vision-Capture, Transmission and Display of 3D Video (3DTV-CON), 1–4 (2013). [CrossRef]  

18. R. Otsuka, T. Hoshino, and Y. Horry, “Transpost: A novel approach to the display and transmission of 360 degrees-viewable 3D solid images,” IEEE Trans. Vis. Comput. Graph. 12(2), 178–185 (2006). [CrossRef]   [PubMed]  

19. M. K. Hedili, M. O. Freeman, and H. Urey, “Transmission characteristics of a bidirectional transparent screen based on reflective microlenses,” Opt. Express 21(21), 24636–24646 (2013). [CrossRef]   [PubMed]  

20. M. K. Hedili, M. O. Freeman, and H. Urey, “Microlens array-based high-gain screen design for direct projection head-up displays,” Appl. Opt. 52(6), 1351–1357 (2013). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 Optical arrangement of the proposed 360° multiview 3D display system with only two adjacent projecting units being drawn for simplicity: (a) Structure of the projecting unit k; (b) The adjacent projecting unit k + 1 with an off-set angle to the unit k; (c) Diagram of the viewing region when two adjacent projecting units are activated simultaneously.
Fig. 2
Fig. 2 Optical diagram showing the observed transition stereo image when the pupil is cover by a CFZ.
Fig. 3
Fig. 3 Photograph of the experimental display system.
Fig. 4
Fig. 4 Captured color images by an angular spacing of 36° along the preferred observing path when the display system works.
Fig. 5
Fig. 5 The same transitional stereo image captured from the above prototype system and the verification experiment. For comparison purpose, the two images are zoomed into the same size here.

Tables (3)

Tables Icon

Table 1 Definitions of symbols

Tables Icon

Table 2 Input system parameters in the prototype system

Tables Icon

Table 3 Optimized input system parameters

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

θ VZ =2arctan((β d x D )/2v) θ PVZ =arcsin(0.5β d x cos(0.5Δθ)/(v0.5β d x sin(0.5Δθ))) arcsin(0.5β d x cos(0.5Δθ)/(v+0.5β d x sin(0.5Δθ))) θ CFZ =2arcsin(0.5β d x cos(0.5Δθ)/(v+0.5β d x sin(0.5Δθ)))
I Q kQk+1 = α Qk I Q k + α Qk+1 I Q k+1
α Qk = D 2(k) Q k / D 1(k) D 2(k) α Qk+1 = Q k+1 D 1(k+1) / D 1(k+1) D 2(k+1)
crosstalk(%)= reflection signal ×100
2 R 2 =β d x =(v/u) d x =v(1/f+1/v) d x =(R/f+1) d x
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.