Abstract

We present Fourier rainbow holographic imaging approach. It involves standard laser holographic recording and novel horizontal parallax only holographic display. In the display, the rainbow effect is introduced in an illumination module by high-frequency diffraction grating and white light LED source. The display is addressed by Fourier rainbow digital hologram (FRDH) encoding defocused object field with removed spatial frequency components in one direction by hologram slitting and without spherical phase factor. Theoretically and experimentally it is shown that the method extends the viewing zone of the classical viewing window display in vertical and longitudinal directions, thus the comfort of observation is improved. It is also numerically and experimentally validated that the numerical slitting applied within FRDH generation improves reconstruction depth of the display, here up to 400 mm.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Since hologram was invented by Gabor, many researchers have been focusing on holographic displays as a solution to provide perfect 3D image without support of special glasses or external devices. In holography, complex wave fields are reconstructed so that the observer can see optical copies of real or simulated objects. Recent advances in display technology have increased expectation for more realistic 3D reproduction [1]. However, the current state of 3D imaging by the digital holography is still far from the commercial expectations as shown in the Sci-Fi movies like Star Wars.

High demands on space bandwidth product (SBP) parameter is the most significant technological limitation of holographic displays. SBP is determined by the number of pixels of the spatial light modulator (SLM). Since the product of viewing angle and field of view is proportional to the SBP of display, for a given SLM, it is impossible to increase the size of the reconstructed image without the cost of viewing angle.

One important research direction is to physically improve SBP of SLMs. It is difficult to increase SBP physically. Nevertheless, this problem has been approached in many studies. Some investigations involve commercially available SLMs, where researchers have tried to multiplex SLMs physically or virtually in the space to increase image dimension or viewing angle [2–7]. These approaches require large number of physical/virtual SLMs. Moreover, for aligning multiple SLMs complicated optical system is needed.

Interesting alternative, which allows for reconstruction of large images, is a viewing window (VW) display approach. In this method, each point of the 3D image is reconstructed only within a narrow angular range, thus the SLM bandwidth requirement is much reduced. The disadvantage is a limited position of observation defined by the size of the generated VW, which is in consequence small. The essential and integral part of the system is a viewer tracking technology with an optical steering system responsible for change of location of the VW according to observer’s position [8]. The tracking may not be required if fast scanning and high refresh frame rate of SLMs are implemented [7].

Horizontal parallax only (HPO) approach is a solution for high requirement of the SBP [2, 3, 9–13]. At the expense of vertical parallax, it can reduce technical requirements of the display including number of SLMs and amount of data. Many approaches utilize vertical diffuser screen and place it at the plane of the reconstructed object to extend viewing zone in vertical direction. A disadvantage of this method is strong limitation of reconstruction depth. In [4] application of the vertical diffuser limited reconstruction depth to 60 mm. Next weakness is a generation of additional speckle noise coming from the diffuser. Speckle noise is an inherent disadvantage of using a laser. Therefore, some studies use LED illumination for its reduction [14–17].

The classical rainbow holography is a well-known and the most successful HPO approach [9]. The technique, in the recording step, uses a physical slit removing high frequency components in one direction, whilst the reconstruction step applies off-axis white light source multiplying the slit of the hologram in the same direction. This provides comfortable perception of a sharp 3D image, which is visible under white light illumination. Some researchers used computer generated rainbow holography for the optical reconstruction by hologram printing [18–20]. In their approaches, a grating function for a rainbow effect is implemented numerically in the computer generated hologram (CGH) data. In [21], this CGH-based rainbow approach is applied within holographic display setup. However, in this method, the SBP of the SLM is too small to generate rainbow effect for the large 3D hologram. In order to generate rainbow effect without any cost of SBP of the display SLM, in this work, we investigate the method of introducing the external rainbow illumination. The concept of external rainbow illumination was introduced in the technique of volumetric imaging achieving high depth selectivity [22].

In this paper, we propose a concept of Fourier Rainbow Holography, which is based on laser capture and white light reconstruction of spherical object wave. Our work focuses on an end-to-end digital rainbow holographic solution for real objects from capture to optical reconstruction. The solution involves standard laser lensless digital Fourier holographic recording or a computer hologram generation, the white light HPO holographic display where the SLM is illuminated by rainbow beam produced by first diffraction order of a diffraction grating, and a numerical processing. Within the numerical processing the Fourier Rainbow Digital Hologram (FRDH) is generated where the slit is applied directly to the hologram data. The simulations and experiments show that selection of the slit width depends on a result of the compromise between the display reconstruction depth and spatial resolution at the SLM plane. For the display, a concept of an HPO white light LED display with rainbow modulation produced by high frequency grating is introduced to provide the resolution and transverse viewing angle of a classical rainbow hologram, which is a new development based on VW display. In Section 5 and 6, it is shown that the method extends viewing zone of the display in vertical and longitudinal direction, thus the performance is improved when observer’s eye is located out of the VW plane. When an eye is in the viewing zone entire reconstructed image can be seen. The display is not reproducing colors of an object. It provides views with colors that changes with the observer’s position.

This work is organized as follows: in Section 2, the concept of Fourier Rainbow Holography is introduced; in Section 3 and 4, generation of FRDH and our implementation setup are explained; Section 5 provides discussion on reconstruction and spectral view analysis; in Section 6 experimental results are shown; Section 7 includes conclusions.

2. Concept of Fourier rainbow holography

In the paper, we propose the technique of Fourier Rainbow Holography. It involves laser lensless Fourier digital holographic recording, white light LED Fourier rainbow holographic display that utilizes grating as a source of external dispersion, and hologram processing responsible for slitting and transferring of holographic image. Figure 1 shows an illustration of the image formation process from the data capture to the display, where subscript 1 and 2 indicates the capture and display system, respectively. Figure 1(a) illustrates the content registration for the FRDH, while Fig. 1(b) depicts the reconstruction in the rainbow display.

 figure: Fig. 1

Fig. 1 Geometry for (a) generation, and (b) reconstruction of the FRDH.

Download Full Size | PPT Slide | PDF

The content for FRDH is captured at the primary hologram plane (PHP). The captured digital hologram (DH) contains Fourier Transform (FT) of the object field from the image plane (x1, y1). The two major numerical elements of FRDH generation are limitation of spatial frequencies in x1 direction with the numerical slit and numerical transfer of a hologram data from the PHP to the secondary hologram plane (SHP) of the display, which is optical copy of the SLM plane. The details of FRDH generation are given in Section 3. The computer generation of holographic content is also discussed in Section 3.

In the display system the FRDH is reconstructed with the rainbow beam produced by dispersion of the diffraction grating. The rainbow beam is generated by collimated white light source illuminating the grating at an angle of −1st diffraction order, θ = sin−1(λ2/d), while the SLM is illuminated at 0° for λ2 = 540 nm. The d is a period of the grating. Figure 1(b) illustrates the reconstruction process in the image domain of the display system. The full implementation of the display system is discussed in Section 4. Here, it is assumed that the FRDH is reconstructed from the SHP, which is transversely magnified (M = −6) optical copy of the SLM plane. At this plane, there is a field lens, which adds spherical phase factor of an object wave subtracted by the reference wave in the recording process. Such a flatten wave is also used during the FRDH generation. The FRDH generated for λ2 is reconstructed by assembly of plane waves of different tilts and wavelengths. The reconstruction for each wavelength focuses at corresponding VW at u2 with wavelength dependent coordinate. At this plane the obtained intensity distribution for all wavelengths is referred as rainbow VW (RVW). Formation of RVW extends viewing zone of the display in vertical and longitudinal direction. This effect is discussed in Section 5. Section 6 illustrates reconstructed images observed from different eye positions inside the viewing zone. The display provides orthoscopic reconstructions, which are to be viewed with a naked eye. The pupil of the eye can be considered as spectral filter, its physical dimensions enable observation of only limited spectral range of the reconstructed color components [23].

3. Registration and generation of FRDH

The content for FRDH is captured at the PHP of the lensless Fourier digital holographic system as shown in Fig. 1(a). The recorded object term OR* is a product of the object wave O and the conjugated reference wave R*:

OR*(u1,v1)=Oc(u1,v1,R1)=O(u1,v1)exp[iπ(u12+v12)λ1R1],
where λ1 is a registration wavelength. The notation Oc(u1, v1, -R1) expresses the compact space bandwidth product (CSBP) [24] representation, which is the object wave O with removed paraxial spherical phase factor defined by radius R1. The optical fields in (u1, v1) and (x1, y1) are related by Fresnel propagation integral, which in lensless Fourier holography reduces to FT:
Oc(x1,y1,R1)=Oc(u1,v1,R1)exp{2πi(u1x1+v1y1)λ1R1}du1dv1.
This shows that the change of hologram size affects the image resolution, only. The numerical part of the FRDH generation has four steps that are illustrated in flow chart in Fig. 2(a): (i) application of the slit to the hologram, (ii) reconstructing an object wave with FT, (iii) spatial filtering of twin image, zero order, and positioning in (x1, y1), (iv) propagation of the result by the distance R2 - Ff to the SHP. The geometry of image transfer within generation of the FRDH is shown in Fig. 2(b). The first step reduces the size of the hologram in vertical direction as:
Hr(u1,v1)=(|R(u1,v1)|2+|O(u1,v1)|2+RO*(u1,v1)+OR*(u1,v1))Π(u1S1),
where S is a width of the slit and the superscript r denotes rainbow hologram. The capture system can have different pixel pitch of CCD than obtained in the display. In this work the captured hologram Hr(u1, v1) is stretched to the primary hologram of display Hr(u2, v2) by simple change of pixel pitch using parameter t (u1 = t−1u2) representing pixels mismatch; from pixel pitch of the capture CCD to pixel pitch of the display at (u2, v2) without resampling. It is also assumed that the hologram is reconstructed with R2, which can be different than R1. Both geometrical discrepancies introduce an image magnification and axial shift to R2 during the optical reconstruction. In [15] detailed discussion on the effects of differences of parameters of capture and display system and resulted image magnification is provided.

 figure: Fig. 2

Fig. 2 Numerical processing of FRDH generation: (a) flow chart, (b) illustration of image transfer.

Download Full Size | PPT Slide | PDF

Disregarding magnification effect the object term of primary hologram of display is denoted as Ocr(u2,v2,R2)=Π(u2S21)OR*(u2,v2)and its reconstruction with FT results in an object field Ocr(x2,y2,R2). This field is propagated to the SHP (ξ2, η2). The propagation step produces final form of the FRDH used for reconstruction. The propagation is realized using a Confocal Fresnel Propagation (CFP) [15] that changes the sample size of propagated beam according to the radius ratio FfR2−1. This feature allows for the DH reconstruction at an arbitrary axial location with the same pixel count, with no loss of quality, and without additional computational cost. Thus, the generation step of the FRDH imposes no restriction on location of the holographic image in respect to the SHP.

The method of FRDH generation considered a real object captured. Nevertheless, the idea can be implemented for a computer generated rainbow hologram (CGRH). In this case generation has two steps. In step 1 the rainbow hologram is calculated from the set of N self-illuminated point sources [25, 26] with coordinates (xn, yn, zn) representing an object wave at the PHP:

Ocr(u2,v2,Ff)=exp[2πi(u22+v22)λ2Ff]n=1Nexp{2πi[(u2xn)2+(v2yn)2]λ2zn}.
The slit is here imposed by limitation of the computational domain of a hologram. Step 2 is a propagation to the SHP, carried out by FT as shown in Eq. (2), using Ff instead of R1.

In our implementation the slit is directly introduced to the primary hologram of size 2160 × 1920 pixels, which is twice a pixel size of SLM of display (Holoeye 1080P, 1920 x 1080 pixels, pixel pitch ΔSLM = 8 µm). The pixels outside the slit are set to zero (e.g. for 2 mm slit size 70% of pixels are zeros). The hologram is reconstructed using FT and the left half of the reconstructed hologram (1080 × 1920 pixels), which is an upright image, is taken. The value of R2, which for good resolution should be between 400 and 800 mm, sets the plane of the object focus. Next, this focused object wave is propagated by the distance R2Ff for λ2 = 540 nm to the SHP (ξ2, η2) with the CFP method. The propagation gives final form of the FRDH, which is converted to the phase only hologram using complex coding scheme [14].

The slit size imposed on the hologram should be selected in relation to axial distance of the reconstructed object to the SHP. When the object is reconstructed close to the SHP, the highest resolution of the display is obtained. Thus, slitting should not be used. For more distant reconstructions, imposing the slit allows to increase the imaging depth at the expense of resolution at the SHP. In [27] effect of the slit on resolution of view for different reconstruction depths is experimentally investigated. It is shown that the slitting can improve the imaging depth.

In this work, to provide more quantitative analysis, the slitting is studied numerically. For this, the observation of reconstructed holograms generated for different slit widths is simulated. The holograms are generated for object points of different axial locations. The observer’s eye is simulated by 4 mm pupil size at u2. For each hologram, a set of intensity reconstructions for different wavelengths is calculated at longitudinal location of the analyzed object point. The reconstructions are computed for white light spectrum with 0.1 nm step and only for the light transmitted by the eye pupil. Finally, the intensity summation for all wavelengths is done and the full width at half maximum (FWHM) measure is calculated. Example reconstructions of point sources are shown in Figs. 3(a) and 3(b) for z2 = 0 and z2 = −300 mm, respectively. The first plot illustrates the effect of resolution loss at the SLM plane when slitting. Figure 3(b) shows that for deep objects the slitting improves the quality of observed reconstruction. Both plots include the corresponding FWHMs measures. Figure 3(c) presents results of the entire simulation, where for each object point the factor FWHMz2/FWHMREF is calculated. The plot shows the resolution change for different reconstruction distance when applying different slit width. Consider an example of 2 mm slit size. For z2 = −300 mm and z2 = 0 mm the FWHM factor is 11 and 3.4, respectively. For the case of no slit the FWHM factor is 40 and 1. When applying smaller slit (S = 1 mm) the resolution is almost constant for entire simulated reconstruction depth and the FWHM factor is approximately 7. In the experiments for reconstruction of objects with large depths 2 mm slit was selected.

 figure: Fig. 3

Fig. 3 Simulated axial perception of reconstructed points for different axial positions: spatial distributions of views of reconstructed point sources at (a) z2 = 0 mm, (b) z2 = −300 mm; (c) distributions of FWHMz2/FWHMREF factor; FWHMz2 is calculated value of FWHM for z2; reference FWHMREF is calculated for z2 = 0 mm and S = 6.75 mm. Data was simulated for parameters of our display setup.

Download Full Size | PPT Slide | PDF

4. Fourier rainbow holographic display

Experimental realization of the Fourier rainbow holographic display is implemented with the phase only SLM, diffraction grating, and incoherent white light LED illumination. The display, which is shown in Fig. 4, consists of two main modules: illumination and imaging. The first module produces rainbow illumination beam with the white light LED source and high frequency diffraction grating DG. In our implementation a linear transmissive diffraction grating with 830 lines/mm is used. The diffraction grating is placed between the SLM and the LED source (Doric, LED W55, fiber core 960 µm, NA 0.5). Collimated beam from white light source formed by collimator lens C (Fc = 300 mm) is incident on the diffraction grating at the angle θ = 28° in vertical direction. This illumination angle is chosen so the −1st order of diffraction for selected central wavelength, here λ2 = 540 nm, illuminates the SLM at the normal direction. In the system correct beam direction is obtained using a mirror placed between the grating and the collimator lens.

 figure: Fig. 4

Fig. 4 Fourier rainbow holographic display.

Download Full Size | PPT Slide | PDF

The imaging module of the system enables generation and observation of large 3D objects with a naked eye. For this, the system has transverse magnification and a field lens with focus at the RVW plane. The 4F system with magnification ratio M = −6 is composed of two lenses L1 and L2 (F1 = 100 mm, F2 = 600 mm). The display employs complex coding scheme, which is experimentally supported with the cut-off filter in Fourier plane of the 4F system [14, 28]. The filter removes the conjugate term of the hologram displayed on the SLM leaving reconstructed complex object wave. The curvature of this object wave is supplemented by the field lens Lf (Ff = 600 mm). It is worth noting that in our implementation the diffraction grating is illuminated from the bottom. Thus, the blue wavelength component illuminates the SLM from below, while red one from the top. Our imaging system has negative magnification, therefore VW for blue color is formed at the bottom of the RVW.

5. Reconstruction and view analysis of FRDH

In the display the FRDH generated for normal direction and wavelength λ2 is reconstructed by the set of plane wave components of different angles and wavelengths. Figure 1(b) illustrates the reconstruction process with two example plane wave components k2, k2’ of the rainbow. The first one has the normal incidence angle and wavelength of the FRDH generation, the second is another wavelength. Taking into account the optical magnification M the angle of k2’ is

β2'=(λ2λ2')sin(θ)Mλ2.
In the Eq. (5) paraxial approximation was applied along the optical axis of the system. For wavelength λ2 the reconstruction is obtained at longitudinal coordinate:
z2'=z2Ffλ2λ2'Ff+z2(λ2λ2'),,
where z2 is an axial position of the image for k2. The reconstruction for k2’ is shifted in transverse direction by β2z2’. This transverse shift introduces image blur, which limits a reconstruction depth of the rainbow holography [23].

In VW display an eye has to be placed at the precise location of the VW plane. When an image is observed by an eye not at u2, for VW display, only a central part of the image is seen. In the rainbow display, the SLM is illuminated with different incident angles for different wavelengths. As a result, each wavelength diffracted on the SLM converges at the focal plane of the field lens at different spatial location. All reconstruction wavelengths contained in the white light source form an extended RVW in vertical direction. The viewing zone of the display is determined by two converging rays with different wavelengths diffracted from edges of the SLM as shown in Fig. 5. If we set the range of perceptible wavelengths as [λ2B, λ2R], the size of viewing zone can be approximated as:

ΔVZu2=Ff(λ2Rλ2B)2M[1d+1ΔSLM],
ΔVZz2=Bx2Ff[1Bx2ΔVZu21Bx2+ΔVZu2],
where Bx2 is the size of magnified SLM. If we assume that the spectral range of white light source is from 400 nm to 700 nm, the size of viewing zone calculated using Eq. (7) and (8) is 19.9 mm and 272.5 mm in vertical and transverse direction, respectively. For comparison, in Fig. 5, the viewing zone of the VW display is shown with green color in scale.

 figure: Fig. 5

Fig. 5 Viewing zone of the rainbow display; k2R+ and k2B- are related to reconstruction waves k2R and k2B by the diffracted angle of SLM, respectively.

Download Full Size | PPT Slide | PDF

When an eye is located within the viewing zone of the rainbow display entire reconstruction is seen but in colors that depends on the observer’s position. Let’s evaluate the rainbow spread, which is a range of wavelengths that are seen by the viewer in the image. For this, consider the geometry of Fig. 6(a) illustrating observation of the reconstruction of hologram of a point P2 of angular coordinate α2 from perspective α2. For this observation angle, an eye sees point P2 reconstructed with plane wave of an angle β2, related to the corresponding wavelength by Eq. (5). Since α2’ = α2 + β2’ the wavelength of perception of the reconstructed point P2’ is:

λ2'=λ2+λ2M(α2α2')sinθ.
The rainbow spread of the entire view can be evaluated, taking angular coordinates α2 of two boundary points: ½FoV and -½FoV. For on-axis observation the rainbow spread is

 figure: Fig. 6

Fig. 6 (a) Geometry for evaluating the rainbow spreads and (b) visualization of spectral range of observed view for different eye positions. Data was simulated for parameters of our display setup.

Download Full Size | PPT Slide | PDF

Δλonaxis=|FoVλ2M(Ffzp11)sinθ|.

For off-axis observations the rainbow spread can be computed directly from Eq. (9) analyzing wavelengths of the boundary points. In Fig. 6(b), the calculated rainbow spreads of the observed views for the various eye positions are visualized. The solid, dashed and dotted lines are the boundaries of the rainbow spreads for on-axis and off-axis locations of an eye: solid lines for up = 0, dashed lines for up = 10 mm, and dotted lines for up = −10 mm.

The color dots in Fig. 6(b) indicates the colors of the views when an eye is at the RVW plane. However, when an eye is out of the RVW plane, 3D image of the rainbow color with different spectral range is observed. For instance, for on-axis observation at zp = 500 mm, the observed image contains colors corresponding to the wavelengths between 455 nm and 610 nm.

6. Experimental results

The proposed Fourier Rainbow Holographic solution is evaluated in three experiments. First presents image quality for real 3D object. Second shows the visual properties of obtained views for varying observation positions while third illustrates the imaging depth. All images presented in this section were captured with a zoom lens camera (Canon EOS 5D Mark II), which settings (F# = 7.1, Focal length = 50 mm) were selected to simulate the conditions of perception of a human eye [29]. As shown in the Fig. 7, an image captured for selected camera parameters occupies only small part of the camera sensor, which is a case of Figs. 7, 8, 9, and 10. In all experiments enlarged parts of images are presented. At the same time, it should be noted that the quality of the holograms observed with a naked eye is higher than those recorded with the use of a camera.

 figure: Fig. 7

Fig. 7 Reconstruction of FRDH of the “Lowiczanka” figurine, (a) captured and (b) enlarged image; (c) camera photo of the object.

Download Full Size | PPT Slide | PDF

 figure: Fig. 8

Fig. 8 Reconstruction of the FRDH photographed for three vertical camera locations at the RVW plane. Visualization 2 shows continuous change of views for the camera moving in y direction.

Download Full Size | PPT Slide | PDF

 figure: Fig. 9

Fig. 9 Reconstruction of the FRDH photographed for four longitudinal camera locations on the optical axis. Visualization 3 presents continous change of views for the camera position from zp = 500 mm to zp = 700 mm.

Download Full Size | PPT Slide | PDF

 figure: Fig. 10

Fig. 10 Reconstruction of the CGRH designed and photographed for different reconstruction distances.

Download Full Size | PPT Slide | PDF

In the first experiment rainbow hologram of real 3D object was optically reconstructed in rainbow holographic display. The FRDH displayed on the SLM was generated for λ2 = 540 nm from the DH of “Lowiczanka” figurine. The object has dimensions: 120 mm height, 60 mm width, and 50 mm depth. The parameters of the capture system are following: camera resolution 2448 × 2050; camera pixel pitch 3.45 µm; object distance 1060 mm from the PHP. To provide the highest resolution the object was centered around the SLM surface, thus no slitting was applied. Reconstructed hologram was photographed by a camera placed at zp = 680 mm, which is 80 mm out of RVW. Figure 7 presents images of the reconstructed “Lowiczanka” object and its color photo. This experiment shows that when viewing position is out of the VW a holographic image contains many colors. In the Fig. 7(b) a single perspective is shown. Nevertheless, Visualization 1 illustrates figurine rotating in range ± 15°. For this purpose, synthetic aperture hologram of size 2540 × 60000 pixels was used as an input. Taking different parts of the hologram for the FRDH generation allows for reconstruction of different views of the object. Since controlling F# is disabled in a movie mode of our camera each frame of visualizations was captured separately and combined into the movie.

In the second experiment reconstruction of FRDH of “Lowiczanka” was photographed for different camera positions in relation to the RVW plane simulating different eye locations. Figure 8 illustrates images captured for three vertical positions of the camera in the RVW plane. For these camera locations different single color reconstructions are viewed. Green image is observed for axial position of a viewer, blue for bottom, and red for top (see color marking points in the Fig. 6(b) for zp = 600 mm). Visualization 2 shows continuous change of colors when moving the camera up and down in the RVW plane. Range of vertical movement is limited by the RVW size, here 19.9 mm. Figure 9 presents the images captured for four axial positions of the camera zp = 500, 550, 650, and 700 mm, respectively. Visualization 3 illustrates continuous change of eye position from zp = 500 mm to zp = 700 mm. When the camera is out of the RVW plane different parts of an image are observed with various range of rainbow colors (see color bars marked in the Fig. 6(b) for zp = 500 mm and zp = 550 mm). The experiment shows that our display extends viewing zone in transverse and longitudinal direction. Thus, the comfort of observation in the display is increased. For two axial positions of the camera, corresponding to ± 100 mm distance to the RVW plane, it is worth marking that: (i) colors direction in reconstructed object is inverted on opposite sides of the RVW plane (for + 100 mm top part is red and bottom is blue while for −100 mm it is the opposite) and (ii) reconstructed object starts to be cut in horizontal direction.

The final experiment evaluates imaging depth of the rainbow display. In this regards CGRH from cloud of points data was generated and optically reconstructed for different reconstruction distances R2 = 600, 700, 800 and 900 mm, respectively. Point structure was selected to easily visualize decrease of resolution. 3D model consists of 7000 points and represents a “dog”. The object in the SHP has following dimensions: 50 mm width, 70 mm height, and 25 mm depth. For each reconstruction distance object was rescaled to keep the same angular size. Each hologram was generated for λ2 = 540 nm and 2 mm slitwidth. The slit size was selected to increase the depth of the reconstructed object without significant reduction of its imaging quality close to the SHP. Generation method of CGRH is described in Section 3. Figure 10 shows optical reconstructions photographed by the camera located at zp = 680 mm. The camera is always focused at the same part of the 3D model, here, at the cheek. In the Fig. 10, blur of the points increases, however, up to R2 = 800 mm, which is 200 mm out of the SHP, it is small and the quality of the reconstructed CGRH is satisfactory.

7. Conclusions

This paper presents full imaging chain for the Fourier Rainbow Holography, which is based on capturing, processing and reconstructing of converging object beams. DH from the lensless Fourier holographic capture system is converted into the FRDH through numerical processing and reconstructed by the proposed rainbow display setup. This allows for rainbow hologram reconstruction of large, real 3D orthoscopic objects in rainbow holographic display with extended viewing angle in vertical direction.

In the display rainbow effect is introduced by high frequency diffraction grating and white light LED illumination module. The display SLM is addressed with FRDH encoding complex amplitude taken from an arbitrary defocused object plane with its spherical phase factor removed. The FRDH can be simulated numerically or captured optically. The two major numerical processing steps of the FRDH generation are limitation of transverse spatial frequencies with the numerical slitting and numerical transfer of the hologram to the SLM plane. Due to the FT relation between distributions at the object and hologram planes the slit is applied directly to the hologram data.

The proposed approach enables converting the VW display into the rainbow display, it extends viewing zone in vertical and longitudinal directions. Thus, the comfort of observation is increased. In the VW display an eye has to be precisely placed in yz while in the proposed Fourier rainbow holographic display with enlarged viewing zone an eye can move in yz. Different observation positions at RVW plane gives holographic views of different single colors. Moving an eye along the optical axis allows for observation of an image in different rainbow colors. The possibility of generating full-color rainbow hologram of real object for hologram printing was reported in [30]. With this solution full-color image can be observed from small viewing zone.

This work presents an HPO display with rainbow illumination using the white light LED and high frequency diffraction grating. The rainbow illumination extends the viewing zone, thus it can be considered as a means of reducing the SBP requirements. It has been shown that the SBP requirements can be further reduced by the slitting. The display provides high quality rainbow color reconstruction of large 3D objects, also in depth, with white light LED. For objects situated close to the SHP there is no need for slitting and the display provides the highest resolution, which corresponds to resolution of the VW display. For further distances imposing the slit increases the imaging depth at the expense of resolution at the SHP.

Funding

Cross-Ministry Giga KOREA Project Ministry of Science ICT and Future Planning (MSIP); Korea (GigaKOREA GK18D0100); and statutory funds of Warsaw University of Technology.

References

1. Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016). [CrossRef]  

2. J. Hong, Y. Kim, H. J. Choi, J. Hahn, J. H. Park, H. Kim, S. W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues [Invited],” Appl. Opt. 50(34), H87–H115 (2011). [CrossRef]   [PubMed]  

3. L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. IEEE 99(4), 576–589 (2011). [CrossRef]  

4. T. Kozacki, M. Kujawińska, G. Finke, B. Hennelly, and N. Pandey, “Extended viewing angle holographic display system with tilted SLMs in a circular configuration,” Appl. Opt. 51(11), 1771–1780 (2012). [CrossRef]   [PubMed]  

5. Y. Takaki and M. Nakaoka, “Scalable screen-size enlargement by multi-channel viewing-zone scanning holography,” Opt. Express 24(16), 18772–18781 (2016). [CrossRef]   [PubMed]  

6. Y. Lim, K. Hong, H. Kim, H. E. Kim, E. Y. Chang, S. Lee, T. Kim, J. Nam, H. G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016). [CrossRef]   [PubMed]  

7. H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015). [CrossRef]   [PubMed]  

8. S. Reichelt, R. Häussler, N. Leister, G. Fütterer, H. Stolle, and A. Schwerdtner, “Holographic 3-D displays electro-holography within the grasp of commercialization,” in Book of Advances in Lasers and Electro Optics, N. Costa and A. Cartaxo (Academic, 2010).

9. S. A. Benton and V. M. Bove, Holographic Imaging, (John Wiley & Sons, 2008).

10. V. M. Bove Jr, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005). [CrossRef]  

11. D. E. Smalley, Q. Y. Smithwick, V. M. Bove Jr, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013). [CrossRef]   [PubMed]  

12. Y. Takaki and N. Okada, “Hologram generation by horizontal scanning of a high-speed spatial light modulator,” Appl. Opt. 48(17), 3255–3260 (2009). [CrossRef]   [PubMed]  

13. N. Okada and Y. Takaki, “Horizontally scanning holography to enlarge both image size and viewing zone angle,” Proc. SPIE 7233, 723309 (2009). [CrossRef]  

14. T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24(3), 2189–2199 (2016). [CrossRef]   [PubMed]  

15. T. Kozacki, M. Chlipala, and P. L. Makowski, “Color Fourier orthoscopic holography with laser capture and an LED display,” Opt. Express 26(9), 12144–12158 (2018). [CrossRef]   [PubMed]  

16. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014). [CrossRef]   [PubMed]  

17. H. Araki, N. Takada, H. Niwase, S. Ikawa, M. Fujiwara, H. Nakayama, T. Kakue, T. Shimobaba, and T. Ito, “Real-time time-division color electroholography using a single GPU and a USB module for synchronizing reference light,” Appl. Opt. 54(34), 10029–10034 (2015). [CrossRef]   [PubMed]  

18. D. Leseberg and O. Bryngdahl, “Computer-generated rainbow holograms,” Appl. Opt. 23(14), 2441–2447 (1984). [CrossRef]   [PubMed]  

19. H. Yoshikawa and H. Taniguchi, “Computer generated rainbow hologram,” Opt. Rev. 6(2), 118–123 (1999). [CrossRef]  

20. H. Yoshikawa and T. Yamaguchi, “Computer-generated holograms for 3D display,” Chin. Opt. Lett. 7(12), 1079–1082 (2009). [CrossRef]  

21. T. Yamaguchi and H. Yoshikawa, “Real time calculation for holographic video display,” Proc. SPIE 6136, 61360T (2006). [CrossRef]  

22. W. Sun and G. Barbastathis, “Rainbow volume holographic imaging,” Opt. Lett. 30(9), 976–978 (2005). [CrossRef]   [PubMed]  

23. J. C. Wyant, “Image blur for rainbow holograms,” Opt. Lett. 1(4), 130–132 (1977). [CrossRef]   [PubMed]  

24. T. Kozacki and K. Falaggis, “Angular spectrum-based wave-propagation method with compact space bandwidth for large propagation distances,” Opt. Lett. 40(14), 3420–3423 (2015). [CrossRef]   [PubMed]  

25. J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017). [CrossRef]  

26. A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282–10298 (2018). [CrossRef]   [PubMed]  

27. H. Choo, M. Chlipala, and T. Kozacki, “Image blur and visual perception for rainbow holographic display,” Proc. SPIE 10679, 106790S (2018).

28. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013). [CrossRef]   [PubMed]  

29. R. Cicala, “The Camera Versus the Human Eye,” https://petapixel.com/2012/11/17/the-camera-versus-the-human-eye.

30. Y. Shi, H. Wang, Y. Li, H. Jin, and L. Ma, “Practical method for color computer-generated rainbow holograms of real-existing objects,” Appl. Opt. 48(21), 4219–4226 (2009). [CrossRef]   [PubMed]  

References

  • View by:

  1. Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016).
    [Crossref]
  2. J. Hong, Y. Kim, H. J. Choi, J. Hahn, J. H. Park, H. Kim, S. W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues [Invited],” Appl. Opt. 50(34), H87–H115 (2011).
    [Crossref] [PubMed]
  3. L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. IEEE 99(4), 576–589 (2011).
    [Crossref]
  4. T. Kozacki, M. Kujawińska, G. Finke, B. Hennelly, and N. Pandey, “Extended viewing angle holographic display system with tilted SLMs in a circular configuration,” Appl. Opt. 51(11), 1771–1780 (2012).
    [Crossref] [PubMed]
  5. Y. Takaki and M. Nakaoka, “Scalable screen-size enlargement by multi-channel viewing-zone scanning holography,” Opt. Express 24(16), 18772–18781 (2016).
    [Crossref] [PubMed]
  6. Y. Lim, K. Hong, H. Kim, H. E. Kim, E. Y. Chang, S. Lee, T. Kim, J. Nam, H. G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016).
    [Crossref] [PubMed]
  7. H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
    [Crossref] [PubMed]
  8. S. Reichelt, R. Häussler, N. Leister, G. Fütterer, H. Stolle, and A. Schwerdtner, “Holographic 3-D displays electro-holography within the grasp of commercialization,” in Book of Advances in Lasers and Electro Optics, N. Costa and A. Cartaxo (Academic, 2010).
  9. S. A. Benton and V. M. Bove, Holographic Imaging, (John Wiley & Sons, 2008).
  10. V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
    [Crossref]
  11. D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
    [Crossref] [PubMed]
  12. Y. Takaki and N. Okada, “Hologram generation by horizontal scanning of a high-speed spatial light modulator,” Appl. Opt. 48(17), 3255–3260 (2009).
    [Crossref] [PubMed]
  13. N. Okada and Y. Takaki, “Horizontally scanning holography to enlarge both image size and viewing zone angle,” Proc. SPIE 7233, 723309 (2009).
    [Crossref]
  14. T. Kozacki and M. Chlipala, “Color holographic display with white light LED source and single phase only SLM,” Opt. Express 24(3), 2189–2199 (2016).
    [Crossref] [PubMed]
  15. T. Kozacki, M. Chlipala, and P. L. Makowski, “Color Fourier orthoscopic holography with laser capture and an LED display,” Opt. Express 26(9), 12144–12158 (2018).
    [Crossref] [PubMed]
  16. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014).
    [Crossref] [PubMed]
  17. H. Araki, N. Takada, H. Niwase, S. Ikawa, M. Fujiwara, H. Nakayama, T. Kakue, T. Shimobaba, and T. Ito, “Real-time time-division color electroholography using a single GPU and a USB module for synchronizing reference light,” Appl. Opt. 54(34), 10029–10034 (2015).
    [Crossref] [PubMed]
  18. D. Leseberg and O. Bryngdahl, “Computer-generated rainbow holograms,” Appl. Opt. 23(14), 2441–2447 (1984).
    [Crossref] [PubMed]
  19. H. Yoshikawa and H. Taniguchi, “Computer generated rainbow hologram,” Opt. Rev. 6(2), 118–123 (1999).
    [Crossref]
  20. H. Yoshikawa and T. Yamaguchi, “Computer-generated holograms for 3D display,” Chin. Opt. Lett. 7(12), 1079–1082 (2009).
    [Crossref]
  21. T. Yamaguchi and H. Yoshikawa, “Real time calculation for holographic video display,” Proc. SPIE 6136, 61360T (2006).
    [Crossref]
  22. W. Sun and G. Barbastathis, “Rainbow volume holographic imaging,” Opt. Lett. 30(9), 976–978 (2005).
    [Crossref] [PubMed]
  23. J. C. Wyant, “Image blur for rainbow holograms,” Opt. Lett. 1(4), 130–132 (1977).
    [Crossref] [PubMed]
  24. T. Kozacki and K. Falaggis, “Angular spectrum-based wave-propagation method with compact space bandwidth for large propagation distances,” Opt. Lett. 40(14), 3420–3423 (2015).
    [Crossref] [PubMed]
  25. J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017).
    [Crossref]
  26. A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282–10298 (2018).
    [Crossref] [PubMed]
  27. H. Choo, M. Chlipala, and T. Kozacki, “Image blur and visual perception for rainbow holographic display,” Proc. SPIE 10679, 106790S (2018).
  28. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
    [Crossref] [PubMed]
  29. R. Cicala, “The Camera Versus the Human Eye,” https://petapixel.com/2012/11/17/the-camera-versus-the-human-eye .
  30. Y. Shi, H. Wang, Y. Li, H. Jin, and L. Ma, “Practical method for color computer-generated rainbow holograms of real-existing objects,” Appl. Opt. 48(21), 4219–4226 (2009).
    [Crossref] [PubMed]

2018 (3)

2017 (1)

J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017).
[Crossref]

2016 (3)

2015 (3)

2014 (1)

2013 (2)

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref] [PubMed]

2012 (1)

2011 (2)

2009 (4)

2006 (1)

T. Yamaguchi and H. Yoshikawa, “Real time calculation for holographic video display,” Proc. SPIE 6136, 61360T (2006).
[Crossref]

2005 (2)

W. Sun and G. Barbastathis, “Rainbow volume holographic imaging,” Opt. Lett. 30(9), 976–978 (2005).
[Crossref] [PubMed]

V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
[Crossref]

1999 (1)

H. Yoshikawa and H. Taniguchi, “Computer generated rainbow hologram,” Opt. Rev. 6(2), 118–123 (1999).
[Crossref]

1984 (1)

1977 (1)

Araki, H.

Barabas, J.

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
[Crossref]

Barbastathis, G.

Blinder, D.

Bove, V. M.

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
[Crossref]

Bryngdahl, O.

Chang, E. Y.

Chen, N.

Chlipala, M.

Choi, H. J.

Choo, H.

H. Choo, M. Chlipala, and T. Kozacki, “Image blur and visual perception for rainbow holographic display,” Proc. SPIE 10679, 106790S (2018).

Choo, H. G.

Falaggis, K.

Finke, G.

Fujiwara, M.

Hahn, J.

Hennelly, B.

Hong, J.

Hong, K.

Ichihashi, Y.

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Ikawa, S.

Ito, T.

Jia, J.

Jin, H.

Jolly, S.

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

Kakue, T.

Kang, H.

L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. IEEE 99(4), 576–589 (2011).
[Crossref]

Kim, H.

Kim, H. E.

Kim, J.

Kim, M.

Kim, T.

Kim, Y.

Kozacki, T.

Kujawinska, M.

Lee, B.

Lee, S.

Leseberg, D.

Li, X.

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref] [PubMed]

Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016).
[Crossref]

Li, Y.

Lim, Y.

Liu, J.

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref] [PubMed]

Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016).
[Crossref]

Ma, L.

Makowski, P. L.

Min, S. W.

Moon, E.

Nakaoka, M.

Nakayama, H.

Nam, J.

Niwase, H.

Oi, R.

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Okada, N.

Y. Takaki and N. Okada, “Hologram generation by horizontal scanning of a high-speed spatial light modulator,” Appl. Opt. 48(17), 3255–3260 (2009).
[Crossref] [PubMed]

N. Okada and Y. Takaki, “Horizontally scanning holography to enlarge both image size and viewing zone angle,” Proc. SPIE 7233, 723309 (2009).
[Crossref]

Onural, L.

L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. IEEE 99(4), 576–589 (2011).
[Crossref]

Pan, Y.

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref] [PubMed]

Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016).
[Crossref]

Pandey, N.

Park, J. H.

Park, J.-H.

J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017).
[Crossref]

Plesniak, W. J.

V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
[Crossref]

Quentmeyer, T.

V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
[Crossref]

Roh, J.

Sasaki, H.

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Schelkens, P.

Senoh, T.

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Shi, Y.

Shimobaba, T.

Smalley, D. E.

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

Smithwick, Q. Y.

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

Sun, W.

Symeonidou, A.

Takada, N.

Takaki, Y.

Taniguchi, H.

H. Yoshikawa and H. Taniguchi, “Computer generated rainbow hologram,” Opt. Rev. 6(2), 118–123 (1999).
[Crossref]

Wakunami, K.

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Wang, H.

Wang, Y.

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref] [PubMed]

Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016).
[Crossref]

Wyant, J. C.

Yamaguchi, T.

H. Yoshikawa and T. Yamaguchi, “Computer-generated holograms for 3D display,” Chin. Opt. Lett. 7(12), 1079–1082 (2009).
[Crossref]

T. Yamaguchi and H. Yoshikawa, “Real time calculation for holographic video display,” Proc. SPIE 6136, 61360T (2006).
[Crossref]

Yamamoto, K.

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Yaras, F.

L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. IEEE 99(4), 576–589 (2011).
[Crossref]

Yoshikawa, H.

H. Yoshikawa and T. Yamaguchi, “Computer-generated holograms for 3D display,” Chin. Opt. Lett. 7(12), 1079–1082 (2009).
[Crossref]

T. Yamaguchi and H. Yoshikawa, “Real time calculation for holographic video display,” Proc. SPIE 6136, 61360T (2006).
[Crossref]

H. Yoshikawa and H. Taniguchi, “Computer generated rainbow hologram,” Opt. Rev. 6(2), 118–123 (1999).
[Crossref]

Appl. Opt. (6)

Chin. Opt. Lett. (1)

J. Inform. Displ. (1)

J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017).
[Crossref]

Nature (1)

D. E. Smalley, Q. Y. Smithwick, V. M. Bove, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013).
[Crossref] [PubMed]

Opt. Express (7)

Opt. Lett. (3)

Opt. Rev. (1)

H. Yoshikawa and H. Taniguchi, “Computer generated rainbow hologram,” Opt. Rev. 6(2), 118–123 (1999).
[Crossref]

Proc. IEEE (1)

L. Onural, F. Yaraş, and H. Kang, “Digital holographic three-dimensional video displays,” Proc. IEEE 99(4), 576–589 (2011).
[Crossref]

Proc. SPIE (4)

V. M. Bove, W. J. Plesniak, T. Quentmeyer, and J. Barabas, “Real-time holographic video images with commodity PC hardware,” Proc. SPIE 5664, 255–262 (2005).
[Crossref]

T. Yamaguchi and H. Yoshikawa, “Real time calculation for holographic video display,” Proc. SPIE 6136, 61360T (2006).
[Crossref]

N. Okada and Y. Takaki, “Horizontally scanning holography to enlarge both image size and viewing zone angle,” Proc. SPIE 7233, 723309 (2009).
[Crossref]

H. Choo, M. Chlipala, and T. Kozacki, “Image blur and visual perception for rainbow holographic display,” Proc. SPIE 10679, 106790S (2018).

Sci. Rep. (1)

H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015).
[Crossref] [PubMed]

Other (4)

S. Reichelt, R. Häussler, N. Leister, G. Fütterer, H. Stolle, and A. Schwerdtner, “Holographic 3-D displays electro-holography within the grasp of commercialization,” in Book of Advances in Lasers and Electro Optics, N. Costa and A. Cartaxo (Academic, 2010).

S. A. Benton and V. M. Bove, Holographic Imaging, (John Wiley & Sons, 2008).

Y. Pan, J. Liu, X. Li, and Y. Wang, “A Review of Dynamic Holographic Three-Dimensional Display: Algorithms, Devices, and Systems,” in Proceedings of IEEE Conference on Transactions on Industrial Informatics, pp. 1599–1610 (2016).
[Crossref]

R. Cicala, “The Camera Versus the Human Eye,” https://petapixel.com/2012/11/17/the-camera-versus-the-human-eye .

Supplementary Material (3)

NameDescription
Visualization 1       Reconstructed figurine rotating in 30 degree
Visualization 2       Continuous change of colors of view of reconstructed figurine for camera moving in y direction
Visualization 3       Continuous change of view of reconstructed figurine for different axial camera position

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 Geometry for (a) generation, and (b) reconstruction of the FRDH.
Fig. 2
Fig. 2 Numerical processing of FRDH generation: (a) flow chart, (b) illustration of image transfer.
Fig. 3
Fig. 3 Simulated axial perception of reconstructed points for different axial positions: spatial distributions of views of reconstructed point sources at (a) z2 = 0 mm, (b) z2 = −300 mm; (c) distributions of FWHMz2/FWHMREF factor; FWHMz2 is calculated value of FWHM for z2; reference FWHMREF is calculated for z2 = 0 mm and S = 6.75 mm. Data was simulated for parameters of our display setup.
Fig. 4
Fig. 4 Fourier rainbow holographic display.
Fig. 5
Fig. 5 Viewing zone of the rainbow display; k2R+ and k2B- are related to reconstruction waves k2R and k2B by the diffracted angle of SLM, respectively.
Fig. 6
Fig. 6 (a) Geometry for evaluating the rainbow spreads and (b) visualization of spectral range of observed view for different eye positions. Data was simulated for parameters of our display setup.
Fig. 7
Fig. 7 Reconstruction of FRDH of the “Lowiczanka” figurine, (a) captured and (b) enlarged image; (c) camera photo of the object.
Fig. 8
Fig. 8 Reconstruction of the FRDH photographed for three vertical camera locations at the RVW plane. Visualization 2 shows continuous change of views for the camera moving in y direction.
Fig. 9
Fig. 9 Reconstruction of the FRDH photographed for four longitudinal camera locations on the optical axis. Visualization 3 presents continous change of views for the camera position from zp = 500 mm to zp = 700 mm.
Fig. 10
Fig. 10 Reconstruction of the CGRH designed and photographed for different reconstruction distances.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

O R * ( u 1 , v 1 )= O c ( u 1 , v 1 , R 1 )=O( u 1 , v 1 )exp[ iπ( u 1 2 + v 1 2 ) λ 1 R 1 ],
O c ( x 1 , y 1 , R 1 )= O c ( u 1 , v 1 , R 1 ) exp{ 2πi( u 1 x 1 + v 1 y 1 ) λ 1 R 1 }d u 1 d v 1 .
H r ( u 1 , v 1 )=( | R( u 1 , v 1 ) | 2 + | O( u 1 , v 1 ) | 2 +R O * ( u 1 , v 1 )+O R * ( u 1 , v 1 ) )Π( u 1 S 1 ),
O c r ( u 2 , v 2 , F f )=exp[ 2πi( u 2 2 + v 2 2 ) λ 2 F f ] n=1 N exp{ 2πi[ ( u 2 x n ) 2 + ( v 2 y n ) 2 ] λ 2 z n } .
β 2 '= ( λ 2 λ 2 ' )sin( θ ) M λ 2 .
z 2 '= z 2 F f λ 2 λ 2 ' F f + z 2 ( λ 2 λ 2 ') ,,
ΔV Z u 2 = F f ( λ 2 R λ 2 B ) 2M [ 1 d + 1 Δ SLM ],
ΔV Z z 2 = B x 2 F f [ 1 B x 2 ΔV Z u 2 1 B x 2 +ΔV Z u 2 ],
λ 2 '= λ 2 + λ 2 M( α 2 α 2 ' ) sinθ .
Δ λ on axis =| FoV λ 2 M( F f z p 1 1 ) sinθ |.

Metrics