This Letter proposes a new optical architecture based on a double-sideband filter, simultaneously applied at the Fourier plane, for inline digital holography. The proposed architecture not only allows removal of the conjugate images in the reconstruction process but also reduces the distortions that usually appear when using a single-sideband filter. We first introduce the mathematical model that explains the method and then describe the optical setup used for the implementation. The optical system includes a parallel aligned liquid crystal display placed at the Fourier plane that simultaneously filters positive and negative frequencies, when properly combined with linear polarizers. This feature makes the device useful to register dynamic processes. Finally, we tested the setup by registering a holographic movie of microscopic moving objects placed at different planes.
© 2015 Optical Society of America
Holograms are intensity or phase distributions that arise from the interference between the light transmitted (or reflected) by an object with a given reference beam. In the reconstruction process, reconstruction beams coherently illuminate the holograms, and the optical field diffracted by the hologram is formed by three different field contributions: a direct wave (proportional to the reference beam), an object conjugate wave (real image of the object), and an optical field proportional to the object wave (virtual image of the object). Depending on the optical arrangement used for registration and reconstruction of holograms, the visualization of object virtual images may be degraded by defocused images related to the object conjugate wave.
Current advances achieved in optoelectronic devices, such as spatial light modulators and detectors (CCD, CMOS), make digital holography a technique widely used in a large number of applications [1,2]. In digital holography, the interference between the object and the reference wave fronts is recorded onto a pixelated optoelectronic sensor. Afterward, this recorded image is used for the reconstruction of the object wave by using numerical methods [3,4]. Two main optical arrangements are used for digital holography: off-axis (OA) holography  and inline (IL) holography . In OA holography, proposed by Leith and Upatnieks, the reference and the object beams have a relative angle at the registration plane. In this way, the object, the conjugate, and the reconstruction waves propagate along different directions and can be observed separately in the reconstruction. Thus, the real image can be selected by spatially filtering the reconstructed planes [7,8].
However, OA configurations require a recording media with high spatial resolution, and they are more sensitive to vibrations and air flows than IL configurations. When dealing with IL configurations in the reconstruction process, the object virtual image (related to the object wave) and the object real image (related to the object conjugate image) overlap, so they usually cannot be observed separately.
Several techniques have been proposed to remove the unwanted conjugate image in hologram reconstruction. For instance, a spatial filtering of reconstruction planes of digital holograms was proposed in . To achieve this goal, the desired digitally reconstructed image is isolated from its surrounding pixels, but the selected area still contains noise from the undesired conjugate image. Another method to remove the influence of conjugate images was proposed in , where the hologram was recorded in the far field of the object. In such a situation, when the hologram is reconstructed, the object image is focused but the conjugate image appears strongly defocused, which makes it appear as background noise.
Reference  proposes another interesting approach that is based on the phase-shifting technique. In this approach, a CCD camera captures a set of interferograms obtained by illuminating the object with different reference waves related to different phase shifts. These phase shifts are in general performed through small displacements of a mirror on a piezoelectric device. Although this is a very suitable technique to be applied in IL applications, due to the time-sequential acquisition required to capture the diverse interferograms, it is difficult to implement in dynamic processes. In addition, the configuration used in this technique is sensitive to vibrations and air flows.
An alternative technique, known as single-sideband (SSB) holography, was originally proposed by Bryngdahl and Lohmann . Due to its simplicity, it also can be applied to dynamical objects. The method stops half of the spatial frequency spectrum at the Fourier plane during the recording of the hologram. Then the other sideband of the spectrum is blocked during the reconstruction step. The original technique was improved and adapted to digital holography [13,14] and has already been experimentally tested for fluid velocimetry . To this end, a collimated beam illuminates the particles and their distribution is imaged onto a CCD using convergent lens. At the focal plane of the lens, half of the spectrum is blocked by a knife edge. Consequently, a slight deformation of the image is observed in the reconstruction process because not all of the information of the wavefront was registered.
In this work, we propose a digital IL holographic method that offers reconstructed images free of distortions and uncorrupted by the conjugate ones. The technique exploits the idea of compensating deformations originated by a single-sideband filter by simultaneously using double-sideband (DSB) filtering. After providing the DSB filtering proof of concept, a particular experimental setup, based on a liquid crystal on silicon (LCoS) display, was implemented and tested to track particles placed at different planes.
First we will describe the technique by analyzing light propagation through the IL configuration shown in Fig. 1.
Figure 1 shows two different particles placed at two different planes labeled as P1 and P2, respectively. The system is illuminated by a plane wave that is diffracted by the particles and the Fourier spectrum is obtained at the focal plane of the convergent lens L1 [i.e., solid lines in Fig. 1]. In addition, L1 is also imaging an intermediate plane onto the CCD camera [dashed lines in Fig. 1]. Let us denote as the complex value of the electric field at the CCD camera if there were no filter at the Fourier plane.
By assuming that we are dealing with an almost transparent object we can write:1), which leads to
In turn, if a single-sideband filter is introduced at the Fourier plane to block frequencies , the amplitude distribution at the CCD plane becomes
Whereas the second term () in Eq. (4) only contains frequencies , the third term () only contains frequencies . Afterward, the Fourier transform of the intensity registered at the CCD [Eq. (4)] can be digitally calculated and frequencies removed, which allows us to eliminate the () term (i.e., the unwanted conjugate term). In this situation Eq. (4) becomes
By performing this action, we obtain an expression similar to Eq. (5):
In this way the full complex amplitude, without the contribution of unwanted conjugate waves, is obtained. From this information, the wavefront can be reconstructed in any arbitrary position by using a Rayleigh–Sommerfeld diffraction integral equation .
To implement the DSB filtering stated above, we have proposed the optical architecture shown in Fig. 2(a). This setup allowed us to apply the two sideband filters simultaneously, which is a very interesting feature in dynamic applications. In this architecture a laser beam, spatially filtered and then collimated by the convergent lens L1, is used as light source. A linear polarizer LP1 is used to properly choose the polarization angle of the incident beam. After LP1, the object under study is placed, represented by P1 and P2 planes. The Fourier plane is placed at the back focal plane of a second convergent lens L2 [solid lines in Fig. 2(a)], that simultaneously images an intermediate plane , which is situated between the P1 and P2 planes, over two different CCD cameras [dashed lines in Fig. 2(a)]. A beam-splitter (BS) is used to properly image onto CCD1 and CCD2 planes.
Since one of the goals of the proposed method is to be used to record holograms of dynamic processes, the DSB filtering must be performed simultaneously. Some suitable devices to perform this operation include phase plates, ferroelectric crystals (FECs), and liquid crystal displays (LCDs). In this case, we used the optical device shown in Fig. 2(b), which is formed by a parallel aligned liquid crystal display (PA-LCD) in combination with linear polarizers.
Next we will describe the device performance. The linear polarizer LP1 was set to illuminate the PA-LCD with a linearly polarized beam at 45° of the laboratory vertical. The PA-LCD can be modeled as a linear variable phase plate oriented at 0° and whose retardance depends on the addressed voltage . The voltage amplitude and the spatial distribution is selected in such a way that one half of the display has a retardance and the other half [see Fig. 2(b)]. Thus, whereas the incident linear polarization is not modified when it passes through the first PA-LCD half (), the light passing through the second half () is rotated 90°, so its linear polarization becomes oriented at 135° [Fig. 2(b)]. Afterward, the wavefront divided at the BS is projected, respectively, onto the linear analyzers LP2 and LP3. These two analyzers are orthogonally oriented to each other, namely, at 45° and at 135°, respectively. In this configuration, the linear analyzers LP2 and LP3 act as upper and lower sideband filters and thus the intensity registered at the CCD1 and CCD2 cameras corresponds to those described in Eqs. (5) and (6), respectively. Therefore, these complementary intensity images are simultaneously achieved. By computing Eq. (7), the full complex amplitude without the contribution of unwanted conjugate waves is obtained in real time.
The optical architecture proposed in Fig. 2(a) was experimentally implemented as shown in Fig. 3. A linearly polarized laser (Research Electro-optics, Model R-30995, wavelength 633 nm, output power 17 mW) was used as the light source. The laser beam was expanded and filtered by a spatial filter. Lenses L1 (, ) and L2 (, ) were used to collimate the beam and to image a plane onto CCDs, respectively. In the focal plane of L2, as spatial light modulator, we used a parallel aligned electrically controlled birefringence LCOS display distributed by HOLOEYE, so the setup is adapted to operate in reflection. The display, named PLUTO, is an active matrix reflective device with pixels and diagonal. The pixel pitch is of 8.0 μm and the display has a fill factor of 87%. We used CCD cameras (Basler, Model PIA 1000-60 gm), with a Truesense Imaging KAI-1020 CCD sensor, which delivers 60 frames per second at 1 MP resolution.
The proposed technique and setup were tested with an object conformed by a fixed micrometric reticle, placed at P1, and a thin glass with 100 μm microspheres randomly distributed on one of its faces. The thin glass is mounted on a rotation stage placed on P2 plane and in this way constitutes the dynamic object. The distance between both planes is 100 mm.
The plane , set between the microspheres diffractive particles and the reticle object, it is located to 50 mm from P2 and to 690 mm (object distance, s) from L2. CCD1 and CCD2 image the plane , simultaneously recording complementary holograms (Fig. 4). Note that the spots corresponding to defocused particles are deformed because of the single-sideband filter applied to obtain each hologram [upper Fig. 4(a) and lower Fig. 4(b), respectively].
The digital Fourier transform of both holograms is computed and the corresponding half frequency plane is blocked, as detailed in the mathematical fundamentals. After that, the inverse Fourier transform is applied to get the complex amplitudes. By adding both distributions, the compensated complex amplitude of the object is obtained.
Next, the complex amplitude was digitally propagated through -axis using the Rayleigh–Sommerfeld diffraction integral equation. In this case, corresponds to reconstruction of in the image plane. The reconstruction of the micrometric reticle image in Fig. 5(a) shows that its conjugate image is removed and the microspheres are defocused. Analogously, when the particles are focused the conjugate image is not visible and the reticle is now defocused [Fig. 5(b)]. Unlike the holograms shown in Fig. 4, the images displayed in Fig. 5 are free of distortions because of the double-sideband filter used.
The movie in (Fig. 6) shows the capability of the method and the setup to record holograms of dynamic objects. The first part illustrates only a refocusing process (i.e., for a fixed time a scanning from until the microsphere plane, along the axis is performed). Next, for a fixed distance, the focused microspheres are spatially rotated in the plane. Finally, the refocusing process is applied to scan the axis in the reverse direction but meanwhile the microspheres are in motion. The refocusing process ends when micrometric reticle is focused.
For a proper experimental implementation of the technique, an accurate alignment of the optical elements in the setup, with special attention to the two CCD cameras, must be provided. This is because simultaneous intensity images of the same object region with the same magnification must be recorded. This can be achieved by using a single convergent lens [L2 in Fig. 2(a)] properly placed before the beam-splitter element. To ensure that both cameras are properly set in the setup, we conducted some experimental calibration strategy (e.g., using an object test with some specific marks to guide the pixel-to-pixel alignment), which allowed us to place the CCDs at the best focal plane and imaging the same object region. Misalignments present between CCD cameras will to certain extent lead to a decrease of the final reconstructed hologram efficiency. In addition, faint differences between the sensors (that may occur in industrial fabrication processes) may also influence the results. In spite of this situation, as provided by the experimental tests conducted, performing an accurate alignment of the system makes the proposed setup robust enough to provide satisfactory results.
To summarize, this Letter presents what we believe is a new IL holographic technique based on double-sideband filtering. It allows the distortion-free removal of spurious conjugate images, and is suitable for capturing dynamic processes. After providing the mathematical foundations of the double-sideband based technique, we have presented an optical architecture, able to perform simultaneously the filtering for negative and positive frequencies. In addition, an experimental implementation is conducted, where the double-sideband filtering is achieved using a PA-LCD combined with properly oriented linear polarizers. Finally, the proposed method and the optical setup were tested with a dynamic object. A movie shows the ability of the technique to scan along the optical axis objects placed at different planes, an operation that is possible to perform for each frame of the movie. This feature makes the technique of interest to be used for tracking microscopic objects at frame rates limited only by the speed of the detection device.
ANPCYT PICT (2014-2432); Catalan Government (SGR 2014-1639); CONACyT (207633, 250850); Fondos FEDER; Spanish MINECO (FIS2012-39158-C02-01); UBACyT (20020130100727BA).
1. T. Kreis, Handbook of Holographic Interferometry (Wiley-VCH Verlag GmbH, 2005), pp. 81–92.
2. B. M. Hennelly, D. P. Kelly, N. Pandey, and D. Monaghan, Proceedings of the China–Ireland Information and Communications Technologies Conference (National University of Ireland Maynooth, 2009), pp. 241–245.
3. U. Schnars and W. P. O. Jüptner, Meas. Sci. Technol. 13, R85 (2002). [CrossRef]
4. S. Grilli, P. Ferrano, S. De Nicola, A. Finizio, G. Pierattini, and R. Meucci, Opt. Express 9, 294 (2001). [CrossRef]
5. E. N. Leith and J. Upatnieks, J. Opt. Soc. Am. 55, 569 (1965). [CrossRef]
6. D. Gabor, Nature 161, 777 (1948). [CrossRef]
7. E. N. Leith and J. Upatnieks, J. Opt. Soc. Am. 52, 1123 (1962). [CrossRef]
8. E. Cuche, P. Marquet, and C. Depeursinge, Appl. Opt. 39, 4070 (2000). [CrossRef]
9. G. Pedrini, P. Fröning, H. Fessler, and H. J. Tiziani, Appl. Opt. 37, 6262 (1998). [CrossRef]
10. J. B. DeVelis, G. B. Parrent Jr., and B. J. Thompson, J. Opt. Soc. Am. 56, 423 (1966). [CrossRef]
11. I. Yamaguchi and T. Zhang, Opt. Lett. 22, 1268 (1997). [CrossRef]
12. O. Bryngdahl and A. Lohmann, J. Opt. Soc. Am. 58, 620 (1968). [CrossRef]
13. T. Mishina, F. Okano, and I. Yuyama, Appl. Opt. 38, 3703 (1999). [CrossRef]
14. Y. Takaki and Y. Tanemoto, Appl. Opt. 48, H64 (2009). [CrossRef]
15. V. Palero, J. Lobera, N. Andres, and M. P. Arroyo, Opt. Lett. 39, 3356 (2014). [CrossRef]
16. G. Pedrini, W. Osten, and Y. Zhang, Opt. Lett. 30, 833 (2005). [CrossRef]
17. A. Lizana, A. Marquez, L. Lobato, Y. Rodange, I. Moreno, C. Iemmi, and J. Campos, Opt. Express 18, 10581 (2010). [CrossRef]