Abstract

Six-pack holography is adapted to reject out-of-focus objects in dynamic samples, using a single camera exposure and without any scanning. By illuminating the sample from six different angles in parallel using a low-coherence source, out-of-focus objects are laterally shifted in six different directions when projected onto the focal plane. Then pixel-wise averaging of the six reconstructed images creates a significantly clearer image, with rejection of out-of-focus objects. Dynamic imaging results are shown for swimming microalgae and flowing microbeads, including numerical refocusing by Fresnel propagation. The averaged images reduced the contribution of out-of-focus objects by up to 83% in comparison to standard holograms captured using the same light source, further improving the system sectioning capabilities. Both simulation and experimental results are presented.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Holography can capture the complex wave fronts of samples, thereby recording not only the sample amplitude but also the phase, thus enabling post-capture propagation to other axial distances by digital propagation [1]. This is achieved by interfering the coherent light that interacted with the sample with a clean reference beam that contains no sample modulation. In contrast to bright-field microscopy, however, where light is focused onto the sample by a condenser lens, in holographic microscopy a collimated beam of light (i.e. plane wave) is typically used to illuminate the sample. This is because illuminating with focused light (i.e. a spherical wave) would introduce a large curvature in the recorded phase, and the sample phase is typically much smaller than this phase curvature, thus the sample information becomes difficult to reconstruct. However, illuminating with a collimated beam leads to a significant decrease in the numerical aperture (NA) of the optical system, and correspondingly increases the depth of field. This causes out-of-focus (OOF) objects in 3D samples to have a strong presence in the image, such that when an OOF object passes either behind or in front of the sample in the focal plane, it will have visible effects on both the amplitude and phase images.

Six-pack holography (6PH) is an off-axis holographic imaging technique that is able to record six complex wave fronts simultaneously, by projecting onto the digital camera six pairs of interfering beams, and rejecting unwanted interferences using coherence gating [2,3]. Since these complex wave fronts do not overlap in the spatial frequency domain, they can be fully reconstructed. 6PH is an optimized case of spatial holographic multiplexing or angular holographic multiplexing [4], which has previously been applied for various applications, including single-shot Jones matrix acquisition [5], recording fast events [6], acquiring fluorescence and hologram images in the same exposure [7], and recording multi-wavelength holograms in a single shot [8].

Previously, we have used 6PH to obtain super-resolution by synthetic aperture in simultaneous off-axis digital holographic imaging [3]. In the current paper, we adapt 6PH to obtain a new approach for rejection of OOF objects in off-axis holography by simultaneous imaging of six different perspective views in six holographic channels, thereby improving sectioning in the reconstructed dynamic complex wave fonts, regardless of the illumination source temporal or spatial coherence. Thus, even when using a low-coherence source with coherence gating, our method is able to further improve the system sectioning capabilities. Our technique is similar in principle to that of light field microscopy [911], though in 6PH the lateral resolution is not sacrificed for improved axial resolution by using microlens arrays, and complex wave fronts are captured rather than intensity images, enabling techniques such as numerical refocusing using Fresnel propagation [12], and holographic synthetic aperture [13]. In addition, the presented application does not require a sparse sample as is required in compressive holography [1416]. Most importantly, the proposed technique is simultaneous and does not require scanning over time, as in other optical sectioning methods, such as confocal microscopy [17] or optical coherence tomography (OCT), either using a low temporal-coherence source [18,19] or a low spatial-coherence source in reflection mode [20].

2. Rejection of OOF objects with 6PH

In standard off-axis digital holography, the recorded hologram, containing linear fringes, is digitally Fourier transformed, and the cross-correlation (CC) term is cropped and inverse Fourier transformed to produce the complex wave front image from a single camera exposure. The linear parallel fringes in the hologram act as a carrier frequency for the complex wave front data, shifting the CC terms away from the center of the spatial frequency domain, where the DC term is located, as illustrated in Fig. 1(a). In 6PH, instead of a single pair of sample and reference beams, six pairs of sample and reference beams are used together. In the spatial frequency domain, this produces a pattern of six CC terms and their complex conjugates without overlapping any of these terms, as illustrated in Fig. 1(b), enabling the reconstruction of the six complex wave fronts contained in the six sample beams from a single camera exposure. To avoid unwanted interferences of nonmatching beams, coherence gating [3,2123] is used. This is done by using a partially coherent light source and introducing different optical path delays to each of the six beam pairs, such that each beam pair is delayed by at least the coherence length of the light source relative to the other five pairs, and thus only the matching sample and reference beam pairs will create off-axis interferences.

 figure: Fig. 1.

Fig. 1. Comparison of the spatial frequency domains of: (a) standard off-axis holography, and (b) 6PH. CC, cross-correlation term; CC*, complex conjugate of corresponding CC term; DC, auto-correlation term.

Download Full Size | PPT Slide | PDF

In order to reduce the presence of OOF objects using 6PH, one can illuminate the sample from six different off-axis angles around the optical axis. As illustrated in Fig. 2, each image produced using a different illumination angle provides a different perspective view of the 3D sample, with the position of in-focus objects remaining constant in all images, while the positions of OOF objects shift in accordance to their projections onto the focal plane. This shift is determined by the magnitude of the off-axis illumination angle, its direction along the sample plane, and the distance of the object from the focal plane. As shown in Fig. 2(a), the lateral shift distance d is equal to z tanθ, where z is the distance of the object from the focal plane, and θ is the off-axis illumination angle.

 figure: Fig. 2.

Fig. 2. Diagrams illustrating the OOF object rejection by the proposed averaging method that can be implemented by 6PH. (a) Diagram showing the correlation between off-axis illumination angle and shift magnitude. (b) – (g) Illustrations of six different perspective view images, using six illumination angles from around the optical axis, in the case where an OOF cell is directly above an in-focus cell. The central object represents an in-focus cell, and other objects represent the same single OOF cell from the six different perspectives. (h) The averaged image produced from (b) – (g), where the OOF objects appear in six different places, shifted from the in-focus object. (i) Expected image when θ = 0° when using a single perspective, where the OOF object occludes the in-focus object.

Download Full Size | PPT Slide | PDF

Assuming a 3D sample comprised of objects of roughly the same diameter, such as beads or a suspension of cells, and assuming illumination angles as in the proposed system, ideally the distance d of the OOF object should be greater than the diameter of the object. In such a case, pixel-wise averaging of the six perspective images illustrated in Figs. 2(b)–2(g) produces an image in which the in-focus object remains clearly visible, while the contribution of the OOF objects in the final image is decreased to one sixth of its previous value, as shown in Figs. 2(h) and 2(i). This is due to the fact that the projection images of the OOF objects are shifted in different directions in each amplitude image, and thus do not overlap in the averaged image.

3. Simulation results

First, to demonstrate the method and analyze its limitations, a computer simulation was implemented. The simulation was performed by systematic propagation of a plane wave through the model by increments of 65 nm along the angle of illumination, using the Fresnel propagation algorithm. Once the propagation through the model was complete, the resulting complex wave front image was refocused to the focal plane of z = 0. In this simulation, backscattering is not included; however, scattering along the beam path from multiple objects is included. The magnification, numerical aperture, wavelength, and pixel size used in the simulation were chosen to be identical to those of the optical system, presented later in the paper. Using this simulation, the effect of averaging the six amplitude images produced by the same six illumination angles to be used in the optical system was analyzed. We first analyzed a model of one object, and then analyzed our sectioning capability with a two-object model. The model used for this simulation was an absorptive sphere with a diameter of 5.1 µm, as shown in Fig. 3(a). Figure 3(b) shows the single-channel amplitude image of the in-focus sphere, and Fig. 3(c) shows the single-channel amplitude image of the same sphere after moving it 5.1 µm out of focus, and illuminating it at an off-axis angle of 23.4°. Figure 3(d) shows the averaged image produced using all six amplitude images when using the same six illumination angles as in the 6PH optical system presented later.

 figure: Fig. 3.

Fig. 3. Simulation images for a single sphere and an OOF sphere above a pattern of lines. (a) 3D representation of the single sphere model. (b) Amplitude image of an in-focus sphere as reconstructed from a single-channel hologram. (c) Amplitude image of the sphere from (a) at a z-distance of 5.1 µm, when illuminating the sample at an off-axis angle of 23.4°, as reconstructed from a single-channel hologram. (d) Averaged image produced from (c) and the five other amplitude images using the same six illumination angles as in the optical system presented later, as reconstructed from a six-pack hologram. Scale bar in (d) applies to (b) – (d). (e) 3D representation of the USAF target model, an in-focus line pattern with an OOF sphere. (f) Amplitude image of (e) using on-axis illumination, as reconstructed from a single-channel hologram. (g) Averaged image of (e) using six amplitude images as reconstructed from a six-pack hologram, with the same six illumination angles as in the optical system presented later. (h) Synthetic aperture produced using the same six complex wave fronts used to create (g). Circles indicate the six different spatial frequency bands. (i) Synthetic aperture amplitude image produced using the synthetic aperture from (h). Scale bar in (i) applies also to (f), (g).

Download Full Size | PPT Slide | PDF

In order to assess the averaging effect, the intensity differences of the images in Figs. 3(c) and 3(d) from the background value of 1.0 (white) were calculated. The image in Fig. 3(c) possessed a mean difference of 0.36, and a maximum difference of 0.88, while the image in Fig. 3(d) possessed a mean difference of 0.10, and a difference of 0.19 at the location of the maximum difference in Fig. 3(c). Based on these values, the averaged image reduces the mean contribution of the OOF object in this case by a factor of 3.6 and reduces the maximum contribution, that is, the darkest spot, by a factor of 4.6. While one might expect that the darkest spot would be reduced by a factor of 6, this is only true when there is no overlap between the six amplitude images. Assuming a sample containing multiple identical spheres such as the one in this simulation, the minimum z-distance between the centers of two spheres directly above one another, one in-focus and one OOF, is equal to the diameter of the spheres, 5.1 µm. Thus, the minimum reduction factor in this case is 4.6, while the maximum is 6, for objects further from focus. This points to an inherent limitation of this technique, which is that the minimum reduction factor is dependent on the dimensions of the objects in the sample. Non-spherical objects can be closer together and have larger contributions to the image in their x-y dimensions, leading to significant overlap between the amplitude images. This effect is illustrated in Visualization 1, which shows the averaged image of a sphere of 5.1 µm diameter as it moves out of focus. A non-spherical object may have averaged images similar to those of the spherical object shown in the video prior to reaching a z-distance of 5.1 µm.

Next, extension of the averaging technique by using synthetic aperture methods was tested, in which the shifted spatial frequencies acquired in each of the imaging channels were used to compose a larger spatial frequency content image [13]. For this purpose, the model that was used simulated an absorptive sphere 1.6 µm in diameter, placed 2.0 µm above a line pattern, similar to that of a single element from a 1951 USAF resolution target, comprised of 3 absorptive lines, each 1.95 µm wide, with 1.95 µm of empty space between them, as shown in Fig. 3(e). Figure 3(f) shows the single-channel image of this sample, using an on-axis illumination beam. Visibility of the pattern is 0.06 at the edges, with the middle of the central line being obscured completely by the OOF sphere. Figure 3(g) shows the averaged image using the same illumination angles as in the proposed optical system. In this case, the visibility of the pattern is increased to 0.12. Figure 3(h) shows the synthetic aperture constructed from the same six complex wave fronts (i.e. CC terms) used to create Fig. 3(g). In order to maintain the averaging effect, wherever the CC terms overlapped in the spatial frequency domain during the synthetic aperture construction, the average value of the overlapping frequencies was used. The resulting synthetic aperture amplitude image, shown in Fig. 3(i), possesses both the OOF object rejection of the averaged image, and the increased lateral resolution of a synthetic aperture image, with a visibility of 0.29 in this case. Visualization 2 illustrates the effects of the illumination angle on the averaged and synthetic aperture images. In this video, the arrangement of the beams around the optical axis is the same as in the optical system, except with all the off-axis illumination angles being equal and increasing.

Following this, the effect of the off-axis illumination angles on the averaged image was tested using a model similar to that of a simple biological cell. The model possessed a mildly absorptive ellipsoidal membrane and two identical absorptive spherical organelles, as shown in Fig. 4(a). The membrane had x and y diameters of 5.2 µm and a z diameter of 7.9 µm. The organelles had diameters of 1.6 µm, with one organelle being in-focus, and the second organelle being 2.6 µm above the focal plane.

 figure: Fig. 4.

Fig. 4. Simulation images for a biological cell model. (a) 3D representation of the cell model. (b) Single-channel amplitude image of the cell model using on-axis illumination. (c) Averaged image of (a) using the same six illumination angles as in the optical system presented later. (d) Averaged image of (a) with six off-axis illumination angles set to 45°.

Download Full Size | PPT Slide | PDF

Figure 4(b) shows the single-channel amplitude image of the cell model using on-axis illumination, while Fig. 4(c) shows the averaged image of the cell model using the same six illumination angles as in the proposed optical system (mean off-axis angle value: 26.1°), and Fig. 4(d) shows the averaged image with the six off-axis illumination angles set to 45°. From Figs. 4(b)–4(d) we can see the effect of larger illumination angles on the effective depth of field and the OOF object removal process. In Fig. 4(b), at 0° illumination, the upper organelle has a strong presence in the image, while in Fig. 4(c) this has been improved by averaging using angles of 26.1°, with the edges of the cell being clearer and the cytoplasm being brighter. In Fig. 4(d), the averaged image produced using angles of 45° is even brighter. The mean values of the pixels in the area of the cell were measured for Figs. 4(b)–4(d), and were determined to be 0.58, 0.66, and 0.73, respectively. These values indicate that the effective depth of field of the averaged image decreases as the off-axis angles of the illumination increases, as the higher illumination angles cause the layers of the model that are closer to the focal plane to undergo greater lateral shifts and be further averaged out. In addition, it is apparent that at 45° illumination from six angles, the image begins to suffer from a spatial frequency imbalance, resulting in six visibly more resolved (brighter) regions, along the three axes of illumination (as the six beams illuminate in pairs of opposing angles). In between these six higher-resolution regions, there are six less resolved (darker) regions, where the resolution remains closer to what is seen in Fig. 4(c). This effect is further illustrated in Visualization 3, where the averaged images using illumination angles from 0° to 45° are shown.

Finally, the effect of object concentration on the averaged image was tested using a 16.6×16.6×16.6 µm model composed of multiple absorptive spheres of 2 µm diameter, at various homogenous concentrations, with a single in-focus sphere always at the center of the model and the resulting averaged image. The simulation used the same six illumination angles as in the proposed system. At the maximum concentration of 182.89 pM, in which the objects touch each other, as shown in Fig. 5(a), both the single-channel amplitude image and the averaged image fail to enable the in-focus object to be distinguished, with the Michelson visibility of the in-focus object being 0.12 in the averaged image. At a lower concentration of 118.31 pM, shown in Fig. 5(b), the in-focus object begins to be visible in the averaged image, with a visibility of 0.28. At the concentration of 54.19 pM shown in Fig. 5(c), the in-focus object still cannot be distinguished in the single-channel amplitude image, yet it may be distinguished in the averaged image, with a visibility of 0.45. Finally, at the concentration of 11.70 pM shown in Fig. 5(d), the in-focus object is still obscured in the single-channel image, yet in the averaged image the in-focus object is highly visible, with a visibility of 0.63. The in-focus object visibility in the averaged image continues to improve linearly as the concentration decreases, as illustrated in Fig. 5(e) and Visualization 4, with the maximum possible visibility for the in-focus object being 0.75.

 figure: Fig. 5.

Fig. 5. Simulation of applying 6PH sectioning for a 3D scene containing absorptive spheres distributed at high concentrations. (a) 182.89 pM concentration. (b) 118.31 pM concentration. (c) 54.19 pM concentration. (d) 11.70 pM concentration. (a – d) Left: 3D representation of the model. Middle: Single-channel (regular holography) amplitude image. Right: Averaged image demonstrating the sectioning capability of 6PH. (e) Graph of the in-focus object visibility in the averaged image as a function of concentration.

Download Full Size | PPT Slide | PDF

Next, the effect of the averaging technique on the optical path delay (OPD) images produced by holography was tested. OPD is equal to the phase of the complex wave front image multiplied by the illumination wavelength and divided by 2π. In this simulation, which was generated using the Rytov approximation [24], a simple human sperm cell head model was used. The model, shown in Fig. 6(a), possessed an ellipsoidal membrane with a refractive index (RI) of 1.35 (indicated in cyan), cytoplasm of RI 1.38 (indicated in green), and a nucleus of RI 1.42 (indicated in red), with the surrounding medium having an RI of 1.33. The membrane had an x diameter of 5.2 µm, a y diameter of 3.5 µm, and a z diameter of 1.9 µm. The nucleus had an x diameter of 1.9 µm, a y diameter of 1.9 µm, and a z diameter of 1.0 µm. The model was aligned parallel to the xy plane. Figure 6(b) shows the OPD when illuminating the model on-axis (0°). Figures 6(c)–6(h) show the six OPD images produced when using the same six illumination angles as in the proposed optical system, and Fig. 6(i) shows the averaged image produced from the six OPD images. In this case, the resulting averaged OPD image is similar to the on-axis image, but still has a mean error of 11%.

 figure: Fig. 6.

Fig. 6. Simulation of applying 6PH sectioning to OPD images, with sample aligned in the xy plane. (a) 3D representation of the cell model. (b) On-axis illumination OPD image. (c) – (h) The six different off-axis illumination OPD images using illumination angles as in the proposed system. (i) Averaged OPD image produced from images (c) – (h). (j) Tomographically reconstructed OPD image produced from images (c) – (h). Scale bar and color bar in (j) apply to all figures except (a).

Download Full Size | PPT Slide | PDF

The attempt was made to improve these results using tomographic reconstruction via the optical diffraction tomography (ODT) algorithm [25]. The tomographic 3D RI map was reconstructed using the same phase maps used for Figs. 6(c)–6(h), and an OPD image was reconstructed from the 3D RI map by subtracting the RI value of the medium from the RI value of each voxel and multiplying by the voxel thickness. This converted the value of each voxel from RI to OPD, as OPD = h(nsnm), where h is the object thickness, ns is the RI of the sample, and nm is the RI of the surrounding medium. Once the voxel value had been converted to OPD values, the voxel values were summed along the z-axis to produce the reconstructed OPD image shown in Fig. 6(j). The result is only slightly more accurate than averaging, with a mean error of 6%, though the 2D shape less resembles the on-axis image. This is likely due to reconstructing the 3D RI map using only six projections, which was not enough to produce a high quality tomographic reconstruction of the 3D RI map.

 figure: Fig. 7.

Fig. 7. Simulation of applying 6PH sectioning to OPD images, with sample aligned in the yz plane. (a) 3D representation of the cell model. (b) On-axis illumination OPD image. (c) – (h) The six different off-axis illumination OPD images using illumination angles as in the proposed system. (i) Averaged OPD image produced from images (c) – (h). (j) Tomographically reconstructed OPD image produced from images (c) – (h). Scale bar and color bar in (j) apply to all figures except (a).

Download Full Size | PPT Slide | PDF

As the OPD profile of an object may be significantly different depending on its orientation, the same simulation was repeated using the same model, except with the model cell aligned in the yz plane, as shown in Fig. 7(a). The on-axis illumination image for this case is shown in Fig. 7(b), and the six OPD images produced from the six illumination angles are shown in Figs. 7(c)–7(h). By this stage, it was evident that the OPD profiles of the six off-axis illumination images were significantly different from that of the on-axis image. Figure 6(i) shows the resulting averaged OPD image produced from the six images, in which the mean error is 21%. The OPD image reconstructed using the ODT algorithm and summation is shown in Fig. 7(j), and is even worse than the averaged image, with a mean error of 34%. Based on these results, it is clear that both the averaging method and the tomographic method were incapable of guaranteeing accurate results in our system when imaging dynamic samples, in which the orientation of the sample is unpredictable. Thus, this averaging method should be used for amplitude reconstruction rather than phase reconstruction.

4. Experimental system

The proposed 6PH system for OOF object rejection, shown in Fig. 8(a), is based on a modified Mach-Zehnder interferometer, which is adapted for high magnification microscopy (in contrast to Ref. [3]). Partially coherent laser light is emitted by a supercontinuum laser source (NKT SuperK EXTREME), followed by an acousto-optical filter (NKT SuperK SELECT) and a laserline filter (central wavelength: 632.8 nm, full width at half maximum: 3 nm), collectively designated LC in Fig. 8(a). This laser light illuminates a diffractive beam splitter DBS (DigitalOptics Corporation), and is split into an 11 × 7 pattern of 77 collimated beams that diverge from the optical axis. These beams then pass through lens L1 (50 mm focal length) to reach the first 50:50 beam splitter, BS1, where they are split into the sample and reference arms of the system. As the beams pass through the lenses, they alternate between travelling in parallel while each individual beam is focusing, and collectively converging onto the optical axis while each individual beam is collimated.

 figure: Fig. 8.

Fig. 8. 6PH system for complex wave front imaging with rejection of OOF objects. (a) The optical microscopy system. LC, low-coherence supercontinuum laser source; DBS, diffractive beam splitter; L1 – L14, lenses; BS1, BS2, beam splitters; P1, P2, periscopes; PDP, phase delay plate; S, sample; MO, microscope objective; M1, M2, mirrors; ND, neutral density filter; C, digital camera. Red line indicates the optical axis. (b) Diagram of the sample and reference beams chosen, and their positions relative to the optical axis before the PDPs. S1 – S6, sample beam positions before L4 in the sample arm; R1 – R2, reference beam positions before L10 in the reference arm. The optical axis is normal to the image and marked by ⊗. Each square of the grid is 1.36 mm × 1.36 mm in size. (c) The corresponding sample and reference PDP structures, ensuring that non-matching beam pairs will not interfere with each other due to coherence gating.

Download Full Size | PPT Slide | PDF

In the sample arm, the 77 beams, traveling in parallel, enter lens L2 (50 mm focal length) and pass through periscope P1, comprised of 2 mirrors located along the optical axis with an angle of 270° between their reflective faces, and a retroreflector mirror with an angle of 90° between its reflective faces. They then pass through lens L3 (focal length 150 mm) and all but the six desired sample beams are blocked, as shown in Fig. 8(b). The six beams then pass through phase delay plate PDP, a set of 2D stairs comprised of six steps of glass, with each step being 2 mm thick, as illustrated in the left of Fig. 8(c). Each beam passes through a different step, which introduces an OPD of 1.16 mm between each of the beams, which is greater than the coherence length of the laser, 42.4 µm, preventing the beams from interfering with each other. The six sample beams then pass through aspheric lens L4 (focal length 12.7 mm), causing them to illuminate the sample S at six different angles from around the optical axis. Beams 1–4 illuminate the sample at an off-axis angle of 27.5°, while beams 5 and 6 illuminate it at an off-axis angle of 23.4° (in contrast to Ref. [3]). The sample is then imaged by a microscope comprised of microscope objective MO (Zeiss Plan-APOCHROMAT, 63× magnification, 1.4 numerical aperture, in contrast to Ref. [3]) and the tube lens L5 (focal length 150 mm). Following this, additional magnification is added by a 4f system (in contrast to Ref. [3]), composed of L6 (focal length 80 mm) and L7 (focal length 125 mm), and, after passing through 50:50 beam splitter BS2, the sample is imaged on the digital camera C (GS3-U3-23S6M, FLIR, 12-bit monochromatic CMOS, 1200 × 1920 square pixels of 5.86 µm each). The total magnification of the microscope is 90×, and the diffraction limited spot size is 452 nm.

In the beginning of the reference arm, the system is virtually identical to that of the sample arm, with lenses L8 and L9 and periscope P2 being identical to lenses L2 and L3 and periscope P1, respectively, except that the retroreflector of P2 is mounted on a translation stage, allowing the optical path lengths of the sample and reference arm to be matched by slightly adjusting the retroreflector position. The purpose of lenses L1 – L3, L8, and L9, in both the sample and reference arm, is to pass the pattern of 77 beams to the PDPs while magnifying the pattern by a factor of 3, making it simpler to block the undesired beams and increasing the off-axis illumination angle after L4. After the beams pass through lens L9, a different set of six beams than those chosen in the sample arm are allowed to pass, while all other beams are blocked. The six reference beams are arranged in a different pattern, as shown in Fig. 8(b), as this pattern will produce the necessary off-axis angles for 6PH. Following this, the six reference beams pass through another PDP, designed to match the different pattern, as illustrated on the right of Fig. 8(c). This introduces the same OPD effect as the sample PDP, as it is designed to ensure that each reference beam will have only one matching sample beam that experiences the same delay, and thus will only interfere with that one sample beam. The beams then pass through lenses L10 (focal length 75 mm), L11 (focal length 75 mm), L12 (aspheric, focal length 50.8 mm), L13 (focal length 50 mm), and L14 (aspheric, focal length 152 mm) and BS2 and interfere with their corresponding sample beams on the digital camera plane, producing the multiplexed six-pack hologram. Reference beams 2, 3, 5, and 6 illuminate the camera at an off-axis angle of approximately 1.5°, and beams 1 and 4 illuminate the camera at an off-axis angle of approximately 2.0°. A neutral density filter, ND (optical density 1.4), is placed in the sample arm between L10 and L11 in order to match the sample and reference beam intensities, due to the large difference in intensity caused by the microscope magnification. A 2-inch thick rectangular prism of glass is placed between L11 and L12, to compensate for the large OPD introduced by the MO.

Once the six-pack hologram is captured by the digital camera, a digital 2D Fourier transform is performed, and the six CC terms are individually cropped and inverse Fourier transformed to produce the six complex wave front images. A background six-pack hologram containing no sample was also captured, and pixel-wise division of the sample complex wave fronts by the corresponding background complex wave fronts was done to remove stationary aberrations.

All lenses in the system are arranged in a 4f configuration, except for lenses L13 and L14 in which the distance between them is 125 mm. This is required in order to create the off-axis angles, while having all six reference beams illuminate the same point on the camera, and still maintain near-identical optical path lengths for the sample and reference arms.

The phase delay plates used in this system were hand-made by gluing together the 2 mm thick sections of glass that comprised each step using optical glue, which may have led to undesired reflections and decrease in beam intensities. The system may be improved by using phase delay plates of solid glass. Furthermore, in this system a filtered supercontinuum laser source was used to produce light of a limited coherence length. Other collimated, partially coherent light sources, such as a laser diode, can be used. However, any decrease in the coherence length will decrease the area upon which we see interference on the camera, while increasing the coherence length can lead to cross-talk between the beams. The latter problem can be solved by increasing the thickness of the PDP.

5. Experimental results

A sample of microalgae (Chlamydomonas reinhardtii) freely swimming in water was imaged at 50 fps, with the six-pack hologram of a single video frame shown in Fig. 9(a). In this multiplexed hologram, there is one in-focus cell and a single OOF microalgae just above. Focusing was done manually, by using a cell resting on the glass slide. The digital Fourier transform of this frame was taken, with the corresponding spatial frequency power spectrum shown in Fig. 9(b), and from the six CC terms the six amplitude images, shown in Figs. 9(c)–9(h), were reconstructed. Following this, the amplitude images were pixel-wise averaged to produce the averaged image shown in Fig. 9(i). It can be seen that in each of the six amplitude images the OOF cell is projected onto a different location in the image, with the OOF cell fully overlaying the in-focus cell in Fig. 9(d). If Fig. 9(d) were the only amplitude image available, as in standard holography, the in-focus cell would remain obscured. However, in the averaged image, the OOF cell is averaged out of the image, allowing the in-focus cell to remain visible. In addition, the averaging process acts as a condenser would, decreasing the depth of field of the system. As the average off-axis angle of illumination is 26.1°, which corresponds to a condenser of numerical aperture 0.44, the effective depth of field in this system is decreased from 560 nm to 337 nm. Due to this, the averaged image is not identical to any of the single-channel amplitude images, as the depth of field is significantly smaller than the typical length of the cell, which is 10 µm. Rather, only the region of the cell that is at the focal plane is visible. A video of the single-channel amplitude and averaged images of this sample is shown in Visualization 5.

 figure: Fig. 9.

Fig. 9. Dynamic 6PH experimental results of swimming microalgae. (a) Six-pack hologram, inset shows fringes magnified ten times. (b) Spatial frequency power spectrum of (a). (c) – (h) The amplitude images produced from each of the six holographic channels contained in (a). (i) Averaged image generated from images (c) – (h). Scale bar in (i) applies to all images except (b).

Download Full Size | PPT Slide | PDF

As the averaged image is constructed from six complex wave front images, it is possible to numerically refocus the image through use of Fresnel propagation, which is illustrated in Fig. 10. Figure 10(a) shows the amplitude image from a single holographic channel, in which there is an in-focus cell with an overlaying OOF cell. In Fig. 10(b) the averaged image is shown, in which the OOF cell is averaged out. Figures 10(c) and 10(d) show the same images as 10(a) and 10(b), respectively, after numerically refocusing by 9.1 µm. A video of the refocusing of the dynamic sample is provided in Visualization 6.

 figure: Fig. 10.

Fig. 10. In-focus and numerically refocused images reconstructed from the 6PH data of the swimming microalgae shown in Fig. 9. (a) Amplitude image from a single channel. (b) Averaged image produced from (a) and the five other amplitude images. (c), (d) Numerically refocused versions of (a) and (b), respectively.

Download Full Size | PPT Slide | PDF

A thick sample of 12 µm polystyrene beads flowing in water was also imaged at 50 fps, with the results shown in Fig. 11 and Visualization 7. Using the process described above, the averaged image was reconstructed from a single frame of the video. The cropped video frame and associated spatial frequency power spectrum are shown in Figs. 11(a) and 11(b), respectively.

 figure: Fig. 11.

Fig. 11. Dynamic 6PH experimental results of 12 µm polystyrene beads flowing in water. (a) Cropped six-pack hologram. Inset shows fringes magnified five times. (b) Spatial frequency power spectrum of the six-pack hologram. (c) Single-channel amplitude image of beads in water, with one OOF bead occluding an in-focus bead, reconstructed from a single CC term from (b). (d) Averaged image as obtained from the six parallel imaging channels.

Download Full Size | PPT Slide | PDF

In the standard, single channel amplitude image shown in Fig. 11(c), which was reconstructed from a single CC term taken from Fig. 11(b), there are two beads, a single in-focus bead in the center, and an occluding OOF bead, seen as a smaller darker circle at the bottom of the image, with the OOF bead almost completely obscuring the in-focus bead. However, in the averaged image shown in Fig. 11(d), captured using the same low-coherence source, the presence of the OOF bead is reduced by 83% (5/6), and thus is barely visible in comparison to Fig. 11(c), allowing the in-focus bead to be seen more clearly, further demonstrating the significant improvement in using this 6PH sectioning approach over using standard holography with the same low-coherence light source. The full video of the dynamic sample is shown in Visualization 7.

6. Conclusion

A new technique for dynamically rejecting OOF objects in holography using 6PH was presented. We have shown that by averaging the six amplitude images obtained simultaneously by 6PH, it is possible to decrease the contribution of OOF objects in a single location by up to 83%. This method can be combined with synthetic aperture techniques to both decrease the presence of OOF objects and increase lateral resolution. Using 6PH, the averaged images can be obtained from a single camera exposure, and we have shown that the averaged image can still be numerically refocused as in regular digital holography. These techniques can enable holographic imaging of dynamic samples containing multiple objects that can occlude the objects of interest, as in the swimming microalgae sample, allowing sectioned numerical refocusing in samples where this would normally be difficult to obtain. While the averaging technique is effective, by averaging the six laterally shifted images, six less visible images of the OOF object are created. This causes the OOF object to be present in a larger area of the image, limiting the spatial density of the sample, though we have shown that the technique is still effective even at high sample concentrations.

While applying the averaging technique to OPD images would be of significant value, we have shown it is not possible to consistently reconstruct accurate OPD images from the six off-axis illumination images produced by our system, as dynamic samples will have significantly different OPD profiles when observed from different perspectives. Averaging in the case where the sample length is aligned with the optical axis results in a quantitative phase image that barely resembles an on-axis illumination phase image of the same object. Furthermore, use of 3D tomographic reconstruction algorithms to reproduce the OPD image also fails in such cases, producing even less accurate results due to the limited number of projection images.

In the future, the presented system can be improved by using separate arrays of lenses for each reference beam, thereby decoupling off-axis interference angle and reference beam magnification, allowing the reference beams to be larger and better match the sample beam size. The SNR can also be improved by using a camera with a larger dynamic range. In addition, the possibility of improving OOF object rejection by training a deep learning network using images from our simulation could be investigated.

Funding

H2020 European Research Council (678316).

Acknowledgments

We thank Prof. Iftach Yacoby for supplying the microalgae samples.

Disclosures

The authors declare no conflicts of interest.

References

1. E. Cuche, F. Bevilacqua, and C. Depeursinge, “Digital holography for quantitative phase-contrast imaging,” Opt. Lett. 24(5), 291–293 (1999). [CrossRef]  

2. M. Rubin, G. Dardikman, S. K. Mirsky, N. A. Turko, and N. T. Shaked, “Six-pack off-axis holography,” Opt. Lett. 42(22), 4611–4614 (2017). [CrossRef]  

3. S. K. Mirsky and N. T. Shaked, “First experimental realization of six-pack holography and its application to dynamic synthetic aperture superresolution,” Opt. Express 27(19), 26708–26720 (2019). [CrossRef]  

4. N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020). [CrossRef]  

5. X. Liu, B. Y. Wang, and C. S. Guo, “One-step Jones matrix polarization holography for extraction of spatially resolved Jones matrix of polarization-sensitive materials,” Opt. Lett. 39(21), 6170–6173 (2014). [CrossRef]  

6. S. Suzuki, Y. Nozaki, and H. Kimura, “High-speed holographic microscopy for fast-propagating cracks in transparent materials,” Appl. Opt. 36(28), 7224–7233 (1997). [CrossRef]  

7. Y. N. Nygate, G. Singh, I. Barnea, and N. T. Shaked, “Simultaneous off-axis multiplexed holography and regular fluorescence microscopy of biological cells,” Opt. Lett. 43(11), 2587–2590 (2018). [CrossRef]  

8. N. A. Turko, P. J. Eravuchira, I. Barnea, and N. T. Shaked, “Simultaneous three-wavelength unwrapping using external digital holographic multiplexing module,” Opt. Lett. 43(9), 1943–1946 (2018). [CrossRef]  

9. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

10. N. C. Pégard, H. Liu, N. Antipa, M. Gerlock, H. Adesnik, and L. Waller, “Compressive light-field microscopy for 3D neural activity recording,” Optica 3(5), 517–524 (2016). [CrossRef]  

11. H. Li, C. Guo, D. Kim-Holzapfel, W. Li, Y. Altshuller, B. Schroeder, W. Liu, Y. Meng, J. B. French, K. I. Takamaru, M. A. Frohman, and S. Jia, “Fast, volumetric live-cell imaging using high-resolution light-field microscopy,” Biomed. Opt. Express 10(1), 29–49 (2019). [CrossRef]  

12. F. Dubois, L. Joannes, and J. Legros, “Improved three-dimensional imaging with a digital holography microscope with a source of partial spatial coherence,” Appl. Opt. 38(34), 7085–7094 (1999). [CrossRef]  

13. J. H. Massig, “Digital off-axis holography with a synthetic aperture,” Opt. Lett. 27(24), 2179–2181 (2002). [CrossRef]  

14. D. J. Brady, K. Choi, D. L. Marks, R. Horisaki, and S. Lim, “Compressive Holography,” Opt. Express 17(15), 13040–13049 (2009). [CrossRef]  

15. K. Choi, R. Horisaki, J. Hahn, S. Lim, D. L. Marks, T. J. Schulz, and D. J. Brady, “Compressive holography of diffuse objects,” Appl. Opt. 49(34), H1–H10 (2010). [CrossRef]  

16. C. F. Cull, D. A. Wikner, J. N. Mait, M. Mattheiss, and D. J. Brady, “Millimeter-wave compressive holography,” Appl. Opt. 49(19), E67–E82 (2010). [CrossRef]  

17. J. B. Pawley, Handbook of Biological Confocal Microscopy, 2nd ed. (Plenum Press, 1995).

18. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991). [CrossRef]  

19. A. M. Rollins, M. D. Kulkarni, S. Yazdanfar, R. Ung-arunyawee, and J. A. Izatt, “In vivo video rate optical coherence tomography,” Opt. Express 3(6), 219–229 (1998). [CrossRef]  

20. A. Ahmad, T. Mahanty, V. Dubey, A. Butola, B. S. Ahluwalia, and D. S. Mehta, “Effect on the longitudinal coherence properties of a pseudothermal light source as a function of source size and temporal coherence,” Opt. Lett. 44(7), 1817–1820 (2019). [CrossRef]  

21. N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991). [CrossRef]  

22. G. Dardikman, N. A. Turko, N. Nativ, S. K. Mirsky, and N. T. Shaked, “Optimal spatial bandwidth capacity in multiplexed off-axis holography for rapid quantitative phase reconstruction and visualization,” Opt. Express 25(26), 33400–33415 (2017). [CrossRef]  

23. I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. 22(16), 1268–1270 (1997). [CrossRef]  

24. P. Müller, M. Schürmann, and J. Guck, “The Theory of Diffraction Tomography,” arXiv: 1507.00466v3 [q-bio.QM] (2016).

25. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17(1), 266–277 (2009). [CrossRef]  

References

  • View by:

  1. E. Cuche, F. Bevilacqua, and C. Depeursinge, “Digital holography for quantitative phase-contrast imaging,” Opt. Lett. 24(5), 291–293 (1999).
    [Crossref]
  2. M. Rubin, G. Dardikman, S. K. Mirsky, N. A. Turko, and N. T. Shaked, “Six-pack off-axis holography,” Opt. Lett. 42(22), 4611–4614 (2017).
    [Crossref]
  3. S. K. Mirsky and N. T. Shaked, “First experimental realization of six-pack holography and its application to dynamic synthetic aperture superresolution,” Opt. Express 27(19), 26708–26720 (2019).
    [Crossref]
  4. N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020).
    [Crossref]
  5. X. Liu, B. Y. Wang, and C. S. Guo, “One-step Jones matrix polarization holography for extraction of spatially resolved Jones matrix of polarization-sensitive materials,” Opt. Lett. 39(21), 6170–6173 (2014).
    [Crossref]
  6. S. Suzuki, Y. Nozaki, and H. Kimura, “High-speed holographic microscopy for fast-propagating cracks in transparent materials,” Appl. Opt. 36(28), 7224–7233 (1997).
    [Crossref]
  7. Y. N. Nygate, G. Singh, I. Barnea, and N. T. Shaked, “Simultaneous off-axis multiplexed holography and regular fluorescence microscopy of biological cells,” Opt. Lett. 43(11), 2587–2590 (2018).
    [Crossref]
  8. N. A. Turko, P. J. Eravuchira, I. Barnea, and N. T. Shaked, “Simultaneous three-wavelength unwrapping using external digital holographic multiplexing module,” Opt. Lett. 43(9), 1943–1946 (2018).
    [Crossref]
  9. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.
  10. N. C. Pégard, H. Liu, N. Antipa, M. Gerlock, H. Adesnik, and L. Waller, “Compressive light-field microscopy for 3D neural activity recording,” Optica 3(5), 517–524 (2016).
    [Crossref]
  11. H. Li, C. Guo, D. Kim-Holzapfel, W. Li, Y. Altshuller, B. Schroeder, W. Liu, Y. Meng, J. B. French, K. I. Takamaru, M. A. Frohman, and S. Jia, “Fast, volumetric live-cell imaging using high-resolution light-field microscopy,” Biomed. Opt. Express 10(1), 29–49 (2019).
    [Crossref]
  12. F. Dubois, L. Joannes, and J. Legros, “Improved three-dimensional imaging with a digital holography microscope with a source of partial spatial coherence,” Appl. Opt. 38(34), 7085–7094 (1999).
    [Crossref]
  13. J. H. Massig, “Digital off-axis holography with a synthetic aperture,” Opt. Lett. 27(24), 2179–2181 (2002).
    [Crossref]
  14. D. J. Brady, K. Choi, D. L. Marks, R. Horisaki, and S. Lim, “Compressive Holography,” Opt. Express 17(15), 13040–13049 (2009).
    [Crossref]
  15. K. Choi, R. Horisaki, J. Hahn, S. Lim, D. L. Marks, T. J. Schulz, and D. J. Brady, “Compressive holography of diffuse objects,” Appl. Opt. 49(34), H1–H10 (2010).
    [Crossref]
  16. C. F. Cull, D. A. Wikner, J. N. Mait, M. Mattheiss, and D. J. Brady, “Millimeter-wave compressive holography,” Appl. Opt. 49(19), E67–E82 (2010).
    [Crossref]
  17. J. B. Pawley, Handbook of Biological Confocal Microscopy, 2nd ed. (Plenum Press, 1995).
  18. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
    [Crossref]
  19. A. M. Rollins, M. D. Kulkarni, S. Yazdanfar, R. Ung-arunyawee, and J. A. Izatt, “In vivo video rate optical coherence tomography,” Opt. Express 3(6), 219–229 (1998).
    [Crossref]
  20. A. Ahmad, T. Mahanty, V. Dubey, A. Butola, B. S. Ahluwalia, and D. S. Mehta, “Effect on the longitudinal coherence properties of a pseudothermal light source as a function of source size and temporal coherence,” Opt. Lett. 44(7), 1817–1820 (2019).
    [Crossref]
  21. N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991).
    [Crossref]
  22. G. Dardikman, N. A. Turko, N. Nativ, S. K. Mirsky, and N. T. Shaked, “Optimal spatial bandwidth capacity in multiplexed off-axis holography for rapid quantitative phase reconstruction and visualization,” Opt. Express 25(26), 33400–33415 (2017).
    [Crossref]
  23. I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. 22(16), 1268–1270 (1997).
    [Crossref]
  24. P. Müller, M. Schürmann, and J. Guck, “The Theory of Diffraction Tomography,” arXiv: 1507.00466v3 [q-bio.QM] (2016).
  25. Y. Sung, W. Choi, C. Fang-Yen, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17(1), 266–277 (2009).
    [Crossref]

2020 (1)

N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020).
[Crossref]

2019 (3)

2018 (2)

2017 (2)

2016 (1)

2014 (1)

2010 (2)

2009 (2)

2002 (1)

1999 (2)

1998 (1)

1997 (2)

1991 (2)

N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991).
[Crossref]

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Abramson, N. H.

N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991).
[Crossref]

Adams, A.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

Adesnik, H.

Ahluwalia, B. S.

Ahmad, A.

Altshuller, Y.

Antipa, N.

Badizadegan, K.

Barnea, I.

Bevilacqua, F.

Bjelkhagen, H. I.

N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991).
[Crossref]

Brady, D. J.

Butola, A.

Caulfield, H. J.

N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991).
[Crossref]

Chang, W.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Choi, K.

Choi, W.

Cuche, E.

Cull, C. F.

Dardikman, G.

Dasari, R. R.

Depeursinge, C.

Dubey, V.

Dubois, F.

Eravuchira, P. J.

Fang-Yen, C.

Feld, M. S.

Flotte, T.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Footer, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

French, J. B.

Frohman, M. A.

Fujimoto, J. G.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Gerlock, M.

Gregory, K.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Guck, J.

P. Müller, M. Schürmann, and J. Guck, “The Theory of Diffraction Tomography,” arXiv: 1507.00466v3 [q-bio.QM] (2016).

Guo, C.

Guo, C. S.

Hahn, J.

Hee, M. R.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Horisaki, R.

Horowitz, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

Huang, D.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Izatt, J. A.

Jia, S.

Joannes, L.

Kim-Holzapfel, D.

Kimura, H.

Kulkarni, M. D.

Kus, A.

N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020).
[Crossref]

Legros, J.

Levoy, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

Li, H.

Li, W.

Lim, S.

Lin, C. P.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Liu, H.

Liu, W.

Liu, X.

Mahanty, T.

Mait, J. N.

Marks, D. L.

Massig, J. H.

Mattheiss, M.

Mehta, D. S.

Meng, Y.

Micó, V.

N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020).
[Crossref]

Mirsky, S. K.

Müller, P.

P. Müller, M. Schürmann, and J. Guck, “The Theory of Diffraction Tomography,” arXiv: 1507.00466v3 [q-bio.QM] (2016).

Nativ, N.

Ng, R.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

Nozaki, Y.

Nygate, Y. N.

Pawley, J. B.

J. B. Pawley, Handbook of Biological Confocal Microscopy, 2nd ed. (Plenum Press, 1995).

Pégard, N. C.

Puliafito, C. A.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Rollins, A. M.

Rubin, M.

Schroeder, B.

Schulz, T. J.

Schuman, J. S.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Schürmann, M.

P. Müller, M. Schürmann, and J. Guck, “The Theory of Diffraction Tomography,” arXiv: 1507.00466v3 [q-bio.QM] (2016).

Shaked, N. T.

Singh, G.

Stinson, W. G.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Sung, Y.

Suzuki, S.

Swanson, E. A.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Takamaru, K. I.

Trusiak, M.

N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020).
[Crossref]

Turko, N. A.

Ung-arunyawee, R.

Waller, L.

Wang, B. Y.

Wikner, D. A.

Yamaguchi, I.

Yazdanfar, S.

Zhang, T.

Adv. Opt. Photonics (1)

N. T. Shaked, V. Micó, M. Trusiak, A. Kuś, and S. K. Mirsky, “Off-axis digital holographic multiplexing for rapid wavefront acquisition and processing,” Adv. Opt. Photonics 12(3), 556–611 (2020).
[Crossref]

Appl. Opt. (4)

Biomed. Opt. Express (1)

J. Mod. Opt. (1)

N. H. Abramson, H. I. Bjelkhagen, and H. J. Caulfield, “The ABCs of Space-time-coherence Recording in Holography,” J. Mod. Opt. 38(7), 1399–1406 (1991).
[Crossref]

Opt. Express (5)

Opt. Lett. (8)

Optica (1)

Science (1)

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Other (3)

P. Müller, M. Schürmann, and J. Guck, “The Theory of Diffraction Tomography,” arXiv: 1507.00466v3 [q-bio.QM] (2016).

J. B. Pawley, Handbook of Biological Confocal Microscopy, 2nd ed. (Plenum Press, 1995).

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light Field Microscopy,” in Proc. SIGGRAPH (2006), Vol. 25, pp. 924–934.

Supplementary Material (7)

NameDescription
Visualization 1       Visualization 1
Visualization 2       Visualization 2
Visualization 3       Visualization 3
Visualization 4       Visualization 4
Visualization 5       Visualization 5
Visualization 6       Visualization 6
Visualization 7       Visualization 7

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Comparison of the spatial frequency domains of: (a) standard off-axis holography, and (b) 6PH. CC, cross-correlation term; CC*, complex conjugate of corresponding CC term; DC, auto-correlation term.
Fig. 2.
Fig. 2. Diagrams illustrating the OOF object rejection by the proposed averaging method that can be implemented by 6PH. (a) Diagram showing the correlation between off-axis illumination angle and shift magnitude. (b) – (g) Illustrations of six different perspective view images, using six illumination angles from around the optical axis, in the case where an OOF cell is directly above an in-focus cell. The central object represents an in-focus cell, and other objects represent the same single OOF cell from the six different perspectives. (h) The averaged image produced from (b) – (g), where the OOF objects appear in six different places, shifted from the in-focus object. (i) Expected image when θ = 0° when using a single perspective, where the OOF object occludes the in-focus object.
Fig. 3.
Fig. 3. Simulation images for a single sphere and an OOF sphere above a pattern of lines. (a) 3D representation of the single sphere model. (b) Amplitude image of an in-focus sphere as reconstructed from a single-channel hologram. (c) Amplitude image of the sphere from (a) at a z-distance of 5.1 µm, when illuminating the sample at an off-axis angle of 23.4°, as reconstructed from a single-channel hologram. (d) Averaged image produced from (c) and the five other amplitude images using the same six illumination angles as in the optical system presented later, as reconstructed from a six-pack hologram. Scale bar in (d) applies to (b) – (d). (e) 3D representation of the USAF target model, an in-focus line pattern with an OOF sphere. (f) Amplitude image of (e) using on-axis illumination, as reconstructed from a single-channel hologram. (g) Averaged image of (e) using six amplitude images as reconstructed from a six-pack hologram, with the same six illumination angles as in the optical system presented later. (h) Synthetic aperture produced using the same six complex wave fronts used to create (g). Circles indicate the six different spatial frequency bands. (i) Synthetic aperture amplitude image produced using the synthetic aperture from (h). Scale bar in (i) applies also to (f), (g).
Fig. 4.
Fig. 4. Simulation images for a biological cell model. (a) 3D representation of the cell model. (b) Single-channel amplitude image of the cell model using on-axis illumination. (c) Averaged image of (a) using the same six illumination angles as in the optical system presented later. (d) Averaged image of (a) with six off-axis illumination angles set to 45°.
Fig. 5.
Fig. 5. Simulation of applying 6PH sectioning for a 3D scene containing absorptive spheres distributed at high concentrations. (a) 182.89 pM concentration. (b) 118.31 pM concentration. (c) 54.19 pM concentration. (d) 11.70 pM concentration. (a – d) Left: 3D representation of the model. Middle: Single-channel (regular holography) amplitude image. Right: Averaged image demonstrating the sectioning capability of 6PH. (e) Graph of the in-focus object visibility in the averaged image as a function of concentration.
Fig. 6.
Fig. 6. Simulation of applying 6PH sectioning to OPD images, with sample aligned in the xy plane. (a) 3D representation of the cell model. (b) On-axis illumination OPD image. (c) – (h) The six different off-axis illumination OPD images using illumination angles as in the proposed system. (i) Averaged OPD image produced from images (c) – (h). (j) Tomographically reconstructed OPD image produced from images (c) – (h). Scale bar and color bar in (j) apply to all figures except (a).
Fig. 7.
Fig. 7. Simulation of applying 6PH sectioning to OPD images, with sample aligned in the yz plane. (a) 3D representation of the cell model. (b) On-axis illumination OPD image. (c) – (h) The six different off-axis illumination OPD images using illumination angles as in the proposed system. (i) Averaged OPD image produced from images (c) – (h). (j) Tomographically reconstructed OPD image produced from images (c) – (h). Scale bar and color bar in (j) apply to all figures except (a).
Fig. 8.
Fig. 8. 6PH system for complex wave front imaging with rejection of OOF objects. (a) The optical microscopy system. LC, low-coherence supercontinuum laser source; DBS, diffractive beam splitter; L1 – L14, lenses; BS1, BS2, beam splitters; P1, P2, periscopes; PDP, phase delay plate; S, sample; MO, microscope objective; M1, M2, mirrors; ND, neutral density filter; C, digital camera. Red line indicates the optical axis. (b) Diagram of the sample and reference beams chosen, and their positions relative to the optical axis before the PDPs. S1 – S6, sample beam positions before L4 in the sample arm; R1 – R2, reference beam positions before L10 in the reference arm. The optical axis is normal to the image and marked by ⊗. Each square of the grid is 1.36 mm × 1.36 mm in size. (c) The corresponding sample and reference PDP structures, ensuring that non-matching beam pairs will not interfere with each other due to coherence gating.
Fig. 9.
Fig. 9. Dynamic 6PH experimental results of swimming microalgae. (a) Six-pack hologram, inset shows fringes magnified ten times. (b) Spatial frequency power spectrum of (a). (c) – (h) The amplitude images produced from each of the six holographic channels contained in (a). (i) Averaged image generated from images (c) – (h). Scale bar in (i) applies to all images except (b).
Fig. 10.
Fig. 10. In-focus and numerically refocused images reconstructed from the 6PH data of the swimming microalgae shown in Fig. 9. (a) Amplitude image from a single channel. (b) Averaged image produced from (a) and the five other amplitude images. (c), (d) Numerically refocused versions of (a) and (b), respectively.
Fig. 11.
Fig. 11. Dynamic 6PH experimental results of 12 µm polystyrene beads flowing in water. (a) Cropped six-pack hologram. Inset shows fringes magnified five times. (b) Spatial frequency power spectrum of the six-pack hologram. (c) Single-channel amplitude image of beads in water, with one OOF bead occluding an in-focus bead, reconstructed from a single CC term from (b). (d) Averaged image as obtained from the six parallel imaging channels.

Metrics