Objects imaged through thin scattering media can be reconstructed with the knowledge of the complex transmission function of the diffuser. We demonstrate image reconstruction of static and dynamic objects with numerical phase conjugation in a lensless setup. Data is acquired by single shot intensity capture of an object coherently illuminated and obscured by an inhomogeneous medium, i.e. light diffracted at a specimen is scattered by a polycarbonate diffuser and the resulting speckle field is recorded. As a preparational step, which has to be performed only one time before imaging, the complex speckle field diffracted by the diffuser to the camera chip is measured interferometrically, which allows to reconstruct the transmission function of the diffuser. After insertion of the specimen, the speckle field in the camera plane changes, and the complex field of the sample can be reconstructed from the new intensity distribution. After initial interferometric measurement of the diffuser field, the method is robust with respect to a subsequent misalignment of the diffuser. The method can be extended to image objects placed between a pair of thin scattering plates. Since the object information is contained in a single speckle intensity pattern, it is possible to image dynamic processes at video rate.
© 2014 Optical Society of America
Information of a light field scattered by a thin random medium is not lost, but instead is coded in the complex amplitude of the scattered light field. For example, an object obscured by a scattering medium can be reconstructed by reversing the scattering process holographically [1–4]. Since the advent of deformable mirror devices even time dependent wavefront distortions, as introduced by atmospheric inhomogeneities impeding extraterrestrial observations, can be corrected in real time . To date spatial light modulators enable focusing of light through nearly opaque diffusive media [6–8]. Large scattering angles of the diffuser even enable a reduced spot size, allowing sub resolution limit imaging by scanning the focus  and exploiting the memory effect . In  an optimized SLM was used to image a millimeter sized object after reflection off paper and through a thin polycarbonate diffuser with thermal light. A faster algorithm was presented in , reducing SLM optimization times from minutes to milliseconds. In an alternative SLM-based method the phase distribution of second harmonic radiation of the scatterer is measured via off-axis holography and the calculated conjugated phase distribution is displayed on the SLM to reverse the scattering process . In  imaging through a scatterer was reported by sequentially measuring its transmission matrix with a SLM operated as a common path interferometer. A non-invasive approach to image fluorescent structures hidden by a strong scatterer (with no access to the space behind the scatterer) is demonstrated in . A holographic approach was reported where an off-axis hologram of an object is recovered by lateral movement of the diffuser and imaged onto a sensor with a lens . Recently a method was presented that provides imaging through turbid media using digital holography .
Here we demonstrate a lensless single shot technique for the reconstruction of objects obscured by a thin scattering medium. For this purpose the complex field of a coherent beam passing through the diffuser is first measured interferometrically in a camera plane located a few centimeters behind the diffuser. When the distorted intensity distribution of an object placed behind the diffuser is captured, the previously measured phase information of the undisturbed speckle field (without an object inserted) enables numerical reversal of the scattering process. After the initial calibration this can be done at video rate. We show that even potential misalignments like a lateral shift of the diffuser plate can be tracked and compensated by numerical re-calibration. Furthermore, imaging of a sample which is embedded between two diffuser plates is also possible with a modified reconstruction procedure.
2. Object reconstruction
A monochromatic wave in a plane (x, y) perpendicular to the direction of propagation z can be described as U(x, y) = A(x, y)exp[iϕ(x, y)], where A(x, y) and ϕ(x, y) are amplitude and phase distribution, respectively, with an intensity distribution I(x, y) = |U(x, y)|2. Scattering of such a wave when passing through a non-absorbing diffusive medium is caused by refractive index variations n(x, y) in the material, which change the phase distribution of the light field and locally alter the direction of propagation. A laser beam will evolve into a complex speckle field.
Here we consider the situation where the object is placed behind a thin polycarbonate diffuser illuminated with coherent light, and the interfering waves are scattered into a speckle pattern. Recovery of the object field is then done using the principles of digital inline holography and digital phase conjugation.
The method is based on measuring the refractive index distribution of the diffusive medium itself prior to measuring the inline hologram of the sample through the scatterer. For this purpose we apply phase stepping interferometry (PSI) to measure the phase distribution of the speckle field caused by the diffuser. For a time-independent diffuser this interferometric measurement has to be performed only a single time before data recording, allowing reconstruction of the object field from single shot measurements, which enables image acquisition at video rate. If the diffuser moves with respect to the camera, e.g. by a gradual loss of adjustment due to vibrations or thermal effects, it is possible to numerically re-adjust the setup, without requiring a new interferometric calibration.
The method can also be extended to a scenario where an object is placed between a pair of diffusers. In this case two interferometric measurements need to be performed prior to data acquisition in order to enable object reconstruction, which afterwards can again be done with single shot camera exposures.
2.1. Image formation and reconstruction of an object obscured by a diffuser
Figure 1 shows the transverse planes relevant for object reconstruction, which are distinguished with primes and double primes. In the following the formation of the image in the sensor plane is described. The illuminating monochromatic plane wave is partially scattered by the object and further propagates to the diffuser. The complex amplitude of the object field in the diffuser plane takes on the form
For the field in the camera plane it is assumed that the light scattered by the object passes through the diffuser and adds to the field caused by only the diffuser (without inserted object), i.e. in this case the object is assumed to create only a small perturbance of the total field (this is the typical assumption of inline holography). The intensity distribution in the sensor plane is then given byEq. (4) with exp(iϕ″d) and neglecting , which is assumed to be much smaller than , yields an expression for the object complex amplitude through the diffuser in the sensor plane,
The second term, denoted as “noise”, actually corresponds to the minus first diffraction order of the inline hologram, which, after back-propagation to the sample plane, will result in a diffuse background, due to the fact that it includes the phase term exp[i(2ϕ″d − ϕ″obj)], where ϕ″d is the (random) phase of the diffuser. This phase term will not be compensated by the following phase conjugation step, but the resulting field in the sample plane will be just a diffuse speckle pattern of low intensity, which is superposed on the reconstructed object field. For the following calculations this field contribution will thus be neglected.
The subtraction of the diffuser intensity distribution in the first term of Eq. (5) can be interpreted as a pseudo darkfield method, resulting in images with a suppressed background. The denominator A″d of this term is typically neglected (i.e. set to unity), in order to avoid artefacts caused by zeros in this term.
If the “noise” term is neglected, all factors at the right hand side of Eq. (5) are known, i.e. amplitude and phase distribution of the diffuser’s speckle pattern in the sensor plane, A″d and ϕ″d, respectively, are determined by the initial interferometric calibration measurement, and I″CCD is the single-shot intensity capture of the object through the diffuser. For reconstruction, the right hand side of Eq. (5) is numerically propagated to the diffuser plane with standard FFT-based scalar diffraction propagation :Eq. (1) to reconstruct the object requires to compensate the (known) phase disturbance of the diffuser, exp(iϕ′d), which is done by numerical phase conjugation, i.e. by multiplication with the complex conjugate distribution, exp(−iϕ′d). The phase disturbance introduced by the diffuser is known from the previous interferometric calibration measurement, i.e. it can be obtained by propagating the initially measured complex amplitude A″d exp[iϕ″d] from the camera plane to the diffuser plane, according to:
Further propagation to the object plane then yields:
2.2. Reconstruction of an object placed between two diffusers
The image reconstruction routine can be extended to a scenario where an object is placed between a pair of diffusers in tandem. In this case two interferometric measurements are needed prior to object data acquisition. One for measuring the complex amplitude of light scattered by the first diffuser, d1, which is closest to the digital sensor, A″d1 exp[iϕ″d1], and one for measuring the complex amplitude of light scattered by both diffusers, A″d2 exp[iϕ″d2]. Reconstruction is then performed similar to the previous situation, using,
The experiments were performed with an expanded 632.8 nm HeNe laser split into a probe and a reference beam and recombined in a Mach-Zehnder interferometer. Figure 2(a) shows an illustration of the scenario where an object is illuminated in transmission with coherent light and obscured by a diffuser (Edmund Optics - Holographic diffuser 1°). The intensity distribution of the resulting speckle field is captured on a 14 bit CCD sensor (pco.4000s) with dimensions 4008×2672 pixels and a pixelsize of 9×9 μm2. The distances from object to diffuser and object to digital sensor are 2 cm and 7 cm, respectively. Prior to object data acquisition, phase stepping interferometry  is applied to measure amplitude and phase distribution of the speckle field caused by the diffuser (without an object inserted). There the reference wave of the Mach-Zehnder interferometer, recombined with the probe beam by a beam splitter cube, is shifted by a minimum of three equidistant phase steps in the interval [0, 2π] by changing the length of the reference beam path of the interferometer with a piezo crystal translation stage. Figure 2(b) provides an impression of the scattering strength of the diffuser (Edmund Optics - Holographic diffuser 1°) showing a conventional white light photograph (Canon EOS 550d with zoom objective) of an obscured insect.
For initial calibration of the holographic setup, the phase distribution ϕ″d of the expanded laser beam scattered at the diffuser (without an inserted object) was determined with phase shifting interferometry using 6 equidistant phase steps. Afterwards, object information is captured in single-shot images with an exposure time of 1.2 ms. In Fig. 3 the performance of the described method is compared to standard digital inline holography without (a), and with (b) a diffuser inserted in the optical path. For reasons of comparability the inline hologram is inverted prior to propagation to the object plane resulting in a background subtracted image with no loss of contrast. Note that the reconstructed image in (a) shows fringes surrounding the object, which are due to the fact that both, the image of the object, and a twin image, which is caused by the minus first diffraction order, are simultaneously reconstructed. Figure 3(c) shows a reconstructed image of the same specimen obscured by a diffuser by means of the numerical phase conjugation procedure described in section 2.1. As expected by the discussion in section 2.1, no twin image is visible here. Instead, the light corresponding to the minus first diffraction order of the hologram is distributed as a uniform background noise with a low intensity . The amplitude distribution shown in (d) is reconstructed at a 12 mm out of focus plane demonstrating the ability to focus at a different transverse plane of interest. A comparison of the image contrast is given in Fig. 3(e) as one-dimensional amplitude profiles along the colored blue, red and green lines in Figs. 3(a), 3(c), and 3(d), respectively. The colored squares 1 and 2 indicate the areas of 70×70 pixels in which the mean signals μ̄max and μ̄min were computed to evaluate the averaged contrasts C = (μ̄max − μ̄min)/(μ̄max + μ̄min). For direct inline holography (with no diffuser inserted), in focus and out of focus imaging through a thin diffuser averaged contrasts resulted in 0.28, 0.46 and 0.45, respectively.
Experimentally it turns out that the performance crucially depends on the proper axial numerical propagation distance 𝒫s→d in Eq. (6) and (7), i.e. the distance between sensor and diffuser. Since a single image acquisition is sufficient for object reconstruction, the method enables capture of dynamic objects, with the capability for imaging in real-time. An enclosed video ( Media 1) shows a series of reconstructed images of an obscured moving insect at the frame rate of the digital sensor (1s), starting with a visualization of the reconstruction routine of the first frame.
In a further experiment we demonstrate that an object can still be reconstructed when the diffuser is shifted in lateral directions. Now the previously interferometrically measured complex amplitude of the speckle field (caused by the diffuser only) is numerically shifted in the computer to compensate for the diffuser’s misalignment before reconstruction starting from Eq. (5) is applied. Figure 4(a) shows the reconstructed image of an insect before displacement of the diffuser. In Fig. 4(b) the diffuser has been shifted in lateral direction by 10 μm, and the object is reconstructed without accounting for this, resulting in a loss of image contrast. In Fig. 4(c) the shift of the diffuser was numerically compensated by also shifting the initial calibration image by the same distance before performing the described reconstruction procedure. Now, the original image information can be reconstructed with the same quality as before. The numerical compensation of the diffuser displacement is possible as long as the corresponding speckle field in the camera plane only marginally changes its structure when being shifted .
Assuming a more general situation where the amount of displacement is unknown, the required shift of the reference complex amplitude can be determined with an automated algorithm, where the two-dimensional cross-correlation C between the original speckle intensity distribution, Iref, and the intensity distribution after misalignment of the diffuser, Ishift, is maximized,
For lateral shifts larger than 10 μm object reconstruction starting from Eq. (5) leads to a decrease in image contrast and eventually becomes impossible for shifts larger than 50 μm because intensity distributions of reference and object speckle field will no longer resemble each other well enough. In this case the object complex amplitude in the sensor plane is approximated byEq. (5), is replaced by the square root of the measured intensity distribution, , and the shifted phase function of the reference field. This results in a “bright-field image” of the sample, which is, however, less sensitive to the diffuser alignment. Figure 4(d) shows the reconstruction of an object with the disordered medium horizontally displaced by d = 0.2 mm. The reference complex amplitude was shifted accordingly. Although here the advantage of background subtraction cannot be exploited, the image is still displayed in the “bright-field mode”. Note that for a normal interferometric imaging method, compensation of a misalignment on the order of 0.2 mm, corresponding to about 300 wavelengths, would be a challenge.
3.1. Imaging inside a pair of thin diffusers
For imaging an object situated between two diffusers the difficulty arises that the object and one of the diffusers are illuminated with a disordered wavefront, see Fig. 5(a). In this case it is necessary to measure the phase distributions of the light field scattered by diffuser one, and by both diffusers prior to object data acquisition. The corresponding setup is sketched in Fig. 5(a).
Figure 5(b) shows a conventional photograph (Canon EOS 550d with zoom objective) of an insect placed between the two diffusers (Edmund Optics - Holographic diffuser 1°), recorded with a camera with adapted objective, focused at the position of the object. Obviously the object cannot be resolved by standard photography. The results of digital holographic approaches for image reconstruction are shown in Fig. 6. In these cases the camera objective is removed, i.e. recording is done without an inserted lens. The first approach, displayed in (a), uses the concept of inline holography, i.e. the speckle pattern recorded in the camera plane is numerically propagated into the object plane. Obviously this approach does not work, due to the wavefront distortions introduced by the diffusers. The second approach, displayed in (b), uses the method described in section 2.2, i.e. it starts from Eq. (9), and performs back-propagation under consideration of the calibrated scattered fields of the two diffusers, including numerical phase conjugation to compensate for the phase contributions of diffuser 1. In this case one obtains a sharp image of the object, illuminated by the speckle field of diffuser 2, which introduces some artefacts. These disturbances can be minimized in a further step by normalizing the image with the speckle intensity distribution which is created by diffuser 2 in the sample plane. For this purpose the complex amplitude of the light field passing both diffusers A″d2 exp(iϕ″d2) is propagated from the camera plane to the plane of diffuser 1, where its (known) phase distribution ϕ′d1 is subtracted. The obtained complex amplitude is further propagated to the object plane, where it thus corresponds to the amplitude of the speckle field which illuminates the sample. Division of the reconstructed object amplitude (i.e. the result displayed in (b)) by the reconstructed speckle illumination amplitude yields the normalized amplitude image shown in Fig. 6(c). There, the resolution is slightly enhanced with respect to (b), but at the cost of an increased background, which is due to the fact that the reconstructed speckle field contains regions of low intensity (the minima in the speckle field), which create artefacts when they are used for image normalization. An enclosed video ( Media 2) shows a series of reconstructed images of two moving insects inside a pair of diffusers at the frame rate of the digital sensor (1s), starting with a visualization of the reconstruction routine of the first frame, including normalization with the speckle field of diffuser 2.
The feasibility to image objects obscured by a diffuser, or even embedded between two thin diffusive media, may have advantages in microscopic applications. In principle it is possible to first characterize the diffusive screen with an initial interferometric measurement, and afterwards detect any change within the medium by only imaging the scattered light intensity without the need of further interferometric measurements. This single-shot technique allows for imaging dynamic processes within a diffuse object at the frame rate of the used camera. The lensless holographic approach has the additional advantage that during data recording no focusing is needed, i.e. the digital image reconstruction allows to numerically re-focus into any selected object plane. Holographic image reconstruction through such diffusers has the advantageous side-effect that the disturbing twin-image, which appears in standard inline holography, is suppressed. The method is also robust with respect to a loss of adjustment after the initial calibration, which might occur by thermal drifts or vibrations, and which can be numerically compensated by tracking the cross-correlation between the actual speckle intensity distribution and the speckle distribution during calibration.
The feasibility to image an object embedded between two diffusers suggests that it may also be possible to image an object hidden behind a diffuser by illuminating it from the front side (the observation side), i.e. through the diffuser, since also in this situation the illuminating light is first distorted by the diffuser, and then the scattered object field is again distorted when passing through the diffuser, similar to our experimental situation. This would have interesting practical applications in microscopy, where imaging through scattering media is a significant challenge.
This work was supported by the ERC Advanced Grant 247024 catchIT, and by the Austrian Science Fund (FWF), Project No. P19582-N20.
References and links
1. E. N. Leith and J. Upatnieks, “Holographic imagery through diffusing media,” J. Opt. Soc. Am. 56, 523 (1966). [CrossRef]
2. J. W. Goodman, W. H. Huntley Jr., D. W. Jackson, and M. Lehmann, “Wavefront reconstruction imaging through random media,” Appl. Phys. Lett. 8, 311–313 (1966). [CrossRef]
3. H. Kogelnik and K. S. Pennington, “Holographic imaging through a random medium,” J. Opt. Soc. Am. 58(2), 273–274 (1968). [CrossRef]
4. I. Freund, “Looking through walls and around corners,” Physica A 168, 49–65 (1990). [CrossRef]
5. R. T. Tyson, Principles of Adaptive Optics (Academic, 2010). [CrossRef]
7. I. M. Vellekoop, A. Lagendijk, and A. P. Mosk, “Exploiting disorder for perfect focusing,” Nat. Photon. 4, 320–322 (2010). [CrossRef]
8. M. Nixon, O. Katz, E. Small, Y. Bromberg, A. A. Friesem, Y. Silberberg, and N. Davidson, “Real-time wavefront shaping through scattering media by all-optical feedback,” Nat. Photon. 7, 919–924 (2013). [CrossRef]
9. E. G. van Putten, D. Akbulut, J. Bertolotti, W. L. Vos, A. Lagendijk, and A. P. Mosk, “Scattering lens resolves sub-100 nm structures with visible light,” Phys. Rev. Lett. 106, 193905 (2011). [CrossRef] [PubMed]
11. O. Katz, E. Small, and Y. Silberberg, “Looking around corners and through thin turbid layers in real time with scattered incoherent light,” Nat. Photon. 6, 549–553 (2012). [CrossRef]
12. D. B. Conkey, A. M. Caravaca-Aguirre, and R. Piestun, “High-speed scattering medium characterization with application to focusing light through turbid media,” Opt. Express 20, 1733–1740 (2012). [CrossRef] [PubMed]
13. C. Hsieh, Y. Pu, R. Grange, G. Laporte, and D. Psaltis, “Imaging through turbid layers by scanning the phase conjugated second harmonic radiation from a nanoparticle,” Opt. Express 18, 20723–20731 (2010). [CrossRef] [PubMed]
14. S. M. Popoff, G. Lerosey, R. Carminati, M. Fink, A. C. Boccara, and S. Gigan, “Measuring the transmission matrix in optics: an approach to the study and control of light propagation in disordered media,” Phys. Rev. Lett. 104, 100601 (2010). [CrossRef] [PubMed]
16. A. Kumar Singh, D. N. Naik, G. Pedrini, M. Takeda, and W. Osten, “Looking through a diffuser and around an opaque surface: A holographic approach,” Opt. Express 22, 7694–7701 (2014). [CrossRef]
17. S. Li and J. Zhong, “Dynamic imaging through turbid media based on digital holography,” J. Opt. Soc. Am. A 31, 480–486 (2014). [CrossRef]
18. B. C. Kress and P. Meyrueis, Applied Digital Optics (Wiley, 2009). [CrossRef]
19. D. Malacara, Optical Shop Testing (Wiley, 2007). [CrossRef]
20. S. Bernet, W. Harm, A. Jesacher, and M. Ritsch-Marte, “Lensless digital holography with diffuse illumination through a pseudo-random phase mask,” Opt. Express 19, 25113–25124 (2011). [CrossRef]