Fresnel Incoherent Correlation Holography (FINCH) allows digital reconstruction of incoherently illuminated objects from intensity records acquired by a Spatial Light Modulator (SLM). The article presents wave optics model of FINCH, which allows analytical calculation of the Point Spread Function (PSF) for both the optical and digital part of imaging and takes into account Gaussian aperture for a spatial bounding of light waves. The 3D PSF is used to determine diffraction limits of the lateral and longitudinal size of a point image created in the FINCH set-up. Lateral and longitudinal resolution is investigated both theoretically and experimentally using quantitative measures introduced for two-point imaging. Dependence of the resolving power on the system parameters is studied and optimal geometry of the set-up is designed with regard to the best lateral and longitudinal resolution. Theoretical results are confirmed by experiments in which the light emitting diode (LED) is used as a spatially incoherent source to create object holograms using the SLM.
© 2011 OSA
3D imaging has recently been the subject of increased interest for its numerous applications in molecular biomedical investigation, on-line industrial product inspection, micro-fabrication investigation, etc. In bioimaging, the scanning confocal microscopy and the wide-field imaging are applied as the main 3D imaging techniques . The confocal microscopy provides excellent sectioning with high axial resolution achieved by rejecting out-of-focus light but scanning slows down the imaging process. The wide-field microscopy gives real-time 2D imaging but construction of 3D image requires z-scanning and complicated data processing. 3D imaging of incoherently illuminated objects can also be realized by the scanning holography , or by the projection holography based on computed holograms, synthesized from tens of images of the object captured from different view angles [3, 4].
Recently, the Fresnel Incoherent Correlation Holography (FINCH) has been proposed for 3D imaging of incoherently illuminated objects . This method combines principles of optical and digital holography with the spatial light modulation technique. Spatially incoherent light that is scattered by the observed object is collimated, transformed by a SLM, and subsequently captured by a CCD camera. The SLM splits the incident waves and the set-up operates as a one-way interferometer producing holograms of separate object points. Subsequently, the holograms are digitally reconstructed by applying the Fresnel transform . Recently, the FINCH was tested for mercury arc lamp illumination  and for fluorescent objects . The method was successfully used to design a fluorescent microscope  and a holographic synthetic aperture system [8, 9]. Although the method was demonstrated in a variety of applications, a complete mathematical description, which allows detailed analysis of the FINCH imaging, has not yet been published. In [5, 6, 10], the FINCH was described by a formal notation, which did not allow analysis of the geometrical parameters of the system or accurate assessment of the properties of digital images. In the last article , interference records of a point object were described by geometric parameters of the system and the lateral magnification of digital image was determined. The lateral magnification then was used to discuss the influence of geometric parameters on the lateral resolution.
This article presents wave optics model of FINCH, which considers the real parameters of the experimental set-up and uses a variable Gaussian aperture for a spatial bounding of light waves. Mathematical operations that were in earlier articles only formally outlined are performed analytically, in the proposed model. The result is the 3D PSF, valid for both an optical and digital part of the FINCH imaging. We have verified that the relationship between geometrical parameters designed in [10, 11] for optimum lateral resolution is not optimal in terms of physical criteria applied to two-point imaging, and results in a reduction of the lateral resolution provided by the collimating optics. In addition, we first determined the diffraction limit for longitudinal resolution of the FINCH imaging and made a proposal for an optimal geometric configuration with regard to the best longitudinal resolution. Furthermore, we showed that digitally reconstructed image is fully coherent even when the object is illuminated by spatially incoherent light. This important fact was not discussed in previous studies concerning optimal resolution in FINCH [5, 11]. Analysis of the geometrical parameters of the FINCH imaging presented in  has been extended to calculate the longitudinal magnification of 3D objects. In the experimental part of the paper, we reconstructed the 3D PSF from experimental data and compared it with theoretical predictions. Transverse and longitudinal resolution was experimentally investigated for various system parameters and the best values available in optimal configuration were determined. LED of wavelength 632 nm was used as a source of incoherent light in accomplished experiments.
2. Computational model of FINCH
The basic principle of FINCH can be explained by Fig. 1 that shows both the optical and digital part of experiment. The observed object is placed near the focal plane of collimating lens with the focal length f 0 and then illuminated by incoherent source. Light scattered by the object is collimated, transformed by the SLM and captured by the CCD camera. The SLM acts as a diffractive beam splitter, so the waves impacting it are doubled. The light from each point of the object is split into signal and reference waves that interfere and create a Fresnel zone structure on the CCD. The resulting hologram of the observed object is created as an incoherent superposition of interference patterns of individual points. In experimental part of FINCH, three holograms of the object Ij, j = 1,2,3, are recorded with different phase shift of signal and reference wave. The recorded holograms are then processed and observed object is digitally reconstructed using the Fresnel transform.
In optics, the PSF is used as a basic imaging function that allows to examine the characteristics and quality of the created image. To study the properties of the FINCH imaging, the PSF of the entire imaging chain that includes both an optical and digital part must be determined. In previous articles, this task was formulated using a formal notation, but the precise mathematical description of the intensity distribution in the digital image is still not available.
In this paper, the PSF is calculated for an arbitrary object point P 0 with the coordinates (x 0, y 0, z 0), where z 0 < 0 is used if the object lies in front of the collimating lens. Its paraxial image Pr is created by the collimating lens and has coordinates (xr, yr, zr) measured in the coordinate system with origin at the center of the CCD (see Fig. 1). The image is formed by paraxial rays, which represent normals of a paraboloidal wave converging to the image point Pr. This paraboloidal wave is incident on the SLM and splits into reference and signal waves with complex amplitudes Ur and Usj. The splitting of incoming waves is ensured through the SLM transmission function of the formFig. 1). The reference and signal waves originate from the same object point P 0 and are spatially coherent. If the difference of their optical paths is less than coherence length of the source, the waves interfere at the CCD, and create an interference pattern with the shape of the Fresnel zone plate, 5], Eq. (4), T can be simplified to the form Eq. (3), we can write 11] it is termed the diffractive lens, in this article. The focal length of the diffractive lens is given as 1/fl = 1/zs − 1/zr and its axis is shifted laterally relative to the origin of coordinate system of the object space. The focal length fl and the shift of the axis (X′, Y′) are uniquely determined by the position of object points and system parameters. Applying a lens equation, we can write
In FINCH, 3D objects are reconstructed using a complex function T, which is created from intensity records Ij according to Eq. (5). Complex amplitude of the image U′ is then obtained as a convolution of T and impulse response function of free propagation of light, h. When calculating the PSF, a quadratic phase of the digital lens Eq. (7) is used in the convolution. In this case we can write,Eq. (7) into Eq. (13) and applying the Fresnel approximation of h , we obtain
3. Geometrical optics approach of FINCH imaging
The PSF can be calculated analytically under simplified conditions which allow to perform the Fourier transform Eq. (14). The task is easiest in the approximation of ray optics, which does not consider the spatial bounding of light. In this case, the aperture function becomes constant, Q = 1. At the distance z′ = fl, a quadratic phase term in Eq. (14) is eliminated. This situation corresponds to a sharp image determined by the Dirac delta function,Equation (17) shows that the object point P 0(x 0, y 0, z 0) is digitally reconstructed as an image point P′ whose position is given by the coordinates x′ = X′, y′ = Y′, and z′ = fl. The symbol m used in Eq. (10) then represents the lateral magnification defined as the ratio between the image size and its true size in object space, m = x′/x 0 = y′/y 0. Ray optics provides a simple interpretation of FINCH imaging in which the longitudinal position of the object point z 0 determines the focal length of the diffractive lens and the lateral position (x 0, y 0) causes displacement of the lens axis. Paraxial image always lies on the axis of the lens, so it has image space telecentricity. This feature of the diffractive lens was used in  to determine the lateral magnification, but the link between the image position and the transverse shift of the lens axis has not been clarified. The lateral magnification Eq. (11) agrees with that which was determined in  and demonstrates its dependence on set-up parameters and the object distance z 0. It is interesting to note that the lateral magnification is independent of the focal length of the lens implemented by the SLM. The derivative dm/dz 0 shows that the dependence of m on z 0 can be eliminated by appropriate choice of Δ1,
The longitudinal magnification of the FINCH imaging is defined as α = dfl/dz 0. Using Eq. (8) it can be written asEq. (9). Derived relations for fl, m and α are valid for a general position of observed object and can be used in the analysis of two special cases.
Special case I: Planar object located in focal plane of collimating lens
Relations for the geometric parameters of the imaging Eqs. (8), (11) and (19) are simplified when a planar object is placed in the focal plane of the collimating lens, z 0 → – f 0. In this case, we can write
Special case II: Lensless FINCH imaging
The FINCH imaging can also be implemented in a simplified system in which the object is recorded without the use of collimating lens . In this case, Eqs. (8), (11) and (19) are used in the limit f 0 → ∞ and with Δ1 = 0,
4. PSF for diffraction limited FINCH imaging
In order to examine the diffraction limited FINCH imaging, a lateral restriction of light in the system must be considered. The proposed model takes into account the fixed apertures of optical components and also considers a variable size of holograms recorded on the CCD. The holograms are laterally confined by a Gaussian function, which allows to calculate the PSF analytically. Aperture function used in Eq. (13) then takes the formEq. (26) is then defined as
When using the Gaussian aperture, a point object can be reconstructed by analytical calculation. Complex amplitude of an image is obtained by Eq. (14) and can be written asEq. (32) used with Eq. (28) and can be written in the form
The radius of a Gaussian diffraction spot Δr′ can be defined by a fall of normalized intensity to the value I′N (Δr′) = 1/e 2,Eq. (28), the Strehl ratio can be arranged to the form Equations (34) and (37) are of general validity, but the numerical aperture NA′ must be analyzed separately for the system with a collimating lens and the lensless experiment.
FINCH with collimating lens
If we assume an object in the focal plane of the collimating lens, fl is given by Eq. (21) and the radius of intensity records Δρc can be simply estimated using the ray tracing and the Nyquist criterion. By means of ray optics, the radius Δρr can be determined by a simple relation
When using Δρc given by Eq. (38), the aperture of the diffractive lens is equal to the aperture of the modulator lens, NA′ = ρSLM/fd. For the Strehl coefficient D = 1/e 2, the transverse and longitudinal size of the diffraction spot can be written as
Intensity profile in the plane (x′,z′) obtained using Eq. (28) is illustrated in Fig. 2(b) for parameters f 0 = 200 mm, fd = 750 mm, Δ1 = 250 mm and Δ2 = 600 mm. For comparison, the PSF reconstructed from the CCD records is shown in Fig. 2(a). The parameters of the experiment were the same as those used in the calculation. As can be observed in Figs. 2(a) and 2(b), the size and shape of the demonstrated PSFs is in very good agreement.
When the lensless configuration is used with |z 0| < fd, the radius of recorded holograms is given as Δρc = ρCCD. The numerical aperture of the diffractive lens then can be written as NA′ = ρCCD/flL where flL is given by Eq. (23). In this case, the transverse and longitudinal size of the diffraction spot Eq. (40) depends on the separation distance between the SLM and the CCD.
5. Optimal geometric configuration of FINCH
In conventional imaging systems, the Lagrange invariant can be used to express a lateral magnification as a ratio of object and image numerical apertures, m = NA/NA′. If the Rayleigh criterion is applied to a diffraction limited image, two points are resolved for their mutual distance Δr′ ≈ λ/NA′ corresponding to a radius of image spot of a point object. The related distance in an object space can be written as Δr = Δr′/m ≈ λ/NA, so that resolution is also diffraction limited. In FINCH including both optical and digital imaging, the Lagrange invariant is not inherently fulfilled. Resolution enabled by an object aperture NA is not fully exploited in reconstructed image, if parameters of the set-up are inappropriately chosen. In presented analysis, an optimal geometry of experiment is discussed for two different configurations of the FINCH set-up.
FINCH imaging with collimating optics
To simplify discussion, an object placed near the front focal plane of a collimating lens is considered. In this case, Eqs. (21) and (22) can be used for the lateral magnification mF and the focal length flF of the diffractive lens. Using assumption that transverse dimensions of the collimating lens and the SLM are the same, the object and image numerical apertures can be written as NA = ρSLM/f 0 and NA′ = Δρc/flF. Using Eq. (38), the ratio of the numerical apertures can be written as NA/NA′ = |mF|fd /Δ2. Lateral resolution in object space is given as Δr = Δr′/|mF|. Using the image resolution Eq. (40), Δr can be expressed asFig. 3 for different focal lengths of collimating lenses. If Δ2 ≥ fd, the lateral resolution of the FINCH imaging is limited by the resolution of the collimating optics. In FINCH set-up with Δ2 < fd, the lateral resolution of the collimating optics is reduced. In , an optimal FINCH configuration was analyzed using Eq. (15) that takes into account the size of the reconstructed image expressed by the parameter a. If its value is chosen owing to the technical conditions as a = 1, the separation distance Δ2 = fd/2 is obtained [10,11]. In this case, the lateral resolution of the two-point object is two times worse than the diffraction limited resolution of the collimation lens Δr 0. When Eq. (15) in  is used with a = 0, the CCD position ensuring the best two-point resolution is obtained.
The longitudinal resolution in object space is given as Δz = Δz′/|αF|, where αF denotes the longitudinal magnification. Using Eq. (22), Δz can be rewritten asFig. 3. Technical difficulty related to Δ2 = fd setting will be discussed in the experimental section.
Lensless FINCH imaging
In lensless FINCH, each point of object emits a spherical wave which is captured by the SLM and subsequently divided into reference and signal waves interfering in the CCD. The lateral resolution in object and image space are related as Δr = Δr′/|mL|, where Δr′ can be obtained by Eq. (40) and mL is given by Eq. (25). Assuming fd > 0 and |z 0| < fd, the numerical aperture of diffractive lens can be written as NA′ = ρCCD/flL, where flL is given by Eq. (23). The lateral resolution Δr then can be expressed in the form,Eq. (44), the best lateral resolution can be written as Fig. 4 for fd = 750 mm and object distances z 0 = −50 mm, −75 mm and −100 mm. For |z 0| << fd, an optimal distance between the SLM and CCD is approximately given as Δ2 ≈ –z 0 and best resolution in these planes remains unchanged.
The longitudinal resolution in object space is defined as Δz = Δz′/αL. For chosen parameters of the experiment, the optimal position of the CCD camera Δ2 can be determined from the conditionFig. 4.
6. Coherent digital reconstruction of two-point object
Analysis of resolving power of FINCH is performed for imaging of two point sources Pj(xj, yj, zj), j = 1, 2, emitting uncorrelated light waves of equal amplitude. If the intensity records Eq. (4) are performed for two-point object, the complex function T given by Eq. (5) can be written asEq. (13) applied to T given by Eq. (47). Its complex amplitude U′TP can be written as Eqs. (12) and (30) where the coordinates (x 0, y 0, z 0) are replaced by (xj, yj, zj), j = 1, 2. Though the observed point sources emit incoherent light, the digital image is linear in complex amplitude and the total intensity I′TP= |U′TP|2 is given by interference law, Equation (49) describes 3D intensity distribution in a digital image of two-point object and can be used for analysis of lateral and longitudinal resolutions of FINCH imaging. The lateral resolution is examined for object points P 1(x 0 + Δx/2, 0, z 0) and P 2(x 0 – Δx/2,0, z 0). As a criterion we use a quantity Vx defined as Eq. (53) is formally identical with a visibility of interference patterns, Vx is called the lateral visibility in this paper.
Two-point resolution in coherent light depends on the relative phase Δϕ associated with the observed sources. When Δϕ = π/2, the intensity of the image is identical to that resulting from incoherent sources. When the sources are in phase (Δϕ = 0), the resolution is worse in coherent light than in incoherent light. If sources are in phase opposition (Δϕ = π), destructive interference occurs and the point sources are better resolved with coherent illumination than with incoherent light. Detailed analysis of the resolution in coherent light can be found in . If observed object points P 1 and P 2 are placed symmetrically with respect to the optical axis (x 0 = 0), their diffraction images interfere constructively at axial point P′(0, 0, z′) and Δϕ = 0 for an arbitrary separation distance Δx. For the two-point object laterally shifted to the position x 0 ≠ 0, the phase difference Δϕ oscillates in interval < 0, 2π > if the distance of points Δx is slightly changed. In calculated visibility, fast oscillations appear in dependence of Vx on Δx. When visibility is determined from experimental data, the oscillations do not occur. The reason is that the change in the relative phase causes slight changes of interference pattern that are not resolved in the CCD detection.
The longitudinal resolution is studied for two axial points specified by coordinates P 1(0, 0, z 0 + Δz/2) and P 2(0, 0, z 0 – Δz/2). The longitudinal visibility Vz then can be defined as
7. Experimental determination of resolving power of FINCH imaging
In the proposed experiment, a two-point incoherent source was recorded on the CCD camera and its digital reconstruction was performed. The FINCH imaging was analyzed for different mutual lateral and longitudinal positions of point sources and the consistency between the experimentally determined and calculated resolving power was compared.
7.1. Design and parameters of the experimental set-up
The set-up used for measurement of FINCH resolution is shown in Fig. 5. Spatially incoherent light emitted by the collimated LED (Thorlabs M625L2-C1, 625 nm, 280 mW) is transmitted through the laser line filter (Thorlabs FL632.8-3 with FWHM 3 nm) and coupled into two single mode fibers (Thorlabs SM 600, MFD 4.3 μm) with holders mounted to XYZ travel translation stages. The faces of the fibers are used as point sources, which can be precisely positioned in an object space near the focal plane of collimating lens. The collimated beams pass through polarizer, iris diaphragm and beam splitter and fall on the reflective SLM Hamamatsu X10468 (16 mm x 12 mm, 792 x 600 pixels). The polarizer is used to optimize a phase operation of the SLM. The SLM is addressed by computer generated holograms enabling splitting of the input beam to the signal and reference waves. The reference wave is transmitted with unchanged wavefront shape. The signal wave is created by a quadratic phase modulation equivalent to the action of lens with the focal length fd and successively three different phase shifts 0, 2π/3 and 4π/3 are imposed on it. Light reflected by the SLM is diverted to the CCD camera (QImaging Retiga-4000R, 15 mm x 15mm, 2048 x 2048 pixels) and three records of interference patterns created by the signal and reference waves are subsequently acquired. The intensity records are processed in a PC and digital image of observed object is created. In this way, the 3D PSF was restored and the transverse and longitudinal resolution of two-point object was examined for various parameters of the set-up.
7.2. Experimental results
Preliminary measurement was focused on the PSF reconstruction. In this case, only one fiber was used for hologram recording. By paraxial optics, an approximate location of the image was determined and accurate intensity distribution in the vicinity of paraxial image point was created by reconstruction algorithms. Using the Fresnel transform, transverse intensity profiles were evaluated in a sequence of planes near the paraxial image plane. These data were used for image reconstruction in the planes (x′, z′) and (y′, z′) of the image space from which information about transverse and longitudinal size of the image spot can be obtained. The PSF that was reconstructed from the experimental data is compared with the calculated PSF in Fig. 2 for parameters f 0 = 200 mm, fd = 750 mm, Δ1 = 250 mm, and Δ2 = 600 mm.
In examining the lateral resolution, the visibility was experimentally determined for different transversal distances Δx of the observed point sources. The measurements were carried out for different ratios Δ2/fd and the experimental results were compared with theoretical predictions. In the measurements, the fiber faces were placed in the focal plane of the collimating lens and the lateral distance of fibers was successively changed using micrometer displacements. Holograms of two-point object were recorded on the CCD camera and then used for digital image reconstruction. The planes of sharp images were determined using the numerical procedure implemented in software for digital imaging. In separate transverse planes of the image space, the first and second order moments were calculated from the intensity profile and the standard deviations σx and σy evaluated. The longitudinal position where becomes minimal was identified as the plane of best focus. In this plane, the two-point image was evaluated and the value of lateral visibility Vx defined by Eq. (53) determined. Visibility evaluated for transverse distance of point sources Δx, which vary from 0 to 0.14 mm is shown in Fig. 6(a). Experimental data were acquired in the system with parameters f 0 = 200 mm, fd = 750 mm, Δ1 = 250 mm and with varying distances Δ2 = 0.25 fd, 0.5 fd, 0.95 fd, 1.0 fd and 1.5 fd. Experimental results confirm the theoretical prediction that a transverse resolution of the FINCH imaging is limited by resolution of the collimating lens if Δ2 ≥ fd. Figure 6(a) shows that for such positions of the CCD, the experimental visibility is close to the theoretical curve, which corresponds to a diffraction limited resolution of the collimating lens (dashed line in Fig. 6(a)). The visibility curves obtained for Δ2 = 0.25 fd and 0.5 fd show that the resolution of a collimating lens is substantially reduced when Δ2 < fd. This trend is also evident from Fig. 6(b) that shows the separation distance Δx corresponding to the visibility Vx = 0.8 in dependence on the ratio Δ2/fd. Slightly lower experimental resolution then theoretical limit can be attributed to imperfections of the optical set-up, but general trends are clearly reproduced. Improvement in lateral resolution achieved by changing the position of the CCD camera is illustrated in Fig. 7 for the separation distances of the point sources Δx = 25 μm and 65 μm and camera positions Δ2 = 1.2 fd and 0.4 fd.
Analysis of the longitudinal resolution was carried out for two points located on the optical axis of collimating lens. The observed point sources were again realized by fibers and their longitudinal separation distance Δz was altered by precise mechanical displacements. Holograms were recorded on a CCD for each position of a two-point source and its images were created numerically using the Fresnel transform. Lateral intensity profiles were evaluated in a sequence of plains separated by a fine longitudinal step adapted to the depth of field of the system. In this way, the 3D intensity of two-point source was reconstructed and used to determine the longitudinal visibility Vz defined by Eq. (56). Dependence of the visibility Vz on the longitudinal separation distance of point sources Δz was found for different positions of the CCD given by Δ2 = fd ± 50 mm (Δ2/fd = 1.06 and 0.93), Δ2 = fd ± 375 mm (Δ2/fd = 1.5 and 0.5) and Δ2 = fd ± 550 mm (Δ2/fd = 1.73 and 0.26). The results are shown in Fig. 8(a). For each position of the CCD, the distance Δz for which the visibility drops to a value of Vz = 0.8 was determined from the dependence of Vz on Δz. Separation distances Δz for which visibility Vz drops to the specified level 0.8 have different values for different positions of the CCD. Their dependence on the ratio Δ2/fd is shown in Fig. 8(b). Figure 9 shows the intensity distribution, which was obtained by digital reconstruction of holographic records of two point sources located on the axis of the collimating lens. CCD camera was in the position Δ2 = 0.8 fd and the longitudinal distances of observed point sources were Δz = 2.5 mm and 3.5 mm, respectively. Experimental results again confirm the prediction that the best longitudinal resolution is reached to set Δ2 ≈ fd. In this case, the longitudinal resolution is limited by the longitudinal resolution of the collimating lens. The larger discrepancy in the longitudinal resolution can be justified by the fact, that longitudinal magnification is much more sensitive to parameter inaccuracies than lateral magnification.
At this point it is worth repeating that optimal transverse resolution for FINCH imaging is obtained for Δ2 ≥ fd, hence optimum for both transversal and longitudal two-point resolution can be achieved for Δ2 equal or slightly larger than fd. At this position of detector, the wave generated by the modulator is in focus, so that detected interference patterns have a small size. In the experiment, the two-point image was successfully reconstructed even from records taken directly at distance Δ2 = 1.0 fd where transverse dimension of the interference pattern was close to diffraction limited Airy pattern. In this case, however, special care had to be taken to overcome technical limitation caused by large difference in irradiance from signal and reference wave and so multiple exposures from CCD had to be combined for each record to accumulate enough light from reference wave and not to cause overexposure from focused signal wave, at the same time.
The article presents a wave model of FINCH, which allows calculation of the PSF and subsequent analysis of lateral and longitudinal resolution of two-point object. For the first time, the diffraction limits of the lateral and longitudinal resolution achievable in the FINCH imaging are calculated and examined experimentally. The main results can be summarized as follows:
- In the ray approximation, the relationships between geometrical parameters of the object and its digitally created image were found and used to determine the lateral and longitudinal magnification of the image.
- Three-dimensional diffraction limited PSF was calculated and compared with the image spot reconstructed from experimental data acquired in the FINCH set-up for a point object.
- Transverse and longitudinal resolution of FINCH imaging was examined theoretically and experimentally using the visibility function defined for a two-point source implemented by the LED coupled to single-mode fibers.
- The transverse and longitudinal resolution of two-point object was investigated experimentally for different parameters of the set-up with very good agreement with theory.
- It was verified that the transverse and longitudinal resolution of FINCH imaging is limited by the collimating optics and can be achieved only if distance Δ2 between the CCD camera and the SLM is equal or larger than focal length of the SLM lens, fd. We proved both theoretically and experimentally that when Δ2 is shorter than fd, the best two-point resolution is not reached and the transverse and longitudinal resolution provided by the collimating lens is significantly reduced.
Theoretical and experimental results presented in the article are applicable to estimation of the image quality of FINCH imaging achieved in experiment of given parameters. Complete mathematical treatment and discussion of coherence properties may be useful for the study of transfer functions of FINCH imaging. New findings on the longitudinal magnification and resolution can be valuable for imaging analysis of 3D objects.
This work was supported by the Czech Ministry of Education, Projects No. MSM6198959213 and MSM0021630508, the Czech Ministry of Industry and Trade, project No. FR-TI1/364, IGA project Modern optics and applications PrF 2010 005, and the Grant Agency of the Czech Republic, project No. 202/08/0590.
References and links
2. G. Indebetouw, A. El Maghnouji, and R. Foster, “Scanning holographic microscopy with transverse resolution exceeding the Rayleigh limit and extended depth of focus,” J. Opt. Soc. Am. A 22, 892–898 (2005). [CrossRef]
3. Y. Li, D. Abookasis, and J. Rosen, “Computer-generated holograms of three-dimensional realistic objects recorded without wave interference,” Appl. Opt. 40, 2864–2870 (2001). [CrossRef]
4. Y. Sando, M. Itoh, and T. Yatagai, “Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects,” Opt. Lett. 28, 2518–2520 (2003). [CrossRef] [PubMed]
7. J. Rosen and G. Brooker, “Non-scanning motionless fluorescence three-dimensional holographic microscopy,” Nat. Photonics 2, 190–195 (2008). [CrossRef]
10. B. Katz, D. Wulich, and J. Rosen, “Optimal noise suppression in Fresnel incoherent correlation holography (FINCH) configured for maximum imaging resolution,” Appl. Opt. 49, 5757–5763 (2010). [CrossRef] [PubMed]
12. B. E. A. Saleh and M. C. Teich, Fundamentals of Photonics (J. Wiley, 1991). [CrossRef]
13. S. Van Aert and D. Van Dyck, “Resolution of coherent and incoherent imaging systems reconsidered–classical criteria and a statistical alternative,” Opt. Express 14, 3830–3839 (2006). [CrossRef] [PubMed]