Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Colored point spread function engineering for parallel confocal microscopy

Open Access Open Access

Abstract

Using the color selectivity of a spatial light modulator (SLM) for both, tailoring the excitation beam at one wavelength, and multiplexing the image at the red-shifted fluorescence wavelength, it is possible to parallelize confocal microscopy, i.e. to simultaneously detect an axial stack (z-stack) of a sample. For this purpose, two diffractive patterns, one steering the excitation light, and the other manipulating the emission light, are combined within the same area of the SLM, which acts as a pure phase modulator. A recently demonstrated technique allows one to combine the patterns with high diffraction efficiency and low crosstalk, using the extended phase shifting capability of the SLM, which covers multiples of 2π at the respective wavelengths. For a first demonstration we compare standard confocal imaging with simultaneous image acquisition in two separate sample planes, which shows comparable results.

© 2016 Optical Society of America

1. Introduction

Imaging speed is an important parameter in confocal fluorescence microscopy, particularly when dynamical samples are imaged. Therefore various methods have been developed to parallelize confocal imaging, such as the spinning disc methods [2, 3] or programmable array microscopy [1]. While these approaches allow for real-time frame rates, they are nevertheless bound to a single imaging plane and do not support fast 3D imaging. Here we demonstrate a novel approach suitable for fluorescence microscopy, using color-dependent multi-spot diffraction patterns. Such patterns can generate multiple excitation spots at arbitrary 3D positions and simultaneously diffract the generated fluorescence onto a camera. Both, excitation and detection multiplexing, are done with a single multi-order diffractive pattern (DP) displayed on a phase-only SLM. Using the extended phase modulation range of the SLM, it is possible to design DPs that act practically independent at the excitation and emission wavelengths (460/515 nm), thus facilitating individually tailored excitation and detection PSFs in a scanning microscope. The DPs reconstructed upon illumination with the respective wavelengths are good approximations to the original template DPs, i.e. in the ideal case the diffraction efficiencies are only slightly reduced (on the order of 15%), and there is only low crosstalk (on the order of 2%) between the patterns. In earlier work the corresponding method to combine DPs which reconstruct independent phase masks at selected wavelengths has already been used for producing various kinds of polychromatic diffractive optical elements [4–11]. In combination with programmable SLMs with an extended phase modulation range [12], the method has been used in [13] to project color holograms by combining independent DPs for three red-green-blue wavelengths, to use an SLM for both optical trapping and Fourier imaging at two different wavelengths [14], and recently for multicolor localization microscopy, which simultaneously detects the z-positions of two types of fluorophores with different emission wavelengths [15].

2. Principle of operation

Briefly, the methods of combining various independent DPs, calculated for different wavelengths, into a single, multi-order DP, are based on a combination of two effects, namely the dispersion behavior of the liquid crystal pixels, and the fact that not the total phase profile, but only the phase modulation modulo 2π determines the effects of a DP at a certain wavelength.

Liquid crystal pixels are addressed by applying a voltage U, which changes their orientation and subsequently their refractive index (for the correct polarization direction) n(U). The corresponding phase retardation ϕ(U, λ) is

ϕ(U,λ)=2πdλn(U),
where d is the thickness of the liquid crystal layer. Correspondingly the phase changes inversely as a function of the wavelength. The fact that the phase of a wave diffracted by an SLM is only defined modulo 2π can be used to add different multiples of 2π to each pixel of a DP, without changing its diffraction properties at the design wavelength. However, the situation changes if the DP is reconstructed at another wavelength. In this case the total phase changes as an inverse function of the wavelength, and taking this phase modulo 2π gives completely different results, depending on the absolute phase value of a pixel. This behavior can be exploited by choosing an appropriate phase offset (in multiples of 2π) to set two template phase value (modulo 2π) at two (or even more) wavelengths separately, and to closely approximate them. This can be done for each DP pixel individually, and as a result one can combine two (or more) independent phase patterns into one with a larger phase range, which reconstructs the original patterns at their design wavelengths. The detailed algorithm for this is described in [13–15].

A concrete example for this procedure is sketched in Fig. 1. For applications in parallel confocal microscopy both, an excitation PSF for a wavelength of 460 nm, and a detection PSF at a wavelength of 515 nm, have to be independently controlled by an SLM located in a Fourier plane of the optical path. The excitation PSF is manipulated to generate two focal spots on the optical axis in the sample volume, separated by a certain longitudinal (z-)distance. This is achieved by a binary Fresnel lens (displayed in Fig. 1(a), top, not to scale), which adds a positive and a negative divergence (of the same absolute value) to the first and minus first diffraction orders of the transmitted light beam. The corresponding excitation beam path is sketched in Fig. 1(b) at the top. In the detection path (sketched in Fig. 1(b) at the bottom), the fluorescence light emitted from the two excited spots is again split into two. From the total number of now four beams, two can be focused onto the detector if the detection DP is chosen to be a suitable off-axis binary Fresnel lens (sketched in Fig. 1(a) (bottom)). Such a lens compensates for the defocus of light emitted from the two different target depths and diffracts also sideways such that the two excitation sites are sharply imaged onto the camera sensor with a certain lateral separation.

 figure: Fig. 1

Fig. 1 Principle of combining diffractive patterns for two-color operation used for parallel imaging. (a): The two diffractive patterns (top, and bottom) are individually calculated for operation at their respective wavelengths of 460 nm (top), and 515 nm (bottom), each covering a phase range of 2π (gray levels correspond to the voltage levels applied at the respective SLM pixels) at their respective wavelengths. Note that the voltage covers only a range between 0 and 72 at 460 nm, and between 0 and 85 at 515 nm, out of 256 accessible voltage levels. With a method described in the text, the two diffractive phase patterns are combined into a single one (middle), which now covers the entire accessible voltage range. (b): The SLM displaying the combined phase pattern is placed in a plane conjugate to the back aperture of the microscope objective. By illumination with the excitation beam at 460 nm, the first diffractive pattern (top pattern in (a)) is accessed, producing a PSF with two diffraction limited spots in two different z-planes. The corresponding fluorescence light at 515 nm passing back through the objective to the SLM accesses the second diffractive pattern (bottom of (a)), which produces a PSF with two spots, focusing at separate positions of the camera chip, and again with a depth separation corresponding to the first one. As a result, the camera detects two sharp in-focus images of the two excitation spots in the sample.

Download Full Size | PDF

Using the above mentioned method to combine the two DPs for individually shaping excitation and emission light, one obtains a final phase pattern like that displayed in Fig. 1(a) at the right (not to scale). This pattern now exploits the entire available phase range of the SLM. The respective efficiency of the two DPs reconstructed at their design wavelengths is about 70% of the values achievable with a DP optimized for just one wavelength, and the crosstalk, i.e. the diffraction efficiency of any wavelength at the “wrong” DP, is about 10%. Note that these values are worse than those reported in [13, 14], where efficiencies on the order of 85% and crosstalk levels below 2% have been achieved, due to the fact that the wavelength separation between the excitation beam at 460 nm, and the detection wavelength at 515 nm are significantly narrower than in the previous experiments (430 nm and 520 nm, respectively).

3. Experiments

A sketch of the experimental setup is shown in Fig. 2. A collimated, expanded excitation laser beam with a wavelength of 460 nm emitted by a single mode diode laser passes a first beam splitter (BS) and a folding mirror, which reflects it to the surface of a phase-only, reflective SLM (Hamamatsu LCOS SLM X10468-01, 792 × 600 pixels, pixel size: 20 × 20 μm2) under a small angle of incidence of about 10° with respect to the surface normal. The DPs displayed on the SLM can be modulated in a phase range up to 7.4π at our excitation wavelength of 460 nm, and up to 6.2 π at the emission wavelength of 515 nm. The SLM displays a phase pattern like that sketched in Fig. 1(a), which is designed to control both, the excitation and the emission beams at their respective wavelengths individually. Thus is shapes the excitation PSF in order to focus at two different z-planes within the sample volume. The beam then passes two subsequent 4f-relay systems, each containing a galvo scan mirror for beam steering along one separate axis. The beam then passes a second beamsplitter (BS) and reaches the back aperture of a microscope objective, which is conjugated to both galvo mirrors and the SLM. The objective (Olympus 60x NA 1.25, oil immersion) then focuses the excitation beam into the sample volume. For demonstration purposes we used a sample made of an aqueous fluorescein layer sandwiched between object slide and coverslip. The thickness of the layer was approximately 10 to 20 μm.

 figure: Fig. 2

Fig. 2 Experimental setup for simultaneous confocal imaging in two different z-planes. Details are explained in the text.

Download Full Size | PDF

In the detection path, the fluorescent light emitted by the sample passes again through the objective. For control purposes, part of the fluorescent light is reflected by the second BS and focused at a first camera (cam 1). Thus, camera 1 records the excitation light pattern within the fluorescent sample, as shown in (b) (on the top). Note that the two excitation spots can not be resolved as they lie on top of each other. The remainder of the fluorescence light passes the scanning and relay system, and is again diffracted by the DP at the SLM, which is for the emission wavelength designed to split a beam into two conjugate diffraction orders. The detection DP adds positive/negative tilt and defocus to these orders such that the two excitation sites in the sample volume are sharply imaged onto camera 2 (Hamamatsu ORCA-Flash4.0 V2), at a specific lateral distance. The corresponding image recorded with camera 2 is displayed in (b) (bottom). There, two images of the fluorescent surface are displayed side by side. Note that each of the two spots actually consists of two beams, one which is focused, and one which focuses in a plane which is 4 micron above/below the visible spot. However, using high NA imaging, only one of the planes is sharply in focus, whereas the other beam just produces a diffuse background, which slightly lowers the image contrast. During the scanning operation, a small region of the camera sensor containing these two foci is continuously read-out at an image frame rate of 400 Hz. The collected data is processed using pixel-reassignment [16], to render sharp confocal images, a concept which is known as Image Scanning Microscopy (ISM) [17,18].

As an example, the method is used to simultaneously record two confocal images from different depths within a GFP-labelled mouse brain-slice, which is a highly scattering tissue. Figure 3 shows the results of such a scan. The left column shows two confocal images from two z-planes with an axial distance of 4μm, simultaneously recorded by camera 2. This is compared with images from the same two planes which were sequentially acquired (second column). In this case, the SLM was only used as a reflective mirror, i.e. without displaying any structured DP, and image acquisition was performed as in standard ISM. Comparing the simultaneously acquired images with the respective “standard” ISM images (sequentially acquired) one detects no significant deterioration of the parallel imaging method. For further comparison, a standard wide field image of the sample (recorded with camera 1) is displayed at the right side of the figure. As expected, no details from different sample planes can be resolved in this image. Since each fluorescence image is split into two in case of the simultaneous acquisition, the respective image intensities (total photon counts in each image are indicated in the respective figures) are smaller by a factor of approximately 3 (which is lower than the optimally expected factor of 2, probably due to some diffusely reflected light components at the DPs displayed on the SLM).

 figure: Fig. 3

Fig. 3 Comparison of two simultaneously acquired images of different depths (left column) in a GFP-labelled mouse brain-slice with a control experiment where the same planes have been sequentially acquired (next row). The axial interspacing between the planes is 4 μm. The insets give the total number of pixel counts, which are proportional to the respective image intensities. For comparison, a wide-field fluorescence image of the same sample is shown at the right.

Download Full Size | PDF

The potential of the method for parallel imaging of a larger number of confocal spots is indicated in Fig. 4. Here a sandwiched fluorescein sample was used for demonstration. The SLM was programmed to generate a PSF consisting of 5×5 spots (focused in the same plane), which can be used to acquire a confocal image of the whole sample by just scanning a quadratic area with a side length corresponding to the distance between two spots. In Fig. 4(a) the corresponding DP was programmed to control only the excitation wavelength at 460 nm (corresponding voltage pattern shown in Fig. 4(a) at the left). The resulting spot array, emitted by fluorescent molecules on the plane sample surface, is recorded by camera 1 (middle of row (a)) and displays the expected spot array. However, the image of camera 2, which is situated such that it records the fluorescence light after it has passed the SLM again, just shows a blurred image, due to the second diffraction process at the DP. Thus, the resulting image is a cross-correlation between excitation and detection PSFs (which are in this case identical if one neglects Stokes shift induced effects), namely the 5×5 spot array, which is correspondingly blurred. Figure 4(b) shows the same experiment repeated with a DP that is programmed to provide the previous 5×5 spot array in the excitation path, but just a uniform phase distribution for the emission light, thus acting as a plane mirror. Accordingly, the image of camera 1 (which records the excitation pattern) shows the spot array, and the image recorded by camera 2 now also shows the same, almost undisturbed spot array. These experiments indicate that programming the SLM with DPs controlling both, excitation and emission light simultaneously, allows one to double-pass the SLM without significant image deterioration, which may be advantageous in several practical implementations.

 figure: Fig. 4

Fig. 4 Example for colored PSF engineering using a more complex excitation and filtering pattern of 5 × 5 spots. (a) shows a diffractive pattern which is designed only for the excitation wavelength (460 nm), i.e. it covers a phase range of 2π at that wavelength, corresponding to a voltage range between 0 and 72. Camera 1 directly images the excitation pattern in the sample plane, showing the corresponding pattern of fluorescent spots. The image of camera 2 is acquired after the light passing the SLM in both, excitation and detection paths, and thus records a convolution of the excitation and detection PSFs (of different colors), which is in this case (with non-controlled detection PSF) a diffuse pattern of multiplexed spots. (b) shows the same experiment done with a combined diffractive pattern, which is designed to create a 5 × 5 spot array for the excitation light (at 460 nm, see cam 1), and just a plane mirror function at the emission wavelength of 515 nm. As before, the image of camera 2 shows the convolution between excitation and imaging PSFs, which in this case corresponds to the almost undisturbed spot array.

Download Full Size | PDF

The results of further experiments indicate other potential applications of the method. The first image in Fig. 5(a) shows the results obtained by a DP which combines a 5×5 spot array optimized for the excitation wavelength, with a spiral phase function of order 5 (a higher order was chosen to make the structures more discernible), designed for the fluorescence emission wavelength. The corresponding cross-correlation thus produces a 5×5 array of doughnut rings. In a practical application it would be advantageous to chose a helicity of 1 for the spiral phase function, which would pave the way for multiplexed confocal spiral phase contrast imaging [19]. Note that the seemingly strong zero order intensity in the center of the image consists mainly of the first diffraction order of the “wrong” diffractive pattern (which produces a focused doughnut in the image center), which leaks through with an efficiency of about 10%, due to the crosstalk between the two patterns. This crosstalk can be strongly reduced, if a fluorescence dye with a larger separation between excitation and emission wavelength is chosen.

 figure: Fig. 5

Fig. 5 In (a) the diffractive pattern for the excitation wavelength (at 460 nm) was designed as the 5 × 5 spot array shown in Fig. 4, whereas the DP in the imaging path (515 nm) was designed to produce a doughnut beam of order 5. The corresponding light pattern seen by camera 2 is a cross-correlation of both, and thus corresponds to an array of doughnut rings. (b) shows the result of a similar experiment where the DP for the excitation beam was calculated to produce a 5 × 5 spot array, which is projected at the surface of the sample. In this case the spots were programmed to focus in different (randomly distributed) z-planes. The DP in the imaging path was designed to produce a double-helix PSF. The corresponding image seen by camera 2 thus shows an array of double-helices, which however stem from excitation spots that are in focus and out of focus at the fluorescent surface. An attached movie shows the effect of scanning the z-position of the sample, resulting in a rotation of the helical PSFs. During the axial scan other helical PSFs which then focus at the sample surface have a changed orientation with respect to ones which focus in other planes (see Visualization 1).

Download Full Size | PDF

Figure 5(b) shows a similar experiment, with a DP producing a 5×5 spot array in the excitation path, however with each spot focusing at another randomly chosen depth (axial positions distributed within a range of 10 μm). This DP is combined with a phase function producing a double-helix PSF in the emission path. The corresponding phase pattern is displayed in Fig. 5(c). Thus the resulting cross-correlation between excitation and detection PSFs produces an array of 5×5 helical PSFs. Each helical PSF consists of two intensity maxima arranged on a line where the actually emitting fluorophores are in the center. Interestingly, the orientation of the line connecting the two maxima is an extremely sensitive measure for the axial position of the emitters, which was used in previous experiments to obtain axial superresolution in confocal imaging [20, 21]. As a result, camera 2 records an array of 5×5 double-helix PSFs, which are sharply imaged only for the excitation spots focused at the plane sample surface, and blurred for defocused spots. An attached movie ( Visualization 1) shows the effect when the axial position of the sample is scanned in a range of 4 μm. Since with changing distance between microscope objective and sample surface, other spots are sharply focused at the sample surface, these different spots now produce sharp images of their double-helix PSFs. Simultaneously, the movement of the sample is accompanied by a rotation of the PSFs, which allows one to accurately measure the axial displacement.

4. Discussion

We have shown that the extended phase modulation range of SLMs can be used to shape the PSFs for different wavelengths separately. This was used for simultaneous confocal imaging of a sample in two separate planes. Extending the method for scanning even more planes is possible by creating more excitation spots, and splitting the field in the camera plane into a corresponding number of more images, which sharply focus the different excited sample planes. Although this goes along with a reduction of the image intensity in each of the recorded sample planes and increasing scattering-induced cross-talk between channels, such a scheme might be advantageous in situations where it is important to observe events in multiple zones across a volume simultaneously, such as in neuronal signalling. Multi-Channel detectors supporting the required acquisition speed for such applications are already available.

The application possibilities of independent control over excitation and emission wavelengths also suggest other applications, like parallel confocal spiral phase contrast microscopy, or parallel superresolution microscopy using advanced Fourier manipulation methods, like helical point spread function engineering for accurate depth measurements within sample volumes.

Funding

ERC Advanced Grant 247024 catchIT.

References and links

1. M. Liang, R. L. Stehr, and A. W. Krause, “Confocal pattern period in multiple-aperture confocal imaging systems with coherent illumination,” Opt. Lett. 22, 751–753 (1997). [CrossRef]   [PubMed]  

2. M. A. A. Neil, T. Wilson, and R. Juskaitis, “A light efficient optically sectioning microscope,” J. of Microsc. 189114–117 (1998). [CrossRef]  

3. R. GrÃd’f, J. Rietdorf, and T. Zimmermann, “Live cell spinning disk microscopy,” Advances in Biochemical Engineering/Biotechnology 95, 57–75 (2005). [CrossRef]  

4. D. Faklis and G. M. Morris, “Spectral properties of multiorder diffractive lenses,” Appl. Opt. 34, 2462–2468 (1995). [CrossRef]   [PubMed]  

5. D. W. Sweeney and G. E. Sommargren, “Harmonic diffractive lenses,” Appl. Opt. 34, 2469–2475 (1995). [CrossRef]   [PubMed]  

6. S. Noach, A. Lewis, Y. Arieli, and N. Eisenberg, “Integrated diffractive and refractive elements for spectrum shaping,” Appl. Opt. 35, 3635–3639 (1996). [CrossRef]   [PubMed]  

7. I. M. Barton, P. Blair, and M. Taghizadeh, “Dual-wavelength operation diffractive phase elements for pattern formation,” Opt. Exp. 1, 54–59 (1997). [CrossRef]  

8. J. Bengtsson, “Kinoforms designed to produce different fan-out patterns for two wavelengths,” Appl. Opt. 37, 2011–2020 (1998). [CrossRef]  

9. T. R. M. Sales and D. H. Raguin, “Multiwavelength operation with thin diffractive elements,” Appl. Opt. 38, 3012–3018 (1999). [CrossRef]  

10. Y. Ogura, N. Shirai, J. Tanida, and Y. Ichioka, “Wavelength-multiplexing diffractive phase elements: design, fabrication, and performance evaluation,” JOSA A 18, 1082–1092 (2001). [CrossRef]   [PubMed]  

11. A. J. Caley, A. J. Waddie, and M. R. Taghizadeh, “A novel algorithm for designing diffractive optical elements for two colour far-field pattern formation,” J. Opt. A: Pure Appl. Opt. 7, 276–279 (2005). [CrossRef]  

12. V. Calero, P. García-Martínez, J. Albero, M. M. Sánchez-López, and I. Moreno, “Liquid crystal spatial light modulator with very large phase modulation operating in high harmonic orders,” Opt. Lett. 38, 4663–4666 (2013). [CrossRef]   [PubMed]  

13. A. Jesacher, S. Bernet, and M. Ritsch-Marte, “Colour hologram projection with an SLM by exploiting its full phase modulation range,” Opt. Exp. 22, 20530–20541 (2014). [CrossRef]  

14. A. Jesacher, S. Bernet, and M. Ritsch-Marte, “Combined holographic optical trapping and optical image processing using a single diffractive pattern displayed on a spatial light modulator,” Opt. Lett. 39, 5337–5340 (2014). [CrossRef]  

15. Y. Shechtman, L. E. Weiss, A. S. Backer, M. Y. Lee, and W. E. Moerner, “Multicolour localization microscopy by point-spread-function engineering,” Nat. Photon. 10, 590–594 (2016). [CrossRef]  

16. C. J. R. Sheppard, “Super-resolution in Confocal Imaging,” Optik 80, 53–54 (1988).

17. C. Müller and J. Enderlein, “Image Scanning Microscopy,” Phys. Rev. Lett. 104, 198101 (2010). [CrossRef]   [PubMed]  

18. C. J. R. Sheppard, S. B. Mehta, and R. Heintzmann, “Superresolution by image scanning microscopy using pixel reassignment,” Opt. Lett. 38, 2889–2892 (2013). [CrossRef]   [PubMed]  

19. Marc Guillon and Marcel A. Lauterbach, “Quantitative confocal spiral phase contrast,” J. Opt. Soc. Am. A 31, 1215–1225 (2014). [CrossRef]  

20. R. Piestun, Y. Y. Schechner, and J. Shamir, “Propagation-invariant wave fields with finite energy,” J. Opt. Soc. Am. A 17, 294–303 (2000). [CrossRef]  

21. S. R. P. Pavani, M. A. Thompson, J. S. Biteen, S. J. Lord, N. Liu, R. J. Twieg, R. Piestun, and W. E. Moerner, “Three-dimensional single-molecule fluorescence imaging beyond the diffraction limit using a double-helix point spread function,” PNAS 106, 2995–2999 (2009). [CrossRef]  

Supplementary Material (1)

NameDescription
Visualization 1: AVI (140 KB)      avi-movie visualization 1 related to Fig5

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 Principle of combining diffractive patterns for two-color operation used for parallel imaging. (a): The two diffractive patterns (top, and bottom) are individually calculated for operation at their respective wavelengths of 460 nm (top), and 515 nm (bottom), each covering a phase range of 2π (gray levels correspond to the voltage levels applied at the respective SLM pixels) at their respective wavelengths. Note that the voltage covers only a range between 0 and 72 at 460 nm, and between 0 and 85 at 515 nm, out of 256 accessible voltage levels. With a method described in the text, the two diffractive phase patterns are combined into a single one (middle), which now covers the entire accessible voltage range. (b): The SLM displaying the combined phase pattern is placed in a plane conjugate to the back aperture of the microscope objective. By illumination with the excitation beam at 460 nm, the first diffractive pattern (top pattern in (a)) is accessed, producing a PSF with two diffraction limited spots in two different z-planes. The corresponding fluorescence light at 515 nm passing back through the objective to the SLM accesses the second diffractive pattern (bottom of (a)), which produces a PSF with two spots, focusing at separate positions of the camera chip, and again with a depth separation corresponding to the first one. As a result, the camera detects two sharp in-focus images of the two excitation spots in the sample.
Fig. 2
Fig. 2 Experimental setup for simultaneous confocal imaging in two different z-planes. Details are explained in the text.
Fig. 3
Fig. 3 Comparison of two simultaneously acquired images of different depths (left column) in a GFP-labelled mouse brain-slice with a control experiment where the same planes have been sequentially acquired (next row). The axial interspacing between the planes is 4 μm. The insets give the total number of pixel counts, which are proportional to the respective image intensities. For comparison, a wide-field fluorescence image of the same sample is shown at the right.
Fig. 4
Fig. 4 Example for colored PSF engineering using a more complex excitation and filtering pattern of 5 × 5 spots. (a) shows a diffractive pattern which is designed only for the excitation wavelength (460 nm), i.e. it covers a phase range of 2π at that wavelength, corresponding to a voltage range between 0 and 72. Camera 1 directly images the excitation pattern in the sample plane, showing the corresponding pattern of fluorescent spots. The image of camera 2 is acquired after the light passing the SLM in both, excitation and detection paths, and thus records a convolution of the excitation and detection PSFs (of different colors), which is in this case (with non-controlled detection PSF) a diffuse pattern of multiplexed spots. (b) shows the same experiment done with a combined diffractive pattern, which is designed to create a 5 × 5 spot array for the excitation light (at 460 nm, see cam 1), and just a plane mirror function at the emission wavelength of 515 nm. As before, the image of camera 2 shows the convolution between excitation and imaging PSFs, which in this case corresponds to the almost undisturbed spot array.
Fig. 5
Fig. 5 In (a) the diffractive pattern for the excitation wavelength (at 460 nm) was designed as the 5 × 5 spot array shown in Fig. 4, whereas the DP in the imaging path (515 nm) was designed to produce a doughnut beam of order 5. The corresponding light pattern seen by camera 2 is a cross-correlation of both, and thus corresponds to an array of doughnut rings. (b) shows the result of a similar experiment where the DP for the excitation beam was calculated to produce a 5 × 5 spot array, which is projected at the surface of the sample. In this case the spots were programmed to focus in different (randomly distributed) z-planes. The DP in the imaging path was designed to produce a double-helix PSF. The corresponding image seen by camera 2 thus shows an array of double-helices, which however stem from excitation spots that are in focus and out of focus at the fluorescent surface. An attached movie shows the effect of scanning the z-position of the sample, resulting in a rotation of the helical PSFs. During the axial scan other helical PSFs which then focus at the sample surface have a changed orientation with respect to ones which focus in other planes (see Visualization 1).

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

ϕ ( U , λ ) = 2 π d λ n ( U ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.