Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Adaptive optics light-sheet microscopy based on direct wavefront sensing without any guide star

Open Access Open Access

Abstract

We propose an adaptive optics light-sheet fluorescence microscope (AO-LSFM) for closed-loop aberrations’ correction at the emission path, providing intrinsic instrumental simplicity and high accuracy when compared to previously reported schemes. The approach is based on direct wavefront sensing, i.e., not on time-consuming iterative algorithms, and does not require the use of any guide star, thus reducing instrumental complexity and/or sample preparation constraints. The design is based on a modified Shack–Hartmann wavefront sensor providing compatibility with extended sources such as images from optical sectioning microscopes. We report an AO-LSFM setup based on such sensors, including characterization of the sensor performance, and demonstrate for the first time to the best of our knowledge a significant contrast improvement on neuronal structures of the ex vivo adult drosophila brain in depth.

© 2019 Optical Society of America

When targeting reliable diagnosis and efficient therapy of neurological diseases, understanding brain functions is of key importance. Neuroimaging now benefits from spectacular breakthroughs in biochemistry and microscopy, thanks to the availability of new genetically encoded reporters such as GCaMP and of advanced optical sectioning techniques such as light-sheet or multiphoton microscopy [13]. These techniques can provide structural and functional images of neuron networks with high spatiotemporal resolution, on semi-transparent animal models such as ZebraFish or Drosophila larvae [4,5], with minimized phototoxicity in particular in the case of light-sheet microscopy. Still, one major limitation of such techniques when imaging at large depths is the presence of optical aberrations that arise from inhomogeneities of the sample as well as from residual aberrations of the optical setup.

Thus, there have been considerable recent efforts in developing adaptive optics (AO) methods to compensate for optical aberrations in microscopy and provide increased contrast and resolution in depth [6,7]. Reported approaches mainly differ regarding implemented wavefront (WF) sensing methods, which can be grouped into two categories: indirect WF sensing methods based on iterative algorithms estimating the WF from its impact on image quality, and direct WF sensing methods based on real-time WF measurement from a point source acting as a “guide star”—as originally defined in astronomical AO—in the imaging plane. The former makes use of a merit function such as image intensity or sharpness [8,9], does not require a WF sensor, and still provides image enhancement with moderate scattering conditions. It is however time-consuming, which increases phototoxicity, whereas direct WF sensing, usually based on a Shack–Hartmann (SH) WF sensor, provides the best reliability, speed, and compatibility with in vivoimaging [7].

The LSFM, because of its low phototoxicity and speed capabilities, is now increasingly used for both structural and even functional neuroimaging [10]. Only a couple of AO setups using direct WF sensing have been proposed for the LSFM, targeting increased contrast and resolution, and mainly applied to the correction of aberrations at the emission path. Because of the need for a guide star in direct WF sensing, reported setups either make use of fluorescent beads in the sample [11], something practically not compatible with biological studies in particular in neuroscience, or make use of an ultrafast laser to locally induce a fluorescent source through a nonlinear multiphoton process and a scan/descan acquisition geometry [12], which corresponds to significant instrumental complexity and cost. Recently, Lawrence et al. [13] proposed an AO light-sheet setup using a scene-based SH WF sensor as originally developed in astronomy and Earth observation [14,15] but failed to provide an aberrations correction. This approach avoids the need for a guide star, providing simple AO implementation based on direct WF sensing.

In this Letter, we report for the first time AO-LSFM images of neuronal structures involved in sleep behavior in the ex vivo adult drosophila brain, showing significant image quality improvement in depth, using closed-loop AO correction of aberrations at the emission path without the use of a guide star. We describe our experimental setup, including detailed design and performance characterization of the specific SH WF sensor providing compatibility with extended sources. We discuss the impact of the characteristics of the sample on AO performance such as WF sensing accuracy and anisoplanetism, and provide strategies to further enhance the technique regarding photometry and extension of the corrected field-of-view (FOV).

Our optical setup is presented in Fig. 1 and is composed of a light-sheet excitation module and an AO loop included in the emission path. The conventional light-sheet illumination module includes a 488 nm laser source (Cobolt), a 10× NA 0.3 objective lens (Olympus) with a 1 mm diameter diaphragm at the pupil plane to generate a pencil beam, a galvanometer mirror (Thorlabs) used to scan the beam over the FOV, and a custom sample chamber filled with a sample medium with a 0.17 mm thick coverslip as the optical interface. This illumination setup creates a Gaussian beam with 7 μm thickness full width at half-maximum (FWHM) and a Rayleigh range of 160 μm, both measured in the object plane. The design of the AO part corresponds to a closed-loop configuration with the deformable mirror (DM) (Mirao52e, Imagine Eyes) prior to the custom, extended-source SH sensor (ESSH). Imaging is performed with a 25× NA 0.95 water-immersion objective lens (Leica) and a sCMOS camera (Hamamatsu Orca Flash v2). A conventional SH (Imagine Optic Haso3 First, 32×40 microlenses, λ/100 accuracy) is also included as a reference SH sensor for performance assessment.

 figure: Fig. 1.

Fig. 1. Optical setup. L1-5, relay lenses; IF, interference fluorescence filter; M, mirror; DM, deformable mirror; D1-2, field diaphragms; BS1-2, 50:50 nonpolarizing beamsplitters; CAM, imaging camera. P1-4 are conjugated pupil planes.

Download Full Size | PDF

The ESSH WF sensor is made of a 17×23 microlens array, each microlens defining a 43×43 pixels area on a global shutter scientific CMOS camera with 6.9 μm per pixel, a conjugation lens, and an adjustable, square-shaped field diaphragm to avoid crosstalk between subimages formed by adjacent microlenses. The illumination beam scan rate is significantly higher than the ESSH camera acquisition rate. The focal length/diameter ratio of each microlens is fmic/Dmic=17, corresponding to a low numerical aperture (NA), and each subimage is sampled just above the diffraction limit to provide a good tradeoff between the FOV defined by the size of a microlens and the measurement accuracy. WF slopes are directly computed from the positions of intercorrelation peaks between subimages, as previously reported in the field of astronomy [16,17]. Each microlens defines a 132×132μm FOV in the object plane, and the reconstructed WF, using a conventional zonal method, is representative of an average aberration map over this FOV. Figure 2 shows a raw ESSH image of a fluorescent live cell and comparative images from one microlens and from the LSFM camera.

 figure: Fig. 2.

Fig. 2. (a) Raw ESSH image of a HeLa cell (GFP—tubulin). (b) Image on scientific camera. (c) Zoom on one ESSH subimage.

Download Full Size | PDF

Since WF measurement accuracy is driving AO performance, characterization of the custom ESSH sensor is required. For this purpose, we use the reference SH sensor to drive the DM, such that a set of pure Zernike modes of known amplitudes is generated, taking into account aberrations of the optical setup as well as differential aberrations between the two WF sensing paths. A sample made of sparse 2 μm fluorescent beads (ThermoFischer, λ=515nm emission) deposited between two coverslips is used to provide both a guide star for the SH sensor by selecting one bead at the center of the FOV using field diaphragm D2 and an extended source for the ESSH by using its field diaphragm D1 of larger extent. Static aberrations of the optical system are first measured and corrected before adding known aberrations. Figure 3 shows the WF difference between measurements from the two sensors as computed using Zernike coefficients, demonstrating a relative accuracy better than λ/50 over a range of ±200nm rms of induced 3rd-order aberrations. Since the two sensors have a significant difference in spatial resolution and since the two WF measurement paths have different magnifications, the calculation of the Zernike pupil size and center for each sensor is likely to exhibit some relative inaccuracy: this probably explains the increasing residual WF difference with a larger-generated WF amplitude, particularly for a spherical aberration that is very sensitive to pupil edge effects. However, these results demonstrate good accuracy of the ESSH, adapted to AO-based imaging.

 figure: Fig. 3.

Fig. 3. ESSH WF measurement accuracy for 3rd-order aberrations as compared to the reference SH.

Download Full Size | PDF

Since the ESSH method is based on imaging, WF measurement accuracy is also driven by the characteristics of the image of the sample through microlenses, whereas the conventional SH approach benefits from diffraction-limited images of a mandatory point source. In particular, the spatial frequency content of subimages formed by the ESSH impact the geometry of intercorrelation peaks and as a consequence their localization accuracy, which drives the WF measurement accuracy. Each microlens of the ESSH acts as a low-pass filter in the Fourier domain, with a NA of NAmicDmic/2fmic0.03 as compared to the NA of 0.95 of the microscope objective. Intercorrelations of subimages of the ESSH are thus based on low spatial frequencies of the object. To characterize the impact of this parameter, we simulated an object with variable 2D spatial frequency—generated from a random set of 2D sine patterns, we computed the corresponding image through a microlens, sampled in accordance with our ESSH design, calculated its positioning error for a given image shift, and converted it to the proportional WF error at the level of a microlens.

The results are presented in Fig. 4: expected accuracy is <λ/20 over the whole spatial frequency range of a microlens, except for very low frequencies corresponding to very smooth patterns that fail to provide accurate intercorrelation, but that can easily be rejected by high-pass prefiltering of subimages. Moreover, for highest spatial frequencies, typically corresponding to the fluorescent beads used in the previous experiment, the expected accuracy reaches up to <λ/50, consistently with the results of Fig. 3. WF sensing based on ESSH is likely to fail for objects solely containing high spatial frequencies, in particular higher than the cutoff frequency of a microlens. When studying samples where structures of interest are very small, this suggests that an additional labeling, for example structural, might be beneficial to the reported approach—all the more that such labeling can be done at a given emission wavelength that can be specifically used for WF measurement, thus avoiding the need to share photons between the WF sensing and the imaging path, as done in the present straightforward setup.

 figure: Fig. 4.

Fig. 4. WF error as a function of the sample normalized spatial frequency (fcESSH = cutoff frequency of a microlens). Each point (blue dot) is the average WF error computed from a set of 10 sine patterns randomly generated—to take into account sampling effects—and a corresponding set of 10 image shifts randomly distributed over a range describing a WF error of ±λ. Error bars are thus calculated from 100 measurements.

Download Full Size | PDF

To demonstrate image quality improvement of the reported AO-LSFM setup, we imaged a freshly dissected adult drosophila brain, with sleep neurons expressing green fluorescent protein [GFP (23E10-GAL4 > UAS-CD8-GFP)], without any sample processing such as fixation and clarification. Figure 5 shows raw AO-LSFM images of cells bodies of the ExF12 neuronal group of the dorsal fan-shaped body [18] at a depth of approximately 40 μm without and with AO, without any deconvolution applied. In this experiment, static aberrations of the optical setup as well as differential aberrations between the imaging camera and the WF sensing path have been precompensated, so that comparative images provide only the effect of correcting aberrations arising from the sample. The AO loop is running at approximately 2 Hz in the experiment, currently limited by the amount of signal available for the ESSH, since the fluorescence signal is shared between imaging and WF sensing in this first version of the setup. WF residual drops from 80 to 25 nm rms through the AO process, based on the correction of the first 30 modes of the interaction matrix. This residual approximately corresponds to λ/20, below the Marechal criterion for a diffraction-limited image.

 figure: Fig. 5.

Fig. 5. GFP-expressing ExF12 neurons of the dorsal fan-shaped body around 40 μm deep inside a freshly dissected adult drosophila brain, without (top) and with (bottom) AO. (a)–(d) Intensity profiles of cell bodies along the pink and green lines showing SNR and resolution enhancement with AO. Insert: 350×350μm2 full FOV image. Dotted square: ESSH/AO correction FOV.

Download Full Size | PDF

Intensity profiles along cellular structures show clear contrast and resolution enhancement using AO. On a small cell body of about 10 μm in diameter, profiles (pink line in Fig. 5) show a 50% increase in contrast, while on a larger cell, profiles (green line in Fig. 5) show that AO correction provides a sharper cell contour. The FWHM of the profiles without and with AO decreases from 12 to 9 μm, suggesting a significant resolution increase already at moderate depth. The ESSH FOV is marked using a dotted square in Fig. 5: structures outside this area do not fully benefit from AO correction.

In the ESSH method applied to microscopy, as previously explained, NAmic is much smaller than NAobj. The depth of focus (DoF) of a microlens is thereby significantly larger than the DoF of the objective, so that each microlens provides an in-focus image of the sample over a huge axial extent. Therefore, to provide an accurate WF measurement corresponding to the imaging plane, optical sectioning is a mandatory feature to be used with ESSH, for intercorrelations to be done on structures of the sole imaging plane. This corresponds to providing a “guide plane,” instead of a guide star for SH. This constraint is however far more acceptable since modern microscopy techniques such as the LSFM now rely on optical sectioning to provide 3D imaging with an enhanced SNR. Also, we showed that, due to the use of low spatial frequencies of the sample in the ESSH approach, a supplementary structural labeling is beneficial to robust WF sensing and photometry. This constraint is also more acceptable for biologists than using beads, such labeling being already widely used for anatomical mapping, in conjunction or not with specific functional reporters, in particular in neuroimaging. We will implement dual-labeling sample preparation and corresponding detection in the next version of our AO-LSFM setup.

Sampling of subimages in ESSH is a key design parameter: each subimage defines the WF measurement FOV, and minimal sampling is required to ensure accurate intercorrelation, so that more pixels per microlens are necessary when compared to conventional SH. As a result, ESSH usually provides less WF sampling than SH or requires the use of megapixels cameras. This might be seen as a limitation when targeting high-order aberrations’ characterization. In particular, this design parameter needs to be carefully considered in conjunction with the DM used, to ensure that the spatial correction modes of the DM, depending on the geometry, are sampled enough by the WF sensor. In our setup, the DM exhibits eight actuators along a diameter corresponding to 14 microlenses of the ESSH, which provides proper sampling of the WF regarding DM spatial capabilities.

Anisoplanetism is a practical limitation of AO when targeting aberration-corrected images of complex samples over a large FOV. A typical size of the isoplanetic patch for brain samples has been estimated to 30–150 μm [12,19,20], depending on the sample, imaging depth, and optical setup. We used these results to define the FOV of a microlens (132×132μm2) in our design, as a preliminary tradeoff favoring the corrected image size. Assessment of the optimal isoplanetic patch will be achieved as a next step. Also, since the targeted FOV is typically >400μm, sequential WF measurements with a transverse motion of the diaphragm followed by DM compensation can provide characterization and correction of aberrations of the full FOV of the objective, at the expense of supplementary acquisition time.

The reported setup provides a new, simple AO method to compensate for aberrations at the emission path in the LSFM. We demonstrated a SNR and contrast enhancement when imaging GFP neurons tens of microns deep inside a live Drosophila brain. A full AO correction, including strategies at the excitation path, will be implemented as a next step of our instrumental development.

Funding

Agence Nationale de la Recherche (ANR) (INOVAO 2018); IDEX Paris-Saclay, Initiative de Recherche Stratégique 206 (BrainScopes); Institut National de la Santé et de la Recherche Médicale (INSERM).

Acknowledgment

We thank A. Fourgeaud (ESPCI) for providing custom mechanical parts and L. Bourdieu (ENS-IBENS) for fruitful discussion. F. Rouyer is supported by INSERM.

REFERENCES

1. S. Chamberland, H. H. Yang, M. M. Pan, S. W. Evans, S. Guan, M. Chavarha, Y. Yang, C. Salesse, H. Wu, J. C. Wu, T. R. Clandinin, K. Toth, M. Z. Lin, and F. St-Pierre, eLife 6, e25690 (2017). [CrossRef]  

2. A. M. Packer, L. E. Russell, H. W. P. Dalgleish, and M. Häusser, Nat. Methods 12, 140 (2015). [CrossRef]  

3. T.-W. Chen, T. J. Wardill, Y. Sun, S. R. Pulver, S. L. Renninger, A. Baohan, E. R. Schreiter, R. A. Kerr, M. B. Orger, V. Jayaraman, L. L. Looger, K. Svoboda, and D. S. Kim, Nature 499, 295 (2013). [CrossRef]  

4. W. C. Lemon, S. R. Pulver, B. Höckendorf, K. McDole, K. Branson, J. Freeman, and P. J. Keller, Nat. Commun. 6, 7924 (2015). [CrossRef]  

5. X. Liang, T. E. Holy, and P. H. Taghert, Science 351, 976 (2016). [CrossRef]  

6. M. J. Booth, Light Sci. Appl. 3, e165 (2014). [CrossRef]  

7. N. Ji, Nat. Methods 14, 374 (2017). [CrossRef]  

8. N. Olivier, D. Débarre, and E. Beaurepaire, Opt. Lett. 34, 3145 (2009). [CrossRef]  

9. M. Pedrazzani, V. Loriette, P. Tchenio, S. Benrezzak, D. Nutarelli, and A. Fragola, J. Biomed. Opt. 21, 036006 (2016). [CrossRef]  

10. P. J. Keller and M. B. Ahrens, Neuron 85, 462 (2015). [CrossRef]  

11. R. Jorand, G. L. Corre, J. Andilla, A. Maandhui, C. Frongia, V. Lobjois, B. Ducommun, and C. Lorenzo, PLoS ONE 7, e35795 (2012). [CrossRef]  

12. T.-L. Liu, S. Upadhyayula, D. E. Milkie, V. Singh, K. Wang, I. A. Swinburne, K. R. Mosaliganti, Z. M. Collins, T. W. Hiscock, J. Shea, A. Q. Kohrman, T. N. Medwig, D. Dambournet, R. Forster, B. Cunniff, Y. Ruan, H. Yashiro, S. Scholpp, E. M. Meyerowitz, D. Hockemeyer, D. G. Drubin, B. L. Martin, D. Q. Matus, M. Koyama, S. G. Megason, T. Kirchhausen, and E. Betzig, Science 360, eaaq1392 (2018). [CrossRef]  

13. K. Lawrence, Y. Liu, R. Ball, A. J. VanLeuven, J. D. Lauderdale, and P. Kner, in Imaging and Applied Optics (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, pcAOP) (Optical Society of America, 2018) [CrossRef]  .

14. V. Michau, G. Rousset, and J.-C. Fontanella, in Real Time and Post Facto Solar Image Correction, R. R. Radick, ed. (1993), p. 124.

15. V. Michau, J.-M. Conan, T. Fusco, M. Nicolle, C. Robert, M.-T. Velluet, and E. Piganeau, in Atmospheric Optical Modeling, Measurement, and Simulation II (2006), Vol. 6303 [CrossRef]  .

16. M. Rais, J.-M. Morel, C. Thiebaut, J.-M. Delvit, and G. Facciolo, Appl. Opt. 55, 7836 (2016). [CrossRef]  

17. N. Anugu, P. J. V. Garcia, and C. M. Correia, Mon. Not. R. Astron. Soc. 476, 300 (2018). [CrossRef]  

18. J. M. Donlea, D. Pimentel, and G. Miesenböck, Neuron 81, 860 (2014). [CrossRef]  

19. J. Zeng, P. Mahou, M.-C. Schanne-Klein, E. Beaurepaire, and D. Débarre, Biomed. Opt. Express 3, 1898 (2012). [CrossRef]  

20. J.-H. Park, L. Kong, Y. Zhou, and M. Cui, Nat. Methods 14, 581 (2017). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1. Optical setup. L1-5, relay lenses; IF, interference fluorescence filter; M, mirror; DM, deformable mirror; D1-2, field diaphragms; BS1-2, 50:50 nonpolarizing beamsplitters; CAM, imaging camera. P1-4 are conjugated pupil planes.
Fig. 2.
Fig. 2. (a) Raw ESSH image of a HeLa cell (GFP—tubulin). (b) Image on scientific camera. (c) Zoom on one ESSH subimage.
Fig. 3.
Fig. 3. ESSH WF measurement accuracy for 3rd-order aberrations as compared to the reference SH.
Fig. 4.
Fig. 4. WF error as a function of the sample normalized spatial frequency (fcESSH = cutoff frequency of a microlens). Each point (blue dot) is the average WF error computed from a set of 10 sine patterns randomly generated—to take into account sampling effects—and a corresponding set of 10 image shifts randomly distributed over a range describing a WF error of ±λ. Error bars are thus calculated from 100 measurements.
Fig. 5.
Fig. 5. GFP-expressing ExF12 neurons of the dorsal fan-shaped body around 40 μm deep inside a freshly dissected adult drosophila brain, without (top) and with (bottom) AO. (a)–(d) Intensity profiles of cell bodies along the pink and green lines showing SNR and resolution enhancement with AO. Insert: 350×350μm2 full FOV image. Dotted square: ESSH/AO correction FOV.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.