Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Volumetric light sheet imaging with adaptive optics correction

Open Access Open Access

Abstract

Light sheet microscopy has developed quickly over the past decades and become a popular method for imaging live model organisms and other thick biological tissues. For rapid volumetric imaging, an electrically tunable lens can be used to rapidly change the imaging plane in the sample. For larger fields of view and higher NA objectives, the electrically tunable lens introduces aberrations in the system, particularly away from the nominal focus and off-axis. Here, we describe a system that employs an electrically tunable lens and adaptive optics to image over a volume of 499 × 499 × 192 μm3 with close to diffraction-limited resolution. Compared to the system without adaptive optics, the performance shows an increase in signal to background ratio by a factor of 3.5. While the system currently requires 7s/volume, it should be straightforward to increase the imaging speed to under 1s per volume.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Light Sheet Fluorescence Microscopy (LSFM) is a valuable tool for biological imaging because it combines optical sectioning with high speed over a large field of view [1]. An additional advantage is that the overall light-dose can be kept low, minimizing photobleaching and phototoxicity [2]. This makes LSFM an ideal choice for high-speed cellular resolution imaging of model organisms including zebrafish larvae. In particular, LSFM enables rapid imaging of the intact zebrafish nervous system in vivo over long periods of time. The strengths of LSFM have led to many new discoveries in neuroscience research [24].

A central goal of modern neuroscience is to obtain a mechanistic understanding of higher brain functions under healthy and diseased conditions [5]. The zebrafish is a vertebrate organism that holds promise for understanding brain function and neurological disorders, due to its small size and transparency in the larval stages. Zebrafish offer a wide range of genetic tools for neurophysiological approaches [5], and the central nervous system (CNS) of the zebrafish larva 4–7 days post fertilization (dpf) consists of approximately 100,000 neurons, making it the simplest vertebrate animal to study [6,7]. In order to understand how the neural circuits function in concert under normal or abnormal conditions, it is essential to have the capability of measuring and monitoring neural activity across large regions of the brain over minutes to hours. Imaging the CNS of zebrafish larvae with LSFM has proven to be an excellent method for accomplishing such a task [24]. However, in many studies [812], only a single 2D plane of the CNS has been imaged. In order to have a clearer understanding of functional connectivity across the CNS, high-speed volumetric imaging is required to follow neural events happening throughout the brain.

In LSFM, a thin sheet of illumination excites a thin slice of the specimen, a few microns thick, around the focal plane of the imaging objective lens. A 3D volumetric image of the sample can be acquired by a series of 2D images along the optical axis of the imaging objective lens. This is most easily achieved by moving the sample across the illumination plane. However, moving the sample is slow and behavior could be affected by vibrations associated with moving the sample.

Several different LSFM approaches have been developed to achieve fast volumetric imaging [1316]. Most of these use a galvo scanner to scan the light sheet through the sample. To capture an in-focus image while the light sheet is scanning, the focal plane of the imaging objective must also be scanned to coincide with the illuminated plane [17]. The focal plane can be scanned by scanning the imaging objective using a piezo stage [18], by scanning the focal power of a lens in the imaging path to scan the focal plane [19], or by using remote focusing [20]. Another approach is to extend the depth of focus of the imaging point spread function so that the imaging plane is determined only by the light sheet [13].

A popular method is to add an electrically tunable lens (ETL) in the detection path of the LSFM for scanning the focal plane [2123]. This method has several advantages. First, there is no mechanical movement of the sample or detection objective, as there is with rapid scanning of the imaging objective. Second, in contrast to the depth of focus extension methods, no post-processing step is required. Lastly, it is a cost effective approach to improve the volumetric imaging speed of the system. This combination was first demonstrated by Fahrbach et al. to investigate the beating heart of a zebrafish [19]. A volumetric imaging speed of 30Hz over a 100 $\mu$m axial range was reached using this method. Although the ETL allows for an increase in imaging speed, it also leads to aberrations [2426]. The aberrations vary across the imaging area, and, as the ETL is tuned to larger focal powers, the aberrations increase radially from the center of the image outwards towards the margins of the image. This constrains the usable field of view and scan range [19,25].

Using a lower NA objective lens is a common practice to keep the aberrations low [19,27,28]. Aberrations are exacerbated when a higher NA objective lens is used; for example, spherical aberration increases with NA$^4$. To compensate, a smaller field of view and shorter tuning range of the ETL are used with higher NA systems [2931]. Deep learning based deconvolution algorithms have been implemented to improve the image quality [32]. However, a shortcoming of the deconvolution approach is that it cannot recover lost phase information and residual artifacts remain after deconvolution [33].

Adaptive optics (AO) is an effective method to correct optical aberrations in fluorescence microscopy [3436]. AO uses a dynamic element such as a deformable mirror or spatial light modulator to actively compensate for the phase distortion of the light as it travels through the imaging system. The optimum deformable mirror setting is achieved either by direct wavefront sensing (typically with a Shack-Hartmann wavefront sensor) or by an iterative sensorless method. For direct wavefront sensing, a guide star is generally needed. Because light sheet microscopy uses a widefield detection scheme, implementing such an approach is usually complex and costly [37,38]. A SHWFS can also be used in light sheet microscopy without a guide star by using extended scene wavefront sensing [39,40]. This approach works best with sparse fluorescent labeling that allows features to be easily identified on the SHWFS. On the other hand, the sensorless method provides a cost-effective solution to achieve the best wavefront estimation without additional requirements in terms of sample preparation or a high power pulsed laser. The application of AO correction to the detection side of a LSFM was first shown by Bourgenot et al. [41].

Here, we demonstrate a light sheet microscope system that combines sensorless AO for system aberration correction and an ETL for volumetric imaging with high resolution and large field of view. The AO reduces the aberrations introduced by the ETL both along the optical axis as well as off-axis. This system can improve the signal to background ratio by a factor of $3.5$ across the entire field of view of the system, compared to without AO correction. In addition, we show that this system is fast enough to detect Pentylenetetrazol-induced changes in neuronal calcium associated with seizures in the central nervous system of $5$-$7$ dpf zebrafish larvae.

2. Materials and methods

2.1 LSFM setup

The layout of the LSFM, shown in Fig. 1, is a modification of the design used in our previous work and incorporates the ETL as in Fahrbach et al. [4244]. In the illumination path, the laser beam is expanded through a set of achromatic doublet lens pairs by a factor of $3$ (Thorlabs AC254-025-A-ML, Thorlabs AC127-075-A-ML). Then a cylindrical lens (Thorlabs ACY254-050-A) forms a line-profile on a two-axis galvo scanner (Thorlabs GVSM002). Two lenses (Thorlabs AC127-050-A-ML, Thorlabs AC127-030-A-ML) image the galvo scanner onto the back pupil plane of the illumination objective (Olympus UMPLFLN 10XW, 0.3 NA) with a magnification of 0.6. The 0.3 NA water-dipping illumination objective forms the light sheet at the focal plane of the detection objective (Olympus LUMPLFLN 40XW, 0.8 NA). The detection path images the sample onto the camera (Hamamatsu Orca Flash 4.0 v2) with an overall magnification of 26.67. The detection objective and tube lens (Olympus U-TLU) form the first image of the sample. This image is then relayed by lens pairs L1-L2 (Thorlabs AC508-100-A-ML, efl 100 mm, and Edmund optics 45-417, 200 mm), L3-L4 (Edmund optics 45-418,efl 300 mm, and Thorlabs ACT508-200-A-ML, efl 200 mm), and L5-L6 (Thorlabs AC508-300-A-ML and AC508-150-A-ML) to the camera. The tube lens and lens L1 image the back pupil of the detection objective onto the Deformable Mirror (DM) (Boston Micromachines Multi-3.5-DM) with a pupil size of 4 mm. After the DM, the pupil of the detection objective is further relayed onto the ETL (Optotune EL-10-30-TC) and the Shack-Hartmann Wavefront Sensor (SHWFS) with a pupil size of 6 mm. The path is split between the imaging camera and the SHWFS with a non-polarizing beamsplitter (Thorlabs BS013). An offset lens (Thorlabs, ACN254-075-A, efl −75 mm) shifts the tuning range of the ETL to $\frac {1}{F_{\textrm{ETL,eff}}} = -3.05 - +11.96$ m$^{-1}$. The SHWFS is home-built and consists of a lenslet array (Edmund Optics 86-745, 300 $\mu$m pitch, 4.8 mm efl), relay lens pair (Thorlabs MAP1075150-A, 2x magnification), and camera (PCO Edge $4.2$) [39,43]. Here, the SHWFS is used only for calibration of the DM. The green HeNe laser is inserted into the system with the flip mirror (Fig. 1) and the DM influence functions are measured on the SHWFS. The SHWFS can also be used for direct wavefront sensing.

 figure: Fig. 1.

Fig. 1. Schematic of the optical setup. The illumination objective is a $10 \times$ $0.3$NA water dipping objective, and the detection objective is a $40 \times$ $0.8$NA water dipping objective. T1: $3 \times$ magnification lens pair ($25$ mm and $75$ mm efl); T2: 5/3 demagnification lens pair ($50$ mm and $30$ mm); TL: Tube lens ($180$ mm efl); T3: $8 \times$ magnification lens pair ($25$ mm and $200$ mm efl). L1-2: Relay lenses (f1=$100$ mm and f2=$200$ mm). L3-4: Relay lenses (f3=$300$ mm and f4=$200$ mm); L5-6 is a magnifying lens pair ($300$ mm and $150$ mm efl). DM: Deformable Mirror. CL: cylindrical lens ($50$ mm efl). OL: offset lens ($-75$ mm efl). ETL: Electrically Tunable Lens. SHWFS: Home built Shack-Hartmann Wavefront Sensor. The green laser is used to calibrate the DM and the SHWFS. The overall magnification of the detection path is $26.67$.

Download Full Size | PDF

2.2 ETL and galvo mirror synchronization

The maximum tunable range of the ETL in our system is approximately 211$\mu$m which can be calculated from Eqn. (1) where the range of $\frac {1}{F_{\textrm{ETL,eff}}}$ is given above. $\delta Z$ is the position of the focal plane relative to the nominal focal plane. For each value of $\delta Z$, the galvo must be positioned to illuminate the corresponding image plane. The general procedure for calibration involves driving the ETL through the axial scanning range as the light sheet is parked at sequential $\delta Z$ positions, see Fig. 2(a).

$$\delta Z = \frac{-1}{M^2_{\textrm{det}}}\cdot \frac{1}{F_{\textrm{ETL,eff}}}\cdot \left(\frac{L_{1}}{L_{2}}\right)^2\cdot L_{3}^2$$

 figure: Fig. 2.

Fig. 2. (a) Representation of dual direction scanning of the ETL for a parked light sheet position. (b) Sinusoidal driving signal of the ETL. (c) Plot of the image sharpness indicating the best frame index for each light sheet axial position. (d) Calibrated ETL focal power as a function of the lightsheet galvo setting.

Download Full Size | PDF

We used a sine wave to drive the ETL to $60$ positions within the range from $10.27$ m$^{-1}$ to $25.29$ m$^{-1}$. At each light sheet position, a stack of images is captured as the ETL is scanned through one period of the sine wave. Then the light sheet is moved and the ETL is scanned again. The light sheet is moved in axial steps of $7.64$ microns over a range of 229 microns. Due to hysteresis of the ETL, the optimal settings of the ETL for each light sheet position are slightly different depending on whether the ETL is being driven towards higher or lower optical power, see Fig. 2(d). After these image stacks are captured, each of the stacks is evaluated to determine the ETL setting for which the brightest plane is in focus. This is done by calculating the sharpness metric for each image, shown in Fig. 2(c). We imaged a sample of sparse beads to perform the calibration.

2.3 Mapped-AO correction

We use a sensorless approach to determine the optimal setting for the deformable mirror [45]. We correct each Zernike mode independently, acquiring images with the Zernike amplitude set to $+0.25~\mu$m, $0\; \mu$m, and $-0.25~\mu$m. The images captured by the sCMOS camera can be evaluated using one of three different image metrics, and the values provided by the metric are fit to a quadratic function to determine the optimum amplitude for each Zernike mode [46]. The metric is evaluated over specific subregions of the image, Fig. 3(a). The three different metrics we use are the maximum intensity, Fourier Metric [46], and RMS metric [47]. The maximum intensity metric is used in regions where a single point-like feature is the dominant feature, such as the cell bodies of the neuron. The Fourier metric is a weighted value of the high spatial frequencies and less affected by low intensity [46]. This is used in regions which contain many detailed features and fewer bright regions, such as the axons of the neuron. The RMS metric, which evaluates the standard deviation of the image, is used for the regions where there is a mix of detailed features and featureless regions, such as the area that is close to the optic tectum of the fish. The metrics are described in Fig. 3(c). We correct from Z4 to Z14, a total of $11$ Zernike modes [48]. The aberrations caused by the electrical tunable lens are within the correction capability of the DM up to Z14. Here, we use the RMS metric to correct the bead images.

 figure: Fig. 3.

Fig. 3. Mapped-AO correction for ETL-LSFM. (a) Schematic of the larval zebrafish optic tectum showing the different regions used for AO correction. The red shaded lines indicate the four regions that the image has been divided into, and smaller red arrows mark the radius of the central region while the larger red arrows point to the regions of interest where the metric is evaluated in each region (green boxes). (b) Schematic depiction of the acquisition process. For each z-position, an image is taken for each region with the corresponding correction. The AO settings for each region are updated every few slices with settings for a new axial range. (c) Depiction of the different image metrics used for sensorless AO. The red hatched regions (the ring and the center areas) indicate the high-frequency and low-frequency components of the image, respectively. The metric is selected based on its performance for different samples and regions.

Download Full Size | PDF

We divide the image into multiple subregions, then use the sensorless approach to map out the optimum AO correction settings for each subregion for the given axial setting of the ETL. The same procedure is then repeated for multiple axial positions. We refer to this approach as Mapped AO. The number and size of the sub-regions for obtaining the AO settings as well as the radial distance of the sub region from the center of the image can be adjusted and evaluated by the user. Here, we use a division into 3 radial regions each spanning 120 degress and a central circular region. The central circular region has a radius of $97.6 \mu$m ($400$ pixels). Once the AO settings are determined, they are applied as the volume is being acquired. This procedure is shown in Fig. 3(b). At each axial position, 4 images are acquired with the different AO corrections. Every few slices, the AO corrections are updated. The time required to correct one Zernike mode is 375 ms. The time to correct for an entire volume is 82.5 s. This full correction must only be done once on a bead sample.

2.4 Image fusion

We divide the image into $4$ regions, and $5$ axial ranges with different AO settings. Therefore, for each z step, $4$ images are captured with different AO corrections. The final raw image stack contains a total of $144$ images (4 images for each of 36 axial slices). For every final fused image, we cut out the regions of the images corresponding to the AO correction and merge them together for each z step. No blending is used in this process.

2.5 Sample preparation

We use sub-diffraction fluorescent beads and biological samples to test the performance of our system. We prepare bead phantoms with $200$ nm yellow-green (ThermoFisher Scientific F8811) fluorescent beads. The beads are first diluted with DI water (18.2 M$\Omega /cm$) in a ratio of 1:100. Then, the stock is mixed together with 3$\%$ agarose gel in a ratio of 250 to 1. A micro capillary tube is used to draw the solution and hold the sample after the gel has been gently extruded.

Larval zebrafish (Danio rerio) were obtained from lines maintained in the University of Georgia Zebrafish Facility following standard procedures [49]. Embryos and larvae were staged using standard staging criteria [49,50]. Wild-type fish of the WIK strain and nacre (mitfaw2/w2) were originally obtained from the Zebrafish International Research Center (ZIRC). Fish transgenic for Tg[elavl3:GCaMP5g] on nacre (mitfaw2/w2) [18] were obtained from Dr. Ahrens (Janelia Farm HHMI), which were crossed into the crystal (nacrew2/w2;albb4/b4;roya9/a9) background. Crystal zebrafish [51] were originally obtained from Dr. Hindges (Kings College London). Fish transgenic for Tg[dlx6a-1.4dlx5a-dlx6a:GFP] were obtained from Dr. M. Ekker (University of Ottawa). All experimental procedures were conducted in accordance with National Institutes of Health guidelines for use of zebrafish in research under protocols approved and overseen by the University of Georgia Institutional Animal Care and Use Committee.

For neural activity imaging of live Tg[elavl3:GCaMP5g] transgenic zebrafish, we followed the procedure described in [42], where we first paralyse 4-7 day post fertilization transgenic zebrafish larvae using alpha-bungarotoxin (125.25$\mu$M $\alpha$-BTX). To induce seizure-like events, 15 mM of Pentylenetetrazol (PTZ) was added to the environment. For live imaging of Tg[dlx6a-1.4dlx5a-dlx6a:GFP], fish were anesthetized with tricaine (MS-222). Immobilized larvae were transferred to a solution of $3\%$ low melting point agarose and mounted in a glass capillary for imaging.

2.6 Signal to background ratio and calcium signal calculations

In this work, we calculate the peak signal to background ratio (SBR) and compare the SBR between the images acquired with and without AO across the imaging field and along the axial range. We first calculated the mean intensity value of four different dark regions measuring 150 $\times$ 150 pixels$^2$ around the corners of the images, and used the average value as the background value of the image. Then, the peak signal value of a $256 \times 256$ pixels$^2$ sliding window is measured across the entire image with a step of 50 pixels. The SBR is then calculated along the center of the image horizontally as well as vertically as the ratio of the peak signal to the background.

The change in fluorescence intensity is an indicator of the calcium signal. Here, we calculated the fluorescence intensity changes ($\Delta$F/F) using Eq. (2). $F_{t}$ which is the sum of the pixel values of the region of interest for each time point, and $F_{\mu }$ is the average of the entire time range. This does not include the first ten time points, which are omitted because the neurological activity during that period is a reaction to the laser being turned on.

$$\Delta F /F = \frac{F_{t}-F_{\mu}}{F_{\mu}}$$

3. Results

Figure 4 shows an example of imaging fluorescent beads over a $499 \times 499 \times 192 \mu$m$^3$ volume without AO. As the imaging plane moves away from the nominal focal plane, Fig. 4(f), strong off-axis aberrations become apparent. Without AO, these aberrations either limit the field of view to a $\sim 100~\mu$m circle in the center of the field of view or limit the usable axial range to $\sim 30~\mu$m. Close to the nominal focal position, Fig. 4(f), the dominant aberration is astigmatism. As the power of the ETL increases, the aberrations become increasingly strong. The aberrations are directed radially, justifying the choice of radial regions for AO correction, Fig. 3.

 figure: Fig. 4.

Fig. 4. Demonstration of the aberrations caused by the ETL without AO. The sample is 200 nm YG fluorescent beads embedded in agar. Images at depths of (a) $-148 \mu$m, (b) $-117 \mu$m, (c) $-86 \mu$m, (d) $-54 \mu$m, (e) $-23 \mu$m, and (f) $8 \mu$m. The scale bar in (f) is $100 \mu$m. (g)-(k) closeups from (c) at distances of 192, 163, 115, 59, and 14 $\mu$m from the center of the image. (l)-(p) Simulated PSFs approximating the images in (g)-(k). (q) The magnitude of different aberrations as a function of distance from the simulations in (l-p). The RMS is the total wavefront error excluding tip, tilt, and defocus. Scale bar in (g) is 10 $\mu$m.

Download Full Size | PDF

To estimate the aberrations, we simulate the Point Spread Function (PSF) at 5 locations across the field of view. Figure 4(g-k) shows beads at different distances, from 14 to 192 $\mu$m, from the center of the image. Below, in Fig. 4(l-p), we simulate the PSF using the first 14 Zernike modes. The Zernike modes are estimated by minimizing the difference between the measured bead image and the PSF. Because the field of view is crowded, and a single bead cannot be isolated, the estimate of the Zernike amplitudes and corresponding PSF is imperfect, but, nonetheless, it gives an estimate of the aberrations across the field of view as shown in Fig. 4(q). The result demonstrates that the aberrations change relatively slowly over the center of the image, but the change increases towards the edge, indicating that multiple radial regions might provide a better correction. Here, we use 4 regions to maintain the imaging speed.

We first tested the performance of mapped-AO on a fluorescent bead sample. We acquired an image stack using mapped-AO and compared it with the volume acquired without AO correction, Fig. 5. Figure 5(a-f) shows a comparison of 2D slices from the volume without AO (top-left) to the volume with mapped-AO (bottom-right) over an 192 micron axial range. As can be seen, the PSF is close to diffraction-limited over a large portion of the field of view after AO correction, while it is significantly larger before correction. An enlarged image of the red box shown in Fig. 5(a) is plotted in Fig. 5(g-l) for 3 axial positions spaced by 44.48 $\mu$m. As can be seen, the resolution of the system is improved across the axial range. The image field is divided into four sections for correction, and the corresponding correction wavefronts are shown in Fig. 6(d). The magnitude of the correction for coma, astigmatism, and spherical aberration for each region are plotted in Fig. 6(a-c). We see a general increase in the amount of coma for all regions as the axial position moves away from zero. That the coma in the center region also increases indicates that the system is not perfectly aligned on axis. we also see an increase in spherical aberration with axial position, which is to be expected. To quantify the improvement with AO, we then calculated the SBR across the center of the FOV both vertically and horizontally for each z slice of the image stack and averaged the SBR of the image stack along the z-axis. We achieved an overall SBR with mapped-AO correction $3.40$ times higher along the vertical direction and $3.68$ times higher along the horizontal direction after AO correction. This result is shown in Fig. 7(a)-(b). The aberrations increase towards the maximum end of the axial range. The calculated SBR has a flat-top appearance because the calculation is done in 256 pixel regions. We see the greatest improvement in the PSF near the center of the image. This is due to a combination of strong aberrations near the center of the image and the limited ability to correct the strong aberrations at the edges of the image.

 figure: Fig. 5.

Fig. 5. Mapped-AO correction for ETL-LSFM on beads. (a)-(f) Bead images acquired with and without AO using three divided regions at different axial positions in a 3D stack (every 32 $\mu$m). (g)-(l) The enlarged box region in (a). Three axial positions are shown here (every 32 $\mu$m). (m) and (n) The axial images of the beads. A line profile through a bead is plotted in (o) with FWHM of 6.07 $\mu$m and 4.49 $\mu$m before and after AO. The peak intensity is increased by a factor of 2.4. The scale bars in (g) and (m) are $20 \mu$m.

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. The correction wavefronts using the mapped-AO settings for the different regions. (a)-(c) Plot of the Coma, Astigmatism, and Spherical modes with different axial positions and regions (d) 6 different settings are used across the axial range. The color map corresponds to $-1.5 \mu$m to $+1.5 \mu$m.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Thee SBR comparison of an image volume at 5 different axial positions. The green curve is the average across the axial range.

Download Full Size | PDF

Next, we applied this approach to imaging a 6 dpf zebrafish larva in which GFP is expressed in dlx4/6 positive neurons. We acquired a $499 \times 499 \times 192~\mu$m$^3$ volume image stack. The AO settings determined from the bead data are applied as the larva volume is acquired. Three slices from the volumetric image are shown in Fig. 8. Figure 8(a), (b) and (c) show slices without AO correction at depths of 37.1, 53.1, and 69.1 $\mu$m from the top of the fish, respectively. Figure 8(e), (e) and (f) show the corresponding images with AO correction. The improvement of the details can be noted in the regions indicated by the boxes which are enlarged in Fig. 8(g-r). Without AO correction, the cell bodies of the neurons are elongated, which may lead to false interpretation in later analysis. After the correction, a more spherical shape of the neuron cell body is achieved. In our system, the DM and ETL are synchronized using software commands, which limits the overall imaging speed to $194$ ms per slice and $7$ seconds per volume ($17.5$ms$/\mu$m). This is primarily due to the speed of serial communication with the ETL, galvo driver, and DM.

 figure: Fig. 8.

Fig. 8. 6dpf Tg1.4dlx5a-dlx6a:GFPw2/w2 zebrafish larva over a $192 \mu$m axial range. Three different axial positions in a 3D stack at 37.1, 53.1, and 69.1 $\mu$m from the top of the fish. (a-c) Without AO correction. (d-f) With AO correction. (g-l) Insets from (d-f). (m-r) Insets from (d-f). The scale bar is (a) is $100\mu$m.

Download Full Size | PDF

Figure 9 shows 4 consecutive volumes from a movie of a 6 dpf zebrafish larvae neuronally expressing the GCaMP calcium indicator and treated with PTZ. Zebrafish larvae treated with PTZ are known to undergo seizure-like events [42]. A strong signal can be seen in the middle two frames, Fig. 9(b) and (c). PTZ-induced changes in neuronal calcium associated with seizures typically last on the order of several seconds, and our system is fast enough to record the location and frequency of these events.

 figure: Fig. 9.

Fig. 9. A 6dpf elavl3:GCaMP5g;mitfaw2/w2 PTZ treated zebrafish larva is imaged over $499 \times 499\times 192 \mu$m$^3$. Each volume contains $36$ images and is acquired with a spacing of $5.33\mu$m. A $5$ ms exposure time was used for this data. The dataset is 64 volumes. (a)-(d) Maximum Intensity Projections of four consecutive time points showing a change in neural activity over time. (e) Maximum intensity projection of a single image volume over $192 \mu m$. Depth is coded in color. (f) $\Delta$F$/$F plotted over time for six different subregions indicated in (e). See Visualization 1, Visualization 2, Visualization 3 and Visualization 4.

Download Full Size | PDF

4. Discussion and conclusion

We have demonstrated imaging of 6 dpf zebrafish larva over a volume of $499 \times 499 \times 192 \mu$m$^3$ using the combination of an ETL and AO. Scanning the imaging plane with the ETL introduces off-axis aberrations which would significantly limit both the field of view and the axial range without the addition of AO. The aberrations are a function of the azimuthal angle and the distance from the center of the image. To correct the aberrations, we divide the field of view into a central circular region and three outer regions each spanning 120 degrees each. We then correct each region individually by capturing 4 volumes with the separate corrections. The volumes are then combined to create a volumetric image corrected over the whole field of view. More regions could be used to improve the level of correction. For example, 4 outer sections could be used instead of 3, or multiple radial regions in each section could be used, improving correction at the expense of speed. In our approach, the AO settings are determined once from data taken on fluorescent beads and applied to biological samples scanned over the same volume. This avoids the time and photobleaching from AO during biological imaging.

In the current system, the ETL, DM, and galvo are driven by software commands, resulting in rather slow imaging speed. It takes 1.75s per raw volume resulting in 7s to take all 4 images for the corrected final image volume. The time is mostly the result of passing commands through serial interfaces. We plan to speed this up in future work by running the ETL and galvo using continuous sinusoidal signals so that frequent software commands are not needed. A difficulty with this is the DM control, because the DM cannot be hardware triggered to load new wavefront settings. It is possible that the wavefront could be changed through an interrupt mechanism, but a DM that could be used as a slave device could significantly improve performance. The output of the mapped-AO could be further combined with other software based image correction methods, such as spatially varying deconvolution or a deep learning deconvolution method. Mapped-AO could also be done with an initial correction on the biological sample allowing for the correction of sample aberrations as well as system aberrations.

The ability to rapidly image the zebrafish central nervous system in three-dimensions at high resolution will significantly improve our understanding of the propagation of seizure events. Most studies to date have focused on imaging a two-dimensional slice of the CNS using either confocal microscopy or light-sheet imaging. In that case only a portion of the event is captured, and one can only estimate its total duration, the origin of the event, and its direction of propagation. In our future work, we plan to address these questions using volumetric light sheet microscopy with mapped-AO.

Funding

National Science Foundation (1350654); National Institutes of Health (1F31NS115496, R01NS090645).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. Huisken, J. Swoger, F. Del Bene, J. Wittbrodt, and E. H. K. Stelzer, “Optical sectioning deep inside live embryos by selective plane illumination microscopy,” Science 305(5686), 1007–1009 (2004). [CrossRef]  

2. E. M. Hillman, V. Voleti, W. Li, and H. Yu, “Light-sheet microscopy in neuroscience,” Annu. Rev. Neurosci. 42(1), 295–313 (2019). [CrossRef]  

3. P. J. Keller, M. B. Ahrens, and J. Freeman, “Light-sheet imaging for systems neuroscience,” Nat. Methods 12(1), 27–29 (2015). [CrossRef]  

4. S. Corsetti, F. Gunn-Moore, and K. Dholakia, “Light sheet fluorescence microscopy for neuroscience,” J. Neurosci. Methods 319, 16–27 (2019). [CrossRef]  

5. R. W. Friedrich, G. A. Jacobson, and P. Zhu, “Circuit neuroscience in zebrafish,” Curr. Biol. 20(8), R371–R381 (2010). [CrossRef]  

6. A. Hill, C. V. Howard, U. Strahle, and A. Cossins, “Neurodevelopmental defects in zebrafish (danio rerio) at environmentally relevant dioxin (TCDD) concentrations,” Toxicol. Sci. 76(2), 392–399 (2003). [CrossRef]  

7. A. A. Wanner and A. Vishwanathan, “Methods for mapping neuronal activity to synaptic connectivity: lessons from larval zebrafish,” Front. Neural Circuits 12, 89 (2018). [CrossRef]  

8. L. Turrini, C. Fornetto, G. Marchetto, M. C. Müllenbroich, N. Tiso, A. Vettori, F. Resta, A. Masi, G. Mannaioni, F. S. Pavone, and F. Vanzi, “Optical mapping of neuronal activity during seizures in zebrafish,” Sci. Rep. 7(1), 3025 (2017). [CrossRef]  

9. R. E. Rosch, P. R. Hunter, T. Baldeweg, K. J. Friston, and M. P. Meyer, “Calcium imaging and dynamic causal modelling reveal brain-wide changes in effective connectivity and synaptic dynamics during epileptic seizures,” PLoS Comput. Biol. 14(8), e1006375 (2018). [CrossRef]  

10. J. Liu and S. C. Baraban, “Network properties revealed during multi-scale calcium imaging of seizure activity in zebrafish,” eNeuro 6(1), ENEURO.0041-19.2019 (2019). [CrossRef]  

11. A. Brenet, R. Hassan-Abdi, J. Somkhit, C. Yanicostas, and N. Soussi-Yanicostas, “Defective excitatory/inhibitory synaptic balance and increased neuron apoptosis in a zebrafish model of dravet syndrome,” Cells 8(10), 1199 (2019). [CrossRef]  

12. J. Liu, K. A. Salvati, and S. C. Baraban, “In vivo calcium imaging reveals disordered interictal network dynamics in epileptic stxbp1b zebrafish,” iScience 24(6), 102558 (2021). [CrossRef]  

13. O. E. Olarte, J. Andilla, D. Artigas, and P. Loza-Alvarez, “Decoupled illumination detection in light sheet microscopy for fast volumetric imaging,” Optica 2(8), 702–705 (2015). [CrossRef]  

14. S. Quirin, N. Vladimirov, C.-T. Yang, D. S. Peterka, R. Yuste, and M. B. Ahrens, “Calcium imaging of neural circuits with extended depth-of-field light-sheet microscopy,” Opt. Lett. 41(5), 855–858 (2016). [CrossRef]  

15. R. Tomer, M. Lovett-Barron, I. Kauvar, A. Andalman, V. Burns, S. Sankaran, L. Grosenick, M. Broxton, S. Yang, and K. Deisseroth, “SPED light sheet microscopy: fast mapping of biological system structure and function,” Cell 163(7), 1796–1806 (2015). [CrossRef]  

16. W. C. Lemon, S. R. Pulver, B. Höckendorf, K. McDole, K. Branson, J. Freeman, and P. J. Keller, “Whole-central nervous system functional imaging in larval Drosophila,” Nat. Commun. 6(1), 7924 (2015). [CrossRef]  

17. P. Haslehurst, Z. Yang, K. Dholakia, and N. Emptage, “Fast volume-scanning light sheet microscopy reveals transient neuronal events,” Biomed. Opt. Express 9(5), 2154–2167 (2018). Publisher: OSA. [CrossRef]  

18. M. B. Ahrens, M. B. Orger, D. N. Robson, J. M. Li, and P. J. Keller, “Whole-brain functional imaging at cellular resolution using light-sheet microscopy,” Nat. Methods 10(5), 413–420 (2013). [CrossRef]  

19. F. O. Fahrbach, F. F. Voigt, B. Schmid, F. Helmchen, and J. Huisken, “Rapid 3D light-sheet microscopy with a tunable lens,” Opt. Express 21(18), 21010–21026 (2013). [CrossRef]  

20. E. Botcherby, R. Juškaitis, M. Booth, and T. Wilson, “An optical technique for remote focusing in microscopy,” Opt. Commun. 281(4), 880–887 (2008). [CrossRef]  

21. C. Zuo, Q. Chen, W. Qu, and A. Asundi, “High-speed transport-of-intensity phase microscopy with an electrically tunable lens,” Opt. Express 21(20), 24060–24075 (2013). [CrossRef]  

22. J. M. Jabbour, B. H. Malik, C. Olsovsky, R. Cuenca, S. Cheng, J. A. Jo, Y.-S. L. Cheng, J. M. Wright, and K. C. Maitland, “Optical axial scanning in confocal microscopy using an electrically tunable lens,” Biomed. Opt. Express 5(2), 645–652 (2014). [CrossRef]  

23. Y. Nakai, M. Ozeki, T. Hiraiwa, R. Tanimoto, A. Funahashi, N. Hiroi, A. Taniguchi, S. Nonaka, V. Boilot, R. Shrestha, J. Clark, N. Tamura, V. M. Draviam, and H. Oku, “High-speed microscopy with an electrically tunable lens to image the dynamics of in vivo molecular complexes,” Rev. Sci. Instrum. 86(1), 013707 (2015). [CrossRef]  

24. Y.-K. Fuh, J.-K. Chen, and P.-W. Chen, “Characterization of electrically tunable liquid lens and adaptive optics for aberration correction,” Optik 126(24), 5456–5459 (2015). [CrossRef]  

25. C. Efstathiou and V. M. Draviam, “Electrically tunable lenses – eliminating mechanical axial movements during high-speed 3D live imaging,” J. Cell Sci. 134(16), jcs258650 (2021). [CrossRef]  

26. J. A. Strother, “Reduction of spherical and chromatic aberration in axial-scanning optical systems with tunable lenses,” Biomed. Opt. Express 12(6), 3530–3552 (2021). [CrossRef]  

27. T. Funane, S. S. Hou, K. M. Zoltowska, S. J. van Veluw, O. Berezovska, A. T. N. Kumar, and B. J. Bacskai, “Selective plane illumination microscopy (SPIM) with time-domain fluorescence lifetime imaging microscopy (FLIM) for volumetric measurement of cleared mouse brain samples,” Rev. Sci. Instrum. 89(5), 053705 (2018). [CrossRef]  

28. D. P. Ryan, E. A. Gould, G. J. Seedorf, O. Masihzadeh, S. H. Abman, S. Vijayaraghavan, W. B. Macklin, D. Restrepo, and D. P. Shepherd, “Automatic and adaptive heterogeneous refractive index compensation for light-sheet microscopy,” Nat. Commun. 8(1), 612 (2017). [CrossRef]  

29. A. B. Kashekodi, T. Meinert, R. Michiels, and A. Rohrbach, “Miniature scanning light-sheet illumination implemented in a conventional microscope,” Biomed. Opt. Express 9(9), 4263–4274 (2018). [CrossRef]  

30. D. Wang, S. Xu, P. Pant, E. Redington, S. Soltanian-Zadeh, S. Farsiu, and Y. Gong, “Hybrid light-sheet and light-field microscope for high resolution and large volume neuroimaging,” Biomed. Opt. Express 10(12), 6595–6610 (2019). [CrossRef]  

31. B. Liu, C. M. Hobson, F. M. Pimenta, E. Nelsen, J. Hsiao, T. O’Brien, M. R. Falvo, K. M. Hahn, and R. Superfine, “VIEW-MOD: a versatile illumination engine with a modular optical design for fluorescence microscopy,” Opt. Express 27(14), 19950–19972 (2019). [CrossRef]  

32. E.-S. Cho, S. Han, K.-H. Lee, C.-H. Kim, and Y.-G. Yoon, “3DM: deep decomposition and deconvolution microscopy for rapid neural activity imaging,” Opt. Express 29(20), 32700–32711 (2021). Publisher: OSA. [CrossRef]  

33. J. G. McNally, T. Karpova, J. Cooper, and J. A. Conchello, “Three-dimensional imaging by deconvolution microscopy,” Methods 19(3), 373–385 (1999). [CrossRef]  

34. N. Ji, “Adaptive optical fluorescence microscopy,” Nat. Methods 14(4), 374–380 (2017). [CrossRef]  

35. J. M. Girkin, S. Poland, and A. J. Wright, “Adaptive optics for deeper imaging of biological samples,” Curr. Opin. Biotechnol. 20(1), 106–110 (2009). [CrossRef]  

36. J. M. Girkin and M. T. Carvalho, “The light-sheet microscopy revolution,” J. Opt. 20(5), 053002 (2018). [CrossRef]  

37. R. Jorand, G. Le Corre, J. Andilla, A. Maandhui, C. Frongia, V. Lobjois, B. Ducommun, and C. Lorenzo, “Deep and clear optical imaging of thick inhomogeneous samples,” PLoS One 7(4), e35795 (2012). [CrossRef]  

38. T.-L. Liu, S. Upadhyayula, D. E. Milkie, et al., “Observing the cell in its native state: imaging subcellular dynamics in multicellular organisms,” Science 360(6386), 1 (2018). [CrossRef]  

39. K. Lawrence, Y. Liu, R. Ball, A. J. VanLeuven, J. D. Lauderdale, and P Kner, “Scene-based Shack-Hartmann wavefront sensor for light-sheet microscopy,” Proc. SPIE 10502, 105020B (2018). [CrossRef]  

40. A. Hubert, F. Harms, R. Juvénal, P. Treimany, X. Levecq, V. Loriette, G. Farkouh, F. Rouyer, and A. Fragola, “Adaptive optics light-sheet microscopy based on direct wavefront sensing without any guide star,” Opt. Lett. 44(10), 2514–2517 (2019). [CrossRef]  

41. C. Bourgenot, C. D. Saunter, J. M. Taylor, J. M. Girkin, and G. D. Love, “3D adaptive optics in a light sheet microscope,” Opt. Express 20(12), 13252–13261 (2012). [CrossRef]  

42. Y. Liu, S. Dale, R. Ball, A. J. VanLeuven, A. Sornborger, J. D. Lauderdale, and P. Kner, “Imaging neural events in zebrafish larvae with linear structured illumination light sheet fluorescence microscopy,” Neurophotonics 6(01), 1 (2019). [CrossRef]  

43. Y. Liu, K. Lawrence, J. D. Lauderdale, and P. Kner, “Sensorless and sensor based adaptive optics for light sheet microscopy,” Proc. SPIE 11248, 1124806 (2020). [CrossRef]  

44. F. O. Fahrbach, V. Gurchenkov, K. Alessandri, P. Nassoy, and A. Rohrbach, “Light-sheet microscopy in thick media using scanned Bessel beams and two-photon fluorescence excitation,” Opt. Express 21(11), 13824–13839 (2013). [CrossRef]  

45. M. J. Booth, “Wavefront sensorless adaptive optics for large aberrations,” Opt. Lett. 32(1), 5–7 (2007). [CrossRef]  

46. K. F. Tehrani, J. Xu, Y. Zhang, P. Shen, and P. Kner, “Adaptive optics stochastic optical reconstruction microscopy (AO-STORM) using a genetic algorithm,” Opt. Express 23(10), 13677–13692 (2015). [CrossRef]  

47. J. R. Fienup and J. J. Miller, “Aberration correction by maximizing generalized sharpness metrics,” J. Opt. Soc. Am. A 20(4), 609–620 (2003). [CrossRef]  

48. R. J. Noll, “Zernike polynomials and atmospheric turbulence,” J. Opt. Soc. Am. 66(3), 207–211 (1976). [CrossRef]  

49. M. Westerfield, The Zebrafish Book: A Guide for the Laboratory Use of Zebrafish Danio (“Brachydanio Rerio”) (University of Oregon, 2007).

50. C. B. Kimmel, W. W. Ballard, S. R. Kimmel, B. Ullmann, and T. F. Schilling, “Stages of embryonic development of the zebrafish,” Dev. Dyn. 203(3), 253–310 (1995). Publisher: Wiley Online Library. [CrossRef]  

51. P. Antinucci and R. Hindges, “A crystal-clear zebrafish for in vivo imaging,” Sci. Rep. 6(1), 29490 (2016). [CrossRef]  

Supplementary Material (4)

NameDescription
Visualization 1       A 6dpf PTZ treated zebrafish larva is imaged over a 400 x 400 x 192 micron volume. Each volume contains 36 images and is acquired with a spacing of 5.33 microns. A 5 ms exposure time was used for this data.
Visualization 2       A 6dpf PTZ treated zebrafish larva is imaged over a 400 x 400 x 192 micron volume. Each volume contains 36 images and is acquired with a spacing of 5.33 microns. A 5 ms exposure time was used for this data.
Visualization 3       A 6dpf PTZ treated zebrafish larva is imaged over a 400 x 400 x 192 micron volume. Each volume contains 36 images and is acquired with a spacing of 5.33 microns. A 5 ms exposure time was used for this data.
Visualization 4       A 6dpf PTZ treated zebrafish larva is imaged over a 400 x 400 x 192 micron volume. Each volume contains 36 images and is acquired with a spacing of 5.33 microns. A 5 ms exposure time was used for this data.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Schematic of the optical setup. The illumination objective is a $10 \times$ $0.3$NA water dipping objective, and the detection objective is a $40 \times$ $0.8$NA water dipping objective. T1: $3 \times$ magnification lens pair ($25$ mm and $75$ mm efl); T2: 5/3 demagnification lens pair ($50$ mm and $30$ mm); TL: Tube lens ($180$ mm efl); T3: $8 \times$ magnification lens pair ($25$ mm and $200$ mm efl). L1-2: Relay lenses (f1=$100$ mm and f2=$200$ mm). L3-4: Relay lenses (f3=$300$ mm and f4=$200$ mm); L5-6 is a magnifying lens pair ($300$ mm and $150$ mm efl). DM: Deformable Mirror. CL: cylindrical lens ($50$ mm efl). OL: offset lens ($-75$ mm efl). ETL: Electrically Tunable Lens. SHWFS: Home built Shack-Hartmann Wavefront Sensor. The green laser is used to calibrate the DM and the SHWFS. The overall magnification of the detection path is $26.67$.
Fig. 2.
Fig. 2. (a) Representation of dual direction scanning of the ETL for a parked light sheet position. (b) Sinusoidal driving signal of the ETL. (c) Plot of the image sharpness indicating the best frame index for each light sheet axial position. (d) Calibrated ETL focal power as a function of the lightsheet galvo setting.
Fig. 3.
Fig. 3. Mapped-AO correction for ETL-LSFM. (a) Schematic of the larval zebrafish optic tectum showing the different regions used for AO correction. The red shaded lines indicate the four regions that the image has been divided into, and smaller red arrows mark the radius of the central region while the larger red arrows point to the regions of interest where the metric is evaluated in each region (green boxes). (b) Schematic depiction of the acquisition process. For each z-position, an image is taken for each region with the corresponding correction. The AO settings for each region are updated every few slices with settings for a new axial range. (c) Depiction of the different image metrics used for sensorless AO. The red hatched regions (the ring and the center areas) indicate the high-frequency and low-frequency components of the image, respectively. The metric is selected based on its performance for different samples and regions.
Fig. 4.
Fig. 4. Demonstration of the aberrations caused by the ETL without AO. The sample is 200 nm YG fluorescent beads embedded in agar. Images at depths of (a) $-148 \mu$m, (b) $-117 \mu$m, (c) $-86 \mu$m, (d) $-54 \mu$m, (e) $-23 \mu$m, and (f) $8 \mu$m. The scale bar in (f) is $100 \mu$m. (g)-(k) closeups from (c) at distances of 192, 163, 115, 59, and 14 $\mu$m from the center of the image. (l)-(p) Simulated PSFs approximating the images in (g)-(k). (q) The magnitude of different aberrations as a function of distance from the simulations in (l-p). The RMS is the total wavefront error excluding tip, tilt, and defocus. Scale bar in (g) is 10 $\mu$m.
Fig. 5.
Fig. 5. Mapped-AO correction for ETL-LSFM on beads. (a)-(f) Bead images acquired with and without AO using three divided regions at different axial positions in a 3D stack (every 32 $\mu$m). (g)-(l) The enlarged box region in (a). Three axial positions are shown here (every 32 $\mu$m). (m) and (n) The axial images of the beads. A line profile through a bead is plotted in (o) with FWHM of 6.07 $\mu$m and 4.49 $\mu$m before and after AO. The peak intensity is increased by a factor of 2.4. The scale bars in (g) and (m) are $20 \mu$m.
Fig. 6.
Fig. 6. The correction wavefronts using the mapped-AO settings for the different regions. (a)-(c) Plot of the Coma, Astigmatism, and Spherical modes with different axial positions and regions (d) 6 different settings are used across the axial range. The color map corresponds to $-1.5 \mu$m to $+1.5 \mu$m.
Fig. 7.
Fig. 7. Thee SBR comparison of an image volume at 5 different axial positions. The green curve is the average across the axial range.
Fig. 8.
Fig. 8. 6dpf Tg1.4dlx5a-dlx6a:GFPw2/w2 zebrafish larva over a $192 \mu$m axial range. Three different axial positions in a 3D stack at 37.1, 53.1, and 69.1 $\mu$m from the top of the fish. (a-c) Without AO correction. (d-f) With AO correction. (g-l) Insets from (d-f). (m-r) Insets from (d-f). The scale bar is (a) is $100\mu$m.
Fig. 9.
Fig. 9. A 6dpf elavl3:GCaMP5g;mitfaw2/w2 PTZ treated zebrafish larva is imaged over $499 \times 499\times 192 \mu$m$^3$. Each volume contains $36$ images and is acquired with a spacing of $5.33\mu$m. A $5$ ms exposure time was used for this data. The dataset is 64 volumes. (a)-(d) Maximum Intensity Projections of four consecutive time points showing a change in neural activity over time. (e) Maximum intensity projection of a single image volume over $192 \mu m$. Depth is coded in color. (f) $\Delta$F$/$F plotted over time for six different subregions indicated in (e). See Visualization 1, Visualization 2, Visualization 3 and Visualization 4.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

δ Z = 1 M det 2 1 F ETL,eff ( L 1 L 2 ) 2 L 3 2
Δ F / F = F t F μ F μ
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.