Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Four-dimensional visualization of zebrafish cardiovascular and vessel dynamics by a structured illumination microscope with electrically tunable lens

Open Access Open Access

Abstract

We established a four-dimensional (4D) microscopy method using structured illumination for optical axial imaging with an electrically tunable lens. With its fast imaging capability, the dynamics of the cardiovascular system of the zebrafish and cerebral vessels were imaged based on the coverage of two stacks (25 layers) per second with lateral /axial resolutions of 0.6 µm and 1.8 µm, respectively. Time lapse imaging clearly shows the contractile–relaxation response of the beating heart at different cardiac phases and with different mobilities of blood vessels in different regions. This new 4D technique will facilitate in vivo imaging of organ function, generation, as well as drug responses in small-sized animals.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Corrections

6 March 2020: A typographical correction was made to the body text.

1. Introduction

As one of the mostly used animal models, zebrafish has become extensively used in genomic function, organ development, and drug screening. Owing to the small animal size and transparency, optical imaging techniques have been generally used to visualize its structures and functions in vivo [1,2], such as the study of hematopoiesis [35] and blood vessel formation [6]. Additionally, in the study of vascular development and human vascular disease in the zebrafish, it was necessary to image the blood vessels and visually observe changes in them [710]. Real-time imaging and the lineage tracing of zebrafish are also the primary means for accurate clarification of the number of mural cells (MCs) required to cover the endothelial cell (Ec) tube and identify the origin of MCs [11,12]. Because of the three-dimensional (3D) structure of the cardiovascular system and blood vessels, and the fast dynamics—typically of the order of subseconds—a 4D time-lapse imaging methodology will become an indispensable tool. Nevertheless, to-this-date, this is hardly available to most biological labs.

To obtain the 3D structure, optical sectioning is first required to reject the out-of-focus fluorescence signals in each image plane. Optical sectioning imaging could be performed using confocal laser scanning microscopy (CLSM) [13], two-photon laser scanning fluorescence microscopy (2PLSM) [14], or light sheet fluorescence microscopy (LSFM) [1517]. However, CLSM and 2PLSM have slow acquisition speeds because they are point-scanning methods. Hence, they are not suitable for fast 4D imaging. LSFM uses lateral illumination to allow the generation of fluorescence signals only at the focal plane, hence effectively diminishing out-of-focus signals. However, this technique requires a specific optical design and immobilization of the sample, which hinders its broad availability in common biological labs.

Optical sectioning could also be performed by structural illumination microscopy (SIM) [1820]. The method was first introduced by Neil and coworkers in 1997 [21,22] and was later adapted by Zeiss in their Apotome 2 microscope [23]. In the setup, a grating was used to generate a striped illumination pattern, and a step motor or a piezoelectric motor was used to change the pattern position to capture multiple images. A postprocessing algorithm was then used to reconstruct image sections. To increase the imaging speed, they further developed a system with a microstructured, stripe-array, light emitting diode for structured illumination. Hence, no moving parts were required [24]. In 2010, Heintzmann et al. proposed a method to acquire three polarized images with single-camera exposure to further increase the imaging speed [25]. Dan et al. also reported the use of a digital micromirror device (DMD) for structured illumination to obtain optical sectioning [26]. The group later reported a color SIM with a digital color camera and used it for the 3D imaging of mixed pollen grains and insects [27]. However, currently most SIM setups were designed for cell biology and emphasize on the super-resolution application. To our knowledge, optical sectioning imaging of live small-sized animal models using SIM has not been reported previously. Beside instrumentation, optical sectioning with SIM also required reconstruction algorithm to obtain final image. HiLo microscopy use one structured illumination image and one standard uniform illumination image to optical sectioning by spectrum processing [28,29].

Based on the benefits of the aforementioned SIM optical sectioning imaging methods, 2D SIM imaging can yield good imaging results at increased speeds. However, when implementing 3D volume imaging, the general method moves the stage or objective lens to achieve multilayer scanning in a mechanical way. Moving specimens with piezo or stepper motors achieved increased displacement accuracy, but the imaging speed was limited by the response and settling time of the mechanical movement. Recently, electrically tunable lenses (ETL) have been extensively used in microscopic imaging to increase the volume imaging speed. Helmchen and Denk measured the population activity of neurons in 40 µm sections and achieved scan rates up to 20∼30 Hz following the introduction of an ETL in the excitation path of a two-photon microscope [30]. ETL was additionally used to improve the imaging speed in LSFM [31]. Sancataldo et al. developed a novel wide-field (WF) detection system based on an ETL that could track multiple individual nanoscale emitters in 3D over a tunable axial range with nanometric localization precision [32]. Zuo et al. also presented a high-speed transport-of-intensity equation (TIE) quantitative phase microscopy technique by combining an ETL with a conventional transmission microscope. This allowed diffraction-limited, through-focus, intensity stack collections at 15 frames per second (fps) [33].

In this study, we propose a fast 3D imaging method using structured light illumination for optical section and an ETL for refocusing. Two fringe images with a phase difference of π were collected at each plane to reconstruct optical sectioned images with modified HiLo algorithm. The dynamics of the cardiovascular system of the zebrafish and cerebral vessels were imaged with speed of two stacks (25 layers) per second and lateral /axial resolutions of 0.6 µm and 1.8 µm, respectively. The 3D time-lapse imaging clearly shows the contractile-relaxation response of the beating heart at different cardiac phases and different mobilities of blood vessels in different regions.

2. Materials and methods

2.1 Experimental device and system control

The schematic of the structured light illumination imaging system based on an ETL is shown in Fig. 1(a). The system uses a digital micromirror device (DMD, Texas Instruments Inc, USA) as a spatial light modulator (SLM) to produce high-contrast structural stripes, a DMD pixel size of 13.6 µm × 13.6 µm, and an imaging matrix of 1024 × 768 pixels. A noncoherent light source (U-HGLGPS, Olympus Inc, Japan) was used to illuminate the DMD, which effectively suppressed speckle. The DMD plane and sample plane were conjugated to each other. Thus, the DMD-modulated light excites the fluorescence of the specimen. It is then acquired by a high-quantum efficiency scientific complementary metal oxide semiconductor (sCMOS) camera (Prime BSI, Teledyne Photometrics Inc, USA), which has a maximum full-frame rate of 63 fps at 11-bit gray levels.

 figure: Fig. 1.

Fig. 1. (a) Schematic of the imaging system (DMD, digital micromirror device; L1,L2,L3, Optical Lenses; M1, M2, mirrors; DM, dichroic mirror; ETL, electrically tunable lens; OBJ, objective lens. The illustration in the upper right corner is a schematic view of the ETL. (b) Control sequence diagram of the system. FPGA output two signals to trigger the start of camera (light blue) and ETL focus change (red); DMD loads two opposite structural patterns at each plane and turned off between them; the green area represents the camera's read-out time while the yellow area represents the exposure time only when the DMD turned on; 8 ms is ETL settling time).

Download Full Size | PDF

An ETL (EL-10-30-Ci, Optotune Inc, Switzerland) and an objective lens made up the nonmechanical focus module and achieved 3D scanning. The ETL is a container filled with optical fluid and sealed with an elastomeric polymer film. The electromagnetic actuator exerts pressure on the container, which causes the lens to bend. Therefore, the focal length of the lens can be changed by controlling different voltages on the actuator. Benefiting from the refocusing speed of the lens, the acquisition speed of the system can be tripled compared with the traditional system, which uses piezoelectric mechanical movements. Because the response time of the liquid lens requires only approximately 2 ms, the stabilization time of the liquid is only in the range of 4–6 ms. More importantly, we used a field-programmable gate array (FPGA) to automatically collect data, generate DMD patterns, and change the ETL voltage. The control sequence diagram of the device is shown in Fig. 1(b). FPGA triggers the operation of the camera. Then, the DMD is triggered by camera to load the image after when all rows of the camera begin to expose. After the second image is exposed in the camera (upon the onset of the red signal in Fig. 1(b)), the FPGA increases the ETL’s analog voltage to change the focal length, and a waiting time of 8 ms is used for the response of ETL to change the focal plane. The acquisition loops according to this sequence. Time needed for each acquisition can be calculated using the following formula

$$Time = Exposure\_time \times Layers \times 2 + Rows \times 7.74us \times Layers + 8ms \times Layers,$$
where $L\textrm{ayers}$ represents the number of layers scanned, while the number of lines of the image is represented by $Rows$. The readout time of each row is 7.74 us. Thus, the picture read-out time is the number of lines multiplied by it. The third term of the formula is the settling time of ETL.

2.2 Optical sectioning algorithm

We modified the HiLo algorithm [28, 34] so that it can also reconstruct 3D optical sectioned images for thick samples. The algorithm was divided into three steps, as shown in Fig. 2(a). First, five iterations of Richardson–Lucy deconvolution [35] were performed on the two raw images with a phase difference of π ($\textrm{r}1$ and $\textrm{r}2$), yielding two deconvoluted raw images $\textrm{r}1^{\prime}$ and $\textrm{r}2^{\prime}$. A standard optical transfer function (OTF) was used as apodization function for the deconvolution:

$$OTF = \left\{ {\begin{array}{ll} {\frac{\pi }{2}\left[ {{{cos }^{ - 1}}(k) - k\sqrt {1 - {k^2}} } \right]{, }}& |k |\le {k_c}\\ 0, & |k |\;{ \ge }\;{k_c} \end{array}}\right.$$
where k is the spatial frequency vector, kc is the cutoff frequency of the system. Then, we followed the HiLo algorithm [28, 34] to obtain sectioned imaged by merging the in-focus, complementary high-frequency ${I_{hp}}(x,y)$ and low-frequency ${I_{lp}}(x,y)$ image components. A wide field (WF) image was calculated as $(\textrm{r}1^{\prime} + \textrm{r}2^{\prime})/2$, and the high-frequency components were extracted by a high-pass filter. Since the out-of-focus signal always has low frequency and low local contrast, it can be separated from the low-frequency components from WF image by using the local contrast information. In order to get the contrast, the ratio $R(k)$ was constructed in the algorithm
$${R}({k}) = {FT}[2 \times \textrm{r}1^{\prime}/(\textrm{r}1^{\prime} + \textrm{r}2^{\prime})],$$
where $FT$ represents Fourier transform, $(\textrm{r}1^{\prime} + \textrm{r}2^{\prime})/2$ is wide-field image. $R(k)$ represents the illumination pattern and consists of the conjugated two parts ${R_ + }(k)$ and ${R_ - }(k)$. A single-sideband demodulation was used to isolate ${R_ + }(k)$, followed by an inverse Fourier transform to obtain ${R_ + }(x,y)$. The conjugate term is expressed as $R_ + ^\ast (x,y)$. Then, the image local contrast $C(x,y)$ was given by
$$C(x,y) = \sqrt {[{R_ + }(x,y)R_ + ^\ast (x,y)]} \textrm{ }.$$
Therefore, the final optical sectioning image ${I_{hilo}}(x,y)$ can be expressed as
$$\begin{aligned} {I_{hilo}}(x,y) &= {I_{hp}}(x,y) + {\eta }{I_{lp}}(x,y)\\ &{ = }{{HP}}[{({{\textrm{r}1}}^{\prime} + {{\textrm{r}2}}^{\prime})/2}]{ + }{\eta }LP[\sqrt {{R_ + }(x,y)R_ + ^\ast (x,y)} \times ({{{\textrm{r}1}}^{\prime} + {{\textrm{r}2}}^{\prime})/2}], \end{aligned}$$
where ${\eta }$ is a scaling factor introduced to ensure a seamless transition of the frequency content of ${I_{hilo}}(x,y)$ across the cutoff frequency. $HP$ and $LP$ represent complementary high pass and low pass filter.

 figure: Fig. 2.

Fig. 2. Principle of the optical sectioning algorithm. (a) Flow chart of the algorithm. Three main steps consisting deconvolution, HiLo and inverted Gaussian attenuation were marked in different colors. (HP filter, high pass filter; LP filter, low pass filter; FT, Fourier transform; iFT, inverse Fourier transform). (b) Principle of inverted Gaussian attenuation to decrease the out-of-focus signal in spectrum domain. (WF, wide field). (c) Demonstration of the optical section capability with a pumpkin stem slices sample. Lower panel shows the normalized intensity along the dotted lines. (OS, optical sectioning image; Scale bar: 1 µm)

Download Full Size | PDF

Finally, to further decrease the residual out-of-focus information in thick samples, we used an inverted Gaussian function to attenuate the low-frequency components to eliminate defocus information, as shown in (Fig. 2(b)). The final optical sectioning image $Ios$ can be written as:

$$Ios = iFT{ }[FT({I_{hilo}}{) } \times { }G{\textrm{a}}ussAtt(a{,}{{\sigma }^2})],$$
where $FT$ and $iFT$ represent Fourier transform and inverse Fourier transform respectively. Inverted Gaussian function $GaussAtt(a,{{\sigma }^2}) = 1 - a \times \exp [ - ({x^2} + {y^2})/(2 \times {{\sigma }^2})]$ with two adjustable parameters of a and $\sigma _{}^2$ were used. Our algorithm's optical sectioning capability was demonstrated with a pumpkin stem slice sample, as shown in Fig. 2(c).

2.3 Preparation for in vivo zebrafish and fluorescent microsphere imaging

A stable transgenic zebrafish model (referred as Flk1: EGFP) which expressed green fluorescent protein in cardiovascular tissue was used in our experiments. We selected zebrafish with sufficient fluorescence expression on the 4th to 6th day after fertilization for imaging. To maintain the stability of the posture of the fish, we used agarose (UltraPure Low Melting Point Agarose, ThermoFisher Inc, USA) to restrain the larva. The agarose was heated to 70 °C in a water bath to melt. When the agarose cooled to 30 °C, live fishes were gently placed in the agar and immobilized at the bottom of the glass dish. No anesthetic was used in order to not affect its cardiovascular conditions. The zebrafish preparation processes were carried out our in accordance with the Guide for the Care and Use of Laboratory Animals (published by U.S. National Academy of Sciences, ISBN 0-309-05377-3), and were approved by the animal ethics committee of Suzhou Institute of Biomedical Engineering and Technology, CAS.

We have used 200 nm fluorescent microspheres (FluoSpheres Carboxylate-modified microspheres, 0.2 µm, yellow-green fluorescent, ThermoFisher Inc, USA), whose size is smaller than the resolution limit of the imaging system. We added 0.3 µl of microspheres to 200 µl of absolute ethanol solution, and the solution was then added to a glass bottom dish. Finally, after evaporating the ethanol in the dish, distilled water was injected. Pumpkin stem slice with auto-fluorescence was a commercial microscope sample for teaching manufactured by Suzhou Shenying Optics Inc.

3. Results and discussion

3.1 Calibration of device parameters

 figure: Fig. 3.

Fig. 3. Calibration of various system parameters. (a) Relationship between the focal plane position and the voltage for different objective lenses. (b) Relationship between the relative magnification and imaging depth with 40× objective. The magnification was characterized by measuring distances between two beads. (c) Evaluation of the spatial resolutions using 200 nm red fluorescent beads. Scale bar: 7 µm. (d) Enlarged image of fluorescent beads. Scale bar: 2 µm. (e) Axial images of the beads. Scale bar: 2 µm. (f, g) Fitted curves for grayscale values of horizontal and axial images.

Download Full Size | PDF

To calibrate the relationship between the focal plane of the ETL and the voltage, we used the actuator (EL-E-OF-A, Optotune Inc, Switzerland) to achieve different focal lengths with different objective lenses (60×, numerical aperture (NA) = 1.4, 40×, NA = 0.65, 20×, NA = 0.4). Additionally, a Z-axis nanometer stage was used to measure the position of the focal plane (Fig. 3(a)). After each change of the ETL voltage, we moved the sample along the z axis by the nanometer stage until the sample was in focus again and the moved distance were read. We found that the voltage and the focal positions were linearly related. In addition, since the ETL was directly behind the objective, change of its focal length by applying voltage will also lead to the change of the system magnification. For this reason, we measured the change in the relative magnification at different voltages for the objective (40×, NA = 0.65, Olympus Inc, Japan) that was used in our experiment (Fig. 3(b)). We characterized the magnification changes based on the measurement of the distance between two beads at different focal planes. The curve was used to correct the magnification of the raw image during image reconstruction in large-scale imaging. To evaluate the spatial resolution of the SIM microscope, we used green fluorescent microspheres (diameters = 200 nm) as test samples. The sizes of the microspheres were far below the resolution limit of the microscope when an objective of 40×/NA = 0.65 was used. We measured the five beads in the box in Fig. 3(c). The intensity distributions of the 200 nm fluorescent beads in the lateral and axial planes are shown in Figs. 3(f) and 3(g). We quantified the horizontal and axial resolutions of the system, and the respective values were 0.58 ± 0.02 µm and 1.83 ± 0.1 µm with no deconvolution nor optical section algorithm applied.

3.2 Imaging of the cardiovascular system of live zebrafish

Given that the zebrafish is a small, transparent animal, it is an ideal model for 3D imaging. Imaging of the zebrafish cardiovascular system could contribute to the study of embryonic cardiovascular development and zebrafish heart regeneration [21,22]. However, it is difficult to observe the heart dynamic in 3D due to its fast beating. We bred genetically modified zebrafish and imaged the cardiovascular system. Heart of zebrafish consists of 2 components, a ventricle and an atrium connected with blood vessels, and beats at approximately 100 times per minute. The position and size of the heart are indicated by the blue squares in Fig. 4(a), which shows a 2D image of a zebrafish on the fifth day after fertilization. The image was captured by a 4×/NA = 0.13 objective lens. The distribution of the zebrafish blood vessels can be clearly observed. Figure 4(b) is a schematic of the heart of the zebrafish. The heart consists of a ventricle and an atrium connected with blood vessels. Figures 4(c) and 4(d), respectively, show the 3D imaging results of veins and arteries connected to the heart (see Visualization 1 and Visualization 2). Figure 4(e) is a 3D heart image acquired postmortem from a zebrafish. The detailed parameters of images are listed in the Appendix Table 1.

 figure: Fig. 4.

Fig. 4. Images of the cardiovascular system of the zebrafish. (a) Two-dimensional (2D) zebrafish image with the use of a 4× objective lens, Scale bar: 200 µm. (b) Schematic of the heart of the zebrafish. The illustration in the upper right corner marks the location of the zebrafish heart. (c, d) Blood vessels connected to the heart. Scale bar: 10 µm. (e) Static heart image acquired with a 40× objective. Scale bar: 20 µm. (f-h) Cardiac images at different phases of the cardiac cycle. The yellow dotted line indicates the shape of the heart. Scale bar: 10 µm.

Download Full Size | PDF

A 3D image of the four cardiovascular components can be observed in the reference file (see Visualization 3). Figure 4(f-h) shows the cardiovascular images at different phases. The shapes of the components are indicated with dotted lines. The exposure time of Fig. 4(f-h) was set to 5 ms, and 21 layers of images were acquired with a depth of 21.924 µm over 544.87 ms. As demonstrated by the movie of the time-lapse series (see Visualization 4), this imaging speed was enough to capture the dynamic heartbeat pattern and overall movements. The experimental results demonstrate the ability of the system to image the cardiovascular system of zebrafish. In this way, we can use this system to carry out research on the cardiac morphogenesis of the zebrafish, heart growth, and even cardiac regeneration. Similarly, the device can also be used to help the development of cardiac drugs, such as drugs controlling the heart rate.

3.3 Imaging of cerebral blood vessels in the zebrafish

The head of the zebrafish has the most abundant blood vessels, and these blood vessels move relatively slowly, so we imaged it deeper and longer, and quantitatively measured the movement distance of the blood vessels. Figure 5 demonstrates the feasibility of 3D imaging of complex head vascular structures in the zebrafish. We scanned the cerebral blood vessels with a step length of 1.044 µm and a 40×/NA = 0.65 objective lens (Fig. 5(a)). We acquired 10 time points continually for 5.32 s to observe the dynamic changes of blood vessels (see Visualization 5). We chose three time points and took a cross-section at the yellow box to see the movement of the blood vessels along the Z-axis. Obvious vascular motion can be observed at locations corresponding to the yellow arrow in Fig. 5(b-d). Our system provides a reliable tool for the study of the distribution and movement of the cerebral zebrafish vessels. Therefore, with this platform, we can extend the previous research in the two-dimensional mode, such as blood vessel growth and vascular function, to the dynamic three-dimensional mode. In Fig. 5(e), the imaging depth reached 41.76 µm, and the total imaging time at 56 time points was 71.597 s (matrix size = 512 × 512 pixels). However, our existing computers may require lengthier periods to acquire the images and are limited by the memory and calculation capacities of the computer. To observe the changes of specimen structures with depth, we coded the different depths with different colors. The color change from blue to orange represents a change in imaging depth from 0 to 40 µm, as shown in Fig. 5(f). These figures demonstrate the capability of out-of-focus light rejection and the optical sectioning of SIM. In Fig. 5(g), we measured the movement of a blood vessel during this time, whereby the two colors represent the first and last time points, respectively. The blood vessels moved by 2.6 and 3.2 µm in the x and y directions, respectively. Visualization 6 shows the 3D image of the specimen observed from different angles after 3D reconstruction with the use of the software Imaris (Bitplane Inc, Switzerland). Therefore, we can quantitatively measure the three-dimensional displacement of blood vessels with our system.

 figure: Fig. 5.

Fig. 5. Three-dimensional (3D) imaging of cerebral blood vessels in the zebrafish. (a) Vessel image acquired with a 40× objective lens (yellow box area shown in Fig. 4(a).) Scale bar: 20 µm. (b-d) Cross-sectional image at different time points in the yellow box in (a). The yellow arrows indicate the locations of the obvious movements. (e) Deep vascular imaging with a 40× objective lens, (green box area shown in Fig. 4(a)). Scale bar: 20 µm. (f) Depth-coding maximum intensity projection (MIP) images. Scale bar: 20 µm. (g) Quantitative measurement of vessel movement from the beginning to the end-time point, Scale bar: 2 µm.

Download Full Size | PDF

3.4 Imaging of intersegmental vessels and peri-intestinal vessels

 figure: Fig. 6.

Fig. 6. Intersegmental and peri-intestinal vessels. (a) White and yellow boxes indicate the intersegmental and peri-intestinal vessels, respectively. Scale bar: 20 µm. (b) 3D view of intersegmental vessel. (c) Images of the peri- intestinal blood vessels at two time points showing obvious movements. Scale bar: 20 µm.

Download Full Size | PDF

We observed fish blood vessels in regions which correspond to the red box in Fig. 4(a). Biologists are very interested in this area because the blood vessels are located around the intestines of the fish, and the movement of the blood vessels reflects the gastrointestinal motility. As anticipated, this can allow additional studies of the digestion patterns of the zebrafish [21]. We recorded the 3D motion of the tail vessels at an increased acquisition speed with a 40× objective lens (Fig. 6(a)). Every scan cycle contained 21 layers and only lasted 666.02 ms (matrix size = 486 × 628 pixels) (see Visualization 7). Figure 6(b) shows an enlarged view of the 3D structure of the intersegmental vessel of the zebrafish, which is almost in a stationary state. However, the blood vessels in the yellow frame move very quickly. Accordingly, Fig. 6(c) shows the changes in vessel positions at two time points. The movement around the intestine visually reflect the gastrointestinal motility of the zebrafish. This indicating that our 3D dynamic imaging could potentially allow us to study the digestive system of the zebrafish.

4. Discussion

With structured illumination for optical sectioning and ETL for axial refocusing, we realized fast 3D dynamic imaging with 2 stacks (25 layers) per second. However, the acquisition speed is still limited by the exposure and readout time of the camera. Using higher intensity light source could effectively reduce the exposure time and hence enable faster dynamic imaging. A high-quantum-efficiency (95%) sCMOS were used in our experiment which has relatively low readout speed. By using the latest sCMOS (such as Sona 4.2B-6, Andor Inc. UK) with readout speed of up to 74 frames/s (16 bit, 2048×2048 pixels), performance of our system could be further improved.

Three-dimensional optical sectioning imaging with ETL significantly improves the speed of focusing, thus, providing the possibility of imaging fast moving samples. However, ETL has inherent shortcomings that will generate a variety of aberrations in the imaging system when changing the focus. However, it has a minor impact on our current imaging system, and corrections are not necessary. In subsequent experiments, adaptive optics may be included to correct the aberration for higher resolution imaging [36,37].

In addition, our devices could be easily expanded into a multicolor imaging system with multi-band filters and multi-camera combinations. Thus, the multicolor imaging system could acquire images to capture the 3D dynamic process of blood flow through the blood vessels and heart providing they are labelled with different types of fluorescent protein. The system also has the potential to image other thick fluorescent-labelled samples, such as brain tissue sections and organoids.

5. Summary

In this study, a fast 4D microscopic imaging system based on an ETL was designed to image the vessels and heart of the zebrafish. The entire system has a focusing speed that is 2–3 times higher compared to that of the traditional mechanical optical sectioning system. To solve the problem of fringe residuals caused by inaccurate phase shifts, we introduced and optimized the HiLo algorithm and made it more suitable for optical imaging of sections of thick samples. The system obtained good imaging results of the cardiovascular system and vessels in the zebrafish. The experimental results confirm the effectiveness of the imaging method and provide a new tool for the study of the cardiovascular and neurovascular system of the zebrafish.

Appendix

Parameters list

The reconstruction parameters of images in Fig. 4–Fig. 6 are listed in Table 1.

Tables Icon

Table 1. Parameters in Fig. 4–Fig. 6

Funding

National Key Research and Development Program of China (2017YFC0110100); National Natural Science Foundation of China (61805272).

Acknowledgments

We are grateful to the team led by Weijun Pan at the Shanghai Institute of Nutrition and Health for providing the zebrafish samples.

Disclosures

The method presented in this paper is the subject of a China patent application (2019100412767)

References

1. L. Andrés-Delgado, M. Peralta, N. Mercader, and J. Ripoll, “Dynamic focusing in the zebrafish beating heart,” Proc. SPIE 9717, 971717 (2016). [CrossRef]  

2. M. Mickoleit, B. Schmid, M. Weber, F. O. Fahrbach, S. Hombach, S. Reischauer, and J. Huisken, “High-resolution reconstruction of the beating zebrafish heart,” Nat. Methods 11(9), 919–922 (2014). [CrossRef]  

3. D. G. Ransom, P. Haffter, J. Odenthal, A. Brownlie, E. Vogelsang, R. N. Kelsh, M. Van Eeden, F. J. Furutani-Seiki, and M. Makoto Granato, “Characterization of zebrafish mutants with defects in embryonic hematopoiesis,” Development 123, 311–319 (1996).

4. D. Li, W. Xue, M. Li, M. Dong, J. Wang, X. Wang, X. Li, K. Chen, W. Zhang, and S. Wu, “VCAM-1 + macrophages guide the homing of HSPCs to a vascular niche,” Nature 564(7734), 119–124 (2018). [CrossRef]  

5. S. C. Watkins, S. Maniar, M. Mosher, B. L. Roman, M. Tsang, and C. M. St Croix, “High resolution imaging of vascular function in zebrafish,” PloS One7(8), e44018 (2012). [CrossRef]  

6. Y. R. Cha and B. M. Weinstein, “Visualization and experimental analysis of blood vessel formation using transgenic zebrafish,” Birth Defects Res., Part C 81(4), 286–296 (2007). [CrossRef]  

7. S. Isogai, M. Horiguchi, and B. M. Weinstein, “The vascular anatomy of the developing zebrafish: an atlas of embryonic and early larval development,” Dev. Biol. 230(2), 278–301 (2001). [CrossRef]  

8. L. D. Covassin, J. A. Villefranc, M. C. Kacergis, B. M. Weinstein, and N. D. Lawson, “Distinct genetic interactions between multiple Vegf receptors are required for development of different blood vessel types in zebrafish,” Proc. Natl. Acad. Sci. 103(17), 6554–6559 (2006). [CrossRef]  

9. K. Yaniv, S. Isogai, D. Castranova, L. Dye, J. Hitomi, and B. M. Weinstein, “Live imaging of lymphatic development in the zebrafish,” Nat. Med. 12(6), 711–716 (2006). [CrossRef]  

10. K. S. Okuda, J. W. Astin, J. P. Misa, M. V. Flores, K. E. Crosier, and P. S. Crosier, “Lyve1 expression reveals novel lymphatic vessels and new mechanisms for lymphatic vessel development in zebrafish,” Development 139(13), 2381–2391 (2012). [CrossRef]  

11. K. Ando, S. Fukuhara, N. Izumi, H. Nakajima, H. Fukui, R. N. Kelsh, and N. Mochizuki, “Clarification of mural cell coverage of vascular endothelial cells by live imaging of zebrafish,” Development 143(8), 1328–1339 (2016). [CrossRef]  

12. E. Kochhan, A. Lenard, E. Ellertsdottir, L. Herwig, M. Affolter, H.-G. Belting, and A.-F. Siekmann, “Blood flow changes coincide with cellular rearrangements during blood vessel pruning in zebrafish embryos,” PLoS One 8(10), e75060 (2013). [CrossRef]  

13. J.-A. Conchello and J. W. Lichtman, “Optical sectioning microscopy,” Nat. Methods 2(12), 920–931 (2005). [CrossRef]  

14. F. Helmchen and W. Denk, “Deep tissue two-photon microscopy,” Nat. Methods 2(12), 932–940 (2005). [CrossRef]  

15. A. H. Voie, D. Burns, and F. Spelman, “Orthogonal-plane fluorescence optical sectioning: three - dimensional imaging of macroscopic biological specimens,” J. Microsc. 170(3), 229–236 (1993). [CrossRef]  

16. B.-C. Chen, W. R. Legant, K. Wang, L. Shao, D. E. Milkie, M. W. Davidson, C. Janetopoulos, X. Wu, J.-A. Hammer, and Z. Liu, “Lattice light-sheet microscopy: imaging molecules to embryos at high spatiotemporal resolution,” Science 346(6208), 1257998 (2014). [CrossRef]  

17. P. J. Keller, A. D. Schmidt, J. Wittbrodt, and E. H. Stelzer, “Digital scanned laser light-sheet fluorescence microscopy (DSLM) of zebrafish and Drosophila embryonic development,” Cold Spring Harbor Protocols 2011(10), pdb.prot065839 (2011). [CrossRef]  

18. M. Arigovindan, J. W. Sedat, and D. A. Agard, “Effect of depth dependent spherical aberrations in 3D structured illumination microscopy,” Opt. Express 20(6), 6527–6541 (2012). [CrossRef]  

19. L. Gao, L. Shao, C. D. Higgins, J. S. Poulton, M. Peifer, M. W. Davidson, X. Wu, B. Goldstein, and E. Betzig, “Noninvasive imaging beyond the diffraction limit of 3D dynamics in thickly fluorescent specimens,” Cell 151(6), 1370–1385 (2012). [CrossRef]  

20. D. Li, L. Shao, B. C. Chen, X. Zhang, M. Zhang, B. Moses, D. E. Milkie, J. R. Beach, J. A. Hammer, M. Pasham, T. Kirchhausen, M. A. Baird, M. W. Davidson, P. Xu, and E. Betzig, “ADVANCED IMAGING. Extended-resolution structured illumination imaging of endocytic and cytoskeletal dynamics,” Science 349(6251), aab3500 (2015). [CrossRef]  

21. M. A. Neil, R. Juškaitis, and T. Wilson, “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Opt. Lett. 22(24), 1905–1907 (1997). [CrossRef]  

22. M. Neil, R. Juškaitis, and T. Wilson, “Real time 3D fluorescence microscopy by two beam interference illumination,” Opt. Commun. 153(1-3), 1–4 (1998). [CrossRef]  

23. H. Bauch and J. Schaffer, “Optical sections by means of “structured illumination”: background and application in fluorescence microscopy,” Photonik Int 5, 86–88 (2006).

24. V. Poher, H. Zhang, G. Kennedy, C. Griffin, S. Oddos, E. Gu, D. S. Elson, J. M. Girkin, P. French, and M. D. Dawson, “Optical sectioning microscopes with no moving parts using a micro-stripe array light emitting diode,” Opt. Express 15(18), 11196–11206 (2007). [CrossRef]  

25. K. Wicker and R. Heintzmann, “Single-shot optical sectioning using polarization-coded structured illumination,” J. Opt. 12(8), 084010 (2010). [CrossRef]  

26. D. Dan, M. Lei, B. Yao, W. Wang, M. Winterhalder, A. Zumbusch, Y. Qi, L. Xia, S. Yan, Y. Yang, P. Gao, T. Ye, and W. Zhao, “DMD-based LED-illumination super-resolution and optical sectioning microscopy,” Sci. Rep. 3(1), 1116 (2013). [CrossRef]  

27. J. Qian, M. Lei, D. Dan, B. Yao, X. Zhou, Y. Yang, S. H. Yan, J. W. Min, and X. H. Yu, “Full-color structured illumination optical sectioning microscopy,” Sci. Rep. 5(1), 14513 (2015). [CrossRef]  

28. S. Santos, K. K. Chu, D. Lim, N. Bozinovic, T. N. Ford, C. Hourtoule, A. C. Bartoo, S. K. Singh, and J. Mertz, “Optically sectioned fluorescence endomicroscopy with hybrid-illumination imaging through a flexible fiber bundle,” J Biomed Opt. 14(3), 030502 (2009). [CrossRef]  

29. T. N. Ford, D. Lim, and J. Mertz, “Fast optically sectioned fluorescence HiLo endomicroscopy,” J. Biomed. Opt. 17(2), 021105 (2012). [CrossRef]  

30. B. F. Grewe, F. F. Voigt, M. van’t Hoff, and F. Helmchen, “Fast two-layer two-photon imaging of neuronal cell populations using an electrically tunable lens,” Biomed. Opt. Express 2(7), 2035–2046 (2011). [CrossRef]  

31. R. Galland, G. Grenci, A. Aravind, V. Viasnoff, V. Studer, and J.-B. Sibarita, “3D high-and super-resolution imaging using single-objective SPIM,” Nat. Methods 12(7), 641–644 (2015). [CrossRef]  

32. G. Sancataldo, L. Scipioni, T. Ravasenga, L. Lanzanò, A. Diaspro, A. Barberis, and M. Duocastella, “Three-dimensional multiple-particle tracking with nanometric precision over tunable axial ranges,” Optica 4(3), 367–373 (2017). [CrossRef]  

33. C. Zuo, Q. Chen, W. Qu, and A. Asundi, “High-speed transport-of-intensity phase microscopy with an electrically tunable lens,” Opt. Express 21(20), 24060–24075 (2013). [CrossRef]  

34. J. Mazzaferri, D. Kunik, J. M. Belisle, K. Singh, S. Lefrancois, and S. Costantino, “Analyzing speckle contrast for HiLo microscopy optimization,” Opt. Express 19(15), 14508–14517 (2011). [CrossRef]  

35. F. Ströhl and C. F. Kaminski, “A joint Richardson-Lucy deconvolution algorithm for the reconstruction of multifocal structured illumination microscopy data,” Methods Appl. Fluoresc. 3(1), 014002 (2015). [CrossRef]  

36. D. Débarre, E. J. Botcherby, M. J. Booth, and T. Wilson, “Adaptive optics for structured illumination microscopy,” Opt. Express 16(13), 9290–9305 (2008). [CrossRef]  

37. M. Žurauskas, I. M. Dobbie, R. M. Parton, M. A. Phillips, A. Göhler, I. Davis, and M. J. Booth, “IsoSense: frequency enhanced sensorless adaptive optics through structured illumination,” Optica 6(3), 370–379 (2019). [CrossRef]  

Supplementary Material (7)

NameDescription
Visualization 1       Visualization of the vein in zebrafish cardiovascular
Visualization 2       Visualization of the artery in zebrafish cardiovascular
Visualization 3       The shape of the four components of the zebrafish cardiovascular
Visualization 4       Cardiac images at different phases of the cardiac cycle
Visualization 5       Reconstruction of zebrafish vessels in the head region
Visualization 6       Optical sectioning SIM time-lapse recording of zebrafish vessels in the head region
Visualization 7       Dynamic three-dimensional visualization of the intersegmental and peri-intestinal vessels in zebrafish

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. (a) Schematic of the imaging system (DMD, digital micromirror device; L1,L2,L3, Optical Lenses; M1, M2, mirrors; DM, dichroic mirror; ETL, electrically tunable lens; OBJ, objective lens. The illustration in the upper right corner is a schematic view of the ETL. (b) Control sequence diagram of the system. FPGA output two signals to trigger the start of camera (light blue) and ETL focus change (red); DMD loads two opposite structural patterns at each plane and turned off between them; the green area represents the camera's read-out time while the yellow area represents the exposure time only when the DMD turned on; 8 ms is ETL settling time).
Fig. 2.
Fig. 2. Principle of the optical sectioning algorithm. (a) Flow chart of the algorithm. Three main steps consisting deconvolution, HiLo and inverted Gaussian attenuation were marked in different colors. (HP filter, high pass filter; LP filter, low pass filter; FT, Fourier transform; iFT, inverse Fourier transform). (b) Principle of inverted Gaussian attenuation to decrease the out-of-focus signal in spectrum domain. (WF, wide field). (c) Demonstration of the optical section capability with a pumpkin stem slices sample. Lower panel shows the normalized intensity along the dotted lines. (OS, optical sectioning image; Scale bar: 1 µm)
Fig. 3.
Fig. 3. Calibration of various system parameters. (a) Relationship between the focal plane position and the voltage for different objective lenses. (b) Relationship between the relative magnification and imaging depth with 40× objective. The magnification was characterized by measuring distances between two beads. (c) Evaluation of the spatial resolutions using 200 nm red fluorescent beads. Scale bar: 7 µm. (d) Enlarged image of fluorescent beads. Scale bar: 2 µm. (e) Axial images of the beads. Scale bar: 2 µm. (f, g) Fitted curves for grayscale values of horizontal and axial images.
Fig. 4.
Fig. 4. Images of the cardiovascular system of the zebrafish. (a) Two-dimensional (2D) zebrafish image with the use of a 4× objective lens, Scale bar: 200 µm. (b) Schematic of the heart of the zebrafish. The illustration in the upper right corner marks the location of the zebrafish heart. (c, d) Blood vessels connected to the heart. Scale bar: 10 µm. (e) Static heart image acquired with a 40× objective. Scale bar: 20 µm. (f-h) Cardiac images at different phases of the cardiac cycle. The yellow dotted line indicates the shape of the heart. Scale bar: 10 µm.
Fig. 5.
Fig. 5. Three-dimensional (3D) imaging of cerebral blood vessels in the zebrafish. (a) Vessel image acquired with a 40× objective lens (yellow box area shown in Fig. 4(a).) Scale bar: 20 µm. (b-d) Cross-sectional image at different time points in the yellow box in (a). The yellow arrows indicate the locations of the obvious movements. (e) Deep vascular imaging with a 40× objective lens, (green box area shown in Fig. 4(a)). Scale bar: 20 µm. (f) Depth-coding maximum intensity projection (MIP) images. Scale bar: 20 µm. (g) Quantitative measurement of vessel movement from the beginning to the end-time point, Scale bar: 2 µm.
Fig. 6.
Fig. 6. Intersegmental and peri-intestinal vessels. (a) White and yellow boxes indicate the intersegmental and peri-intestinal vessels, respectively. Scale bar: 20 µm. (b) 3D view of intersegmental vessel. (c) Images of the peri- intestinal blood vessels at two time points showing obvious movements. Scale bar: 20 µm.

Tables (1)

Tables Icon

Table 1. Parameters in Fig. 4–Fig. 6

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

T i m e = E x p o s u r e _ t i m e × L a y e r s × 2 + R o w s × 7.74 u s × L a y e r s + 8 m s × L a y e r s ,
O T F = { π 2 [ c o s 1 ( k ) k 1 k 2 ] , | k | k c 0 , | k | k c
R ( k ) = F T [ 2 × r 1 / ( r 1 + r 2 ) ] ,
C ( x , y ) = [ R + ( x , y ) R + ( x , y ) ]   .
I h i l o ( x , y ) = I h p ( x , y ) + η I l p ( x , y ) = H P [ ( r 1 + r 2 ) / 2 ] + η L P [ R + ( x , y ) R + ( x , y ) × ( r 1 + r 2 ) / 2 ] ,
I o s = i F T [ F T ( I h i l o ) × G a u s s A t t ( a , σ 2 ) ] ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.