Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Accurate calibration of beam trajectories in scanning optical imaging systems

Open Access Open Access

Abstract

We present a calibration method for finding the coordinates of points in the trajectory of the scanning beam in flying-spot imaging devices. Our method is based on laterally translating the field of view on the imaging object plane by introducing additional beam deflections. We show that laterally translating the field of view provides a series of images whose relative translations are equal to the distances between the points in the scanning pattern to be calibrated. We show how these distances are mapped to the coordinates of the trajectory points. As an example, we demonstrate the calibration of the scanning patterns in an optical system with two independent microelectromechanical system based scanners. Our method profits from a large collection of distance measurements to find the trajectory coordinates, thereby minimizing the effect of random sources of uncertainty in the positions of points in the scanning pattern. We have found that we are capable of finding the coordinates of points in the scanning patterns with accuracy greater than the optical resolution of the imaging system.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

In flying-spot imaging systems, an image is formed by the measured intensity of backscattered light as a beam travels, deflected by a scanner, along a trajectory of sampling points on the object plane. This trajectory depends largely on the signals driving the scanner, but it is also affected by the optical and mechanical properties of the beam-deflecting system. As a result, the actual scanning trajectory differs from the presumed, theoretical trajectory. Accurate knowledge of the position of the sampling points is crucial for the correct reconstruction of the image, and hence the need for a calibration procedure that accurately maps the acquired, distorted image onto the image with proper geometry.

In common solutions to the calibration problem, it is assumed that either a model of the imaging system or of its distortions is well known. Features on a calibration target are then used to find parameters on a model of the image distortions [16]. Accurate calibration thus requires proper choices of both a model of the system’s distortions and a reference target with adequate spatial features that can be resolved by the imaging system. Moreover, beam scanning patterns sparse with respect to the system’s optical resolution cannot sample all of the target’s features. An exemplary case of this class of patterns is the Lissajous scanning pattern widely used in a growing number of ophthalmic imaging systems [7,8].

In this Letter, we present a trajectory calibration method that does not depend on a baseline of feature sizes or spatial distribution, thus obviating the need for a traceable calibration target. Instead, the calibration is possible using an arbitrary object. Furthermore, our method does not require a priori knowledge of a model of the imaging system’s distortions. In particular, our method is applicable to systems in which sampling is sparse or non-uniformly distributed. Remarkably, our method is capable of calibrating trajectories with accuracy greater than the system’s optical resolution. Our method differs from common approaches in that it is based on determining the spatial coordinates of each individual point within the scanning trajectory in lieu of fitting a model of optical distortions to a target’s features and then estimating the trajectory based on this model.

We find the coordinates in a two-step process. First, we calculate the distances between each pair of points in the scanning trajectory. Next, we localize the position of the coordinates that match these distances using an optimization technique. In the first step, we use a raster scanner to translate the starting position of the scanning beam on the object plane. During a complete excursion of the raster scanner, each point in the scanning trajectory forms a laterally translated image of the object. The displacements between images directly correspond to the distances between the scanning trajectory points. We then compute the distance between laterally displaced images for every pair of points. Following this approach, we effectively translate the problem of mapping a set of unknown coordinate points into a manifold of straightforward image registration tasks. Coordinate localization is performed by numerically minimizing a cost function involving the collected distances.

To demonstrate the effectiveness of our method, we have calibrated the scanning trajectories of two independent scanning laser ophthalmoscopes (SLOs) that simultaneously acquire images of the same target; see Fig. 1. A pair of 2D resonant microelectromechanical system (MEMS) scanners (OP-6111, Opus Microsystems, Taiwan), one in each SLO, oscillates at 22,400 Hz on the fast axis and 1400 Hz on the slow axis to produce a pair of perpendicular, sparse Lissajous scanning patterns on the object as shown in Fig. 2(a). These patterns scan the sample at 2800 frames/s. The object plane is purposely sampled so that the distance between the lines in each pattern is much larger than the diameter of the imaging spot. The two beams are combined by a polarization beam splitter and passed through our calibration tool. The tool is composed of one pair of galvanometric scanners that additionally deflect the beams in a raster fashion. The wavelength of the light used in the experiments is 850 nm, and the calculated optical resolution of the imaging system is 6.2 µm. The data acquisition and system synchronization are performed using custom-made electronics and software (AM2M Ltd. L.P, Poland).

 figure: Fig. 1.

Fig. 1. Scanning optical system under calibration and calibration tool. L1-L10, lenses; PD1-PD2, avalanche photodiodes; BS1-BS2, non-polarizing beam splitters; PBS, polarizing beam splitter; PC1-PC2, fiber polarization controllers; MM1-MM2, 2D resonant MEMS scanners; GS, pair of galvanometric scanners; SLD, superluminescent diode. Scanning trajectories generated by scanners MM1, MM2, and GS are schematically plotted next to the scanners.

Download Full Size | PDF

 figure: Fig. 2.

Fig. 2. Relations between the Lissajous and raster scanning trajectories. (a) L-frames for SLO1 and SLO2. Both Lissajous patterns laterally translated to the position ${\vec r_j}$ of the raster pattern and overlaid simultaneously on the raster pattern image of a USAF 1951 target. (b) R-frames generated by two different points of a Lissajous pattern laterally displaced by the motion of the galvoscanner shown by the red line. The distance between points ${\vec l_i}$ and ${\vec l_k}$ from an L-frame ${{\bf L}_j}$ is the same as the lateral displacement between R-frames ${{\bf R}_i}$ and ${{\bf R}_k}$; see Eq. (1). (c) Relations between the L-frames and R-frames on matrix ${\bf P}$. The projections of all the frames for rows and columns are shown in Visualization 1 and Visualization 2, respectively.

Download Full Size | PDF

For each one of the SLOs, the scanning beam follows a trajectory that is a combination of the two scanning patterns: the Lissajous pattern to be calibrated generated by the 2D resonant MEMS scanner, and the raster pattern introduced by the galvanometric scanners in our calibration tool. This compound trajectory can be described as follows. Let the Lissajous pattern be a set of $N$ points ${\bf L} = ({\vec l_1}, \ldots,{\vec l_i} \ldots,{\vec l_N})$ and the raster pattern a set of $M$ points ${\bf R} = ({\vec r_1}, \ldots,{\vec r_j} \ldots,{\vec r_M})$. In these definitions, $\vec l = ({l_x},{l_y})$ and $\vec r = ({r_x},{r_y})$ are the coordinates of the points of the Lissajous and raster patterns, respectively. The coordinates of points acquired during the experiment can be described as sums of the two coordinates and presented in the form of a matrix ${\bf P}$ with elements ${\vec p_{ij}} = {\vec l_i} + {\vec r_j}$. The columns ${{\bf L}_j} = {\bf L} + {\vec r_j}$ of this matrix are the Lissajous patterns (L-frames) acquired at the ${\vec r_j}$ coordinates of the raster pattern, and rows ${{\bf R}_i} = {\bf R} + {\vec l_i}$ can be interpreted as the raster images (R-frames) laterally translated to the position ${\vec l_i}$ in the Lissajous pattern. The connection between the points in Lissajous patterns and their associated raster images is depicted in Fig. 2(b). Figure 2(c) shows the composition of matrix ${\bf P}$.

We now define an image-to-image operator $\vec s({{\bf R}_i},{{\bf R}_k})$ that calculates the mean distance between all corresponding pairs of points in the two images ${{\bf R}_i}$ and ${{\bf R}_k}$. This operator is the vectorial distance between the elements of rows ${{\bf R}_i}$ and ${{\bf R}_k}$ of matrix ${\bf P}$:

$$\vec s({{\bf R}_i},{{\bf R}_k}) = {{\bf R}_i} - {{\bf R}_k} = {\bf R} + {\vec l_i} - ({\bf R} + {\vec l_k}) = {\vec l_i} - {\vec l_k}.$$

From Eq. (1), the relative lateral displacement between any two raster images ${{\bf R}_i}$ and ${{\bf R}_k}$ equals the distance between points ${\vec l_i}$ and ${\vec l_k}$ in the Lissajous pattern. We can now use one of the many available methods for image registration, including cross correlation [9] to efficiently calculate the displacements $\vec s({{\bf R}_i},{{\bf R}_k})$ between images.

In the second stage of the calibration method, we minimize a cost function to find the calibrated Lissajous pattern ${{\bf L}^c}$ with coordinates ${\vec l^c}$ arranged in such a way that all the mutual distances $\vec l_i^c - \vec l_k^c$ match the distances found using operator $\vec s({{\bf R}_i},{{\bf R}_k})$:

$${{\bf L}^c} = \mathop {{\arg}\, {\min}}\limits_{{{\vec l}^c} \in {{\bf L}^c}} \sum\limits_{i,k = 1}^N |(\vec l_i^c - \vec l_k^c) - \vec s({{\bf R}_i},{{\bf R}_k})|.$$

Coordinate-localization calculations can be performed using a variety of specialized minimization methods, including multidimensional scaling (MDS) [10], self-organized maps [11], or stochastic linear embedding [12,13], to name a few examples. MDS is particularly suited to use the pairwise distances among a set of individual objects and localize them into a configuration of spatial points. In particular, we solve Eq. (2) using an MDS-based algorithm [14]. The starting coordinates of points ${\vec l^c}$ are derived from a theoretical Lissajous pattern ${{\bf L}^t}$ calculated from nominal values of the MEMS scanners’ resonant frequencies.

The calibrated coordinates ${\vec l^c}$ are calculated in the same units as the displacements $\vec s({{\bf R}_i},{{\bf R}_k})$. In our experiment, even if different fields of view have different field distortions, the overlapping sections of any pair of images are equally affected by them. As a result, any pair of images will always be accurately registered, provided that the images overlap. Once the calibrated coordinates are obtained, if quantitative spatial information with specific units is required, it is readily available upon calibration of any resolution target that allows for calculation of the raster image pixel size.

To experimentally validate our method, we have performed two different experiments. In the first experiment, we set the pair of galvanometric scanners GS to form a raster scanning pattern consisting of $1000 \times 1000$ pixels over an area of $7.0^\circ \times 7.0^\circ$, and we acquired one R-frame for each of the SLOs with the MEMS scanners turned off, namely, frame ${\bf R}_0^1$ for SLO1 and frame ${\bf R}_0^2$ for SLO2 [see Fig. 4(a) for ${\bf R}_0^1$]. This pair of frames is then used to find the offset between the two SLOs $\vec s({\bf R}_0^1,{\bf R}_0^2)$ to properly register the images obtained from the two SLOs.

In the second experiment, we set the 2D MEMS scanners in both SLOs to cover an angular extent of ${2.8^ \circ} \times {2.2^ \circ}$ with 4096 pixels in each acquired Lissajous pattern. Next, we set the pair of galvanometric scanners GS to form a raster scanning pattern consisting of $500 \times 500$ pixels over an area of $4.0^\circ \times 4.0^\circ$ and filled matrix ${\bf P}$ for each of the SLOs, as shown in Fig. 2(c). Both matrix ${{\bf P}^1}$ (for SLO1) and ${{\bf P}^2}$ (for SLO2) are filled column-wise, and the data from Lissajous patterns ${\bf L}_j^1$ and ${\bf L}_j^2$ are collected after the galvanometric scanner is fully stopped at consecutive positions ${\vec r_j}$ of the raster pattern. Visualization 1 shows the acquisition of all L-frames, and Visualization 2 shows the resulting R-frames for matrices ${{\bf P}^1}$ and ${{\bf P}^2}$. Next, we apply the calibration method as described above using data from both matrices ${{\bf P}^1}$ and ${{\bf P}^2}$ independently. Finally, we apply the correction for the displacement $\vec s({\bf R}_0^1,{\bf R}_0^2)$ measured in the first experiment to matrix ${{\bf P}^1}$.

Figure 3(a) shows the results of calibration with the calibrated Lissajous patterns ${{\bf L}^{c,1}}$ and ${{\bf L}^{c,2}}$ overlaid on the theoretical Lissajous patterns ${{\bf L}^{t,1}}$ and ${{\bf L}^{t,2}}$ for SLO1 and SLO2, respectively. In Fig. 3(b), we show the histograms of the angular errors between corresponding points in the theoretical and calibrated patterns for both SLOs. In our experiment, these differences are as high as 0.13°. This maximum error is equivalent to seven Airy disc radii, approximately.

 figure: Fig. 3.

Fig. 3. (a) Theoretical ${{\bf L}^t}$ (dotted lines) and calibrated ${{\bf L}^c}$ (solid lines) Lissajous patterns. (b) Histogram of distances between corresponding points from ${{\bf L}^t}$ and ${{\bf L}^c}$ in Airy disc radii and degrees. Lissajous patterns and histograms from SLO1 and SLO2 are plotted in blue and orange, respectively.

Download Full Size | PDF

The effect of trajectory calibration is clearly appreciated by comparing the images of a USAF 1951 resolution target that result from compounding all the R-frames obtained from both matrices ${{\bf P}^1}$ and ${{\bf P}^2}$ acquired in the second experiment. Figure 4(b) shows the image obtained from the R-frames laterally translated using coordinates $\vec l_i^{t,1}$ and $\vec l_i^{t,2}$ from the theoretical Lissajous scanning patterns ${{\bf L}^{t,1}}$ and ${{\bf L}^{t,2}}$. For comparison, Fig. 4(c) shows the image obtained from the same R-frames but laterally translated using coordinates $\vec l_i^{c,1}$ and $\vec l_i^{c,2}$ from the calibrated Lissajous scanning patterns ${{\bf L}^{c,1}}$ and ${{\bf L}^{c,1}}$. The resolution of the single raster image shown in Fig. 4(a) is limited only by the system’s optical resolution. We find upon inspection that the images in Figs. 4(a) and 4(c) have comparable resolution, which evidences that minimizing the cost function in Eq. (2) provides an optimal estimate ${{\bf L}^c}$ with an error well below the optical resolution of the system. Any uncertainty introduced in the relative positions of the raster images would result in a lower image resolution in Fig. 4(c). A major advantage of using the set of accurately registered R-frames to produce the target’s image is the visible reduction in noise in Fig. 4(c) (see inset). This improvement is explained by the compounding process, which uses a weighted average of the R-frames, resulting in suppression of uncorrelated noise from the images to a level below visibility.

 figure: Fig. 4.

Fig. 4. Images of a 1951 USAF resolution target obtained (a) with a single raster scan, (b) using registration of R-frames laterally translated to the coordinates $\vec l_i^t$ from the theoretical Lissajous scanning pattern ${{\bf L}^t}$, and (c) using registration of R-frames laterally translated to the coordinates $\vec l_i^c$ from the calibrated Lissajous scanning pattern ${{\bf L}^c}$. Corresponding experimental contrast transfer function plots for horizontal and vertical elements of the target are shown below each image in blue and orange, respectively.

Download Full Size | PDF

The improvement in resolution due to calibration is also quantitatively shown in the measured contrast transfer function plots for each of the images in Fig. 4. In Figs. 4(a) and 4(c), we find that the contrast falls below 20% for a spatial frequency above 80 lp/mm (line pairs per mm), which corresponds well to the maximum theoretical resolution of 6.2 µm for a beam wavelength of 850 nm and beam diameter of 3.4 mm and an objective lens with focal length of 20 mm. The insets distinctly show the lines in the third element of the sixth group in our resolution target, which has a line width of 6.2 µm. The lines in the same element are blurred beyond recognition in Fig. 4(b).

We have demonstrated our calibration method using two independent Lissajous patterns. However, our method is capable of accurately calibrating the position of a collection of points in any arbitrary scanning trajectory. Moreover, the method can be applied to accurately calibrate a scanning imaging device used in conjunction with an SLO, such as an ophthalmic optical coherence tomography (OCT) imaging system. Remarkably, accurate calibration well below optical resolution can be achieved without the need for a calibrated resolution target. Our calibration method makes use of a multiplicity of distances to compound the accurate position of the scanning trajectory. The use of this diversity of measurements suppresses any random, uncorrelated noise from the calibrated trajectory. Namely, assuming that the signal power is constant between subsequent measurements, averaging $N$ measurements of distance reduces their variance by a factor of $N$, thereby reducing the position noise level by a factor of $\sqrt N$. Coincidentally, the number of available distances $N(N - 1)/2$ scales as ${N^2}$, imposing a constraint on computation time. Therefore, a straightforward, flexible mechanism is naturally available to tune the resolution/accuracy while limiting computation cost in the trade-off between sampling density and the extent of raster pattern field of view.

In summary, we have presented a calibration method for accurately calibrating the true trajectory of the beam in beam scanning imaging devices. We have successfully calibrated trajectories with accuracy greater than the optical resolution of the imaging system as evidenced by spatial resolution measurements. Since we take advantage of a large collection of position calculations, we can effectively suppress the effect of random noise in the localization of trajectory points below our instrument detection levels. Our method provides a robust, noise-resistant technique of accurate localization applicable in imaging devices with demanding, high spatial accuracy requirements, including mutual alignment between concurrent multi-modal ophthalmic imaging systems.

Funding

Fundacja na rzecz Nauki Polskiej (POIR.04.04.00-00-2070/16-00).

Acknowledgment

The project “FreezEYE Tracker—ultrafast system for image stabilization in biomedical imaging” is carried out within the TEAM TECH program of the Foundation for Polish Science co-financed by the European Union under the European Regional Development Fund. The authors acknowledge Krzysztof Dalasiński, Maciej Nowakowski, Anna Szkulmowska, and Krystian Wróbel for insightful comments and technical support. The authors thank AM2M Ltd. L.P for providing the custom-made electronics for synchronous data acquisition in the optical system used in the experiments.

Disclosures

AM2M Ltd. L.P: MS (I).

Data Availability

Data underlying the results presented in this Letter may be obtained from the authors upon reasonable request.

REFERENCES

1. A. Manakov, H. Seidel, and I. Ihrke, in Vision, Model. Vis. (VMV) (2011), pp. 207–214.

2. Z. Zhakypov, E. Golubovic, and A. Sabanovic, in IECON 2013-39th Annu. Conf. IEEE Ind. Electron. Soc. (2013), pp. 4138–4143.

3. S. Yang, L. Yang, G. Zhang, T. Wang, and X. Yang, Nanomanuf. Metrol. 1, 180 (2018). [CrossRef]  

4. L. Leybaert, A. de Meyer, C. Mabilde, and M. J. Sanderson, J. Microsc. 219, 133 (2005). [CrossRef]  

5. Z. Zhang, IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330 (2000). [CrossRef]  

6. P. García-Gómez, S. Royo, N. Rodrigo, and J. R. Casas, Sensors 20, 2898 (2020). [CrossRef]  

7. Y. Chen, Y.-J. Hong, S. Makita, and Y. Yasuno, Biomed. Opt. Express 9, 1111 (2018). [CrossRef]  

8. M. M. Bartuzel, K. Wróbel, S. Tamborski, M. Meina, M. Nowakowski, K. Dalasiński, A. Szkulmowska, and M. Szkulmowski, Biomed. Opt. Express 11, 3164 (2020). [CrossRef]  

9. G. D. Evangelidis and E. Z. Psarakis, IEEE Trans. Pattern Anal. Mach. Intell. 30, 1858 (2008). [CrossRef]  

10. W. S. Torgerson, Psychometrika 30, 379 (1965). [CrossRef]  

11. T. Kohonen, Proc. IEEE 78, 1464 (1990). [CrossRef]  

12. G. Hinton and S. T. Roweis, in NIPS (2002), Vol. 15, p. 833.

13. L. Van der Maaten and G. Hinton, J. Mach. Learn. Res. 9, 2579 (2008).

14. G. T. F. de Abreu and G. Destino, in IEEE Wirel. Commun. Netw. Conf. (2007), pp. 4430–4434.

Supplementary Material (2)

NameDescription
Visualization 1       Image reconstruction using calibrated Lissajous scanning pattern. (a) L-frame from SLO1. (b) L-frame from SLO2. (c) Raster trajectory formed by the galvanometric scanner. (d) Image of USAF 1951 resolution target compounded from calibrated L-frames.
Visualization 2       Lissajous scanning pattern reconstruction using the distance between R-frames. (a) R-frame from SLO1. (b) R-frame from SLO2. (c) Calibrated Lissajous pattern from SLO1. (d) Calibrated Lissajous pattern from SLO2.

Data Availability

Data underlying the results presented in this Letter may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (4)

Fig. 1.
Fig. 1. Scanning optical system under calibration and calibration tool. L1-L10, lenses; PD1-PD2, avalanche photodiodes; BS1-BS2, non-polarizing beam splitters; PBS, polarizing beam splitter; PC1-PC2, fiber polarization controllers; MM1-MM2, 2D resonant MEMS scanners; GS, pair of galvanometric scanners; SLD, superluminescent diode. Scanning trajectories generated by scanners MM1, MM2, and GS are schematically plotted next to the scanners.
Fig. 2.
Fig. 2. Relations between the Lissajous and raster scanning trajectories. (a) L-frames for SLO1 and SLO2. Both Lissajous patterns laterally translated to the position ${\vec r_j}$ of the raster pattern and overlaid simultaneously on the raster pattern image of a USAF 1951 target. (b) R-frames generated by two different points of a Lissajous pattern laterally displaced by the motion of the galvoscanner shown by the red line. The distance between points ${\vec l_i}$ and ${\vec l_k}$ from an L-frame ${{\bf L}_j}$ is the same as the lateral displacement between R-frames ${{\bf R}_i}$ and ${{\bf R}_k}$ ; see Eq. (1). (c) Relations between the L-frames and R-frames on matrix ${\bf P}$ . The projections of all the frames for rows and columns are shown in Visualization 1 and Visualization 2, respectively.
Fig. 3.
Fig. 3. (a) Theoretical ${{\bf L}^t}$ (dotted lines) and calibrated ${{\bf L}^c}$ (solid lines) Lissajous patterns. (b) Histogram of distances between corresponding points from ${{\bf L}^t}$ and ${{\bf L}^c}$ in Airy disc radii and degrees. Lissajous patterns and histograms from SLO1 and SLO2 are plotted in blue and orange, respectively.
Fig. 4.
Fig. 4. Images of a 1951 USAF resolution target obtained (a) with a single raster scan, (b) using registration of R-frames laterally translated to the coordinates $\vec l_i^t$ from the theoretical Lissajous scanning pattern ${{\bf L}^t}$ , and (c) using registration of R-frames laterally translated to the coordinates $\vec l_i^c$ from the calibrated Lissajous scanning pattern ${{\bf L}^c}$ . Corresponding experimental contrast transfer function plots for horizontal and vertical elements of the target are shown below each image in blue and orange, respectively.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

s ( R i , R k ) = R i R k = R + l i ( R + l k ) = l i l k .
L c = arg min l c L c i , k = 1 N | ( l i c l k c ) s ( R i , R k ) | .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.