Abstract
Large field-of-view (FOV) calibration is indispensable to ensure the accuracy of vision measurement systems for large aviation components. We propose an improved separated-parameter calibration method for large-FOV binocular vision measurements with a high flexibility and accuracy. Firstly, the camera parameters are separately calibrated according to the sub-area features of image. Subsequently, based on the spatial-calibration accuracy, a stereoscopic calibration object is devised. The mean error of the proposed method is experimentally obtained as 0.13 mm for a FOV of 2.0 m × 1.5 m. Its feasibility and effectiveness for the measurement in the field is validated by workshop calibration.
© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
1. Introduction
The quality of connection and assembly of large force-bearing composite components governs the characteristics of airplanes [1]. For improving the assembly quality, it is important to monitor the assembling conditions, which requires the reconstruction of the surfaces of the assembly components. Owing to its advantages such as non-contact feature, high-accuracy measurement capability, and good stability, the measurement method based on binocular vision has been extensively applied to detect the size of large aviation components [2–4].
The high-accuracy calibration of binocular cameras significantly impacts the measurement results [5]. Conventional calibration methods aim to establish the correspondence between two-dimensional (2D) information and standard three-dimensional (3D) information by capturing the images of the calibration object at different angles. The size of the calibration object should be approximately equal to the field of view (FOV) for ensuring calibration accuracy [6,7]. For example, in Zhang’s calibration method [7], which is widely used to accurately calibrate a vision system, by acquiring the images of 2D calibration object at 20-30 different positions with a FOV of approximately 170 mm × 250 mm. However, for a large measurement space exceeding 1.5 m, large calibration objects with high accuracy are required, which are expensive to manufacture and difficult to operate in a workshop. Force-bearing aviation components made of composite materials exhibit features such as large geometrical size and wide variations in the depth of field (DOF). To measure the geometrical parameters of force-bearing components accurately and efficiently, it is important to develop an accurate calibration method for binocular-vision measurement systems in industrial fields.
Several studies have focused on calibration methods based on large FOV [8–10]. M. Brückner et al. proposed a calibration method that utilized a robot arm for multiple cameras by separating intrinsic and extrinsic parameters [11]. The robot arm was employed to locate spatial positions and poses of cameras. This calibration method eliminated the dependence on calibration objects. However, the FOV was limited by the range of the robot arm, and the calibration accuracy was poor owing to the low location accuracy of the robot arm. Roviramás et al. designed a calibration object with uniformly distributed features based on Zhang’s calibration, and the baseline of binocular cameras was utilized to optimize the calibration results [12]. This method included a large FOV. However, the placement of the designed calibration object and operation were very difficult. Abedi et al. presented a method for group geometrical calibration of multi-camera imaging systems [13]. A pyramid with small triangular patterns and opposite colors was designed, and image rectification was performed based on an ideal circle line. This method is useful for circular multi-camera systems, but it cannot be applied to large field calibrations for industrial applications. Wei et al. presented a flexible calibration method for binocular vision sensor using a planar target with several parallel lines [14]. The structural parameters of binocular vision sensors were estimated according to the vanishing feature constraints and spacing constraints of parallel lines. This method can be used for measurements with a FOV of 300 mm × 300 mm. Zhang et al. proposed a calibration method by utilizing a coordinate measuring machine (CMM) to drive shining target points building virtual stereo targets [15], which involved a large FOV of 1500 mm × 700 mm. However, it is difficult to apply this method in industrial fields due to the use of CMM as the locating tool. Jia et al. proposed an improved camera calibration method based on perpendicularity compensation for binocular stereo vision systems [16]. The accuracy of this calibration method approached 99.91% in a large FOV. Because a large-sized controlling platform is implemented in this method, it is currently difficult to apply it in industrial fields.
The existing large-FOV calibration methods are primarily based on two strategies. The first strategy is to simplify the structure of the calibration objects. The overall size of the calibration object is large for covering the FOV. Therefore, it is inconvenient to move the calibration object in a workshop. The second strategy includes the calibration of the vision measurement system according to the spatial features such as parallel lines. However, the calibration accuracy is limited. For the high-accuracy measurement of large aviation parts, the measurement convenience and accuracy should be simultaneously guaranteed. To this end, we propose an improved separated-parameter calibration method for binocular vision measurement with large FOV to facilitate the measurement of large components. According to the imaging mechanism, image distortion is different at the different spatial positions. The camera parameters are separately calibrated by using the sub-area features of images. Meanwhile, to accurately determine the initial value of camera parameters, the measurement accuracy in the imaging space is examined for supporting the design of the calibration object.
The rest of the paper is organized as follows. Section 2 explains the measurement principle. Section 3 describes the proposed improved separated-parameter calibration method. Section 4 presents the design of the stereoscopic calibration object. Section 5 presents the laboratory and field experiments that verify the effectiveness of the proposed method. Section 6 concludes this paper.
2. Measurement principle
A schematic of binocular stereo vision system is shown in Fig. 1. The measured point P is captured synchronously by the binocular camera. Further, it is reconstructed based on triangulation. The transformation of measured points from 2D information to 3D information follows the perspective projection model, as expressed in Eq. (1) [17]. The world coordinate system ${O_w}{X_w}{Y_w}{Z_w}$ represents the actual 3D coordinates of the measured object in space. The camera coordinate system ${O_c}{X_c}{Y_c}{Z_c}$ indicates the inherent 3D coordinates in the camera. The image points are represented in the physical image coordinate system ${O_o}xy$ and the corresponding 2D pixel coordinate system ${O_p}uv$ on the 2D image.
Owing to the machining error and the optical-system assembly error between the camera and lens, there are different types of errors between the actual point and the ideal image point, which are nonlinear errors [18]. The main types of camera distortion are radial distortion, centrifugal distortion, and thin prism distortion. For the measurement of aviation parts, the FOV is large, and the effect of thin prism and centrifugal distortions are much less than that of radial distortion [19].
3. Improved separated-parameter calibration with a large FOV
Radial distortion is gradually increased outwards from the center of the image. Thus, the maximum distortion is observed at the four corners of the image, and there is minimal distortion near the principal point. For the field measurement of large-sized parts, it is difficult to manufacture large standard objects with high accuracy. Therefore, for ensuring the accuracy of workshop measurements, the image acquisition area is effectively partitioned according to the imaging characteristics of distortion distribution. The camera parameter matrix and distortion coefficient are separately calculated based on the local characteristics of the captured image to realize high-accuracy calibration in the workshop with a large FOV.
In this study, an improved separated-parameter calibration method is proposed for large-FOV measurement. A schematic of the calibration system is shown in Fig. 2. The initial value of the camera parameter matrix is quickly obtained by using a small 3D calibration target that is placed in the central area of the FOV with small distortion. Besides, 2D calibration targets of collinear points are set at the four corners of image with maximum distortion, and the distortion coefficient is calculated based on the collinear constraint. Finally, taking the minimum projection error as the objective function, the parameter matrix and distortion coefficient are optimized to calibrate the binocular camera. Using this method, the camera parameters are separately calibrated based on the local characteristics of large FOV. This avoids the requirement for manufacturing high-accuracy and large-sized calibration targets and increases the flexibility and reliability of the calibration method in large-field measurements.
3.1 Solution for initial intrinsic and extrinsic parameters
The initial camera parameters are calibrated by using a 3D calibration object, which is shown in Fig. 3. The world coordinates of the center of spatial calibration target point are defined as $({{X_w},{Y_w},{Z_w}} )$, which can be acquired by using the high-accuracy 3D calibration object. The pixel coordinates of the center of target points are defined as $({u,v} )$. The initial parametric matrix of the camera is solved using a single image. This process is described as follows.
The perspective projection model containing the intrinsic and extrinsic parameter matrices is shown in Eq. (1). The specific solution is obtained as follows.
The projection matrix ${\textbf M} = \left[ {\begin{array}{cccc} {\begin{array}{c} {{m_{11}}}\\ {{m_{21}}}\\ {{m_{31}}} \end{array}}&{\begin{array}{c} {{m_{12}}}\\ {{m_{22}}}\\ {{m_{32}}} \end{array}}&{\begin{array}{c} {{m_{13}}}\\ {{m_{23}}}\\ {{m_{33}}} \end{array}}&{\begin{array}{c} {{m_{14}}}\\ {{m_{24}}}\\ {{m_{34}}} \end{array}} \end{array}} \right]$, $s = {z_i}$, $({{x_i},{y_i},{z_i}} )$ are the coordinates of the point in the camera coordinate system. Equation (3) is constructed according to the coordinate transformation relationship:
3.2 Initial solution of distortion coefficient
For large-FOV measurements, the imaging distortion of the camera is not negligible. According to the rule of radial distortion, the most critical distortion regions are the four vertical angles. For improving the accuracy and reliability of the distortion coefficients, a large-FOV distortion coefficient method that combines collinear constraints in the four corner areas is proposed. As shown in Fig. 4, the image is divided into five regions: A, B, C, D, and E, where the radial distortion of the region E is the smallest. The initial values of camera calibration parameters can be calculated by using target points in this region. The radial distortion of the remaining four regions near the edge of the image is relatively large. Four sets of collinear constraint points are arranged in the four corner regions and four collinear points are selected in each region.
The linear cross ratio is defined as ${C_R}$, which can be expressed by Eq. (10) and Eq. (11). The world coordinates of the four collinear points are given as $({{X_a},{Y_a},{Z_a}} )$, $({{X_b},{Y_b},{Z_b}} )$, $({{X_c},{Y_c},{Z_c}} )$ and $({{X_d},{Y_d},{Z_d}} )$. The corresponding image coordinates are $({{x_a},{y_a}} )$, $({{x_b},{y_b}} )$, $({{x_c},{y_c}} )$ and $({{x_d},{y_d}} )$, respectively.
3.3 Parameter optimization
To further improve the calibration accuracy, the parameter matrix and distortion coefficients should be optimized. Based on the principle of the minimization of reprojection error, the nonlinear global optimization objective function is established as follows:
4. Design of stereoscopic calibration object
To ensure the accuracy and stability of the calibration, the 3D target points should exhibit a high spatial position accuracy and repetition accuracy. Meanwhile, target points should not be occluded. The binocular camera can capture the effective target points. Besides, for improving the calibration speed and accuracy, the target points captured by the left and right cameras should exhibit stable features so that they can be matched quickly and there is no confusion in these points due to different image-acquisition angles. To fulfill these requirements, a stereoscopic calibration object has been designed in this study. As the number and spatial distribution of target points govern the reliability of the calibration object, the number of target points, DOF, and the spatial distribution on the calibration target were determined by performing accuracy tests. Finally, the calibration object with 3D target points was designed based on the results of this analysis.
4.1 Analysis of structural parameter for stereoscopic calibration object
To determine the number and distribution of target points, the influence of these structural parameters on the calibration accuracy is discussed in this section. As shown in Fig. 5(a), a calibration control field is established using a laser tracker (Leica AT960, measurement error < ± (15 µm + 6 µm/m)) in the measurement space. The FOV, DOF, front DOF, and back DOF are 640 mm × 480 mm, 250 mm, 120 mm, and 130 mm, respectively.
The 3D coordinates of the target points are obtained using the laser tracker. To ensure the consistency of coordinate collection, the 2D coordinates of target points are obtained by using visual target balls (radius: $19.05_{\textrm{ - }0.0127}^{\textrm{ + }0.0000}$mm, position accuracy of retro reflective dot on the center of sphere: 12.7 µm), and it has a similar sized target ball (radius: 19.05 mm ± 2.54 µm, centering of optics: < ±3.05 µm, shape ball: ≤3.05 µm) as that used by the laser tracker. To verify the effect of the number and distribution of target points on the calibration, the measurement space is divided into three calibration planes: focal plane, front-depth plane, and back-depth plane. Each calibration plane is arranged into 8 rows and 7 columns. The 3D and 2D coordinates of the target points were obtained as follows: (1) 2D coordinates of seven target points on one row were synchronously captured by the binocular vision system. Then, each visual target was replaced with a laser tracker target, respectively. The corresponding 3D coordinates of these 7 points were measured by laser tracker. The relative position of these targets was determined by seven drift nests which are evenly distributed in one row, and fixed with hot melt adhesive. The measurement system is shown in Fig. 5. (2) Then, each row of each plane was measured separately. The binocular vision system captured twenty four images to obtain the 2D coordinates of the target points. The laser tracker measured 168 times to obtain the 3D coordinates of the target points. Finally, there are 168 target points in the three calibration planes, which constitute a complete calibration field as shown in Fig. 6(a). Furthermore, the binocular vision system was calibrated by selecting different target points in this measurement space to analyze the number and distribution of target points on calibration accuracy. To ensure the reliability of the analysis, the length of a one-dimensional (1D) standard ruler with two different lengths (475.0201 mm and 350.0156 mm) in 10 different positions was measured to determine the measurement accuracy. The location distribution of target ruler is shown in the Fig. 6(b). The measurement of the standard ruler was repeated three times. The mean value of the absolute measurement errors for the distance measurement of standard ruler is defined as the measurement error, which is used for qualitative analysis of calibration accuracy.
Under the condition of a large FOV, the number of target points for calibration, image-depth distribution, and spatial distribution significantly influence the calibration results. We analyze the calibration accuracy by considering these three factors.
- (1) The number of points: we selected 6, 10, 20, 30, 40, 50, 60, 70, 80, 90, and 100 points to calibrate the cameras for examining the impact of the number of points on the calibration accuracy. As the calibration method should implement at least six sets of world coordinates of target points and the corresponding pixel coordinates. Thus, the minimum number of target points is six. The selected target points should cover the calibration field as much as possible, and they should be evenly distributed within the measurement space. The target points were randomly selected with three times to calibrate the binocular system. The absolute measurement errors of the binocular vision system calibrated by the different number of target points are shown in Fig. 7(a). It shows that when the number of target points reaches 20-30, the calibration error significantly decreases, and when it reaches 60, the measurement result becomes stable. Therefore, for ensuring the measurement accuracy and calculation efficiency, it is enough to select about 20-30 target points for the camera calibration. Due to perturbation of random selection, the calibration error slightly increases when the calibration number reaches 40-50. However, the number of target points is more than 20 points, the measurement errors are relatively stable. Therefore, for ensuring the measurement accuracy and calculation efficiency, it is enough to select about 20-30 target points for the initial calibration of camera parameters.
- (2) Image-depth distribution: the effect of DOF on the calibration accuracy was analyzed by using the target points of different calibration planes and whole target points in the calibration field. Thirty target points on each calibration plane were randomly selected to calibrate the binocular cameras. The measurement errors are shown in Fig. 7(b). The accuracy and stability of measurement results obtained from the calibration of camera in the focal plane is better than that in other calibration planes (random planes 1 and 2). The calibration accuracy and stability are highest when the entire field is covered. Therefore, the method involving the 3D calibration reference is more advantageous than that involving 2D calibration plane.
- (3) Spatial distribution. According to the radial distortion model, the image distortion varies at different positions of the FOV. The location of the calibration target points in the measured space affects the calibration results. The established calibration field is divided into sub-areas (Zone A, Zone B, Zone C, Zone D, and Zone E), as shown in Fig. 6(a). In each region, 30 target points are selected to calibrate the camera for determining the calibration accuracy with different distribution. The measurement errors are shown in Fig. 7(c). The measurement accuracy of calibration in the middle of the FOV (Zone E) is higher than that in the four corners of the FOV. Thus, the camera parameters can be preliminarily calculated based on calibration results in the middle of the FOV, and the distortion parameters can be assessed by calibration in four corners of the FOV. Further, Fig. 7(c) verifies the feasibility and rationality of the proposed method to separate camera parameters and the distortion region.
4.2 Design of the calibration object and coordinate extraction
According to the above qualitative analysis of the calibration accuracy in the measurement space, it is obvious that the structural design of the stereoscopic calibration object should meet the following requirements: 1) the characteristic elements should be distributed in the 3D space to ensure the global measurement accuracy; 2) the number of characteristic elements is near 30, which almost guarantees the best calibration accuracy; 3) for ensuring the accuracy of the initial calibration parameters, the size of the device needs to be approximately equal to that of the central region of the calibration FOV; 4) for ensuring the convenience and reliability, the stereoscopic calibration object should exhibit a stable design, and it should be easy to operate.
A suitable feature element is the key to ensure the coordinate-acquisition accuracy of the calibration object. A silicon carbide ceramic ball has many advantages such as high accuracy, light weight, high strength, low coefficient of thermal expansion, and non-magnetic nature. Further, it can also be used in conjunction with metal elements. Therefore, 25 standard silicon carbide ceramic balls were utilized as the feature elements of the stereoscopic calibration object. The standard centers of these ceramic balls were defined as the target points for camera calibration. According to the structural requirements, the stereoscopic calibration object was composed of standard ceramic balls, adapters, support rods, the base of the ceramic ball, and flush bolts, as shown in Fig. 8. For avoiding the occlusion of calibration features and for ensuring adequate spatial calibration information, the spatial structure of a 5 × 5 ladder was adopted. The size of the calibration field surrounded by ceramic balls was 320 mm × 360 mm, and the calibration depth was 320 mm. All these specifications met the calibration requirements of the central FOV. The ceramic ball and the supporting rods were connected through the detachable adapters, and the adapter and the supporting rod were connected by a accuracy screw, which could be easily assembled and dismantled. This ensured the repetitive positioning accuracy of the device. The supporting rod and the base were connected through a countersunk head bolt to ensure the structural stability of the calibration object.
When the stereoscopic calibration object was fixed, the central positions of 25 ceramic balls were relatively fixed. As shown in Fig. 9(a), the 3D coordinates of the spherical center are fitted by measuring four contacts of the ceramic balls using a 3D coordinate machine (Zessis Prismo navigator, measurement error < 2µm). Thus, the 3D coordinates of the center of ceramic ball $({{X_w},{Y_w},{Z_w}} )$ are accurately obtained. The image of the space calibration object captured by the camera is preprocessed. Using the ellipse fitting method, the sub-pixel central coordinates of the ceramic balls are obtained. Thus, the image coordinates $({u,v} )$ of the target points are acquired, as shown in Fig. 9(b) and Fig. 9(c).
4.3 Reliability verification of stereoscopic calibration object
For measurements in industrial fields, the aviation parts have complex features, and there are indefinite locations for image acquisition. Thus, we established a binocular measurement system to verify the reliability of the designed calibration object. Five images of the calibration object were captured at five random positions in the central region of the field, as shown in Fig. 10. The centers of the target balls of the calibration object were extracted. The distance of the balls from the center of the central ball was reconstructed, which was set as the evaluation criterion of repeatability errors. Twenty-four distances were acquired, which are shown in Table 1.
The experimental results demonstrate that the recognition rate of the centers of standard balls is 100%. Here, the reconstructed distances in the position 1 are the set as the reference values. The difference between the reconstructed distance from the positions 2,3,4 and 5 and the reference distance is defined as the absolute error. The average absolute error of the reconstructed distances is 0.0447 mm, and the average relative error is 0.0323%. The repetitive accuracy is up to 99.97%.
5. Experimental analysis
5.1 Experimental validation of calibration accuracy
To validate the feasibility of the proposed method, a binocular vision-measurement system was built in the laboratory. As shown in Fig. 11, the measurement system consisted of two high-resolution cameras (VC-12MC with resolution of 4096 × 3072, Vieworks, Korea) with a FOV of 2.0 m × 1.5 m, and the nominal focal length was 20 mm. The calibration object was arranged in the center of the FOV while the collinear points were arranged in the four corners of the FOV. The 3D coordinates of target points were already calibrated precisely, and the relative positions of the stereo calibration object and collinear points remained unchanged during the calibration process. According to the proposed method in the Section 3, the calibration parameters of binocular camera are shown in Table 2. The radial distortion coefficients of the image were calculated with the four respective collinear points. Each position was calculated for three times to ensure the reliability of the distortion coefficients.
The focal lengths of the cameras can be obtained with Table 2 as follows:
Table 3 shows that the maximum measurement error of the proposed vision-measurement system is 0.225 mm, while the FOV is 2.0 m × 1.5 m. Further, the minimum measurement error is 0.052 mm, and the average measurement error is 0.115 mm. The maximum relative error is 0.028%, and the average relative error is 0.019%. The proposed method fulfills requirements for large-FOV measurement. Moreover, only one image of the calibration object is captured in this method to realize high accuracy camera calibration, which greatly improves the measurement efficiency.
5.2 Field validation of calibration accuracy
To validate the feasibility of the proposed method for workshop calibration, the same binocular vision-measurement system with a FOV of 2.0 m × 1.5 m was built in an aviation assembly shop. The system is shown in Fig. 13. The initial parameter matrices of the binocular cameras are shown in Table 4. The camera calibration parameters after global optimization are shown in Table 5.
The focal lengths of the cameras can be obtained with Table 4 as follows:
Considering the measurement results obtained by the laser tracker as the reference, the absolute measurement errors of binocular vision system in the three coordinate directions are shown in Fig. 14.
Table 6 and Fig. 14 show that the maximum measurement error of the proposed vision-measurement system is 0.35 mm in a single coordinate direction, while the FOV is 2.0 m × 1.5 m. Further, the minimum measurement error is 0.01 mm, and the average measurement error is 0.130 mm. Approximately 97.92% of the coordinate measurement errors are less than 0.3 mm. In particular, the average error in the z direction is just 0.116 mm. This can effectively improve the accuracy of the traditional calibration methods in the direction of DOF as well as the global measurement accuracy for large FOV. Further, this fulfills the measurement requirements for large aviation parts with complex environment, and the feasibility of the camera calibration method is verified simultaneously.
The results of the proposed separated-parameter calibration method and Zhang’s calibration method (the size of calibration object is 600 mm × 800 mm, fifteen images of calibration object placed in the different position of measuring space were captured by the left and right cameras) [7] are compared in Table 7. The standard ruler was commercially obtained (Brunson, 803-MCP, tube length tolerance ± 0.003 mm). Its length was 600.0197 mm. The standard ruler was placed in 10 distinct positions according to the distribution diagram in Fig. 5(b). Compared to the Zhang's calibration method, the average absolute error of the proposed method decreased from 0.3487 mm to 0.1103 mm, and the relative error decreased from 0.058% to 0.018%.
Moreover, some key points were placed on the surface of an aviation part. These points were measured by the binocular vision system with the FOV of 2.0 m × 1.5 m. The system were calibrated by the proposed separated-parameter calibration method (SPCM) and Zhang’s calibration method (ZCM), respectively. The distribution of these key points is shown in Fig. 15(a). The 3D coordinates of these points measured by the laser tracker (Leica AT901, measurement error < ± (15 µm + 6 µm/m)) are defined as the standard coordinates. The distances from point 2-9 to point 1 were taken as the evaluation. The absolute errors and relative errors of the measuring distances are shown in Fig. 15(b). These distances were measured by the calibrated binocular vision system.
The experimental results demonstrate that the maximum relative error of the proposed separated-parameter calibration method is 0.028%, minimum relative error is 0.001% and the average relative error is 0.012%. The maximum relative error of Zhang’s calibration method is 0.273%, minimum relative error is 0.051% and the average relative error is 0.136%. Compared with the Zhang's calibration method, the proposed calibration method has a higher calibration accuracy for the field measurement. Besides, the proposed method can realize the calibration of the binocular vision system with a large FOV by capturing only one image of the calibration object. The proposed method greatly improves calibration efficiency and accuracy, especially for the field measurement with a large FOV.
6. Conclusions
We proposed an improved calibration method based on parameter separation, which is primarily used to calibrate a binocular vision system with a large FOV in complex industrial environments. The space to be measured was partitioned according to the image feature of radial distortion. Subsequently, the initial parameter matrix was obtained based on the designed calibration target, which was placed in the centre of the field with a smaller distortion. The initial distortion coefficients were calculated based on four groups of collinear points that are placed in the corners of the field with maximum distortion. Calibration parameters were optimized by minimizing the number of re-projection errors using LM optimization method. The proposed method can quickly calibrate the binocular cameras with a large FOV by using only one image. The influence of the number of target points and distribution on the accuracy, the calibration accuracy is qualitatively assessed. Based on this analysis, a stereoscopic calibration object with standard ceramic target balls was designed. The accuracy of measuring method was verified in the laboratory and the measurement field of aviation with a size of 2.0 m × 1.5 m, where the average error of a measurement point was obtained as 0.015 mm and 0.130 mm, respectively. The measurement accuracy of the proposed method is significantly improved as compared to that of the traditional calibration method. The proposed method meets the requirements of field measurement with high speed and accuracy. Therefore, we believe that the proposed method exhibits immense potential for aviation measurement under large-sized fields. In the further research, the distortion model could be further improved. Moreover the accuracy evaluation in the application field should be further studied for improving the accuracy and reliability of the field measurement.
Funding
Key Technologies Research and Development Program (2018YFA0703304); National Natural Science Foundation of China (51905077); China Postdoctoral Science Foundation (2019M651110); Liao Ning Revitalization Talents Program (XLYC1801008, XLYC1807086).
Disclosures
The authors declare no conflicts of interest.
References
1. F. Mas, J. Ríos, J. L. Menéndez, and A. Gómez, “A process-oriented approach to modeling the conceptual design of aircraft assembly lines,” Int. J. Adv. Manuf. Technol. 67(1-4), 771–784 (2013). [CrossRef]
2. C. Li, C. Zhou, C. Miao, Y. Yan, and J. Yu, “Binocular vision profilometry for large-sized rough optical elements using binarized band-limited pseudo-random patterns,” Opt. Express 27(8), 10890–10899 (2019). [CrossRef]
3. G. Xu, L. Sun, X. Li, J. Su, Z. Hao, and X. Lu, “Global calibration and equation reconstruction methods of a three dimensional curve generated from a laser plane in vision measurement,” Opt. Express 22(18), 22043–22055 (2014). [CrossRef]
4. Z. Liu, X. Li, F. Li, and G. Zhang, “Flexible dynamic measurement method of three-dimensional surface profilometry based on multiple vision sensors,” Opt. Express 23(1), 384–400 (2015). [CrossRef]
5. Y. I. Abdel-Aziz, H. M. Karara, and M. Hauck, “Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry,” Photogramm. Eng. Rem. S. 81(2), 103–107 (2015). [CrossRef]
6. M. E. Loaiza, A. B. Raposo, and M. Gattass, “Multi-camera calibration based on an invariant pattern,” Comput. Graph. 35(2), 198–207 (2011). [CrossRef]
7. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000). [CrossRef]
8. Z. Wei, W. Zou, G. Zhang, and K. Zhao, “Extrinsic parameters calibration of multi-camera with non-overlapping fields of view using laser scanning,” Opt. Express 27(12), 16719–16737 (2019). [CrossRef]
9. X. Pan and Z. Liu, “High-accuracy calibration of line-structured light vision sensor by correction of image deviation,” Opt. Express 27(4), 4364–4385 (2019). [CrossRef]
10. O. Burggraaff, N. Schmidt, J. Zamorano, K. Pauly, S. Pascual, C. Tapia, E. Spyrakos, and F. Snik, “Standardized spectral and radiometric calibration of consumer cameras,” Opt. Express 27(14), 19075–19101 (2019). [CrossRef]
11. M. Brückner, F. Bajramovic, and J. Denzler, “Intrinsic and extrinsic active self-calibration of multi-camera systems,” Mach. Vision Appl. 25(2), 389–403 (2014). [CrossRef]
12. F. Roviramás, Q. Wang, and Q. Zhang, “Design parameters for adjusting the visual field of binocular stereo cameras,” Biosyst. Eng. 105(1), 59–70 (2010). [CrossRef]
13. F. Abedi, Y. Yang, and Q. Liu, “Group geometric calibration and rectification for circular multi-camera imaging system,” Opt. Express 26(23), 30596–30613 (2018). [CrossRef]
14. Z. Wei and X. Liu, “Vanishing feature constraints calibration method for binocular vision sensor,” Opt. Express 23(15), 18897–18914 (2015). [CrossRef]
15. B. Yang, L. Zhang, N. Ye, X. Feng, and T. Li, “Camera calibration technique of wide-area vision measurement,” Acta Opt. Sin. 32(9), 0915001 (2012). [CrossRef]
16. Z. Jia, J. Yang, W. Liu, F. Wang, Y. Liu, L. Wang, C. Fan, and K. Zhao, “Improved camera calibration method based on perpendicularity compensation for binocular stereo vision measurement system,” Opt. Express 23(12), 15205–15223 (2015). [CrossRef]
17. D. Herrera, J. Kannala, and J. Heikkilä, “Joint depth and color camera calibration with distortion correction,” IEEE Trans. Pattern Anal. Mach. Intell. 34(10), 2058–2064 (2012). [CrossRef]
18. L. Ma, Y. Q. Chen, and K. L. Moore, “Analytical piecewise radial distortion model for precision camera calibration,” IEE Proc.-Vis. Image Signal Process. 153(4), 468–474 (2006). [CrossRef]
19. M. Zhang, L. Jin, G. Li, Y. Wu, and S. Han, “Camera distortion calibration method based on straight line characteristics,” Acta Opt. Sin. 35(6), 0615001 (2015). [CrossRef]
20. G. Zhang, J. He, and X. Yang, “Calibrating camera radial distortion with cross-ratio invariability,” Opt. Laser Technol. 35(6), 457–461 (2003). [CrossRef]