Abstract

The structured-light vision sensor calibration approach based on planar target fails to generate reliable and accurate results due to the inaccurate localization of feature points in outdoor complex lighting environments. To address this issue, a novel high-accuracy calibration method that corrects image deviation is proposed for line-structured light vision sensor in this paper. The mathematical solution for stripe point location uncertainty is put forward and the location uncertainty of target feature points and stripe points is established. Moreover, the location deviation of all points is computed through large-scale nonlinear multi-step optimization based on the constraints of uncertainty. After compensation, the planar target based calibration method is adopted to solve the light plane equation. Both simulative and physical experiments have carried out to evaluate the performance of the proposed method and the results show that the proposed method is robust under large feature points localization deviation and can achieve the same measurement accuracy as the planar target calibration method under ideal imaging conditions, which reduces the requirements of the calibration environment, and has important practical engineering application value.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
On-site calibration of line-structured light vision sensor in complex light environments

Zhen Liu, Xiaojing Li, and Yang Yin
Opt. Express 23(23) 29896-29911 (2015)

Dynamic tread wear measurement method for train wheels against vibrations

Xu Chen, Junhua Sun, Zhen Liu, and Guangjun Zhang
Appl. Opt. 54(17) 5270-5280 (2015)

Real-time and accurate rail wear measurement method and experimental analysis

Zhen Liu, Fengjiao Li, Bangkui Huang, and Guangjun Zhang
J. Opt. Soc. Am. A 31(8) 1721-1729 (2014)

References

  • View by:
  • |
  • |
  • |

  1. B. Q. Shi and J. Liang, “Guide to quickly build high-quality three-dimensional models with a structured light range scanner,” Appl. Opt. 55(36), 10158–10169 (2016).
    [Crossref] [PubMed]
  2. Z. Cai, X. Liu, X. Peng, Y. Yin, A. Li, J. Wu, and B. Z. Gao, “Structured light field 3D imaging,” Opt. Express 24(18), 20324–20334 (2016).
    [Crossref] [PubMed]
  3. C. K. Sun, L. Tao, P. Wang, and H. Li, “A 3D acquisition system combination of structured-light scanning and shape from silhouette,” Chin. Opt. Lett. 4(5), 282–284 (2006).
  4. Y. An, T. Bell, B. Li, J. Xu, and S. Zhang, “Method for large-range structured light system calibration,” Appl. Opt. 55(33), 9563–9572 (2016).
    [Crossref] [PubMed]
  5. Y. Caulier, “Inspection of complex surfaces by means of structured light patterns,” Opt. Express 18(7), 6642–6660 (2010).
    [Crossref] [PubMed]
  6. B. Li and S. Zhang, “Flexible calibration method for microscopic structured light system using telecentric lens,” Opt. Express 23(20), 25795–25803 (2015).
    [Crossref] [PubMed]
  7. Y. Zhu, Y. Gu, Y. Jin, and C. Zhai, “Flexible calibration method for an inner surface detector based on circle structured light,” Appl. Opt. 55(5), 1034–1039 (2016).
    [Crossref] [PubMed]
  8. Y. Gao, S. Y. Shao, and Q. B. Feng, “A new method for dynamically measuring diameters of train wheels using line structured light visual sensor,” in Symposium on Photonics and Optoelectronics (SOPO,2012), pp.1–4.
    [Crossref]
  9. H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).
  10. Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
    [Crossref]
  11. G. J. Zhang, Z. Liu, and Z. Z. Wei, “Novel calibration method for a multi-sensor visual measurement system based on structured light,” Opt. Eng. 49(4), 258 (2010).
    [Crossref]
  12. R. Y. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3(4), 323–344 (2003).
    [Crossref]
  13. Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
    [Crossref]
  14. Z. Zhang, “Camera calibration with one-dimensional objects,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 892–899 (2004).
    [Crossref] [PubMed]
  15. Z. Liu, Q. Wu, S. Wu, and X. Pan, “Flexible and accurate camera calibration using grid spherical images,” Opt. Express 25(13), 15269–15285 (2017).
    [Crossref] [PubMed]
  16. J. Y. Liu, Y. Li, and S. Chen, “Robust camera calibration by optimal localization of spatial control points,” IEEE Trans. Instrum. Meas. 63(12), 3076–3087 (2014).
    [Crossref]
  17. Z. Liu, Q. Wu, X. Chen, and Y. Yin, “High-accuracy calibration of low-cost camera using image disturbance factor,” Opt. Express 24(21), 24321–24336 (2016).
    [Crossref] [PubMed]
  18. R. Dewar, “Self-generated targets for spatial calibration of structured light optical sectioning sensors with respect to an external coordinate system,” in Proceedings of Robots and Vision’88 Conference, pp. 5–13.
  19. D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999).
    [Crossref]
  20. Z. Z. Wei, L. J. Cao, and G. J. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010).
    [Crossref]
  21. Z. Liu, X. J. Li, F. J. Li, and G. J. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng. 69(6), 20–28 (2015).
    [Crossref]
  22. Z. Liu, X. Li, and Y. Yin, “On-site calibration of line-structured light vision sensor in complex light environments,” Opt. Express 23(23), 29896–29911 (2015).
    [Crossref] [PubMed]
  23. L. Qi, Y. Zhang, X. Zhang, S. Wang, and F. Xie, “Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger’s algorithm,” Opt. Express 21(11), 13442–13449 (2013).
    [Crossref] [PubMed]
  24. C. Steger, “An unbiased detector of curvilinear structure,” IEEE Trans. Pattern Anal. Mach. Intell. 20(2), 113–125 (1998).
    [Crossref]
  25. O. Chum, J. Matas, and S. Obdrzalek, “Enhancing RANSAC by generaized model optimization,” in Proceedings of Asian Conference on Computer Vision (ACCV,2004), pp. 812–817.

2017 (2)

H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).

Z. Liu, Q. Wu, S. Wu, and X. Pan, “Flexible and accurate camera calibration using grid spherical images,” Opt. Express 25(13), 15269–15285 (2017).
[Crossref] [PubMed]

2016 (5)

2015 (3)

2014 (1)

J. Y. Liu, Y. Li, and S. Chen, “Robust camera calibration by optimal localization of spatial control points,” IEEE Trans. Instrum. Meas. 63(12), 3076–3087 (2014).
[Crossref]

2013 (1)

2011 (1)

Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
[Crossref]

2010 (3)

G. J. Zhang, Z. Liu, and Z. Z. Wei, “Novel calibration method for a multi-sensor visual measurement system based on structured light,” Opt. Eng. 49(4), 258 (2010).
[Crossref]

Z. Z. Wei, L. J. Cao, and G. J. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010).
[Crossref]

Y. Caulier, “Inspection of complex surfaces by means of structured light patterns,” Opt. Express 18(7), 6642–6660 (2010).
[Crossref] [PubMed]

2006 (1)

2004 (1)

Z. Zhang, “Camera calibration with one-dimensional objects,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 892–899 (2004).
[Crossref] [PubMed]

2003 (1)

R. Y. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3(4), 323–344 (2003).
[Crossref]

2000 (1)

Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
[Crossref]

1999 (1)

D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999).
[Crossref]

1998 (1)

C. Steger, “An unbiased detector of curvilinear structure,” IEEE Trans. Pattern Anal. Mach. Intell. 20(2), 113–125 (1998).
[Crossref]

An, Y.

Bell, T.

Cai, Z.

Cao, L. J.

Z. Z. Wei, L. J. Cao, and G. J. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010).
[Crossref]

Caulier, Y.

Chen, S.

J. Y. Liu, Y. Li, and S. Chen, “Robust camera calibration by optimal localization of spatial control points,” IEEE Trans. Instrum. Meas. 63(12), 3076–3087 (2014).
[Crossref]

Chen, X.

Cheng, S. Y.

H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).

Chum, O.

O. Chum, J. Matas, and S. Obdrzalek, “Enhancing RANSAC by generaized model optimization,” in Proceedings of Asian Conference on Computer Vision (ACCV,2004), pp. 812–817.

Feng, Q. B.

Y. Gao, S. Y. Shao, and Q. B. Feng, “A new method for dynamically measuring diameters of train wheels using line structured light visual sensor,” in Symposium on Photonics and Optoelectronics (SOPO,2012), pp.1–4.
[Crossref]

Gao, B. Z.

Gao, Y.

Y. Gao, S. Y. Shao, and Q. B. Feng, “A new method for dynamically measuring diameters of train wheels using line structured light visual sensor,” in Symposium on Photonics and Optoelectronics (SOPO,2012), pp.1–4.
[Crossref]

Gu, Y.

Hartmann, P. E.

D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999).
[Crossref]

Huynh, D. Q.

D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999).
[Crossref]

Jin, Y.

Li, A.

Li, B.

Li, F. J.

Z. Liu, X. J. Li, F. J. Li, and G. J. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng. 69(6), 20–28 (2015).
[Crossref]

Li, H.

Li, X.

Li, X. J.

Z. Liu, X. J. Li, F. J. Li, and G. J. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng. 69(6), 20–28 (2015).
[Crossref]

Li, Y.

J. Y. Liu, Y. Li, and S. Chen, “Robust camera calibration by optimal localization of spatial control points,” IEEE Trans. Instrum. Meas. 63(12), 3076–3087 (2014).
[Crossref]

Liang, J.

Liu, J. Y.

J. Y. Liu, Y. Li, and S. Chen, “Robust camera calibration by optimal localization of spatial control points,” IEEE Trans. Instrum. Meas. 63(12), 3076–3087 (2014).
[Crossref]

Liu, X.

Liu, Z.

Z. Liu, Q. Wu, S. Wu, and X. Pan, “Flexible and accurate camera calibration using grid spherical images,” Opt. Express 25(13), 15269–15285 (2017).
[Crossref] [PubMed]

Z. Liu, Q. Wu, X. Chen, and Y. Yin, “High-accuracy calibration of low-cost camera using image disturbance factor,” Opt. Express 24(21), 24321–24336 (2016).
[Crossref] [PubMed]

Z. Liu, X. Li, and Y. Yin, “On-site calibration of line-structured light vision sensor in complex light environments,” Opt. Express 23(23), 29896–29911 (2015).
[Crossref] [PubMed]

Z. Liu, X. J. Li, F. J. Li, and G. J. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng. 69(6), 20–28 (2015).
[Crossref]

Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
[Crossref]

G. J. Zhang, Z. Liu, and Z. Z. Wei, “Novel calibration method for a multi-sensor visual measurement system based on structured light,” Opt. Eng. 49(4), 258 (2010).
[Crossref]

Lv, W. G.

H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).

Matas, J.

O. Chum, J. Matas, and S. Obdrzalek, “Enhancing RANSAC by generaized model optimization,” in Proceedings of Asian Conference on Computer Vision (ACCV,2004), pp. 812–817.

Meng, H.

H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).

Obdrzalek, S.

O. Chum, J. Matas, and S. Obdrzalek, “Enhancing RANSAC by generaized model optimization,” in Proceedings of Asian Conference on Computer Vision (ACCV,2004), pp. 812–817.

Owens, R. A.

D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999).
[Crossref]

Pan, X.

Peng, X.

Qi, L.

Shao, S. Y.

Y. Gao, S. Y. Shao, and Q. B. Feng, “A new method for dynamically measuring diameters of train wheels using line structured light visual sensor,” in Symposium on Photonics and Optoelectronics (SOPO,2012), pp.1–4.
[Crossref]

Shi, B. Q.

Steger, C.

C. Steger, “An unbiased detector of curvilinear structure,” IEEE Trans. Pattern Anal. Mach. Intell. 20(2), 113–125 (1998).
[Crossref]

Sun, C. K.

Sun, J. H.

Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
[Crossref]

Tao, L.

Tsai, R. Y.

R. Y. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3(4), 323–344 (2003).
[Crossref]

Wang, H.

Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
[Crossref]

Wang, P.

Wang, S.

Wei, Z. Z.

Z. Z. Wei, L. J. Cao, and G. J. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010).
[Crossref]

G. J. Zhang, Z. Liu, and Z. Z. Wei, “Novel calibration method for a multi-sensor visual measurement system based on structured light,” Opt. Eng. 49(4), 258 (2010).
[Crossref]

Wu, J.

Wu, Q.

Wu, S.

Xie, F.

Xu, J.

Yang, X. R.

H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).

Yin, Y.

Zhai, C.

Zhang, G. J.

Z. Liu, X. J. Li, F. J. Li, and G. J. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng. 69(6), 20–28 (2015).
[Crossref]

Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
[Crossref]

G. J. Zhang, Z. Liu, and Z. Z. Wei, “Novel calibration method for a multi-sensor visual measurement system based on structured light,” Opt. Eng. 49(4), 258 (2010).
[Crossref]

Z. Z. Wei, L. J. Cao, and G. J. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010).
[Crossref]

Zhang, S.

Zhang, X.

Zhang, Y.

Zhang, Z.

Z. Zhang, “Camera calibration with one-dimensional objects,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 892–899 (2004).
[Crossref] [PubMed]

Zhang, Z. Y.

Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
[Crossref]

Zhu, Y.

Appl. Opt. (3)

Chin. Opt. Lett. (1)

IEEE J. Robot. Autom. (1)

R. Y. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3(4), 323–344 (2003).
[Crossref]

IEEE Trans. Instrum. Meas. (1)

J. Y. Liu, Y. Li, and S. Chen, “Robust camera calibration by optimal localization of spatial control points,” IEEE Trans. Instrum. Meas. 63(12), 3076–3087 (2014).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (3)

Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
[Crossref]

Z. Zhang, “Camera calibration with one-dimensional objects,” IEEE Trans. Pattern Anal. Mach. Intell. 26(7), 892–899 (2004).
[Crossref] [PubMed]

C. Steger, “An unbiased detector of curvilinear structure,” IEEE Trans. Pattern Anal. Mach. Intell. 20(2), 113–125 (1998).
[Crossref]

Int. J. Comput. Vis. (1)

D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999).
[Crossref]

Opt. Eng. (1)

G. J. Zhang, Z. Liu, and Z. Z. Wei, “Novel calibration method for a multi-sensor visual measurement system based on structured light,” Opt. Eng. 49(4), 258 (2010).
[Crossref]

Opt. Express (7)

Opt. Laser Technol. (1)

Z. Z. Wei, L. J. Cao, and G. J. Zhang, “A novel 1D target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010).
[Crossref]

Opt. Lasers Eng. (2)

Z. Liu, X. J. Li, F. J. Li, and G. J. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng. 69(6), 20–28 (2015).
[Crossref]

Z. Liu, J. H. Sun, H. Wang, and G. J. Zhang, “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011).
[Crossref]

Railway Standard Design. (1)

H. Meng, X. R. Yang, W. G. Lv, and S. Y. Cheng, “The design of an abrasion detection system for monorail train pantograph sider,” Railway Standard Design. 61(8), 151–154 (2017).

Other (3)

Y. Gao, S. Y. Shao, and Q. B. Feng, “A new method for dynamically measuring diameters of train wheels using line structured light visual sensor,” in Symposium on Photonics and Optoelectronics (SOPO,2012), pp.1–4.
[Crossref]

R. Dewar, “Self-generated targets for spatial calibration of structured light optical sectioning sensors with respect to an external coordinate system,” in Proceedings of Robots and Vision’88 Conference, pp. 5–13.

O. Chum, J. Matas, and S. Obdrzalek, “Enhancing RANSAC by generaized model optimization,” in Proceedings of Asian Conference on Computer Vision (ACCV,2004), pp. 812–817.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1 Calibration schematic of structured light vision sensor.
Fig. 2
Fig. 2 Location uncertainty diagram of target feature points and stripe points. (a) Line-structured light calibration image; (b) Target feature point; (c) Stripe point; (d) Grayscale distribution of target feature point; (e) Grayscale distribution of stripe point; (f) Location uncertainty of target feature point; (g) Location uncertainty of stripe point.
Fig. 3
Fig. 3 Analysis of calibration accuracy under different target feature points’ noise levels. (a) RMS error of a with different target feature points’ noise; (b) RMS error of b with different target feature points’ noise; (c) RMS error of c with different target feature points’ noise; (d) RMS error of d with different target feature points’ noise.
Fig. 4
Fig. 4 Analysis of calibration accuracy under different stripe points’ noise levels. (a) RMS error of a with different stripe points’ noise; (b) RMS error of b with different stripe points’ noise; (c) RMS error of c with different stripe points’ noise; (d) RMS error of d with different stripe points’ noise.
Fig. 5
Fig. 5 On-site structured light vision sensor and calibration target. (a) Line-structured light vision sensor applied in wheel size measurement system installed in outdoor railway. (b) Metal plane target embedded with LED.
Fig. 6
Fig. 6 Ideal images captured in shelter.
Fig. 7
Fig. 7 Structured light calibration image captured outdoors under complex lighting conditions. Three kinds of situations are present: the first row is the uneven width of stripe, the second row is the low brightness of the target feature points, and the third row is the target reflective condition.
Fig. 8
Fig. 8 Feature points location deviation correction results under complex lighting conditions. Green cross points are pre-corrected feature points, red cross points are corrected ones. Green asterisk points are pre-corrected stripe points, red assterisk points are corrected ones. (a) Correction results of target feature points under reflective conditions; (b) Correction results of target and stripe feature points under strong reflective and over-exposure conditions; (c) Correction results of target and stripe feature points under stripe and feature points intersect each other; (d) Correction results of target and stripe feature points under low and uneven brightness; (e) Correction results of target and stripe feature points under stripe over-exposure and stray light interference; (f) Correction results of target and stripe feature points under high brightness background.
Fig. 9
Fig. 9 Two-cylinder target used for measurement accuracy evaluation.
Fig. 10
Fig. 10 Distribution of parameters with four calibration methods. (a) The distribution of light plane parameter a with four calibration methods; (b) The distribution of light plane parameter b with the four calibration methods; (c) The distribution of light plane parameter c with four calibration methods; (d) The distribution of light plane parameter d with four calibration methods.
Fig. 11
Fig. 11 Online dynamic wheel size measurement in outdoor railway line.
Fig. 12
Fig. 12 Wheel cross profiles with different calibration methods. (a) Wheel cross profile of freight train. (b) Wheel cross profile of passenger train.
Fig. 13
Fig. 13 Curves of online dynamic wheel size measurement results. (a) Curves of tread wear measurement results; (b) Curves of flange thickness measurement results.

Tables (5)

Tables Icon

Table 1 Deviation of Target Feature Points Before and After Correction

Tables Icon

Table 2 Deviation of Stripe Points Before and After Correction

Tables Icon

Table 3 Line-Structured Light Vision Sensor Parameters Calibrated by Different Methods

Tables Icon

Table 4 Distance Result of Calibration Precision via Different Methods (mm)

Tables Icon

Table 5 Cylinder Radius Result of Calibration Accuracy via Different Methods (mm)

Equations (21)

Equations on this page are rendered with MathJax. Learn more.

f[(t n u + u 0 ),(t n v + v 0 )]=g( u 0 , v 0 )+t n u g u ( u 0 , v 0 )+t n v g v ( u 0 , v 0 ) + 1 2 t 2 n u 2 g uu ( u 0 , v 0 )+ t 2 n u n v g uv ( u 0 , v 0 )+ 1 2 t 2 n v 2 g vv ( u 0 , v 0 ).
t= n u g u + n v g v n u 2 g uu +2 n u n v g uv + n v 2 g vv .
F ˙ h ^ ( c 0 , r 0 )= F ˙ I ^ ( c 0 , r 0 )+ F ˙ N ^ ( c 0 , r 0 )=0,
F ˙ h ^ ( c 0 ,r=0 )= F ˙ I ^ ( 0,0 )+ F ˙ N ^ ( 0,0 )+( F ¨ I ^ ( c 0 , r 0 )( 0,0 )+ F ¨ N ^ ( 0,0 )) c 0 +O( c 0 )=0.
c 0 F ˙ N ^ ( 0,0 )/ F ¨ I ^ ( 0,0 ).
σ ^ N 2 = σ N 2 8π σ g 4 .
σ c 0 2 = σ ^ N 2 F ¨ I ^ ( 0,0 ) 2 .
F ¨ I ^ ( 0,0 )= M σ w ( σ g 2 + σ w 2 ) 3/2 .
σ c 0 2 = ( σ w 2 + σ g 2 ) 3 8π σ g 4 σ w 2 ( σ N 2 M 2 ).
ρ p u =K[ r 1 r 2 t ][ x y 1 ]= H 3×3 [ x y 1 ],
u d = u u +( u u u 0 )( k 1 r 2 + k 2 r 4 ) v d = v u +( v u v 0 )( k 1 r 2 + k 2 r 4 ) .
p n =[ u n v n 1 ]=[ u d +Δu v d +Δu 1 ].
e 1 =min{ i=1 M j=1 N D( p ij , p n( ij) ) +| i=1 M j=1 N p ij i=1 M j=1 N p n( ij) | },
e 2 =min{ i=1 M j=1 N Dist( Q j , Q ˜ ij ) +| j=1 N Q j i=1 M j=1 N Q ˜ ij | }.
e 3 =min{ i=1 N { j=1 K | (C R h ( Q j , Q (j+1) ; Q (j+2) , Q (j+3) ) C R h ( p u(ij) , p u(i(j+1)) ; p u(i(j+2)) , p u(i(j+3)) ) | + j=1 L | (C R v ( Q j , Q (j+1) ; Q (j+2) , Q (j+3) ) C R v ( p u(ij) , p u(i(j+1)) ; p u(i(j+2)) , p u(i(j+3)) ) | } },
e 4 =min( i=1 M j=1 N G( Q ^ ij , F i ) ),
e 5 =min( i=1 M j=1 N Dist( Q ^ ij , c ˜ i ) ).
E(a)= e 1 + e 2 + e 3 (1) + e 4 + e 5 (2) (3) ,
{ U Δ u ij Δ u ij U Δ u ij U Δ v ij Δ v ij U Δ v ij U Δ u ij l Δ u ik l U Δ u ij l U Δ v ij l Δ v ik l U Δ v ij l .
Q ik l =s H ˜ i 1 p ˜ ik l ,
Q ^ ik l = Q ik l R ˜ i + t ˜ i .

Metrics