Abstract

We present UWSPSM, an algorithm of uncertainty weighted stereopsis pose solution method based on the projection vector which to solve the problem of pose estimation for stereo vision measurement system based on feature points. Firstly, we use a covariance matrix to represent the direction uncertainty of feature points, and utilize projection matrix to integrate the direction uncertainty of feature points into stereo-vision pose estimation. Then, the optimal translation vector is solved based on the projection vector of feature points, as well the depth is updated by the projection vector of feature points. In the absolute azimuth solution stage, the singular value decomposition algorithm is used to calculate the relative attitude matrix, and the above two stages are iteratively performed until the result converges. Finally, the convergence of the proposed algorithm is proved, from the theoretical point of view, by the global convergence theorem. Expanded into stereo-vision, the fixed relationship constraint between cameras is introduced into the stereoscopic pose estimation, so that only one pose parameter of the two images captured is optimized in the iterative process, and the two cameras are better bound as a camera, it can improve accuracy and efficiency while enhancing measurement reliability. The experimental results show that the proposed pose estimation algorithm can converge quickly, has high-precision and good robustness, and can tolerate different degrees of error uncertainty. So, it has useful practical application prospects.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Optimization-based non-cooperative spacecraft pose estimation using stereo cameras during proximity operations

Limin Zhang, Feng Zhu, Yingming Hao, and Wang Pan
Appl. Opt. 56(15) 4522-4531 (2017)

3D pose estimation of large and complicated workpieces based on binocular stereo vision

Zhifeng Luo, Ke Zhang, Zhigang Wang, Jian Zheng, and Yixin Chen
Appl. Opt. 56(24) 6822-6836 (2017)

References

  • View by:
  • |
  • |
  • |

  1. R. Laganiere, S. Gibert, and G. Roth, “Robust object pose estimation from feature-based stereo,” IEEE Trans. Instrum. Meas. 55(4), 1270–1280 (2006).
    [Crossref]
  2. W. B. Dong and V. Isler, “A novel method for the extrinsic calibration of a 2D laser rangefinder and a camera,” IEEE Sensors J. 18(10), 4200–4211 (2018).
    [Crossref]
  3. K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
    [Crossref]
  4. K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
    [Crossref]
  5. J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
    [Crossref]
  6. Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
    [Crossref]
  7. V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” Int. J. Comput. Vis. 81(2), 155–166 (2009).
    [Crossref]
  8. R. Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. on Image Process. 21(2), 802–815 (2012).
    [Crossref]
  9. S. Li, C. Xu, and M. Xie, “A robust O(n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012).
    [Crossref]
  10. Y. Zheng, S. Sugimoto, and M. Okutomi, “ASpnp: An accurate and scalable solution to the perspective-n-point problem,” ICIET Trans. Inf. Syst. E96(7), 1525–1535 (2013).
    [Crossref]
  11. C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
    [Crossref]
  12. M. Y. Li and K. Hashimoto, “Accurate object pose estimation using depth only,” Sensors 18(4), 1045 (2018).
    [Crossref]
  13. J. S. Cui, J. Huo, and M. Yang, “Novel method of calibration with restrictive constrains for stereo-vision system,” J. Mod. Opt. 63(9), 835–846 (2016).
    [Crossref]
  14. J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
    [Crossref]
  15. K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
    [Crossref]
  16. H. Nguyen, D. Nguyen, Z. Wang, H. Kieu, and M. Le, “Real-time, high-accuracy 3d imaging and shape measurement,” Appl. Opt. 54(1), A9–A17 (2015).
    [Crossref]
  17. P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
    [Crossref]
  18. Z. Cai, X. Liu, A. Li, Q. Tang, X. Peng, and B. Z. Gao, “Phase-3D mapping method developed from back projection stereovision model for fringe projection profilometry,” Opt. Express 25(2), 1262–1277 (2017).
    [Crossref]
  19. Y. Cui, F. Q. Zhou, Y. X. Wang, L. Liu, and H. Gao, “Precise calibration of binocular vision system used for vision measurement,” Opt. Express 22(8), 9134–9149 (2014).
    [Crossref]
  20. N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
    [Crossref]
  21. W. M. Li and D. Zhang, “A high-accuracy monocular self-calibration method based on the essential matrix and bundle adjustment,” J. Mod. Opt. 61(19), 1556–1563 (2014).
    [Crossref]
  22. A. Censi and D. Scaramuzza, “Calibration by correlation using metric embedding from nonmetric similarities,” IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2357–2370 (2013).
    [Crossref]
  23. R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
    [Crossref]
  24. C. P. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent pose estimation from video images,” IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000).
    [Crossref]
  25. G. Schweighofer and A. Pinz, “Robust pose estimation from a planar target,” IEEE Trans. Pattern Anal. Mach. Intell. 28(12), 2024–2030 (2006).
    [Crossref]
  26. C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
    [Crossref]
  27. G. Caron, A. Dame, and E. Marchand, “Direct model based visual tracking and pose estimation using mutual information,” Image Vis. Comput. 32(1), 54–63 (2014).
    [Crossref]
  28. J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
    [Crossref]
  29. X. K. Miao, F. Zhu, and Y. M. Hao, “A new pose estimation method based on uncertainty-weighted errors of the feature points,” J. Optoelectronics Laser 23(7), 1348–1355 (2012).
  30. J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
    [Crossref]
  31. S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
    [Crossref]
  32. D. Raviv, C. Barsi, N. Naik, M. Feigin, and R. Raskar, “Pose estimation using time-resolved inversion of diffuse light,” Opt. Express 22(17), 20164–20176 (2014).
    [Crossref]
  33. L. M. Zhang, F. Zhu, Y. M. Hao, and W. Pan, “Rectangular-structure-based pose estimation method for non-cooperative rendezvous,” Appl. Opt. 57(21), 6164–6173 (2018).
    [Crossref]
  34. J. G. Wang, X. Xiao, and B. Javidi, “Three-dimensional integral imaging with flexible sensing,” Opt. Lett. 39(24), 6855–6858 (2014).
    [Crossref]
  35. Z. F. Luo, K. Zhang, Z. G. Wang, J. Zheng, and Y. X. Chen, “3D pose estimation of large and complicated workpieces based on binocular stereo vision,” Appl. Opt. 56(24), 6822–6836 (2017).
    [Crossref]
  36. R. M. Steele and C. Jaynes, “Feature uncertainty arising from covariant image noise,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) (2005), pp. 1063–1070.
  37. P. J. Huber and E. M. Ronchetti, Robust Statistics. John Wiley & Sons, New Jersey, 2009.
  38. G. B. Chang, T. H. Xu, and Q. X. Wang, “M-estimator for the 3D symmetric Helmert coordinate transformation,” J. Geod. 92(1), 47–58 (2018).
    [Crossref]
  39. G. B. Chang, T. H. Xu, and Q. X. Wang, “Error analysis of the 3D similarity coordinate transformation,” GPS Solut. 21(3), 963–971 (2017).
    [Crossref]
  40. G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
    [Crossref]
  41. O. D. Faugeras and M. Hebert, “The representation, recognition, and locating of 3D shapes from range data,” Mach. Intell. Pattern Recogn. 3, 13–51 (1986).
    [Crossref]
  42. B. K. P. Horn, “Cloud-form solution of absolute arientation using unit quaternion,” J. Opt. Soc. Am. A 4(4), 629–642 (1987).
    [Crossref]
  43. K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Trans. Pattern Anal. PAMI-9(5), 698–700 (1987).
    [Crossref]
  44. D. G. Luenberger and Y. Y. Ye, Linear and Nonlinear Programming. 3rd ed, Springer, New York, 2008.
  45. P. Anandan and M. Irani, “Factorization with uncertainty,” Int. J. Comput. Vis. 49(2/3), 101–116 (2002).
    [Crossref]

2019 (1)

J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
[Crossref]

2018 (9)

J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
[Crossref]

G. B. Chang, T. H. Xu, and Q. X. Wang, “M-estimator for the 3D symmetric Helmert coordinate transformation,” J. Geod. 92(1), 47–58 (2018).
[Crossref]

W. B. Dong and V. Isler, “A novel method for the extrinsic calibration of a 2D laser rangefinder and a camera,” IEEE Sensors J. 18(10), 4200–4211 (2018).
[Crossref]

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
[Crossref]

M. Y. Li and K. Hashimoto, “Accurate object pose estimation using depth only,” Sensors 18(4), 1045 (2018).
[Crossref]

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

L. M. Zhang, F. Zhu, Y. M. Hao, and W. Pan, “Rectangular-structure-based pose estimation method for non-cooperative rendezvous,” Appl. Opt. 57(21), 6164–6173 (2018).
[Crossref]

2017 (6)

Z. F. Luo, K. Zhang, Z. G. Wang, J. Zheng, and Y. X. Chen, “3D pose estimation of large and complicated workpieces based on binocular stereo vision,” Appl. Opt. 56(24), 6822–6836 (2017).
[Crossref]

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

G. B. Chang, T. H. Xu, and Q. X. Wang, “Error analysis of the 3D similarity coordinate transformation,” GPS Solut. 21(3), 963–971 (2017).
[Crossref]

G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
[Crossref]

Z. Cai, X. Liu, A. Li, Q. Tang, X. Peng, and B. Z. Gao, “Phase-3D mapping method developed from back projection stereovision model for fringe projection profilometry,” Opt. Express 25(2), 1262–1277 (2017).
[Crossref]

C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
[Crossref]

2016 (3)

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

J. S. Cui, J. Huo, and M. Yang, “Novel method of calibration with restrictive constrains for stereo-vision system,” J. Mod. Opt. 63(9), 835–846 (2016).
[Crossref]

J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
[Crossref]

2015 (1)

2014 (7)

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

D. Raviv, C. Barsi, N. Naik, M. Feigin, and R. Raskar, “Pose estimation using time-resolved inversion of diffuse light,” Opt. Express 22(17), 20164–20176 (2014).
[Crossref]

G. Caron, A. Dame, and E. Marchand, “Direct model based visual tracking and pose estimation using mutual information,” Image Vis. Comput. 32(1), 54–63 (2014).
[Crossref]

Y. Cui, F. Q. Zhou, Y. X. Wang, L. Liu, and H. Gao, “Precise calibration of binocular vision system used for vision measurement,” Opt. Express 22(8), 9134–9149 (2014).
[Crossref]

N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
[Crossref]

W. M. Li and D. Zhang, “A high-accuracy monocular self-calibration method based on the essential matrix and bundle adjustment,” J. Mod. Opt. 61(19), 1556–1563 (2014).
[Crossref]

J. G. Wang, X. Xiao, and B. Javidi, “Three-dimensional integral imaging with flexible sensing,” Opt. Lett. 39(24), 6855–6858 (2014).
[Crossref]

2013 (3)

A. Censi and D. Scaramuzza, “Calibration by correlation using metric embedding from nonmetric similarities,” IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2357–2370 (2013).
[Crossref]

P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
[Crossref]

Y. Zheng, S. Sugimoto, and M. Okutomi, “ASpnp: An accurate and scalable solution to the perspective-n-point problem,” ICIET Trans. Inf. Syst. E96(7), 1525–1535 (2013).
[Crossref]

2012 (3)

R. Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. on Image Process. 21(2), 802–815 (2012).
[Crossref]

S. Li, C. Xu, and M. Xie, “A robust O(n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012).
[Crossref]

X. K. Miao, F. Zhu, and Y. M. Hao, “A new pose estimation method based on uncertainty-weighted errors of the feature points,” J. Optoelectronics Laser 23(7), 1348–1355 (2012).

2009 (1)

V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” Int. J. Comput. Vis. 81(2), 155–166 (2009).
[Crossref]

2006 (2)

R. Laganiere, S. Gibert, and G. Roth, “Robust object pose estimation from feature-based stereo,” IEEE Trans. Instrum. Meas. 55(4), 1270–1280 (2006).
[Crossref]

G. Schweighofer and A. Pinz, “Robust pose estimation from a planar target,” IEEE Trans. Pattern Anal. Mach. Intell. 28(12), 2024–2030 (2006).
[Crossref]

2002 (1)

P. Anandan and M. Irani, “Factorization with uncertainty,” Int. J. Comput. Vis. 49(2/3), 101–116 (2002).
[Crossref]

2000 (1)

C. P. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent pose estimation from video images,” IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000).
[Crossref]

1989 (1)

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

1987 (2)

B. K. P. Horn, “Cloud-form solution of absolute arientation using unit quaternion,” J. Opt. Soc. Am. A 4(4), 629–642 (1987).
[Crossref]

K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Trans. Pattern Anal. PAMI-9(5), 698–700 (1987).
[Crossref]

1986 (1)

O. D. Faugeras and M. Hebert, “The representation, recognition, and locating of 3D shapes from range data,” Mach. Intell. Pattern Recogn. 3, 13–51 (1986).
[Crossref]

Anandan, P.

P. Anandan and M. Irani, “Factorization with uncertainty,” Int. J. Comput. Vis. 49(2/3), 101–116 (2002).
[Crossref]

Arun, K. S.

K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Trans. Pattern Anal. PAMI-9(5), 698–700 (1987).
[Crossref]

Bai, X. Y.

J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
[Crossref]

Barsi, C.

Blostein, S. D.

K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Trans. Pattern Anal. PAMI-9(5), 698–700 (1987).
[Crossref]

Cai, Z.

Cao, Z. Q.

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

Caron, G.

G. Caron, A. Dame, and E. Marchand, “Direct model based visual tracking and pose estimation using mutual information,” Image Vis. Comput. 32(1), 54–63 (2014).
[Crossref]

Censi, A.

A. Censi and D. Scaramuzza, “Calibration by correlation using metric embedding from nonmetric similarities,” IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2357–2370 (2013).
[Crossref]

Chang, G. B.

G. B. Chang, T. H. Xu, and Q. X. Wang, “M-estimator for the 3D symmetric Helmert coordinate transformation,” J. Geod. 92(1), 47–58 (2018).
[Crossref]

G. B. Chang, T. H. Xu, and Q. X. Wang, “Error analysis of the 3D similarity coordinate transformation,” GPS Solut. 21(3), 963–971 (2017).
[Crossref]

G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
[Crossref]

Chen, Y. X.

Chen, Z.

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

Cheng, L.

C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
[Crossref]

Cui, J. R.

J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
[Crossref]

Cui, J. S.

J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
[Crossref]

J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
[Crossref]

J. S. Cui, J. Huo, and M. Yang, “Novel method of calibration with restrictive constrains for stereo-vision system,” J. Mod. Opt. 63(9), 835–846 (2016).
[Crossref]

Cui, Y.

Dame, A.

G. Caron, A. Dame, and E. Marchand, “Direct model based visual tracking and pose estimation using mutual information,” Image Vis. Comput. 32(1), 54–63 (2014).
[Crossref]

Dong, H.

C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
[Crossref]

Dong, W. B.

W. B. Dong and V. Isler, “A novel method for the extrinsic calibration of a 2D laser rangefinder and a camera,” IEEE Sensors J. 18(10), 4200–4211 (2018).
[Crossref]

El Batteoui, N.

N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
[Crossref]

Fang, Z. J.

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

Faugeras, O. D.

O. D. Faugeras and M. Hebert, “The representation, recognition, and locating of 3D shapes from range data,” Mach. Intell. Pattern Recogn. 3, 13–51 (1986).
[Crossref]

Feigin, M.

Fua, P.

V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” Int. J. Comput. Vis. 81(2), 155–166 (2009).
[Crossref]

Gao, B. Z.

Gao, H.

Gevers, T.

R. Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. on Image Process. 21(2), 802–815 (2012).
[Crossref]

Ghosh, S.

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

Gibert, S.

R. Laganiere, S. Gibert, and G. Roth, “Robust object pose estimation from feature-based stereo,” IEEE Trans. Instrum. Meas. 55(4), 1270–1280 (2006).
[Crossref]

Gong, Y.

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Hager, G. D.

C. P. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent pose estimation from video images,” IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000).
[Crossref]

Hao, Y. M.

L. M. Zhang, F. Zhu, Y. M. Hao, and W. Pan, “Rectangular-structure-based pose estimation method for non-cooperative rendezvous,” Appl. Opt. 57(21), 6164–6173 (2018).
[Crossref]

X. K. Miao, F. Zhu, and Y. M. Hao, “A new pose estimation method based on uncertainty-weighted errors of the feature points,” J. Optoelectronics Laser 23(7), 1348–1355 (2012).

Haralick, R. M.

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Hashimoto, K.

M. Y. Li and K. Hashimoto, “Accurate object pose estimation using depth only,” Sensors 18(4), 1045 (2018).
[Crossref]

Hebert, M.

O. D. Faugeras and M. Hebert, “The representation, recognition, and locating of 3D shapes from range data,” Mach. Intell. Pattern Recogn. 3, 13–51 (1986).
[Crossref]

Horn, B. K. P.

Huang, T. S.

K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Trans. Pattern Anal. PAMI-9(5), 698–700 (1987).
[Crossref]

Huber, P. J.

P. J. Huber and E. M. Ronchetti, Robust Statistics. John Wiley & Sons, New Jersey, 2009.

Huo, J.

J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
[Crossref]

J. S. Cui, J. Huo, and M. Yang, “Novel method of calibration with restrictive constrains for stereo-vision system,” J. Mod. Opt. 63(9), 835–846 (2016).
[Crossref]

Irani, M.

P. Anandan and M. Irani, “Factorization with uncertainty,” Int. J. Comput. Vis. 49(2/3), 101–116 (2002).
[Crossref]

Isler, V.

W. B. Dong and V. Isler, “A novel method for the extrinsic calibration of a 2D laser rangefinder and a camera,” IEEE Sensors J. 18(10), 4200–4211 (2018).
[Crossref]

Javidi, B.

Jaynes, C.

R. M. Steele and C. Jaynes, “Feature uncertainty arising from covariant image noise,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) (2005), pp. 1063–1070.

Joo, H.

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Kellnhofer, P.

P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
[Crossref]

Kieu, H.

Kim, M. B.

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Koch, R.

C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
[Crossref]

Laganiere, R.

R. Laganiere, S. Gibert, and G. Roth, “Robust object pose estimation from feature-based stereo,” IEEE Trans. Instrum. Meas. 55(4), 1270–1280 (2006).
[Crossref]

Le, M.

Lee, C. N.

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Lepetit, V.

V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” Int. J. Comput. Vis. 81(2), 155–166 (2009).
[Crossref]

Li, A.

Li, M. Y.

M. Y. Li and K. Hashimoto, “Accurate object pose estimation using depth only,” Sensors 18(4), 1045 (2018).
[Crossref]

Li, S.

S. Li, C. Xu, and M. Xie, “A robust O(n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012).
[Crossref]

Li, W. M.

W. M. Li and D. Zhang, “A high-accuracy monocular self-calibration method based on the essential matrix and bundle adjustment,” J. Mod. Opt. 61(19), 1556–1563 (2014).
[Crossref]

Liu, E.

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

Liu, F.

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Liu, J. G.

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

Liu, J. R.

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

Liu, L.

Liu, M.

G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
[Crossref]

Liu, X.

Liu, Y.

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

Lu, C. P.

C. P. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent pose estimation from video images,” IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000).
[Crossref]

Luenberger, D. G.

D. G. Luenberger and Y. Y. Ye, Linear and Nonlinear Programming. 3rd ed, Springer, New York, 2008.

Luo, Z. F.

Marchand, E.

G. Caron, A. Dame, and E. Marchand, “Direct model based visual tracking and pose estimation using mutual information,” Image Vis. Comput. 32(1), 54–63 (2014).
[Crossref]

Merras, M.

N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
[Crossref]

Miao, X. K.

X. K. Miao, F. Zhu, and Y. M. Hao, “A new pose estimation method based on uncertainty-weighted errors of the feature points,” J. Optoelectronics Laser 23(7), 1348–1355 (2012).

Min, C. W.

J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
[Crossref]

Mjolsness, E.

C. P. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent pose estimation from video images,” IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000).
[Crossref]

Moreno-Noguer, F.

V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” Int. J. Comput. Vis. 81(2), 155–166 (2009).
[Crossref]

Myszkowski, K.

P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
[Crossref]

Naik, N.

Nandy, S.

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

Nguyen, D.

Nguyen, H.

Okutomi, M.

Y. Zheng, S. Sugimoto, and M. Okutomi, “ASpnp: An accurate and scalable solution to the perspective-n-point problem,” ICIET Trans. Inf. Syst. E96(7), 1525–1535 (2013).
[Crossref]

Pan, W.

Peng, X.

Pinz, A.

G. Schweighofer and A. Pinz, “Robust pose estimation from a planar target,” IEEE Trans. Pattern Anal. Mach. Intell. 28(12), 2024–2030 (2006).
[Crossref]

Pösch, A.

J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
[Crossref]

Qin, Z.

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Raskar, R.

Raviv, D.

Ray, R.

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

Reithmeier, E.

J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
[Crossref]

Ritschel, T.

P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
[Crossref]

Ronchetti, E. M.

P. J. Huber and E. M. Ronchetti, Robust Statistics. John Wiley & Sons, New Jersey, 2009.

Rosenhahn, B.

J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
[Crossref]

Roth, G.

R. Laganiere, S. Gibert, and G. Roth, “Robust object pose estimation from feature-based stereo,” IEEE Trans. Instrum. Meas. 55(4), 1270–1280 (2006).
[Crossref]

Saaidi, A.

N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
[Crossref]

Satori, K.

N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
[Crossref]

Scaramuzza, D.

A. Censi and D. Scaramuzza, “Calibration by correlation using metric embedding from nonmetric similarities,” IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2357–2370 (2013).
[Crossref]

Schlobohm, J.

J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
[Crossref]

Schweighofer, G.

G. Schweighofer and A. Pinz, “Robust pose estimation from a planar target,” IEEE Trans. Pattern Anal. Mach. Intell. 28(12), 2024–2030 (2006).
[Crossref]

Sebe, N.

R. Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. on Image Process. 21(2), 802–815 (2012).
[Crossref]

Seidel, H. P.

P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
[Crossref]

Shome, S. N.

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

Steele, R. M.

R. M. Steele and C. Jaynes, “Feature uncertainty arising from covariant image noise,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) (2005), pp. 1063–1070.

Sugimoto, S.

Y. Zheng, S. Sugimoto, and M. Okutomi, “ASpnp: An accurate and scalable solution to the perspective-n-point problem,” ICIET Trans. Inf. Syst. E96(7), 1525–1535 (2013).
[Crossref]

Sun, C. K.

C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
[Crossref]

Tan, M.

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

Tang, Q.

Tian, H.

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

Vadali, S. R.

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

Vaidya, V. G.

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Valenti, R.

R. Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. on Image Process. 21(2), 802–815 (2012).
[Crossref]

Wang, H.

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

Wang, H. H.

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Wang, J.

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Wang, J. G.

Wang, P.

C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
[Crossref]

Wang, Q. X.

G. B. Chang, T. H. Xu, and Q. X. Wang, “M-estimator for the 3D symmetric Helmert coordinate transformation,” J. Geod. 92(1), 47–58 (2018).
[Crossref]

G. B. Chang, T. H. Xu, and Q. X. Wang, “Error analysis of the 3D similarity coordinate transformation,” GPS Solut. 21(3), 963–971 (2017).
[Crossref]

G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
[Crossref]

Wang, X.

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Wang, Y. X.

Wang, Z.

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

H. Nguyen, D. Nguyen, Z. Wang, H. Kieu, and M. Le, “Real-time, high-accuracy 3d imaging and shape measurement,” Appl. Opt. 54(1), A9–A17 (2015).
[Crossref]

Wang, Z. G.

Wei, H.

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

Xiang, X. J.

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

Xiao, X.

Xie, M.

S. Li, C. Xu, and M. Xie, “A robust O(n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012).
[Crossref]

Xu, C.

C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
[Crossref]

S. Li, C. Xu, and M. Xie, “A robust O(n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012).
[Crossref]

Xu, T. H.

G. B. Chang, T. H. Xu, and Q. X. Wang, “M-estimator for the 3D symmetric Helmert coordinate transformation,” J. Geod. 92(1), 47–58 (2018).
[Crossref]

G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
[Crossref]

G. B. Chang, T. H. Xu, and Q. X. Wang, “Error analysis of the 3D similarity coordinate transformation,” GPS Solut. 21(3), 963–971 (2017).
[Crossref]

Yan, K.

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

Yang, M.

J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
[Crossref]

J. S. Cui, J. Huo, and M. Yang, “Novel method of calibration with restrictive constrains for stereo-vision system,” J. Mod. Opt. 63(9), 835–846 (2016).
[Crossref]

Ye, Y. Y.

D. G. Luenberger and Y. Y. Ye, Linear and Nonlinear Programming. 3rd ed, Springer, New York, 2008.

Yin, L.

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

Zhang, B. S.

C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
[Crossref]

Zhang, D.

W. M. Li and D. Zhang, “A high-accuracy monocular self-calibration method based on the essential matrix and bundle adjustment,” J. Mod. Opt. 61(19), 1556–1563 (2014).
[Crossref]

Zhang, G. Y.

J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
[Crossref]

Zhang, K.

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

Z. F. Luo, K. Zhang, Z. G. Wang, J. Zheng, and Y. X. Chen, “3D pose estimation of large and complicated workpieces based on binocular stereo vision,” Appl. Opt. 56(24), 6822–6836 (2017).
[Crossref]

Zhang, L.

C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
[Crossref]

Zhang, L. M.

Zhang, X.

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Zhang, Z.

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

Zhao, R.

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

Zheng, J.

Zheng, W. J.

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

Zheng, Y.

Y. Zheng, S. Sugimoto, and M. Okutomi, “ASpnp: An accurate and scalable solution to the perspective-n-point problem,” ICIET Trans. Inf. Syst. E96(7), 1525–1535 (2013).
[Crossref]

Zhou, F. Q.

Zhou, K.

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

Zhu, F.

L. M. Zhang, F. Zhu, Y. M. Hao, and W. Pan, “Rectangular-structure-based pose estimation method for non-cooperative rendezvous,” Appl. Opt. 57(21), 6164–6173 (2018).
[Crossref]

X. K. Miao, F. Zhu, and Y. M. Hao, “A new pose estimation method based on uncertainty-weighted errors of the feature points,” J. Optoelectronics Laser 23(7), 1348–1355 (2012).

Appl. Opt. (3)

Comput. Graph. Forum. (1)

P. Kellnhofer, T. Ritschel, K. Myszkowski, and H. P. Seidel, “Optimizing disparity for motion in depth,” Comput. Graph. Forum. 32(4), 143–152 (2013).
[Crossref]

GPS Solut. (1)

G. B. Chang, T. H. Xu, and Q. X. Wang, “Error analysis of the 3D similarity coordinate transformation,” GPS Solut. 21(3), 963–971 (2017).
[Crossref]

ICIET Trans. Inf. Syst. (1)

Y. Zheng, S. Sugimoto, and M. Okutomi, “ASpnp: An accurate and scalable solution to the perspective-n-point problem,” ICIET Trans. Inf. Syst. E96(7), 1525–1535 (2013).
[Crossref]

IEEE Access (1)

K. Zhou, X. J. Xiang, Z. Wang, H. Wei, and L. Yin, “Complete initial solutions for iterative pose estimation from planar objects,” IEEE Access 6, 22257–22266 (2018).
[Crossref]

IEEE Photon. J. (1)

J. S. Cui, C. W. Min, X. Y. Bai, and J. R. Cui, “An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty,” IEEE Photon. J. 11(2), 1–16 (2019).
[Crossref]

IEEE Sensors J. (1)

W. B. Dong and V. Isler, “A novel method for the extrinsic calibration of a 2D laser rangefinder and a camera,” IEEE Sensors J. 18(10), 4200–4211 (2018).
[Crossref]

IEEE Trans. Instrum. Meas. (2)

R. Laganiere, S. Gibert, and G. Roth, “Robust object pose estimation from feature-based stereo,” IEEE Trans. Instrum. Meas. 55(4), 1270–1280 (2006).
[Crossref]

K. Zhang, Z. Q. Cao, J. R. Liu, Z. J. Fang, and M. Tan, “Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot,” IEEE Trans. Instrum. Meas. 67(4), 811–820 (2018).
[Crossref]

IEEE Trans. on Image Process. (1)

R. Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. on Image Process. 21(2), 802–815 (2012).
[Crossref]

IEEE Trans. Pattern Anal. (1)

K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-squares fitting of two 3-d point sets,” IEEE Trans. Pattern Anal. PAMI-9(5), 698–700 (1987).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (5)

A. Censi and D. Scaramuzza, “Calibration by correlation using metric embedding from nonmetric similarities,” IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2357–2370 (2013).
[Crossref]

C. P. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent pose estimation from video images,” IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000).
[Crossref]

G. Schweighofer and A. Pinz, “Robust pose estimation from a planar target,” IEEE Trans. Pattern Anal. Mach. Intell. 28(12), 2024–2030 (2006).
[Crossref]

C. Xu, L. Zhang, L. Cheng, and R. Koch, “Pose estimation from line correspondences: A complete analysis and a series of solutions,” IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1209–1222 (2017).
[Crossref]

S. Li, C. Xu, and M. Xie, “A robust O(n) solution to the perspective-n-point problem,” IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012).
[Crossref]

IEEE Trans. Syst., Man, Cybern. (1)

R. M. Haralick, H. Joo, C. N. Lee, X. Zhang, V. G. Vaidya, and M. B. Kim, “Pose estimation from corresponding point data,” IEEE Trans. Syst., Man, Cybern. 19(6), 1426–1446 (1989).
[Crossref]

Image Vis. Comput. (1)

G. Caron, A. Dame, and E. Marchand, “Direct model based visual tracking and pose estimation using mutual information,” Image Vis. Comput. 32(1), 54–63 (2014).
[Crossref]

Int. J. Comput. Vis. (2)

V. Lepetit, F. Moreno-Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” Int. J. Comput. Vis. 81(2), 155–166 (2009).
[Crossref]

P. Anandan and M. Irani, “Factorization with uncertainty,” Int. J. Comput. Vis. 49(2/3), 101–116 (2002).
[Crossref]

J. Geod. (1)

G. B. Chang, T. H. Xu, and Q. X. Wang, “M-estimator for the 3D symmetric Helmert coordinate transformation,” J. Geod. 92(1), 47–58 (2018).
[Crossref]

J. Mod. Opt. (2)

W. M. Li and D. Zhang, “A high-accuracy monocular self-calibration method based on the essential matrix and bundle adjustment,” J. Mod. Opt. 61(19), 1556–1563 (2014).
[Crossref]

J. S. Cui, J. Huo, and M. Yang, “Novel method of calibration with restrictive constrains for stereo-vision system,” J. Mod. Opt. 63(9), 835–846 (2016).
[Crossref]

J. Mod. Optic. (1)

J. Huo, G. Y. Zhang, J. S. Cui, and M. Yang, “A novel algorithm for pose estimation based on generalized orthogonal iteration with uncertainty-weighted measuring error of feature points,” J. Mod. Optic. 65(3), 331–341 (2018).
[Crossref]

J. Opt. Soc. Am. A (1)

J. Optoelectronics Laser (1)

X. K. Miao, F. Zhu, and Y. M. Hao, “A new pose estimation method based on uncertainty-weighted errors of the feature points,” J. Optoelectronics Laser 23(7), 1348–1355 (2012).

Mach. Intell. Pattern Recogn. (1)

O. D. Faugeras and M. Hebert, “The representation, recognition, and locating of 3D shapes from range data,” Mach. Intell. Pattern Recogn. 3, 13–51 (1986).
[Crossref]

Mach. Vis. Appl. (1)

S. Ghosh, R. Ray, S. R. Vadali, S. N. Shome, and S. Nandy, “Reliable pose estimation of underwater dock using single camera: a scene invariant approach,” Mach. Vis. Appl. 27(2), 221–236 (2016).
[Crossref]

Meas. Sci. Technol. (1)

C. K. Sun, H. Dong, B. S. Zhang, and P. Wang, “An orthogonal iteration pose estimation algorithm based on an incident ray tracking model,” Meas. Sci. Technol. 29(9), 095402 (2018).
[Crossref]

Measurement (3)

J. Schlobohm, A. Pösch, E. Reithmeier, and B. Rosenhahn, “Improving contour based pose estimation for fast 3D measurement of free form objects,” Measurement 92, 79–82 (2016).
[Crossref]

K. Yan, R. Zhao, H. Tian, E. Liu, and Z. Zhang, “A high accuracy method for pose estimation based on rotation parameters,” Measurement 122, 392–401 (2018).
[Crossref]

G. B. Chang, T. H. Xu, Q. X. Wang, and M. Liu, “Analytical solution to and error analysis of the quaternion based similarity transformation considering measurement errors in both frames,” Measurement 110, 1–10 (2017).
[Crossref]

Opt. Express (3)

Opt. Lasers Eng. (1)

J. Wang, X. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Lasers Eng. 54(1), 269–274 (2014).
[Crossref]

Opt. Lett. (1)

Sensors (2)

Y. Liu, Z. Chen, W. J. Zheng, H. Wang, and J. G. Liu, “Monocular visual-inertial SLAM: continuous preintegration and reliable initialization,” Sensors 17(11), 2613 (2017).
[Crossref]

M. Y. Li and K. Hashimoto, “Accurate object pose estimation using depth only,” Sensors 18(4), 1045 (2018).
[Crossref]

Vis. Comput. (1)

N. El Batteoui, M. Merras, A. Saaidi, and K. Satori, “Camera self-calibration with varying intrinsic parameters by an unknown three-dimensional scene,” Vis. Comput. 30(5), 519–530 (2014).
[Crossref]

Other (3)

R. M. Steele and C. Jaynes, “Feature uncertainty arising from covariant image noise,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) (2005), pp. 1063–1070.

P. J. Huber and E. M. Ronchetti, Robust Statistics. John Wiley & Sons, New Jersey, 2009.

D. G. Luenberger and Y. Y. Ye, Linear and Nonlinear Programming. 3rd ed, Springer, New York, 2008.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. Stereo vision characteristic projection imaging schematic. Includes the multi coordinate system and its relationship involved in stereo vision pose estimation, the imaging process and imaging error of feature points, and the fixed pose constraint relationship between stereo vision cameras.
Fig. 2.
Fig. 2. Schematic diagram of ellipsoid point uncertainty. There are three main uncertain states: (I) a/b > 1 or a/b < 1, (II) a/b = 1, (III) a/b > 1 or a/b < 1.
Fig. 3.
Fig. 3. Schematic diagram of the pose estimation algorithm. There are four processes involved: preparation, imaging process, accuracy modeling and process algorithm. Each process also contains input-output relationships
Fig. 4.
Fig. 4. Pose estimation results (convergence rate and iteration number): The attitude angle and relative position results of CSPM, SWSPSM, DWSPSM and UWSPSM are compared. (a) Attitude parameters $\varphi$, $\theta$ and $\psi$ convergence rate and iteration number results. (b) Position parameters ${T_x}$, ${T_y}$ and ${T_z}$ convergence rate and iteration number results.
Fig. 5.
Fig. 5. Pose estimation results (pose error and feature number): The attitude angle and relative position errors of CSPM, SWSPSM, DWSPSM and UWSPSM are compared. (a) Relationship between position error and feature points. (b) Relationship between attitude error and feature points.
Fig. 6.
Fig. 6. Schematic of estimation error related to the points number: The position and attitude angle accuracy of Our algorithm, POSIT algorithm, OI algorithm, EPNP algorithm and DLT algorithm are compared. (a) Position accuracy comparison results. (b) Attitude accuracy comparison results.
Fig. 7.
Fig. 7. Pose estimation error results under different noise types: Compared with CSPM, SWSPSM, DWSPSM and UWSPSM. (a) Position error comparison results; (b) Attitude error comparison results.
Fig. 8.
Fig. 8. Pose estimation results (calculation time and feature points). With the increase of feature points involved in pose estimation, the change of calculation time of CSPM, SWSPSM, DWSPSM and UWSPSM.
Fig. 9.
Fig. 9. Pose estimation results (estimation error and elliptical noise): With the change of the uncertainty of the feature points, the relative position error and attitude error of CSPM, SWSPSM, DWSPSM and UWSPSM are compared to show the robustness of the algorithm. (a) Position error results with elliptical uncertainty. (b) Attitude error results with elliptical uncertainty.
Fig. 10.
Fig. 10. Schematic of estimation error related to the elliptical noise. Compared with POSIT algorithm, OI algorithm, EPNP algorithm and DLT algorithm to show the robustness and accuracy of Our algorithm. (a) Pos_error results with elliptical uncertainty; (b) Rot_ error results with elliptical uncertainty.
Fig. 11.
Fig. 11. Measurement experiment system diagram and tested object. (a) The experimental device consists of a stereo vision and a motion device composed of two cameras. (b) The image of stereo vision under different motion states of the tested motion device. Upper is left camera, lower is right camera.
Fig. 12.
Fig. 12. Diagram of static swing angle measurement error compared with UWPSM and CSPSM. (a) UWPSM test results of Yaw, Pitch and composite angle errors. (b) CSPSM test results of Yaw, Pitch and composite angle errors.
Fig. 13.
Fig. 13. Diagram of dynamic swing angle measurement error compared with UWPSM and CSPSM. (a) UWPSM test results of dynamic swing angle. (b) CSPSM test results of dynamic swing angle.
Fig. 14.
Fig. 14. Schematic diagram of the measuring error of pendulum center position. (a) UWPSM test results of pendulum center position. (b) CSPSM test results of pendulum center position.

Tables (2)

Tables Icon

Table 1. Test parameters.

Tables Icon

Table 2. The result of experiment.

Equations (40)

Equations on this page are rendered with MathJax. Learn more.

P i j c = R j P i t + T j
z i j c [ u i j v i j 1 ] = [ 1 / 1 d x j d x j γ j u j 0 0 1 / 1 d y j d y j v j 0 0 0 1 ] [ x i j y i j 1 ] = [ 1 / 1 d x j d x j 0 u j 0 0 1 / 1 d y j d y j v j 0 0 0 1 ] [ f j 0 0 0 f j 0 0 0 1 ] [ R j | T j ] [ x i w y i w z i w 1 ] = H j A j [ R j | T j ] X
v i j = 1 u i j 2 + v i j 2 + f j 2 [ u i j v i j f j ]
P i j c = s i j v i j
P i j c = v i j T P i j c v i j = v i j T ( R j P i t + T j ) v i j
e 2 ( R j , T j , { s i j } ) = j = l r i = 1 n s i j v i j ( R j P i t + T j ) 2
{ R j T R j = I det R j > 0
{ R r = R c R l T r = T c + R c T l
P i r c = v i r T P i r c v i r = v i r T ( R c R l P i t + T c + R c T l ) v i r
e 2 ( R , T , { s i } ) = i = 1 n s i r v i r ( R r P i t + T r ) 2 = i = 1 n s i r v i r ( R c R l P i t + T c + R c T l ) 2
{ ( R l T R r ) m ( R l T R r ) h = 0 [ R l T ( P r c P l c ) ] m [ R l T ( P r c P l c ) ] h = 0
Q 1 = ( u , v ) ω ( u , v , 1 ) [ I u I u I v I u 0 I u I v I v I v 0 0 0 1 ]
Q = U Σ U T
Q 1 = U Σ 1 U T
F = Σ 1 / 1 2 2 U T
{ C ^ i = ( u ^ i , v ^ i , 1 ) T = F i C i = F i ( u i , v i , 1 ) T C ^ i = ( u ^ i , v ^ i , 1 ) T = F i C i = F i ( u i , v i , 1 ) T
Σ 1 = d i a g ( 1 / 1 σ 1 2 σ 1 2 , 1 / 1 σ 2 2 σ 2 2 , 1 )
e r r o r = F i C i F i C i 2
e ( R , T ) = i = 1 n C ^ i C i ^ 2 = i = 1 N F i C i F i C i 2
e 2 ( R , T ) = i = 1 n C ^ i C i ^ 2 = i = 1 n F i j H j A j P i j c F i j v i j T ( R j P i t + T j ) v i j 2 = i = 1 n F i j H j A j ( R c R j P i t + T c + R c T j ) F i j v i j T ( R c R j P i t + T c + R c T j ) v i j 2
T e 2 ( R ,   T ) = 0
  T  =  1 n i = 1 n { F i j [ v i j T ( R c R j P i t + T c + R c T j ) v i j H j A j R c R j P i t ] }
[ n I i = 1 n ( v i j v i j T ) ] T j  =  i = 1 n { F i j [ ( v i j v i j T I ) H j A j R c R j P i t ] }
m T ( n I i = 1 n ( v i j v i j T ) ) m = i = 1 n m i 2 i = 1 n ( m i T v i j v i j T m i ) = i = 1 n ( m i 2 v i j T m i 2 )
T ( R )  =  [ n I i = 1 n ( v i j v i j T ) ] 1 i = 1 n { F i j [ ( v i j v i j T I ) H j A j R c R j P i t ] }
s i j ( R )  =  v i j T ( R c R j P i t + T ( R ) )
P i c ( R ) = s i j ( R ) v i j = v i j T ( R c R j P i t + T ( R ) ) v j
e 2 ( R , T ) = i = 1 n ( R c R j P i t + T ( R ) ) P i j c ( R ) 2
P i j c ( R ) = s i j ( k ) ( R ) v i j
R ( k + 1 ) = arg min i = 1 n ( R c R j P i t + T ( R ) ) P i j c ( R ( k ) ) 2
e 2 ( R ( k ) , T ( k ) , { s i ( k ) } ) = i = 1 n s i j ( k ) v i j ( R c R ( k ) P i t + T c + R c T ( k ) ) 2
e 2 ( R ( k  +  1 ) , T ( k  +  1 ) , { s i ( k  +  1 ) } ) = i = 1 n s i j ( k + 1 ) v i j ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2  =  i = 1 n s i j ( k ) v i j ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2  +  i = 1 n ( s i j ( k ) s i j ( k  +  1 ) ) [ 2 v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) s i j ( k ) v i j ) + ( s i j ( k ) s i j ( k  +  1 ) ) v i j 2 ] = i = 1 n s i j ( k ) v i j ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2  +  i = 1 n ( s i j ( k ) s i j ( k  +  1 ) ) [ 2 v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) s i j ( k ) v i j ) s i j ( k ) s i j ( k  +  1 ) ] = i = 1 n s i j ( k ) v i j ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2  +  i = 1 n { ( s i j ( k  +  1 ) ) 2 2 s i j ( k  +  1 ) v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) + 2 s i j ( k ) v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) ( s i j ( k ) ) 2 } = i = 1 n s i j ( k ) v i j ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2  +  i = 1 n { s i j ( k  +  1 ) v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2 s i j ( k ) v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2 }
R ( k + 1 ) = arg min i = 1 n ( R c R j P i t + T ( R ) ) s i j ( R ( k ) ) v i j 2
i = 1 n s i j ( k ) v i j ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2 i = 1 n s i j ( k ) v i j ( R c R ( k ) P i t + T c + R c T ( k ) ) 2 = e 2 ( R ( k ) , T ( k ) , { s i ( k ) } )
e 2 ( R ( k  +  1 ) , T ( k  +  1 ) , { s i ( k  +  1 ) } ) e 2 ( R ( k ) , T ( k ) , { s i ( k ) } ) i = 1 n s i j ( k ) v i j T ( R c R ( k + 1 ) P i t + T c + R c T ( k + 1 ) ) 2
e 2 ( R ( k  +  1 ) , T ( k  +  1 ) , { s i ( k  +  1 ) } ) e 2 ( R ( k ) , T ( k ) , { s i ( k ) } )
[ φ θ ψ ]  =  [ 20 10 15 ] ;
[ X Y Z ]  =  [ 0.8 m 0.05 m 3 m ]
Resolution: 1280pixel × 1024pixel; Focal length: 27mm; Cell size:5 .4 µ m × 5 .4 µ m; Quantization error: 0 .3pixel; Image point extraction error: 0 .5pixel .
{ p o s _ e r r ( % ) = t t r u e t e s t / t t r u e t e s t t t r u e t t r u e r o t _ e r r ( % ) = R t r u e R e s t / R t r u e R e s t R t r u e R t r u e