Abstract

The calibration of camera with intrinsic and extrinsic parameters is a procedure of significance in current imaging-based optical metrology. Improvement at two aspects, feature detection and overall optimization, are investigated here by using an active phase target and statistically constrained bundle adjustment (SCBA). From the observations in experiment and simulation, the feature detection can be enhanced by “virtual defocusing” and windowed polynomial fitting if sinusoidal fringe patterns are used as the active phase target. SCBA can be applied to avoid the difficult measurement of the active target. As a typical calibration result in our experiment, the root mean square of reprojection error can be reduced to 0.0067 pixels with the proposed method.

© 2013 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. K. Harding, Nat. Photonics 2, 667 (2008).
    [CrossRef]
  2. M. Vo, Z. Wang, B. Pan, and T. Pan, Opt. Express 20, 16926 (2012).
    [CrossRef]
  3. K. H. Strobl and G. Hirzinger, in 2011 IEEE International Conference on Computer Vision Workshops (IEEE, 2011), pp. 1068–1075.
  4. C. Schmalz, F. Forster, and E. Angelopoulou, Opt. Eng. 50, 113601 (2011).
    [CrossRef]
  5. A. Albarelli, E. Rodolà, and A. Torsello, in British Machine Vision Conference, F. D. R. Labrosse, R. Zwiggelaar, Y. Liu, and B. Tiddeman, eds. (British Machine Vision Association, 2010), paper 67.
  6. K. H. Strobl and G. Hirzinger, in IEEE International Conference on Robotics and Automation (IEEE, 2008), pp. 1398–1405.
  7. Z. Zhang, IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330 (2000).
    [CrossRef]
  8. D. C. Brown, Photogramm. Eng. 37, 855 (1971).
  9. I. Sobel, Artif. Intell. 5, 185 (1974).
    [CrossRef]
  10. R. Y. Tsai, IEEE J. Robot. Autom. 3, 323 (1987).
    [CrossRef]
  11. R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, in IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2005), pp. 832–837.
  12. L. Huang and A. K. Asundi, Meas. Sci. Technol. 22, 035304 (2011).
    [CrossRef]
  13. L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
    [CrossRef]
  14. Z. Wang, H. Du, S. Park, and H. Xie, Appl. Opt. 48, 1052 (2009).
    [CrossRef]
  15. Y. Surrel, Appl. Opt. 35, 51 (1996).
    [CrossRef]

2012

2011

C. Schmalz, F. Forster, and E. Angelopoulou, Opt. Eng. 50, 113601 (2011).
[CrossRef]

L. Huang and A. K. Asundi, Meas. Sci. Technol. 22, 035304 (2011).
[CrossRef]

2010

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
[CrossRef]

2009

2008

K. Harding, Nat. Photonics 2, 667 (2008).
[CrossRef]

2000

Z. Zhang, IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330 (2000).
[CrossRef]

1996

1987

R. Y. Tsai, IEEE J. Robot. Autom. 3, 323 (1987).
[CrossRef]

1974

I. Sobel, Artif. Intell. 5, 185 (1974).
[CrossRef]

1971

D. C. Brown, Photogramm. Eng. 37, 855 (1971).

Albarelli, A.

A. Albarelli, E. Rodolà, and A. Torsello, in British Machine Vision Conference, F. D. R. Labrosse, R. Zwiggelaar, Y. Liu, and B. Tiddeman, eds. (British Machine Vision Association, 2010), paper 67.

Angelopoulou, E.

C. Schmalz, F. Forster, and E. Angelopoulou, Opt. Eng. 50, 113601 (2011).
[CrossRef]

Asundi, A. K.

L. Huang and A. K. Asundi, Meas. Sci. Technol. 22, 035304 (2011).
[CrossRef]

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
[CrossRef]

Brown, D. C.

D. C. Brown, Photogramm. Eng. 37, 855 (1971).

Du, H.

Echigo, T.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, in IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2005), pp. 832–837.

Forster, F.

C. Schmalz, F. Forster, and E. Angelopoulou, Opt. Eng. 50, 113601 (2011).
[CrossRef]

Harding, K.

K. Harding, Nat. Photonics 2, 667 (2008).
[CrossRef]

Hirzinger, G.

K. H. Strobl and G. Hirzinger, in 2011 IEEE International Conference on Computer Vision Workshops (IEEE, 2011), pp. 1068–1075.

K. H. Strobl and G. Hirzinger, in IEEE International Conference on Robotics and Automation (IEEE, 2008), pp. 1398–1405.

Huang, L.

L. Huang and A. K. Asundi, Meas. Sci. Technol. 22, 035304 (2011).
[CrossRef]

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
[CrossRef]

Kemao, Q.

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
[CrossRef]

Pan, B.

M. Vo, Z. Wang, B. Pan, and T. Pan, Opt. Express 20, 16926 (2012).
[CrossRef]

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
[CrossRef]

Pan, T.

Park, S.

Rodolà, E.

A. Albarelli, E. Rodolà, and A. Torsello, in British Machine Vision Conference, F. D. R. Labrosse, R. Zwiggelaar, Y. Liu, and B. Tiddeman, eds. (British Machine Vision Association, 2010), paper 67.

Sagawa, R.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, in IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2005), pp. 832–837.

Schmalz, C.

C. Schmalz, F. Forster, and E. Angelopoulou, Opt. Eng. 50, 113601 (2011).
[CrossRef]

Sobel, I.

I. Sobel, Artif. Intell. 5, 185 (1974).
[CrossRef]

Strobl, K. H.

K. H. Strobl and G. Hirzinger, in 2011 IEEE International Conference on Computer Vision Workshops (IEEE, 2011), pp. 1068–1075.

K. H. Strobl and G. Hirzinger, in IEEE International Conference on Robotics and Automation (IEEE, 2008), pp. 1398–1405.

Surrel, Y.

Takatsuji, M.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, in IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2005), pp. 832–837.

Torsello, A.

A. Albarelli, E. Rodolà, and A. Torsello, in British Machine Vision Conference, F. D. R. Labrosse, R. Zwiggelaar, Y. Liu, and B. Tiddeman, eds. (British Machine Vision Association, 2010), paper 67.

Tsai, R. Y.

R. Y. Tsai, IEEE J. Robot. Autom. 3, 323 (1987).
[CrossRef]

Vo, M.

Wang, Z.

Xie, H.

Yagi, Y.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, in IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2005), pp. 832–837.

Zhang, Z.

Z. Zhang, IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330 (2000).
[CrossRef]

Appl. Opt.

Artif. Intell.

I. Sobel, Artif. Intell. 5, 185 (1974).
[CrossRef]

IEEE J. Robot. Autom.

R. Y. Tsai, IEEE J. Robot. Autom. 3, 323 (1987).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell.

Z. Zhang, IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330 (2000).
[CrossRef]

Meas. Sci. Technol.

L. Huang and A. K. Asundi, Meas. Sci. Technol. 22, 035304 (2011).
[CrossRef]

Nat. Photonics

K. Harding, Nat. Photonics 2, 667 (2008).
[CrossRef]

Opt. Eng.

C. Schmalz, F. Forster, and E. Angelopoulou, Opt. Eng. 50, 113601 (2011).
[CrossRef]

Opt. Express

Opt. Lasers Eng.

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, Opt. Lasers Eng. 48, 141 (2010).
[CrossRef]

Photogramm. Eng.

D. C. Brown, Photogramm. Eng. 37, 855 (1971).

Other

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, in IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2005), pp. 832–837.

K. H. Strobl and G. Hirzinger, in 2011 IEEE International Conference on Computer Vision Workshops (IEEE, 2011), pp. 1068–1075.

A. Albarelli, E. Rodolà, and A. Torsello, in British Machine Vision Conference, F. D. R. Labrosse, R. Zwiggelaar, Y. Liu, and B. Tiddeman, eds. (British Machine Vision Association, 2010), paper 67.

K. H. Strobl and G. Hirzinger, in IEEE International Conference on Robotics and Automation (IEEE, 2008), pp. 1398–1405.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1.

Sinusoidal fringe patterns displayed on an LCD screen can be used as active targets for camera calibration.

Fig. 2.
Fig. 2.

Pixel grids in focus can be removed by using the VD method.

Fig. 3.
Fig. 3.

VD method improves the precision of phase measurement. Distributions of phase errors without VD (left) and with VD (right), and the error histogram (middle).

Fig. 4.
Fig. 4.

Windowed polynomial fitting highly enhances the feature detection.

Fig. 5.
Fig. 5.

With the advanced feature detection, the reprojection errors can be reduced to 0.025 pixels or even less, as 0.0067 pixels, if SCBA is applied.

Fig. 6.
Fig. 6.

Experiment is carried out to calibrate a camera with the proposed method. (a) The screen poses and (b) the distribution of reprojection error.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

[P*,Tn*,Xwm*]=argmin(P,Tn,Xwm)n=1NmMmf(P,Tn,Xwm)2,
mMXwm*=mMXwm,nx*=0,ny*=0,mMxwm*ywmxwmywm*0,mMXwm*Xwm20.

Metrics