Abstract

To obtain 3D information of large areas, wide angle lens cameras are used to reduce the number of cameras as much as possible. However, since images are high distorted, errors in point correspondences increase and 3D information could be erroneous. To increase the number of data from images and to improve the 3D information, trinocular sensors are used. In this paper a calibration method for a trinocular sensor formed with wide angle lens cameras is proposed. First pixels locations in the images are corrected using a set of constraints which define the image formation in a trinocular system. When pixels location are corrected, lens distortion and trifocal tensor is computed.

© 2012 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. R. Hartley and A. Zisserman, Multiple view geometry in computer vision. (Cambridge University Press, 2000).
  2. O. Faugeras and B. Mourrain, “On the geometry and algebra of the point and line correspondences between n images,” In Proc. 5th Int. Conf. Comput. Vision, 951–962 (1995).
  3. R. Hartley, “Lines and points in three views and the trifocal tensor,” Int. J. Comput. Vis.22(2), 125–140 (1997).
    [CrossRef]
  4. P. Torr and A. Zisserman, “Robust parameterization and computation of the trifocal tensor,” Image Vis. Comput.15(8), 591–605 (1997).
    [CrossRef]
  5. O. Faugeras, Q.-T. Luong, and T. Papadopoulo, Geometry of multiple images. (Cambridge, MA, USA: MIT Press, 2001).
  6. P. Sturm and S. Ramalingam, “A generic concept for camera calibration,” in Proc. 10th ECCV, 1–13 (2004)
  7. C. Ressl, “Geometry, constraints and computation of the trifocal tensor,” PhD Thesis, Instituüt für Photogrammetrie und Fernerkundung, Technische Universität Wien, (2003).
  8. W. Förstner, “On weighting and choosing constraints for optimally reconstructing the geometry of image triplets,” in Proc. 6th ECCV, 669–684 (2000).
  9. T. Molinier, D. Fofi, and F. Meriaudeau, “Self-calibration of a trinocular sensor with imperceptible structured light and varying intrinsic parameters,” in Proc.7th Int. Conf. Quality Control by Artificial Vision, (2005).
  10. D. Nistér, “Reconstruction from uncalibrated sequences with a hierarchy of trifocal tensors,” in Proc. 6th ECCV, 649–663 (2000).
  11. C. Ricolfe-Viala, A. J. Sanchez-Salmeron, and E. Martinez-Berti, “Accurate calibration with highly distorted images,” Appl. Opt.51(1), 89–101 (2012).
    [CrossRef] [PubMed]
  12. C. Ricolfe-Viala and A. J. Sanchez-Salmeron, “Lens distortion models evaluation,” Appl. Opt.49(30), 5914–5928 (2010).
    [CrossRef] [PubMed]
  13. D. Claus and A. Fitzgibbon, “A rational function lens distortion model for general cameras,” In Proc. CVPR IEEE, 213–219 (2005).
  14. M. Ahmed and A. Farag, “Nonmetric calibration of camera lens distortion: differential methods and robust estimation,” IEEE Trans. Image Process.14(8), 1215–1230 (2005).
    [CrossRef] [PubMed]
  15. F. Devernay and O. Faugeras, “Straight lines have to be straight,” Mach. Vis. Appl.13(1), 14–24 (2001).
    [CrossRef]

2012

2010

2005

M. Ahmed and A. Farag, “Nonmetric calibration of camera lens distortion: differential methods and robust estimation,” IEEE Trans. Image Process.14(8), 1215–1230 (2005).
[CrossRef] [PubMed]

2001

F. Devernay and O. Faugeras, “Straight lines have to be straight,” Mach. Vis. Appl.13(1), 14–24 (2001).
[CrossRef]

1997

R. Hartley, “Lines and points in three views and the trifocal tensor,” Int. J. Comput. Vis.22(2), 125–140 (1997).
[CrossRef]

P. Torr and A. Zisserman, “Robust parameterization and computation of the trifocal tensor,” Image Vis. Comput.15(8), 591–605 (1997).
[CrossRef]

Ahmed, M.

M. Ahmed and A. Farag, “Nonmetric calibration of camera lens distortion: differential methods and robust estimation,” IEEE Trans. Image Process.14(8), 1215–1230 (2005).
[CrossRef] [PubMed]

Devernay, F.

F. Devernay and O. Faugeras, “Straight lines have to be straight,” Mach. Vis. Appl.13(1), 14–24 (2001).
[CrossRef]

Farag, A.

M. Ahmed and A. Farag, “Nonmetric calibration of camera lens distortion: differential methods and robust estimation,” IEEE Trans. Image Process.14(8), 1215–1230 (2005).
[CrossRef] [PubMed]

Faugeras, O.

F. Devernay and O. Faugeras, “Straight lines have to be straight,” Mach. Vis. Appl.13(1), 14–24 (2001).
[CrossRef]

Hartley, R.

R. Hartley, “Lines and points in three views and the trifocal tensor,” Int. J. Comput. Vis.22(2), 125–140 (1997).
[CrossRef]

Martinez-Berti, E.

Ricolfe-Viala, C.

Sanchez-Salmeron, A. J.

Torr, P.

P. Torr and A. Zisserman, “Robust parameterization and computation of the trifocal tensor,” Image Vis. Comput.15(8), 591–605 (1997).
[CrossRef]

Zisserman, A.

P. Torr and A. Zisserman, “Robust parameterization and computation of the trifocal tensor,” Image Vis. Comput.15(8), 591–605 (1997).
[CrossRef]

Appl. Opt.

IEEE Trans. Image Process.

M. Ahmed and A. Farag, “Nonmetric calibration of camera lens distortion: differential methods and robust estimation,” IEEE Trans. Image Process.14(8), 1215–1230 (2005).
[CrossRef] [PubMed]

Image Vis. Comput.

P. Torr and A. Zisserman, “Robust parameterization and computation of the trifocal tensor,” Image Vis. Comput.15(8), 591–605 (1997).
[CrossRef]

Int. J. Comput. Vis.

R. Hartley, “Lines and points in three views and the trifocal tensor,” Int. J. Comput. Vis.22(2), 125–140 (1997).
[CrossRef]

Mach. Vis. Appl.

F. Devernay and O. Faugeras, “Straight lines have to be straight,” Mach. Vis. Appl.13(1), 14–24 (2001).
[CrossRef]

Other

O. Faugeras, Q.-T. Luong, and T. Papadopoulo, Geometry of multiple images. (Cambridge, MA, USA: MIT Press, 2001).

P. Sturm and S. Ramalingam, “A generic concept for camera calibration,” in Proc. 10th ECCV, 1–13 (2004)

C. Ressl, “Geometry, constraints and computation of the trifocal tensor,” PhD Thesis, Instituüt für Photogrammetrie und Fernerkundung, Technische Universität Wien, (2003).

W. Förstner, “On weighting and choosing constraints for optimally reconstructing the geometry of image triplets,” in Proc. 6th ECCV, 669–684 (2000).

T. Molinier, D. Fofi, and F. Meriaudeau, “Self-calibration of a trinocular sensor with imperceptible structured light and varying intrinsic parameters,” in Proc.7th Int. Conf. Quality Control by Artificial Vision, (2005).

D. Nistér, “Reconstruction from uncalibrated sequences with a hierarchy of trifocal tensors,” in Proc. 6th ECCV, 649–663 (2000).

D. Claus and A. Fitzgibbon, “A rational function lens distortion model for general cameras,” In Proc. CVPR IEEE, 213–219 (2005).

R. Hartley and A. Zisserman, Multiple view geometry in computer vision. (Cambridge University Press, 2000).

O. Faugeras and B. Mourrain, “On the geometry and algebra of the point and line correspondences between n images,” In Proc. 5th Int. Conf. Comput. Vision, 951–962 (1995).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (3)

Fig. 1
Fig. 1

(a) Geometric constraints for correcting points locations detected in the image. Cross ratio guaranties that parallel lines remain parallels under perspective projection. Points are corrected to belong to straight lines. All parallel lines meet in a vanishing point in an image. All vanishing points are in the horizon line. (b) Trifocal tensor constraints used for pixel location correction in distorted images. Planes which contain the lines in the images meet in the 3D line of the calibration template. Pixels locations in one image are corrected in reference with their corresponding points in the remaining two images.

Fig. 2
Fig. 2

(a) Blue dots represent detected points in the distorted image of Fig. 3(b). Red dots represent undistorted points corrected with the proposed method. Pixels are moved to accomplish all projective geometric and trifocal sensor constraints. (b) The number of errors in points correspondences between images (outliers) is modified. Results with the proposed method do not depend on the number of outliers since point localizations are corrected before trifocal tensor is computed.

Fig. 3
Fig. 3

Results with real data. (a)(b)(c) show 640x480 captured images with an Axis 212 PTZ with 2.7 mm lens mounted which gives 85° field of view. (d)(e)(f) show images corrected with the proposed method. White dots are the set of corresponding points which are used to calibrate the trifocal tensor.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

J CR = l=1 n k=1 m3 ( CR( q k,l , q k+1,l , q k+2,l , q k+3,l )CR( p 1 , p 2 , p 3 , p 4 ) ) 2
J ST = l=1 n k=1 m d( q k,l ,l) = l=1 n k=1 m | a l · u k,l + b l · v k,l + c l · w k,l | a l 2 + b l 2
J VP = i=1 4 l=1 t d( q vpi , l l ) = i=1 4 l=1 t | a l · u vpi + b l · v vpi + c l · w vpi | a l 2 + b l 2
J HL = i=1 4 d( q vpi , l h ) = i=1 4 | a h · u vpi + b h · v vpi + c h · w vpi | a h 2 + b h 2
[ v l v i l ]· n i l =[ 0 0 ]
J TS = i=1 3 l=1 s V i l · n i l
J= J CR + J ST + J VP + J HL + J TS
e d = 1 n i=1 n ( *u* u o ) 2 + ( *v* v o ) 2

Metrics