Abstract

Camera-based optical metrology crucially relies on a proper identification of intrinsic (optical center position) and extrinsic (camera position) parameters of the used camera. A novel approach for processing phase data from multiple views of a target grid is presented, allowing these identifications within a pinhole camera model. First, the homography associated with the perspective distortion of each grid image is accurately identified using a phase-locking loop. Then, the affine transform for each view is determined, with the constraint of an identical set of intrinsic camera parameters. This requires slightly adjusting each homography. As there are highly redundant data, a criterion has to be chosen to optimize the result. The chosen criterion is the simultaneous minimization of the standard deviation between in-plane grid line displacements between the acquired grid images and the reconstructed ones. Experimental results demonstrate the efficiency of the method.

© 2017 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Planar self-calibration for stereo cameras with radial distortion

Banglei Guan, Yang Shang, and Qifeng Yu
Appl. Opt. 56(33) 9257-9267 (2017)

Fusion of images on affine sampling grids

Douglas Granrath and James Lersch
J. Opt. Soc. Am. A 15(4) 791-801 (1998)

References

  • View by:
  • |
  • |
  • |

  1. F. Remondino and C. Fraser, “Digital camera calibration methods: considerations and comparisons,” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 36, 266–272 (2006).
  2. C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).
  3. K. B. Atkinson, Close Range Photogrammetry and Machine Vision (Whittles, 1996).
  4. C. S. Fraser, “Automatic camera calibration in close range photogrammetry,” Photogramm. Eng. Remote Sens. 79, 381–388 (2013).
    [Crossref]
  5. R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
    [Crossref]
  6. O. Faugeras, Three-Dimensional Computer Vision: A Geometric Viewpoint (MIT, 1993).
  7. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
    [Crossref]
  8. E. Malis and M. Vargas, “Deeper understanding of the homography decomposition for vision-based control,” (2007).
  9. Y. Shirai, Three-Dimensional Computer Vision (Springer, 2012).
  10. M. Fiala and C. Shu, “Self-identifying patterns for plane-based camera calibration,” Machine Vis. Appl. 19, 209–216 (2008).
    [Crossref]
  11. A. Agarwal, C. Jawahar, and P. Narayanan, “A survey of planar homography estimation techniques,” (2005).
  12. L. Lucchese, “A frequency domain technique based on energy radial projections for robust estimation of global 2D affine transformations,” Comp. Vis. Image Underst. 82, 82–83 (2001).
    [Crossref]
  13. C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
    [Crossref]
  14. L. Huang, Q. Zhang, and A. Asundi, “Camera calibration with active phase target: improvement on feature detection and optimization,” Opt. Lett. 38, 1446–1448 (2013).
    [Crossref]
  15. Y. Surrel and B. Zhao, “Simultaneous u-v displacement field measurement with a phase-shifting grid method,” Proc. SPIE 2342, 66–75 (1994).
    [Crossref]
  16. H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
    [Crossref]
  17. C. Badulescu, M. Grédiac, and J. D. Mathias, “Investigation of the grid method for accurate in-plane strain measurement,” Meas. Sci. Technol. 20, 095102 (2009).
    [Crossref]
  18. J. Surrel and Y. Surrel, “La technique de projection de franges pour la saisie des formes d’objets biologiques vivants,” J. Opt. 29, 6–13 (1998).
    [Crossref]
  19. R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, “Calibration of lens distortion by structured-light scanning,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), pp. 832–837.
  20. Y. Surrel, “Méthode des grilles, moiré et déflectométrie,” in Mesures de Champs et Identification en Mécanique des Solides, M. Grédiac and F. Hild, eds. (Lavoisier, 2011), Chap. 3.
  21. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. 72, 156–160 (1982).
    [Crossref]

2013 (2)

C. S. Fraser, “Automatic camera calibration in close range photogrammetry,” Photogramm. Eng. Remote Sens. 79, 381–388 (2013).
[Crossref]

L. Huang, Q. Zhang, and A. Asundi, “Camera calibration with active phase target: improvement on feature detection and optimization,” Opt. Lett. 38, 1446–1448 (2013).
[Crossref]

2011 (1)

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[Crossref]

2009 (1)

C. Badulescu, M. Grédiac, and J. D. Mathias, “Investigation of the grid method for accurate in-plane strain measurement,” Meas. Sci. Technol. 20, 095102 (2009).
[Crossref]

2008 (1)

M. Fiala and C. Shu, “Self-identifying patterns for plane-based camera calibration,” Machine Vis. Appl. 19, 209–216 (2008).
[Crossref]

2006 (2)

F. Remondino and C. Fraser, “Digital camera calibration methods: considerations and comparisons,” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 36, 266–272 (2006).

H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
[Crossref]

2001 (1)

L. Lucchese, “A frequency domain technique based on energy radial projections for robust estimation of global 2D affine transformations,” Comp. Vis. Image Underst. 82, 82–83 (2001).
[Crossref]

2000 (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

1998 (1)

J. Surrel and Y. Surrel, “La technique de projection de franges pour la saisie des formes d’objets biologiques vivants,” J. Opt. 29, 6–13 (1998).
[Crossref]

1994 (1)

Y. Surrel and B. Zhao, “Simultaneous u-v displacement field measurement with a phase-shifting grid method,” Proc. SPIE 2342, 66–75 (1994).
[Crossref]

1987 (1)

R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
[Crossref]

1982 (1)

1971 (1)

C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Agarwal, A.

A. Agarwal, C. Jawahar, and P. Narayanan, “A survey of planar homography estimation techniques,” (2005).

Angelopoulou, E.

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[Crossref]

Asundi, A.

Atkinson, K. B.

K. B. Atkinson, Close Range Photogrammetry and Machine Vision (Whittles, 1996).

Avril, S.

H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
[Crossref]

Badulescu, C.

C. Badulescu, M. Grédiac, and J. D. Mathias, “Investigation of the grid method for accurate in-plane strain measurement,” Meas. Sci. Technol. 20, 095102 (2009).
[Crossref]

Chalal, H.

H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
[Crossref]

Duane, C. B.

C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Echigo, T.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, “Calibration of lens distortion by structured-light scanning,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), pp. 832–837.

Faugeras, O.

O. Faugeras, Three-Dimensional Computer Vision: A Geometric Viewpoint (MIT, 1993).

Fiala, M.

M. Fiala and C. Shu, “Self-identifying patterns for plane-based camera calibration,” Machine Vis. Appl. 19, 209–216 (2008).
[Crossref]

Forster, F.

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[Crossref]

Fraser, C.

F. Remondino and C. Fraser, “Digital camera calibration methods: considerations and comparisons,” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 36, 266–272 (2006).

Fraser, C. S.

C. S. Fraser, “Automatic camera calibration in close range photogrammetry,” Photogramm. Eng. Remote Sens. 79, 381–388 (2013).
[Crossref]

Grédiac, M.

C. Badulescu, M. Grédiac, and J. D. Mathias, “Investigation of the grid method for accurate in-plane strain measurement,” Meas. Sci. Technol. 20, 095102 (2009).
[Crossref]

Huang, L.

Ina, H.

Jawahar, C.

A. Agarwal, C. Jawahar, and P. Narayanan, “A survey of planar homography estimation techniques,” (2005).

Kobayashi, S.

Lucchese, L.

L. Lucchese, “A frequency domain technique based on energy radial projections for robust estimation of global 2D affine transformations,” Comp. Vis. Image Underst. 82, 82–83 (2001).
[Crossref]

Malis, E.

E. Malis and M. Vargas, “Deeper understanding of the homography decomposition for vision-based control,” (2007).

Mathias, J. D.

C. Badulescu, M. Grédiac, and J. D. Mathias, “Investigation of the grid method for accurate in-plane strain measurement,” Meas. Sci. Technol. 20, 095102 (2009).
[Crossref]

Meraghni, F.

H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
[Crossref]

Narayanan, P.

A. Agarwal, C. Jawahar, and P. Narayanan, “A survey of planar homography estimation techniques,” (2005).

Pierron, F.

H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
[Crossref]

Remondino, F.

F. Remondino and C. Fraser, “Digital camera calibration methods: considerations and comparisons,” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 36, 266–272 (2006).

Sagawa, R.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, “Calibration of lens distortion by structured-light scanning,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), pp. 832–837.

Schmalz, C.

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[Crossref]

Shirai, Y.

Y. Shirai, Three-Dimensional Computer Vision (Springer, 2012).

Shu, C.

M. Fiala and C. Shu, “Self-identifying patterns for plane-based camera calibration,” Machine Vis. Appl. 19, 209–216 (2008).
[Crossref]

Surrel, J.

J. Surrel and Y. Surrel, “La technique de projection de franges pour la saisie des formes d’objets biologiques vivants,” J. Opt. 29, 6–13 (1998).
[Crossref]

Surrel, Y.

J. Surrel and Y. Surrel, “La technique de projection de franges pour la saisie des formes d’objets biologiques vivants,” J. Opt. 29, 6–13 (1998).
[Crossref]

Y. Surrel and B. Zhao, “Simultaneous u-v displacement field measurement with a phase-shifting grid method,” Proc. SPIE 2342, 66–75 (1994).
[Crossref]

Y. Surrel, “Méthode des grilles, moiré et déflectométrie,” in Mesures de Champs et Identification en Mécanique des Solides, M. Grédiac and F. Hild, eds. (Lavoisier, 2011), Chap. 3.

Takatsuji, M.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, “Calibration of lens distortion by structured-light scanning,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), pp. 832–837.

Takeda, M.

Tsai, R.

R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
[Crossref]

Vargas, M.

E. Malis and M. Vargas, “Deeper understanding of the homography decomposition for vision-based control,” (2007).

Yagi, Y.

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, “Calibration of lens distortion by structured-light scanning,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), pp. 832–837.

Zhang, Q.

Zhang, Z.

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

Zhao, B.

Y. Surrel and B. Zhao, “Simultaneous u-v displacement field measurement with a phase-shifting grid method,” Proc. SPIE 2342, 66–75 (1994).
[Crossref]

Comp. Vis. Image Underst. (1)

L. Lucchese, “A frequency domain technique based on energy radial projections for robust estimation of global 2D affine transformations,” Comp. Vis. Image Underst. 82, 82–83 (2001).
[Crossref]

Composites Part A (1)

H. Chalal, S. Avril, F. Pierron, and F. Meraghni, “Experimental identification of a nonlinear model for composites using the grid technique coupled to the virtual fields method,” Composites Part A 37, 315–325 (2006).
[Crossref]

IEEE J. Robot. Autom. (1)

R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
[Crossref]

Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. (1)

F. Remondino and C. Fraser, “Digital camera calibration methods: considerations and comparisons,” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 36, 266–272 (2006).

J. Opt. (1)

J. Surrel and Y. Surrel, “La technique de projection de franges pour la saisie des formes d’objets biologiques vivants,” J. Opt. 29, 6–13 (1998).
[Crossref]

J. Opt. Soc. Am. (1)

Machine Vis. Appl. (1)

M. Fiala and C. Shu, “Self-identifying patterns for plane-based camera calibration,” Machine Vis. Appl. 19, 209–216 (2008).
[Crossref]

Meas. Sci. Technol. (1)

C. Badulescu, M. Grédiac, and J. D. Mathias, “Investigation of the grid method for accurate in-plane strain measurement,” Meas. Sci. Technol. 20, 095102 (2009).
[Crossref]

Opt. Eng. (1)

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[Crossref]

Opt. Lett. (1)

Photogramm. Eng. (1)

C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Photogramm. Eng. Remote Sens. (1)

C. S. Fraser, “Automatic camera calibration in close range photogrammetry,” Photogramm. Eng. Remote Sens. 79, 381–388 (2013).
[Crossref]

Proc. SPIE (1)

Y. Surrel and B. Zhao, “Simultaneous u-v displacement field measurement with a phase-shifting grid method,” Proc. SPIE 2342, 66–75 (1994).
[Crossref]

Other (7)

R. Sagawa, M. Takatsuji, T. Echigo, and Y. Yagi, “Calibration of lens distortion by structured-light scanning,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (2005), pp. 832–837.

Y. Surrel, “Méthode des grilles, moiré et déflectométrie,” in Mesures de Champs et Identification en Mécanique des Solides, M. Grédiac and F. Hild, eds. (Lavoisier, 2011), Chap. 3.

K. B. Atkinson, Close Range Photogrammetry and Machine Vision (Whittles, 1996).

O. Faugeras, Three-Dimensional Computer Vision: A Geometric Viewpoint (MIT, 1993).

A. Agarwal, C. Jawahar, and P. Narayanan, “A survey of planar homography estimation techniques,” (2005).

E. Malis and M. Vargas, “Deeper understanding of the homography decomposition for vision-based control,” (2007).

Y. Shirai, Three-Dimensional Computer Vision (Springer, 2012).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1.

Example of an acquired grid image. The corresponding field of view is around 60–70 mm wide.

Fig. 2.
Fig. 2.

Projection of a target grid onto a camera sensor. The sensor origin O (distinct from P) is not indicated, and d is the distance PC.

Fig. 3.
Fig. 3.

Projection center lies on the sphere with diameter XY.

Fig. 4.
Fig. 4.

Minimal circle (in blue, w=45  deg) and another circle defined by the vanishing points with w=65  deg, corresponding to an in-plane rotation of 20  deg.

Fig. 5.
Fig. 5.

Subtraction fields between the experimental and reconstructed grid line positions (x left and y right) corresponding to the acquired image shown in Fig. 1. The gray levels scale (black to white) is [15  deg, +15  deg] (phase) or [35  μm, 35 μm] (grid line displacement).

Fig. 6.
Fig. 6.

Minimal circles and radical axes of the identified homographies, with three zoom levels. The sensor edges are represented with dashed lines. The upper-left subfigure shows all the homography radical circles and axes. The sensor is the barely visible rectangle in the center. The upper-right subfigure is a zoomed view of the sensor area. Popt is the point that lies at a minimum distance from all the radical axes. The bottom view is a closer zoom showing projections of Popt onto all the radical axes.

Fig. 7.
Fig. 7.

Closeup view of the initial image (left) and after addition of Gaussian noise (σ=30 gray levels, right).

Tables (3)

Tables Icon

Table 1. Standard Deviations of Subtractions Between Reconstructed and Experimental Grid Line Positions, Equivalently Expressed as Phases δϕx and δϕy or Displacements ux and uy

Tables Icon

Table 2. Location (x,y) of the Optimal Concurrency Point Popt and Its Projections onto the Radical Axes from Which the Corresponding Image Distances Cz Can Be Calculateda

Tables Icon

Table 3. Standard Deviations of Subtractions Between Reconstructed and Experimental Grid Line Positions, Equivalently Expressed as Phases δϕx and δϕy or Displacements ux and uy, Using Homographies Adjusted to Derive from a Single Projection Centera

Equations (29)

Equations on this page are rendered with MathJax. Learn more.

CM=R(CM)+t.
(xMyMzM1)=(r00r01r02txr10r11r12tyr20r21r22tz0001)·(xMyM01),
(sxmsyms)=(r00r01txr10r11tyr20/dr21/dtz/d)·(xMyM1).
{x=ax+by+cgx+hy+k,y=dx+ey+fgx+hy+k,
H=(abcdefghk).
H=(abcdefgh1).
H=(ab0de0gh1).
{ztw=texp(iw)=g+ih,zru=rexp(iu)=aibgih,zsv=sexp(iv)=diegih,
{a+ib=ztwz¯ru,d+ie=ztwz¯sv,g+ih=ztw,
(ab0de0gh1)=(rtcos(wu)rtsin(wu)0stcos(wv)stsin(wv)0tcos(w)tsin(w)1).
{x=R[(a+ib)(xiy])1+R[(g+ih)(xiy)]=R(ztwz¯ruz¯)1+R(ztwz¯),y=R[(d+ie)(xiy])1+R[(g+ih)(xiy)]=R(ztwz¯svz¯)1+R(ztwz¯),
{X=(agdg)=(rtcos(wu)tcos(w)stcos(wv)tcos(w))=(rcos(u)+rsin(u)tan(w)scos(v)+ssin(v)tan(w))=OΓ+Rtan(w),Y=(bheh)=(rtsin(wu)tsin(w)stsin(wv)tsin(w))=(rcos(u)rsin(u)cot(w)scos(v)ssin(v)cot(w))=OΓRcot(w),
{OΓ=(rcos(u)scos(v))=R(zruzsv),R=(rsin(u)ssin(v))=I(zruzsv),
YX=R[tan(w)+cot(w)]=2sin(2w)R.
{Xmin=OΓ+R,Ymin=OΓR.
{x(x,y)Xxx+Xyy+Xx2x2+Xxyxy+Xy2y2+Xx3x3+Xx2yx2y+Xxy2xy2+Xy3y3y(x,y)Yxx+Yyy+Yx2x2+Yxyxy+Yy2y2+Yx3x3+Yx2yx2y+Yxy2xy2+Yy3y3.
{(Xx,Xy,Xx2,Xxy,Xy2,Xx3,Xx2y,Xxy2,Xy3)=(a,b,ag,ahbg,bh,ag2,2agh+bg2,ah2+2bgh,bh2),(Yx,Yy,Yx2,Yxy,Yy2,Yx3,Yx2y,Yxy2,Yy3)=(d,e,dg,dheg,eh,dg2,2dgh+eg2,dh2+2egh,eh2).
{a=X0xb=X0yd=Y0xe=Y0yg=X0x2Y0x2ador0,h=X0y2Y0y2beor0.
δΩ=(δXxδXyδXx2δXxyδXy2δXx3δXx2yδXxy2δXy3δYxδYyδYx2δYxyδYy2δYx3δYx2yδYxy2δYy3)=(100000010000g000a0hg00ba0h000bg20002ag02ghg2002(ah+bg)2agh22gh002bh2(ah+bg)0h20002bh00100000010000g0d000hged000h0e00g202dg0002ghg22(dh+eg)2dg00h22gh2eh2(dh+eg)000h202eh)·(δaδbδdδeδgδh).
(a,b,d,e,g,h)t=Sp_inv·δΩ,
{zrueiαkzru,zsveiαkzsv,ztwkeiαztw.
R(eiαkzru)I(eiαkzru)+R(eiαkzsv)I(eiαkzsv)=0.
I(zrusv2e2iα)=0,
zrusv=zru2+zsv2,
α=arg(zrusv)+mπ/2,mZ,
Cz2=R2OΓ2.
Cz2=I(zrusveiαk)2R(zrusveiαk)2=R(zrusv2e2iαk2)=|zrusv|2k2.
k=|zrusv|Cz,
keiα=imz¯rusvCz,mZ,

Metrics