Abstract

A structured-light system with a binary defocusing technique has the potential to have more extensive application due to its high speeds, gamma-calibration-free nature, and lack of rigid synchronization requirements between the camera and projector. However, the existing calibration methods fail to achieve high accuracy for a structured-light system with an out-of-focus projector. This paper proposes a method that can accurately calibrate a structured-light system even when the projector is not in focus, making it possible for high-accuracy and high-speed measurement with the binary defocusing method. Experiments demonstrate that our calibration approach performs consistently under different defocusing degrees, and a root-mean-square error of about 73 μm can be achieved with a calibration volume of 150(H)mm×250(W)mm×200(D)mm.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010).
    [CrossRef]
  2. C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).
  3. I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
    [CrossRef]
  4. R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
    [CrossRef]
  5. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
    [CrossRef]
  6. J. Lavest, M. Viala, and M. Dhome, “Do we really need an accurate calibration pattern to achieve a reliable camera calibration?” in Proceedings of the European Conference on Computer Vision (Springer, 1998), pp. 158–174.
  7. A. Albarelli, E. Rodolà, and A. Torsello, “Robust camera calibration using inaccurate targets,” IEEE Trans. Pattern Anal. Mach. Intell. 31, 376–383 (2009).
    [CrossRef]
  8. K. H. Strobl and G. Hirzinger, “More accurate pinhole camera calibration with imperfect planar target,” in IEEE International Conference on Computer Vision (IEEE, 2011), pp. 1068–1075.
  9. L. Huang, Q. Zhang, and A. Asundi, “Flexible camera calibration using not-measured imperfect target,” Appl. Opt. 52, 6278–6286 (2013).
    [CrossRef]
  10. C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
    [CrossRef]
  11. L. Huang, Q. Zhang, and A. Asundi, “Camera calibration with active phase target: improvement on feature detection and optimization,” Opt. Lett. 38, 1446–1448 (2013).
    [CrossRef]
  12. Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
    [CrossRef]
  13. X. Mao, W. Chen, and X. Su, “Improved Fourier-transform profilometry,” Appl. Opt. 46, 664–668 (2007).
    [CrossRef]
  14. E. Zappa and G. Busca, “Fourier-transform profilometry calibration based on an exhaustive geometric model of the system,” Opt. Lasers Eng. 47, 754–767 (2009).
    [CrossRef]
  15. H. Guo, M. Chen, and P. Zheng, “Least-squares fitting of carrier phase distribution by using a rational function in fringe projection profilometry,” Opt. Lett. 31, 3588–3590 (2006).
    [CrossRef]
  16. H. Du and Z. Wang, “Three-dimensional shape measurement with an arbitrarily arranged fringe projection profilometry system,” Opt. Lett. 32, 2438–2440 (2007).
    [CrossRef]
  17. L. Huang, P. S. Chua, and A. Asundi, “Least-squares calibration method for fringe projection profilometry considering camera lens distortion,” Appl. Opt. 49, 1539–1548 (2010).
    [CrossRef]
  18. M. Vo, Z. Wang, B. Pan, and T. Pan, “Hyper-accurate flexible calibration technique for fringe-projection-based three-dimensional imaging,” Opt. Express 20, 16926–16941 (2012).
    [CrossRef]
  19. R. Legarda-Sáenz, T. Bothe, and W. P. Ju, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
    [CrossRef]
  20. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).
    [CrossRef]
  21. Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
    [CrossRef]
  22. Y. Yin, X. Peng, A. Li, X. Liu, and B. Z. Gao, “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett. 37, 542–544 (2012).
    [CrossRef]
  23. D. Han, A. Chimienti, and G. Menga, “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng. 52, 104106 (2013).
    [CrossRef]
  24. S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. 34, 3080–3082 (2009).
    [CrossRef]
  25. Y. Gong and S. Zhang, “Ultrafast 3-D shape measurement with an off-the-shelf DLP projector,” Opt. Express 18, 19743–19754 (2010).
    [CrossRef]
  26. L. Merner, Y. Wang, and S. Zhang, “Accurate calibration for 3D shape measurement system using a binary defocusing technique,” Opt. Lasers Eng. 51, 514–519 (2013).
    [CrossRef]
  27. Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19, 5143–5148 (2011).
    [CrossRef]
  28. S. L. Ellenberger, Influence of Defocus on Measurements in Microscope Images, ASCI Dissertation Series (ASCI, 2000).
  29. P. A. Stokseth, “Properties of a defocused optical system,” J. Opt. Soc. Am. 59, 1314–1321 (1969).
    [CrossRef]
  30. H. Hopkins, “The frequency response of a defocused optical system,” Proc. R. Soc. London A 231, 91–103 (1955).
    [CrossRef]
  31. S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-D shape measurement,” Opt. Express 14, 9120–9129 (2006).
    [CrossRef]

2013 (4)

D. Han, A. Chimienti, and G. Menga, “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng. 52, 104106 (2013).
[CrossRef]

L. Merner, Y. Wang, and S. Zhang, “Accurate calibration for 3D shape measurement system using a binary defocusing technique,” Opt. Lasers Eng. 51, 514–519 (2013).
[CrossRef]

L. Huang, Q. Zhang, and A. Asundi, “Camera calibration with active phase target: improvement on feature detection and optimization,” Opt. Lett. 38, 1446–1448 (2013).
[CrossRef]

L. Huang, Q. Zhang, and A. Asundi, “Flexible camera calibration using not-measured imperfect target,” Appl. Opt. 52, 6278–6286 (2013).
[CrossRef]

2012 (2)

2011 (2)

Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19, 5143–5148 (2011).
[CrossRef]

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[CrossRef]

2010 (3)

2009 (3)

S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. 34, 3080–3082 (2009).
[CrossRef]

A. Albarelli, E. Rodolà, and A. Torsello, “Robust camera calibration using inaccurate targets,” IEEE Trans. Pattern Anal. Mach. Intell. 31, 376–383 (2009).
[CrossRef]

E. Zappa and G. Busca, “Fourier-transform profilometry calibration based on an exhaustive geometric model of the system,” Opt. Lasers Eng. 47, 754–767 (2009).
[CrossRef]

2008 (1)

Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[CrossRef]

2007 (2)

2006 (3)

2004 (1)

R. Legarda-Sáenz, T. Bothe, and W. P. Ju, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[CrossRef]

2003 (1)

Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
[CrossRef]

2000 (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
[CrossRef]

1987 (1)

R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
[CrossRef]

1974 (1)

I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
[CrossRef]

1971 (1)

C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

1969 (1)

1955 (1)

H. Hopkins, “The frequency response of a defocused optical system,” Proc. R. Soc. London A 231, 91–103 (1955).
[CrossRef]

Albarelli, A.

A. Albarelli, E. Rodolà, and A. Torsello, “Robust camera calibration using inaccurate targets,” IEEE Trans. Pattern Anal. Mach. Intell. 31, 376–383 (2009).
[CrossRef]

Angelopoulou, E.

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[CrossRef]

Asundi, A.

Bothe, T.

R. Legarda-Sáenz, T. Bothe, and W. P. Ju, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[CrossRef]

Busca, G.

E. Zappa and G. Busca, “Fourier-transform profilometry calibration based on an exhaustive geometric model of the system,” Opt. Lasers Eng. 47, 754–767 (2009).
[CrossRef]

Chen, M.

Chen, W.

Chiang, F.-P.

Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
[CrossRef]

Chimienti, A.

D. Han, A. Chimienti, and G. Menga, “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng. 52, 104106 (2013).
[CrossRef]

Chua, P. S.

Dhome, M.

J. Lavest, M. Viala, and M. Dhome, “Do we really need an accurate calibration pattern to achieve a reliable camera calibration?” in Proceedings of the European Conference on Computer Vision (Springer, 1998), pp. 158–174.

Du, H.

Duane, C. B.

C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Ellenberger, S. L.

S. L. Ellenberger, Influence of Defocus on Measurements in Microscope Images, ASCI Dissertation Series (ASCI, 2000).

Forster, F.

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[CrossRef]

Fu, Q.

Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
[CrossRef]

Gao, B. Z.

Gong, Y.

Guo, H.

Han, D.

D. Han, A. Chimienti, and G. Menga, “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng. 52, 104106 (2013).
[CrossRef]

Hirzinger, G.

K. H. Strobl and G. Hirzinger, “More accurate pinhole camera calibration with imperfect planar target,” in IEEE International Conference on Computer Vision (IEEE, 2011), pp. 1068–1075.

Hopkins, H.

H. Hopkins, “The frequency response of a defocused optical system,” Proc. R. Soc. London A 231, 91–103 (1955).
[CrossRef]

Hu, Q.

Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
[CrossRef]

Huang, L.

Huang, P. S.

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).
[CrossRef]

Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
[CrossRef]

Ju, W. P.

R. Legarda-Sáenz, T. Bothe, and W. P. Ju, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[CrossRef]

Lavest, J.

J. Lavest, M. Viala, and M. Dhome, “Do we really need an accurate calibration pattern to achieve a reliable camera calibration?” in Proceedings of the European Conference on Computer Vision (Springer, 1998), pp. 158–174.

Legarda-Sáenz, R.

R. Legarda-Sáenz, T. Bothe, and W. P. Ju, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[CrossRef]

Lei, S.

Li, A.

Li, Z.

Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[CrossRef]

Liu, X.

Mao, X.

Menga, G.

D. Han, A. Chimienti, and G. Menga, “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng. 52, 104106 (2013).
[CrossRef]

Merner, L.

L. Merner, Y. Wang, and S. Zhang, “Accurate calibration for 3D shape measurement system using a binary defocusing technique,” Opt. Lasers Eng. 51, 514–519 (2013).
[CrossRef]

Pan, B.

Pan, T.

Peng, X.

Rodolà, E.

A. Albarelli, E. Rodolà, and A. Torsello, “Robust camera calibration using inaccurate targets,” IEEE Trans. Pattern Anal. Mach. Intell. 31, 376–383 (2009).
[CrossRef]

Royer, D.

Schmalz, C.

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[CrossRef]

Shi, Y.

Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[CrossRef]

Sobel, I.

I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
[CrossRef]

Stokseth, P. A.

Strobl, K. H.

K. H. Strobl and G. Hirzinger, “More accurate pinhole camera calibration with imperfect planar target,” in IEEE International Conference on Computer Vision (IEEE, 2011), pp. 1068–1075.

Su, X.

Torsello, A.

A. Albarelli, E. Rodolà, and A. Torsello, “Robust camera calibration using inaccurate targets,” IEEE Trans. Pattern Anal. Mach. Intell. 31, 376–383 (2009).
[CrossRef]

Tsai, R.

R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
[CrossRef]

Viala, M.

J. Lavest, M. Viala, and M. Dhome, “Do we really need an accurate calibration pattern to achieve a reliable camera calibration?” in Proceedings of the European Conference on Computer Vision (Springer, 1998), pp. 158–174.

Vo, M.

Wang, C.

Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[CrossRef]

Wang, Y.

L. Merner, Y. Wang, and S. Zhang, “Accurate calibration for 3D shape measurement system using a binary defocusing technique,” Opt. Lasers Eng. 51, 514–519 (2013).
[CrossRef]

Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19, 5143–5148 (2011).
[CrossRef]

Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[CrossRef]

Wang, Z.

Yau, S.-T.

Yin, Y.

Zappa, E.

E. Zappa and G. Busca, “Fourier-transform profilometry calibration based on an exhaustive geometric model of the system,” Opt. Lasers Eng. 47, 754–767 (2009).
[CrossRef]

Zhang, Q.

Zhang, S.

L. Merner, Y. Wang, and S. Zhang, “Accurate calibration for 3D shape measurement system using a binary defocusing technique,” Opt. Lasers Eng. 51, 514–519 (2013).
[CrossRef]

Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19, 5143–5148 (2011).
[CrossRef]

Y. Gong and S. Zhang, “Ultrafast 3-D shape measurement with an off-the-shelf DLP projector,” Opt. Express 18, 19743–19754 (2010).
[CrossRef]

S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010).
[CrossRef]

S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. 34, 3080–3082 (2009).
[CrossRef]

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).
[CrossRef]

S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-D shape measurement,” Opt. Express 14, 9120–9129 (2006).
[CrossRef]

Zhang, Z.

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
[CrossRef]

Zheng, P.

Appl. Opt. (3)

Artif. Intell. (1)

I. Sobel, “On calibrating computer controlled cameras for perceiving 3-D scenes,” Artif. Intell. 5, 185–198 (1974).
[CrossRef]

IEEE J. Robot. Autom. (1)

R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE J. Robot. Autom. 3, 323–344 (1987).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000).
[CrossRef]

A. Albarelli, E. Rodolà, and A. Torsello, “Robust camera calibration using inaccurate targets,” IEEE Trans. Pattern Anal. Mach. Intell. 31, 376–383 (2009).
[CrossRef]

J. Opt. Soc. Am. (1)

Opt. Eng. (6)

C. Schmalz, F. Forster, and E. Angelopoulou, “Camera calibration: active versus passive targets,” Opt. Eng. 50, 113601 (2011).
[CrossRef]

Q. Hu, P. S. Huang, Q. Fu, and F.-P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003).
[CrossRef]

R. Legarda-Sáenz, T. Bothe, and W. P. Ju, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43, 464–471 (2004).
[CrossRef]

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45, 083601 (2006).
[CrossRef]

Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47, 053604 (2008).
[CrossRef]

D. Han, A. Chimienti, and G. Menga, “Improving calibration accuracy of structured light systems using plane-based residual error compensation,” Opt. Eng. 52, 104106 (2013).
[CrossRef]

Opt. Express (4)

Opt. Lasers Eng. (3)

E. Zappa and G. Busca, “Fourier-transform profilometry calibration based on an exhaustive geometric model of the system,” Opt. Lasers Eng. 47, 754–767 (2009).
[CrossRef]

L. Merner, Y. Wang, and S. Zhang, “Accurate calibration for 3D shape measurement system using a binary defocusing technique,” Opt. Lasers Eng. 51, 514–519 (2013).
[CrossRef]

S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010).
[CrossRef]

Opt. Lett. (5)

Photogramm. Eng. (1)

C. B. Duane, “Close-range camera calibration,” Photogramm. Eng. 37, 855–866 (1971).

Proc. R. Soc. London A (1)

H. Hopkins, “The frequency response of a defocused optical system,” Proc. R. Soc. London A 231, 91–103 (1955).
[CrossRef]

Other (3)

K. H. Strobl and G. Hirzinger, “More accurate pinhole camera calibration with imperfect planar target,” in IEEE International Conference on Computer Vision (IEEE, 2011), pp. 1068–1075.

J. Lavest, M. Viala, and M. Dhome, “Do we really need an accurate calibration pattern to achieve a reliable camera calibration?” in Proceedings of the European Conference on Computer Vision (Springer, 1998), pp. 158–174.

S. L. Ellenberger, Influence of Defocus on Measurements in Microscope Images, ASCI Dissertation Series (ASCI, 2000).

Supplementary Material (1)

» Media 1: MOV (1388 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (17)

Fig. 1.
Fig. 1.

Pinhole model of camera.

Fig. 2.
Fig. 2.

Design of calibration board.

Fig. 3.
Fig. 3.

Defocused optical system with the parameter ω.

Fig. 4.
Fig. 4.

Illustration of the optical transfer function (OTF) for a defocusing system. (a) Example of OTF with defocusing parameter ω=2λ. (b) Cross section of OTFs with different defocusing parameter ω.

Fig. 5.
Fig. 5.

Illustration of the point spread function (PSF) for a defocusing system. (a) Example of normalized PSF with defocusing parameter ω=2λ. (b) Cross section of normalized PSFs with different defocusing parameter ω.

Fig. 6.
Fig. 6.

Model of a structured-light system with an out-of-focus projector.

Fig. 7.
Fig. 7.

Pinhole model of structured-light system.

Fig. 8.
Fig. 8.

Photograph of dual-camera structured-light system.

Fig. 9.
Fig. 9.

Pixel geometry of the structured-light system devices. (a) DMD projector pixel geometry. (b) CMOS camera pixel geometry.

Fig. 10.
Fig. 10.

Example of captured images. (a) Example of one captured fringe image with horizontal pattern projection. (b) Example of one captured fringe image with vertical pattern projection. (c) Example of one captured fringe image with pure white image projection.

Fig. 11.
Fig. 11.

Example of finding circle centers for the camera and the projector. (a) Example of one calibration pose. (b) Circle centers extracted from (a). (c) Mapped image for the projector from (b).

Fig. 12.
Fig. 12.

Reprojection error caused by nonlinear distortion: (a) error for the camera and (b) error for the projector.

Fig. 13.
Fig. 13.

Absolute phase retrieval using three-frequency phase unwrapping algorithm. (a) Picture of the spherical object. (b) Wrapped phase map obtained from patterns with fringe period T=18 pixels. (c) Wrapped phase map obtained from patterns with fringe period T=21 pixels. (d) Wrapped phase map obtained from patterns with fringe period T=154 pixels. (e) Unwrapped phase map by applying the temporal phase unwrapping algorithm with three frequencies.

Fig. 14.
Fig. 14.

Illustration of three different defocusing degrees. (a) One captured fringe image under defocusing degree 1 (projector in focus). (b) One captured fringe image under defocusing degree 2 (projector slightly defocused). (c) One captured fringe image under defocusing degree 3 (projector greatly defocused). (d)–(f) Corresponding cross sections of intensity of (a)–(c).

Fig. 15.
Fig. 15.

Measurement results of a spherical surface under three different defocusing degrees; the rms errors estimated on (d), (h), and (l) are ±71, ±77, and ±73μm, respectively. (a) One captured fringe image under defocusing degree 1 (projector in focus). (b) Reconstructed 3D result under defocusing degree 1 (projector in focus). (c) Cross section of the 3D result and the ideal circle under defocusing degree 1 (projector in focus). (d) Error estimated based on (b). (e)–(h) Corresponding figures of (a)–(d) under defocusing degree 2 (projector slightly defocused). (i)–(l) Corresponding figures of (a)–(d) under defocusing degree 3 (projector greatly defocused).

Fig. 16.
Fig. 16.

Illustration of the measured diagonals on calibration board.

Fig. 17.
Fig. 17.

Real-time 3D shape measurement result. (a) One captured fringe image. (b)–(d) Three frames of the video (Media 1) we recorded.

Tables (1)

Tables Icon

Table 1. Measurement Result of Two Diagonals on Calibration Board

Equations (31)

Equations on this page are rendered with MathJax. Learn more.

Ik(x,y)=I(x,y)+I(x,y)cos(ϕ(x,y)+2kπ/N),
ϕ(x,y)=tan1[k=1NIksin(2kπ/N)k=1NIkcos(2kπ/N)].
scIc=Ac[Rc,tc]Xw.
Ac=[αγu0c0βv0c001],
Distc=[k1k2p1p2k3]T,
uc=uc(1+k1r2+k2r4+k3r6),
vc=vc(1+k1r2+k2r4+k3r6).
uc=uc+[2p1uv+p2(r2+2u2)],
vc=vc+[p1(r2+2v2)+2p2uv].
i(x,y)=o(x,y)psf(x,y).
psf(x,y)=|12π+f(u,v)ei(xu+yv)dudv|2=|F(x,y)|2,
f(u,v)={t(u,v)ej2πλω(u,v),foru2+v210,foru2+v2>1,
I(s0,t0)=O(s0,t0)×OTF(s0,t0).
OTF(s0,t0)=OTF(s0,t0)OTF(0,0).
f(u,v)={ej2πλω(u2+v2),foru2+v210,foru2+v2>1,
OTF(s)=+f(u+s2,v)f*(us2,v)dudv+|f(u,v)|2dudv,
OTF(s)=2J1(a)/a,a=(4πλωs),
Ip(up,vp)=Ic(uc,vc)×OTFp(0,0)OTFc(0,0),
ϕvac(uc,vc)=ϕvap(vp)=ϕva.
ϕhac(uc,vc)=ϕhap(up)=ϕha,
scIc=Ac[Rc,tc]Xw,
spIp=Ap[Rp,tp]Xw.
scIc=Ac[E3,0]Xw,
spIp=Ap[R,t]Xw.
vp=ϕvac(uc,vc)×P/2π
up=ϕhac(uc,vc)×P/2π,
Ac=[1698.020383.06201691.49294.487001],
Ap=[1019.050316.76302014.01841.891001],
Distc=[0.09052490.320865000]T.
Mc=[100001000010],
Mp=[0.9523290.003674220.305051162.9860.02816590.9967160.0759252146.1520.303770.08089780.94930595.6518].

Metrics