Abstract

A 3D shape measurement method in the presence of strong interreflections is presented. Traditional optical 3D shape measurement methods such as fringe projection profilometry (FPP) cannot measure regions that contain strong interreflections, which result in 3D shape measurement failure. In the proposed method, epipolar imaging with speckle patterns is utilized to eliminate the effects of interreflections and obtain the initial 3D shape measurement result. Regional fringe projection based on the initial measurement result is further applied to achieve high-accuracy measurement. Experimental results show that the proposed method can measure the regions that contain strong interreflections at a high accuracy.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Adaptive fringe-pattern projection for image saturation avoidance in 3D surface-shape measurement

Dong Li and Jonathan Kofman
Opt. Express 22(8) 9887-9901 (2014)

3D shape measurement based on projection of triangular patterns of two selected frequencies

Pu Cao, Jiangtao Xi, Yanguang Yu, Qinghua Guo, and Limei Song
Opt. Express 22(23) 29234-29248 (2014)

Adaptive digital fringe projection technique for high dynamic range three-dimensional shape measurement

Hui Lin, Jian Gao, Qing Mei, Yunbo He, Junxiu Liu, and Xingjin Wang
Opt. Express 24(7) 7703-7718 (2016)

References

  • View by:
  • |
  • |
  • |

  1. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000).
    [Crossref]
  2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
    [Crossref]
  3. C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
    [Crossref]
  4. Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
    [Crossref] [PubMed]
  5. S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
    [Crossref]
  6. S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar, “Fast separation of direct and global components of a scene using high frequency illumination,” in Proceedings of ACM SIGGRAPH (ACM, 2006), pp. 935–944.
    [Crossref]
  7. T. Chen, H. P. Seidel, and H. P. A. Lensch, “Modulated phase-shifting for 3D scanning,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.
  8. M. Gupta and S. K. Nayar, “Micro phase shifting,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 813–820.
  9. Y. Zhang, Z. Xiong, and F. Wu, “Unambiguous 3D measurement from speckle-embedded fringe,” Appl. Opt. 52(32), 7797–7805 (2013).
    [Crossref] [PubMed]
  10. S. Tang, X. Zhang, and D. Tu, “Micro-phase measuring profilometry: Its sensitivity analysis and phase unwrapping,” Opt. Lasers Eng. 72, 47–57 (2015).
    [Crossref]
  11. M. Gupta, A. Agrawal, A. Veeraraghavan, and S. G. Narasimhan, “Structured light 3D scanning in the presence of global illumination,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2011), pp. 713–720.
    [Crossref]
  12. V. Couture, N. Martin, and S. Roy, “Unstructured light scanning to overcome interreflections,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 1895–1902.
  13. N. Martin, V. Couture, and S. Roy, “Subpixel scanning invariant to indirect lighting using quadratic code length,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2013), pp. 1441–1448.
    [Crossref]
  14. Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
    [Crossref]
  15. H. Jiang, Y. Zhou, and H. Zhao, “Using adaptive regional projection to measure parts with strong reflection,” Proc. SPIE 10458, 104581A (2017).
  16. Y. Xu and D. G. Aliaga, “An adaptive correspondence algorithm for modeling scenes with strong interreflections,” IEEE Trans. Vis. Comput. Graph. 15(3), 465–480 (2009).
    [Crossref] [PubMed]
  17. X. Chen and Y. H. Yang, “Scene adaptive structured light using error detection and correction,” Pattern Recognit. 48(1), 220–230 (2015).
    [Crossref]
  18. M. O’Toole, J. Mather, and K. N. Kutulakos, “3D shape and indirect appearance by structured light transport,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2014), pp. 3246–3253.
    [Crossref]
  19. M. O’Toole, S. Achar, S. G. Narasimhan, and K. N. Kutulakos, “Homogeneous codes for energy-efficient illumination and imaging,” in Proceedings of ACM SIGGRAPH (ACM, 2015), p. 35.
    [Crossref]
  20. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
    [Crossref]
  21. H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012).
    [Crossref]
  22. N. Pears, Y. Liu, and P. Bunting, 3D Imaging, Analysis and Applications (Springer-Verlag, 2012), Chap. 2.
  23. D. Li, H. Zhao, and H. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” in Proceedings of International Symposium on Optomechatronic Technologies (IEEE, 2011), pp. 1–5.
  24. S. Gai, F. Da, and X. Dai, “Novel 3D measurement system based on speckle and fringe pattern projection,” Opt. Express 24(16), 17686–17697 (2016).
    [Crossref] [PubMed]

2017 (1)

H. Jiang, Y. Zhou, and H. Zhao, “Using adaptive regional projection to measure parts with strong reflection,” Proc. SPIE 10458, 104581A (2017).

2016 (2)

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

S. Gai, F. Da, and X. Dai, “Novel 3D measurement system based on speckle and fringe pattern projection,” Opt. Express 24(16), 17686–17697 (2016).
[Crossref] [PubMed]

2015 (2)

X. Chen and Y. H. Yang, “Scene adaptive structured light using error detection and correction,” Pattern Recognit. 48(1), 220–230 (2015).
[Crossref]

S. Tang, X. Zhang, and D. Tu, “Micro-phase measuring profilometry: Its sensitivity analysis and phase unwrapping,” Opt. Lasers Eng. 72, 47–57 (2015).
[Crossref]

2013 (2)

S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
[Crossref]

Y. Zhang, Z. Xiong, and F. Wu, “Unambiguous 3D measurement from speckle-embedded fringe,” Appl. Opt. 52(32), 7797–7805 (2013).
[Crossref] [PubMed]

2012 (2)

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

2010 (1)

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

2009 (1)

Y. Xu and D. G. Aliaga, “An adaptive correspondence algorithm for modeling scenes with strong interreflections,” IEEE Trans. Vis. Comput. Graph. 15(3), 465–480 (2009).
[Crossref] [PubMed]

2005 (2)

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
[Crossref]

2000 (1)

F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000).
[Crossref]

Achar, S.

M. O’Toole, S. Achar, S. G. Narasimhan, and K. N. Kutulakos, “Homogeneous codes for energy-efficient illumination and imaging,” in Proceedings of ACM SIGGRAPH (ACM, 2015), p. 35.
[Crossref]

Agrawal, A.

M. Gupta, A. Agrawal, A. Veeraraghavan, and S. G. Narasimhan, “Structured light 3D scanning in the presence of global illumination,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2011), pp. 713–720.
[Crossref]

Aliaga, D. G.

Y. Xu and D. G. Aliaga, “An adaptive correspondence algorithm for modeling scenes with strong interreflections,” IEEE Trans. Vis. Comput. Graph. 15(3), 465–480 (2009).
[Crossref] [PubMed]

Asundi, A.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Brown, G. M.

F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000).
[Crossref]

Chen, F.

F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000).
[Crossref]

Chen, Q.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Chen, T.

T. Chen, H. P. Seidel, and H. P. A. Lensch, “Modulated phase-shifting for 3D scanning,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Chen, X.

X. Chen and Y. H. Yang, “Scene adaptive structured light using error detection and correction,” Pattern Recognit. 48(1), 220–230 (2015).
[Crossref]

Couture, V.

V. Couture, N. Martin, and S. Roy, “Unstructured light scanning to overcome interreflections,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 1895–1902.

N. Martin, V. Couture, and S. Roy, “Subpixel scanning invariant to indirect lighting using quadratic code length,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2013), pp. 1441–1448.
[Crossref]

Da, F.

Dai, X.

Du, X.

Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
[Crossref]

Gai, S.

Gerken, B.

S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
[Crossref]

Gorthi, S. S.

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

Grossberg, M. D.

S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar, “Fast separation of direct and global components of a scene using high frequency illumination,” in Proceedings of ACM SIGGRAPH (ACM, 2006), pp. 935–944.
[Crossref]

Gupta, M.

M. Gupta and S. K. Nayar, “Micro phase shifting,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 813–820.

M. Gupta, A. Agrawal, A. Veeraraghavan, and S. G. Narasimhan, “Structured light 3D scanning in the presence of global illumination,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2011), pp. 713–720.
[Crossref]

Hamilton, D.

Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
[Crossref]

Hao, Q.

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

Harding, K. G.

Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
[Crossref]

Hassebrook, L. G.

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

Heinze, M.

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

Herbort, S.

S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
[Crossref]

Hu, Q.

Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
[Crossref]

Huang, L.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Jiang, H.

H. Jiang, Y. Zhou, and H. Zhao, “Using adaptive regional projection to measure parts with strong reflection,” Proc. SPIE 10458, 104581A (2017).

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

D. Li, H. Zhao, and H. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” in Proceedings of International Symposium on Optomechatronic Technologies (IEEE, 2011), pp. 1–5.

Krishnan, G.

S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar, “Fast separation of direct and global components of a scene using high frequency illumination,” in Proceedings of ACM SIGGRAPH (ACM, 2006), pp. 935–944.
[Crossref]

Kühmstedt, P.

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

Kutulakos, K. N.

M. O’Toole, S. Achar, S. G. Narasimhan, and K. N. Kutulakos, “Homogeneous codes for energy-efficient illumination and imaging,” in Proceedings of ACM SIGGRAPH (ACM, 2015), p. 35.
[Crossref]

M. O’Toole, J. Mather, and K. N. Kutulakos, “3D shape and indirect appearance by structured light transport,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2014), pp. 3246–3253.
[Crossref]

Lau, D. L.

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

Lensch, H. P. A.

T. Chen, H. P. Seidel, and H. P. A. Lensch, “Modulated phase-shifting for 3D scanning,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Li, D.

D. Li, H. Zhao, and H. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” in Proceedings of International Symposium on Optomechatronic Technologies (IEEE, 2011), pp. 1–5.

Li, X.

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

Liu, K.

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

Martin, N.

N. Martin, V. Couture, and S. Roy, “Subpixel scanning invariant to indirect lighting using quadratic code length,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2013), pp. 1441–1448.
[Crossref]

V. Couture, N. Martin, and S. Roy, “Unstructured light scanning to overcome interreflections,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 1895–1902.

Mather, J.

M. O’Toole, J. Mather, and K. N. Kutulakos, “3D shape and indirect appearance by structured light transport,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2014), pp. 3246–3253.
[Crossref]

Munkelt, C.

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

Narasimhan, S. G.

M. O’Toole, S. Achar, S. G. Narasimhan, and K. N. Kutulakos, “Homogeneous codes for energy-efficient illumination and imaging,” in Proceedings of ACM SIGGRAPH (ACM, 2015), p. 35.
[Crossref]

M. Gupta, A. Agrawal, A. Veeraraghavan, and S. G. Narasimhan, “Structured light 3D scanning in the presence of global illumination,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2011), pp. 713–720.
[Crossref]

Nayar, S. K.

M. Gupta and S. K. Nayar, “Micro phase shifting,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 813–820.

S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar, “Fast separation of direct and global components of a scene using high frequency illumination,” in Proceedings of ACM SIGGRAPH (ACM, 2006), pp. 935–944.
[Crossref]

Notnia, G.

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

O’Toole, M.

M. O’Toole, S. Achar, S. G. Narasimhan, and K. N. Kutulakos, “Homogeneous codes for energy-efficient illumination and imaging,” in Proceedings of ACM SIGGRAPH (ACM, 2015), p. 35.
[Crossref]

M. O’Toole, J. Mather, and K. N. Kutulakos, “3D shape and indirect appearance by structured light transport,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2014), pp. 3246–3253.
[Crossref]

Raskar, R.

S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar, “Fast separation of direct and global components of a scene using high frequency illumination,” in Proceedings of ACM SIGGRAPH (ACM, 2006), pp. 935–944.
[Crossref]

Rastogi, P.

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

Roy, S.

N. Martin, V. Couture, and S. Roy, “Subpixel scanning invariant to indirect lighting using quadratic code length,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2013), pp. 1441–1448.
[Crossref]

V. Couture, N. Martin, and S. Roy, “Unstructured light scanning to overcome interreflections,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 1895–1902.

Schugk, D.

S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
[Crossref]

Seidel, H. P.

T. Chen, H. P. Seidel, and H. P. A. Lensch, “Modulated phase-shifting for 3D scanning,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Song, M.

F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000).
[Crossref]

Süße, H.

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

Tang, S.

S. Tang, X. Zhang, and D. Tu, “Micro-phase measuring profilometry: Its sensitivity analysis and phase unwrapping,” Opt. Lasers Eng. 72, 47–57 (2015).
[Crossref]

Tu, D.

S. Tang, X. Zhang, and D. Tu, “Micro-phase measuring profilometry: Its sensitivity analysis and phase unwrapping,” Opt. Lasers Eng. 72, 47–57 (2015).
[Crossref]

Veeraraghavan, A.

M. Gupta, A. Agrawal, A. Veeraraghavan, and S. G. Narasimhan, “Structured light 3D scanning in the presence of global illumination,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2011), pp. 713–720.
[Crossref]

Wang, X.

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

Wang, Y.

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

Wöhler, C.

S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
[Crossref]

Wu, F.

Xiong, Z.

Xu, Y.

Y. Xu and D. G. Aliaga, “An adaptive correspondence algorithm for modeling scenes with strong interreflections,” IEEE Trans. Vis. Comput. Graph. 15(3), 465–480 (2009).
[Crossref] [PubMed]

Yang, Y. H.

X. Chen and Y. H. Yang, “Scene adaptive structured light using error detection and correction,” Pattern Recognit. 48(1), 220–230 (2015).
[Crossref]

Zhang, M.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Zhang, X.

S. Tang, X. Zhang, and D. Tu, “Micro-phase measuring profilometry: Its sensitivity analysis and phase unwrapping,” Opt. Lasers Eng. 72, 47–57 (2015).
[Crossref]

Zhang, Y.

Zhao, H.

H. Jiang, Y. Zhou, and H. Zhao, “Using adaptive regional projection to measure parts with strong reflection,” Proc. SPIE 10458, 104581A (2017).

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

D. Li, H. Zhao, and H. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” in Proceedings of International Symposium on Optomechatronic Technologies (IEEE, 2011), pp. 1–5.

Zhou, Y.

H. Jiang, Y. Zhou, and H. Zhao, “Using adaptive regional projection to measure parts with strong reflection,” Proc. SPIE 10458, 104581A (2017).

Zuo, C.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Appl. Opt. (1)

IEEE Trans. Pattern Anal. Mach. Intell. (1)

Y. Wang, K. Liu, Q. Hao, X. Wang, D. L. Lau, and L. G. Hassebrook, “Robust active stereo vision using Kullback-Leibler divergence,” IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 548–563 (2012).
[Crossref] [PubMed]

IEEE Trans. Vis. Comput. Graph. (1)

Y. Xu and D. G. Aliaga, “An adaptive correspondence algorithm for modeling scenes with strong interreflections,” IEEE Trans. Vis. Comput. Graph. 15(3), 465–480 (2009).
[Crossref] [PubMed]

ISPRS J. Photogramm. Remote Sens. (1)

S. Herbort, B. Gerken, D. Schugk, and C. Wöhler, “3D range scan enhancement using image-based methods,” ISPRS J. Photogramm. Remote Sens. 84, 69–84 (2013).
[Crossref]

Opt. Eng. (1)

F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000).
[Crossref]

Opt. Express (1)

Opt. Lasers Eng. (4)

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high reflective surfaces,” Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

S. Tang, X. Zhang, and D. Tu, “Micro-phase measuring profilometry: Its sensitivity analysis and phase unwrapping,” Opt. Lasers Eng. 72, 47–57 (2015).
[Crossref]

Pattern Recognit. (1)

X. Chen and Y. H. Yang, “Scene adaptive structured light using error detection and correction,” Pattern Recognit. 48(1), 220–230 (2015).
[Crossref]

Proc. SPIE (3)

C. Munkelt, P. Kühmstedt, M. Heinze, H. Süße, and G. Notnia, “How to detect object-caused illumination effects in 3D fringe projection,” Proc. SPIE 5856, 632–639 (2005).
[Crossref]

Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 60000D (2005).
[Crossref]

H. Jiang, Y. Zhou, and H. Zhao, “Using adaptive regional projection to measure parts with strong reflection,” Proc. SPIE 10458, 104581A (2017).

Other (10)

M. O’Toole, J. Mather, and K. N. Kutulakos, “3D shape and indirect appearance by structured light transport,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2014), pp. 3246–3253.
[Crossref]

M. O’Toole, S. Achar, S. G. Narasimhan, and K. N. Kutulakos, “Homogeneous codes for energy-efficient illumination and imaging,” in Proceedings of ACM SIGGRAPH (ACM, 2015), p. 35.
[Crossref]

M. Gupta, A. Agrawal, A. Veeraraghavan, and S. G. Narasimhan, “Structured light 3D scanning in the presence of global illumination,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2011), pp. 713–720.
[Crossref]

V. Couture, N. Martin, and S. Roy, “Unstructured light scanning to overcome interreflections,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 1895–1902.

N. Martin, V. Couture, and S. Roy, “Subpixel scanning invariant to indirect lighting using quadratic code length,” in Proceedings of IEEE International Conference on Computer Vision (IEEE, 2013), pp. 1441–1448.
[Crossref]

S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar, “Fast separation of direct and global components of a scene using high frequency illumination,” in Proceedings of ACM SIGGRAPH (ACM, 2006), pp. 935–944.
[Crossref]

T. Chen, H. P. Seidel, and H. P. A. Lensch, “Modulated phase-shifting for 3D scanning,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

M. Gupta and S. K. Nayar, “Micro phase shifting,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 813–820.

N. Pears, Y. Liu, and P. Bunting, 3D Imaging, Analysis and Applications (Springer-Verlag, 2012), Chap. 2.

D. Li, H. Zhao, and H. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” in Proceedings of International Symposium on Optomechatronic Technologies (IEEE, 2011), pp. 1–5.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1
Fig. 1 FPP in the presence of interreflections. (a) Typical setup of a FPP system; (b) Fringe patterns with strong glossy interreflections between concave surfaces (blue solid line: direct light path; red dashed line: indirect light path).
Fig. 2
Fig. 2 Framework of the proposed method.
Fig. 3
Fig. 3 Determining the rows required to perform epipolar imaging. (a) Pixels that satisfy p r; (b) Pixels that satisfy q r; (c) Pixels that satisfy (p q) r; (d) Mask after morphological operations. The gray band labels the rows required to perform epipolar imaging.
Fig. 4
Fig. 4 Epipolar imaging in the FPP system. The projected epipolar line becomes two lines in the captured image due to interreflections, and the line of direct light can be identified by epipolar constraint.
Fig. 5
Fig. 5 The epipolar imaging procedure of a single epipolar line. The original projected pattern I( x,y ) is multiplied by the projector mask M i ( x,y ) to obtain the masked projected pattern I ˜ i ( x,y ), and the original captured image I i ( x,y ) is multiplied by the camera mask M i ( x,y ) to obtain the masked image I ˜ i ( x,y ) in which interreflections are eliminated.
Fig. 6
Fig. 6 Schematic diagram of epipolar rectification and inverse epipolar rectification. I and IV are the image planes after epipolar rectification; II and III are image planes before epipolar rectification of the projector and the camera, respectively. Original projected pattern is multiplied by the horizontal epipolar mask to acquire masked projected pattern (I); Inverse epipolar rectification is performed to obtain the actual projected pattern (I−II); The projected pattern is captured by the camera (II−III); Epipolar rectification makes the projected epipolar line in the captured image become horizontal again (III−IV).
Fig. 7
Fig. 7 Speckle patterns projected on the concave surfaces. (a) Full-field projected speckle pattern; (b) Epipolar imaging speckle pattern.
Fig. 8
Fig. 8 Stereo matching results with different projected speckle patterns. (a) Difference between captured positive and negative speckle patterns; (b) Original projected speckle pattern (0°); (c) Stereo matching result with (b); (d) Right-skewed projected speckle pattern (25°); (e) Left-skewed projected speckle pattern (−25°); (f) Stereo matching result with (b), (d), and (e) simultaneously.
Fig. 9
Fig. 9 Regional fringe projection and imaging of a single region; the original fringe pattern I ϕ ( x,y ) is multiplied by the regional projector mask M( x,y ) to obtain the regional masked fringe pattern I ˜ ϕ ( x,y ), and the original captured fringe image I ϕ ( x,y ) is multiplied by the regional camera mask M ( x,y ) to obtain the regional masked fringe image I ˜ ϕ ( x,y ).
Fig. 10
Fig. 10 Measurement system and measured objects. (a) Measurement system formed by a camera and a projector; (b) Aluminium alloy workpiece with concave surfaces; (c) Ceramic bowl; (d) Etalon formed by two steel gauge blocks.
Fig. 11
Fig. 11 3D shape measurement by fringe projection. (a) HDR vertical fringes; (b) Unwrapped phase retrieved by vertical fringes; (c) Measurement result that contains wrong data caused by interreflections; (d) HDR horizontal fringes; (e) Unwrapped phase retrieved by horizontal fringes; (f) Measurement result after excluding the wrong data by horizontal fringes.
Fig. 12
Fig. 12 Epipolar imaging with speckle patterns. (a) Captured image of a single epipolar line; (b) Fused speckle pattern by epipolar imaging (only project the rows contain interreflections); (c) Measurement result of epipolar imaging (blue region).
Fig. 13
Fig. 13 Regional fringe projection and imaging. (a) Camera mask of region 1; (b) Camera mask of region 2; (c) Fused HDR fringe image of two regions; (d) Captured fringe image of region 1; (e) Captured fringe image of region 2; (f) Measurement result of regional fringe projection.
Fig. 14
Fig. 14 Measurement results of a ceramic bowl. (a) HDR fringes in FPP; (b) Fused speckle pattern by epipolar imaging (only project the rows contain interreflections); (c) Segmented regions by normal clustering (one gray level represents one region); (d) Measurement result of FPP; (e) Measurement result of epipolar imaging (blue region); (f) Measurement result of regional fringe projection.
Fig. 15
Fig. 15 Error distribution of fitting two planes. (a) FPP; (b) Epipolar imaging with speckle patterns; (c) Regional fringe projection.

Tables (1)

Tables Icon

Table 1 Fitting plane error comparison (unit: mm)

Equations (20)

Equations on this page are rendered with MathJax. Learn more.

I i ( x,y )= I A ( x,y )+ I B ( x,y )cos[ ϕ( x,y )+ π 2 i ],
ϕ( x,y )=arctan I 3 ( x,y ) I 1 ( x,y ) I 0 ( x,y ) I 2 ( x,y )
Φ 1 (x,y)= ϕ 1 (x,y)+2πround{ 1 2π [ λ 2 λ 2 λ 1 ( ϕ 1 ( x,y )- ϕ 2 ( x,y ) ) ϕ 1 (x,y) ] },
I 1 ( x,y )= I A 1 ( x,y )+ I B 1 ( x,y )cos[ ϕ 1 ( x,y ) ],
I 2 ( x,y )= I A 2 ( x,y )+ I B 2 ( x,y )cos[ ϕ 2 ( x,y ) ]= I A 2 + I B 2 cos[ ϕ 1 ( x,y )+Δϕ( x,y ) ],
I s ( x,y )= I A s ( x,y )+ I B s ( x,y )cos[ ϕ s ( x,y ) ],
ϕ s ( x,y )= ϕ 1 ( x,y )+Δ ϕ s ( x,y ),
tanΔ ϕ s ( x,y )= I B 2 ( x,y )sin[ Δϕ( x,y ) ] I B 1 ( x,y )+ I B 2 ( x,y )cos[ Δϕ( x,y ) ] ,
{ p: | Φ c ( x,y ) Φ p ( x,y ) | 2π λ 1 > ε 1 q: i=1 n [ d i ( x,y ) 1 n i=1 n d i ( x,y ) ] > ε 2 r: I B ( x,y )> ε 3 ,
I( x,y )= i=1 n I ˜ i ( x,y ) = i=1 n I( x,y ) M i ( x,y ) ,
I ˜ ( x,y )= i=1 n I ˜ i ( x,y ) = i=1 n I i ( x,y ) M i ( x,y ) ,
l =Fp,
l= F T p ,
NCC( x,y,d )= u,v [ I c ( u,v ) I ¯ c ][ I p ( ud,v ) I ¯ p ] { u,v [ I c ( u,v ) I ¯ c ] 2 u,v [ I p ( ud,v ) I ¯ p ] 2 } 1/2 ,
I ˜ ϕ ( x,y )= I ϕ ( x,y )M( x,y ),
I ˜ ϕ ( x,y )= I ϕ ( x,y ) M ( x,y ),
M i ( x,y )={ 0, n i ,n( x,y ) θ 1, n i ,n( x,y ) <θ ,
n( x,y )= ( P left P right )×( P up P down ) ( P left P right )×( P up P down ) ,
M i [ xd( x,y ),y ]= M i ( x,y ),
Φ region ( x,y )= ϕ region ( x,y )+2πround{ Φ projector [ xd( x,y ),y ] ϕ region ( x,y ) 2π },

Metrics