Abstract

We discuss physical and information theoretical limits of optical 3D metrology. Based on these principal considerations we introduce a novel single-shot 3D movie camera that almost reaches these limits. The camera is designed for the 3D acquisition of macroscopic live scenes. Like a hologram, each movie-frame encompasses the full 3D information about the object surface and the observation perspective can be varied while watching the 3D movie. The camera combines single-shot ability with a point cloud density close to the theoretical limit. No space-bandwidth is wasted by pattern codification. With 1-megapixel sensors, the 3D camera delivers nearly 300,000 independent 3D points within each frame. The 3D data display a lateral resolution and a depth precision only limited by physics. The approach is based on multi-line triangulation. The requisite low-cost technology is simple. Only two properly positioned synchronized cameras solve the profound ambiguity problem omnipresent in 3D metrology.

© 2017 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Single-shot three-dimensional sensing with improved data density

Florian Willomitzer, Svenja Ettl, Christian Faber, and Gerd Häusler
Appl. Opt. 54(3) 408-417 (2015)

Flying triangulation—an optical 3D sensor for the motion-robust acquisition of complex objects

Svenja Ettl, Oliver Arold, Zheng Yang, and Gerd Häusler
Appl. Opt. 51(2) 281-289 (2012)

Rapid and automatic 3D body measurement system based on a GPU–Steger line detector

Xingjian Liu, Hengshuang Zhao, Guomin Zhan, Kai Zhong, Zhongwei Li, Yuhjin Chao, and Yusheng Shi
Appl. Opt. 55(21) 5539-5547 (2016)

References

  • View by:
  • |
  • |
  • |

  1. N. L. Lapa and Y. A. Brailov, “System and method for three-dimensional measurement of the shape of material objects,” U.S. Patent No. US 7,768,656 B2 (2010).
  2. B. Freedman, A. Shpunt, M. Machline, and Y. Arieli, “Depth mapping using projected patterns,” U.S. Patent Application No. US 2010/0118123 A1 (2010).
  3. H. Kawasaki, R. Furukawa, R. Sagawa, and Y. Yagi, “Dynamic scene shape reconstruction using a single structured light pattern,” IEEE Conference on CVPR, 1–8 (2008).
  4. R. Sagawa, R. Furukawa, and H. Kawasaki, “Dense 3D reconstruction from high frame-rate video using a static grid pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 36(9), 1733–1747 (2014).
    [Crossref] [PubMed]
  5. B. Harendt, M. Große, M. Schaffer, and R. Kowarschik, “3D shape measurement of static and moving objects with adaptive spatiotemporal correlation,” Appl. Opt. 53(31), 7507–7515 (2014).
    [Crossref] [PubMed]
  6. S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
    [Crossref]
  7. N. Matsuda, O. Cossairt, and M. Gupta, “MC3D: Motion Contrast 3D Scanning,” 2015 IEEE International Conference on Computational Photography (ICCP), Houston, TX, 2015, pp. 1–10.
  8. W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22(22), 26752–26762 (2014).
    [Crossref] [PubMed]
  9. W. Lohry, V. Chen, and S. Zhang, “Absolute three-dimensional shape measurement using coded fringe patterns without phase unwrapping or projector calibration,” Opt. Express 22(2), 1287–1301 (2014).
    [Crossref] [PubMed]
  10. H. Nguyen, D. Nguyen, Z. Wang, H. Kieu, and M. Le, “Real-time, high-accuracy 3D imaging and shape measurement,” Appl. Opt. 54(1), A9–A17 (2015).
    [Crossref] [PubMed]
  11. R. Ishiyama, S. Sakamoto, J. Tajima, T. Okatani, and K. Deguchi, “Absolute phase measurements using geometric constraints between multiple cameras and projectors,” Appl. Opt. 46(17), 3528–3538 (2007).
    [Crossref] [PubMed]
  12. K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
    [Crossref]
  13. C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements,” Proc. SPIE 8789, 878906 (2013).
    [Crossref]
  14. K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
    [Crossref]
  15. G. Häusler and S. Ettl, “Limitations of optical 3D sensors,” in Optical Measurement of Surface Topography, R. Leach, ed. (Springer, 2011).
  16. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23(18), 3105–3108 (1984).
    [Crossref] [PubMed]
  17. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983).
    [Crossref] [PubMed]
  18. G. Häusler and W. Heckel, “Light sectioning with large depth and high resolution,” Appl. Opt. 27(24), 5165–5169 (1988).
    [Crossref] [PubMed]
  19. F. Willomitzer, S. Ettl, C. Faber, and G. Häusler, “Single-shot three-dimensional sensing with improved data density,” Appl. Opt. 54(3), 408–417 (2015).
    [Crossref]
  20. H. O. Saldner and J. M. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. 36(13), 2770–2775 (1997).
    [Crossref] [PubMed]
  21. M. Servin, J. M. Padilla, A. Gonzalez, and G. Garnica, “Temporal phase-unwrapping of static surfaces with 2-sensitivity fringe-patterns,” Opt. Express 23(12), 15806–15815 (2015).
    [Crossref] [PubMed]
  22. S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
    [Crossref] [PubMed]
  23. F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
    [Crossref]
  24. X. Laboureux and G. Häusler, “Localization and registration of three-dimensional objects in space—where are the limits?” Appl. Opt. 40(29), 5206–5216 (2001).
    [Crossref] [PubMed]
  25. G. Häusler and D. Ritter, “Parallel three-dimensional sensing by color-coded triangulation,” Appl. Opt. 32(35), 7164–7169 (1993).
    [Crossref] [PubMed]
  26. J. Geng, “Rainbow three-dimensional camera: New concept of high-speed three-dimensional vision systems,” Opt. Eng. 35(2), 376–383 (1996).
    [Crossref]
  27. C. Schmalz and E. Angelopoulou, “Robust single-shot structured light,” IEEE Workshop on Projector–Camera Systems, (2010).
  28. M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).
  29. R. G. Dorsch, G. Häusler, and J. M. Herrmann, “Laser triangulation: Fundamental uncertainty in distance measurement,” Appl. Opt. 33(7), 1306–1314 (1994).
    [Crossref] [PubMed]
  30. G. Häusler, “Ubiquitous coherence - boon and bale of the optical metrologist,” Speckle Metrology, Trondheim. Proc. SPIE 4933, 48–52 (2003).
    [Crossref]
  31. G. Häusler, “Speckle and Coherence,” in Encyclopedia of Modern Optics, B. Guenther, ed. (Elsevier, 2004).
  32. J. Habermann, “Statistisch unabhängige Specklefelder zur Reduktion von Messfehlern in der Weißlichtinterferometrie,” Diploma Thesis, University Erlangen-Nuremberg (2002).
  33. F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).
  34. YouTube-Channel of the authors’ research group: www.youtube.com/user/Osmin3D
  35. C. Wagner and G. Häusler, “Information theoretical optimization for optical range sensors,” Appl. Opt. 42(27), 5418–5426 (2003).
    [Crossref] [PubMed]
  36. G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

2016 (2)

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
[Crossref]

2015 (3)

2014 (5)

2013 (3)

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements,” Proc. SPIE 8789, 878906 (2013).
[Crossref]

F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
[Crossref]

2012 (2)

S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
[Crossref] [PubMed]

G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

2007 (1)

2003 (2)

C. Wagner and G. Häusler, “Information theoretical optimization for optical range sensors,” Appl. Opt. 42(27), 5418–5426 (2003).
[Crossref] [PubMed]

G. Häusler, “Ubiquitous coherence - boon and bale of the optical metrologist,” Speckle Metrology, Trondheim. Proc. SPIE 4933, 48–52 (2003).
[Crossref]

2001 (1)

1997 (1)

1996 (1)

J. Geng, “Rainbow three-dimensional camera: New concept of high-speed three-dimensional vision systems,” Opt. Eng. 35(2), 376–383 (1996).
[Crossref]

1994 (1)

1993 (1)

1988 (1)

1984 (1)

1983 (1)

Angelopoulou, E.

C. Schmalz and E. Angelopoulou, “Robust single-shot structured light,” IEEE Workshop on Projector–Camera Systems, (2010).

Arold, O.

F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
[Crossref]

S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
[Crossref] [PubMed]

Beeson, E.

M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).

Bräuer-Burchardt, C.

C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements,” Proc. SPIE 8789, 878906 (2013).
[Crossref]

Chen, V.

Cossairt, O.

N. Matsuda, O. Cossairt, and M. Gupta, “MC3D: Motion Contrast 3D Scanning,” 2015 IEEE International Conference on Computational Photography (ICCP), Houston, TX, 2015, pp. 1–10.

Davis, J.

M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).

Deguchi, K.

Dienstbier, P.

G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

Dietrich, P.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Dorsch, R. G.

Ettl, S.

F. Willomitzer, S. Ettl, C. Faber, and G. Häusler, “Single-shot three-dimensional sensing with improved data density,” Appl. Opt. 54(3), 408–417 (2015).
[Crossref]

F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).

F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
[Crossref]

S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
[Crossref] [PubMed]

Faber, C.

F. Willomitzer, S. Ettl, C. Faber, and G. Häusler, “Single-shot three-dimensional sensing with improved data density,” Appl. Opt. 54(3), 408–417 (2015).
[Crossref]

G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

Furukawa, R.

R. Sagawa, R. Furukawa, and H. Kawasaki, “Dense 3D reconstruction from high frame-rate video using a static grid pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 36(9), 1733–1747 (2014).
[Crossref] [PubMed]

Garnica, G.

Geng, J.

J. Geng, “Rainbow three-dimensional camera: New concept of high-speed three-dimensional vision systems,” Opt. Eng. 35(2), 376–383 (1996).
[Crossref]

Gonzalez, A.

Große, M.

Gupta, M.

N. Matsuda, O. Cossairt, and M. Gupta, “MC3D: Motion Contrast 3D Scanning,” 2015 IEEE International Conference on Computational Photography (ICCP), Houston, TX, 2015, pp. 1–10.

Halioua, M.

Harendt, B.

Häusler, G.

F. Willomitzer, S. Ettl, C. Faber, and G. Häusler, “Single-shot three-dimensional sensing with improved data density,” Appl. Opt. 54(3), 408–417 (2015).
[Crossref]

F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).

F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
[Crossref]

S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
[Crossref] [PubMed]

G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

C. Wagner and G. Häusler, “Information theoretical optimization for optical range sensors,” Appl. Opt. 42(27), 5418–5426 (2003).
[Crossref] [PubMed]

G. Häusler, “Ubiquitous coherence - boon and bale of the optical metrologist,” Speckle Metrology, Trondheim. Proc. SPIE 4933, 48–52 (2003).
[Crossref]

X. Laboureux and G. Häusler, “Localization and registration of three-dimensional objects in space—where are the limits?” Appl. Opt. 40(29), 5206–5216 (2001).
[Crossref] [PubMed]

R. G. Dorsch, G. Häusler, and J. M. Herrmann, “Laser triangulation: Fundamental uncertainty in distance measurement,” Appl. Opt. 33(7), 1306–1314 (1994).
[Crossref] [PubMed]

G. Häusler and D. Ritter, “Parallel three-dimensional sensing by color-coded triangulation,” Appl. Opt. 32(35), 7164–7169 (1993).
[Crossref] [PubMed]

G. Häusler and W. Heckel, “Light sectioning with large depth and high resolution,” Appl. Opt. 27(24), 5165–5169 (1988).
[Crossref] [PubMed]

Heckel, W.

Heist, S.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Herrmann, J. M.

Hu, S.

K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
[Crossref]

Huntley, J. M.

Ishiyama, R.

Kawasaki, H.

R. Sagawa, R. Furukawa, and H. Kawasaki, “Dense 3D reconstruction from high frame-rate video using a static grid pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 36(9), 1733–1747 (2014).
[Crossref] [PubMed]

Kieu, H.

Kowarschik, R.

Kühmstedt, P.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements,” Proc. SPIE 8789, 878906 (2013).
[Crossref]

Laboureux, X.

Le, M.

Lei, Y.

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

Li, Z.

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

Liu, H. C.

Lohry, W.

Lutzke, P.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Matsuda, N.

N. Matsuda, O. Cossairt, and M. Gupta, “MC3D: Motion Contrast 3D Scanning,” 2015 IEEE International Conference on Computational Photography (ICCP), Houston, TX, 2015, pp. 1–10.

Mutoh, K.

Nguyen, D.

Nguyen, H.

Notni, G.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements,” Proc. SPIE 8789, 878906 (2013).
[Crossref]

Okatani, T.

Padilla, J. M.

Ramamoorthi, R.

M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).

Ritter, D.

Rusinkiewicz, S.

M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).

Sagawa, R.

R. Sagawa, R. Furukawa, and H. Kawasaki, “Dense 3D reconstruction from high frame-rate video using a static grid pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 36(9), 1733–1747 (2014).
[Crossref] [PubMed]

Sakamoto, S.

Saldner, H. O.

Schaffer, M.

Schiffers, F.

F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).

Schmalz, C.

C. Schmalz and E. Angelopoulou, “Robust single-shot structured light,” IEEE Workshop on Projector–Camera Systems, (2010).

Schmidt, I.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Servin, M.

Shi, Y.

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

Song, K.

K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
[Crossref]

Srinivasan, V.

Tajima, J.

Takeda, M.

Tünnermann, A.

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Wagner, C.

Wang, C.

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

Wang, Z.

Wen, X.

K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
[Crossref]

Willomitzer, F.

F. Willomitzer, S. Ettl, C. Faber, and G. Häusler, “Single-shot three-dimensional sensing with improved data density,” Appl. Opt. 54(3), 408–417 (2015).
[Crossref]

F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).

F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
[Crossref]

G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

Yan, Y.

K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
[Crossref]

Yang, Z.

F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).

S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
[Crossref] [PubMed]

Young, M.

M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).

Zhang, S.

Zhong, K.

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

AIP Conf. Proc. (1)

F. Willomitzer, S. Ettl, O. Arold, and G. Häusler, “Flying Triangulation - a motion-robust optical 3D sensor for the real-time shape acquisition of complex objects,” AIP Conf. Proc. 1537, 19–26 (2013).
[Crossref]

Appl. Opt. (13)

X. Laboureux and G. Häusler, “Localization and registration of three-dimensional objects in space—where are the limits?” Appl. Opt. 40(29), 5206–5216 (2001).
[Crossref] [PubMed]

G. Häusler and D. Ritter, “Parallel three-dimensional sensing by color-coded triangulation,” Appl. Opt. 32(35), 7164–7169 (1993).
[Crossref] [PubMed]

R. G. Dorsch, G. Häusler, and J. M. Herrmann, “Laser triangulation: Fundamental uncertainty in distance measurement,” Appl. Opt. 33(7), 1306–1314 (1994).
[Crossref] [PubMed]

S. Ettl, O. Arold, Z. Yang, and G. Häusler, “Flying Triangulation: an optical 3D sensor for the motion-robust acquisition of complex objects,” Appl. Opt. 51(2), 281–289 (2012).
[Crossref] [PubMed]

C. Wagner and G. Häusler, “Information theoretical optimization for optical range sensors,” Appl. Opt. 42(27), 5418–5426 (2003).
[Crossref] [PubMed]

B. Harendt, M. Große, M. Schaffer, and R. Kowarschik, “3D shape measurement of static and moving objects with adaptive spatiotemporal correlation,” Appl. Opt. 53(31), 7507–7515 (2014).
[Crossref] [PubMed]

H. Nguyen, D. Nguyen, Z. Wang, H. Kieu, and M. Le, “Real-time, high-accuracy 3D imaging and shape measurement,” Appl. Opt. 54(1), A9–A17 (2015).
[Crossref] [PubMed]

R. Ishiyama, S. Sakamoto, J. Tajima, T. Okatani, and K. Deguchi, “Absolute phase measurements using geometric constraints between multiple cameras and projectors,” Appl. Opt. 46(17), 3528–3538 (2007).
[Crossref] [PubMed]

V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23(18), 3105–3108 (1984).
[Crossref] [PubMed]

M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983).
[Crossref] [PubMed]

G. Häusler and W. Heckel, “Light sectioning with large depth and high resolution,” Appl. Opt. 27(24), 5165–5169 (1988).
[Crossref] [PubMed]

F. Willomitzer, S. Ettl, C. Faber, and G. Häusler, “Single-shot three-dimensional sensing with improved data density,” Appl. Opt. 54(3), 408–417 (2015).
[Crossref]

H. O. Saldner and J. M. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. 36(13), 2770–2775 (1997).
[Crossref] [PubMed]

DGaO-Proceedings (2)

G. Häusler, C. Faber, F. Willomitzer, and P. Dienstbier, “Why can’t we purchase a perfect single shot 3D-sensor?” DGaO-Proceedings 2012, A8 (2012).

F. Schiffers, F. Willomitzer, S. Ettl, Z. Yang, and G. Häusler, “Calibration of multi-line-light-sectioning,” DGaO-Proceedings 2014, 12 (2014).

IEEE Trans. Pattern Anal. Mach. Intell. (1)

R. Sagawa, R. Furukawa, and H. Kawasaki, “Dense 3D reconstruction from high frame-rate video using a static grid pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 36(9), 1733–1747 (2014).
[Crossref] [PubMed]

Opt. Eng. (1)

J. Geng, “Rainbow three-dimensional camera: New concept of high-speed three-dimensional vision systems,” Opt. Eng. 35(2), 376–383 (1996).
[Crossref]

Opt. Express (3)

Opt. Lasers Eng. (3)

K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using Fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016).
[Crossref]

K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. 51(11), 1213–1222 (2013).
[Crossref]

S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using GOBO projection,” Opt. Lasers Eng. 87, 90–96 (2016).
[Crossref]

Proc. SPIE (1)

C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements,” Proc. SPIE 8789, 878906 (2013).
[Crossref]

Speckle Metrology, Trondheim. Proc. SPIE (1)

G. Häusler, “Ubiquitous coherence - boon and bale of the optical metrologist,” Speckle Metrology, Trondheim. Proc. SPIE 4933, 48–52 (2003).
[Crossref]

Other (10)

G. Häusler, “Speckle and Coherence,” in Encyclopedia of Modern Optics, B. Guenther, ed. (Elsevier, 2004).

J. Habermann, “Statistisch unabhängige Specklefelder zur Reduktion von Messfehlern in der Weißlichtinterferometrie,” Diploma Thesis, University Erlangen-Nuremberg (2002).

C. Schmalz and E. Angelopoulou, “Robust single-shot structured light,” IEEE Workshop on Projector–Camera Systems, (2010).

M. Young, E. Beeson, J. Davis, S. Rusinkiewicz, and R. Ramamoorthi, “Viewpoint-coded structured light,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), (2007).

YouTube-Channel of the authors’ research group: www.youtube.com/user/Osmin3D

N. Matsuda, O. Cossairt, and M. Gupta, “MC3D: Motion Contrast 3D Scanning,” 2015 IEEE International Conference on Computational Photography (ICCP), Houston, TX, 2015, pp. 1–10.

N. L. Lapa and Y. A. Brailov, “System and method for three-dimensional measurement of the shape of material objects,” U.S. Patent No. US 7,768,656 B2 (2010).

B. Freedman, A. Shpunt, M. Machline, and Y. Arieli, “Depth mapping using projected patterns,” U.S. Patent Application No. US 2010/0118123 A1 (2010).

H. Kawasaki, R. Furukawa, R. Sagawa, and Y. Yagi, “Dynamic scene shape reconstruction using a single structured light pattern,” IEEE Conference on CVPR, 1–8 (2008).

G. Häusler and S. Ettl, “Limitations of optical 3D sensors,” in Optical Measurement of Surface Topography, R. Leach, ed. (Springer, 2011).

Supplementary Material (6)

NameDescription
» Visualization 1       Person watching a free-viewpoint 3D movie
» Visualization 2       3D movie with unidirectional lines of a human face
» Visualization 3       3D movie with unidirectional lines of a folded paper
» Visualization 4       3D movie with crossed lines of a human face
» Visualization 5       Color 3D movie with crossed lines of a human face
» Visualization 6       3D movie with crossed lines of a bouncing ping pong ball

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1

(a) Nyquist sampling allows for precise sub-pixel line localization and high distance precision. (b) The minimum distance between projected lines is three times the pixel pitch.

Fig. 2
Fig. 2

The achievable number of lines L depends on the triangulation angle θ and the unique measurement depth Δz.

Fig. 3
Fig. 3

Setup comprising a projector P that projects a static line pattern and two cameras C1 and C2. For a vertical line pattern, the horizontal distance between the nodal points of the projector and the cameras define the related triangulation angles θ1 and θ2. (a) Front view of the setup. (b) View from top, illustrating related angles and measurement volumes.

Fig. 4
Fig. 4

Unique indexing by combining two camera images: (a) Object. (b) and (c) Images of the object with two cameras, seen from different triangulation angles (known indices in (b) are color coded). (d) and (e) 3D data, directly calculated from (b) and (c). (f) Noisy 3D data from (d), correctly indexed (color coded), back-projected onto the chip of C2, together with the line image of C2. (g) Correctly indexed lines of C2, assigned from the indices delivered by C1 (color coded). (h) Final 3D model evaluated from (g) with correct indices and low noise.

Fig. 5
Fig. 5

Raw data (without post-processing) from a single-shot measurement with a line density of ~160 lines/field. (a) Camera image of human face with projected lines. (b) 3D model from different perspectives, evaluated from one single video frame. (c) Close view of the 3D data, illustrating the low noise (~200µm).

Fig. 6
Fig. 6

Single-shot measurement (raw data) of a folded piece of paper. Edges are preserved, as the full space bandwidth can be exploited, there is no spatial line encoding. (a) 3D model of the folded paper. (b) Close-up view. The object was continuously moved during the measurement. A video can be found in Visualization 3 or [34].

Fig. 7
Fig. 7

Line images and noise. Left column: (a) Line image with laser illumination. (b) Line image with reduced spatial and temporal coherence. Center column: magnified line images with sub-pixel maximum location. Right column: calculated distance with uncertainty.

Fig. 8
Fig. 8

Original patterns to be projected onto the object visualize the density of 3D points (with magnification windows). (a) Unidirectional line pattern (see section 2). (b) Crossed lines for higher point density. The reader can zoom in to resolve the patterns.

Fig. 9
Fig. 9

Setup with crossed line projection: Two cameras C1 and C2 and the projector P produce four independent triangulation angles: θ1x, θ1y, θ2x, and θ2y. Each line direction is evaluated separately. (a) Front view of the setup. (b) Perspective sketch with triangulation angles.

Fig. 10
Fig. 10

(a) Single frames of a 3D movie of a “talking face”, acquired with crossed lines single-shot triangulation. The different perspectives (viewpoints 1, 2, and 3) are all taken from the same corresponding video frame. (b) and (c) Close look at the 3D data, illustrating relatively low noise. The corresponding motion picture can be seen in Visualization 4 and in [34].

Fig. 11
Fig. 11

Three frames from a color 3D movie. The monochrome cameras were replaced by color cameras, and the color texture was acquired along the projected lines from the same frame as the 3D data. The movie can be seen in Visualization 5 and in [34].

Fig. 12
Fig. 12

Line images seen by C1 (small θ1) and C2 (large θ2). (a) The line image of C1 displays a unique “phase distortion” (< ) and can be considered a perfect image-plane hologram of the object surface. (b) The line image of C2 displays a large phase modulation (> ) that does not allow for simple unique decoding.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

L Δx 1 Δztanθ .
d 2 '>2δx' sin θ 2 sin θ 1 .
δ z coh = C s 2π λ sin u obs sinθ ,
δx ' coh = C s 2π λ sinu ' obs ,
c spat = sin u obs sin u ill =0.3.
c pol =0.5.
c pix = λ d pix sin u obs =0.95.

Metrics