Abstract

Shape connection based on the pattern recognition of three-dimensional shapes is presented. In this technique, the object shape is reconstructed by laser scanning and image processing. The object is reconstructed from multiple views when an object occlusion appears. From this process, multiple parts of the object are reconstructed. Then, these parts are assembled to obtain the complete object shape. To perform the assembling, a matching procedure is applied to a transverse section of the multiple views by Hu moments. The depth of the transverse section is computed by an approximation network based on the behavior of the laser line and the camera position. Also, vision parameters are deduced by the network and image processing. In this manner, the shape connection is achieved automatically by computational algorithms. Therefore, errors of physical measurement are not passed to the reconstruction system. Thus, the performance and the accuracy of the reconstruction system are improved. This is elucidated by the comparison between the obtained results by the proposed technique and the obtained results by a contact method. Thus, a contribution in laser metrology for shape connection is achieved.

© 2008 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. R. Klette, K. Schluns, and A. Koschan, Computer Vision: Three-Dimensional Data From Images (Springer , 1998).
  2. G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
    [CrossRef]
  3. C. Breque, J. C. Dupre, and F. Brenand, “Calibration of a system of projection moiré for relief measuring: biomechanical applications,” Opt. Lasers Eng. 41, 241-260 (2004).
    [CrossRef]
  4. F. Remondino and S. El-Hakim, “Image-based 3D modelling: a review,” Photogram. Record 21, 269-291 (2006).
    [CrossRef]
  5. J. A. Muñoz Rodríguez and R. Rodríguez-Vera, “Evaluation of the light line displacement location for object shape detection,” J. Mod. Opt. 50, 137-154 (2003).
    [CrossRef]
  6. L. Zagorchev and A. Goshtasby, “A paintbrush laser range scanner,” Comput. Vision Image Underst. 101, 65-86(2006).
    [CrossRef]
  7. L. M. Song and D. N. Wang, “A novel grating matching method for 3D reconstruction,” NDT E Int. 39, 282-288 (2006).
    [CrossRef]
  8. L. Song, X. Qu, and Y. Yang, “Application of structured lighting sensor for on line measurement,” Opt. Lasers Eng. 43, 1118-1126 (2005).
    [CrossRef]
  9. M. Young Kim and H. Cho, “An active trinocular vision system of sensing indoor environment for mobile robots,” Sens. Actuators A, Phys. 125, 192-209 (2006).
    [CrossRef]
  10. S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
    [CrossRef]
  11. X. Lu, A. K. Jain, and S. C. Dass, “3D facial expression modeling for recognition,” Proc. SPIE 5779, 113-121 (2005).
    [CrossRef]
  12. X. Li and W. G. Wee, “Data fusion method for 3-D object reconstruction from range image,” Opt. Eng. 44, 107006(2005).
    [CrossRef]
  13. H. Y. Lin and M. Subbarao, “Vision system for fast 3-D model reconstruction,” Opt. Eng. 43, 1651-1664 (2004).
    [CrossRef]
  14. M. E. Mortenson, Geometric Modeling, 2nd ed. (Wiley, 1997).
  15. H. Frederick and G. J. Lieberman, Introduction to Operations Research (McGraw-Hill, 1982).
  16. C. F. Gerald and P. O. Wheatley, Applied Numerical Analysis (Addison-Wesley, 1992).
  17. R. Castaneda, “ Imaging quality evaluation of rotation symmetrical systems in the spatial frequency domain,” Pure Appl. Opt. 5, 45-53 (1996).
    [CrossRef]
  18. K. R. Castleman, Digital Image Processing (Prentice Hall, 1996).
  19. R. Castaneda, J. Garcia-Sucerquia, and F. Brand, “Quality descriptor of optical beams based on centered reduced moments I: spot analysis,” Opt. Commun. 227, 37-48 (2003).
    [CrossRef]
  20. R. C. Gonzalez and P. Wintz, Digital Image Processing, 2nd ed. (Addison-Wesley, 1987).
  21. Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Opt. Eng. 42, 2956-2966 (2003).
    [CrossRef]
  22. F. Liu, F. Duan, and S. Ye, “A new method for calibration of line structured light sensor using zigzag target,” Opt. Laser Technol. 34, 373 (2002).
    [CrossRef]
  23. A. M. McIvor, “Nonlinear calibration of a laser stripe profiler,” Opt. Eng. 41, 205-212 (2002).
    [CrossRef]
  24. D. Q. Huuynh, “Calibration a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33, 73-86 (1999).
    [CrossRef]
  25. T. Masters, Practical Neural Networks Recipes in C++ (Academic Press, 1993).
    [PubMed]
  26. C. H. Liu and J. Ward, “The use of 3D information in face recognition,” Vision Res. 46, 768-773 (2006).
    [CrossRef]
  27. P. Picton, Neural Networks (Polgrave, 1998).
  28. J. E. Freund, (Prentice Hall, 1979).

2006

F. Remondino and S. El-Hakim, “Image-based 3D modelling: a review,” Photogram. Record 21, 269-291 (2006).
[CrossRef]

M. Young Kim and H. Cho, “An active trinocular vision system of sensing indoor environment for mobile robots,” Sens. Actuators A, Phys. 125, 192-209 (2006).
[CrossRef]

L. Zagorchev and A. Goshtasby, “A paintbrush laser range scanner,” Comput. Vision Image Underst. 101, 65-86(2006).
[CrossRef]

L. M. Song and D. N. Wang, “A novel grating matching method for 3D reconstruction,” NDT E Int. 39, 282-288 (2006).
[CrossRef]

C. H. Liu and J. Ward, “The use of 3D information in face recognition,” Vision Res. 46, 768-773 (2006).
[CrossRef]

2005

L. Song, X. Qu, and Y. Yang, “Application of structured lighting sensor for on line measurement,” Opt. Lasers Eng. 43, 1118-1126 (2005).
[CrossRef]

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

X. Lu, A. K. Jain, and S. C. Dass, “3D facial expression modeling for recognition,” Proc. SPIE 5779, 113-121 (2005).
[CrossRef]

X. Li and W. G. Wee, “Data fusion method for 3-D object reconstruction from range image,” Opt. Eng. 44, 107006(2005).
[CrossRef]

2004

H. Y. Lin and M. Subbarao, “Vision system for fast 3-D model reconstruction,” Opt. Eng. 43, 1651-1664 (2004).
[CrossRef]

G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
[CrossRef]

C. Breque, J. C. Dupre, and F. Brenand, “Calibration of a system of projection moiré for relief measuring: biomechanical applications,” Opt. Lasers Eng. 41, 241-260 (2004).
[CrossRef]

2003

J. A. Muñoz Rodríguez and R. Rodríguez-Vera, “Evaluation of the light line displacement location for object shape detection,” J. Mod. Opt. 50, 137-154 (2003).
[CrossRef]

R. Castaneda, J. Garcia-Sucerquia, and F. Brand, “Quality descriptor of optical beams based on centered reduced moments I: spot analysis,” Opt. Commun. 227, 37-48 (2003).
[CrossRef]

Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Opt. Eng. 42, 2956-2966 (2003).
[CrossRef]

2002

F. Liu, F. Duan, and S. Ye, “A new method for calibration of line structured light sensor using zigzag target,” Opt. Laser Technol. 34, 373 (2002).
[CrossRef]

A. M. McIvor, “Nonlinear calibration of a laser stripe profiler,” Opt. Eng. 41, 205-212 (2002).
[CrossRef]

1999

D. Q. Huuynh, “Calibration a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33, 73-86 (1999).
[CrossRef]

1996

R. Castaneda, “ Imaging quality evaluation of rotation symmetrical systems in the spatial frequency domain,” Pure Appl. Opt. 5, 45-53 (1996).
[CrossRef]

Brand, F.

R. Castaneda, J. Garcia-Sucerquia, and F. Brand, “Quality descriptor of optical beams based on centered reduced moments I: spot analysis,” Opt. Commun. 227, 37-48 (2003).
[CrossRef]

Brenand, F.

C. Breque, J. C. Dupre, and F. Brenand, “Calibration of a system of projection moiré for relief measuring: biomechanical applications,” Opt. Lasers Eng. 41, 241-260 (2004).
[CrossRef]

Breque, C.

C. Breque, J. C. Dupre, and F. Brenand, “Calibration of a system of projection moiré for relief measuring: biomechanical applications,” Opt. Lasers Eng. 41, 241-260 (2004).
[CrossRef]

Castaneda, R.

R. Castaneda, J. Garcia-Sucerquia, and F. Brand, “Quality descriptor of optical beams based on centered reduced moments I: spot analysis,” Opt. Commun. 227, 37-48 (2003).
[CrossRef]

R. Castaneda, “ Imaging quality evaluation of rotation symmetrical systems in the spatial frequency domain,” Pure Appl. Opt. 5, 45-53 (1996).
[CrossRef]

Castleman, K. R.

K. R. Castleman, Digital Image Processing (Prentice Hall, 1996).

Cho, H.

M. Young Kim and H. Cho, “An active trinocular vision system of sensing indoor environment for mobile robots,” Sens. Actuators A, Phys. 125, 192-209 (2006).
[CrossRef]

Dass, S. C.

X. Lu, A. K. Jain, and S. C. Dass, “3D facial expression modeling for recognition,” Proc. SPIE 5779, 113-121 (2005).
[CrossRef]

Duan, F.

F. Liu, F. Duan, and S. Ye, “A new method for calibration of line structured light sensor using zigzag target,” Opt. Laser Technol. 34, 373 (2002).
[CrossRef]

Dupre, J. C.

C. Breque, J. C. Dupre, and F. Brenand, “Calibration of a system of projection moiré for relief measuring: biomechanical applications,” Opt. Lasers Eng. 41, 241-260 (2004).
[CrossRef]

El-Hakim, S.

F. Remondino and S. El-Hakim, “Image-based 3D modelling: a review,” Photogram. Record 21, 269-291 (2006).
[CrossRef]

Frederick, H.

H. Frederick and G. J. Lieberman, Introduction to Operations Research (McGraw-Hill, 1982).

Freund, J. E.

J. E. Freund, (Prentice Hall, 1979).

Garcia-Sucerquia, J.

R. Castaneda, J. Garcia-Sucerquia, and F. Brand, “Quality descriptor of optical beams based on centered reduced moments I: spot analysis,” Opt. Commun. 227, 37-48 (2003).
[CrossRef]

Gerald, C. F.

C. F. Gerald and P. O. Wheatley, Applied Numerical Analysis (Addison-Wesley, 1992).

Gonzalez, R. C.

R. C. Gonzalez and P. Wintz, Digital Image Processing, 2nd ed. (Addison-Wesley, 1987).

Goshtasby, A.

L. Zagorchev and A. Goshtasby, “A paintbrush laser range scanner,” Comput. Vision Image Underst. 101, 65-86(2006).
[CrossRef]

Hu, Z.

G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
[CrossRef]

Huuynh, D. Q.

D. Q. Huuynh, “Calibration a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33, 73-86 (1999).
[CrossRef]

Jain, A. K.

X. Lu, A. K. Jain, and S. C. Dass, “3D facial expression modeling for recognition,” Proc. SPIE 5779, 113-121 (2005).
[CrossRef]

Klette, R.

R. Klette, K. Schluns, and A. Koschan, Computer Vision: Three-Dimensional Data From Images (Springer , 1998).

Koschan, A.

R. Klette, K. Schluns, and A. Koschan, Computer Vision: Three-Dimensional Data From Images (Springer , 1998).

Li, X.

X. Li and W. G. Wee, “Data fusion method for 3-D object reconstruction from range image,” Opt. Eng. 44, 107006(2005).
[CrossRef]

Lieberman, G. J.

H. Frederick and G. J. Lieberman, Introduction to Operations Research (McGraw-Hill, 1982).

Lin, H. Y.

H. Y. Lin and M. Subbarao, “Vision system for fast 3-D model reconstruction,” Opt. Eng. 43, 1651-1664 (2004).
[CrossRef]

Liu, C. H.

C. H. Liu and J. Ward, “The use of 3D information in face recognition,” Vision Res. 46, 768-773 (2006).
[CrossRef]

Liu, F.

F. Liu, F. Duan, and S. Ye, “A new method for calibration of line structured light sensor using zigzag target,” Opt. Laser Technol. 34, 373 (2002).
[CrossRef]

Lu, X.

X. Lu, A. K. Jain, and S. C. Dass, “3D facial expression modeling for recognition,” Proc. SPIE 5779, 113-121 (2005).
[CrossRef]

Masters, T.

T. Masters, Practical Neural Networks Recipes in C++ (Academic Press, 1993).
[PubMed]

McIvor, A. M.

A. M. McIvor, “Nonlinear calibration of a laser stripe profiler,” Opt. Eng. 41, 205-212 (2002).
[CrossRef]

Mortenson, M. E.

M. E. Mortenson, Geometric Modeling, 2nd ed. (Wiley, 1997).

Muñoz Rodríguez, J. A.

J. A. Muñoz Rodríguez and R. Rodríguez-Vera, “Evaluation of the light line displacement location for object shape detection,” J. Mod. Opt. 50, 137-154 (2003).
[CrossRef]

Niwa, Y.

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

Picton, P.

P. Picton, Neural Networks (Polgrave, 1998).

Qu, X.

L. Song, X. Qu, and Y. Yang, “Application of structured lighting sensor for on line measurement,” Opt. Lasers Eng. 43, 1118-1126 (2005).
[CrossRef]

Remondino, F.

F. Remondino and S. El-Hakim, “Image-based 3D modelling: a review,” Photogram. Record 21, 269-291 (2006).
[CrossRef]

Rodríguez-Vera, R.

J. A. Muñoz Rodríguez and R. Rodríguez-Vera, “Evaluation of the light line displacement location for object shape detection,” J. Mod. Opt. 50, 137-154 (2003).
[CrossRef]

Schluns, K.

R. Klette, K. Schluns, and A. Koschan, Computer Vision: Three-Dimensional Data From Images (Springer , 1998).

Shimizu, S.

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

Song, L.

L. Song, X. Qu, and Y. Yang, “Application of structured lighting sensor for on line measurement,” Opt. Lasers Eng. 43, 1118-1126 (2005).
[CrossRef]

Song, L. M.

L. M. Song and D. N. Wang, “A novel grating matching method for 3D reconstruction,” NDT E Int. 39, 282-288 (2006).
[CrossRef]

Subbarao, M.

H. Y. Lin and M. Subbarao, “Vision system for fast 3-D model reconstruction,” Opt. Eng. 43, 1651-1664 (2004).
[CrossRef]

Tanahashi, H.

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

Tsui, H. T.

G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
[CrossRef]

Wang, C.

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

Wang, D. N.

L. M. Song and D. N. Wang, “A novel grating matching method for 3D reconstruction,” NDT E Int. 39, 282-288 (2006).
[CrossRef]

Wang, G.

G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
[CrossRef]

Ward, J.

C. H. Liu and J. Ward, “The use of 3D information in face recognition,” Vision Res. 46, 768-773 (2006).
[CrossRef]

Wee, W. G.

X. Li and W. G. Wee, “Data fusion method for 3-D object reconstruction from range image,” Opt. Eng. 44, 107006(2005).
[CrossRef]

Wei, Z.

Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Opt. Eng. 42, 2956-2966 (2003).
[CrossRef]

Wheatley, P. O.

C. F. Gerald and P. O. Wheatley, Applied Numerical Analysis (Addison-Wesley, 1992).

Wintz, P.

R. C. Gonzalez and P. Wintz, Digital Image Processing, 2nd ed. (Addison-Wesley, 1987).

Wu, F.

G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
[CrossRef]

Xu, Y.

Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Opt. Eng. 42, 2956-2966 (2003).
[CrossRef]

Yamamoto, K.

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

Yang, Y.

L. Song, X. Qu, and Y. Yang, “Application of structured lighting sensor for on line measurement,” Opt. Lasers Eng. 43, 1118-1126 (2005).
[CrossRef]

Ye, S.

F. Liu, F. Duan, and S. Ye, “A new method for calibration of line structured light sensor using zigzag target,” Opt. Laser Technol. 34, 373 (2002).
[CrossRef]

Young Kim, M.

M. Young Kim and H. Cho, “An active trinocular vision system of sensing indoor environment for mobile robots,” Sens. Actuators A, Phys. 125, 192-209 (2006).
[CrossRef]

Zagorchev, L.

L. Zagorchev and A. Goshtasby, “A paintbrush laser range scanner,” Comput. Vision Image Underst. 101, 65-86(2006).
[CrossRef]

Zhang, G.

Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Opt. Eng. 42, 2956-2966 (2003).
[CrossRef]

Comput. Vision Image Underst.

L. Zagorchev and A. Goshtasby, “A paintbrush laser range scanner,” Comput. Vision Image Underst. 101, 65-86(2006).
[CrossRef]

Elect. Eng. Jpn.

S. Shimizu, K. Yamamoto, C. Wang, H. Tanahashi, and Y. Niwa, “Detection of moving object by mobile stereo omnidirectional system (SOS),” Elect. Eng. Jpn. 152, 29-38 (2005).
[CrossRef]

Int. J. Comput. Vis.

D. Q. Huuynh, “Calibration a structured light stripe system: a novel approach,” Int. J. Comput. Vis. 33, 73-86 (1999).
[CrossRef]

J. Mod. Opt.

J. A. Muñoz Rodríguez and R. Rodríguez-Vera, “Evaluation of the light line displacement location for object shape detection,” J. Mod. Opt. 50, 137-154 (2003).
[CrossRef]

NDT E Int.

L. M. Song and D. N. Wang, “A novel grating matching method for 3D reconstruction,” NDT E Int. 39, 282-288 (2006).
[CrossRef]

Opt. Commun.

R. Castaneda, J. Garcia-Sucerquia, and F. Brand, “Quality descriptor of optical beams based on centered reduced moments I: spot analysis,” Opt. Commun. 227, 37-48 (2003).
[CrossRef]

Opt. Eng.

Z. Wei, G. Zhang, and Y. Xu, “Calibration approach for structured-light-stripe vision sensor based on the invariance of double cross-ratio,” Opt. Eng. 42, 2956-2966 (2003).
[CrossRef]

A. M. McIvor, “Nonlinear calibration of a laser stripe profiler,” Opt. Eng. 41, 205-212 (2002).
[CrossRef]

X. Li and W. G. Wee, “Data fusion method for 3-D object reconstruction from range image,” Opt. Eng. 44, 107006(2005).
[CrossRef]

H. Y. Lin and M. Subbarao, “Vision system for fast 3-D model reconstruction,” Opt. Eng. 43, 1651-1664 (2004).
[CrossRef]

Opt. Laser Technol.

F. Liu, F. Duan, and S. Ye, “A new method for calibration of line structured light sensor using zigzag target,” Opt. Laser Technol. 34, 373 (2002).
[CrossRef]

Opt. Lasers Eng.

L. Song, X. Qu, and Y. Yang, “Application of structured lighting sensor for on line measurement,” Opt. Lasers Eng. 43, 1118-1126 (2005).
[CrossRef]

G. Wang, Z. Hu, F. Wu, and H. T. Tsui, “Implementation and experimental study on fast object modeling based on multiple structured light stripes,” Opt. Lasers Eng. 42, 627-638(2004).
[CrossRef]

C. Breque, J. C. Dupre, and F. Brenand, “Calibration of a system of projection moiré for relief measuring: biomechanical applications,” Opt. Lasers Eng. 41, 241-260 (2004).
[CrossRef]

Photogram. Record

F. Remondino and S. El-Hakim, “Image-based 3D modelling: a review,” Photogram. Record 21, 269-291 (2006).
[CrossRef]

Proc. SPIE

X. Lu, A. K. Jain, and S. C. Dass, “3D facial expression modeling for recognition,” Proc. SPIE 5779, 113-121 (2005).
[CrossRef]

Pure Appl. Opt.

R. Castaneda, “ Imaging quality evaluation of rotation symmetrical systems in the spatial frequency domain,” Pure Appl. Opt. 5, 45-53 (1996).
[CrossRef]

Sens. Actuators A, Phys.

M. Young Kim and H. Cho, “An active trinocular vision system of sensing indoor environment for mobile robots,” Sens. Actuators A, Phys. 125, 192-209 (2006).
[CrossRef]

Vision Res.

C. H. Liu and J. Ward, “The use of 3D information in face recognition,” Vision Res. 46, 768-773 (2006).
[CrossRef]

Other

P. Picton, Neural Networks (Polgrave, 1998).

J. E. Freund, (Prentice Hall, 1979).

T. Masters, Practical Neural Networks Recipes in C++ (Academic Press, 1993).
[PubMed]

R. C. Gonzalez and P. Wintz, Digital Image Processing, 2nd ed. (Addison-Wesley, 1987).

R. Klette, K. Schluns, and A. Koschan, Computer Vision: Three-Dimensional Data From Images (Springer , 1998).

K. R. Castleman, Digital Image Processing (Prentice Hall, 1996).

M. E. Mortenson, Geometric Modeling, 2nd ed. (Wiley, 1997).

H. Frederick and G. J. Lieberman, Introduction to Operations Research (McGraw-Hill, 1982).

C. F. Gerald and P. O. Wheatley, Applied Numerical Analysis (Addison-Wesley, 1992).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1
Fig. 1

Experimental setup.

Fig. 2
Fig. 2

Geometry of the experimental setup.

Fig. 3
Fig. 3

Geometry of the central projection obtained from Fig. 2.

Fig. 4
Fig. 4

Maximum position from a set of pixels fitted by Bezier curves.

Fig. 5
Fig. 5

(a) Laser line projected on a dummy face. (b) Contour of the laser line corresponding to Fig. 5a. (c) Laser line occlusion. (d) Broken contour corresponding to the laser line in Fig. 5c. (e) Complete contour corresponding to the laser line in Fig. 5d.

Fig. 6
Fig. 6

Structure of the network.

Fig. 7
Fig. 7

Geometry of the setup at different camera positions.

Fig. 8
Fig. 8

Magnitude of s ( x , y ) for different camera positions.

Fig. 9
Fig. 9

(a) Transverse section of the first view. (b) Transverse section of the second view.

Fig. 10
Fig. 10

(a) Geometry of the occluded region at initial camera position l 1 . (b) Geometry of the observed region by camera position l 0 and position l 1 . (c) Profiled surface by the camera placed at position l 0 and position l 1 .

Fig. 11
Fig. 11

(a) Geometry of an optical axis perpendicular to the reference plane x axis. (b) Geometry of an optical axis perpendicular to reference plane y axis. (c) Derivative d k / d s for an optical axis perpendicular to the x axis and for an optical axis not perpendicular to the x axis. (d) Object dimension produced by the Bezier network at the first camera position.

Fig. 12
Fig. 12

(a) Plastic fruit to be profiled. (b) Occlusion of the laser line projected on the plastic fruit. (c) Three-dimensional shape of the first view of the plastic fruit. (d) Three-dimensional shape of the second view of the plastic fruit. (e) Three-dimensional shape of the plastic fruit.

Fig. 13
Fig. 13

(a) Shape of the first view of the dummy face. (b) Shape of the second view the dummy face. (c) Three-dimensional shape of the dummy face.

Fig. 14
Fig. 14

(a) Occlusion of the laser line on the surface of a plastic hippopotamus. (b) Complete laser line projected on the surface of the hippopotamus. (c) Laser line of (b) captured from the other camera position. (d) Object profile corresponding to the laser line of (b) and (c). (e) Complete shape of the hippopotamus.

Fig. 15
Fig. 15

Bezier curves of a laser line that is not smooth.

Equations (44)

Equations on this page are rendered with MathJax. Learn more.

s ( x , y ) = x A x B .
P ( u ) = i = 0 n ( n i ) ( 1 u ) n i u i p i , ( n i ) = n ! i ! ( n i ) ! , 0 u 1.
P ( u ) = ( n 0 ) ( 1 u ) n u 0 p 0 + ( n 1 ) ( 1 u ) n 1 u p 1 + + ( n n ) ( 1 u ) 0 u n p n , 0 u 1.
x ( u ) = ( n 0 ) ( 1 u ) n u 0 x 0 + ( n 1 ) ( 1 u ) n 1 u x 1 + + ( n n ) ( 1 u ) 0 u n x n , 0 u 1 ,
I ( u ) = ( n 0 ) ( 1 u ) n u 0 I 0 + ( n 1 ) ( 1 u ) n 1 u I 1 + + ( n n ) ( 1 u ) 0 u n I n , 0 u 1.
v = a 0 + a 1 x A ,
u = b 0 + b 1 s ,
B i j = B i ( u ) B j ( v ) ,
B i ( u ) = ( n i ) u i ( 1 u ) n i , B j ( v ) = ( m j ) u i ( 1 v ) m j ,
( n i ) = n ! i ! ( n i ) ! , ( m j ) = m ! j ! ( m j ) ! .
H ( u , v ) = i = 0 n j = 0 m w i j h i B i ( u ) B j ( v ) , 0 u 1 , 0 v 1 ,
H 00 ( u , v ) = h 0 = w 00 h 0 B 0 ( u ) B 0 ( v ) + w 01 h 1 B 0 ( u ) B 1 ( v ) + , , + w 0 n h n B 0 ( u ) B n ( v ) + , , + w m 0 h 0 B m ( u ) B 0 ( v ) + , , + w m n h n B m ( u ) B n ( v ) H 01 ( u , v ) = h 1 = w 00 h 0 B 0 ( u ) B 0 ( v ) + w 01 h 1 B 0 ( u ) B 1 ( v ) + , , + w 0 n h n B 0 ( u ) B n ( v ) + , , + w m 0 h 0 B m ( u ) B 0 ( v ) + , , + w m n h n B m ( u ) B n ( v ) H 0 n ( u , v ) = h n = w 00 h 0 B 0 ( u ) B 0 ( v ) + w 01 h 1 B 0 ( u ) B 1 ( v ) + , , + w 0 n h n B 0 ( u ) B n ( v ) + , , + w m 0 h 0 B m ( u ) B 0 ( v ) + , , + w m n h n B m ( u ) B n ( v ) H m 0 ( u , v ) = h 0 = w 00 h 0 B 0 ( u ) B 0 ( v ) + w 01 h 1 B 0 ( u ) B 1 ( v ) + , , + w 0 n h n B 0 ( u ) B n ( v ) + , , + w m 0 h 0 B m ( u ) B 0 ( v ) + , , + w m n h n B m ( u ) B n ( v ) H m n ( u , v ) = h n = w 00 h 0 B 0 ( u ) B 0 ( v ) + w 01 h 1 B 0 ( u ) B 1 ( v ) + , , + w 0 n h n B 0 ( u ) B n ( v ) + , , + w m 0 h 0 B m ( u ) B 0 ( v ) + , , + w m n h n B m ( u ) B n ( v ) .
H 00 = w 00 β 0 , 0 + w 01 β 0 , 1 + , , + w m n β 0 , m n H 01 = w 00 β 1 , 0 + w 01 β 1 , 1 + , , + w m n β 1 , m n H m n = w 00 β m n , 0 + w 01 β m n , 1 + , , + w m n β m n , m n .
[ β 0 , 0 β 0 , 1 β 0 , 2 β 0 , m n β 1 , 0 β 1 , 1 β 1 , 2 β 1 , m n β m n , 0 β m n , 1 β m n , 2 β m n , m n ] [ w 00 w 01 w m n ] = [ H 00 H 01 H m n ] .
z i 0 = z i + f ( x c x B ) + 0 .
h = D f 0 s ( x , y ) + ( x c x A ) .
m p , q = x p y q f ( x , y ) d x d y ,
m p q = x = 0 M 1 y = 0 N 1 x i p y j q f ( x , y ) ,
x c = m 10 m 00 and y c = m 01 m 00 .
μ p q = x = 0 M 1 y = 0 N 1 ( x x c ) p ( y y c ) q f ( x , y ) .
η p q = μ p q μ 00 γ ,
ϕ 1 = η 20 + η 02 ,
ϕ 2 = ( η 20 η 02 ) 2 + 4 η 11 2 ,
ϕ 3 = ( η 30 3 η 12 ) 2 + ( 3 η 21 η 03 ) 2 ,
ϕ 4 = ( η 30 + η 12 ) 2 + ( η 21 η 03 ) 2 ,
ϕ 5 = ( η 30 3 η 12 ) ( η 30 + η 12 ) [ ( η 30 + η 12 ) 2 3 ( η 21 + η 03 ) 2 ] + ( 3 η 21 η 03 ) ( η 03 + η 21 ) [ 3 ( η 30 + η 12 ) 2 ( η 21 + η 03 ) 2 ] ,
ϕ 6 = ( η 20 η 02 ) [ ( η 30 + η 12 ) 2 ( η 21 + η 03 ) 2 ] + 4 η 11 ( η 30 + η 12 ) ( η 03 + η 21 ) ,
ϕ 7 = ( 3 η 21 η 03 ) ( η 30 + η 12 ) [ ( η 30 + η 12 ) 2 3 ( η 21 + η 03 ) 2 ] + ( 3 η 12 η 03 ) ( η 21 + η 03 ) [ 3 ( η 03 + η 21 ) 2 ( η 21 + η 30 ) 2 ] .
f ( x ) d x = 1.
x = 0 n f ( x ) = 1.
T ( r ) = ( L 1 ) 0 r p r ( ω ) d ω .
T ( r k ) = ( L 1 ) j = 0 k p r ( r j ) , P r ( n k ) = n k / MN , k = 0 , 1 , 2 , 3 , , L 1.
k i = f h i s i j + x c x A .
z i a = z i + f η ( x c x B ) + a .
D h i a = D h i + f η ( s i j + x c x A ) + a ,
h 0 = D f a η ( s 00 + x c x A ) , h 1 = D f a η ( s 01 + x c x A ) , h 2 = D f a η ( s 02 + x c x A ) , h 3 = D f a η ( s 03 + x c x A ) , h 4 = D f a η ( s 04 + x c x A ) , h 5 = D f a η ( s 05 + x c x A ) .
t i = ( y c y p ) f ( D h i ) η ( b q i 1 ) .
t 1 = ( y c y p ) F ( D h 1 ) η ( b q 0 ) , t 2 = ( y c y p ) F ( D h 1 ) η ( b q 1 ) , t 3 = ( a y c a y p ) F ( D h 1 ) η ( b q 2 ) .
x B = f a D h i + x c .
rms = 1 n i = 1 n ( h o i h c i ) 2 ,
P ( u ) = 1 / 2 ( 1 u ) 2 P 0 + 1 / 2 ( 2 u 2 + 2 u + 1 ) P 1 + 1 / 2 u 2 P 2 + 1 / 2 ( 1 u ) 2 P 1 + 1 / 2 ( 2 + 2 u + 1 ) P 2 + 1 / 2 u 2 P 3 + 1 / 2 ( 1 u ) 2 P 2 + 1 / 2 ( 2 u 2 + 2 u + 1 ) P 3 + 1 / 2 u 2 P 4 + 1 / 2 ( 1 u ) 2 P 3 + 1 / 2 ( 2 u 2 + 2 u + 1 ) P 4 + 1 / 2 u 2 P 5 .
P ( u ) = w 0 ( 1 u ) 3 P 0 + 3 w 1 ( 1 u ) 2 u P 1 + 3 w 2 ( 1 u ) u 2 P 2 + w 3 u 3 P 3 .
n = ( z α σ x e ) 2 ,
z α = e σ x n .

Metrics