Abstract

We present a three-dimensional (3D) measurement and imaging based on a multicamera system. In the presented system, projected images of 3D objects are taken by cameras located at random positions on a circumference, and then the 3D objects can be reconstructed numerically. We introduce an angle correction function to improve the quality of the reconstructed object. The angle correction function can correct the angle error caused by the position errors in the projected images due to the finite pixel size of the image sensor. The numerical results show that the point source was reconstructed successfully by introducing the angle correction function. We also demonstrate experiments: the two objects are located on a rotary stage controlled by a computer, the projected images are taken by a single camera, and by using 33 projected images, the two objects are reconstructed successfully.

© 2008 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |

  1. B. Javidi and F. Okano, eds., Three-Dimensional Television, Video, and Display Technology (Springer, 2002).
  2. T. Fujii and M. Tanimoto, "Free-viewpoint TV system based on ray-space representation," Proc. SPIE 4864, 175-189 (2002).
    [CrossRef]
  3. Y. Li, D. Abookasis, and J. Rosen, "Computer-generated holograms of three-dimensional realistic objects recorded without wave interference," Appl. Opt. 40, 2864-2869 (2001).
    [CrossRef]
  4. Y. Li and J. Rosen, "Object recognition using three-dimensional optical quasi-correlation," J. Opt. Soc. Am. A 19, 1755-1762 (2002).
    [CrossRef]
  5. I. Yamaguchi and T. Zhang, "Phase-shifting digital holography," Opt. Lett. 22, 1268-1270 (1997).
    [CrossRef] [PubMed]
  6. E. Cuche, F. Bevilacqua, and C. Depeursinge, "Digital holography for quantitative phase-contrast imaging," Opt. Lett. 24, 291-293 (1999).
    [CrossRef]
  7. Y. Awatsuji, M. Sasada, and T. Kubota, "Parallel quasi-phase-shifting digital holography," Appl. Phys. Lett. 85, 1069-1071 (2004).
    [CrossRef]
  8. O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, and B. Javidi, "Real-time three-dimensional object reconstruction by use of a phase-encoded digital hologram," Appl. Opt. 41, 6187-6192 (2002).
    [CrossRef] [PubMed]
  9. O. Matoba, K. Hosor, K. Nitta, and T. Yoshimura, "Fast acquisition system for digital holograms and image processing for three-dimensional display with data manipulation," Appl. Opt. 45, 8945-8950 (2006).
    [CrossRef] [PubMed]
  10. F. Okano, J. Arai, H. Hoshino, and I. Yuyama, "Three-dimensional video system based on integral photography," Opt. Eng. 38, 1072-1077 (1999).
    [CrossRef]

2006 (1)

2004 (1)

Y. Awatsuji, M. Sasada, and T. Kubota, "Parallel quasi-phase-shifting digital holography," Appl. Phys. Lett. 85, 1069-1071 (2004).
[CrossRef]

2002 (3)

2001 (1)

1999 (2)

F. Okano, J. Arai, H. Hoshino, and I. Yuyama, "Three-dimensional video system based on integral photography," Opt. Eng. 38, 1072-1077 (1999).
[CrossRef]

E. Cuche, F. Bevilacqua, and C. Depeursinge, "Digital holography for quantitative phase-contrast imaging," Opt. Lett. 24, 291-293 (1999).
[CrossRef]

1997 (1)

Appl. Opt. (3)

Appl. Phys. Lett. (1)

Y. Awatsuji, M. Sasada, and T. Kubota, "Parallel quasi-phase-shifting digital holography," Appl. Phys. Lett. 85, 1069-1071 (2004).
[CrossRef]

J. Opt. Soc. Am. A (1)

Opt. Eng. (1)

F. Okano, J. Arai, H. Hoshino, and I. Yuyama, "Three-dimensional video system based on integral photography," Opt. Eng. 38, 1072-1077 (1999).
[CrossRef]

Opt. Lett. (2)

Proc. SPIE (1)

T. Fujii and M. Tanimoto, "Free-viewpoint TV system based on ray-space representation," Proc. SPIE 4864, 175-189 (2002).
[CrossRef]

Other (1)

B. Javidi and F. Okano, eds., Three-Dimensional Television, Video, and Display Technology (Springer, 2002).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1

Schematics of proposed 3D measurement and imaging system based on multicameras with random distribution on a circumference.

Fig. 2
Fig. 2

Schemes of (a) recording and (b) reconstruction.

Fig. 3
Fig. 3

Description of angle correction due to the finite size of image sensor.

Fig. 4
Fig. 4

Reconstructed results of point source when it is located at ( 35 , 40 ) . (a) Reconstructed intensity profiles by the proposed method, (b) conventional method, and (c) its profiles along the z axis.

Fig. 5
Fig. 5

Reconstructed results of 169 point sources. (a) Peak intensity positions, and (b) peak intensity values in the proposed method. (c) and (d) Results of peak intensity positions and peak intensity values in the conventional method.

Fig. 6
Fig. 6

Comparison of the reconstructed point sources with and without the phase correction function as a function of the number of cameras when the camera angle range is 32°. (a) Peak value, (b) FWHM along the x axis, and (c) FWHM along the z axis.

Fig. 7
Fig. 7

Effect of the camera angle range in the reconstructed point sources when the number of cameras is 9. (a) Peak value, and (b) FWHM along the z axis.

Fig. 8
Fig. 8

Experimental setup.

Fig. 9
Fig. 9

Parts of projected images obtained at angles of (a) 15.40 ° , (b) 8.52 ° , (c) 0.11°, (d) 9.06°, and (e) 15.62°.

Fig. 10
Fig. 10

Reconstructed intensity distributions in the xy plane at (a) z = 4 pixels, (b) z = 50 pixels, (c) z = 105 pixels, and (d) z = 127 pixels.

Fig. 11
Fig. 11

Reconstructed intensity profiles in the xz plane when the observation angle ranges are (a) 32°, (b) 64°, and (c) 180° in the case of a number of cameras of 33.

Fig. 12
Fig. 12

Reconstructed intensity distributions in the xy plane when the observation angle ranges are (a) 32°, (b) 64°, and (c) 180° in the case of a number of cameras of 33.

Fig. 13
Fig. 13

Reconstructed intensity distributions in the xy plane when the number of cameras are (a) 5, (b) 9, (c) 17, and (d) 32.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

( x k , y k ) = ( x cos θ k + z sin θ k , y ) .
P ˜ k ( u ) = p k ( x k ) exp ( i 2 π λ f x k u ) d x k .
P ˜ k ( s ) = n = 0 N 1 p k ( n ) exp ( i 2 π n s N ) ,
r ( p Δ x , q Δ z ) = s = 0 N 1 k = 0 M 1 P ˜ k ( s ) exp { i 2 π s λ f × Δ u ( p Δ x sin θ k + q Δ z cos θ k ) } ,
r ( p , q ) = s = 0 N 1 k = 0 M 1 P ˜ k ( s ) exp { i 2 π s N ( p cos θ k + q sin θ k ) } .
θ k = sin 1 ( [ x k ] z 0 ± ( [ x k ] z 0 ) 2 ( [ x k ] 2 x 0 2 ) ( x 0 2 + z 0 2 ) x 0 2 + z 0 2 ) .
r ( p , q ) = s = 0 N 1 k = 0 M 1 P ˜ k ( s ) exp { i 2 π s N ( p cos θ k + q sin θ k ) } .

Metrics