Abstract

In this paper we present a high-resolution technique to passively sense, detect and recognize a 3D object using computational integral imaging. We show that the use of a non-stationary microlens array improves the longitudinal distance estimation quantization error. The proposed method overcomes the Nyquist upper limit for the resolution. We use 3D non-linear correlation to recognize the 3D coordinates and shape of the desired object.

© 2003 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |

  1. H. E. Ives, "Optical properties of a Lippmann lenticulated sheet," J. Opt. Soc. Am. 21, 171-176 (1931).
    [CrossRef]
  2. J. Caulfield, Handbook of Optical Holography, (Academic, London, 1979).
  3. F. Okano, J. Arai, H. Hoshino and I. Yuyama, �??Three dimensional video system based on integral photography,�?? Opt. Eng. 38, 1072-1077 (1999).
    [CrossRef]
  4. H. Arimoto and B. Javidi, �??Integral three-dimensional imaging with digital reconstruction,�?? Opt. Lett. 26, 157-159 (2001).
    [CrossRef]
  5. T. Okoshi, Three-dimensional Imaging Techniques, (Academic, NY, 1971).
  6. B. Javidi and E. Tajahuerce, Three-dimensional object recognition by use of digital holography,�?? Opt. Lett. 25, 610-612 (2000).
    [CrossRef]
  7. J. Rosen, �??three dimensional electro-optical correlation,�?? J. Opt. Soc. Am. A 15, 430-436 (1998).
    [CrossRef]
  8. J. Rosen, "Electrooptical correlators for three-dimensional pattern recognition" in Image Recognition and Classification: Algorithms, Systems, and Applications, B. Javidi Ed., Marcel Dekker, NY 2002.
  9. Y. Frauel and B. Javidi, �??Digital three-dimensional image correlation by use of computer reconstructed integral imaging,�?? Appl. Opt. 41, 5488-5496 (2002).
    [CrossRef] [PubMed]
  10. J. Jang and B. Javidi, �??Improved viewing resolution of three dimensional integral imaging by use of nonstationary micro-optics,�?? Opt. Lett. 27, 324-326 (2002).
    [CrossRef]
  11. A. Stern and B. Javidi, " 3D Image Sensing and Reconstruction with Time-Division Multiplexed Computational Integral Imaging (CII)," Applied Optics-IP 42, 7036-7042 (2003).
    [CrossRef]
  12. R. Tsu and T. Huang, �??Multi-frame image restoration and registration,�?? in advances in computer vision and image processing 1, 317-339, JAI press (1984).
  13. A. Teklap, Digital Video Processing, (Prentice Hall, NJ, 1995).
  14. S. Kim, N. Bose, and H. Valenzuela, �??Recursive reconstruction of high resolution image from noisy undersampled multiframes,�?? IEEE trans. On Acoustics, Speech and Signal Processing 38, 1013-1027 (1990).
    [CrossRef]
  15. D. Keren, S. Peleg, and R. Brada, �??Image sequence enhancement using subpixel displacement,�?? Proceeding of the IEEE computer society conference on computer vision and pattern recognition, 742-746, June (1988).
  16. B. Frieden, and H. Aumann, �??Image reconstruction from multiple 1-D scans using filtered localization projection,�?? Appl. Optics 26, 3615-1621 (1987).
    [CrossRef]
  17. N. Shah, and A. Zakhor, �??Multiframe spatial resolution enhancement of color video,�?? in Proceeding of the IEEE international conference on image processing, Lausanne, Switzerland, 985-988, Sept. (1996).
  18. R. Schultz, and R. Stevenson, �??Improved definition video frame enhancement,�?? in Proceeding of the IEEE international conference of Acoustics, Speech and Signal Processing, 2169-2172, Detroit, MI (1995).
  19. B. Javidi, �??�??Nonlinear joint power spectrum based optical correlators,�??�?? Appl. Opt. 28, 2358�??2367 (1989).
    [CrossRef] [PubMed]
  20. A. Mahalanobis, �??On the optimality of the MACH filter for detection of targets in noise�?? Opt. Eng. 36, 2642-2648 (1997).
    [CrossRef]
  21. C. Chesnaud, Ph. Réfrégier and V. Boulet, "Statistical region snake based segmentation adapted to different physical noise models," IEEE Transactions on Pattern Analysis and Machine Intelligence 21, 1145-1157 (1999).
    [CrossRef]
  22. B. Javidi and J.L. Homer, Real-time Optical Information Processing, (Academic Press, NY 1994).

Appl. Opt. (2)

Appl. Optics 26 (1)

B. Frieden, and H. Aumann, �??Image reconstruction from multiple 1-D scans using filtered localization projection,�?? Appl. Optics 26, 3615-1621 (1987).
[CrossRef]

Applied Optics-IP (1)

A. Stern and B. Javidi, " 3D Image Sensing and Reconstruction with Time-Division Multiplexed Computational Integral Imaging (CII)," Applied Optics-IP 42, 7036-7042 (2003).
[CrossRef]

IEEE trans. On Acoustics, Speech and Sig (1)

S. Kim, N. Bose, and H. Valenzuela, �??Recursive reconstruction of high resolution image from noisy undersampled multiframes,�?? IEEE trans. On Acoustics, Speech and Signal Processing 38, 1013-1027 (1990).
[CrossRef]

IEEE Transactions on Pattern Analysis an (1)

C. Chesnaud, Ph. Réfrégier and V. Boulet, "Statistical region snake based segmentation adapted to different physical noise models," IEEE Transactions on Pattern Analysis and Machine Intelligence 21, 1145-1157 (1999).
[CrossRef]

Image Recognition and Classification: (1)

J. Rosen, "Electrooptical correlators for three-dimensional pattern recognition" in Image Recognition and Classification: Algorithms, Systems, and Applications, B. Javidi Ed., Marcel Dekker, NY 2002.

J. Opt. Sco. Am. (1)

H. E. Ives, "Optical properties of a Lippmann lenticulated sheet," J. Opt. Soc. Am. 21, 171-176 (1931).
[CrossRef]

J. Opt. Soc. Am. A (1)

JAI press (1)

R. Tsu and T. Huang, �??Multi-frame image restoration and registration,�?? in advances in computer vision and image processing 1, 317-339, JAI press (1984).

Opt. Eng (1)

A. Mahalanobis, �??On the optimality of the MACH filter for detection of targets in noise�?? Opt. Eng. 36, 2642-2648 (1997).
[CrossRef]

Opt. Eng. (1)

F. Okano, J. Arai, H. Hoshino and I. Yuyama, �??Three dimensional video system based on integral photography,�?? Opt. Eng. 38, 1072-1077 (1999).
[CrossRef]

Opt. Lett. (3)

Proceeding of the IEEE computer society (1)

D. Keren, S. Peleg, and R. Brada, �??Image sequence enhancement using subpixel displacement,�?? Proceeding of the IEEE computer society conference on computer vision and pattern recognition, 742-746, June (1988).

Proceeding of the IEEE international con (2)

N. Shah, and A. Zakhor, �??Multiframe spatial resolution enhancement of color video,�?? in Proceeding of the IEEE international conference on image processing, Lausanne, Switzerland, 985-988, Sept. (1996).

R. Schultz, and R. Stevenson, �??Improved definition video frame enhancement,�?? in Proceeding of the IEEE international conference of Acoustics, Speech and Signal Processing, 2169-2172, Detroit, MI (1995).

Other (4)

B. Javidi and J.L. Homer, Real-time Optical Information Processing, (Academic Press, NY 1994).

A. Teklap, Digital Video Processing, (Prentice Hall, NJ, 1995).

T. Okoshi, Three-dimensional Imaging Techniques, (Academic, NY, 1971).

J. Caulfield, Handbook of Optical Holography, (Academic, London, 1979).

Supplementary Material (4)

» Media 1: GIF (30 KB)     
» Media 2: GIF (55 KB)     
» Media 3: AVI (675 KB)     
» Media 4: AVI (196 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1.
Fig. 1.

(GIF 29.6 KB) Non-stationary microlens array integral imaging system.

Fig. 2.
Fig. 2.

(GIF 55 KB) The sequence of low-resolution images obtained by time multiplexing for one of the elemental images.

Fig. 3.
Fig. 3.

Effect of different object depth on the elemental images.

Fig. 4.
Fig. 4.

Depth change Δz as a function of z1 that produces a one pixel shift at the CCD plane for different lenses number. u=0 mm, d=5.2 mm, and the CCD pixel size is 12 um×12 um.

Fig. 5.
Fig. 5.

The reconstructed high-resolution elemental image

Fig. 6.
Fig. 6.

minimum detectable longitudnal depth change (Δz) for different microlens number n for both the stationary lenslet and time multiplexed integral imaging [moving lenslets], for z1=80mm, d=5.2 mm, and a CCD pixel size of 12 um×12 um.

Fig. 7.
Fig. 7.

(AVI 163 KB) Sequence of time multiplexed integral images.

Fig. 8.
Fig. 8.

The maximum detectable depth that retains a minimum depth estimation resolution of 1mm. Both cases of stationary microlens arrays and moving microlens arrays are considered. n is the lenslet number in the array. d=5.2 mm, CCD pixel size is 12 um×12 um, and the number of time multiplexed images N=5.

Fig. 9.
Fig. 9.

(MOV 675 KB) Example of detecting the projected pixel coordinates in different elemental images.

Fig. 10.
Fig. 10.

2D image of the 3D input scene. The objects are placed at a 79mm and 75mm from the lenslet.

Fig. 11.
Fig. 11.

(MOV 200 KB) Part of the input sequence of elemental images.

Fig. 12. (a)
Fig. 12. (a)

one of the low resolution elemental images (b) the reconstructed high resolution elemental image

Fig. 13.
Fig. 13.

2D color map for the experimental results for depth of the scene shown in Fig. 10.

Fig. 14.
Fig. 14.

Reconstructed 3D scene for the scene for the 3D objects in Fig. 10.

Fig. 15. (a)
Fig. 15. (a)

Slice from the 3D correlation output for the 3D bug object in Fig. 10. The output represents a plane that is 75mm from the lenslet (b) Slice from the 3D correlation output for the 3D bug object in Fig. 10. The output represents a plane that is 76mm from the lenslet (c) Slice from the 3D correlation output for the 3D bug object in Fig. 10. The output represents a plane that is 79mm from the lensle

Fig. 16.
Fig. 16.

Slice from the 3D correlation output for the object at the left hand side of Fig. 10 (a) Output plane at that is 79mm from the microlens array (b) The 3D correlation at a plane that is 80mm far from the microlens array (c) The 3D correlation at a plane that is 75mm far from the microlens array.

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

z 1 = d u n x 1 ,
z 2 = d u n x 2
Δ z = d u n ( Δ x x 1 x 2 ) = d ( n ϕ u ) ( Δ x x 1 x 2 )
Δ z c z min 2 d ( n ϕ u ) c z min
g ̂ ( j + 1 ) ( x ) = g ̂ ( j ) ( x ) + 1 N i = 1 N [ ( ( g ̂ ( j ) ( x ) h psf ( x ) ) D f i ( x ) ) U h bp ( x ) ]
z max = Δ z min 2 + Δ z min 2 + 4 Δ z min d ( n ϕ u ) c 2
d n , m ( x p , y q , x i , x j ) = x = W x 2 x = W x 2 y = W y 2 y = W y 2 [ I 0 , 0 ( x x p , y y q )
I n , m ( x x i , y y j ) ] 2 p , i = 1 , N x q , j = 1 , N y ,
z ( u , v ; n , m ) = z ̂ ( u , v ; n , m ) ± n z ( u , v ; n , m , z ̂ )
z ̂ ( u , v ) = 1 N u N v n = 1 N u m = 1 N v z ̂ ( u , v ; n , m )
z ( u , v ) = z ̂ ( u , v ) + n z ( u , v ) ,
σ z 2 = 1 N u N v Δ z 2 ( u , v ; n , m , z ) 12
C = IFT [ S k exp ( j ϕ S ) R k exp ( j ϕ S ) ]

Metrics