Abstract

This paper examines the performance of a low-cost, miniature, wide field-of-view (FOV) visual sensor that includes advanced pinhole optics and most recent CMOS imager technology. The pinhole camera may often be disregarded because of its apparent simplicity, low aperture and image finesse. However, its angular field can be dramatically improved using only a few off-the-shelf micro-optical elements. With modern high-sensitivity silicon-based digital retina, we show that it could be a practical device for developing self-motion estimation sensor in mobile applications, such as stabilization of a robotic micro flyer.

© 2005 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |

  1. D.V. Wick, T. Martinez, S.R. Restaino and B.R. Stone, �??Foveated imaging demonstration,�?? Opt. Express 10, 60-65 (2002).
    [PubMed]
  2. R. Volkel, M. Eisner and K.J. Weible, �??Miniaturized imaging system,�?? J. Microelectronic Engineering, Elsevier Science, 67-68, 461-472 (2003).
    [CrossRef]
  3. J. Neumann, C. Fermuller and Y. Aloimonos, �??Eyes from eyes: new cameras for structure from motion,�?? in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19-26.
    [CrossRef]
  4. T. Netter and N. Franceschini, �??A robotic aircraft that follows terrain using a neuro-morphic eye,�?? in IEEE Proceedings of Conference on Intelligent Robots and Systems (Lausanne, Switzerland, 2002), 129-134.
    [CrossRef]
  5. K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, �??A small-sized panoramic scanning visual sensor inspired by the fly�??s compound eye,�?? in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641-1646.
  6. R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip and R. Krishnasamy, �??Electronic compound eye image sensor: construction and calibration,�?? in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M.MBlouke, N.Sampat, R.Motta, eds., Proc. SPIE 5301, 13-24 (San Jose, US Calif., 2004).
    [CrossRef]
  7. J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, �??Compound eye sensor for 3D ego motion estimation,�?? in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).
  8. J. Tanida, T. Kumagai, K. Yamada, S. Miyatake, K. Ishida, T. Morimoto, N. Kondou, D. Miyazaki and Y.Ichioka, �??Thin observation module by bound optics (TOMBO): concept and experimental verification,�?? Appl. Opt. 40 (11), 1806-1813 (2001).
    [CrossRef]
  9. J. Kim, K.H. Jeong and L.P. Lee, �??Artificial ommatidia by self-aligned microlenses and waveguides,�?? Opt. Express 30, 5-7 (2005)
  10. J. Gluckman and S.K. Nayar, �??Egomotion and omnidirectional cameras,�?? in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (Bombay, India, 1998), 999-1005.
  11. P. Baker, R. Pless, C. Fermuller and Y. Aloimonos, �??New eyes for shape and motion estimation,�?? in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118-128.
    [CrossRef]
  12. J.J. Koenderink and A.J. Van Doorn, �??Facts on optic flow,�?? J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247-254.
    [CrossRef]
  13. T. Tian, C. Tomasi and D. Heeger, �??Comparison of approaches to egomotion estimation,�?? in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315-320.
  14. M. Franz, J. Chahl and H. Krapp, �??Insect-inspired estimation of egomotion,�?? J. Neural Computation 16, 2245-2260 (2004).
    [CrossRef]
  15. G.Adiv, �??Inherent ambiguities in recovering 3D motion and structure from a noisy field,�?? IEEE Trans. Pattern Anal. Mach. Intell. 11 (5), 477-489 (1989).
    [CrossRef]
  16. C.Fermuller and Y.Aloimonos, �??Observability of 3D motion,�?? J. Computer Vision, Springer Science eds., 37 (1), 46-63 (2000).
  17. J.Neumann, �??Computer vision in the space if light rays: plenoptic video geometry and polydioptric camera design,�?? Dissertation for the degree of Doctor of Philosophy, Department of Computer Science, University of Maryland (2004).
  18. S.Srinivasan and R.Chellappa, �??Noise-resilient estimation of optical flow by use of overlapped basis functions,�?? J. Opt. Soc. Am. A 16, 493-507 (1999).
    [CrossRef]
  19. A.Bruhn, J.Weickert and C.Schnorr, �??Combining the advantages of local and global optic flow methods,�?? in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457-462
  20. C.Fermuller, Y.Aloimonos, P.Baker, R.Pless. J.Neumann and B.Stuart, �??Multi-camera networks: eyes from eyes,�?? in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11-18.
  21. T.Martinez, D.V.Wick and S.R.Restaino, �??Foveated, wide field-of-view imaging system using a liquid crystal spatial light modulator,�?? Opt. Express 8 (10), 555-560 (2001).
    [CrossRef] [PubMed]
  22. J.Beckstead and S.Nordhauser, �??360 degree/forward view integral imaging system,�?? US Patent 6028719, InterScience Inc., 22 February 2000.
  23. R.Constantini and S.Susstrunk, �??Virtual sensor design,�?? in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408-419 (2004).
    [CrossRef]
  24. T.H. Nilsson, �??Incident photometry: specifying stimuli for vision and light detectors,�?? Appl. Opt. 22, 3457-3464 (1983).
    [CrossRef] [PubMed]
  25. M. Young, �??Pinhole optics,�?? Appl. Opt. 10, 2763-2767 (1971).
    [CrossRef] [PubMed]
  26. K.D. Mielenz, �??On the diffraction limit for lensless imaging,�?? J. Nat. Inst. Stand. Tech. 104, (1999).
  27. B. Fowler, A.E. Gamal, D. Yang and H. Tian, �??A method for estimating quantum efficiency for CMOS image sensors,�?? in Solid State Sensor Arrays �?? Development and Applications, Proc. SPIE 3301, 178-185 (1998).
    [CrossRef]
  28. P.B. Catrysse and B.A. Wandell, �??Optical efficiency of image sensor pixels,�?? J. Opt. Soc. Am. A 19, (2002).
    [CrossRef]
  29. J.M. Franke, �??Field-widened pinhole camera,�?? Appl. Opt. 18, 1979.
    [CrossRef] [PubMed]
  30. <a href= "http://www.ovt.com">http://www.ovt.com</a>
  31. R. Steveninck and W. Bialek, �??Timing and counting precision in the blowfly visual system,�?? in Methods in Neural Networks IV, J.Van Hemmen, J.D.Cowan and E.Domany, ed., (Heidelberg; Springer-Verlag, 2001), 313-371.
  32. W. Bialek. �??Thinking about the brain,�?? in Les Houches Lectures on Physics of bio-molecules and cells, H.Flyvbjerg, F.Jülicher, P.Ormos and F.David, ed., (Les Ulis, France; Springer-Verlag, 2002), 485-577.

Appl. Opt.

DAGM Proc. Pattern Recognition 2002

A.Bruhn, J.Weickert and C.Schnorr, �??Combining the advantages of local and global optic flow methods,�?? in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457-462

IEEE Proc. Computer Vision & Pattern '96

T. Tian, C. Tomasi and D. Heeger, �??Comparison of approaches to egomotion estimation,�?? in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315-320.

IEEE Proc. Computer Vision & Pattern '98

J. Gluckman and S.K. Nayar, �??Egomotion and omnidirectional cameras,�?? in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (Bombay, India, 1998), 999-1005.

IEEE Proc. Intelligent Robots & Sys '04

J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, �??Compound eye sensor for 3D ego motion estimation,�?? in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).

IEEE Proc. Intelligent Robots & Sys. '02

T. Netter and N. Franceschini, �??A robotic aircraft that follows terrain using a neuro-morphic eye,�?? in IEEE Proceedings of Conference on Intelligent Robots and Systems (Lausanne, Switzerland, 2002), 129-134.
[CrossRef]

IEEE Proc. Omnidirectional Vision 2000

C.Fermuller, Y.Aloimonos, P.Baker, R.Pless. J.Neumann and B.Stuart, �??Multi-camera networks: eyes from eyes,�?? in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11-18.

IEEE Proc. Omnidirectional Vision 2002

J. Neumann, C. Fermuller and Y. Aloimonos, �??Eyes from eyes: new cameras for structure from motion,�?? in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19-26.
[CrossRef]

IEEE Proc. Robotics and Automation '98

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, �??A small-sized panoramic scanning visual sensor inspired by the fly�??s compound eye,�?? in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641-1646.

IEEE Trans. Pattern Anal. Mach. Intell.

G.Adiv, �??Inherent ambiguities in recovering 3D motion and structure from a noisy field,�?? IEEE Trans. Pattern Anal. Mach. Intell. 11 (5), 477-489 (1989).
[CrossRef]

J. Biol. Cybern.

J.J. Koenderink and A.J. Van Doorn, �??Facts on optic flow,�?? J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247-254.
[CrossRef]

J. Computer Vision

C.Fermuller and Y.Aloimonos, �??Observability of 3D motion,�?? J. Computer Vision, Springer Science eds., 37 (1), 46-63 (2000).

J. Microelec. Engineering, Elsevier Sci.

R. Volkel, M. Eisner and K.J. Weible, �??Miniaturized imaging system,�?? J. Microelectronic Engineering, Elsevier Science, 67-68, 461-472 (2003).
[CrossRef]

J. Nat. Inst. Stand. Tech.

K.D. Mielenz, �??On the diffraction limit for lensless imaging,�?? J. Nat. Inst. Stand. Tech. 104, (1999).

J. Neural Computation

M. Franz, J. Chahl and H. Krapp, �??Insect-inspired estimation of egomotion,�?? J. Neural Computation 16, 2245-2260 (2004).
[CrossRef]

J. Opt. Soc. Am. A

Lectures Notes in Computer Science

P. Baker, R. Pless, C. Fermuller and Y. Aloimonos, �??New eyes for shape and motion estimation,�?? in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118-128.
[CrossRef]

Les Houches Lect. on Physics of bio-mol.

W. Bialek. �??Thinking about the brain,�?? in Les Houches Lectures on Physics of bio-molecules and cells, H.Flyvbjerg, F.Jülicher, P.Ormos and F.David, ed., (Les Ulis, France; Springer-Verlag, 2002), 485-577.

Methods in Neural Networks IV

R. Steveninck and W. Bialek, �??Timing and counting precision in the blowfly visual system,�?? in Methods in Neural Networks IV, J.Van Hemmen, J.D.Cowan and E.Domany, ed., (Heidelberg; Springer-Verlag, 2001), 313-371.

Opt. Express

Proc. SPIE

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip and R. Krishnasamy, �??Electronic compound eye image sensor: construction and calibration,�?? in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M.MBlouke, N.Sampat, R.Motta, eds., Proc. SPIE 5301, 13-24 (San Jose, US Calif., 2004).
[CrossRef]

B. Fowler, A.E. Gamal, D. Yang and H. Tian, �??A method for estimating quantum efficiency for CMOS image sensors,�?? in Solid State Sensor Arrays �?? Development and Applications, Proc. SPIE 3301, 178-185 (1998).
[CrossRef]

R.Constantini and S.Susstrunk, �??Virtual sensor design,�?? in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408-419 (2004).
[CrossRef]

Other

<a href= "http://www.ovt.com">http://www.ovt.com</a>

J.Beckstead and S.Nordhauser, �??360 degree/forward view integral imaging system,�?? US Patent 6028719, InterScience Inc., 22 February 2000.

J.Neumann, �??Computer vision in the space if light rays: plenoptic video geometry and polydioptric camera design,�?? Dissertation for the degree of Doctor of Philosophy, Department of Computer Science, University of Maryland (2004).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1.

Spherical Sensor model: illustration of the image formation and the measurements of the optic flow pi induced by the sensor motion. θ and ϕ are respectively azimuthal and polar angles.

Fig. 2.
Fig. 2.

Perspective transformation model of a wide-angle imaging system; radial image distortions are non-linear; the projection function f(ϕ);ϕ:ℜ +r: ℜ + can be derived from the optical unit geometry; for imaging devices that have a locus of projection centers (e.g. fisheye lens), f is characterized by calibration.

Fig. 3.
Fig. 3.

Minimum level of scene illuminance I0 versus the exposure time Δt. This estimation is respectively plotted for (a) different rays incidence angle θmax with a SNR fixed to 38dB and (b) for different expected SNRs at θma =45°. Otherwise for all plots: λ≈550nm, z=7.5μm, r=1.8mm (height of 1/3” sensor), η=37% and optical efficiency attenuation slope of -1.5%/deg.

Fig. 4.
Fig. 4.

Field-widened pinhole camera; It consists of only three optical elements (half-ball + aperture disc + plano-convex lens) cemented together to form a compact optical system.

Fig. 5.
Fig. 5.

(a) Half-FOV as a function of x; (b) Reflectance as a function of x; (c) Half-FOV as a function of θ, with different PCX lens geometries such that the reflectance at the edge of field does not exceed ~20%; (d) Field angle ϕ as a function of the ratio d/f. These estimations are plotted for several refractive indexes (N).

Fig. 6.
Fig. 6.

Half-FOV as a function of θ when using a half-ball with a lower index of refraction than the PCX lens’ one.

Fig. 7.
Fig. 7.

From left to right: A field-widened pinhole image acquired in our office (illuminance ~ 150 Lux), another image of a testing target (straight lines of chessboard grid are severely distorted), external view of the box-shaped testing target (chessboard pattern on every side).

Fig. 8.
Fig. 8.

Sample of computed optic flow field for four different single camera motions: (a) color code, where the color represents the direction ψ pi=arctan(yi /xi ) of a flow vector pi =[xi ;yi ]T (e.g. red for ψ pi=0°, yellow for ψ pi=90°, green for ψ pi=+/-180°, blue for ψ pi=-90°) and the brightness shows its magnitude (i.e. darker as |pi | diminishes); (b) translation along Y-axis; (c) rotation about Z-axis; (d) translation along X-axis; (e) rotation about X-axis (cf. reference frame in Fig. 6); (f), (g), (h) and (i) theoretical projections of the motion field for the corresponding cameras motions. This simple example illustrates that the measured 2D velocities vectors qualitatively match the expected optic flow field of simple motion.

Tables (1)

Tables Icon

Table 1. Specifications of the off-the-shelf components used for the realization of our optical sensor system.

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

p i = d i t = 1 D i ( T ( T · d i ) d i ) R × d i
( t × d i ) · ( p i + R × d i ) = ( t × d i ) · ( 1 D i ( t ( t · d i ) d i ) ) = 0 .
I i t = I i d i · p i .
I i t = I i d i · t D i + ( d i × I i d i ) · R .
P i = [ θ x θ y ϕ x ϕ y ] [ x t y t ] = [ y x 2 + y 2 x x 2 + y 2 ϕ x ϕ y ] p i '
SNR n ¯ η
Ø = β × λ d f d + f β λ f in the limit d >> f .
( F / # ) 2 f β λ r β λ tan ( θ max )
sin θ a = ( 1 x 2 + cot 2 θ x × cot θ ) ( 1 + cot 2 θ )
FOV = 2 ( θ a + θ IN )
θ = θ a + θ OUT .
δ C 2 = ( δI I 0 ) 2 2 · C 2 · ( δθ δρ ) 2 .
S N c R = M · δ C 2 δN c 2 M · 2 · C 2 · ( δθ δρ ) 2 · n ¯ Δ t .

Metrics