Abstract

This paper examines the performance of a low-cost, miniature, wide field-of-view (FOV) visual sensor that includes advanced pinhole optics and most recent CMOS imager technology. The pinhole camera may often be disregarded because of its apparent simplicity, low aperture and image finesse. However, its angular field can be dramatically improved using only a few off-the-shelf micro-optical elements. With modern high-sensitivity silicon-based digital retina, we show that it could be a practical device for developing self-motion estimation sensor in mobile applications, such as stabilization of a robotic micro flyer.

© 2005 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. D.V. Wick, T. Martinez, S.R. Restaino, and B.R. Stone, “Foveated imaging demonstration,” Opt. Express 10, 60–65 (2002).
    [PubMed]
  2. R. Volkel, M. Eisner, and K.J. Weible, “Miniaturized imaging system,” J. Microelectronic Engineering, Elsevier Science,  67-68, 461–472 (2003).
    [Crossref]
  3. J. Neumann, C. Fermuller, and Y. Aloimonos, “Eyes from eyes: new cameras for structure from motion,” in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19–26.
    [Crossref]
  4. T. Netter and N. Franceschini, “A robotic aircraft that follows terrain using a neuro-morphic eye,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Lausanne, Switzerland, 2002), 129–134.
    [Crossref]
  5. K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.
  6. R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
    [Crossref]
  7. J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, “Compound eye sensor for 3D ego motion estimation,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).
  8. J. Tanida, T. Kumagai, K. Yamada, S. Miyatake, K. Ishida, T. Morimoto, N. Kondou, D. Miyazaki, and Y. Ichioka, “Thin observation module by bound optics (TOMBO): concept and experimental verification,” Appl. Opt. 40 (11), 1806–1813 (2001).
    [Crossref]
  9. J. Kim, K.H. Jeong, and L.P. Lee, “Artificial ommatidia by self-aligned microlenses and waveguides,” Opt. Express 30, 5–7 (2005)
  10. J. Gluckman and S.K. Nayar, “Egomotion and omnidirectional cameras,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (Bombay, India, 1998), 999–1005.
  11. P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
    [Crossref]
  12. J.J. Koenderink and A.J. Van Doorn, “Facts on optic flow,” J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247–254.
    [Crossref]
  13. T. Tian, C. Tomasi, and D. Heeger, “Comparison of approaches to egomotion estimation,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315–320.
  14. M. Franz, J. Chahl, and H. Krapp, “Insect-inspired estimation of egomotion,” J. Neural Computation 16, 2245–2260 (2004).
    [Crossref]
  15. G. Adiv, “Inherent ambiguities in recovering 3D motion and structure from a noisy field,” IEEE Trans. Pattern Anal. Mach. Intell. 11 (5), 477–489 (1989).
    [Crossref]
  16. C. Fermuller and Y. Aloimonos, “Observability of 3D motion,” J. Computer Vision, Springer Science eds.,  37 (1), 46–63 (2000)..
  17. J. Neumann, “Computer vision in the space if light rays: plenoptic video geometry and polydioptric camera design,” Dissertation for the degree of Doctor of Philosophy, Department of Computer Science, University of Maryland (2004).
  18. S. Srinivasan and R. Chellappa, “Noise-resilient estimation of optical flow by use of overlapped basis functions,” J. Opt. Soc. Am. A 16, 493–507 (1999).
    [Crossref]
  19. A. Bruhn, J. Weickert, and C. Schnorr, “Combining the advantages of local and global optic flow methods,” in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457–462
  20. C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.
  21. T. Martinez, D.V. Wick, and S.R. Restaino, “Foveated, wide field-of-view imaging system using a liquid crystal spatial light modulator,” Opt. Express 8 (10), 555–560 (2001).
    [Crossref] [PubMed]
  22. J. Beckstead and S. Nordhauser, “360 degree/forward view integral imaging system,” US Patent 6028719, InterScience Inc., 22 February 2000.
  23. R. Constantini and S. Susstrunk, “Virtual sensor design,” in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408–419 (2004).
    [Crossref]
  24. T.H. Nilsson, “Incident photometry: specifying stimuli for vision and light detectors,” Appl. Opt. 22, 3457–3464 (1983).
    [Crossref] [PubMed]
  25. M. Young, “Pinhole optics,” Appl. Opt. 10, 2763–2767 (1971).
    [Crossref] [PubMed]
  26. K.D. Mielenz, “On the diffraction limit for lensless imaging,” J. Nat. Inst. Stand. Tech. 104, (1999).
  27. B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
    [Crossref]
  28. P.B. Catrysse and B.A. Wandell, “Optical efficiency of image sensor pixels,” J. Opt. Soc. Am. A 19, (2002).
    [Crossref]
  29. J.M. Franke, “Field-widened pinhole camera,” Appl. Opt. 18, 1979.
    [Crossref] [PubMed]
  30. http://www.ovt.com
  31. R. Steveninck and W. Bialek, “Timing and counting precision in the blowfly visual system,” in Methods in Neural Networks IV, J.Van Hemmen, J.D. Cowan, and E. Domany, ed., (Heidelberg; Springer-Verlag, 2001), 313–371.
  32. W. Bialek. “Thinking about the brain,” in Les Houches Lectures on Physics of bio-molecules and cells, H. Flyvbjerg, F. Jülicher, P. Ormos, and F. David, ed., (Les Ulis, France; Springer-Verlag, 2002), 485–577.

2005 (1)

J. Kim, K.H. Jeong, and L.P. Lee, “Artificial ommatidia by self-aligned microlenses and waveguides,” Opt. Express 30, 5–7 (2005)

2004 (3)

M. Franz, J. Chahl, and H. Krapp, “Insect-inspired estimation of egomotion,” J. Neural Computation 16, 2245–2260 (2004).
[Crossref]

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

R. Constantini and S. Susstrunk, “Virtual sensor design,” in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408–419 (2004).
[Crossref]

2003 (1)

R. Volkel, M. Eisner, and K.J. Weible, “Miniaturized imaging system,” J. Microelectronic Engineering, Elsevier Science,  67-68, 461–472 (2003).
[Crossref]

2002 (2)

D.V. Wick, T. Martinez, S.R. Restaino, and B.R. Stone, “Foveated imaging demonstration,” Opt. Express 10, 60–65 (2002).
[PubMed]

P.B. Catrysse and B.A. Wandell, “Optical efficiency of image sensor pixels,” J. Opt. Soc. Am. A 19, (2002).
[Crossref]

2001 (2)

2000 (2)

P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
[Crossref]

C. Fermuller and Y. Aloimonos, “Observability of 3D motion,” J. Computer Vision, Springer Science eds.,  37 (1), 46–63 (2000)..

1999 (2)

1998 (1)

B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
[Crossref]

1989 (1)

G. Adiv, “Inherent ambiguities in recovering 3D motion and structure from a noisy field,” IEEE Trans. Pattern Anal. Mach. Intell. 11 (5), 477–489 (1989).
[Crossref]

1987 (1)

J.J. Koenderink and A.J. Van Doorn, “Facts on optic flow,” J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247–254.
[Crossref]

1983 (1)

1979 (1)

J.M. Franke, “Field-widened pinhole camera,” Appl. Opt. 18, 1979.
[Crossref] [PubMed]

1971 (1)

Adiv, G.

G. Adiv, “Inherent ambiguities in recovering 3D motion and structure from a noisy field,” IEEE Trans. Pattern Anal. Mach. Intell. 11 (5), 477–489 (1989).
[Crossref]

Aloimonos, Y.

C. Fermuller and Y. Aloimonos, “Observability of 3D motion,” J. Computer Vision, Springer Science eds.,  37 (1), 46–63 (2000)..

P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
[Crossref]

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

J. Neumann, C. Fermuller, and Y. Aloimonos, “Eyes from eyes: new cameras for structure from motion,” in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19–26.
[Crossref]

J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, “Compound eye sensor for 3D ego motion estimation,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).

Baker, P.

P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
[Crossref]

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

Beckstead, J.

J. Beckstead and S. Nordhauser, “360 degree/forward view integral imaging system,” US Patent 6028719, InterScience Inc., 22 February 2000.

Bialek, W.

R. Steveninck and W. Bialek, “Timing and counting precision in the blowfly visual system,” in Methods in Neural Networks IV, J.Van Hemmen, J.D. Cowan, and E. Domany, ed., (Heidelberg; Springer-Verlag, 2001), 313–371.

W. Bialek. “Thinking about the brain,” in Les Houches Lectures on Physics of bio-molecules and cells, H. Flyvbjerg, F. Jülicher, P. Ormos, and F. David, ed., (Les Ulis, France; Springer-Verlag, 2002), 485–577.

Brajovic, V.

J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, “Compound eye sensor for 3D ego motion estimation,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).

Bruhn, A.

A. Bruhn, J. Weickert, and C. Schnorr, “Combining the advantages of local and global optic flow methods,” in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457–462

Catrysse, P.B.

P.B. Catrysse and B.A. Wandell, “Optical efficiency of image sensor pixels,” J. Opt. Soc. Am. A 19, (2002).
[Crossref]

Chahl, J.

M. Franz, J. Chahl, and H. Krapp, “Insect-inspired estimation of egomotion,” J. Neural Computation 16, 2245–2260 (2004).
[Crossref]

Chellappa, R.

Constantini, R.

R. Constantini and S. Susstrunk, “Virtual sensor design,” in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408–419 (2004).
[Crossref]

Eisner, M.

R. Volkel, M. Eisner, and K.J. Weible, “Miniaturized imaging system,” J. Microelectronic Engineering, Elsevier Science,  67-68, 461–472 (2003).
[Crossref]

Fermuller, C.

C. Fermuller and Y. Aloimonos, “Observability of 3D motion,” J. Computer Vision, Springer Science eds.,  37 (1), 46–63 (2000)..

P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
[Crossref]

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

J. Neumann, C. Fermuller, and Y. Aloimonos, “Eyes from eyes: new cameras for structure from motion,” in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19–26.
[Crossref]

J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, “Compound eye sensor for 3D ego motion estimation,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).

Fowler, B.

B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
[Crossref]

Franceschini, N.

T. Netter and N. Franceschini, “A robotic aircraft that follows terrain using a neuro-morphic eye,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Lausanne, Switzerland, 2002), 129–134.
[Crossref]

Franke, J.M.

J.M. Franke, “Field-widened pinhole camera,” Appl. Opt. 18, 1979.
[Crossref] [PubMed]

Franz, M.

M. Franz, J. Chahl, and H. Krapp, “Insect-inspired estimation of egomotion,” J. Neural Computation 16, 2245–2260 (2004).
[Crossref]

Gamal, A.E.

B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
[Crossref]

Gluckman, J.

J. Gluckman and S.K. Nayar, “Egomotion and omnidirectional cameras,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (Bombay, India, 1998), 999–1005.

Heeger, D.

T. Tian, C. Tomasi, and D. Heeger, “Comparison of approaches to egomotion estimation,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315–320.

Hornsey, R.

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Hoshino, K.

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.

Ichioka, Y.

Ishida, K.

Jeong, K.H.

J. Kim, K.H. Jeong, and L.P. Lee, “Artificial ommatidia by self-aligned microlenses and waveguides,” Opt. Express 30, 5–7 (2005)

Kim, J.

J. Kim, K.H. Jeong, and L.P. Lee, “Artificial ommatidia by self-aligned microlenses and waveguides,” Opt. Express 30, 5–7 (2005)

Koenderink, J.J.

J.J. Koenderink and A.J. Van Doorn, “Facts on optic flow,” J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247–254.
[Crossref]

Kondou, N.

Krapp, H.

M. Franz, J. Chahl, and H. Krapp, “Insect-inspired estimation of egomotion,” J. Neural Computation 16, 2245–2260 (2004).
[Crossref]

Krishnasamy, R.

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Kumagai, T.

Lee, L.P.

J. Kim, K.H. Jeong, and L.P. Lee, “Artificial ommatidia by self-aligned microlenses and waveguides,” Opt. Express 30, 5–7 (2005)

Martinez, T.

Mielenz, K.D.

K.D. Mielenz, “On the diffraction limit for lensless imaging,” J. Nat. Inst. Stand. Tech. 104, (1999).

Miyatake, S.

Miyazaki, D.

Morii, H.

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.

Morimoto, T.

Mura, F.

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.

Nayar, S.K.

J. Gluckman and S.K. Nayar, “Egomotion and omnidirectional cameras,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (Bombay, India, 1998), 999–1005.

Netter, T.

T. Netter and N. Franceschini, “A robotic aircraft that follows terrain using a neuro-morphic eye,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Lausanne, Switzerland, 2002), 129–134.
[Crossref]

Neumann, J.

J. Neumann, C. Fermuller, and Y. Aloimonos, “Eyes from eyes: new cameras for structure from motion,” in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19–26.
[Crossref]

J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, “Compound eye sensor for 3D ego motion estimation,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

J. Neumann, “Computer vision in the space if light rays: plenoptic video geometry and polydioptric camera design,” Dissertation for the degree of Doctor of Philosophy, Department of Computer Science, University of Maryland (2004).

Nilsson, T.H.

Nordhauser, S.

J. Beckstead and S. Nordhauser, “360 degree/forward view integral imaging system,” US Patent 6028719, InterScience Inc., 22 February 2000.

Pepic, S.

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Pless, R.

P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
[Crossref]

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

Restaino, S.R.

Schnorr, C.

A. Bruhn, J. Weickert, and C. Schnorr, “Combining the advantages of local and global optic flow methods,” in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457–462

Shimoyama, I.

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.

Srinivasan, S.

Steveninck, R.

R. Steveninck and W. Bialek, “Timing and counting precision in the blowfly visual system,” in Methods in Neural Networks IV, J.Van Hemmen, J.D. Cowan, and E. Domany, ed., (Heidelberg; Springer-Verlag, 2001), 313–371.

Stone, B.R.

Stuart, B.

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

Suematsu, K.

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.

Susstrunk, S.

R. Constantini and S. Susstrunk, “Virtual sensor design,” in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408–419 (2004).
[Crossref]

Tanida, J.

Thomas, P.

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Tian, H.

B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
[Crossref]

Tian, T.

T. Tian, C. Tomasi, and D. Heeger, “Comparison of approaches to egomotion estimation,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315–320.

Tomasi, C.

T. Tian, C. Tomasi, and D. Heeger, “Comparison of approaches to egomotion estimation,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315–320.

Van Doorn, A.J.

J.J. Koenderink and A.J. Van Doorn, “Facts on optic flow,” J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247–254.
[Crossref]

Volkel, R.

R. Volkel, M. Eisner, and K.J. Weible, “Miniaturized imaging system,” J. Microelectronic Engineering, Elsevier Science,  67-68, 461–472 (2003).
[Crossref]

Wandell, B.A.

P.B. Catrysse and B.A. Wandell, “Optical efficiency of image sensor pixels,” J. Opt. Soc. Am. A 19, (2002).
[Crossref]

Weible, K.J.

R. Volkel, M. Eisner, and K.J. Weible, “Miniaturized imaging system,” J. Microelectronic Engineering, Elsevier Science,  67-68, 461–472 (2003).
[Crossref]

Weickert, J.

A. Bruhn, J. Weickert, and C. Schnorr, “Combining the advantages of local and global optic flow methods,” in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457–462

Wick, D.V.

Wong, W.

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Yamada, K.

Yang, D.

B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
[Crossref]

Yip, K.

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Young, M.

Appl. Opt. (4)

IEEE Trans. Pattern Anal. Mach. Intell. (1)

G. Adiv, “Inherent ambiguities in recovering 3D motion and structure from a noisy field,” IEEE Trans. Pattern Anal. Mach. Intell. 11 (5), 477–489 (1989).
[Crossref]

J. Biol. Cybern. (1)

J.J. Koenderink and A.J. Van Doorn, “Facts on optic flow,” J. Biol. Cybern. 56, Springer-Verlag eds. (1987), 247–254.
[Crossref]

J. Microelectronic Engineering, Elsevier Science (1)

R. Volkel, M. Eisner, and K.J. Weible, “Miniaturized imaging system,” J. Microelectronic Engineering, Elsevier Science,  67-68, 461–472 (2003).
[Crossref]

J. Nat. Inst. Stand. Tech. (1)

K.D. Mielenz, “On the diffraction limit for lensless imaging,” J. Nat. Inst. Stand. Tech. 104, (1999).

J. Neural Computation (1)

M. Franz, J. Chahl, and H. Krapp, “Insect-inspired estimation of egomotion,” J. Neural Computation 16, 2245–2260 (2004).
[Crossref]

J. Opt. Soc. Am. A (2)

Lectures Notes in Computer Science (1)

P. Baker, R. Pless, C. Fermuller, and Y. Aloimonos, “New eyes for shape and motion estimation,” in IEEE Proceedings of the first international Workshop on Biologically Motivated Computer Vision, Lectures Notes in Computer Science 1811, Springer-Verlag eds. (2000), 118–128.
[Crossref]

Opt. Express (3)

Proc. SPIE (3)

R. Constantini and S. Susstrunk, “Virtual sensor design,” in Sensors and Camera Systems for Scientific -Industrial and Digital Photography Applications, Proc. SPIE 5301, 408–419 (2004).
[Crossref]

B. Fowler, A.E. Gamal, D. Yang, and H. Tian, “A method for estimating quantum efficiency for CMOS image sensors,” in Solid State Sensor Arrays - Development and Applications, Proc. SPIE 3301, 178–185 (1998).
[Crossref]

R. Hornsey, P. Thomas, W. Wong, S. Pepic, K. Yip, and R. Krishnasamy, “Electronic compound eye image sensor: construction and calibration,” in Sensors and Camera Systems for Scientific, Industrial, and Digital Photography Applications V, M. MBlouke, N. Sampat, and R. Motta, eds., Proc. SPIE 5301, 13–24 (San Jose, US Calif., 2004).
[Crossref]

Other (14)

J. Neumann, C. Fermuller, Y. Aloimonos, and V. Brajovic, “Compound eye sensor for 3D ego motion estimation,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Sendai, Japan, 2004).

J. Gluckman and S.K. Nayar, “Egomotion and omnidirectional cameras,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (Bombay, India, 1998), 999–1005.

J. Neumann, C. Fermuller, and Y. Aloimonos, “Eyes from eyes: new cameras for structure from motion,” in IEEE Proceedings of Third Workshop on Omnidirectional Vision (Copenhagen, Denmark, 2002), 19–26.
[Crossref]

T. Netter and N. Franceschini, “A robotic aircraft that follows terrain using a neuro-morphic eye,” in IEEE Proceedings of Conference on Intelligent Robots and Systems (Lausanne, Switzerland, 2002), 129–134.
[Crossref]

K. Hoshino, F. Mura, H. Morii, K. Suematsu, and I. Shimoyama, “A small-sized panoramic scanning visual sensor inspired by the fly’s compound eye,” in IEEE Proceedings of Conference on Robotics and Automation (ICRA, Leuven, Belgium, 1998), 1641–1646.

T. Tian, C. Tomasi, and D. Heeger, “Comparison of approaches to egomotion estimation,” in IEEE Proceedings of Conference on Computer Vision and Pattern Recognition (San Francisco, US Calif., 1996), 315–320.

A. Bruhn, J. Weickert, and C. Schnorr, “Combining the advantages of local and global optic flow methods,” in DAGM Proceedings of Symposium on Pattern Recognition, (2002), 457–462

C. Fermuller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann, and B. Stuart, “Multi-camera networks: eyes from eyes,” in IEEE Proceedings of Workshop on Omnidirectional Vision, (2000), 11–18.

C. Fermuller and Y. Aloimonos, “Observability of 3D motion,” J. Computer Vision, Springer Science eds.,  37 (1), 46–63 (2000)..

J. Neumann, “Computer vision in the space if light rays: plenoptic video geometry and polydioptric camera design,” Dissertation for the degree of Doctor of Philosophy, Department of Computer Science, University of Maryland (2004).

J. Beckstead and S. Nordhauser, “360 degree/forward view integral imaging system,” US Patent 6028719, InterScience Inc., 22 February 2000.

http://www.ovt.com

R. Steveninck and W. Bialek, “Timing and counting precision in the blowfly visual system,” in Methods in Neural Networks IV, J.Van Hemmen, J.D. Cowan, and E. Domany, ed., (Heidelberg; Springer-Verlag, 2001), 313–371.

W. Bialek. “Thinking about the brain,” in Les Houches Lectures on Physics of bio-molecules and cells, H. Flyvbjerg, F. Jülicher, P. Ormos, and F. David, ed., (Les Ulis, France; Springer-Verlag, 2002), 485–577.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1.

Spherical Sensor model: illustration of the image formation and the measurements of the optic flow pi induced by the sensor motion. θ and ϕ are respectively azimuthal and polar angles.

Fig. 2.
Fig. 2.

Perspective transformation model of a wide-angle imaging system; radial image distortions are non-linear; the projection function f(ϕ);ϕ:ℜ +r: ℜ + can be derived from the optical unit geometry; for imaging devices that have a locus of projection centers (e.g. fisheye lens), f is characterized by calibration.

Fig. 3.
Fig. 3.

Minimum level of scene illuminance I0 versus the exposure time Δt. This estimation is respectively plotted for (a) different rays incidence angle θmax with a SNR fixed to 38dB and (b) for different expected SNRs at θma =45°. Otherwise for all plots: λ≈550nm, z=7.5μm, r=1.8mm (height of 1/3” sensor), η=37% and optical efficiency attenuation slope of -1.5%/deg.

Fig. 4.
Fig. 4.

Field-widened pinhole camera; It consists of only three optical elements (half-ball + aperture disc + plano-convex lens) cemented together to form a compact optical system.

Fig. 5.
Fig. 5.

(a) Half-FOV as a function of x; (b) Reflectance as a function of x; (c) Half-FOV as a function of θ, with different PCX lens geometries such that the reflectance at the edge of field does not exceed ~20%; (d) Field angle ϕ as a function of the ratio d/f. These estimations are plotted for several refractive indexes (N).

Fig. 6.
Fig. 6.

Half-FOV as a function of θ when using a half-ball with a lower index of refraction than the PCX lens’ one.

Fig. 7.
Fig. 7.

From left to right: A field-widened pinhole image acquired in our office (illuminance ~ 150 Lux), another image of a testing target (straight lines of chessboard grid are severely distorted), external view of the box-shaped testing target (chessboard pattern on every side).

Fig. 8.
Fig. 8.

Sample of computed optic flow field for four different single camera motions: (a) color code, where the color represents the direction ψ pi=arctan(yi /xi ) of a flow vector pi =[xi ;yi ]T (e.g. red for ψ pi=0°, yellow for ψ pi=90°, green for ψ pi=+/-180°, blue for ψ pi=-90°) and the brightness shows its magnitude (i.e. darker as |pi | diminishes); (b) translation along Y-axis; (c) rotation about Z-axis; (d) translation along X-axis; (e) rotation about X-axis (cf. reference frame in Fig. 6); (f), (g), (h) and (i) theoretical projections of the motion field for the corresponding cameras motions. This simple example illustrates that the measured 2D velocities vectors qualitatively match the expected optic flow field of simple motion.

Tables (1)

Tables Icon

Table 1. Specifications of the off-the-shelf components used for the realization of our optical sensor system.

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

p i = d i t = 1 D i ( T ( T · d i ) d i ) R × d i
( t × d i ) · ( p i + R × d i ) = ( t × d i ) · ( 1 D i ( t ( t · d i ) d i ) ) = 0 .
I i t = I i d i · p i .
I i t = I i d i · t D i + ( d i × I i d i ) · R .
P i = [ θ x θ y ϕ x ϕ y ] [ x t y t ] = [ y x 2 + y 2 x x 2 + y 2 ϕ x ϕ y ] p i '
SNR n ¯ η
Ø = β × λ d f d + f β λ f in the limit d >> f .
( F / # ) 2 f β λ r β λ tan ( θ max )
sin θ a = ( 1 x 2 + cot 2 θ x × cot θ ) ( 1 + cot 2 θ )
FOV = 2 ( θ a + θ IN )
θ = θ a + θ OUT .
δ C 2 = ( δI I 0 ) 2 2 · C 2 · ( δθ δρ ) 2 .
S N c R = M · δ C 2 δN c 2 M · 2 · C 2 · ( δθ δρ ) 2 · n ¯ Δ t .

Metrics