Abstract

This paper describes an application for arrays of narrow-field-of-view sensors with parallel optical axes. These devices exhibit some complementary characteristics with respect to conventional perspective projection or angular projection imaging devices. Conventional imaging devices measure rotational egomotion directly by measuring the angular velocity of the projected image. Translational egomotion cannot be measured directly by these devices because the induced image motion depends on the unknown range of the viewed object. On the other hand, a known translational motion generates image velocities which can be used to recover the ranges of objects and hence the three-dimensional (3D) structure of the environment. A new method is presented for computing egomotion and range using the properties of linear arrays of independent narrow-field-of-view optical sensors. An approximate parallel projection can be used to measure translational egomotion in terms of the velocity of the image. On the other hand, a known rotational motion of the paraxial sensor array generates image velocities, which can be used to recover the 3D structure of the environment. Results of tests of an experimental array confirm these properties.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. H. R. Everett, Sensors for Mobile Robots (A. K. Peters, 1995).
  2. A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.
  3. J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).
  4. R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987).
    [CrossRef]
  5. S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.
  6. G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (1977).
    [CrossRef]
  7. H. G. Krapp and F. Gabbiani, “Spatial distribution of inputs and local receptive field properties of a wide-field, looming sensitive neuron,” J. Neurophysiol. 93, 2240–2253 (2005).
    [CrossRef]
  8. W. Junger and H. J. Dahmen, “Response to self-motion in waterstriders: visual discrimination between rotation and translation,” J. Comp. Physiol. A 169, 641–646 (1991).
  9. R. Preiss, “Separation of translation and rotation by means of eye-region specialisation in flying gypsy moths (Lepidoptera: Lymantriidae),” J. Insect Behav. 4, 209–219 (1991).
  10. M. F. Land, “Visual acuity in insects,” Ann. Rev. Entomol. 42, 147–177 (1997).
  11. B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
    [CrossRef]
  12. E. Hecht and A. Zajac, Optics (Addison-Wesley, 1973).
  13. J. S. Chahl and M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996).
    [CrossRef]
  14. M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,” J. Field Robotics 25, 284–301 (2008).
  15. M. Sim and D. Kim, “Electrolocation based on tail-bending movements in weakly electric fish,” J. Exp. Biol. 214, 2443–2450 (2011).
  16. A. Mizutani and Y. Toh, “Behavioral analysis of two distinct visual responses in the larva of the tiger beetle (Cicindela chinensis),” J. Comp. Physiol. A 182, 277–286 (1998).
    [CrossRef]
  17. R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
    [CrossRef]

2011 (2)

J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).

M. Sim and D. Kim, “Electrolocation based on tail-bending movements in weakly electric fish,” J. Exp. Biol. 214, 2443–2450 (2011).

2008 (1)

M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,” J. Field Robotics 25, 284–301 (2008).

2007 (1)

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

2005 (2)

H. G. Krapp and F. Gabbiani, “Spatial distribution of inputs and local receptive field properties of a wide-field, looming sensitive neuron,” J. Neurophysiol. 93, 2240–2253 (2005).
[CrossRef]

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

1998 (1)

A. Mizutani and Y. Toh, “Behavioral analysis of two distinct visual responses in the larva of the tiger beetle (Cicindela chinensis),” J. Comp. Physiol. A 182, 277–286 (1998).
[CrossRef]

1997 (1)

M. F. Land, “Visual acuity in insects,” Ann. Rev. Entomol. 42, 147–177 (1997).

1996 (1)

J. S. Chahl and M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996).
[CrossRef]

1991 (2)

W. Junger and H. J. Dahmen, “Response to self-motion in waterstriders: visual discrimination between rotation and translation,” J. Comp. Physiol. A 169, 641–646 (1991).

R. Preiss, “Separation of translation and rotation by means of eye-region specialisation in flying gypsy moths (Lepidoptera: Lymantriidae),” J. Insect Behav. 4, 209–219 (1991).

1987 (1)

R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987).
[CrossRef]

1977 (1)

G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (1977).
[CrossRef]

Barrows, G. L.

S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.

Bartlett, S. L.

R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987).
[CrossRef]

Bessette, C. E.

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

Chahl, J.

J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).

Chahl, J. S.

M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,” J. Field Robotics 25, 284–301 (2008).

J. S. Chahl and M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996).
[CrossRef]

Dacke, M.

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

Dahmen, H. J.

W. Junger and H. J. Dahmen, “Response to self-motion in waterstriders: visual discrimination between rotation and translation,” J. Comp. Physiol. A 169, 641–646 (1991).

Darrell, T.

A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.

Everett, H. R.

H. R. Everett, Sensors for Mobile Robots (A. K. Peters, 1995).

Fox, J. L.

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

Gabbiani, F.

H. G. Krapp and F. Gabbiani, “Spatial distribution of inputs and local receptive field properties of a wide-field, looming sensitive neuron,” J. Neurophysiol. 93, 2240–2253 (2005).
[CrossRef]

Garratt, M. A.

M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,” J. Field Robotics 25, 284–301 (2008).

Gkioulekas, I.

S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.

Greiner, B.

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

Hecht, E.

E. Hecht and A. Zajac, Optics (Addison-Wesley, 1973).

Horridge, G. A.

G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (1977).
[CrossRef]

Huang, W.

A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.

Jain, R.

R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987).
[CrossRef]

Junger, W.

W. Junger and H. J. Dahmen, “Response to self-motion in waterstriders: visual discrimination between rotation and translation,” J. Comp. Physiol. A 169, 641–646 (1991).

Kim, D.

M. Sim and D. Kim, “Electrolocation based on tail-bending movements in weakly electric fish,” J. Exp. Biol. 214, 2443–2450 (2011).

Koppal, S. J.

S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.

Krapp, H. G.

H. G. Krapp and F. Gabbiani, “Spatial distribution of inputs and local receptive field properties of a wide-field, looming sensitive neuron,” J. Neurophysiol. 93, 2240–2253 (2005).
[CrossRef]

Land, M. F.

M. F. Land, “Visual acuity in insects,” Ann. Rev. Entomol. 42, 147–177 (1997).

Loosemore, M. P.

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

Mizutani, A.

J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).

A. Mizutani and Y. Toh, “Behavioral analysis of two distinct visual responses in the larva of the tiger beetle (Cicindela chinensis),” J. Comp. Physiol. A 182, 277–286 (1998).
[CrossRef]

Narendra, A.

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

O’Brien, N.

R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987).
[CrossRef]

Olberg, R. M.

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

Pentland, A.

A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.

Preiss, R.

R. Preiss, “Separation of translation and rotation by means of eye-region specialisation in flying gypsy moths (Lepidoptera: Lymantriidae),” J. Insect Behav. 4, 209–219 (1991).

Reid, S.

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

Ribi, W.

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

Rosser, K.

J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).

Sim, M.

M. Sim and D. Kim, “Electrolocation based on tail-bending movements in weakly electric fish,” J. Exp. Biol. 214, 2443–2450 (2011).

Srinivasan, M. V.

J. S. Chahl and M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996).
[CrossRef]

Toh, Y.

A. Mizutani and Y. Toh, “Behavioral analysis of two distinct visual responses in the larva of the tiger beetle (Cicindela chinensis),” J. Comp. Physiol. A 182, 277–286 (1998).
[CrossRef]

Turk, M.

A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.

Worthington, A. H.

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

Zajac, A.

E. Hecht and A. Zajac, Optics (Addison-Wesley, 1973).

Zeil, J.

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

Zickler, T.

S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.

Ann. Rev. Entomol. (1)

M. F. Land, “Visual acuity in insects,” Ann. Rev. Entomol. 42, 147–177 (1997).

Biol. Cybern. (1)

J. S. Chahl and M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybern. 74, 405–411 (1996).
[CrossRef]

Curr. Biol. (1)

B. Greiner, A. Narendra, S. Reid, M. Dacke, W. Ribi, and J. Zeil, “Eye structure correlates with distinct foraging-bout timing in primitive ants,” Curr. Biol. 17, R879–R880 (2007).
[CrossRef]

IEEE Trans. Pattern Recog. Machine Intell. (1)

R. Jain, S. L. Bartlett, and N. O’Brien, “Motion stereo using ego-motion complex logarithmic mapping,” IEEE Trans. Pattern Recog. Machine Intell. 9, 356–369 (1987).
[CrossRef]

J. Comp. Physiol. A (3)

W. Junger and H. J. Dahmen, “Response to self-motion in waterstriders: visual discrimination between rotation and translation,” J. Comp. Physiol. A 169, 641–646 (1991).

A. Mizutani and Y. Toh, “Behavioral analysis of two distinct visual responses in the larva of the tiger beetle (Cicindela chinensis),” J. Comp. Physiol. A 182, 277–286 (1998).
[CrossRef]

R. M. Olberg, A. H. Worthington, J. L. Fox, C. E. Bessette, and M. P. Loosemore, “Prey size selection and distance estimation in foraging adult dragonflies,” J. Comp. Physiol. A 191, 791–797 (2005).
[CrossRef]

J. Exp. Biol. (1)

M. Sim and D. Kim, “Electrolocation based on tail-bending movements in weakly electric fish,” J. Exp. Biol. 214, 2443–2450 (2011).

J. Field Robotics (1)

M. A. Garratt and J. S. Chahl, “Vision-based terrain following for an unmanned rotorcraft,” J. Field Robotics 25, 284–301 (2008).

J. Insect Behav. (1)

R. Preiss, “Separation of translation and rotation by means of eye-region specialisation in flying gypsy moths (Lepidoptera: Lymantriidae),” J. Insect Behav. 4, 209–219 (1991).

J. Neurophysiol. (1)

H. G. Krapp and F. Gabbiani, “Spatial distribution of inputs and local receptive field properties of a wide-field, looming sensitive neuron,” J. Neurophysiol. 93, 2240–2253 (2005).
[CrossRef]

Proc. SPIE (1)

J. Chahl, K. Rosser, and A. Mizutani, “Vertically displaced optical flow sensors to control the landing of a UAV,” Proc. SPIE 7975, 717918 (2011).

Sci. Am. (1)

G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (1977).
[CrossRef]

Other (4)

S. J. Koppal, I. Gkioulekas, T. Zickler, and G. L. Barrows, “Wide-angle micro sensors for vision on a tight budget,” in Computer Vision and Pattern Recognition (2011), pp. 361–368.

H. R. Everett, Sensors for Mobile Robots (A. K. Peters, 1995).

A. Pentland, T. Darrell, M. Turk, and W. Huang, “A simple real-time range camera,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 256–261.

E. Hecht and A. Zajac, Optics (Addison-Wesley, 1973).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1.

Comparison between a device generating a perspective projection view (top) and a parallel projection view (bottom). The essential observation is that the size of the object on the imaging plane depends on object distance for a perspective projection but is independent of object distance for a parallel projection.

Fig. 2.
Fig. 2.

Range measurement by rotating a paraxial array. θ is the amount the array is rotated, r is the perpendicular distance of the point object from the array, R is the distance of the object from the pivoting point of the array, and ω=(dθ/dt) is the angular velocity of the array.

Fig. 3.
Fig. 3.

Two simple ways of approximating a true parallel projection device. (Left) A long, thin, nonreflective tube in front of each element in the array allows only light parallel to the direction of the tube to pass through. (Right) A photodiode at the focus of a parabolic mirror can form a single unit in an array. Both solutions capture little light.

Fig. 4.
Fig. 4.

Paraxial array in which individual units have finite fields of view. The array is viewing a point P in a scene with intensity profile across the length of the array f(x) and range profile r(x); each element has an intensity response curve that is a function of view angle g(θ). The spatial position of P projected onto the array is ξ, which is determined by r(x) and θ. It can be shown that as the array moves along the x direction at rate v, the response profile of the array also moves at rate v.

Fig. 5.
Fig. 5.

(Left) True parallel projection array; (right) paraxial array composed of units with finite fields of view. As range increases the image produced by the array on the left is unaffected, whereas the image produced by the array on the right is increasingly low-pass filtered by the finite angular acceptance window of each unit.

Fig. 6.
Fig. 6.

(Left) The experimental implementation of the device was a camera on a computer-controlled traverse, images were captured by progressively sampling along the traverse. (Right) A region of 3×3 pixels (1°) near the optical axis of the camera was chosen to approximate a single unit in the array. It is apparent that as range increases, the contrast of the image of the stripe pattern decreases. The array was 80 mm long with 0.207 mm intervals between units of the array.

Fig. 7.
Fig. 7.

Pattern chosen for the initial tests of both the range and egomotion algorithms. This pattern provides a wide range of spatial frequencies but no spatial structure. The array only imaged the central area of the pattern, not the background.

Fig. 8.
Fig. 8.

(Left) Egomotion computations across a total displacement of 5 mm, using an array of length 30 mm containing 150 units. The array was placed in front of a random dot pattern at ranges of 400 mm in one experiment and 1500 mm in another. The dashed line represents a perfectly linear relationship between the true value of egomotion and the measured results. (Right) Array of length 40 mm containing 10 units, at ranges of 500 and 1000 mm, respectively, in two different experiments, as well as in a more realistic scene at an average range of 1700 mm (location b in Fig. 10). Maximum motion was 8 mm.

Fig. 9.
Fig. 9.

Two paraxial arrays with equivalent properties, one capable of rotating each optical element, the other capable of rotating the entire array as a unit.

Fig. 10.
Fig. 10.

Test scene for evaluating the range computation scheme. The scene consists of an assemblage of discarded computer equipment used as a target for range computations. The approximate locations where range computations were undertaken are circled and labeled from a to e. The true ranges of these items varied from 1200 to 2200 mm. There was a maximum error of approximately 50 mm in determining the ground truth ranges of the target locations as they are oriented randomly with respect to the camera.

Fig. 11.
Fig. 11.

(Left) Range target was a random dot pattern. The array had a length of 80 mm and contained 320 units. For measurement of range, a 0.5° rotation of the array was simulated. The dashed line indicates the expected response of the device, and the circles indicate the range measurements. (Right) Range targets were chosen from a pile of discarded computer equipment. The letters in the legend refer to locations shown in Fig. 10. Positions on the objects were only selected if they had a reasonable amount of contrast nearby. The array had a length 80 mm and contained 320 units. The simulated rotation was 0.5°. The array had a length of 60 mm containing eight units. The target was the range key from Fig. 10.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

dxdt=Rcos(θ)·dθdt.
R=(dxdtdθdtcos(θ))=(dxdtωcos(θ)),
r=Rcosθ=dxdtω.
r=Image VelocityArray Angular Velocity.
ξr(xvt+ξ)=tanθ.
P(θ,x,t)=f[xvt+ξ(θ,xvt)].
o(x,t)=π2π2g(θ)P(θ,xvt)dθ,
λθδA,

Metrics