Abstract

A family of reflective surfaces is presented that, when imaged by a camera, can capture a global view of the visual environment. By using these surfaces in conjunction with conventional imaging devices, it is possible to produce fields of view in excess of 180° that are not affected by the distortions and aberrations found in refractive wide-angle imaging devices. By solving a differential equation expressing the camera viewing angle as a function of the angle of incidence on a reflective surface, a family of appropriate surfaces has been derived. The surfaces preserve a linear relationship between the angle of incidence of light onto the surface and the angle of reflection onto the imaging device, as does a normal mirror. However, the gradient of this linear relationship can be varied as desired to produce a larger or smaller field of view. The resulting family of surfaces has a number of applications in surveillance and machine vision.

© 1997 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. M. V. Srinivasan, “Generalized gradient schemes for the measurement of two-dimensional image motion,” Biol. Cybernet. 63, 421–431 (1990).
    [CrossRef]
  2. T. R. Halfhill, “See you around,” Byte 20(5), 85–90 (1995).
  3. K. Kato, T. Nakanishi, A. Shio, K. Ishii, “Structure from image sequences captured through a monocular extra-wide angle lens,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (IEEE Computer Society Press, Los Alamitos, Calif., 1994), pp. 919–924.
    [CrossRef]
  4. G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (July1977).
    [CrossRef]
  5. H. H. Wolff, “Wide angle television display system,” U.S. Patent4,246,603 (20January1981).
  6. C. R. Driskell, “Color panoramic laser projector,” U.S. Patent3,992,718 (16November1976).
  7. W. V. Dykes, “Wide angle single channel projection apparatus,” U.S. Patent3,998,532 (21December1976).
  8. J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
    [CrossRef]
  9. J. S. Chahl, M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybernet. 74, 405–411 (1996).
    [CrossRef]
  10. T. Röfer, “Controlling a wheelchair with image-based homing,” in AISB Workshop on Spatial Reasoning in Mobile Robots and Animals (The Society for the Study of Artificial Intelligence and the Simulation of Behaviour, Manchester, UK, 1997), pp. 66–75.
  11. K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).
  12. M. Tistarelli, G. Sandini, “On the advantages of polar and log-polar mapping for direct estimation of time-to-impact from optical flow,” IEEE Trans. Pattern Recognition Machine Intell. 15, 401–410 (1993).
    [CrossRef]

1996 (1)

J. S. Chahl, M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybernet. 74, 405–411 (1996).
[CrossRef]

1995 (1)

T. R. Halfhill, “See you around,” Byte 20(5), 85–90 (1995).

1993 (1)

M. Tistarelli, G. Sandini, “On the advantages of polar and log-polar mapping for direct estimation of time-to-impact from optical flow,” IEEE Trans. Pattern Recognition Machine Intell. 15, 401–410 (1993).
[CrossRef]

1990 (1)

M. V. Srinivasan, “Generalized gradient schemes for the measurement of two-dimensional image motion,” Biol. Cybernet. 63, 421–431 (1990).
[CrossRef]

1977 (1)

G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (July1977).
[CrossRef]

Chahl, J. S.

J. S. Chahl, M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybernet. 74, 405–411 (1996).
[CrossRef]

Driskell, C. R.

C. R. Driskell, “Color panoramic laser projector,” U.S. Patent3,992,718 (16November1976).

Dykes, W. V.

W. V. Dykes, “Wide angle single channel projection apparatus,” U.S. Patent3,998,532 (21December1976).

Fujie, M.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Halfhill, T. R.

T. R. Halfhill, “See you around,” Byte 20(5), 85–90 (1995).

Hong, J.

J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
[CrossRef]

Honma, K.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Horridge, G. A.

G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (July1977).
[CrossRef]

Ishii, K.

K. Kato, T. Nakanishi, A. Shio, K. Ishii, “Structure from image sequences captured through a monocular extra-wide angle lens,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (IEEE Computer Society Press, Los Alamitos, Calif., 1994), pp. 919–924.
[CrossRef]

Iwamoto, T.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Kamejima, K.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Kato, K.

K. Kato, T. Nakanishi, A. Shio, K. Ishii, “Structure from image sequences captured through a monocular extra-wide angle lens,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (IEEE Computer Society Press, Los Alamitos, Calif., 1994), pp. 919–924.
[CrossRef]

Nakanishi, T.

K. Kato, T. Nakanishi, A. Shio, K. Ishii, “Structure from image sequences captured through a monocular extra-wide angle lens,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (IEEE Computer Society Press, Los Alamitos, Calif., 1994), pp. 919–924.
[CrossRef]

Nakano, Y.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Pinette, B.

J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
[CrossRef]

Riseman, E. M.

J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
[CrossRef]

Röfer, T.

T. Röfer, “Controlling a wheelchair with image-based homing,” in AISB Workshop on Spatial Reasoning in Mobile Robots and Animals (The Society for the Study of Artificial Intelligence and the Simulation of Behaviour, Manchester, UK, 1997), pp. 66–75.

Sandini, G.

M. Tistarelli, G. Sandini, “On the advantages of polar and log-polar mapping for direct estimation of time-to-impact from optical flow,” IEEE Trans. Pattern Recognition Machine Intell. 15, 401–410 (1993).
[CrossRef]

Shio, A.

K. Kato, T. Nakanishi, A. Shio, K. Ishii, “Structure from image sequences captured through a monocular extra-wide angle lens,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (IEEE Computer Society Press, Los Alamitos, Calif., 1994), pp. 919–924.
[CrossRef]

Srinivasan, M. V.

J. S. Chahl, M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybernet. 74, 405–411 (1996).
[CrossRef]

M. V. Srinivasan, “Generalized gradient schemes for the measurement of two-dimensional image motion,” Biol. Cybernet. 63, 421–431 (1990).
[CrossRef]

Tan, X.

J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
[CrossRef]

Tistarelli, M.

M. Tistarelli, G. Sandini, “On the advantages of polar and log-polar mapping for direct estimation of time-to-impact from optical flow,” IEEE Trans. Pattern Recognition Machine Intell. 15, 401–410 (1993).
[CrossRef]

Weiss, R.

J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
[CrossRef]

Wolff, H. H.

H. H. Wolff, “Wide angle television display system,” U.S. Patent4,246,603 (20January1981).

Yamamoto, H.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Biol. Cybernet. (2)

M. V. Srinivasan, “Generalized gradient schemes for the measurement of two-dimensional image motion,” Biol. Cybernet. 63, 421–431 (1990).
[CrossRef]

J. S. Chahl, M. V. Srinivasan, “Visual computation of ego-motion using an image interpolation technique,” Biol. Cybernet. 74, 405–411 (1996).
[CrossRef]

Byte (1)

T. R. Halfhill, “See you around,” Byte 20(5), 85–90 (1995).

IEEE Trans. Pattern Recognition Machine Intell. (1)

M. Tistarelli, G. Sandini, “On the advantages of polar and log-polar mapping for direct estimation of time-to-impact from optical flow,” IEEE Trans. Pattern Recognition Machine Intell. 15, 401–410 (1993).
[CrossRef]

Sci. Am. (1)

G. A. Horridge, “The compound eye of insects,” Sci. Am. 237, 108–120 (July1977).
[CrossRef]

Other (7)

H. H. Wolff, “Wide angle television display system,” U.S. Patent4,246,603 (20January1981).

C. R. Driskell, “Color panoramic laser projector,” U.S. Patent3,992,718 (16November1976).

W. V. Dykes, “Wide angle single channel projection apparatus,” U.S. Patent3,998,532 (21December1976).

J. Hong, X. Tan, B. Pinette, R. Weiss, E. M. Riseman, “Image-based homing,” in Proceedings of the 1991 IEEE International Conference on Robotics and Automation (IEEE, New York, 1991), pp. 620–625.
[CrossRef]

K. Kato, T. Nakanishi, A. Shio, K. Ishii, “Structure from image sequences captured through a monocular extra-wide angle lens,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (IEEE Computer Society Press, Los Alamitos, Calif., 1994), pp. 919–924.
[CrossRef]

T. Röfer, “Controlling a wheelchair with image-based homing,” in AISB Workshop on Spatial Reasoning in Mobile Robots and Animals (The Society for the Study of Artificial Intelligence and the Simulation of Behaviour, Manchester, UK, 1997), pp. 66–75.

K. Kamejima, H. Yamamoto, Y. Nakano, M. Fujie, T. Iwamoto, K. Honma, “Picture processing apparatus,” U.S. Patent4,549,208 (22October1985).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (18)

Fig. 1
Fig. 1

Processes involved in producing a panoramic image from a montage of normal still or video images2: (1) Images are captured at regular angular intervals so that each image slightly overlaps its neighbors. (2) Images are stitched together by matching features in the overlapping regions (if accurate stepped motion control is available this procedure is simplified). (3) Cylindrical distortion is added to reproduce the effect of imaging the environment with a curved (angular) sensor rather than a series of flat (planar) sensors as used in the acquisition phase. (4) Image is displayed (adapted from Ref. 2).

Fig. 2
Fig. 2

Wide-angle imaging with a reflective surface. If the correct profile is chosen for a reflective surface placed over an upward facing camera, very-wide-angle imaging is possible. Radial angle θ is the angle of light rays impinging on the camera with respect to the optical axis of the camera. Angle of elevation ϕ is the angle of incoming light with respect to the vertical axis of the surface.

Fig. 3
Fig. 3

Polished aluminum cone placed above a camera able to produce a panoramic strip image of the environment. β must be chosen for the surface, with reference to ϕ the expected angle of rays impinging on the cone surface and θ the radial angle of light entering the camera.

Fig. 4
Fig. 4

Elevation angular magnification α, the ratio of change in radial angle δθ with a change in angle of elevation δϕ.

Fig. 5
Fig. 5

Relationships used to derive a family of surfaces for increasing the field of view of a conventional imaging device: r, distance from the camera to the surface at radial angle θ; γ, difference in angle between the vector described by r and θ and the tangent to the surface.

Fig. 6
Fig. 6

Ray-traced images illustrating the views captured by a camera placed underneath mirrors of various shapes. All images were captured with the camera 0.5 units from the apex of the surfaces. A, World inhabited by the camera and the surfaces. The black spheres were equiangularly spaced along the vertical at 20° intervals, starting from the upper pole, and at 45° intervals about the horizontal. The upward facing camera is in the center of the image. The background is textured somewhat like clouds and blue sky. B, Image from a reflective sphere of a radius of 0.115 units. The angle imaged exceeds 280° diagonally. However, the images of the black spheres are not equally spaced in the radial direction, particularly near the perimeter where extreme compression occurs. The lens of the camera can be seen in the center of this image and the other reflected images. C, Image from a polynomial surface (rectangular hyperboloid) with an angular magnification α of 3. The images of the black spheres are equidistantly spaced. D, Image from a polynomial surface with an angular magnification of 7. The angle imaged exceeds 280° diagonally. The images of the spheres are again equidistantly spaced, and more spheres are imaged. Note the unavoidable elongation of the spheres in the azimuthal direction. This is the result of mapping a spherical coordinate system onto a plane.

Fig. 7
Fig. 7

Different types of misalignment of the camera with respect to the surface: A, Correct location of the camera; B, camera placed too close; C, camera placed too far away; D, camera’s optical axis not coincident with the axis of symmetry of the surface.

Fig. 8
Fig. 8

Effect of displacing the camera along the optic axis (y axis perpendicular to the imaging plane). The surface is designed for a nominal angular magnification of 7 at distance r0 = 50 mm (solid line). When the camera is moved closer to the surface, the effective angular magnification of the camera is decreased, and when it is moved farther away, the angular magnification is increased. However, the deviation of the curve from a straight line (indicating constant angular magnification) is small even for large displacements.

Fig. 9
Fig. 9

Effect of different misalignments for a surface with an angular magnification of 7 and r0 = 0.5 units: A, Camera moved 0.1 units farther away along the optical axis. The spacing between the black spheres remains nearly constant across the radius of the image, although the effective angular magnification appears to increase. B, Camera moved 0.1 units closer to the surface. The effective angular magnification across the radius appears constant, although less environment is visible. C, Camera moved off the vertical axis of the surface by 0.1 units. The image then becomes highly distorted.

Fig. 10
Fig. 10

When the camera is shifted away from the vertical axis of the surface, the angular magnification of the surface is unaffected. However, the angle image is offset by a constant amount. This graph shows only the effect of imaging the surface lying along the axis of translation. Other regions would be affected differently because of the circular geometry of the device. The important result is the angular offset in the part of the environment imaged, indicating that alignment along a direction perpendicular to the optical axis is critical.

Fig. 11
Fig. 11

By using an arrangement of this type with A > 90°, it is possible to image the entire environment, except for the parallelepiped in the shadow of the camera. The shadow is of constant area in the environment, unlike the occluded regions in the earlier examples that had angular dimensions.

Fig. 12
Fig. 12

One solution to a number of practical problems in implementing such a device. The camera is concealed within the primary reflector, which is important for robustness. The secondary reflector has A < 90°, which prevents it from viewing the region of the primary reflector that would contain the reflection of itself, if the correct geometry is used. The primary reflector has A > 90°, which causes the shadow of the secondary mirror to be a cylinder in space rather than a cone. The output image would not contain any part of the camera or secondary reflector, and the hole in the world at the center of the image would be restricted to the projected area of the secondary reflective surface.

Fig. 13
Fig. 13

Experimental implementation of the global imaging system. A, Three surfaces produced by turning aluminum on a CNC lathe. The mirrored finish was achieved by polishing with a metal polish of various grades. B, Assembled device enclosed in a glass tube for rigidity and to protect against dust. Internal reflection did not appear to be a major problem. C, Image produced by the device. The field of view is approximately 240°. D, Image that resulted from unwarping the image, C.

Fig. 14
Fig. 14

Polar CCD array able to provide direct unwarping by clocking out each ring of pixels as a standard video row. The resulting image is a rectangular, unwarped image.

Fig. 15
Fig. 15

Image captured in the Australian National University’s Student Union Courtyard with a surface of angular magnification of 7.

Fig. 16
Fig. 16

Cone allowing the camera to be replaced by a nodal point at O, which is defined as the mirror image of the real nodal point at N, which simplifies the process of derivation.

Fig. 17
Fig. 17

Boundary conditions of the differential equation to be derived.

Fig. 18
Fig. 18

Geometric analysis of dual-surface configuration.

Equations (29)

Equations on this page are rendered with MathJax. Learn more.

ϕ=180-2β+θ.
α=δϕ/δθ,
tan γ=rdθ/dr,
γ=tan-1rdθ/dr.
ddθtan-1rdθdr=κ,
tan-1rdθ/dr=κθ+A,
cotκθ+Adθ=dr/r.
lnsinκθ+A/κ=ln r+ln B=ln Br,
sinκθ+A=expκ lnBr=Brκ.
δϕ=-δθ-2κδθ,
dϕ/dθ=-1-2κ
κ=-1+α/2.
sinA+θ1-α/2=Br1-α/2.
sin90°-θ1+α/2=Br-1+α/2,
cosθ1+α/2=Br-1+α/2.
B-1+α/2=r01+α/2.
cosθ1+α/2=r/r0-1+α/2.
x=yu cos2πxu,  y=yu sin2πxu,
2q=b/2 cos90-α,  q=b/4 sin α;  p/q=tan α,  p=q tan α=b4 sin αsin αcos α=b/4 cos α.
ddθtan-1rdθdr=κ,
tan-1rdθ/dr=κθ+A,
rdθ/dr=tanκθ+A
dr/r=dθ/tanκθ+A.
lnsinκθ+A/κ=ln r+ln B=lnBr,  sinκθ+A=Brκ=Crκ,  sinκθ+A=Crκ,
90°=κ90-α+A;  A=κα+90°1-κ.
sinκθ+α+90°1-κ=Crκ.
lengthOB=l+p=d sin α+b/4 cos α=r0.
sinκ90°-α+α+90°1-κ=Cr0κ,  sin90°=Crκ;  C=1/r0κ.
r0/rκ=cosκθ+α-90°,

Metrics