Abstract

A new formula for a catadioptric panoramic lens with an equidistance projection scheme has been derived. The fabricated lens has a field of view that is wider than that of any previously reported panoramic lens, and the nonimaged region near the back of the camera has a constant volume with zero angular extension.

© 2005 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. T. R. Halfhill, “See you around,” ByteMay, 85–90 (1995).
  2. L. D. Paulson, “Viewing the world through interactive panoramic images,” Computer 37, 28 (2004).
  3. H. Ishiguro, “Development of low-cost compact omnidirectional vision sensors,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 23–38.
    [CrossRef]
  4. I. Powell, “Panoramic lens,” Appl. Opt. 33, 7356–7361 (1994).
    [CrossRef] [PubMed]
  5. C. Beck, “Apparatus to photograph the whole sky,” J. Sci. Instrum. 2, 135–139 (1925).
    [CrossRef]
  6. J. M. Slater, “Photography with the whole-sky lens,” Am. PhotographerOctober1932, pp. 580–583.
  7. K. Miyamoto, “Fish-eye lens,” Appl. Opt. 54, 1060–1061 (1964).
  8. K. Yamazawa, Y. Yagi, M. Yachida, “Obstacle detection with omnidirectional image sensor hyperomni vision,” in IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 1995), pp. 1062–1067.
  9. J. S. Chahl, M. V. Srinivasan, “Range estimation with a panoramic visual sensor,” J. Opt. Soc. Am. A 14, 2144–2151 (1997).
    [CrossRef]
  10. J. Zeil, M. I. Hofmann, J. S. Chahl, “Catchment areas of panoramic snapshots in outdoor scenes,” J. Opt. Soc. Am. A 20, 450–469 (2003).
    [CrossRef]
  11. S. Hrabar, G. S. Sukhatme, “Omnidirectional vision for an autonomous helicopter,” in Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 2003), pp. 558–563.
    [CrossRef]
  12. J. Charles, R. Reeves, C. Schur, “How to build and use an all-sky camera,” Astron.April1987, pp. 64–69.
  13. S. Baker, S. K. Nayar, “A theory of single-viewpoint catadioptric image formation,” Int. J. Comput. Vision 35, 175–196 (1999).
    [CrossRef]
  14. J. S. Chahl, M. V. Srinivasan, “Reflective surfaces for panoramic imaging,” Appl. Opt. 36, 8275–8285 (1997).
    [CrossRef]
  15. R. A. Hicks, R. Bajcsy, “Reflective surfaces as computational sensors,” Image Vision Comput. 19, 773–777 (2001).
    [CrossRef]
  16. S. K. Nayar, V. Peri, “Folded catadioptric cameras,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 103–115.
    [CrossRef]

2004

L. D. Paulson, “Viewing the world through interactive panoramic images,” Computer 37, 28 (2004).

2003

2001

R. A. Hicks, R. Bajcsy, “Reflective surfaces as computational sensors,” Image Vision Comput. 19, 773–777 (2001).
[CrossRef]

1999

S. Baker, S. K. Nayar, “A theory of single-viewpoint catadioptric image formation,” Int. J. Comput. Vision 35, 175–196 (1999).
[CrossRef]

1997

1995

T. R. Halfhill, “See you around,” ByteMay, 85–90 (1995).

1994

1987

J. Charles, R. Reeves, C. Schur, “How to build and use an all-sky camera,” Astron.April1987, pp. 64–69.

1964

K. Miyamoto, “Fish-eye lens,” Appl. Opt. 54, 1060–1061 (1964).

1932

J. M. Slater, “Photography with the whole-sky lens,” Am. PhotographerOctober1932, pp. 580–583.

1925

C. Beck, “Apparatus to photograph the whole sky,” J. Sci. Instrum. 2, 135–139 (1925).
[CrossRef]

Bajcsy, R.

R. A. Hicks, R. Bajcsy, “Reflective surfaces as computational sensors,” Image Vision Comput. 19, 773–777 (2001).
[CrossRef]

Baker, S.

S. Baker, S. K. Nayar, “A theory of single-viewpoint catadioptric image formation,” Int. J. Comput. Vision 35, 175–196 (1999).
[CrossRef]

Beck, C.

C. Beck, “Apparatus to photograph the whole sky,” J. Sci. Instrum. 2, 135–139 (1925).
[CrossRef]

Chahl, J. S.

Charles, J.

J. Charles, R. Reeves, C. Schur, “How to build and use an all-sky camera,” Astron.April1987, pp. 64–69.

Halfhill, T. R.

T. R. Halfhill, “See you around,” ByteMay, 85–90 (1995).

Hicks, R. A.

R. A. Hicks, R. Bajcsy, “Reflective surfaces as computational sensors,” Image Vision Comput. 19, 773–777 (2001).
[CrossRef]

Hofmann, M. I.

Hrabar, S.

S. Hrabar, G. S. Sukhatme, “Omnidirectional vision for an autonomous helicopter,” in Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 2003), pp. 558–563.
[CrossRef]

Ishiguro, H.

H. Ishiguro, “Development of low-cost compact omnidirectional vision sensors,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 23–38.
[CrossRef]

Miyamoto, K.

K. Miyamoto, “Fish-eye lens,” Appl. Opt. 54, 1060–1061 (1964).

Nayar, S. K.

S. Baker, S. K. Nayar, “A theory of single-viewpoint catadioptric image formation,” Int. J. Comput. Vision 35, 175–196 (1999).
[CrossRef]

S. K. Nayar, V. Peri, “Folded catadioptric cameras,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 103–115.
[CrossRef]

Paulson, L. D.

L. D. Paulson, “Viewing the world through interactive panoramic images,” Computer 37, 28 (2004).

Peri, V.

S. K. Nayar, V. Peri, “Folded catadioptric cameras,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 103–115.
[CrossRef]

Powell, I.

Reeves, R.

J. Charles, R. Reeves, C. Schur, “How to build and use an all-sky camera,” Astron.April1987, pp. 64–69.

Schur, C.

J. Charles, R. Reeves, C. Schur, “How to build and use an all-sky camera,” Astron.April1987, pp. 64–69.

Slater, J. M.

J. M. Slater, “Photography with the whole-sky lens,” Am. PhotographerOctober1932, pp. 580–583.

Srinivasan, M. V.

Sukhatme, G. S.

S. Hrabar, G. S. Sukhatme, “Omnidirectional vision for an autonomous helicopter,” in Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 2003), pp. 558–563.
[CrossRef]

Yachida, M.

K. Yamazawa, Y. Yagi, M. Yachida, “Obstacle detection with omnidirectional image sensor hyperomni vision,” in IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 1995), pp. 1062–1067.

Yagi, Y.

K. Yamazawa, Y. Yagi, M. Yachida, “Obstacle detection with omnidirectional image sensor hyperomni vision,” in IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 1995), pp. 1062–1067.

Yamazawa, K.

K. Yamazawa, Y. Yagi, M. Yachida, “Obstacle detection with omnidirectional image sensor hyperomni vision,” in IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 1995), pp. 1062–1067.

Zeil, J.

Am. Photographer

J. M. Slater, “Photography with the whole-sky lens,” Am. PhotographerOctober1932, pp. 580–583.

Appl. Opt.

Astron.

J. Charles, R. Reeves, C. Schur, “How to build and use an all-sky camera,” Astron.April1987, pp. 64–69.

Byte

T. R. Halfhill, “See you around,” ByteMay, 85–90 (1995).

Computer

L. D. Paulson, “Viewing the world through interactive panoramic images,” Computer 37, 28 (2004).

Image Vision Comput.

R. A. Hicks, R. Bajcsy, “Reflective surfaces as computational sensors,” Image Vision Comput. 19, 773–777 (2001).
[CrossRef]

Int. J. Comput. Vision

S. Baker, S. K. Nayar, “A theory of single-viewpoint catadioptric image formation,” Int. J. Comput. Vision 35, 175–196 (1999).
[CrossRef]

J. Opt. Soc. Am. A

J. Sci. Instrum.

C. Beck, “Apparatus to photograph the whole sky,” J. Sci. Instrum. 2, 135–139 (1925).
[CrossRef]

Other

H. Ishiguro, “Development of low-cost compact omnidirectional vision sensors,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 23–38.
[CrossRef]

S. Hrabar, G. S. Sukhatme, “Omnidirectional vision for an autonomous helicopter,” in Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 2003), pp. 558–563.
[CrossRef]

K. Yamazawa, Y. Yagi, M. Yachida, “Obstacle detection with omnidirectional image sensor hyperomni vision,” in IEEE International Conference on Robotics and Automation (Institute of Electrical and Electronics Engineers, Piscataway, N.J., 1995), pp. 1062–1067.

S. K. Nayar, V. Peri, “Folded catadioptric cameras,” in Panoramic Vision: Sensors, Theory, and Applications, R. Benosman, S. B. Kang, eds. (Springer, New York, 2001), pp. 103–115.
[CrossRef]

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1
Fig. 1

Schematic diagram illustrating the operational principle of the catadioptric panoramic lens.

Fig. 2
Fig. 2

Schematic diagram illustrating specular reflections in the panoramic mirror along with the variables needed for analyzing the mirror profile.

Fig. 3
Fig. 3

Camera field of view determined by the image sensor size and the focal length of the lens.

Fig. 4
Fig. 4

(a) Profile of a panoramic mirror with the equidistance projection. The design parameters are θ1 = 10°, θ2 = 20°, δ1 = 180°, δ2 = 45°, and ρi = ρ1 = 5.0 cm. (b) The magnified view of the mirror in (a) near the center hole. Note that the figure is not to scale.

Fig. 5
Fig. 5

Mirror profile in Fig. 4 (dotted curve) along with the best numerical fit (solid curve) by use of a sixth-order polynomial.

Fig. 6
Fig. 6

Calculated ray trajectories for the panoramic mirror in Fig. 4. The maximum allowable diameter of the camera is D.

Fig. 7
Fig. 7

Zenith angles δ of the incident rays versus the corresponding distances ξ in the image sensor. The distance is measured from the optical axis to the pixel that the reflected ray hits.

Fig. 8
Fig. 8

Inverting-type panoramic mirror profile along with the ray trajectories. The design parameters are θ1 = 10°, θ2 = 20°, δ1 = 70°, δ2 = 180° and ρi = ρ2 = 12.1070 cm.

Fig. 9
Fig. 9

Schematic diagram illustrating the location and the size of the planar ring mirror in a folded design. Only the case with δ1 = 180° is considered.

Fig. 10
Fig. 10

Schematic diagram of a folded panoramic mirror.

Fig. 11
Fig. 11

Profiles and locations of the curved and the planar mirrors in the folded panorama mirror. The curved mirror is identical to the one depicted in Fig. 4.

Fig. 12
Fig. 12

Photograph of the fabricated folded panoramic mirror along with the digital camera. When in operation, the camera lens should obtrude through the center hole of the curved mirror.

Fig. 13
Fig. 13

Schematic diagram of the experimental setup for verifying the projection scheme of the folded panoramic mirror.

Fig. 14
Fig. 14

(a) Exemplary panoramic image taken by use of the setup depicted in Fig. 13. (b) A magnified view of the right side of the planar ring mirror.

Fig. 15
Fig. 15

Panoramic image of the university library. The camera is pointing toward the ceiling (i.e., zenith).

Fig. 16
Fig. 16

Panoramic image in Fig. 15 after the polar-to-rectangular transformation.

Tables (1)

Tables Icon

Table 1 Fitting Coefficients of the Panoramic Mirror Profile by Use of Sixth-Order Polynomial in ρ

Equations (23)

Equations on this page are rendered with MathJax. Learn more.

d δ d θ = g ,
d ξ d δ = m ,
tan ϕ = d ρ d z .
cot ϕ = d z d ρ = d z d θ / d ρ d θ .
cot ϕ = r cos θ r sin θ r sin θ + r cos θ ,
r r = sin θ + cot ϕ ( θ ) cos θ cos θ cot ϕ ( θ ) sin θ .
d r r = sin θ + cot ϕ ( θ ) cos θ cos θ cot ϕ ( θ ) sin θ d θ .
r ( θ ) = r ( θ i ) exp [ θ i θ sin θ + cot ϕ ( θ ) cos θ cos θ cot ϕ ( θ ) sin θ d θ ] .
ϕ = ( δ + θ ) / 2 .
δ ( θ ) = 2 ϕ ( θ ) θ .
ξ = f tan θ .
tan θ = β δ + ψ ,
β = tan θ 2 tan θ 1 δ 2 δ 1 ,
ψ = δ 2 tan θ 1 δ 1 tan θ 2 δ 2 δ 1 .
ϕ ( θ ) = tan θ ψ + β θ 2 β .
θ V = tan 1 ( H / 2 f ) .
z ( ρ ) = n = 0 6 C n ρ n ,
d z d ρ | θ s = n = 1 6 n C n ρ s n 1 .
ϕ ( θ s ) = π 2 tan 1 ( d z d ρ | θ s ) .
z o = ρ 1 tan ( π 2 θ 2 ) = ρ 1 cot θ 2 .
Z ( ρ ) = z o [ z ( ρ ) z o ] = 2 z o z ( ρ ) .
ρ I = z o tan θ 1 ,
ρ O = z o tan θ 2 = ρ 1 .

Metrics