Abstract

A novel panoramic stereo imaging system with double panoramic annular lenses (PAL) is introduced in this paper. The proposed system consists of two coaxial PAL units and images on one sensor. The imaging rings are concentric. The object position can be calculated by triangulation. The novelty of this system is that the central blind zone of one PAL unit is partly used by the other as an image zone. The stereo field of view is 60°~105° × 360°. The depth extracting resolution is about 500mm at 1line/cm. The F# of the system is about 3, facilitating the system to be applied in low illumination environment. A design is presented and the depth extracting precision is analyzed.

© 2012 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. W. J. Fowski and M. M. Birnbaum, “Panoramic annular lens attitude determination system (PALADS),” Proc. SPIE 2466, 108–117 (1995).
    [CrossRef]
  2. F. A. Allahdadi and M. Chrisp, “SEDS, earth, atmosphere, and space imaging system (SEASIS),” Proc. SPIE 2214, 257–268 (1994).
    [CrossRef]
  3. T. Svoboda and T. Pajdla, “Panoramic cameras for 3D computation,” in Proceedings of Czech Pattern Recognition Workshop, 63–70, (Czech Society for Pattern Recognition, February 2000).
  4. S. Baker and S. K. Nayar, “A theory of catadioptric image formation” in Proceedings of IEEE Conference on Computer Vision 35(2) 1999. Sixth International Conference, 175–196(1998).
  5. G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
    [CrossRef]
  6. G. Jang, S. Kim, and I. Kweon, “Single-camera panoramic stereo system with single-viewpoint optics,” Opt. Lett. 31(1), 41–43 (2006).
    [CrossRef] [PubMed]
  7. Z. Zhu and K. D. Rajasekar, “Panoramic virtual stereo vision of cooperative mobile robots for localizing 3D moving objects,” in Proceedings of Omnidirectional Vision, IEEE Workshop on 2000, 29–36(2000).
  8. G. N. E. Weech, J. A. Gilbert, and D. R. Matthys, “A stereoscopic system for radial metrology,” in Proceedings of the 2001 SEM Annual Conference and Exposition, 199–202 (2001).
  9. D. R. Matthys, J. A. Gilbert, and S. B. Fair, “Characterization of optical systems for radial metrology,” in Proceedings of the SEM IX International Congress on Experimental Mechanics, 104–107 (2000).
  10. H. R. Fallah and J. Maxwell, “Higher-order pupil aberrations in wide angle and panoramic optical systems,” Proc. SPIE 2774, 342–351 (1996).
    [CrossRef]
  11. Z. L. Feng, “Studies on the characteristics and applications of the annular imaging system with high resolving power,” Ph.D. dissertation (2008).
  12. S. Niu, J. Bai, X. Y. Hou, and G. G. Yang, “Design of a panoramic annular lens with a long focal length,” Appl. Opt. 46(32), 7850–7857 (2007).
    [CrossRef] [PubMed]
  13. I. Powell, “Design study of an infrared panoramic optical system,” Appl. Opt. 35(31), 6190–6194 (1996).
    [CrossRef] [PubMed]
  14. I. Kopilovié, B. Vágvőlgyi, and T. Szirányi, “Application of panoramic annular lens for motion analysis tasks: surveillance and smoke detection,” in Proceedings of the International Conference on Pattern Recognition (ICPR'00), (Barcelona, Spain) 4, 714–717(2000).

2007

2006

2004

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

1996

I. Powell, “Design study of an infrared panoramic optical system,” Appl. Opt. 35(31), 6190–6194 (1996).
[CrossRef] [PubMed]

H. R. Fallah and J. Maxwell, “Higher-order pupil aberrations in wide angle and panoramic optical systems,” Proc. SPIE 2774, 342–351 (1996).
[CrossRef]

1995

W. J. Fowski and M. M. Birnbaum, “Panoramic annular lens attitude determination system (PALADS),” Proc. SPIE 2466, 108–117 (1995).
[CrossRef]

1994

F. A. Allahdadi and M. Chrisp, “SEDS, earth, atmosphere, and space imaging system (SEASIS),” Proc. SPIE 2214, 257–268 (1994).
[CrossRef]

Allahdadi, F. A.

F. A. Allahdadi and M. Chrisp, “SEDS, earth, atmosphere, and space imaging system (SEASIS),” Proc. SPIE 2214, 257–268 (1994).
[CrossRef]

Bai, J.

Birnbaum, M. M.

W. J. Fowski and M. M. Birnbaum, “Panoramic annular lens attitude determination system (PALADS),” Proc. SPIE 2466, 108–117 (1995).
[CrossRef]

Choi, Y.

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Chrisp, M.

F. A. Allahdadi and M. Chrisp, “SEDS, earth, atmosphere, and space imaging system (SEASIS),” Proc. SPIE 2214, 257–268 (1994).
[CrossRef]

Fallah, H. R.

H. R. Fallah and J. Maxwell, “Higher-order pupil aberrations in wide angle and panoramic optical systems,” Proc. SPIE 2774, 342–351 (1996).
[CrossRef]

Fowski, W. J.

W. J. Fowski and M. M. Birnbaum, “Panoramic annular lens attitude determination system (PALADS),” Proc. SPIE 2466, 108–117 (1995).
[CrossRef]

Hou, X. Y.

Jang, G.

Kim, G.

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Kim, H.

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Kim, K.

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Kim, S.

Kweon, G.

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Kweon, I.

Maxwell, J.

H. R. Fallah and J. Maxwell, “Higher-order pupil aberrations in wide angle and panoramic optical systems,” Proc. SPIE 2774, 342–351 (1996).
[CrossRef]

Niu, S.

Powell, I.

Yang, G. G.

Yang, S.

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Appl. Opt.

Opt. Lett.

Proc. SPIE

W. J. Fowski and M. M. Birnbaum, “Panoramic annular lens attitude determination system (PALADS),” Proc. SPIE 2466, 108–117 (1995).
[CrossRef]

F. A. Allahdadi and M. Chrisp, “SEDS, earth, atmosphere, and space imaging system (SEASIS),” Proc. SPIE 2214, 257–268 (1994).
[CrossRef]

H. R. Fallah and J. Maxwell, “Higher-order pupil aberrations in wide angle and panoramic optical systems,” Proc. SPIE 2774, 342–351 (1996).
[CrossRef]

G. Kweon, K. Kim, Y. Choi, G. Kim, H. Kim, and S. Yang, “A catadioptric double-panoramic lens with the equi-distance projection for a rangefinder application,” Proc. SPIE 5613, 29–42 (2004).
[CrossRef]

Other

Z. L. Feng, “Studies on the characteristics and applications of the annular imaging system with high resolving power,” Ph.D. dissertation (2008).

I. Kopilovié, B. Vágvőlgyi, and T. Szirányi, “Application of panoramic annular lens for motion analysis tasks: surveillance and smoke detection,” in Proceedings of the International Conference on Pattern Recognition (ICPR'00), (Barcelona, Spain) 4, 714–717(2000).

T. Svoboda and T. Pajdla, “Panoramic cameras for 3D computation,” in Proceedings of Czech Pattern Recognition Workshop, 63–70, (Czech Society for Pattern Recognition, February 2000).

S. Baker and S. K. Nayar, “A theory of catadioptric image formation” in Proceedings of IEEE Conference on Computer Vision 35(2) 1999. Sixth International Conference, 175–196(1998).

Z. Zhu and K. D. Rajasekar, “Panoramic virtual stereo vision of cooperative mobile robots for localizing 3D moving objects,” in Proceedings of Omnidirectional Vision, IEEE Workshop on 2000, 29–36(2000).

G. N. E. Weech, J. A. Gilbert, and D. R. Matthys, “A stereoscopic system for radial metrology,” in Proceedings of the 2001 SEM Annual Conference and Exposition, 199–202 (2001).

D. R. Matthys, J. A. Gilbert, and S. B. Fair, “Characterization of optical systems for radial metrology,” in Proceedings of the SEM IX International Congress on Experimental Mechanics, 104–107 (2000).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1

Configuration of the stereo PAL imaging system. It is composed by two PAL units. Two rays coming from object point P go through two qwdifferent paths (color dashed) and image P1 and P2 on the image plane. By measuring the radiuses of images to calculate θupper, θlower and baseline d, the location parameters of P, H and S can be calculated by using triangulation.

Fig. 2
Fig. 2

The blue ring is the image area of PALlower unit. The red ring is the image area of PALupper unit. The green circle is the blind area of PALupper unit. Φ is the azimuth angle of P in Fig. 1. P1, P2 are the images of P, whose radiuses are rupper and rlower respectively. rlower(max), rlower(min), rupper(max) and rupper(min) are boundaries of two PAL units.

Fig. 3
Fig. 3

d0 is the initial baseline length at the minimal field angles (θupper(min) and θlower(min)) of PAL units. P0 is located by angles and baseline length. P1 is another object point located by θupper, θlower and d(θupper, θlower). Δd(θupper) and Δd(θlower) are the shifts of entrance pupils.

Fig. 4
Fig. 4

Schematic diagram of double PAL system. The coordinates of P can be calculated by θupper, θlower and d(θupper, θlower) by using trigonometry.

Fig. 5
Fig. 5

A typical PAL block and its ray tracing. Blue region A is the inner curved imaging region in PAL block, pink region B is the region A mirrored by surface S2, green region C is the extension region of B to optical axis. Surface S1 is the surface with refractive coating. Surface S2 is the front mirror of PAL block with ring reflective coating.

Fig. 6
Fig. 6

(a). Structure of the lower PAL optical unit. (b). MTF of 60°, 70°, 80°, 90°, 100°, and 105° fields at 70lp/mm. Different colors represent longitudinal and sagittal MTF of different fields. Black curve denotes the MTF of diffraction limited data.

Fig. 7
Fig. 7

The reverse lower PAL unit with added glass groups. These groups are used to conjugate upper unit with lower unit. The imaging plane is the intermediate imaging plane of the upper PAL unit.

Fig. 8
Fig. 8

(a). Structure of the upper PAL optical system. (b) MTF of 60°, 70°, 80°, 90°, 100°, and 105° fields at 70lp/mm. Different colors represent longitudinal and sagittal MTF of different fields. Black curve denotes the MTF of diffraction limited data.

Fig. 9
Fig. 9

(a). Structure of the upper PAL optical unit, Fig. 9(b). MTF of 60°, 70°, 80°, 90°, 100°, and 105° fields at 70lp/mm. Different colors represent longitudinal and sagittal MTF of different fields. Black curve denotes the MTF of diffraction limited data.

Fig. 10
Fig. 10

The optical structure of the combination of two PAL units.

Fig. 11
Fig. 11

S0 is an object point. S1 and S2 are two extreme positions in one pixel that cannot be distinguished by sensor. The distance to the optical axis is s0, s1, s2, respectively.

Fig. 12
Fig. 12

The resolution vs. two field angles of the system. The unit is line/cm.

Fig. 13
Fig. 13

Incident ray trajectories of the extreme positions of P, whose depth resolution is 1 line/cm

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

{ r lower (min)= f lower θ lower (min) r upper (max)=M f upper θ upper (max) r lower (min)= r upper (max) r lower (max)= f lower ( θ lower (max))
θ θ 1 θ 2 θ 1 = r r 1 r r 2 (0< θ 1 < θ 2 )
r(θ)= r 0 + r 1 θ+ r 2 θ 2 +...+ r N θ N , 0< θ min θ θ max
d( θ upper , θ lower )= d 0 +Δd( θ upper )Δd( θ lower )
s=d( θ upper , θ lower ) sin θ upper sin θ lower sin( θ upper θ lower )
h=d( θ upper , θ lower ) sin θ upper cos θ lower sin( θ upper θ lower )
θ upper ( r 1 )=7.5506+12.9445 r 1 +4.3679 r 1 2 0.9897 r 1 3 +0.0621 r 1 4 8.4180× 10 4 r 1 5
θ lower ( r 2 )=8.2310+5.4260 r 2 +0.5897 r 2 2 0.0305 r 2 3
θ upper (Δ d upper )=58.6473+13.6664Δ d upper 2.3130Δ d upper 2 +0.3028Δ d upper 3 0.0050Δ d upper 4
θ lower (Δ d lower )=58.8308+7.0935Δ d lower 0.5176Δ d lower 2 +0.0304Δ d lower 3 4.8514× 10 4 Δ d lower 4
d θ upper =(12.9445+8.7358 r 1 2.9691 r 1 2 +0.2484 r 1 3 4.209× 10 3 r 1 4 )×d(r)
d θ lower =(5.4260+1.1794 r 2 0.0915 r 2 2 )×d(r)
{ θ upper '= θ upper +d θ upper θ lower '= θ lower d θ lower d'= d 0 +Δd( θ upper ', θ lower ') and { θ upper ''= θ upper d θ upper θ lower ''= θ lower +d θ lower d''= d 0 +Δd( θ upper '', θ lower '')
R( θ upper , θ lower )= 10 s 2 s 1

Metrics