Abstract

We compute a family of double-mirror catadioptric sensors with ultrawide field of view and no distortion. The two concentric mirrors are rotationally symmetric, and the inside mirror is a revolved conic section. The mapping between the object and the image planes was linear, hence the lack of distortion. The equations describing the outer mirror were determined by the projection induced by the inside mirror and the rectifying property of the sensor. Solving the resulting nonlinear ordinary differential equations yielded the cross section of the secondary mirror. The sensors we present require no further digital processing.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. J. Chahl and M. Srinivasan, “Reflective surfaces for panoramic imaging,” Appl. Opt. 36, 8275–8285 (1997).
    [Crossref]
  2. T. Conroy and J. Moore, “Resolution invariant surfaces for panoramic vision systems,” in Proceedings of the Seventh IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, 1999), pp. 392–397.
  3. Y. Ding, J. Xiao, K. Tan, and J. Yu, “Catadioptric projectors,” in IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2009), pp. 2528–2535.
  4. J.-F. Layerle, X. Savatier, J.-Y. Ertaud, and E. M. Mouaddib, “Catadioptric sensor for a simultaneous tracking of the driver’s face and the road scene,” in The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras-OMNIVIS (Hal-Inria, 2008).
  5. G. I. Kweon, K. T. Kim, G. H. Kim, and H. S. Kim, “Folded catadioptric panoramic lens with an equidistance projection scheme,” Appl. Opt. 44, 2759–2767 (2005).
    [Crossref]
  6. R. Swaminathan, S. K. Nayar, and M. D. Grossberg, “Designing mirrors for catadioptric systems that minimize image errors,” in Proceedings of the Fifth Workshop on Omnidirectional Vision (Hal-Inria, 2004).
  7. R. A. Hicks and R. Bacsjy, “Reflective surfaces as computational sensors,” Image Vis. Comput. 19, 773–777 (2001).
    [Crossref]
  8. J. Gluckman and S. Nayar, “Rectified catadioptric stereo sensors,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 224–236 (2002).
    [Crossref]
  9. S. K. Nayar, “Catadioptric omnidirectional camera,” in Proceedings of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1997), pp. 482–488.
  10. R. A. Hicks and R. K. Perline, “Blind-spot problem for motor vehicles,” Appl. Opt. 44, 3893–3897 (2005).
    [Crossref]
  11. S. K. Nayar and V. Peri, “Folded catadioptric cameras,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1999), pp. 217–223.
  12. R. K. Perline and E. Köse, “Achieving wide-field of view using double-mirror catadioptric sensors,” Multiscale Optimization Methods and Applications (Springer, 2005), pp. 327–337.
  13. E. Hecht, Optics, 4th ed. (Addison-Wesley, 2002).

2005 (2)

2002 (1)

J. Gluckman and S. Nayar, “Rectified catadioptric stereo sensors,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 224–236 (2002).
[Crossref]

2001 (1)

R. A. Hicks and R. Bacsjy, “Reflective surfaces as computational sensors,” Image Vis. Comput. 19, 773–777 (2001).
[Crossref]

1997 (1)

Bacsjy, R.

R. A. Hicks and R. Bacsjy, “Reflective surfaces as computational sensors,” Image Vis. Comput. 19, 773–777 (2001).
[Crossref]

Chahl, J.

Conroy, T.

T. Conroy and J. Moore, “Resolution invariant surfaces for panoramic vision systems,” in Proceedings of the Seventh IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, 1999), pp. 392–397.

Ding, Y.

Y. Ding, J. Xiao, K. Tan, and J. Yu, “Catadioptric projectors,” in IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2009), pp. 2528–2535.

Ertaud, J.-Y.

J.-F. Layerle, X. Savatier, J.-Y. Ertaud, and E. M. Mouaddib, “Catadioptric sensor for a simultaneous tracking of the driver’s face and the road scene,” in The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras-OMNIVIS (Hal-Inria, 2008).

Gluckman, J.

J. Gluckman and S. Nayar, “Rectified catadioptric stereo sensors,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 224–236 (2002).
[Crossref]

Grossberg, M. D.

R. Swaminathan, S. K. Nayar, and M. D. Grossberg, “Designing mirrors for catadioptric systems that minimize image errors,” in Proceedings of the Fifth Workshop on Omnidirectional Vision (Hal-Inria, 2004).

Hecht, E.

E. Hecht, Optics, 4th ed. (Addison-Wesley, 2002).

Hicks, R. A.

R. A. Hicks and R. K. Perline, “Blind-spot problem for motor vehicles,” Appl. Opt. 44, 3893–3897 (2005).
[Crossref]

R. A. Hicks and R. Bacsjy, “Reflective surfaces as computational sensors,” Image Vis. Comput. 19, 773–777 (2001).
[Crossref]

Kim, G. H.

Kim, H. S.

Kim, K. T.

Köse, E.

R. K. Perline and E. Köse, “Achieving wide-field of view using double-mirror catadioptric sensors,” Multiscale Optimization Methods and Applications (Springer, 2005), pp. 327–337.

Kweon, G. I.

Layerle, J.-F.

J.-F. Layerle, X. Savatier, J.-Y. Ertaud, and E. M. Mouaddib, “Catadioptric sensor for a simultaneous tracking of the driver’s face and the road scene,” in The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras-OMNIVIS (Hal-Inria, 2008).

Moore, J.

T. Conroy and J. Moore, “Resolution invariant surfaces for panoramic vision systems,” in Proceedings of the Seventh IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, 1999), pp. 392–397.

Mouaddib, E. M.

J.-F. Layerle, X. Savatier, J.-Y. Ertaud, and E. M. Mouaddib, “Catadioptric sensor for a simultaneous tracking of the driver’s face and the road scene,” in The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras-OMNIVIS (Hal-Inria, 2008).

Nayar, S.

J. Gluckman and S. Nayar, “Rectified catadioptric stereo sensors,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 224–236 (2002).
[Crossref]

Nayar, S. K.

S. K. Nayar, “Catadioptric omnidirectional camera,” in Proceedings of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1997), pp. 482–488.

S. K. Nayar and V. Peri, “Folded catadioptric cameras,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1999), pp. 217–223.

R. Swaminathan, S. K. Nayar, and M. D. Grossberg, “Designing mirrors for catadioptric systems that minimize image errors,” in Proceedings of the Fifth Workshop on Omnidirectional Vision (Hal-Inria, 2004).

Peri, V.

S. K. Nayar and V. Peri, “Folded catadioptric cameras,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1999), pp. 217–223.

Perline, R. K.

R. A. Hicks and R. K. Perline, “Blind-spot problem for motor vehicles,” Appl. Opt. 44, 3893–3897 (2005).
[Crossref]

R. K. Perline and E. Köse, “Achieving wide-field of view using double-mirror catadioptric sensors,” Multiscale Optimization Methods and Applications (Springer, 2005), pp. 327–337.

Savatier, X.

J.-F. Layerle, X. Savatier, J.-Y. Ertaud, and E. M. Mouaddib, “Catadioptric sensor for a simultaneous tracking of the driver’s face and the road scene,” in The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras-OMNIVIS (Hal-Inria, 2008).

Srinivasan, M.

Swaminathan, R.

R. Swaminathan, S. K. Nayar, and M. D. Grossberg, “Designing mirrors for catadioptric systems that minimize image errors,” in Proceedings of the Fifth Workshop on Omnidirectional Vision (Hal-Inria, 2004).

Tan, K.

Y. Ding, J. Xiao, K. Tan, and J. Yu, “Catadioptric projectors,” in IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2009), pp. 2528–2535.

Xiao, J.

Y. Ding, J. Xiao, K. Tan, and J. Yu, “Catadioptric projectors,” in IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2009), pp. 2528–2535.

Yu, J.

Y. Ding, J. Xiao, K. Tan, and J. Yu, “Catadioptric projectors,” in IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2009), pp. 2528–2535.

Appl. Opt. (3)

IEEE Trans. Pattern Anal. Mach. Intell. (1)

J. Gluckman and S. Nayar, “Rectified catadioptric stereo sensors,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 224–236 (2002).
[Crossref]

Image Vis. Comput. (1)

R. A. Hicks and R. Bacsjy, “Reflective surfaces as computational sensors,” Image Vis. Comput. 19, 773–777 (2001).
[Crossref]

Other (8)

S. K. Nayar and V. Peri, “Folded catadioptric cameras,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1999), pp. 217–223.

R. K. Perline and E. Köse, “Achieving wide-field of view using double-mirror catadioptric sensors,” Multiscale Optimization Methods and Applications (Springer, 2005), pp. 327–337.

E. Hecht, Optics, 4th ed. (Addison-Wesley, 2002).

S. K. Nayar, “Catadioptric omnidirectional camera,” in Proceedings of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1997), pp. 482–488.

R. Swaminathan, S. K. Nayar, and M. D. Grossberg, “Designing mirrors for catadioptric systems that minimize image errors,” in Proceedings of the Fifth Workshop on Omnidirectional Vision (Hal-Inria, 2004).

T. Conroy and J. Moore, “Resolution invariant surfaces for panoramic vision systems,” in Proceedings of the Seventh IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, 1999), pp. 392–397.

Y. Ding, J. Xiao, K. Tan, and J. Yu, “Catadioptric projectors,” in IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2009), pp. 2528–2535.

J.-F. Layerle, X. Savatier, J.-Y. Ertaud, and E. M. Mouaddib, “Catadioptric sensor for a simultaneous tracking of the driver’s face and the road scene,” in The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras-OMNIVIS (Hal-Inria, 2008).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (18)

Fig. 1.
Fig. 1.

POV-Ray simulation of the catadioptric sensor with a conic primary mirror and orthographic projection in solid colors, for easy visualization.

Fig. 2.
Fig. 2.

General design setup of the double-mirror catadioptric sensors with conic primary mirrors.

Fig. 6.
Fig. 6.

Profile picture of the concave up parabolic primary mirror with orthographic projection.

Fig. 7.
Fig. 7.

POV-Ray simulation of the catadioptric sensor with concave up parabolic primary mirror under orthographic projection in the test room.

Fig. 10.
Fig. 10.

h>0.

Fig. 11.
Fig. 11.

Profile picture of catadioptric sensor with concave down parabolic primary mirror coupled with orthographic projection.

Fig. 12.
Fig. 12.

POV-Ray simulation of the catadioptric sensor with concave down parabolic primary mirror under orthographic projection in the test room.

Fig. 13.
Fig. 13.

Profile picture of catadioptric sensor with sideways parabolic primary mirror coupled with perspective projection.

Fig. 14.
Fig. 14.

POV-Ray simulation of the catadioptric sensor with the sideways parabolic primary mirror under orthographic projection in the test room.

Fig. 15.
Fig. 15.

Profile picture of catadioptric sensor with elliptic primary mirror coupled with perspective projection.

Fig. 16.
Fig. 16.

POV-Ray simulation of the catadioptric sensor with elliptic parabolic primary mirror under orthographic projection in the test room.

Fig. 17.
Fig. 17.

Hyperbola 16x+16y16xy+12y2=0, which is in red, and the section of the hyperbola from which the surface was obtained by revolution about the y axis giving the primary mirror, shown in blue.

Fig. 18.
Fig. 18.

POV-Ray simulation of the catadioptric sensor with hyperbolic parabolic primary mirror under orthographic projection in the test room.

Equations (35)

Equations on this page are rendered with MathJax. Learn more.

P:(x0,0)(Kax0,K).
θ(a)=2arctan(al),
limaθ(a)=π.
n⃗=v⃗+w⃗.
w⃗=Kax0x,Kf(x).
w⃗=ax0xK,1f(x)K,
limKax0xK,1f(x)K,
w=ax0,1.
y=14p(xh)2+k.
y=12x2+x,0x1.
x0=yx1+x2+2x+1+y2x+1.
v⃗=1x,y,w⃗=ax0,1.
n⃗=x1x2+2x+1+y2+11+a2x02,yx2+2x+1+y2+ax01+a2x02.
dydx=1+a2x02x1+a2x02+x2+2x+1+y2y1+a2x02+ax0x2+2x+1+y2.
y=14p(xh)2+k.
y=12x2+x,0x1.
v⃗=1x,y,w⃗=ax0,1.
dydx=x1y2x22x+1+2x04x02+1yy2+x22x+1+14x02+1,
x0=y+x1+y2+x22x+1x1,
x=12y2+y,0y1+3.
v⃗=1,0,w⃗=ax0,1.
dydx=4+8y+20y2+16y3+4y4+4y+2y22(1+y).
θ(a,f,h,k,p)=afh+2pkf,
3y2+4y4x+2xy+3x2=0.
dy1dx1=dy1(t)/dtdx1(t)/dt=2+3x1(t)y1(t)2+x1(t)3y1(t).
v⃗=x1(t)x2(t),y1(t)y2(t).
α(t)dy2(t)α(t)dx2(t)=dy2(t)/dtdx2(t)/dt.
|x1(t)y1(t)1101x2(t)y2(t)1|=0.
16x+16y16xy+12y2=0.
dydx=dy1(t)/dtdx1(t)/dt=21+y1(t)23y1(t)+2x1(t).
dx1(t)dt=2+3y1(t)2x1(t),
dy1(t)dt=2+2y1(t).
x0=2x1(t)y1(t)+2.
v⃗=x1(t)x2(t),y1(t)y2(t),
w⃗=ax0,1.

Metrics