Abstract

Using the image-based optical measurement method known as axially distributed sensing (ADS), the depth information of a scene is retrieved by evaluating images recorded with a camera positioned on multiple points along a common optical axis. We describe the design of a monoscopic lens that images the scene from two different on-axis points of perspective simultaneously onto a single camera. Designed for use in a territorial surveillance video system, this lens allows for capturing 3D scene information based on ADS with a single shot. We present the physical foundations of this approach and its implementation as a compact lens system and show the usability of the system prototype, supported by performance tests.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. P. J. Besl and R. C. Jain, “Three-dimensional object recognition,” ACM Comput. Surv. 17, 75–145 (1985).
    [CrossRef]
  2. R. Schulein, M. DaneshPanah, and B. Javidi, “3D imaging with axially distributed sensing,” Opt. Lett. 34, 2012–2014 (2009).
    [CrossRef]
  3. D. Gennery, “Least-squares camera calibration including lens distortion and automatic editing of calibration points,” in Calibration and Orientation of Cameras in Computer Vision, A. Gruen and T. S. Huang, eds. (Springer, 2001), pp. 123–136.
  4. D. Lee and I. Kweon, “A novel stereo camera system by a biprism,” IEEE Trans. Robot. Autom. 16, 528–541 (2000).
  5. S. D. Luster and B. G. Batchelor, “Telecentric, Fresnel and micro lenses,” in Machine Vision Handbook, B. G. Batchelor, ed. (Springer, 2012), pp. 259–281.
  6. Zemax software website, http://zemax.com/support/downloads/legacy-version-downloads .
  7. K. Lenhardt, “Optical systems in machine vision,” in Handbook of Machine Vision, A. Hornberg, ed. (WILEY-VCH, 2006), pp. 205–332.
  8. E. Kolenović, E. Kolenović, T. Kreis, C. von Kopylow, and W. Jüptner, “Determination of large-scale out-of-plane displacements in digital Fourier holography,” Appl. Opt. 46, 3118–3125 (2007).
    [CrossRef]

2009 (1)

2007 (1)

2000 (1)

D. Lee and I. Kweon, “A novel stereo camera system by a biprism,” IEEE Trans. Robot. Autom. 16, 528–541 (2000).

1985 (1)

P. J. Besl and R. C. Jain, “Three-dimensional object recognition,” ACM Comput. Surv. 17, 75–145 (1985).
[CrossRef]

Batchelor, B. G.

S. D. Luster and B. G. Batchelor, “Telecentric, Fresnel and micro lenses,” in Machine Vision Handbook, B. G. Batchelor, ed. (Springer, 2012), pp. 259–281.

Besl, P. J.

P. J. Besl and R. C. Jain, “Three-dimensional object recognition,” ACM Comput. Surv. 17, 75–145 (1985).
[CrossRef]

DaneshPanah, M.

Gennery, D.

D. Gennery, “Least-squares camera calibration including lens distortion and automatic editing of calibration points,” in Calibration and Orientation of Cameras in Computer Vision, A. Gruen and T. S. Huang, eds. (Springer, 2001), pp. 123–136.

Jain, R. C.

P. J. Besl and R. C. Jain, “Three-dimensional object recognition,” ACM Comput. Surv. 17, 75–145 (1985).
[CrossRef]

Javidi, B.

Jüptner, W.

Kolenovic, E.

Kreis, T.

Kweon, I.

D. Lee and I. Kweon, “A novel stereo camera system by a biprism,” IEEE Trans. Robot. Autom. 16, 528–541 (2000).

Lee, D.

D. Lee and I. Kweon, “A novel stereo camera system by a biprism,” IEEE Trans. Robot. Autom. 16, 528–541 (2000).

Lenhardt, K.

K. Lenhardt, “Optical systems in machine vision,” in Handbook of Machine Vision, A. Hornberg, ed. (WILEY-VCH, 2006), pp. 205–332.

Luster, S. D.

S. D. Luster and B. G. Batchelor, “Telecentric, Fresnel and micro lenses,” in Machine Vision Handbook, B. G. Batchelor, ed. (Springer, 2012), pp. 259–281.

Schulein, R.

von Kopylow, C.

ACM Comput. Surv. (1)

P. J. Besl and R. C. Jain, “Three-dimensional object recognition,” ACM Comput. Surv. 17, 75–145 (1985).
[CrossRef]

Appl. Opt. (1)

IEEE Trans. Robot. Autom. (1)

D. Lee and I. Kweon, “A novel stereo camera system by a biprism,” IEEE Trans. Robot. Autom. 16, 528–541 (2000).

Opt. Lett. (1)

Other (4)

D. Gennery, “Least-squares camera calibration including lens distortion and automatic editing of calibration points,” in Calibration and Orientation of Cameras in Computer Vision, A. Gruen and T. S. Huang, eds. (Springer, 2001), pp. 123–136.

S. D. Luster and B. G. Batchelor, “Telecentric, Fresnel and micro lenses,” in Machine Vision Handbook, B. G. Batchelor, ed. (Springer, 2012), pp. 259–281.

Zemax software website, http://zemax.com/support/downloads/legacy-version-downloads .

K. Lenhardt, “Optical systems in machine vision,” in Handbook of Machine Vision, A. Hornberg, ed. (WILEY-VCH, 2006), pp. 205–332.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1.

Acquisition of distance information based on axially distributed sensing: the viewing angle for an object, indicated by the dotted and dashed lines tilted to the optical axis, changes depending on its distance from the camera. With images from multiple camera positions, the object distance can be calculated from its relative magnification in the images.

Fig. 2.
Fig. 2.

Simplified geometrical construction for imaging a scene: for calculating the object height ho from focal length f and object image height hi, the distance do of the object to the center of perspective [entrance pupil (EP)] has to be known.

Fig. 3.
Fig. 3.

Simplified geometrical construction of an ADS-based imaging situation. The ratio M=h1/h2 of the recorded object image heights increases with increasing distance dEP between the EPs.

Fig. 4.
Fig. 4.

Schematic drawing of the ADS-based setup for single-shot distance acquisition, with the system housing indicated by the surrounding dashed line. In the volume behind the opening for the lens (rightmost item with the field lens), the beam splitter BS1 divides the beam path. The EP EP of the lens system LS1 in the upper beam path (subsystem 1) is closer to the scene than the EP of LS2 (subsystem 2). With the beam splitter BS2 on the left side, the optical axes are parallelized so that the focused images of the subsystems are side by side in the CCD sensor plane.

Fig. 5.
Fig. 5.

Simulated image vignetting in the design and optimization of the optical imaging system for single-shot ADS using Zemax optical design software. The optical design was performed with a single Zemax project file for each of the two imaging subsystems.

Fig. 6.
Fig. 6.

Imaging system for single-shot ADS-based measurement in a surveillance system: (a) Design drawing of the optical system with a custom caging system, with the housing entrance on the right and CCD sensor on the left and (b) photo of the manufactured system. The system’s height is approximately 12 cm.

Fig. 7.
Fig. 7.

Optical distortions and rotational misalignment of the two imaging subsystems. (a) Checkerboard test pattern recorded by the developed imaging setup with best-fit grids. In the close-ups the deviation of the recorded pattern from the best-fit pattern shows the system’s distortion. (b) A rotational misalignment between the imaging subsystems is visible in the close-up of the best-fit grids.

Fig. 8.
Fig. 8.

Calibration measurements with the designed ADS-based imaging system for single-shot measurement: (a) photo of the test setup and (b) image captured with developed imaging setup. The separate images (subimages) are formed by the imaging subsystems with separated centers of perspective along a common optical axis. (c) Close-up of upper subimage.

Fig. 9.
Fig. 9.

Screenshot of the software application for the distance sensor realized with the designed ADS-based imaging system for single-shot measurement. In the main window the scene, imaged by the optical subsystems, is shown with segmented object boundaries; the calculated object distance in meters is displayed on the upper right.

Fig. 10.
Fig. 10.

Comparison between measured and calculated values for the image height ratio M in the images as a function of object distance.

Tables (4)

Tables Icon

Table 1. Requirements (Req) for the Design of the Optical System for the Single-Shot ADS System

Tables Icon

Table 2. Operands of the Merit Function Used for Optimization of the Optical System Design in Zemax

Tables Icon

Table 3. Properties of the Designed Imaging Subsystems from Simulations with Zemax

Tables Icon

Table 4. Difference between Values ΔM=M(do1)M(do2), Depending on the Distances do1 and do2 to the Measurement Setupa

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

ho=hifdo.
h1=hodof1andh2=hodEP+dof2.
M=f1f2(dEPdo+1),
do=dEPMf2f11.
M=dEPdo+1,
|M1|=|dEPdo|.
d=f2ctolF/#+fβp.
dn=d2.
Mdo=dEPdo2.

Metrics