Abstract

A time-of-flight (ToF) depth sensor produces noisy range data due to scene properties such as surface materials and reflectivity. Sensor measurement frequently includes either a saturated or severely noisy depth and effective depth accuracy is far below its ideal specification. In this paper, we propose a hybrid exposure technique for depth imaging in a ToF sensor so to improve the depth quality. Our method automatically determines an optimal depth for each pixel using two exposure conditions. To show that our algorithm is effective, we compare the proposed algorithm with two conventional methods in qualitative and quantitative manners showing the superior performance of proposed algorithm.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).
  2. S. Foix, G. Alenya, C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
    [CrossRef]
  3. A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).
  4. S. Schuon, C. Theobalt, J. Davis, S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).
  5. H. Shim, S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
    [CrossRef]
  6. S. B. Gokturk, H. Yalcin, C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).
  7. V. Brajovic, T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation, 2, 1638–1643, (1996).
    [CrossRef]
  8. S. K. Nayar, V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
    [CrossRef]
  9. M. Aggarwal, N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).
  10. O. Yadid-Pecht, E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
    [CrossRef]
  11. P. Debevec, J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).
  12. T. Mertens, J. Kautz, F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
    [CrossRef]
  13. D. Yang, A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).
  14. U. Hahne, M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
    [CrossRef]
  15. B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).
  16. L. Jovanov, A. Piz̃urica, W. Philips, “Fuzzy logic-based approach to wavelet denoising of 3D images produced by time-of-flight cameras,” Opt. Express 18(22), 22651–22676 (2010).
    [CrossRef] [PubMed]

2013

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

2012

H. Shim, S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[CrossRef]

2011

U. Hahne, M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[CrossRef]

S. Foix, G. Alenya, C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[CrossRef]

2010

2007

T. Mertens, J. Kautz, F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[CrossRef]

2005

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

2003

S. K. Nayar, V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[CrossRef]

2001

M. Aggarwal, N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

1999

D. Yang, A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

1997

O. Yadid-Pecht, E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[CrossRef]

1996

V. Brajovic, T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation, 2, 1638–1643, (1996).
[CrossRef]

Aggarwal, M.

M. Aggarwal, N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

Ahuja, N.

M. Aggarwal, N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

Alenya, G.

S. Foix, G. Alenya, C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[CrossRef]

Alexa, M.

U. Hahne, M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[CrossRef]

Bamji, C.

S. B. Gokturk, H. Yalcin, C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Brajovic, V.

V. Brajovic, T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation, 2, 1638–1643, (1996).
[CrossRef]

Branzoi, V.

S. K. Nayar, V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[CrossRef]

Buttgen, B.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Colaco, A.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Davis, J.

S. Schuon, C. Theobalt, J. Davis, S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Debevec, P.

P. Debevec, J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).

El Gamal, A.

D. Yang, A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

Foix, S.

S. Foix, G. Alenya, C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[CrossRef]

Fossum, E.R.

O. Yadid-Pecht, E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[CrossRef]

Fox, D.

P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Gokturk, S. B.

S. B. Gokturk, H. Yalcin, C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Gong, N.-W.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Goyal, V. K.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Hahne, U.

U. Hahne, M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[CrossRef]

Henry, P.

P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Herbst, E.

P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Jovanov, L.

Kanade, T.

V. Brajovic, T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation, 2, 1638–1643, (1996).
[CrossRef]

Kaufmann, R.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Kautz, J.

T. Mertens, J. Kautz, F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[CrossRef]

Kirmani, A.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Krainin, M.

P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Lee, S.

H. Shim, S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[CrossRef]

Lehmann, M.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Lustenberger, F.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Malik, J.

P. Debevec, J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).

McGarry, T.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Mertens, T.

T. Mertens, J. Kautz, F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[CrossRef]

Nayar, S. K.

S. K. Nayar, V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[CrossRef]

Oggier, T.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Philips, W.

Piz~urica, A.

Ren, X.

P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Schuon, S.

S. Schuon, C. Theobalt, J. Davis, S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Shim, H.

H. Shim, S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[CrossRef]

Theobalt, C.

S. Schuon, C. Theobalt, J. Davis, S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Thrun, S.

S. Schuon, C. Theobalt, J. Davis, S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Torras, C.

S. Foix, G. Alenya, C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[CrossRef]

Van Reeth, F.

T. Mertens, J. Kautz, F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[CrossRef]

Watkins, L.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Yadid-Pecht, O.

O. Yadid-Pecht, E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[CrossRef]

Yalcin, H.

S. B. Gokturk, H. Yalcin, C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Yang, D.

D. Yang, A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

1st range imaging research day

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Comput Graph Forum.

U. Hahne, M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[CrossRef]

IEEE International Conference on Computer Vision

S. K. Nayar, V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[CrossRef]

IEEE Pacific Conference on Computer Graphics and Applications

T. Mertens, J. Kautz, F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[CrossRef]

IEEE Sens. J.

S. Foix, G. Alenya, C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[CrossRef]

IEEE Trans. Electron Dev.

O. Yadid-Pecht, E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[CrossRef]

International Conference on Computer Vision

M. Aggarwal, N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

Opt. Eng.

H. Shim, S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[CrossRef]

Opt. Express

Proc. Int. Image Sensor Workshop

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Proceedings of the IEEE International Conference on Robotics and Automation

V. Brajovic, T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation, 2, 1638–1643, (1996).
[CrossRef]

Proceedings of the SPIE Electronic Imaging

D. Yang, A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

Other

P. Debevec, J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).

P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

S. Schuon, C. Theobalt, J. Davis, S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

S. B. Gokturk, H. Yalcin, C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1

Two distinct groups of depth exposure profiles upon various objects. x axis: exposure (ms), y axis: depth measurement (m).

Fig. 2
Fig. 2

Fourier analysis of a set of depth measures. First row: Examples of depth maps recorded at different exposure times(t), Second row: Amplitude of frequency responses(logarithmic scale). Third row: Phase of frequency response. Each column corresponds to the same depth map. Depth maps with saturation pixels cause periodic patterns in frequency responses.

Fig. 3
Fig. 3

3D Visualization of Living Room Scene in frontal view (top) and profile view (bottom). Note that [14] sacrifices the precision to alleviate the strong saturation.

Fig. 4
Fig. 4

3D Visualization of Experimental Results. Top-left corner: Amplitude image of target scene. In (c), each yellow box highlights the saturated region.

Fig. 5
Fig. 5

Close-up for saturated pixels. Top: Living Room Scene, Bottom: Kitchen Scene.

Fig. 6
Fig. 6

Temporal repeatability across different scenes. Overall std are 2.10 cm (single), 6.16 cm ([14]) and 0.42 cm (proposed).

Fig. 7
Fig. 7

Residuals of plane fitting. Overall residuals are 0.506 cm (single), 0.34 cm ([14]) and 0.19 cm (proposed).

Fig. 8
Fig. 8

Spatial repeatability using plane fitting.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

y t = s t ( x + ε t ) .
[ d , H , t 0 ] = argmin d , t 0 d = t 0 H t 0 + H s t R ( t , t 0 ) y t d 2 .
R ( t , t 0 ) = | t t 0 | H × 1 t 0 α .
ε t l = y t l x , ε t s = y t s x .
ε δ = ε t s ε t l = y t s y t l , s t = 1 .
x ^ t = 𝔉 1 ( L Y ^ t ) .

Metrics