Abstract

A time-of-flight (ToF) depth sensor produces noisy range data due to scene properties such as surface materials and reflectivity. Sensor measurement frequently includes either a saturated or severely noisy depth and effective depth accuracy is far below its ideal specification. In this paper, we propose a hybrid exposure technique for depth imaging in a ToF sensor so to improve the depth quality. Our method automatically determines an optimal depth for each pixel using two exposure conditions. To show that our algorithm is effective, we compare the proposed algorithm with two conventional methods in qualitative and quantitative manners showing the superior performance of proposed algorithm.

© 2014 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Efficient fringe image enhancement based on dual-tree complex wavelet transform

Tai-Chiu Hsung, Daniel Pak-Kong Lun, and William W.L. Ng
Appl. Opt. 50(21) 3973-3986 (2011)

Efficient patch-based approach for compressive depth imaging

Xin Yuan, Xuejun Liao, Patrick Llull, David Brady, and Lawrence Carin
Appl. Opt. 55(27) 7556-7564 (2016)

Multi-polarization fringe projection imaging for high dynamic range objects

Basel Salahieh, Zhenyue Chen, Jeffrey J. Rodriguez, and Rongguang Liang
Opt. Express 22(8) 10064-10071 (2014)

References

  • View by:
  • |
  • |
  • |

  1. P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).
  2. S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
    [Crossref]
  3. A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).
  4. S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).
  5. H. Shim and S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
    [Crossref]
  6. S. B. Gokturk, H. Yalcin, and C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).
  7. V. Brajovic and T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation,  2, 1638–1643, (1996).
    [Crossref]
  8. S. K. Nayar and V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
    [Crossref]
  9. M. Aggarwal and N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).
  10. O. Yadid-Pecht and E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
    [Crossref]
  11. P. Debevec and J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).
  12. T. Mertens, J. Kautz, and F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
    [Crossref]
  13. D. Yang and A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).
  14. U. Hahne and M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
    [Crossref]
  15. B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).
  16. L. Jovanov, A. Pižurica, and W. Philips, “Fuzzy logic-based approach to wavelet denoising of 3D images produced by time-of-flight cameras,” Opt. Express 18(22), 22651–22676 (2010).
    [Crossref] [PubMed]

2012 (1)

H. Shim and S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[Crossref]

2011 (2)

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[Crossref]

U. Hahne and M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[Crossref]

2010 (1)

1997 (1)

O. Yadid-Pecht and E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[Crossref]

1996 (1)

V. Brajovic and T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation,  2, 1638–1643, (1996).
[Crossref]

Aggarwal, M.

M. Aggarwal and N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

Ahuja, N.

M. Aggarwal and N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

Alenya, G.

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[Crossref]

Alexa, M.

U. Hahne and M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[Crossref]

Bamji, C.

S. B. Gokturk, H. Yalcin, and C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Brajovic, V.

V. Brajovic and T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation,  2, 1638–1643, (1996).
[Crossref]

Branzoi, V.

S. K. Nayar and V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[Crossref]

Buttgen, B.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Colaco, A.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Davis, J.

S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Debevec, P.

P. Debevec and J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).

El Gamal, A.

D. Yang and A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

Foix, S.

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[Crossref]

Fossum, E.R.

O. Yadid-Pecht and E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[Crossref]

Fox, D.

P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Gokturk, S. B.

S. B. Gokturk, H. Yalcin, and C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Gong, N.-W.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Goyal, V. K.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Hahne, U.

U. Hahne and M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[Crossref]

Henry, P.

P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Herbst, E.

P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Jovanov, L.

Kanade, T.

V. Brajovic and T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation,  2, 1638–1643, (1996).
[Crossref]

Kaufmann, R.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Kautz, J.

T. Mertens, J. Kautz, and F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[Crossref]

Kirmani, A.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Krainin, M.

P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Lee, S.

H. Shim and S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[Crossref]

Lehmann, M.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Lustenberger, F.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Malik, J.

P. Debevec and J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).

McGarry, T.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Mertens, T.

T. Mertens, J. Kautz, and F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[Crossref]

Nayar, S. K.

S. K. Nayar and V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[Crossref]

Oggier, T.

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

Philips, W.

Pižurica, A.

Ren, X.

P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

Schuon, S.

S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Shim, H.

H. Shim and S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[Crossref]

Theobalt, C.

S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Thrun, S.

S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

Torras, C.

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[Crossref]

Van Reeth, F.

T. Mertens, J. Kautz, and F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[Crossref]

Watkins, L.

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

Yadid-Pecht, O.

O. Yadid-Pecht and E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[Crossref]

Yalcin, H.

S. B. Gokturk, H. Yalcin, and C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

Yang, D.

D. Yang and A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

Comput Graph Forum. (1)

U. Hahne and M. Alexa, “Exposure Fusion for Time-Of-Flight Imaging,” Comput Graph Forum. 30(7), 1887–1894 (2011).
[Crossref]

IEEE Sens. J. (1)

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (ToF) cameras: a survey,” IEEE Sens. J. 11(9), 1917–1926 (2011).
[Crossref]

IEEE Trans. Electron Dev. (1)

O. Yadid-Pecht and E.R. Fossum, “Wide intrascene dynamic range CMOS APS using dual sampling,” IEEE Trans. Electron Dev. 44(10), 1721–1723 (1997).
[Crossref]

Opt. Eng. (1)

H. Shim and S. Lee, “Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations,” Opt. Eng. 51(1), 94401–94414 (2012).
[Crossref]

Opt. Express (1)

Proceedings of the IEEE International Conference on Robotics and Automation (1)

V. Brajovic and T. Kanade, “A Sorting Image Sensor: An Example of Massively Parallel Intensity-to-Time Processing for Low-Latency Computational Sensors,” Proceedings of the IEEE International Conference on Robotics and Automation,  2, 1638–1643, (1996).
[Crossref]

Other (10)

S. K. Nayar and V. Branzoi, “Adaptive dynamic range imaging: optical control of pixel exposures over space and time,” IEEE International Conference on Computer Vision, 1168–1175 (2003).
[Crossref]

M. Aggarwal and N. Ahuja, “High Dynamic Range Panoramic Imaging,” International Conference on Computer Vision, 2–9 (2001).

B. Buttgen, T. Oggier, M. Lehmann, R. Kaufmann, and F. Lustenberger, “CCD/CMOS lock-in pixel for range imaging: Challenges, limitations and state-of-the-art,” 1st range imaging research day, 21–32 (2005).

P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “Rgbd mapping: using depth cameras for dense 3-D modeling of indoor environments,” in RGB-D: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, The MIT Press, Cambridge, Massachusetts (2010).

S. B. Gokturk, H. Yalcin, and C. Bamji, “A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 35–44 (2004).

A. Colaco, A. Kirmani, N.-W. Gong, T. McGarry, L. Watkins, and V. K. Goyal, “3dim: Compact and Low Power Time-of-Flight Sensor for 3D Capture Using Parametric Signal Processing,” Proc. Int. Image Sensor Workshop, 349–352 (2013).

S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Conference on Computer Vision and Pattern Recognition Workshop, 1–7 (2008).

P. Debevec and J. Malik, “Recovering high dynamic range radiance maps from photographs,” in ACM SIGGRAPH classes (ACM Press, 2008, 31).

T. Mertens, J. Kautz, and F. Van Reeth, “Exposure Fusion,” IEEE Pacific Conference on Computer Graphics and Applications, 382–390 (2007).
[Crossref]

D. Yang and A. El Gamal, “Comparative Analysis of SNR for Image Sensors with Enhanced Dynamic Range,” In Proceedings of the SPIE Electronic Imaging, 3649 (1999).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1

Two distinct groups of depth exposure profiles upon various objects. x axis: exposure (ms), y axis: depth measurement (m).

Fig. 2
Fig. 2

Fourier analysis of a set of depth measures. First row: Examples of depth maps recorded at different exposure times(t), Second row: Amplitude of frequency responses(logarithmic scale). Third row: Phase of frequency response. Each column corresponds to the same depth map. Depth maps with saturation pixels cause periodic patterns in frequency responses.

Fig. 3
Fig. 3

3D Visualization of Living Room Scene in frontal view (top) and profile view (bottom). Note that [14] sacrifices the precision to alleviate the strong saturation.

Fig. 4
Fig. 4

3D Visualization of Experimental Results. Top-left corner: Amplitude image of target scene. In (c), each yellow box highlights the saturated region.

Fig. 5
Fig. 5

Close-up for saturated pixels. Top: Living Room Scene, Bottom: Kitchen Scene.

Fig. 6
Fig. 6

Temporal repeatability across different scenes. Overall std are 2.10 cm (single), 6.16 cm ([14]) and 0.42 cm (proposed).

Fig. 7
Fig. 7

Residuals of plane fitting. Overall residuals are 0.506 cm (single), 0.34 cm ([14]) and 0.19 cm (proposed).

Fig. 8
Fig. 8

Spatial repeatability using plane fitting.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

y t = s t ( x + ε t ) .
[ d , H , t 0 ] = argmin d , t 0 d = t 0 H t 0 + H s t R ( t , t 0 ) y t d 2 .
R ( t , t 0 ) = | t t 0 | H × 1 t 0 α .
ε t l = y t l x , ε t s = y t s x .
ε δ = ε t s ε t l = y t s y t l , s t = 1 .
x ^ t = 𝔉 1 ( L Y ^ t ) .

Metrics