Abstract

We present an inexpensive architecture for converting a frequency-modulated continuous-wave LiDAR system into a compressive-sensing based depth-mapping camera. Instead of raster scanning to obtain depth-maps, compressive sensing is used to significantly reduce the number of measurements. Ideally, our approach requires two difference detectors. Due to the large flux entering the detectors, the signal amplification from heterodyne detection, and the effects of background subtraction from compressive sensing, the system can obtain higher signal-to-noise ratios over detector-array based schemes while scanning a scene faster than is possible through raster-scanning. Moreover, by efficiently storing only 2m data points from m < n measurements of an n pixel scene, we can easily extract depths by solving only two linear equations with efficient convex-optimization methods.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Photon counting compressive depth mapping

Gregory A. Howland, Daniel J. Lum, Matthew R. Ware, and John C. Howell
Opt. Express 21(20) 23822-23837 (2013)

Photon-counting compressive sensing laser radar for 3D imaging

G. A. Howland, P. B. Dixon, and J. C. Howell
Appl. Opt. 50(31) 5917-5920 (2011)

Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation

Wen-Kai Yu, Xu-Ri Yao, Xue-Feng Liu, Long-Zhen Li, and Guang-Jie Zhai
Appl. Opt. 54(3) 363-367 (2015)

References

  • View by:
  • |
  • |
  • |

  1. G. S. Cheok, M. Juberts, and M. Franaszek, “3D Imaging Systems for Manufacturing, Construction, and Mobility (NIST TN 1682),” Tech. Note (NIST TN)-1682 (2010).
  2. J. Smisek, M. Jancosek, and T. Pajdla, “3D with kinect,” in Consumer Depth Cameras for Computer Vision, (Springer, 2013), pp. 3–25.
    [Crossref]
  3. K. Khoshelham and S. O. Elberink, “Accuracy and resolution of kinect depth data for indoor mapping applications,” Sensors 12, 1437–1454 (2012).
    [Crossref] [PubMed]
  4. J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
    [Crossref]
  5. B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
    [Crossref]
  6. A. A. Frank and M. Nakamura, “Laser radar for a vehicle lateral guidance system,” (1993). US Patent 5,202,742.
  7. F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
    [Crossref]
  8. R. Stettner, “Compact 3D flash lidar video cameras and applications,” in Laser Radar Technology and Applications XV, Vol. 7684 (International Society for Optics and Photonics, 2010).
    [Crossref]
  9. A. E. Johnson and J. F. Montgomery, “Overview of terrain relative navigation approaches for precise lunar landing,” in IEEE Aerospace Conference (2008), pp. 1–10.
  10. F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).
  11. M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
    [Crossref]
  12. D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
    [Crossref]
  13. M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
    [Crossref]
  14. Y. C. Eldar and G. Kutyniok, Compressed Sensing: Theory and Applications (Cambridge University, 2012).
    [Crossref]
  15. G. A. Howland, P. B. Dixon, and J. C. Howell, “Photon-counting compressive sensing laser radar for 3d imaging,” Appl. Opt. 50, 5917–5920 (2011).
    [Crossref] [PubMed]
  16. A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.
  17. G. A. Howland, D. J. Lum, M. R. Ware, and J. C. Howell, “Photon counting compressive depth mapping,” Opt. Express 21, 23822–23837 (2013).
    [Crossref] [PubMed]
  18. M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
    [Crossref]
  19. A. Kadambi and P. T. Boufounos, “Coded aperture compressive 3-d lidar,” in IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2015), pp. 1166–1170.
  20. A. Kirmani, A. Colaço, F. N. C. Wong, and V. K. Goyal, “Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor,” Opt. Express 19, 21485–21507 (2011).
    [Crossref] [PubMed]
  21. M. H. Conde, “Compressive sensing for the photonic mixer device,” in Compressive Sensing for the Photonic Mixer Device, (Springer, 2017), pp. 207–352.
    [Crossref]
  22. S. Antholzer, C. Wolf, M. Sandbichler, M. Dielacher, and M. Haltmeier, “A framework for compressive time-of-flight 3d sensing,” https://arxiv.org/abs/1710.10444 .
  23. C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
    [Crossref]
  24. W.-K. Yu, X.-R. Yao, X.-F. Liu, L.-Z. Li, and G.-J. Zhai, “Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation,” Appl. Opt. 54, 363–367 (2015).
    [Crossref]
  25. R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
    [Crossref]
  26. F. Remondino and D. Stoppa, TOF Range-Imaging Cameras, Vol. 68121 (Springer, 2013).
    [Crossref]
  27. K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
    [Crossref]
  28. C. Mallet and F. Bretar, “Full-waveform topographic lidar: State-of-the-art,” ISPRS J. Photogramm. Remote. Sens. 64, 1–16 (2009).
    [Crossref]
  29. S. Burri, Y. Maruyama, X. Michalet, F. Regazzoni, C. Bruschini, and E. Charbon, “Architecture and applications of a high resolution gated spad image sensor,” Opt. Express 22, 17573–17589 (2014).
    [Crossref] [PubMed]
  30. N. Takeuchi, H. Baba, K. Sakurai, and T. Ueno, “Diode-laser random-modulation cw lidar,” Appl. Opt. 25, 63–67 (1986).
    [Crossref] [PubMed]
  31. B. L. Stann, W. C. Ruff, and Z. G. Sztankay, “Intensity-modulated diode laser radar using frequency-modulation/continuous-wave ranging techniques,” Opt. Eng. 35, 3270–3279 (1996).
    [Crossref]
  32. O. Batet, F. Dios, A. Comeron, and R. Agishev, “Intensity-modulated linear-frequency-modulated continuous-wave lidar for distributed media: fundamentals of technique,” Appl. Opt. 49, 3369–3379 (2010).
    [Crossref] [PubMed]
  33. W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
    [Crossref]
  34. F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
    [Crossref] [PubMed]
  35. V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.
  36. W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).
  37. T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.
  38. N. Satyan, A. Vasilyev, G. Rakuljic, V. Leyva, and A. Yariv, “Precise control of broadband frequency chirps using optoelectronic feedback,” Opt. Express 17, 15991–15999 (2009).
    [Crossref] [PubMed]
  39. M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
    [Crossref]
  40. M. Hashemi and S. Beheshti, “Adaptive noise variance estimation in bayesshrink,” IEEE Signal Process. Lett. 17, 12–15 (2010).
    [Crossref]
  41. N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series (MIT, 1964).
  42. W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
    [Crossref]
  43. R. Fletcher, Practical Methods of Optimization (John Wiley & Sons, 2013).
  44. R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
    [Crossref]
  45. M. L. Malloy and R. D. Nowak, “Near-optimal adaptive compressed sensing,” IEEE Trans. Inf. Theory 60, 4001–4012 (2014).
    [Crossref]

2017 (1)

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

2016 (4)

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
[Crossref]

2015 (1)

2014 (3)

S. Burri, Y. Maruyama, X. Michalet, F. Regazzoni, C. Bruschini, and E. Charbon, “Architecture and applications of a high resolution gated spad image sensor,” Opt. Express 22, 17573–17589 (2014).
[Crossref] [PubMed]

M. L. Malloy and R. D. Nowak, “Near-optimal adaptive compressed sensing,” IEEE Trans. Inf. Theory 60, 4001–4012 (2014).
[Crossref]

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

2013 (1)

2012 (2)

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

K. Khoshelham and S. O. Elberink, “Accuracy and resolution of kinect depth data for indoor mapping applications,” Sensors 12, 1437–1454 (2012).
[Crossref] [PubMed]

2011 (4)

J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
[Crossref]

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

A. Kirmani, A. Colaço, F. N. C. Wong, and V. K. Goyal, “Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor,” Opt. Express 19, 21485–21507 (2011).
[Crossref] [PubMed]

G. A. Howland, P. B. Dixon, and J. C. Howell, “Photon-counting compressive sensing laser radar for 3d imaging,” Appl. Opt. 50, 5917–5920 (2011).
[Crossref] [PubMed]

2010 (3)

O. Batet, F. Dios, A. Comeron, and R. Agishev, “Intensity-modulated linear-frequency-modulated continuous-wave lidar for distributed media: fundamentals of technique,” Appl. Opt. 49, 3369–3379 (2010).
[Crossref] [PubMed]

M. Hashemi and S. Beheshti, “Adaptive noise variance estimation in bayesshrink,” IEEE Signal Process. Lett. 17, 12–15 (2010).
[Crossref]

W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
[Crossref]

2009 (2)

2008 (1)

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

2006 (1)

D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
[Crossref]

2003 (1)

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

2001 (1)

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

1996 (1)

B. L. Stann, W. C. Ruff, and Z. G. Sztankay, “Intensity-modulated diode laser radar using frequency-modulation/continuous-wave ranging techniques,” Opt. Eng. 35, 3270–3279 (1996).
[Crossref]

1995 (1)

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
[Crossref]

1989 (1)

M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
[Crossref]

1986 (1)

Agishev, R.

Amann, M.-C.

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

Amzajerdian, F.

F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
[Crossref]

Andrés, P.

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Baba, H.

Baney, D. M.

M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
[Crossref]

Baraniuk, R. G.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

Barazzetti, L.

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

Batet, O.

Beheshti, S.

M. Hashemi and S. Beheshti, “Adaptive noise variance estimation in bayesshrink,” IEEE Signal Process. Lett. 17, 12–15 (2010).
[Crossref]

Behroozpour, B.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Bosch, T.

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

Boser, B. E.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Boufounos, P. T.

A. Kadambi and P. T. Boufounos, “Coded aperture compressive 3-d lidar,” in IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2015), pp. 1166–1170.

Bretar, F.

C. Mallet and F. Bretar, “Full-waveform topographic lidar: State-of-the-art,” ISPRS J. Photogramm. Remote. Sens. 64, 1–16 (2009).
[Crossref]

Bruschini, C.

Burri, S.

Byrd, R. H.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
[Crossref]

Cevher, V.

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

Charbon, E.

Chellappa, R.

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

Chen, M.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Cheok, G. S.

G. S. Cheok, M. Juberts, and M. Franaszek, “3D Imaging Systems for Manufacturing, Construction, and Mobility (NIST TN 1682),” Tech. Note (NIST TN)-1682 (2010).

Clemente, P.

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Colaço, A.

A. Kirmani, A. Colaço, F. N. C. Wong, and V. K. Goyal, “Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor,” Opt. Express 19, 21485–21507 (2011).
[Crossref] [PubMed]

A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.

Comeron, A.

Conde, M. H.

M. H. Conde, “Compressive sensing for the photonic mixer device,” in Compressive Sensing for the Photonic Mixer Device, (Springer, 2017), pp. 207–352.
[Crossref]

Davenport, M. A.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

Dios, F.

Dixon, P. B.

Donoho, D. L.

D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
[Crossref]

Duarte, M. F.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

Edgar, M. P.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Elberink, S. O.

K. Khoshelham and S. O. Elberink, “Accuracy and resolution of kinect depth data for indoor mapping applications,” Sensors 12, 1437–1454 (2012).
[Crossref] [PubMed]

Eldar, Y. C.

Y. C. Eldar and G. Kutyniok, Compressed Sensing: Theory and Applications (Cambridge University, 2012).
[Crossref]

Evangelidis, G.

R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
[Crossref]

Fletcher, R.

R. Fletcher, Practical Methods of Optimization (John Wiley & Sons, 2013).

Flood, M.

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

Franaszek, M.

G. S. Cheok, M. Juberts, and M. Franaszek, “3D Imaging Systems for Manufacturing, Construction, and Mobility (NIST TN 1682),” Tech. Note (NIST TN)-1682 (2010).

Frank, A. A.

A. A. Frank and M. Nakamura, “Laser radar for a vehicle lateral guidance system,” (1993). US Patent 5,202,742.

Geng, J.

J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
[Crossref]

Gerrits, T.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

Gibson, G. M.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Gong, W.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Goyal, V. K.

A. Kirmani, A. Colaço, F. N. C. Wong, and V. K. Goyal, “Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor,” Opt. Express 19, 21485–21507 (2011).
[Crossref] [PubMed]

A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.

Han, S.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Hansard, M.

R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
[Crossref]

Hashemi, M.

M. Hashemi and S. Beheshti, “Adaptive noise variance estimation in bayesshrink,” IEEE Signal Process. Lett. 17, 12–15 (2010).
[Crossref]

Hines, G.

F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
[Crossref]

Horaud, R.

R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
[Crossref]

Howell, J.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

Howell, J. C.

G. A. Howland, D. J. Lum, M. R. Ware, and J. C. Howell, “Photon counting compressive depth mapping,” Opt. Express 21, 23822–23837 (2013).
[Crossref] [PubMed]

G. A. Howland, P. B. Dixon, and J. C. Howell, “Photon-counting compressive sensing laser radar for 3d imaging,” Appl. Opt. 50, 5917–5920 (2011).
[Crossref] [PubMed]

A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.

Howland, G. A.

G. A. Howland, D. J. Lum, M. R. Ware, and J. C. Howell, “Photon counting compressive depth mapping,” Opt. Express 21, 23822–23837 (2013).
[Crossref] [PubMed]

G. A. Howland, P. B. Dixon, and J. C. Howell, “Photon-counting compressive sensing laser radar for 3d imaging,” Appl. Opt. 50, 5917–5920 (2011).
[Crossref] [PubMed]

A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.

Jancosek, M.

J. Smisek, M. Jancosek, and T. Pajdla, “3D with kinect,” in Consumer Depth Cameras for Computer Vision, (Springer, 2013), pp. 3–25.
[Crossref]

Johnson, A. E.

A. E. Johnson and J. F. Montgomery, “Overview of terrain relative navigation approaches for precise lunar landing,” in IEEE Aerospace Conference (2008), pp. 1–10.

Juberts, M.

G. S. Cheok, M. Juberts, and M. Franaszek, “3D Imaging Systems for Manufacturing, Construction, and Mobility (NIST TN 1682),” Tech. Note (NIST TN)-1682 (2010).

Kadambi, A.

A. Kadambi and P. T. Boufounos, “Coded aperture compressive 3-d lidar,” in IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2015), pp. 1166–1170.

Kelly, K. F.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

Khoshelham, K.

K. Khoshelham and S. O. Elberink, “Accuracy and resolution of kinect depth data for indoor mapping applications,” Sensors 12, 1437–1454 (2012).
[Crossref] [PubMed]

Kirmani, A.

A. Kirmani, A. Colaço, F. N. C. Wong, and V. K. Goyal, “Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor,” Opt. Express 19, 21485–21507 (2011).
[Crossref] [PubMed]

A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.

Kutyniok, G.

Y. C. Eldar and G. Kutyniok, Compressed Sensing: Theory and Applications (Cambridge University, 2012).
[Crossref]

Lamb, R.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Lan, R.-M.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

Lancis, J.

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Laska, J. N.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

Lescure, M.

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

Leyva, V.

Li, E.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Li, L.-Z.

Lim, K.

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

Liu, X.-F.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

W.-K. Yu, X.-R. Yao, X.-F. Liu, L.-Z. Li, and G.-J. Zhai, “Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation,” Appl. Opt. 54, 363–367 (2015).
[Crossref]

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

Lu, P.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
[Crossref]

Lum, D.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

Lum, D. J.

Mallet, C.

C. Mallet and F. Bretar, “Full-waveform topographic lidar: State-of-the-art,” ISPRS J. Photogramm. Remote. Sens. 64, 1–16 (2009).
[Crossref]

Malloy, M. L.

M. L. Malloy and R. D. Nowak, “Near-optimal adaptive compressed sensing,” IEEE Trans. Inf. Theory 60, 4001–4012 (2014).
[Crossref]

Maruyama, Y.

Matsui, Y.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Ménier, C.

R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
[Crossref]

Michalet, X.

Mirin, R.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

Montgomery, J. F.

A. E. Johnson and J. F. Montgomery, “Overview of terrain relative navigation approaches for precise lunar landing,” in IEEE Aerospace Conference (2008), pp. 1–10.

Morgan, S.

W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
[Crossref]

Myllyla, R.

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

Nakamura, M.

A. A. Frank and M. Nakamura, “Laser radar for a vehicle lateral guidance system,” (1993). US Patent 5,202,742.

Nam, S. W.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

Nazarathy, M.

M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
[Crossref]

Newton, S. A.

M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
[Crossref]

Nex, F.

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

Nocedal, J.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
[Crossref]

Nowak, R. D.

M. L. Malloy and R. D. Nowak, “Near-optimal adaptive compressed sensing,” IEEE Trans. Inf. Theory 60, 4001–4012 (2014).
[Crossref]

Padgett, M. J.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Pajdla, T.

J. Smisek, M. Jancosek, and T. Pajdla, “3D with kinect,” in Consumer Depth Cameras for Computer Vision, (Springer, 2013), pp. 3–25.
[Crossref]

Petway, L.

F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
[Crossref]

Pierrottet, D.

F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
[Crossref]

Quack, N.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Radwell, N.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Rakuljic, G.

Reddy, D.

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

Regazzoni, F.

Remondino, F.

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

F. Remondino and D. Stoppa, TOF Range-Imaging Cameras, Vol. 68121 (Springer, 2013).
[Crossref]

Rioux, M.

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

Roback, V.

F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
[Crossref]

Ruff, W. C.

B. L. Stann, W. C. Ruff, and Z. G. Sztankay, “Intensity-modulated diode laser radar using frequency-modulation/continuous-wave ranging techniques,” Opt. Eng. 35, 3270–3279 (1996).
[Crossref]

Sakurai, K.

Sandborn, P. A.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Sankaranarayanan, A.

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

Sarazzi, D.

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

Satyan, N.

Scaioni, M.

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

Seok, T. J.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Smisek, J.

J. Smisek, M. Jancosek, and T. Pajdla, “3D with kinect,” in Consumer Depth Cameras for Computer Vision, (Springer, 2013), pp. 3–25.
[Crossref]

Soldevila, F.

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Sorin, W. V.

M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
[Crossref]

Stann, B. L.

B. L. Stann, W. C. Ruff, and Z. G. Sztankay, “Intensity-modulated diode laser radar using frequency-modulation/continuous-wave ranging techniques,” Opt. Eng. 35, 3270–3279 (1996).
[Crossref]

Stettner, R.

R. Stettner, “Compact 3D flash lidar video cameras and applications,” in Laser Radar Technology and Applications XV, Vol. 7684 (International Society for Optics and Photonics, 2010).
[Crossref]

St-Onge, B.

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

Stoppa, D.

F. Remondino and D. Stoppa, TOF Range-Imaging Cameras, Vol. 68121 (Springer, 2013).
[Crossref]

Sun, B.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Sun, M.-J.

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Sun, T.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

Sztankay, Z. G.

B. L. Stann, W. C. Ruff, and Z. G. Sztankay, “Intensity-modulated diode laser radar using frequency-modulation/continuous-wave ranging techniques,” Opt. Eng. 35, 3270–3279 (1996).
[Crossref]

Tajahuerce, E.

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Takbar, D.

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

Takeuchi, N.

Treitz, P.

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

Ueno, T.

Uribe-Patarroyo, N.

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Vasilyev, A.

Verma, V.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

Wang, C.

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

Wang, H.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Ware, M. R.

Wiener, N.

N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series (MIT, 1964).

Wong, F. N. C.

Wu, L.-A.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

Wu, M. C.

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

Wulder, M.

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

Xu, W.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Yang, J.

W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
[Crossref]

Yao, X.-R.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

W.-K. Yu, X.-R. Yao, X.-F. Liu, L.-Z. Li, and G.-J. Zhai, “Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation,” Appl. Opt. 54, 363–367 (2015).
[Crossref]

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

Yariv, A.

Yin, W.

W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
[Crossref]

Yu, W.-K.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

W.-K. Yu, X.-R. Yao, X.-F. Liu, L.-Z. Li, and G.-J. Zhai, “Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation,” Appl. Opt. 54, 363–367 (2015).
[Crossref]

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

Zhai, G.-J.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

W.-K. Yu, X.-R. Yao, X.-F. Liu, L.-Z. Li, and G.-J. Zhai, “Three-dimensional single-pixel compressive reflectivity imaging based on complementary modulation,” Appl. Opt. 54, 363–367 (2015).
[Crossref]

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

Zhai, Y.

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

Zhang, Y.

W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
[Crossref]

Zhao, C.

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

Zhao, Q.

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

Zhu, C.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
[Crossref]

Adv. Opt. Photonics (1)

J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
[Crossref]

Appl. Opt. (4)

Appl. Phys. Lett. (1)

C. Zhao, W. Gong, M. Chen, E. Li, H. Wang, W. Xu, and S. Han, “Ghost imaging lidar via sparsity constraints,” Appl. Phys. Lett. 101, 141123 (2012).
[Crossref]

IEEE J. Solid-State Circuits (1)

B. Behroozpour, P. A. Sandborn, N. Quack, T. J. Seok, Y. Matsui, M. C. Wu, and B. E. Boser, “Electronic-Photonic Integrated Circuit for 3D Microimaging,” IEEE J. Solid-State Circuits 52, 161–172 (2017).
[Crossref]

IEEE Signal Process. Lett. (1)

M. Hashemi and S. Beheshti, “Adaptive noise variance estimation in bayesshrink,” IEEE Signal Process. Lett. 17, 12–15 (2010).
[Crossref]

IEEE Signal Process. Mag. (1)

M. F. Duarte, M. A. Davenport, D. Takbar, J. N. Laska, T. Sun, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging via compressive sampling,” IEEE Signal Process. Mag. 25, 83–91 (2008).
[Crossref]

IEEE Trans. Inf. Theory (2)

D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
[Crossref]

M. L. Malloy and R. D. Nowak, “Near-optimal adaptive compressed sensing,” IEEE Trans. Inf. Theory 60, 4001–4012 (2014).
[Crossref]

Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. (1)

F. Remondino, L. Barazzetti, F. Nex, M. Scaioni, and D. Sarazzi, “UAV photogrammetry for mapping and 3d modeling–current status and future perspectives,” Int. Arch. Photogramm. Remote. Sens. Spatial Inf. Sci. 38, C22 (2011).

ISPRS J. Photogramm. Remote. Sens. (1)

C. Mallet and F. Bretar, “Full-waveform topographic lidar: State-of-the-art,” ISPRS J. Photogramm. Remote. Sens. 64, 1–16 (2009).
[Crossref]

J. Light. Technol. (1)

M. Nazarathy, W. V. Sorin, D. M. Baney, and S. A. Newton, “Spectral analysis of optical mixing measurements,” J. Light. Technol. 7, 1083–1096 (1989).
[Crossref]

Mach. Vis. Appl. (1)

R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier, “An overview of depth cameras and range scanners based on time-of-flight technologies,” Mach. Vis. Appl. 27, 1005–1020 (2016).
[Crossref]

Nat. Commun. (1)

M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb, and M. J. Padgett, “Single-pixel three-dimensional imaging with time-based depth resolution,” Nat. Commun. 7, 12010 (2016).
[Crossref]

Opt. Commun (1)

W.-K. Yu, X.-R. Yao, X.-F. Liu, R.-M. Lan, L.-A. Wu, G.-J. Zhai, and Q. Zhao, “Compressive microscopic imaging with “positive–negative” light modulation,” Opt. Commun 371, 105–111 (2016).
[Crossref]

Opt. Eng. (2)

B. L. Stann, W. C. Ruff, and Z. G. Sztankay, “Intensity-modulated diode laser radar using frequency-modulation/continuous-wave ranging techniques,” Opt. Eng. 35, 3270–3279 (1996).
[Crossref]

M.-C. Amann, T. Bosch, M. Lescure, R. Myllyla, and M. Rioux, “Laser ranging: a critical review of usual techniques for distance measurement,” Opt. Eng. 40, 10–19 (2001).
[Crossref]

Opt. Express (4)

Proc. SPIE (1)

W. Yin, S. Morgan, J. Yang, and Y. Zhang, “Practical compressive sensing with toeplitz and circulant matrices,” Proc. SPIE 7744, 77440K (2010).
[Crossref]

Prog. Phys. Geogr. (1)

K. Lim, P. Treitz, M. Wulder, B. St-Onge, and M. Flood, “Lidar remote sensing of forest structure,” Prog. Phys. Geogr. 27, 88–106 (2003).
[Crossref]

Sci. Rep. (2)

W.-K. Yu, X.-F. Liu, X.-R. Yao, C. Wang, Y. Zhai, and G.-J. Zhai, “Complementary compressive imaging for the telescopic system,” Sci. Rep. 4, 05834 (2014).

F. Soldevila, P. Clemente, E. Tajahuerce, N. Uribe-Patarroyo, P. Andrés, and J. Lancis, “Computational imaging with a balanced detector,” Sci. Rep. 6, 29181 (2016).
[Crossref] [PubMed]

Sensors (1)

K. Khoshelham and S. O. Elberink, “Accuracy and resolution of kinect depth data for indoor mapping applications,” Sensors 12, 1437–1454 (2012).
[Crossref] [PubMed]

SIAM J. on Sci. Comput. (1)

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. on Sci. Comput. 16, 1190–1208 (1995).
[Crossref]

Other (16)

R. Fletcher, Practical Methods of Optimization (John Wiley & Sons, 2013).

N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series (MIT, 1964).

Y. C. Eldar and G. Kutyniok, Compressed Sensing: Theory and Applications (Cambridge University, 2012).
[Crossref]

A. Kadambi and P. T. Boufounos, “Coded aperture compressive 3-d lidar,” in IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2015), pp. 1166–1170.

A. Colaço, A. Kirmani, G. A. Howland, J. C. Howell, and V. K. Goyal, “Compressive depth map acquisition using a single photon-counting detector: Parametric signal processing meets sparsity,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 96–102.

G. S. Cheok, M. Juberts, and M. Franaszek, “3D Imaging Systems for Manufacturing, Construction, and Mobility (NIST TN 1682),” Tech. Note (NIST TN)-1682 (2010).

J. Smisek, M. Jancosek, and T. Pajdla, “3D with kinect,” in Consumer Depth Cameras for Computer Vision, (Springer, 2013), pp. 3–25.
[Crossref]

A. A. Frank and M. Nakamura, “Laser radar for a vehicle lateral guidance system,” (1993). US Patent 5,202,742.

F. Amzajerdian, D. Pierrottet, L. Petway, G. Hines, and V. Roback, “Lidar systems for precision navigation and safe landing on planetary bodies,” in International Symposium on Photoelectronic Detection and Imaging 2011, (International Society for Optics and Photonics, 2011), p. 819202.
[Crossref]

R. Stettner, “Compact 3D flash lidar video cameras and applications,” in Laser Radar Technology and Applications XV, Vol. 7684 (International Society for Optics and Photonics, 2010).
[Crossref]

A. E. Johnson and J. F. Montgomery, “Overview of terrain relative navigation approaches for precise lunar landing,” in IEEE Aerospace Conference (2008), pp. 1–10.

V. Cevher, A. Sankaranarayanan, M. F. Duarte, D. Reddy, R. G. Baraniuk, and R. Chellappa, “Compressive sensing for background subtraction,” in European Conference on Computer Vision (Springer, 2008), pp. 155–168.

T. Gerrits, D. Lum, J. Howell, V. Verma, R. Mirin, and S. W. Nam, “A short-wave infrared single photon camera,” in Imaging and Applied Optics 2017, of 2017 Technical Digest Series (Optical Society of America, 2017), paper CTu4B.5.

M. H. Conde, “Compressive sensing for the photonic mixer device,” in Compressive Sensing for the Photonic Mixer Device, (Springer, 2017), pp. 207–352.
[Crossref]

S. Antholzer, C. Wolf, M. Sandbichler, M. Dielacher, and M. Haltmeier, “A framework for compressive time-of-flight 3d sensing,” https://arxiv.org/abs/1710.10444 .

F. Remondino and D. Stoppa, TOF Range-Imaging Cameras, Vol. 68121 (Springer, 2013).
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 (Proposed Experiment) HWP: half-wave plate, PBS: polarizing beam-splitter, DMD: digital micro-mirror device, BS: beam-splitter. A linearly chirped laser is split into two beams, designated as a local-oscillator and a signal, via a HWP and a PBS. The signal illuminates a target scene and the reflected radiation is used to image the scene onto a DMD. The DMD takes pseudo-random spatial projections, consisting of ±1 pixel values, and directs the projections to balanced-heterodyne detectors using the local-oscillator.
Fig. 2
Fig. 2 Image (a) presents a 3-dimensional scene composed of Lambertian-scattering targets. Image (b) presents the depth map we wish to compressively recover.
Fig. 3
Fig. 3 Image (a) presents an illumination profile that consists of 1 Watt in total power. The scene is discretized into a 128 × 128 pixel resolution scene with a maximum power per pixel of 119 μW. When using a 2 inch collection optic and modeling each object as a Lambertian scatter, image (b) presents a single projection taken by the DMD of the reflected radiation as seen by one detector. The maximum power per pixel is of order nanowatts.
Fig. 4
Fig. 4 Image (a) shows the 2 MHz Lorentzian linewidth on the beat notes expected from a laser’s 1 MHz Lorentzian linewidth as seen by an oscilloscope sampling at 33.3 MHz. Image (b) shows the noiseless, positive-frequency components from a single noiseless projection from Fig. 3 when using a zero-linewidth laser. When accounting for a 2 MHz beat-note linewidth, image (c) shows the broadened expected frequency components. When including a peak SNR = 5 (based on the SNR of the brightest object or pixel), image (d) presents a realistic noisy signal one would measure in an experiment. Image (e) presents the result of denoising the signal in (d) with a Bayes-shrink denoising filter using a symlet-20 wavelet decomposition. Image (f) presents the result of deconvolving the 2 MHz Lorentzian linewidth with the denoised signal in (e) using a Weiner filter. Image (f) is the cleaned signal from the (1,0) projection. A similarly cleaned result from the (0,1) signal will then be subtracted from (f) and then used to form yI and y.
Fig. 5
Fig. 5 When using a PSNR = 5, image (a) shows a typical total-variation minimization reconstruction from a 25% sample-ratio intensity-measurement vector yI. Shapes are identified, but the pixel values are incorrect. Image (b) is a binary mask generated by hard-thresholding image (a). After representing image (b) in a sparse basis, such as a Haar-wavelet decomposition, least-squares can be performed on the m/3 largest signal components to construct xI and x. After applying Eq. (11), a depth map is presented in image (c). Images (d), (e), and (f) demonstrate the same process for a 5% sample ratio, again with PSNR = 5. Image (g) is the true depth-map presented for easy comparison. Images (h) and (i) are smoothed depth-maps after applying a 4 ×4 pixel averaging kernel to depth-maps (f) and (c), respectively.
Fig. 6
Fig. 6 (a) Reconstruction Mean-Squared Error: Each data point corresponds to the average mean squared error from 10 different reconstructions of different data sets for varying sample ratios (m/n) and peak-SNR cases. The standard error for each mean is given by the shaded regions. (b) Depth Uncertainty to Toroid: Each data point represents the average uncertainty in depth to the toroid within Fig. 2. Points were calculated by first considering the standard deviation of each pixel over 10 different reconstructions. Standard deviations associated with only the toroid pixels were then averaged. Different peak-SNR scenarios, FWHM Lorentzian linewidths for the beat-note frequency uncertainties, and sample ratios were considered and compared to the results obtained by simulating raster scans. Raster scans were not affected by linewidth uncertainty.

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

E ( t ) = A exp ( 2 π i [ ν 0 + ν f ν 0 2 T t ] t ) ,
P Scope ( t ) = 0 c 2 [ A Sig 2 + A LO 2 + 2 A LO A Sig sin ( 2 π Δ ν τ T t + ϕ ) ] ,
d = ν T c 2 Δ ν .
j E Sig ( t τ j ) = j A j exp ( 2 π i [ ν 0 + Δ ν 2 T ( t τ j ) ] ( t τ j ) ) ,
P Scope ( t ) = 0 c 2 ( | E LO ( t ) + i j E Sig ( t τ j ) 2 | 2 | i E LO ( t ) + j E Sig ( t τ j ) 2 | 2 )
= 0 c j A LO A j sin ( 2 π Δ ν τ j T t + ϕ j ) ,
arg min x ^ n A x ^ y 2 2 + α TV ( x ^ ) ,
P Scope ( t ) = 0 c j A LO A x I j sin ( 2 π Δ ν τ j T t + ϕ j ) .
y I = ν + = 1 N | [ P Scope ( t ) ] + |
y I ν = ν + = 1 N | [ P Scope ( t ) ] + | ν + ,
d = x ^ I ν x ^ I T c 2 Δ ν ,
s m / 3 = P Ψ M ^ ( arg min x I n A x I y I 2 2 + α TV ( x I ) )
x ^ I = M Ψ 1 P T ( arg min s m / 3 A Ψ 1 P T s m / 3 y I 2 2 )
x ^ I ν = M Ψ 1 P T ( arg min s m / 3 A Ψ 1 P T s m / 3 y I ν 2 2 ) ,

Metrics