Abstract

Real-time detection and tracking for fast moving object has important applications in various fields. However, available methods, especially low-cost ones, can hardly achieve real-time and long-duration object detection and tracking. Here we report an image-free and cost-effective method for detecting and tracking a fast moving object in real time and for long duration. The method employs a spatial light modulator and a single-pixel detector for data acquisition. It uses Fourier basis patterns to illuminate the target moving object and collects the resulting light signal with a single-pixel detector. The proposed method is able to detect and track the object with the single-pixel measurements directly without image reconstruction. The detection and tracking algorithm of the proposed method is computationally efficient. We experimentally demonstrate that the method can achieve a temporal resolution of 1,666 frames per second by using a 10,000 Hz digital micro-mirror device. The latency time of the method is on the order of microseconds. Additionally, the method acquires only 600 bytes of data for each frame. The method therefore allows fast moving object detection and tracking in real time and for long duration. This image-free approach might open up a new avenue for spatial information acquisition in a highly efficient manner.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Detecting and tracking moving objects in long-distance imaging through turbulent medium

Eli Chen, Oren Haik, and Yitzhak Yitzhaky
Appl. Opt. 53(6) 1181-1190 (2014)

Implementation of tracking and extraction of moving objects in successive frames

Satoshi A. Sugimoto, Hiroshi Matsuki, and Yoshiki Ichioka
Appl. Opt. 25(6) 990-996 (1986)

Detection and tracking of small moving objects in image sequences by use of nonlinear spatiotemporal optical systems

Weiping Lu, Svetlana L. Lachinova, and Robert G. Harrison
Opt. Lett. 29(8) 824-826 (2004)

References

  • View by:
  • |
  • |
  • |

  1. N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
    [Crossref]
  2. C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
    [Crossref]
  3. P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: Review and experimental comparison,” Pattern Recogn. 76, 323–338 (2018).
    [Crossref]
  4. H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
    [Crossref]
  5. P. Bahl, V. N. Padmanabhan, V. Bahl, and V. Padmanabhan, “RADAR: An in-building RF-based user location and tracking system,” in Proceedings IEEE INFOCOM 2000 (IEEE, 2000), pp. 775.
  6. A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
    [Crossref]
  7. U. Wandinger, “Introduction to lidar,” in Lidar (Springer, New York, 2005).
  8. F. G. Fernald, “Analysis of atmospheric lidar observations: some comments,” Appl. Opt. 23(5), 652–653 (1984).
    [Crossref]
  9. S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.
  10. G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
    [Crossref]
  11. Y. Fang, M. Ichiro, and H. Berthold, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Syst. 3(3), 196–202 (2002).
    [Crossref]
  12. M. S. Wei, F. Xing, and Z. You, “A real-time detection and positioning method for small and weak targets using a 1D morphology-based approach in 2D images,” Light: Sci. Appl. 7(5), 18006 (2018).
    [Crossref]
  13. D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Machine Intell. 25(5), 564–577 (2003).
    [Crossref]
  14. W. Zhong, H. Lu, and M. H. Yang, “Robust object tracking via sparsity-based collaborative model,” IEEE Conference on Computer vision and pattern recognition (IEEE, 2015), pp. 1838–1845.
  15. D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
    [Crossref]
  16. M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
    [Crossref]
  17. M. Sun and J. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors 19(3), 732 (2019).
    [Crossref]
  18. Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
    [Crossref]
  19. Z. Zhang and J. Zhong, “Three-dimensional single-pixel imaging with far fewer measurements than effective image pixels,” Opt. Lett. 41(11), 2497–2500 (2016).
    [Crossref]
  20. Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Fast Fourier single-pixel imaging via binary illumination,” Sci. Rep. 7(1), 12029 (2017).
    [Crossref]
  21. Z. Zhang, S. Liu, J. Peng, M. Yao, G. Zheng, and J. Zhong, “Simultaneous spatial, spectral, and 3D compressive imaging via efficient Fourier single-pixel measurements,” Optica 5(3), 315–319 (2018).
    [Crossref]
  22. R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
    [Crossref]
  23. Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Hadamard single-pixel imaging versus Fourier single-pixel imaging,” Opt. Express 25(16), 19619–19639 (2017).
    [Crossref]
  24. Z. H. Xu, W. Chen, J. Penuelas, M. Padgett, and M. J. Sun, “1000 fps computational ghost imaging using LED-based structured illumination,” Opt. Express 26(3), 2427–2434 (2018).
    [Crossref]
  25. E. Balaguer, P. Carmona, C. Chabert, F. Pla, J. Lancis, and E. Tajahuerce, “Low-cost single-pixel 3D imaging by using an LED array,” Opt. Express 26(12), 15623–15631 (2018).
    [Crossref]

2019 (4)

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

M. Sun and J. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors 19(3), 732 (2019).
[Crossref]

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

2018 (5)

2017 (2)

Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Fast Fourier single-pixel imaging via binary illumination,” Sci. Rep. 7(1), 12029 (2017).
[Crossref]

Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Hadamard single-pixel imaging versus Fourier single-pixel imaging,” Opt. Express 25(16), 19619–19639 (2017).
[Crossref]

2016 (2)

Z. Zhang and J. Zhong, “Three-dimensional single-pixel imaging with far fewer measurements than effective image pixels,” Opt. Lett. 41(11), 2497–2500 (2016).
[Crossref]

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

2015 (1)

Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
[Crossref]

2012 (1)

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

2011 (1)

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

2005 (1)

N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
[Crossref]

2004 (1)

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

2003 (1)

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Machine Intell. 25(5), 564–577 (2003).
[Crossref]

2002 (1)

Y. Fang, M. Ichiro, and H. Berthold, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Syst. 3(3), 196–202 (2002).
[Crossref]

1984 (1)

Albrecht, I.

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

Bahl, P.

P. Bahl, V. N. Padmanabhan, V. Bahl, and V. Padmanabhan, “RADAR: An in-building RF-based user location and tracking system,” in Proceedings IEEE INFOCOM 2000 (IEEE, 2000), pp. 775.

Bahl, V.

P. Bahl, V. N. Padmanabhan, V. Bahl, and V. Padmanabhan, “RADAR: An in-building RF-based user location and tracking system,” in Proceedings IEEE INFOCOM 2000 (IEEE, 2000), pp. 775.

Balaguer, E.

Bawendi, M. G.

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Berthold, H.

Y. Fang, M. Ichiro, and H. Berthold, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Syst. 3(3), 196–202 (2002).
[Crossref]

Carmona, P.

Chabert, C.

Chen, W.

Comaniciu, D.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Machine Intell. 25(5), 564–577 (2003).
[Crossref]

Edgar, M. P.

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

Faccio, D.

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

Fang, Y.

Y. Fang, M. Ichiro, and H. Berthold, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Syst. 3(3), 196–202 (2002).
[Crossref]

Fernald, F. G.

Gariepy, G.

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

Gibson, G. M.

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

Gupta, O.

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Haber, J.

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

Hashimoto, K.

N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
[Crossref]

Hashimoto, M.

S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.

Henderson, R.

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

Huang, J.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Ichiro, M.

Y. Fang, M. Ichiro, and H. Berthold, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Syst. 3(3), 196–202 (2002).
[Crossref]

Ishikawa, M.

N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
[Crossref]

Lancis, J.

Leach, J.

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

Li, G.

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

Li, P.

P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: Review and experimental comparison,” Pattern Recogn. 76, 323–338 (2018).
[Crossref]

Liu, D.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Liu, S.

Liu, W.

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

Lu, H.

P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: Review and experimental comparison,” Pattern Recogn. 76, 323–338 (2018).
[Crossref]

W. Zhong, H. Lu, and M. H. Yang, “Robust object tracking via sparsity-based collaborative model,” IEEE Conference on Computer vision and pattern recognition (IEEE, 2015), pp. 1838–1845.

Lu, Y.

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

Ma, X.

Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
[Crossref]

Magnor, M.

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

Meer, P.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Machine Intell. 25(5), 564–577 (2003).
[Crossref]

Ogawa, N.

N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
[Crossref]

Ogawa, T.

S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.

Oku, H.

N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
[Crossref]

Padgett, M.

Padgett, M. J.

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

Padmanabhan, V.

P. Bahl, V. N. Padmanabhan, V. Bahl, and V. Padmanabhan, “RADAR: An in-building RF-based user location and tracking system,” in Proceedings IEEE INFOCOM 2000 (IEEE, 2000), pp. 775.

Padmanabhan, V. N.

P. Bahl, V. N. Padmanabhan, V. Bahl, and V. Padmanabhan, “RADAR: An in-building RF-based user location and tracking system,” in Proceedings IEEE INFOCOM 2000 (IEEE, 2000), pp. 775.

Peng, J.

Penuelas, J.

Pla, F.

Ramesh, V.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Machine Intell. 25(5), 564–577 (2003).
[Crossref]

Raskar, R.

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Sato, S.

S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.

Seidel, H. P.

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

Shao, L.

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

She, R.

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

Shi, D.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Song, Z.

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

Sun, M.

M. Sun and J. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors 19(3), 732 (2019).
[Crossref]

Sun, M. J.

Tajahuerce, E.

Takagi, K.

S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.

Takita, M.

S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.

Theobalt, C.

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

Tonolini, F.

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

Veeraraghavan, A.

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Velten, A.

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Wandinger, U.

U. Wandinger, “Introduction to lidar,” in Lidar (Springer, New York, 2005).

Wang, D.

P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: Review and experimental comparison,” Pattern Recogn. 76, 323–338 (2018).
[Crossref]

Wang, L.

P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: Review and experimental comparison,” Pattern Recogn. 76, 323–338 (2018).
[Crossref]

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

Wang, X.

Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Hadamard single-pixel imaging versus Fourier single-pixel imaging,” Opt. Express 25(16), 19619–19639 (2017).
[Crossref]

Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Fast Fourier single-pixel imaging via binary illumination,” Sci. Rep. 7(1), 12029 (2017).
[Crossref]

Wang, Y.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Wei, M. S.

M. S. Wei, F. Xing, and Z. You, “A real-time detection and positioning method for small and weak targets using a 1D morphology-based approach in 2D images,” Light: Sci. Appl. 7(5), 18006 (2018).
[Crossref]

Willwacher, T.

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Xie, C.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Xing, F.

M. S. Wei, F. Xing, and Z. You, “A real-time detection and positioning method for small and weak targets using a 1D morphology-based approach in 2D images,” Light: Sci. Appl. 7(5), 18006 (2018).
[Crossref]

Xu, Z. H.

Yang, H.

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

Yang, M. H.

W. Zhong, H. Lu, and M. H. Yang, “Robust object tracking via sparsity-based collaborative model,” IEEE Conference on Computer vision and pattern recognition (IEEE, 2015), pp. 1838–1845.

Yao, M.

Yin, K.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

You, Z.

M. S. Wei, F. Xing, and Z. You, “A real-time detection and positioning method for small and weak targets using a 1D morphology-based approach in 2D images,” Light: Sci. Appl. 7(5), 18006 (2018).
[Crossref]

Yuan, K.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Zhang, J.

M. Sun and J. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors 19(3), 732 (2019).
[Crossref]

Zhang, Z.

Zheng, F.

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

Zheng, G.

Zhong, J.

Zhong, W.

W. Zhong, H. Lu, and M. H. Yang, “Robust object tracking via sparsity-based collaborative model,” IEEE Conference on Computer vision and pattern recognition (IEEE, 2015), pp. 1838–1845.

Zhou, Z.

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

Zhu, W.

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

ACM Trans. Graph. (1)

C. Theobalt, I. Albrecht, J. Haber, M. Magnor, and H. P. Seidel, “Pitching a baseball: tracking high-speed motion with multi-exposure images,” ACM Trans. Graph. 23(3), 540–547 (2004).
[Crossref]

Appl. Opt. (1)

Appl. Phys. Lett. (1)

R. She, W. Liu, Y. Lu, Z. Zhou, and G. Li, “Fourier single-pixel imaging in the terahertz regime,” Appl. Phys. Lett. 115(2), 021101 (2019).
[Crossref]

IEEE Trans. Intell. Transport. Syst. (1)

Y. Fang, M. Ichiro, and H. Berthold, “Depth-based target segmentation for intelligent vehicles: Fusion of radar and binocular stereo,” IEEE Trans. Intell. Transport. Syst. 3(3), 196–202 (2002).
[Crossref]

IEEE Trans. Pattern Anal. Machine Intell. (1)

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Machine Intell. 25(5), 564–577 (2003).
[Crossref]

IEEE Trans. Robot. (1)

N. Ogawa, H. Oku, K. Hashimoto, and M. Ishikawa, “Microrobotic visual control of motile cells using high-speed tracking system,” IEEE Trans. Robot. 21(4), 704–712 (2005).
[Crossref]

Light: Sci. Appl. (1)

M. S. Wei, F. Xing, and Z. You, “A real-time detection and positioning method for small and weak targets using a 1D morphology-based approach in 2D images,” Light: Sci. Appl. 7(5), 18006 (2018).
[Crossref]

Nat. Commun. (2)

A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi, and R. Raskar, “Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging,” Nat. Commun. 3(1), 745 (2012).
[Crossref]

Z. Zhang, X. Ma, and J. Zhong, “Single-pixel imaging by means of Fourier spectrum acquisition,” Nat. Commun. 6(1), 6225 (2015).
[Crossref]

Nat. Photonics (2)

G. Gariepy, F. Tonolini, R. Henderson, J. Leach, and D. Faccio, “Detection and tracking of moving objects hidden from view,” Nat. Photonics 10(1), 23–26 (2016).
[Crossref]

M. P. Edgar, G. M. Gibson, and M. J. Padgett, “Principles and prospects for single-pixel imaging,” Nat. Photonics 13(1), 13–20 (2019).
[Crossref]

Neurocomputing (1)

H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: A review,” Neurocomputing 74(18), 3823–3831 (2011).
[Crossref]

Opt. Commun. (1)

D. Shi, K. Yin, J. Huang, K. Yuan, W. Zhu, C. Xie, D. Liu, and Y. Wang, “Fast tracking of moving objects using single-pixel imaging,” Opt. Commun. 440, 155–162 (2019).
[Crossref]

Opt. Express (3)

Opt. Lett. (1)

Optica (1)

Pattern Recogn. (1)

P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: Review and experimental comparison,” Pattern Recogn. 76, 323–338 (2018).
[Crossref]

Sci. Rep. (1)

Z. Zhang, X. Wang, G. Zheng, and J. Zhong, “Fast Fourier single-pixel imaging via binary illumination,” Sci. Rep. 7(1), 12029 (2017).
[Crossref]

Sensors (1)

M. Sun and J. Zhang, “Single-pixel imaging and its application in three-dimensional reconstruction: a brief review,” Sensors 19(3), 732 (2019).
[Crossref]

Other (4)

W. Zhong, H. Lu, and M. H. Yang, “Robust object tracking via sparsity-based collaborative model,” IEEE Conference on Computer vision and pattern recognition (IEEE, 2015), pp. 1838–1845.

P. Bahl, V. N. Padmanabhan, V. Bahl, and V. Padmanabhan, “RADAR: An in-building RF-based user location and tracking system,” in Proceedings IEEE INFOCOM 2000 (IEEE, 2000), pp. 775.

U. Wandinger, “Introduction to lidar,” in Lidar (Springer, New York, 2005).

S. Sato, M. Hashimoto, M. Takita, K. Takagi, and T. Ogawa, “Multilayer lidar-based pedestrian tracking in urban environments,” in 2010 IEEE Intelligent Vehicles Symposium (IEEE, 2010), pp. 849–854.

Supplementary Material (3)

NameDescription
» Visualization 1       Raw images captured by the side camera for the 'h-shaped track'.
» Visualization 2       Raw images captured by the side camera for the 's-shaped track'.
» Visualization 3       Tracking results comparison between the proposed image-free method and the image-based method.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (4)

Fig. 1.
Fig. 1. Experimental set up. (b) h-shaped track. (c) s-shaped track. Scale bar = 15 mm.
Fig. 2.
Fig. 2. Fourier basis patterns. (a)-(c) are used for the acquisition of $\tilde{I}({{f_x},0} )$ and (d)-(f) for $\tilde{I}({0,{f_y}} )$.
Fig. 3.
Fig. 3. Single-pixel measurements $\bar{D}$: (a) for h-shaped track and (b) for s-shaped track.
Fig. 4.
Fig. 4. Tracking results for (a) h-shaped and (b) s-shaped track, respectively. Red circles are results by proposed method and blue solid dots are the results by high-speed photography using the side camera.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

P(x,y|fx,fy,φ0)=A+Bcos[2π(fxx+fyy)+φ0],
D=I(x,y),P(x,y)=I(x,y){A+Bcos[2π(fxx+fyy)+φ0]}dxdy.
I(xx0,yy0)=F1{I~(fx,fy)exp[j2π(fxx0+fyy0)]}
I~=(2D0D2π/2π33D4π/4π33)+3j(D2π/2π33D4π/4π33),
x0=12πfxarg{[I~(fx,0)I~bg(fx,0)]},
y0=12πfyarg{[I~(0,fy)I~bg(0,fy)]},