Abstract

Mean shift has been presented as a well known and efficient algorithm for tracking infrared targets. However, under complex backgrounds, such as clutter, varying illumination, and occlusion, the traditional mean tracking method often converges to a local maximum and loses the real infrared target. To cope with these problems, an improved mean shift tracking algorithm based on multicue fusion is proposed. According to the characteristics of the human in infrared images, the algorithm first extracts the gray and edge cues, and then uses the motion information to guide the two cues to obtain improved motion-guided gray and edge cues that are fused adaptively into the mean shift framework. Finally an automatic model update is used to improve the tracking performance further. The experimental results show that, compared with the traditional mean shift algorithm, the presented method greatly improves the accuracy and effectiveness of infrared human tracking under complex scenes, and the tracking results are satisfactory.

© 2009 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. M. Yasuno, S. Ryousuke, N. Yasuda, and M. Aoki, “Pedestrian detection and tracking in far infrared images,” in IEEE Conference on Intelligent Transportation Systems, 2005 (IEEE, 2005), pp. 182-187.
  2. R. M. Liu, E. Q. Liu, J. Yang, Y. Zeng, F. L. Wang, and Y. Cao, “Automatically detect and track infrared small targets with kernel Fukunaga--Koontz transform and Kalman prediction,” Appl. Opt. 46, 7780-7791 (2007).
    [CrossRef] [PubMed]
  3. K. Brunnstrom, B. N. Schenkman, and B. Jacobson, “Object detection in cluttered infrared images,” Opt. Eng. 42, 388-399 (2003).
    [CrossRef]
  4. A. Bal and M. S. Alam, “Dynamic target tracking with fringe-adjusted joint transform correlation and template matching,” Appl. Opt. 43, 4874-4881 (2004).
    [CrossRef] [PubMed]
  5. M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
    [CrossRef]
  6. K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image Vision Comput. 21, 99-110(2003).
    [CrossRef]
  7. M. Isard and A. Blake, “CONDENSATION: conditional density propagation for visual tracking,” Int. J. Comput. Vision 29, 5-28 (1998).
    [CrossRef]
  8. H. T. Hu, Z. L. Jing, and S. Q. Hu, “Track before detect for point targets with particle filter in infrared image sequences,” Chin. Opt. Lett. 3, 322-325 (2005).
  9. D. Comaniciu, V. Ramesh, and P. Meer, “Real-time tracking of non-rigid objects using mean shift,” in IEEE Conference on Computer Vision and Pattern Recognition, 2000 (IEEE, 2000), pp. 142-149.
  10. C. Yang, R. Duraiswami, and L. Davis, “Efficient mean-shift tracking via a new similarity measure,” in IEEE Conference on Computer Vision and Pattern Recognition, 2005 (IEEE, 2005), pp. 176-183.
  11. D. Comaniciu and V. Ramesh, “Mean shift and optimal prediction for efficient object tracking,” in IEEE Conference on Image Processing, 2000 (IEEE, 2000), pp. 70-73.
  12. K. Fukunaga and L. Hostetler, “The estimation of the gradient of a density function, with applications in pattern recognition,” IEEE Trans. Inf. Theory 21, 32-40 (1975).
    [CrossRef]
  13. Y. Cheng, “Mean shift, mode seeking, and clustering,” IEEE Trans. Pattern Anal. Mach. Intell. 17, 790-799 (1995).
    [CrossRef]
  14. D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 564-577 (2003).
    [CrossRef]
  15. D. Comaniciu and P. Meer, “Mean shift: a robust approach toward feature space analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 603-619 (2002).
    [CrossRef]
  16. J. G. Ling, E. Q. Liu, H. Y. Liang, and J. Yang, “Infrared target tracking with kernel-based performance metric and eigenvalue-based similarity measure,” Appl. Opt. 46, 3239-3252(2007).
    [CrossRef] [PubMed]
  17. C. F. Shana, T. N. Tan, and Y. C. Wei, “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40, 1958-1970 (2007).
    [CrossRef]
  18. J. Jeyakar, R. V. Babu, and K. R. Ramakrishnan, “Robust object tracking with background-weighted local kernels,” Comput. Vision Image Underst. 112, 296-309 (2008).
    [CrossRef]
  19. R. V. Babu, P. Perez, and P. Bouthemy, “Robust tracking with motion estimation and local kernel-based color modeling,” Image Vision Comput. 25, 1205-1216 (2007).
    [CrossRef]
  20. H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
    [CrossRef]
  21. P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
    [CrossRef]
  22. D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
    [CrossRef]
  23. J. S. Hu, C. W. Juan, and J. J Wang, “A spatial-color mean-shift object tracking algorithm with scale and orientation estimation,” Pattern Recogn. Lett. 29, 2165-2173 (2008).
    [CrossRef]
  24. F. Aherne, N. Thacker, and P. Rockett, “The Bhattacharyya metric as an absolute similarity measure for frequency coded data,” Kybernetika 34, 363-368 (1997).
  25. J. W. Davis, “OTCBVS benchmark dataset collection,” http://www.cse.ohio-state.edu/otcbvs-bench/.

2008

J. Jeyakar, R. V. Babu, and K. R. Ramakrishnan, “Robust object tracking with background-weighted local kernels,” Comput. Vision Image Underst. 112, 296-309 (2008).
[CrossRef]

J. S. Hu, C. W. Juan, and J. J Wang, “A spatial-color mean-shift object tracking algorithm with scale and orientation estimation,” Pattern Recogn. Lett. 29, 2165-2173 (2008).
[CrossRef]

2007

C. F. Shana, T. N. Tan, and Y. C. Wei, “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40, 1958-1970 (2007).
[CrossRef]

R. V. Babu, P. Perez, and P. Bouthemy, “Robust tracking with motion estimation and local kernel-based color modeling,” Image Vision Comput. 25, 1205-1216 (2007).
[CrossRef]

P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
[CrossRef]

J. G. Ling, E. Q. Liu, H. Y. Liang, and J. Yang, “Infrared target tracking with kernel-based performance metric and eigenvalue-based similarity measure,” Appl. Opt. 46, 3239-3252(2007).
[CrossRef] [PubMed]

R. M. Liu, E. Q. Liu, J. Yang, Y. Zeng, F. L. Wang, and Y. Cao, “Automatically detect and track infrared small targets with kernel Fukunaga--Koontz transform and Kalman prediction,” Appl. Opt. 46, 7780-7791 (2007).
[CrossRef] [PubMed]

2005

2004

2003

K. Brunnstrom, B. N. Schenkman, and B. Jacobson, “Object detection in cluttered infrared images,” Opt. Eng. 42, 388-399 (2003).
[CrossRef]

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 564-577 (2003).
[CrossRef]

K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image Vision Comput. 21, 99-110(2003).
[CrossRef]

2002

M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
[CrossRef]

D. Comaniciu and P. Meer, “Mean shift: a robust approach toward feature space analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 603-619 (2002).
[CrossRef]

1998

M. Isard and A. Blake, “CONDENSATION: conditional density propagation for visual tracking,” Int. J. Comput. Vision 29, 5-28 (1998).
[CrossRef]

D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
[CrossRef]

1997

F. Aherne, N. Thacker, and P. Rockett, “The Bhattacharyya metric as an absolute similarity measure for frequency coded data,” Kybernetika 34, 363-368 (1997).

1995

Y. Cheng, “Mean shift, mode seeking, and clustering,” IEEE Trans. Pattern Anal. Mach. Intell. 17, 790-799 (1995).
[CrossRef]

1975

K. Fukunaga and L. Hostetler, “The estimation of the gradient of a density function, with applications in pattern recognition,” IEEE Trans. Inf. Theory 21, 32-40 (1975).
[CrossRef]

Acheroy, M.

D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
[CrossRef]

Aherne, F.

F. Aherne, N. Thacker, and P. Rockett, “The Bhattacharyya metric as an absolute similarity measure for frequency coded data,” Kybernetika 34, 363-368 (1997).

Alam, M. S.

Aoki, M.

M. Yasuno, S. Ryousuke, N. Yasuda, and M. Aoki, “Pedestrian detection and tracking in far infrared images,” in IEEE Conference on Intelligent Transportation Systems, 2005 (IEEE, 2005), pp. 182-187.

Arulampalam, M. S.

M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
[CrossRef]

Babu, R. V.

J. Jeyakar, R. V. Babu, and K. R. Ramakrishnan, “Robust object tracking with background-weighted local kernels,” Comput. Vision Image Underst. 112, 296-309 (2008).
[CrossRef]

R. V. Babu, P. Perez, and P. Bouthemy, “Robust tracking with motion estimation and local kernel-based color modeling,” Image Vision Comput. 25, 1205-1216 (2007).
[CrossRef]

Bal, A.

Blake, A.

M. Isard and A. Blake, “CONDENSATION: conditional density propagation for visual tracking,” Int. J. Comput. Vision 29, 5-28 (1998).
[CrossRef]

Borghys, D.

D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
[CrossRef]

Bouthemy, P.

R. V. Babu, P. Perez, and P. Bouthemy, “Robust tracking with motion estimation and local kernel-based color modeling,” Image Vision Comput. 25, 1205-1216 (2007).
[CrossRef]

Brasnett, P.

P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
[CrossRef]

Brunnstrom, K.

K. Brunnstrom, B. N. Schenkman, and B. Jacobson, “Object detection in cluttered infrared images,” Opt. Eng. 42, 388-399 (2003).
[CrossRef]

Bull, D.

P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
[CrossRef]

Canagarajah, N.

P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
[CrossRef]

Cao, Y.

Cheng, Y.

Y. Cheng, “Mean shift, mode seeking, and clustering,” IEEE Trans. Pattern Anal. Mach. Intell. 17, 790-799 (1995).
[CrossRef]

Clapp, T.

M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
[CrossRef]

Comaniciu, D.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 564-577 (2003).
[CrossRef]

D. Comaniciu and P. Meer, “Mean shift: a robust approach toward feature space analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 603-619 (2002).
[CrossRef]

D. Comaniciu and V. Ramesh, “Mean shift and optimal prediction for efficient object tracking,” in IEEE Conference on Image Processing, 2000 (IEEE, 2000), pp. 70-73.

D. Comaniciu, V. Ramesh, and P. Meer, “Real-time tracking of non-rigid objects using mean shift,” in IEEE Conference on Computer Vision and Pattern Recognition, 2000 (IEEE, 2000), pp. 142-149.

Davis, J. W.

J. W. Davis, “OTCBVS benchmark dataset collection,” http://www.cse.ohio-state.edu/otcbvs-bench/.

Davis, L.

C. Yang, R. Duraiswami, and L. Davis, “Efficient mean-shift tracking via a new similarity measure,” in IEEE Conference on Computer Vision and Pattern Recognition, 2005 (IEEE, 2005), pp. 176-183.

Duraiswami, R.

C. Yang, R. Duraiswami, and L. Davis, “Efficient mean-shift tracking via a new similarity measure,” in IEEE Conference on Computer Vision and Pattern Recognition, 2005 (IEEE, 2005), pp. 176-183.

Fukunaga, K.

K. Fukunaga and L. Hostetler, “The estimation of the gradient of a density function, with applications in pattern recognition,” IEEE Trans. Inf. Theory 21, 32-40 (1975).
[CrossRef]

Gool, L. V.

K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image Vision Comput. 21, 99-110(2003).
[CrossRef]

Gordon, N.

M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
[CrossRef]

Hostetler, L.

K. Fukunaga and L. Hostetler, “The estimation of the gradient of a density function, with applications in pattern recognition,” IEEE Trans. Inf. Theory 21, 32-40 (1975).
[CrossRef]

Hu, H. T.

Hu, J. S.

J. S. Hu, C. W. Juan, and J. J Wang, “A spatial-color mean-shift object tracking algorithm with scale and orientation estimation,” Pattern Recogn. Lett. 29, 2165-2173 (2008).
[CrossRef]

Hu, S. Q.

Isard, M.

M. Isard and A. Blake, “CONDENSATION: conditional density propagation for visual tracking,” Int. J. Comput. Vision 29, 5-28 (1998).
[CrossRef]

Jacobson, B.

K. Brunnstrom, B. N. Schenkman, and B. Jacobson, “Object detection in cluttered infrared images,” Opt. Eng. 42, 388-399 (2003).
[CrossRef]

Jeyakar, J.

J. Jeyakar, R. V. Babu, and K. R. Ramakrishnan, “Robust object tracking with background-weighted local kernels,” Comput. Vision Image Underst. 112, 296-309 (2008).
[CrossRef]

Jing, Z. L.

Juan, C. W.

J. S. Hu, C. W. Juan, and J. J Wang, “A spatial-color mean-shift object tracking algorithm with scale and orientation estimation,” Pattern Recogn. Lett. 29, 2165-2173 (2008).
[CrossRef]

Koller-Meier, E.

K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image Vision Comput. 21, 99-110(2003).
[CrossRef]

Liang, H. Y.

Ling, J. G.

Liu, E. Q.

Liu, H.

H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
[CrossRef]

Liu, R. M.

Maskell, S.

M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
[CrossRef]

Meer, P.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 564-577 (2003).
[CrossRef]

D. Comaniciu and P. Meer, “Mean shift: a robust approach toward feature space analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 603-619 (2002).
[CrossRef]

D. Comaniciu, V. Ramesh, and P. Meer, “Real-time tracking of non-rigid objects using mean shift,” in IEEE Conference on Computer Vision and Pattern Recognition, 2000 (IEEE, 2000), pp. 142-149.

Mihaylova, L.

P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
[CrossRef]

Nummiaro, K.

K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image Vision Comput. 21, 99-110(2003).
[CrossRef]

Perez, P.

R. V. Babu, P. Perez, and P. Bouthemy, “Robust tracking with motion estimation and local kernel-based color modeling,” Image Vision Comput. 25, 1205-1216 (2007).
[CrossRef]

Perneel, C.

D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
[CrossRef]

Ramakrishnan, K. R.

J. Jeyakar, R. V. Babu, and K. R. Ramakrishnan, “Robust object tracking with background-weighted local kernels,” Comput. Vision Image Underst. 112, 296-309 (2008).
[CrossRef]

Ramesh, V.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 564-577 (2003).
[CrossRef]

D. Comaniciu and V. Ramesh, “Mean shift and optimal prediction for efficient object tracking,” in IEEE Conference on Image Processing, 2000 (IEEE, 2000), pp. 70-73.

D. Comaniciu, V. Ramesh, and P. Meer, “Real-time tracking of non-rigid objects using mean shift,” in IEEE Conference on Computer Vision and Pattern Recognition, 2000 (IEEE, 2000), pp. 142-149.

Rockett, P.

F. Aherne, N. Thacker, and P. Rockett, “The Bhattacharyya metric as an absolute similarity measure for frequency coded data,” Kybernetika 34, 363-368 (1997).

Ryousuke, S.

M. Yasuno, S. Ryousuke, N. Yasuda, and M. Aoki, “Pedestrian detection and tracking in far infrared images,” in IEEE Conference on Intelligent Transportation Systems, 2005 (IEEE, 2005), pp. 182-187.

Schenkman, B. N.

K. Brunnstrom, B. N. Schenkman, and B. Jacobson, “Object detection in cluttered infrared images,” Opt. Eng. 42, 388-399 (2003).
[CrossRef]

Shana, C. F.

C. F. Shana, T. N. Tan, and Y. C. Wei, “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40, 1958-1970 (2007).
[CrossRef]

Tan, T. N.

C. F. Shana, T. N. Tan, and Y. C. Wei, “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40, 1958-1970 (2007).
[CrossRef]

Thacker, N.

F. Aherne, N. Thacker, and P. Rockett, “The Bhattacharyya metric as an absolute similarity measure for frequency coded data,” Kybernetika 34, 363-368 (1997).

Verlinde, P.

D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
[CrossRef]

Wang, F. L.

Wang, J. J

J. S. Hu, C. W. Juan, and J. J Wang, “A spatial-color mean-shift object tracking algorithm with scale and orientation estimation,” Pattern Recogn. Lett. 29, 2165-2173 (2008).
[CrossRef]

Wei, Y. C.

C. F. Shana, T. N. Tan, and Y. C. Wei, “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40, 1958-1970 (2007).
[CrossRef]

Yang, C.

C. Yang, R. Duraiswami, and L. Davis, “Efficient mean-shift tracking via a new similarity measure,” in IEEE Conference on Computer Vision and Pattern Recognition, 2005 (IEEE, 2005), pp. 176-183.

Yang, J.

Yasuda, N.

M. Yasuno, S. Ryousuke, N. Yasuda, and M. Aoki, “Pedestrian detection and tracking in far infrared images,” in IEEE Conference on Intelligent Transportation Systems, 2005 (IEEE, 2005), pp. 182-187.

Yasuno, M.

M. Yasuno, S. Ryousuke, N. Yasuda, and M. Aoki, “Pedestrian detection and tracking in far infrared images,” in IEEE Conference on Intelligent Transportation Systems, 2005 (IEEE, 2005), pp. 182-187.

Yu, Z.

H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
[CrossRef]

Zeng, Y.

Zha, H. B.

H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
[CrossRef]

Zhang, L.

H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
[CrossRef]

Zou, Y. X.

H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
[CrossRef]

Appl. Opt.

Chin. Opt. Lett.

Comput. Vision Image Underst.

J. Jeyakar, R. V. Babu, and K. R. Ramakrishnan, “Robust object tracking with background-weighted local kernels,” Comput. Vision Image Underst. 112, 296-309 (2008).
[CrossRef]

IEEE Trans. Inf. Theory

K. Fukunaga and L. Hostetler, “The estimation of the gradient of a density function, with applications in pattern recognition,” IEEE Trans. Inf. Theory 21, 32-40 (1975).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell.

Y. Cheng, “Mean shift, mode seeking, and clustering,” IEEE Trans. Pattern Anal. Mach. Intell. 17, 790-799 (1995).
[CrossRef]

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25, 564-577 (2003).
[CrossRef]

D. Comaniciu and P. Meer, “Mean shift: a robust approach toward feature space analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 603-619 (2002).
[CrossRef]

IEEE Trans. Signal Process.

M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process. 50, 174-188 (2002).
[CrossRef]

Image Vision Comput.

K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image Vision Comput. 21, 99-110(2003).
[CrossRef]

R. V. Babu, P. Perez, and P. Bouthemy, “Robust tracking with motion estimation and local kernel-based color modeling,” Image Vision Comput. 25, 1205-1216 (2007).
[CrossRef]

P. Brasnett, L. Mihaylova, D. Bull, and N. Canagarajah, “Sequential Monte Carlo tracking by fusing multiple cues in video sequences,” Image Vision Comput. 25, 1217-1227(2007).
[CrossRef]

Int. J. Comput. Vision

M. Isard and A. Blake, “CONDENSATION: conditional density propagation for visual tracking,” Int. J. Comput. Vision 29, 5-28 (1998).
[CrossRef]

Kybernetika

F. Aherne, N. Thacker, and P. Rockett, “The Bhattacharyya metric as an absolute similarity measure for frequency coded data,” Kybernetika 34, 363-368 (1997).

Opt. Eng.

D. Borghys, P. Verlinde, C. Perneel, and M. Acheroy, “Multilevel data fusion for the detection of targets using multispectral image sequences,” Opt. Eng. 37, 477-484 (1998).
[CrossRef]

K. Brunnstrom, B. N. Schenkman, and B. Jacobson, “Object detection in cluttered infrared images,” Opt. Eng. 42, 388-399 (2003).
[CrossRef]

Pattern Recogn.

C. F. Shana, T. N. Tan, and Y. C. Wei, “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40, 1958-1970 (2007).
[CrossRef]

Pattern Recogn. Lett.

H. Liu, Z. Yu, H. B. Zha, Y. X. Zou, and L. Zhang, “Robust human tracking based on multi-cue integration and mean-shift,” Pattern Recogn. Lett. (to be published).
[CrossRef]

J. S. Hu, C. W. Juan, and J. J Wang, “A spatial-color mean-shift object tracking algorithm with scale and orientation estimation,” Pattern Recogn. Lett. 29, 2165-2173 (2008).
[CrossRef]

Other

J. W. Davis, “OTCBVS benchmark dataset collection,” http://www.cse.ohio-state.edu/otcbvs-bench/.

M. Yasuno, S. Ryousuke, N. Yasuda, and M. Aoki, “Pedestrian detection and tracking in far infrared images,” in IEEE Conference on Intelligent Transportation Systems, 2005 (IEEE, 2005), pp. 182-187.

D. Comaniciu, V. Ramesh, and P. Meer, “Real-time tracking of non-rigid objects using mean shift,” in IEEE Conference on Computer Vision and Pattern Recognition, 2000 (IEEE, 2000), pp. 142-149.

C. Yang, R. Duraiswami, and L. Davis, “Efficient mean-shift tracking via a new similarity measure,” in IEEE Conference on Computer Vision and Pattern Recognition, 2005 (IEEE, 2005), pp. 176-183.

D. Comaniciu and V. Ramesh, “Mean shift and optimal prediction for efficient object tracking,” in IEEE Conference on Image Processing, 2000 (IEEE, 2000), pp. 70-73.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1

Block diagram of the proposed infrared human tracking system.

Fig. 2
Fig. 2

Example of combined PDM based on motion-guided gray and edge cues: (a) frame 120 of S1, (b) gray PDM, (c) edge PDM, (d) motion difference image, (e) motion-guided gray PDM, (f) motion-guided edge PDM, (g) combined PDM based on motion-guided gray and edge cues.

Fig. 3
Fig. 3

Flow chart of the adaptive multicue fusion based mean shift tracking algorithm.

Fig. 4
Fig. 4

Sequence S1, where the target has similar gray background disturbance and the tracking results of frames 6, 82, and 152: (a)–(c) tracking results of the presented method, (d)–(f) tracking results of the traditional mean shift method.

Fig. 5
Fig. 5

Tracking error plot of both methods for sequence S1.

Fig. 6
Fig. 6

Sequence S2, where the target has low gray values and the tracking results of frames 5, 75, and 103: (a)–(c) tracking results of the presented method; (d)–(f) tracking results of the traditional mean shift method.

Fig. 7
Fig. 7

Tracking error plot of both methods for sequence S2.

Fig. 8
Fig. 8

Sequence S3, where the illumination changes and the tracking results of frames 5, 10, and 117: (a)–(c) tracking results of the presented method; (d)–(f) tracking results of the traditional mean shift method.

Fig. 9
Fig. 9

Weights assigned to the motion-guided gray and edge cues of sequence S3. The vertical dotted lines indicate the times when Fig. 8 is taken.

Fig. 10
Fig. 10

Comparison between our adaptive multicue fusion scheme and the standard nonadaptive one: (a)–(c) tracking results of frames 5, 10, and 117 by our tracker based on the adaptive multicue fusion scheme; (d)–(f) tracking results of frames 5, 10, and 117 by the nonadaptive multicue fusion based tracker.

Fig. 11
Fig. 11

Tracking error plot of our tracker based on the adaptive multicue fusion scheme and the standard nonadaptive multicue fusion based tracker for sequence S3.

Fig. 12
Fig. 12

Flow chart of the occlusion handling process.

Fig. 13
Fig. 13

Sequence S4, where the target suffers from occlusions and tracking results of the presented method: (a) frame 39, (b) frame 45, (c) frame 56, (d) frame 75, (e) frame 90, (f) frame 120.

Tables (2)

Tables Icon

Table 1 Video Sequences Used in Our Experiments

Tables Icon

Table 2 Processing Time Comparison of Both Methods for a Frame (in Seconds)

Equations (24)

Equations on this page are rendered with MathJax. Learn more.

h g ( u ) = i = 1 n δ ( b ( x i ) u ) , u = 1 , , B G ,
p g ( x i , t ) = p gray ( Z i | M g , T ) ,
G ( x ) = [ ( I x ) 2 + ( I y ) 2 ] 1 2 ,
α ( x ) = tan 1 ( I y / I x ) .
α ^ ( x ) = { α ( x ) , G ( x ) > Th , 0 , otherwise .
h e ( v ) = i = 1 n δ ( b * ( x i ) v ) , v = 1 , , B E ,
p e ( x i , t ) = p edge ( Z i | M e , T ) .
D ( x , t ) = l = 1 1 h = 1 1 | f t ( x + l , y + h ) f t 1 ( x + l , y + h ) | .
D ^ ( x , t ) = ( D ( x , t ) + D ( x , t 1 ) ) / 2.
T = m + k σ ,
M D ( x , t ) = { 1 , D ^ ( x , t ) > T , 0 , D ^ ( x , t ) T .
p m g ( x i , t ) = p g ( x i , t ) M D ( x i , t ) ,
p m e ( x i , t ) = p e ( x i , t ) M D ( x i , t ) .
p j ( x i , t ) p j ( Z i | M j , T ) ,
p ( x i , t ) = j = 1 s ω j p j ( x i , t ) , j = 1 s ω j = 1 ,
ρ ( h mod ( t 1 ) , h tar ( t 1 ) ) = i = 1 B ( h mod ( i , t 1 ) h tar ( i , t 1 ) ) 1 / 2 .
d ( h mod ( t 1 ) , h tar ( t 1 ) ) = [ 1 ρ ( h mod ( t 1 ) , h tar ( t 1 ) ) ] 1 / 2 ,
ω ^ j ( t ) = 1 d j 2 ( h mod ( t 1 ) , h tar ( t 1 ) ) , j = 1 , , s .
ω j ( t ) = ω ^ j ( t ) j = 1 s ω ^ j ( t ) , j = 1 , , s .
h mod ( t ) = h mod ( t 1 ) + e α [ 1 ρ ( h mod ( t 1 ) , h tar ( t 1 ) ) ] h tar ( t 1 ) ,
M 00 ( t ) = i p ( x i , t ) ,
M 01 ( t ) = i x i p ( x i , t ) ,
P ^ = M 01 M 00 = i x i · p ( x i , t ) i p ( x i , t ) .
P = P ^ , s = k ( M 00 ) 1 / 2 ,

Metrics