Abstract

An improved IR target-tracking algorithm based on mean shift is proposed herein, which combines the mean-shift-based gradient-matched searching strategy with a feature-classification-based tracking algorithm. An improved target representation model is constructed by considering the likelihood ratio of the gray-level features of the target and local background as a weighted value of the original kernel histogram of the target region. An expression for the mean-shift vector in this model is derived, and a criterion for updating the model is presented. Experimental results show that the algorithm improves the shift weight of the target pixel gray level and suppresses background disturbance.

© 2012 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. 25, 564–577 (2003).
    [CrossRef]
  2. Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
    [CrossRef]
  3. J. Cheng and J. Yang, “Novel infrared object tracking method based on mean shift,” J. Infrared Millim. Waves 24, 231–235 (2005).
  4. A. Yilmaz, K. Shafique, and M. Shah, “Target tracking in airborne forward looking infrared imagery,” Image Vis. Comput. 21, 623–635 (2003).
    [CrossRef]
  5. R. T. Collins, Y. X. Liu, and M. Leordeanu, “Online selection of discriminative tracking features,” IEEE Trans. Pattern Anal. 27, 1631–1643 (2005).
    [CrossRef]
  6. H. T. Chen, T. L. Liu, and C. S. Fuh, “Probabilistic tracking with adaptive feature selection,” in Proceedings of IEEE Conference on Pattern Recognition (IEEE, 2004), pp. 736–739.
  7. S. Avidan, “Ensemble tracking,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 494–501.
  8. Z. Yin and R. Collins, “Spatial divide and conquer with motion cues for tracking through clutter,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 570–577.
  9. H. Wang, M.-W. Ren, and J.-Y. Yang, “New infrared object tracking algorithm based on smog model,” J. Infrared Millim. Waves 27, 252–256 (2008).
    [CrossRef]
  10. J. Q. Wang and Y. Yagi, “Integrating color and shape-texture features for adaptive real-time object tracking,” IEEE Trans. Image Process. 17, 235–240 (2008).
    [CrossRef]
  11. J. S. Shaik, and K. M. Iftekharuddin, “Automated tracking and classification of infrared images,” in Proceedings of IEEE Conference on Neural Networks (IEEE, 2003), pp. 1201–1206.
  12. W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
    [CrossRef]
  13. E. Maggio and A. Cavallaro, “Hybrid particle filter and mean shift tracker with adaptive transition model,” in Proceedings of IEEE Conference on Acoustics, Speech and Signal Processing (IEEE, 2005), pp. 221–224.
  14. W. He, X. Zhao, and L. Zhang, “Online feature extraction and selection for object tracking,” in Proceedings of IEEE Conference on Mechatronics and Automation (IEEE, 2007), pp. 3497–3502.
  15. H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
    [CrossRef]

2008

H. Wang, M.-W. Ren, and J.-Y. Yang, “New infrared object tracking algorithm based on smog model,” J. Infrared Millim. Waves 27, 252–256 (2008).
[CrossRef]

J. Q. Wang and Y. Yagi, “Integrating color and shape-texture features for adaptive real-time object tracking,” IEEE Trans. Image Process. 17, 235–240 (2008).
[CrossRef]

W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
[CrossRef]

2007

H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
[CrossRef]

Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
[CrossRef]

2005

J. Cheng and J. Yang, “Novel infrared object tracking method based on mean shift,” J. Infrared Millim. Waves 24, 231–235 (2005).

R. T. Collins, Y. X. Liu, and M. Leordeanu, “Online selection of discriminative tracking features,” IEEE Trans. Pattern Anal. 27, 1631–1643 (2005).
[CrossRef]

2003

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. 25, 564–577 (2003).
[CrossRef]

A. Yilmaz, K. Shafique, and M. Shah, “Target tracking in airborne forward looking infrared imagery,” Image Vis. Comput. 21, 623–635 (2003).
[CrossRef]

Avidan, S.

S. Avidan, “Ensemble tracking,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 494–501.

Bourennane, S.

W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
[CrossRef]

Cavallaro, A.

E. Maggio and A. Cavallaro, “Hybrid particle filter and mean shift tracker with adaptive transition model,” in Proceedings of IEEE Conference on Acoustics, Speech and Signal Processing (IEEE, 2005), pp. 221–224.

Chen, H. T.

H. T. Chen, T. L. Liu, and C. S. Fuh, “Probabilistic tracking with adaptive feature selection,” in Proceedings of IEEE Conference on Pattern Recognition (IEEE, 2004), pp. 736–739.

Cheng, J.

J. Cheng and J. Yang, “Novel infrared object tracking method based on mean shift,” J. Infrared Millim. Waves 24, 231–235 (2005).

Collins, R.

Z. Yin and R. Collins, “Spatial divide and conquer with motion cues for tracking through clutter,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 570–577.

Collins, R. T.

R. T. Collins, Y. X. Liu, and M. Leordeanu, “Online selection of discriminative tracking features,” IEEE Trans. Pattern Anal. 27, 1631–1643 (2005).
[CrossRef]

Comaniciu, D.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. 25, 564–577 (2003).
[CrossRef]

Derrode, S.

W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
[CrossRef]

Fuh, C. S.

H. T. Chen, T. L. Liu, and C. S. Fuh, “Probabilistic tracking with adaptive feature selection,” in Proceedings of IEEE Conference on Pattern Recognition (IEEE, 2004), pp. 736–739.

He, W.

W. He, X. Zhao, and L. Zhang, “Online feature extraction and selection for object tracking,” in Proceedings of IEEE Conference on Mechatronics and Automation (IEEE, 2007), pp. 3497–3502.

Iftekharuddin, K. M.

J. S. Shaik, and K. M. Iftekharuddin, “Automated tracking and classification of infrared images,” in Proceedings of IEEE Conference on Neural Networks (IEEE, 2003), pp. 1201–1206.

Ketchantang, W.

W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
[CrossRef]

Leordeanu, M.

R. T. Collins, Y. X. Liu, and M. Leordeanu, “Online selection of discriminative tracking features,” IEEE Trans. Pattern Anal. 27, 1631–1643 (2005).
[CrossRef]

Liu, T. L.

H. T. Chen, T. L. Liu, and C. S. Fuh, “Probabilistic tracking with adaptive feature selection,” in Proceedings of IEEE Conference on Pattern Recognition (IEEE, 2004), pp. 736–739.

Liu, Y. X.

R. T. Collins, Y. X. Liu, and M. Leordeanu, “Online selection of discriminative tracking features,” IEEE Trans. Pattern Anal. 27, 1631–1643 (2005).
[CrossRef]

Maggio, E.

E. Maggio and A. Cavallaro, “Hybrid particle filter and mean shift tracker with adaptive transition model,” in Proceedings of IEEE Conference on Acoustics, Speech and Signal Processing (IEEE, 2005), pp. 221–224.

Martin, L.

W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
[CrossRef]

Meer, P.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. 25, 564–577 (2003).
[CrossRef]

Qiao, S.

Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
[CrossRef]

Ramesh, V.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. 25, 564–577 (2003).
[CrossRef]

Ren, M.-W.

H. Wang, M.-W. Ren, and J.-Y. Yang, “New infrared object tracking algorithm based on smog model,” J. Infrared Millim. Waves 27, 252–256 (2008).
[CrossRef]

Schindler, K.

H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
[CrossRef]

Shafique, K.

A. Yilmaz, K. Shafique, and M. Shah, “Target tracking in airborne forward looking infrared imagery,” Image Vis. Comput. 21, 623–635 (2003).
[CrossRef]

Shah, M.

A. Yilmaz, K. Shafique, and M. Shah, “Target tracking in airborne forward looking infrared imagery,” Image Vis. Comput. 21, 623–635 (2003).
[CrossRef]

Shaik, J. S.

J. S. Shaik, and K. M. Iftekharuddin, “Automated tracking and classification of infrared images,” in Proceedings of IEEE Conference on Neural Networks (IEEE, 2003), pp. 1201–1206.

Shen, C.

H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
[CrossRef]

Song, J. Z.

Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
[CrossRef]

Sun, J. X.

Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
[CrossRef]

Sun, Z. S.

Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
[CrossRef]

Suter, D.

H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
[CrossRef]

Wang, H.

H. Wang, M.-W. Ren, and J.-Y. Yang, “New infrared object tracking algorithm based on smog model,” J. Infrared Millim. Waves 27, 252–256 (2008).
[CrossRef]

Wang, H. Z.

H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
[CrossRef]

Wang, J. Q.

J. Q. Wang and Y. Yagi, “Integrating color and shape-texture features for adaptive real-time object tracking,” IEEE Trans. Image Process. 17, 235–240 (2008).
[CrossRef]

Yagi, Y.

J. Q. Wang and Y. Yagi, “Integrating color and shape-texture features for adaptive real-time object tracking,” IEEE Trans. Image Process. 17, 235–240 (2008).
[CrossRef]

Yang, J.

J. Cheng and J. Yang, “Novel infrared object tracking method based on mean shift,” J. Infrared Millim. Waves 24, 231–235 (2005).

Yang, J.-Y.

H. Wang, M.-W. Ren, and J.-Y. Yang, “New infrared object tracking algorithm based on smog model,” J. Infrared Millim. Waves 27, 252–256 (2008).
[CrossRef]

Yilmaz, A.

A. Yilmaz, K. Shafique, and M. Shah, “Target tracking in airborne forward looking infrared imagery,” Image Vis. Comput. 21, 623–635 (2003).
[CrossRef]

Yin, Z.

Z. Yin and R. Collins, “Spatial divide and conquer with motion cues for tracking through clutter,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 570–577.

Zhang, L.

W. He, X. Zhao, and L. Zhang, “Online feature extraction and selection for object tracking,” in Proceedings of IEEE Conference on Mechatronics and Automation (IEEE, 2007), pp. 3497–3502.

Zhao, X.

W. He, X. Zhao, and L. Zhang, “Online feature extraction and selection for object tracking,” in Proceedings of IEEE Conference on Mechatronics and Automation (IEEE, 2007), pp. 3497–3502.

IEEE Trans. Image Process.

J. Q. Wang and Y. Yagi, “Integrating color and shape-texture features for adaptive real-time object tracking,” IEEE Trans. Image Process. 17, 235–240 (2008).
[CrossRef]

IEEE Trans. Pattern Anal.

D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Anal. 25, 564–577 (2003).
[CrossRef]

R. T. Collins, Y. X. Liu, and M. Leordeanu, “Online selection of discriminative tracking features,” IEEE Trans. Pattern Anal. 27, 1631–1643 (2005).
[CrossRef]

IEEE Trans. Pattern Anal. Machine Intell.

H. Z. Wang, D. Suter, K. Schindler, and C. Shen, “Adaptive object tracking based on an effective appearance filter,” IEEE Trans. Pattern Anal. Machine Intell. 29, 1661–1667(2007).
[CrossRef]

Image Vis. Comput.

A. Yilmaz, K. Shafique, and M. Shah, “Target tracking in airborne forward looking infrared imagery,” Image Vis. Comput. 21, 623–635 (2003).
[CrossRef]

J. Infrared Millim. Waves

J. Cheng and J. Yang, “Novel infrared object tracking method based on mean shift,” J. Infrared Millim. Waves 24, 231–235 (2005).

H. Wang, M.-W. Ren, and J.-Y. Yang, “New infrared object tracking algorithm based on smog model,” J. Infrared Millim. Waves 27, 252–256 (2008).
[CrossRef]

Machine Vis. Appl.

W. Ketchantang, S. Derrode, L. Martin, and S. Bourennane, “Pearson-based mixture model for color object tracking,” Machine Vis. Appl. 19, 457–466 (2008).
[CrossRef]

Opt. Precision Eng.

Z. S. Sun, J. X. Sun, J. Z. Song, and S. Qiao, “Anti-occlusion arithmetic for moving object tracking,” Opt. Precision Eng. 15, 267–271 (2007).
[CrossRef]

Other

H. T. Chen, T. L. Liu, and C. S. Fuh, “Probabilistic tracking with adaptive feature selection,” in Proceedings of IEEE Conference on Pattern Recognition (IEEE, 2004), pp. 736–739.

S. Avidan, “Ensemble tracking,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 494–501.

Z. Yin and R. Collins, “Spatial divide and conquer with motion cues for tracking through clutter,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 570–577.

E. Maggio and A. Cavallaro, “Hybrid particle filter and mean shift tracker with adaptive transition model,” in Proceedings of IEEE Conference on Acoustics, Speech and Signal Processing (IEEE, 2005), pp. 221–224.

W. He, X. Zhao, and L. Zhang, “Online feature extraction and selection for object tracking,” in Proceedings of IEEE Conference on Mechatronics and Automation (IEEE, 2007), pp. 3497–3502.

J. S. Shaik, and K. M. Iftekharuddin, “Automated tracking and classification of infrared images,” in Proceedings of IEEE Conference on Neural Networks (IEEE, 2003), pp. 1201–1206.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1.

Target and local background region.

Fig. 2.
Fig. 2.

Updating scheme of the target model.

Fig. 3.
Fig. 3.

Images of (a) the target region and (b) the candidate target region.

Fig. 4.
Fig. 4.

Shift-weighted images of the candidate target region with (a) the local gray-mean feature, (b) the local standard deviation feature, and (c) these two features fused, obtained using the method proposed by Yilmaz et al. [4].

Fig. 5.
Fig. 5.

Shift-weighted images of the candidate target region with (a) the local gray-mean feature, (b) the local standard deviation feature, and (c) these two features fused, obtained using our proposed algorithm according to Eq. (16).

Fig. 6.
Fig. 6.

Tracking results for the cyclist target. The dashed circle is the tracking result of our proposed algorithm. The solid circle is the tracking result of the algorithm proposed by Comaniciu et al. [1]. The solid square with rounded corners is the tracking result of the algorithm proposed by Yilmaz et al. [4]. The images in the first row, from left to right, correspond to the 2nd, 8th, 19th, and 25th frames. The images in the second row, from left to right, correspond to the 41st, 48th, 61st, and 80th frames.

Fig. 7.
Fig. 7.

Bhattacharyya coefficient curves of the target model. The dashed curve depicts the Bhattacharyya coefficient between the target model in the first frame (the reference frame) and the likelihood ratio weighted kernel histogram model of the target region. The solid curve depicts the Bhattacharyya coefficient between the target model obtained by adaptive updating and the likelihood ratio weighted kernel histogram model of the target region.

Fig. 8.
Fig. 8.

Tracking results for an entire moving car using our proposed algorithm. The images in the first row, from left to right, correspond to the 3rd, 18th, 30th, and 44th frames. The images in the second row, from left to right, correspond to the 52nd, 57th, 65th, and 76th frames.

Fig. 9.
Fig. 9.

Tracking results for the part of a moving car using our proposed algorithm. The images in the first row, from left to right, correspond to the 2nd, 40th, 50th, and 57th frames. The images in the second row, from left to right, correspond to the 59th, 80th, 90th, and 96th frames.

Equations (19)

Equations on this page are rendered with MathJax. Learn more.

qu(y*)=Cj=1n{k(y*xj*h2)δ[b(xj*)u]},
u(x)=[f(x),g(x)],
pobj(u|T)=i=1Mωiobjpobj(u|i),
pobj(u|i)=exp[0.5(uμiobj)Ti1(uμiobj)](2π)0.5d|iobj|0.5,
pbg(u|B)=j=1Nωjbgpbg(u|j),
Lu=pobj(u|T)pbg(u|B),
q^u(y*)=CTLy*(u)u(y*),
u(y*)=CQj=1n{k1(u(xj*)uh12)k2(y*xj*h22)},
p^u(y)=CCLy(u)u(y),
u(y)=CPi=1m{k1(u(xi)uh12)k2(yxih22)},
s(y)=1ρ[q^(y*),p^(y)],
ρ(y)=ρ[q^(y*),p^(y)]=uq^u(y*)p^u(y).
ρ(y)0.5u[q^u(y*)p^u(y0)]0.5+0.5up^u(y)[q^u(y*)/p^u(y0)]0.5.
ρ(y)0.5u[q^u(y*)p^u(y0)]0.5+0.5CP(CCCT)0.5i=1mwik2(yxih22),
wi=u[Ly*(u)Ly(u)]0.5[u(y*)/p^u(y0)]0.5k1(u(xi)uh12).
wi=uLy*(u)[u(y*)/u(y0)]0.5k1(u(xi)uh12).
y1=i=1mxiwi/i=1mwi.
TBC0=uru0logru0,
ru0=min[pobj0(u),pbg0(u)],

Metrics