Abstract

We developed spatiotemporal fusion techniques for improving target detection and automatic target recognition. We also investigated real IR (infrared) sensor clutter noise. The sensor noise was collected by an IR (256 × 256) sensor looking at various scenes (trees, grass, roads, buildings, etc.). More than 95% of the sensor pixels showed near-stationary sensor clutter noise that was uncorrelated between pixels as well as across time frames. However, in a few pixels (covering the grass near the road) the sensor noise showed nonstationary properties (with increasing or decreasing mean across time frames). The natural noise extracted from the IR sensor, as well as the computer-generated noise with Gaussian and Rayleigh distributions, was used to test and compare different spatiotemporal fusion strategies. Finally, we proposed two advanced detection schemes: the double-thresholding the reverse-thresholding techniques. These techniques may be applied to complicated clutter situations (e.g., very-high clutter or nonstationary clutter situations) where the traditional constant-false-alarm-ratio technique may fail.

© 2004 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. H.-W. Chen, T. Olson, “Integrated spatiotemporal multiple sensor fusion system design,” in Sensor Fusion: Architectures, Algorithms, and Applications VI, B. V. Dasarathy, ed., Proc. SPIE4731, 204–215 (2002).
    [CrossRef]
  2. H.-W. Chen, T. Olson, “Adaptive spatiotemporal multiple sensor fusion,” Opt. Eng. 42, 1481–1495 (2003).
    [CrossRef]
  3. A. Mahalanobis, B. V. K. Vijaya Kumar, S. R. F. Sims, J. Epperson, “Unconstrained correlation filters,” Appl. Opt. 33, 3751–3759 (1994).
    [CrossRef] [PubMed]
  4. A. Mahalanobis, B. V. K. Vijaya Kumar, S. R. F. Sims, “Distance-classifier correlation filters for multiclass target recognition,” Appl. Opt. 35, 3127–3133 (1996).
    [CrossRef] [PubMed]
  5. S. Haykin, Adaptive Filter Theory (Prentice-Hall, Englewood Cliffs, N.J., 1986).
  6. A. Papoulis, Probability, Random Variables, and Stochastic Processes, 3rd ed. (McGraw-Hill, New York, 1991).
  7. W. B. Davenport, Probability and Random Processes (McGraw-Hill, New York, 1970).
  8. M. I. Skolnik, Radar Handbook (McGraw-Hill, New York, 1970).
  9. E. Waltz, J. Llinas, Multisensor Data Fusion (Artech House, Norwood, Mass., 1990).
  10. L. A. Klein, Sensor and Data Fusion Concepts and Applications, 2nd ed. (SPIE Press, Bellingham, Wash., 1999).

2003 (1)

H.-W. Chen, T. Olson, “Adaptive spatiotemporal multiple sensor fusion,” Opt. Eng. 42, 1481–1495 (2003).
[CrossRef]

1996 (1)

1994 (1)

Chen, H.-W.

H.-W. Chen, T. Olson, “Adaptive spatiotemporal multiple sensor fusion,” Opt. Eng. 42, 1481–1495 (2003).
[CrossRef]

H.-W. Chen, T. Olson, “Integrated spatiotemporal multiple sensor fusion system design,” in Sensor Fusion: Architectures, Algorithms, and Applications VI, B. V. Dasarathy, ed., Proc. SPIE4731, 204–215 (2002).
[CrossRef]

Davenport, W. B.

W. B. Davenport, Probability and Random Processes (McGraw-Hill, New York, 1970).

Epperson, J.

Haykin, S.

S. Haykin, Adaptive Filter Theory (Prentice-Hall, Englewood Cliffs, N.J., 1986).

Klein, L. A.

L. A. Klein, Sensor and Data Fusion Concepts and Applications, 2nd ed. (SPIE Press, Bellingham, Wash., 1999).

Llinas, J.

E. Waltz, J. Llinas, Multisensor Data Fusion (Artech House, Norwood, Mass., 1990).

Mahalanobis, A.

Olson, T.

H.-W. Chen, T. Olson, “Adaptive spatiotemporal multiple sensor fusion,” Opt. Eng. 42, 1481–1495 (2003).
[CrossRef]

H.-W. Chen, T. Olson, “Integrated spatiotemporal multiple sensor fusion system design,” in Sensor Fusion: Architectures, Algorithms, and Applications VI, B. V. Dasarathy, ed., Proc. SPIE4731, 204–215 (2002).
[CrossRef]

Papoulis, A.

A. Papoulis, Probability, Random Variables, and Stochastic Processes, 3rd ed. (McGraw-Hill, New York, 1991).

Sims, S. R. F.

Skolnik, M. I.

M. I. Skolnik, Radar Handbook (McGraw-Hill, New York, 1970).

Vijaya Kumar, B. V. K.

Waltz, E.

E. Waltz, J. Llinas, Multisensor Data Fusion (Artech House, Norwood, Mass., 1990).

Appl. Opt. (2)

Opt. Eng. (1)

H.-W. Chen, T. Olson, “Adaptive spatiotemporal multiple sensor fusion,” Opt. Eng. 42, 1481–1495 (2003).
[CrossRef]

Other (7)

H.-W. Chen, T. Olson, “Integrated spatiotemporal multiple sensor fusion system design,” in Sensor Fusion: Architectures, Algorithms, and Applications VI, B. V. Dasarathy, ed., Proc. SPIE4731, 204–215 (2002).
[CrossRef]

S. Haykin, Adaptive Filter Theory (Prentice-Hall, Englewood Cliffs, N.J., 1986).

A. Papoulis, Probability, Random Variables, and Stochastic Processes, 3rd ed. (McGraw-Hill, New York, 1991).

W. B. Davenport, Probability and Random Processes (McGraw-Hill, New York, 1970).

M. I. Skolnik, Radar Handbook (McGraw-Hill, New York, 1970).

E. Waltz, J. Llinas, Multisensor Data Fusion (Artech House, Norwood, Mass., 1990).

L. A. Klein, Sensor and Data Fusion Concepts and Applications, 2nd ed. (SPIE Press, Bellingham, Wash., 1999).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1
Fig. 1

Spatiotemporal fusion.

Fig. 2
Fig. 2

ROC performances of predetection and postdetection integration. The pdf’s for (a) a single frame and (b) a 25-frame average. (c) Accumulated probability curves for a single frame (solid curves) and a 25-frame average (dashed curves). (d) ROC curves: solid curve (top), predetection averaging 7 frames; dashed curve (bottom), predetection averaging 3 frames; dash-dotted curve (upper middle), postdetection 5 out of 7; dotted curve (lower middle), postdetection 6 out of 7.

Fig. 3
Fig. 3

Target detection in a two-target situation: squares, target 1; circles, target 2; triangles, clutter noise.

Fig. 4
Fig. 4

Gaussian pdf’s of target versus clutter noise. Arrowed line represents the single threshold.

Fig. 5
Fig. 5

Baseline (single-frame) detection ROC performance for (a) IR and (b) RF sensors.

Fig. 6
Fig. 6

Additive spatiotemporal fusion (window = 3 frames) for IR one-target (a) and two-target (b) cases and RF one-target (c) and two-target (d) cases.

Fig. 7
Fig. 7

Additive spatiotemporal fusion (window = 5 frames) for IR one-target (a) and two-target (b) cases and RF one-target (c) and two-target (d) cases.

Fig. 8
Fig. 8

Additive and MIN fusions (window = 5 frames) for IR one-target (a) and two-target (b) cases and RF one-target (c) and two-target (d) cases.

Fig. 9
Fig. 9

Persistency test (window = 5 frames) for IR one-target (a) and two-target (b) cases and RF one-target (c) and two-target (d) cases.

Fig. 10
Fig. 10

Additive fusion and persistency test (window = 5 frames) for IR one-target (a) and two-target (b) cases and RF one-target (c) and two-target (d) cases.

Fig. 11
Fig. 11

Combination of additive fusion and persistency test (window = 5 frames) for IR one-target (a) and two-target (b) cases and RF one-target (a) and two-target (d) cases.

Fig. 12
Fig. 12

Autocorrelations of real and computer-generated noise of (a) the stationary IR noise sequence in (b), (c) of the nonstationary IR noise sequence in (d), and (e) of a computer-generated stationary noise sequence. GWN denotes Gaussian white noise.

Fig. 13
Fig. 13

(a) Nonstationary noise sequence before detrend with autocorrelation function (b) im(143,:) denotes the 1D slice cut from the 143rd row of the 2D image. (c) Same noise sequence after detrend with autocorrelation function (d).

Fig. 14
Fig. 14

Target detection with real IR sensor noise: (a) single-target detection performance; (b) stationary target noise sequence; (c) two-target detection performance; (d) nonstationary target noise sequence.

Fig. 15
Fig. 15

Combination of predetection and postdetection with real IR sensor noise (integration window = 3 frames) for (a) single-target and (b) two-target cases.

Fig. 16
Fig. 16

Gaussian pdf’s of multiple clutter types versus the target.

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

pt=pt1+pt2, pc=pc1+pc2,
pt=pt1+pt2+  +ptn, pc=pc1+pc2+  +pcn.
fZz=0 fXxfYz-xdx.
fptpt=0 fpt1pt1fpt2pt-pt1dpt1,
fpcpc=0 fpc1pc1fpc2pc-pc1dpc1.
pt=pt1 * pt2, pc=pc1 * pc2.
fZz=01|x| fXxfYzxdx.
fptpt=01|pt1| fpt1pt1fpt2ptpt1dpt1,
fpcpc=01|pc1| fpc1pc1fpc2×pcpc1dpc1.
lnpt=lnpt1+lnpt2, lnpc=lnpc1+lnpc2.
fZz=fXz1-FYz+fYz1-FXz,
fZz=fXzFYz+fYzFXz.
pt=minpt1, pt2, pc=minpc1, pc2.
pt=maxpt1, pt2, pc=maxpc1, pc2.
Pk:N=j=kNNkpj1-pN-j,
mt-mc1=mc2-mt=2σ.

Metrics