Abstract

Military sensor applications include tasks such as the surveillance of activity and searching for roadside explosives. These tasks involve identifying and tracking specific objects in a cluttered scene. Unfortunately, the probability of accomplishing these tasks is not predicted by the traditional detect, recognize, and identify (DRI) target acquisition models. The reason why many security and surveillance tasks are functionally different from the traditional DRI tasks is described. Experiments using characters and simple shapes illustrate the problem with using the DRI model to predict the probability of identifying individual objects. The current DRI model is extended to predict specific object identification by including the frequency spectrum content of target contrast. The predictions of the new model match experimental data.

© 2007 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
    [CrossRef]
  2. J. Johnson, "Analysis of imaging forming systems," in Proceedings of the Image Intensifier Symposium, 6-7 October 1958, AD220-160, U.S. Army Engineer Research and Development Lab, Fort Belvoir, Va., USA, pp. 249-273.
  3. J. Howe, "Electro-optical imaging system performance prediction," in Electro-Optical Systems Design, Analysis, and Testing, M. C. Dudzik, ed., The IR & EO Systems Handbook, ERIM IIAC (SPIE, 1993), Vol. 4, pp. 91-101.
  4. R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Opt. Eng. 43, 2806-2818 (2004).
    [CrossRef]
  5. R. H. Vollmerhausen, E. Jacobs, J. Hixson, and M. Friedman, The Targeting Task Performance (TTP) Metric; A New Model for Predicting Target Acquisition Performance, Tech. Report AMSEL-NV-TR-230, NVESD (U.S. Army CERDEC, 2005).
  6. R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Proc. SPIE 5076, 28-40 (2003).
    [CrossRef]
  7. R. H. Vollmerhausen, R. G. Driggers, and M. Tomkinson, "Improved image quality metric for predicting tactical vehicle identification," Proc. SPIE 4030, 60-69 (2000).
    [CrossRef]
  8. J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
    [CrossRef]
  9. I. Overington, Vision and Acquisition (Crane, Russak & Company, Inc., 1976).
  10. E. Peli, ed., Vision Models for Target Detection and Recognition (World Scientific, 1995), Chap. 6, pp. 135-171.
  11. E. Peli, ed., Vision Models for Target Detection and Recognition (World Scientific, 1995), Chaps. 1-5.
  12. P. G. J. Barten, Contrast Sensitivity of the Human Eye and its Effects on Image Quality (SPIE, 1999).
    [CrossRef]
  13. P. G. J. Barten, "Formula for the constrast sensitivity of the human eye," Proc. SPIE 5294, 231-238 (2004). Paper available on the Web at http://www.SPIE.org.
    [CrossRef]
  14. F. W. Campbell and J. G. Robson, "The application of Fourier analysis to the visibility of gratings," J. Physiol. (London) 197, 551-566 (1968).
    [PubMed]
  15. N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
    [CrossRef] [PubMed]
  16. A. B. Watson and D. G. Pelli, "QUEST: a Bayesian adaptive psychometric method," Percept. Psychophys. 33, 113-120 (1983).
    [CrossRef] [PubMed]
  17. Instructions for the Use of the RIT Alphanumeric Resolution Test Object, Graphic Arts Research Center (Rochester Institute of Technology, 1980).

2006

J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
[CrossRef]

2004

P. G. J. Barten, "Formula for the constrast sensitivity of the human eye," Proc. SPIE 5294, 231-238 (2004). Paper available on the Web at http://www.SPIE.org.
[CrossRef]

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Opt. Eng. 43, 2806-2818 (2004).
[CrossRef]

2003

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Proc. SPIE 5076, 28-40 (2003).
[CrossRef]

2002

N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
[CrossRef] [PubMed]

2000

R. H. Vollmerhausen, R. G. Driggers, and M. Tomkinson, "Improved image quality metric for predicting tactical vehicle identification," Proc. SPIE 4030, 60-69 (2000).
[CrossRef]

1983

A. B. Watson and D. G. Pelli, "QUEST: a Bayesian adaptive psychometric method," Percept. Psychophys. 33, 113-120 (1983).
[CrossRef] [PubMed]

1968

F. W. Campbell and J. G. Robson, "The application of Fourier analysis to the visibility of gratings," J. Physiol. (London) 197, 551-566 (1968).
[PubMed]

Barten, P. G. J.

P. G. J. Barten, "Formula for the constrast sensitivity of the human eye," Proc. SPIE 5294, 231-238 (2004). Paper available on the Web at http://www.SPIE.org.
[CrossRef]

P. G. J. Barten, Contrast Sensitivity of the Human Eye and its Effects on Image Quality (SPIE, 1999).
[CrossRef]

Campbell, F. W.

F. W. Campbell and J. G. Robson, "The application of Fourier analysis to the visibility of gratings," J. Physiol. (London) 197, 551-566 (1968).
[PubMed]

Deaver, D. M.

J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
[CrossRef]

Driggers, R.

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Opt. Eng. 43, 2806-2818 (2004).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Proc. SPIE 5076, 28-40 (2003).
[CrossRef]

Driggers, R. G.

R. H. Vollmerhausen, R. G. Driggers, and M. Tomkinson, "Improved image quality metric for predicting tactical vehicle identification," Proc. SPIE 4030, 60-69 (2000).
[CrossRef]

Edwards, T. C.

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

Flug, E.

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

Friedman, M.

R. H. Vollmerhausen, E. Jacobs, J. Hixson, and M. Friedman, The Targeting Task Performance (TTP) Metric; A New Model for Predicting Target Acquisition Performance, Tech. Report AMSEL-NV-TR-230, NVESD (U.S. Army CERDEC, 2005).

Hixson, J.

R. H. Vollmerhausen, E. Jacobs, J. Hixson, and M. Friedman, The Targeting Task Performance (TTP) Metric; A New Model for Predicting Target Acquisition Performance, Tech. Report AMSEL-NV-TR-230, NVESD (U.S. Army CERDEC, 2005).

Howe, J.

J. Howe, "Electro-optical imaging system performance prediction," in Electro-Optical Systems Design, Analysis, and Testing, M. C. Dudzik, ed., The IR & EO Systems Handbook, ERIM IIAC (SPIE, 1993), Vol. 4, pp. 91-101.

Jacobs, E.

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Opt. Eng. 43, 2806-2818 (2004).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Proc. SPIE 5076, 28-40 (2003).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, J. Hixson, and M. Friedman, The Targeting Task Performance (TTP) Metric; A New Model for Predicting Target Acquisition Performance, Tech. Report AMSEL-NV-TR-230, NVESD (U.S. Army CERDEC, 2005).

Johnson, J.

J. Johnson, "Analysis of imaging forming systems," in Proceedings of the Image Intensifier Symposium, 6-7 October 1958, AD220-160, U.S. Army Engineer Research and Development Lab, Fort Belvoir, Va., USA, pp. 249-273.

Krapels, K. A.

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

Kurshan, P.

N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
[CrossRef] [PubMed]

Majaj, N. J.

N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
[CrossRef] [PubMed]

Moyer, S. K.

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

O'Connor, J. D.

J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
[CrossRef]

O'Shea, P.

J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
[CrossRef]

Overington, I.

I. Overington, Vision and Acquisition (Crane, Russak & Company, Inc., 1976).

Palmer, J. E.

J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
[CrossRef]

Palomares, M.

N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
[CrossRef] [PubMed]

Peli, E.

E. Peli, ed., Vision Models for Target Detection and Recognition (World Scientific, 1995), Chap. 6, pp. 135-171.

E. Peli, ed., Vision Models for Target Detection and Recognition (World Scientific, 1995), Chaps. 1-5.

Pelli, D. G.

N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
[CrossRef] [PubMed]

A. B. Watson and D. G. Pelli, "QUEST: a Bayesian adaptive psychometric method," Percept. Psychophys. 33, 113-120 (1983).
[CrossRef] [PubMed]

Robson, J. G.

F. W. Campbell and J. G. Robson, "The application of Fourier analysis to the visibility of gratings," J. Physiol. (London) 197, 551-566 (1968).
[PubMed]

Scarbrough, J.

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

Tomkinson, M.

R. H. Vollmerhausen, R. G. Driggers, and M. Tomkinson, "Improved image quality metric for predicting tactical vehicle identification," Proc. SPIE 4030, 60-69 (2000).
[CrossRef]

Vollmerhausen, R. H.

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Opt. Eng. 43, 2806-2818 (2004).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Proc. SPIE 5076, 28-40 (2003).
[CrossRef]

R. H. Vollmerhausen, R. G. Driggers, and M. Tomkinson, "Improved image quality metric for predicting tactical vehicle identification," Proc. SPIE 4030, 60-69 (2000).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, J. Hixson, and M. Friedman, The Targeting Task Performance (TTP) Metric; A New Model for Predicting Target Acquisition Performance, Tech. Report AMSEL-NV-TR-230, NVESD (U.S. Army CERDEC, 2005).

Watson, A. B.

A. B. Watson and D. G. Pelli, "QUEST: a Bayesian adaptive psychometric method," Percept. Psychophys. 33, 113-120 (1983).
[CrossRef] [PubMed]

J. Physiol.

F. W. Campbell and J. G. Robson, "The application of Fourier analysis to the visibility of gratings," J. Physiol. (London) 197, 551-566 (1968).
[PubMed]

Opt. Eng.

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Opt. Eng. 43, 2806-2818 (2004).
[CrossRef]

Percept. Psychophys.

A. B. Watson and D. G. Pelli, "QUEST: a Bayesian adaptive psychometric method," Percept. Psychophys. 33, 113-120 (1983).
[CrossRef] [PubMed]

Proc. SPIE

P. G. J. Barten, "Formula for the constrast sensitivity of the human eye," Proc. SPIE 5294, 231-238 (2004). Paper available on the Web at http://www.SPIE.org.
[CrossRef]

S. K. Moyer, E. Flug, T. C. Edwards, K. A. Krapels, and J. Scarbrough, "Identification of handheld objects for electrooptical/FLIR applications," Proc. SPIE 5407, 116-126 (2004).
[CrossRef]

R. H. Vollmerhausen, E. Jacobs, and R. Driggers, "New metric for predicting target acquisition performance," Proc. SPIE 5076, 28-40 (2003).
[CrossRef]

R. H. Vollmerhausen, R. G. Driggers, and M. Tomkinson, "Improved image quality metric for predicting tactical vehicle identification," Proc. SPIE 4030, 60-69 (2000).
[CrossRef]

J. D. O'Connor, P. O'Shea, J. E. Palmer, and D. M. Deaver, "Standard target sets for field sensor performance measurements," Proc. SPIE 6207, 62070U (2006).
[CrossRef]

Vision Res.

N. J. Majaj, D. G. Pelli, P. Kurshan, and M. Palomares, "The role of spatial frequency channels in letter identification," Vision Res. 42, 1165-1184 (2002).
[CrossRef] [PubMed]

Other

Instructions for the Use of the RIT Alphanumeric Resolution Test Object, Graphic Arts Research Center (Rochester Institute of Technology, 1980).

I. Overington, Vision and Acquisition (Crane, Russak & Company, Inc., 1976).

E. Peli, ed., Vision Models for Target Detection and Recognition (World Scientific, 1995), Chap. 6, pp. 135-171.

E. Peli, ed., Vision Models for Target Detection and Recognition (World Scientific, 1995), Chaps. 1-5.

P. G. J. Barten, Contrast Sensitivity of the Human Eye and its Effects on Image Quality (SPIE, 1999).
[CrossRef]

J. Johnson, "Analysis of imaging forming systems," in Proceedings of the Image Intensifier Symposium, 6-7 October 1958, AD220-160, U.S. Army Engineer Research and Development Lab, Fort Belvoir, Va., USA, pp. 249-273.

J. Howe, "Electro-optical imaging system performance prediction," in Electro-Optical Systems Design, Analysis, and Testing, M. C. Dudzik, ed., The IR & EO Systems Handbook, ERIM IIAC (SPIE, 1993), Vol. 4, pp. 91-101.

R. H. Vollmerhausen, E. Jacobs, J. Hixson, and M. Friedman, The Targeting Task Performance (TTP) Metric; A New Model for Predicting Target Acquisition Performance, Tech. Report AMSEL-NV-TR-230, NVESD (U.S. Army CERDEC, 2005).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (24)

Fig. 1
Fig. 1

Target set used to establish difficulty of identifying handheld objects. The whole set contains 12 aspects of each of the objects shown.

Fig. 2
Fig. 2

DRI target set recommended for field testing thermal imagers. Each row shows three aspects of the tactical vehicle listed at right.

Fig. 3
Fig. 3

Plot of PID (ordinate) versus range (abscissa) showing typical, slow falloff in probability with range.

Fig. 4
Fig. 4

Plots of the CTF and target contrast (C TGT ) versus spatial frequency. The square root of the ratio of C TGT to CTF is integrated over all spatial frequencies where C TGT exceeds CTF.

Fig. 5
Fig. 5

Probability of detecting a sine wave (ordinate) versus sine wave amplitude (abscissa). Plot is for an actual threshold modulation of 0.0025. The solid curve shows ideal response with zero probability below threshold and certainty above threshold. The dotted curve shows realistic estimate of probability.

Fig. 6
Fig. 6

Schematic of conceptual imager used in analysis. The scene is imaged onto the FPA by a lens. FPA output is viewed on a display. The blur from the lens and FPA combination is assumed Gaussian. FPA sample spacing is 1 mrad in both horizontal and vertical directions.

Fig. 7
Fig. 7

Two character sets, one top and one bottom. The characters within each set are equally recognizable.

Fig. 8
Fig. 8

Construction of Fig. 6 characters. Line width is 10 m, character height is 90 m, and character width is 60 m.

Fig. 9
Fig. 9

Character presentation during subject testing. The subject sees one character from each set. The subject selects the appropriate button beneath the character.

Fig. 10
Fig. 10

Square, hexagon, octagon, and circle used in shape experiment.

Fig. 11
Fig. 11

Presentation of shapes to subjects during test. Buttons at bottom left indicate to subject that, in this instance, the left-hand shape is either a circle or a hexagon. Buttons at bottom right indicate to subject that right-hand shape is either a circle or octagon.

Fig. 12
Fig. 12

PID (ordinate) versus DRI metric value (abscissa). Data are shown for both color and monochromatic (mono) displays. The dashed curve is the DRI prediction using Φ 50 based on least-squares fit. The solid curve illustrates that changing Φ 50 does not bring the model and data into agreement.

Fig. 13
Fig. 13

PID (ordinate) and DRI model metric (abscissa) for discriminating circle and octagon. The diamonds are data and the curves model predictions. The dashed curve shows prediction using Φ 50 giving best least-squares fit. The solid curve shows prediction using a smaller Φ 50 .

Fig. 14
Fig. 14

PID (ordinate) and DRI model metric (abscissa) for discriminating circle and hexagon. The diamonds are data and the curves model predictions. The dashed curve shows prediction using Φ 50 giving best least-squares fit. The solid curve shows prediction using a smaller Φ 50 .

Fig. 15
Fig. 15

PID (ordinate) and DRI model metric (abscissa) for discriminating hexagon and octagon. The diamonds are data and the curves model predictions. The dashed curve shows prediction using Φ 50 giving best least-squares fit. The solid curve shows prediction using a smaller Φ 50 .

Fig. 16
Fig. 16

PID (ordinate) and DRI model metric (abscissa) for discriminating square and hexagon. The diamonds are data and the curves model predictions. The dashed curve shows prediction using Φ 50 giving best least-squares fit. The solid curve shows prediction using a smaller Φ 50 .

Fig. 17
Fig. 17

Relationship between feature contrast of the target set and CTF of the eye for the DRI model. Spatial frequency is in cycles per meter on target, not cycles per milliradian at the imager. Because the target set contains discrimination cues of all sizes, something is visible at all ranges. Probability of ID depends on imager resolution.

Fig. 18
Fig. 18

Illustration that, for a specific target or a set of like targets, discrimination cues have a limited frequency spectrum. The imager must be able to resolve the specific cues. At some range the available cues become blurred, and the probability of ID drops to zero.

Fig. 19
Fig. 19

PID (ordinate) versus metric using character FFT (abscissa). Data are shown for both color and monochromatic (mono) displays. The dashed curve is prediction using Φ50 based on least-squares fit.

Fig. 20
Fig. 20

PID (ordinate) and metric using FFT of circle (abscissa) for discriminating circle and octagon. The diamonds are data. The dashed curve shows prediction using Φ50 giving best least-squares fit.

Fig. 21
Fig. 21

PID (ordinate) and metric using FFT of circle (abscissa) for discriminating circle and hexagon. The diamonds are data. The dashed curve shows prediction using Φ50 giving best least-squares fit.

Fig. 22
Fig. 22

PID (ordinate) and metric using FFT of hexagon (abscissa) for discriminating hexagon and octagon. The diamonds are data. The dashed curve shows prediction using Φ 50 giving best least-squares fit.

Fig. 23
Fig. 23

PID (ordinate) and metric using FFT of square (abscissa) for discriminating square and hexagon. The diamonds are data. The dashed curve shows prediction using Φ 50 giving best least-squares fit.

Fig. 24
Fig. 24

FFT of circle, square, hexagon, or octagon is used to predict metric values: here the minimum and maximum metric values at each range.

Tables (4)

Tables Icon

Table 1 Blur and Noise Applied at Each Range in Character Experiment

Tables Icon

Table 2 Measured PID for Character Experiment

Tables Icon

Table 3 Range and Blur Combinations for Shape Experiment

Tables Icon

Table 4 Measured PID for Shape Experiment

Equations (17)

Equations on this page are rendered with MathJax. Learn more.

P ( R n g ) = number_correct 24 N obs .
V 50 = Φ 50 L TGT ,
CTF sys ( ξ ) = CTF ( ξ ) MTF ( ξ ) .
CTF sys 2 ( ξ ) = CTF 2 ( ξ ) MTF 2 ( ξ ) ( 1 + α 2 σ 2 ) .
C TGT - 0 = ( Δ μ ) 2 + σ tgt 2 2 μ scene .
TTP = ξ l o w ξ c u t [ C TGT CTF s y s ( ξ ) ] 1 / 2 d ξ .
δ ( C TGT CTF sys ( ξ ) ) = 1 for C TGT CTF sys ( ξ ) 1 , = 0 for C TGT CTF s y s ( ξ ) < 1 .
TTP = [ δ ( C TGT CTF sys ( ξ ) ) C TGT CTF s y s ( ξ ) ] 1 / 2 d ξ .
TTP = { [ δ C TGT CTF H ( ξ ) ] 1 / 2 d ξ [ δ C TGT CTF v ( η ) ] 1 / 2 d η } 1 / 2 .
TTP = [ δ C TGT ( ξ , η ) [ CTF sys ( ξ ) CTF sys ( η ) ] 1 / 2 d ξ d η ] 1 / 2 .
Φ = TTP R n g ,
Φ = [ δ C TGT ( ξ , η ) [ CTF sys ( ξ ) CTF sys ( η ) ] 1 / 2 d ξ d η R n g 2 ] 1 / 2 .
PID = ( Φ Φ 50 ) E 1 + ( Φ Φ 50 ) E ,
E = 1.51 + 0.24 Φ Φ 50 .
a i = exp ( i 2 π / blur 2 ) ,
PID = P ( R n g ) P chance 1 P chance .
Φ = [ δ | FFT ( ξ , η , R n g ) | [ CTF s y s ( ξ ) CTF sys ( η ) ] 1 / 2 d ξ d η R n g 2 ] 1 / 2 .

Metrics