Abstract

Upper and lower bounds are derived for the area under the receiver-operating-characteristic (ROC) curve of binary hypothesis testing. These results are compared with the area-under-the-curve (AUC) approximation and the AUC lower bound recently reported by Barrett et al. [J. Opt. Soc. Am A 15, 1520 (1998)].

© 1999 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. H. H. Barrett, C. K. Abbey, E. Clarkson, “Objective assessment of image quality. III. ROC metrics, ideal observers, and likelihood-generating functions,” J. Opt. Soc. Am. A 15, 1520–1535 (1998).
    [CrossRef]
  2. H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), Sec. 2.6.
  3. S. Kotz, N. L. Johnson, eds. in chief, Encyclopedia of Statistical Sciences, Vol. 5 (Wiley, New York, 1985), pp. 176–181.
  4. The upper bound in Eq. (8) is the familiar Bhattacharyya bound; see, e.g., H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), Sec. 2.7.; the lower bound, which is less well known, is derived in Appendix B.
  5. Our results will still be valid if Λ is a discrete or mixed random variable, but the derivation will require inclusion of randomized tests.
  6. J. H. Shapiro, B. A. Capron, R. C. Harney, “Imaging and target detection with a heterodyne-reception optical radar,” Appl. Opt. 20, 3292–3313 (1981).
    [CrossRef] [PubMed]
  7. The task of evaluating Pe can be avoided—at the expense of some weakening of our AUC bounds—by using upper and lower bounds on the error probability found from the conditional semi-invariant moment-generating function of the likelihood ratio under hypothesis H0. For a recent discussion of such Pe bounds see M. V. Burnashev, “On one useful inequality in testing of hypotheses,” IEEE Trans. Inf. Theory 44, 1668–1670 (1998). In most cases, these bounds will be appreciably tighter than the Bhattarcharyya-distance results given in Eq. (8).
    [CrossRef]
  8. H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), p. 137.

1998 (2)

The task of evaluating Pe can be avoided—at the expense of some weakening of our AUC bounds—by using upper and lower bounds on the error probability found from the conditional semi-invariant moment-generating function of the likelihood ratio under hypothesis H0. For a recent discussion of such Pe bounds see M. V. Burnashev, “On one useful inequality in testing of hypotheses,” IEEE Trans. Inf. Theory 44, 1668–1670 (1998). In most cases, these bounds will be appreciably tighter than the Bhattarcharyya-distance results given in Eq. (8).
[CrossRef]

H. H. Barrett, C. K. Abbey, E. Clarkson, “Objective assessment of image quality. III. ROC metrics, ideal observers, and likelihood-generating functions,” J. Opt. Soc. Am. A 15, 1520–1535 (1998).
[CrossRef]

1981 (1)

Abbey, C. K.

Barrett, H. H.

Burnashev, M. V.

The task of evaluating Pe can be avoided—at the expense of some weakening of our AUC bounds—by using upper and lower bounds on the error probability found from the conditional semi-invariant moment-generating function of the likelihood ratio under hypothesis H0. For a recent discussion of such Pe bounds see M. V. Burnashev, “On one useful inequality in testing of hypotheses,” IEEE Trans. Inf. Theory 44, 1668–1670 (1998). In most cases, these bounds will be appreciably tighter than the Bhattarcharyya-distance results given in Eq. (8).
[CrossRef]

Capron, B. A.

Clarkson, E.

Harney, R. C.

Shapiro, J. H.

Van Trees, H. L.

H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), p. 137.

H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), Sec. 2.6.

The upper bound in Eq. (8) is the familiar Bhattacharyya bound; see, e.g., H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), Sec. 2.7.; the lower bound, which is less well known, is derived in Appendix B.

Appl. Opt. (1)

IEEE Trans. Inf. Theory (1)

The task of evaluating Pe can be avoided—at the expense of some weakening of our AUC bounds—by using upper and lower bounds on the error probability found from the conditional semi-invariant moment-generating function of the likelihood ratio under hypothesis H0. For a recent discussion of such Pe bounds see M. V. Burnashev, “On one useful inequality in testing of hypotheses,” IEEE Trans. Inf. Theory 44, 1668–1670 (1998). In most cases, these bounds will be appreciably tighter than the Bhattarcharyya-distance results given in Eq. (8).
[CrossRef]

J. Opt. Soc. Am. A (1)

Other (5)

H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), p. 137.

H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), Sec. 2.6.

S. Kotz, N. L. Johnson, eds. in chief, Encyclopedia of Statistical Sciences, Vol. 5 (Wiley, New York, 1985), pp. 176–181.

The upper bound in Eq. (8) is the familiar Bhattacharyya bound; see, e.g., H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I (Wiley, New York, 1968), Sec. 2.7.; the lower bound, which is less well known, is derived in Appendix B.

Our results will still be valid if Λ is a discrete or mixed random variable, but the derivation will require inclusion of randomized tests.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (3)

Fig. 1
Fig. 1

Receiver operating characteristic (ROC) example and geometry for obtaining AUC bounds: solid curve, ROC for the exponential-statistics example from Section 3 with CNR=5; circle, η=1 ROC point; dashed line, ROC tangent line at the η=1 ROC point; dotted–dashed lines, chords from the points (0, 0) and (1, 1) to the η=1 ROC point. ROC convexity ensures that the area under the dashed line is an upper bound on the AUC and the area under the dotted–dashed lines is a lower bound on the AUC.

Fig. 2
Fig. 2

AUC-versus-CNR curves for the exponential-statistics example: solid curve, exact result; dashed curve, our upper bound; dotted–dashed curve, our lower bound; dotted curve, Barrett et al.1 lower bound.

Fig. 3
Fig. 3

AUC-versus-CNR curves for the exponential-statistics example: solid curve, exact result; dashed curve, our upper bound; dotted–dashed curve, composition of our lower bound with that of Barrett et al.1; dotted curve, Barrett et al. approximation.

Equations (28)

Equations on this page are rendered with MathJax. Learn more.

Λ(X)px|H1(X|H1)px|H0(X|H0)sayH1<sayH0η,
AUC01PDdPF1-exp(-2dB)/2,
AUC1-Q(2dB),
dB-lndX[px|H0(X|H0)px|H1(X|H1)]1/2,
Q(x)xdt exp(-t2/2)2π.
Pe=Q(dM/2),
dM=[(m1-m0)TΣ-1(m1-m0)]1/2,
{1-[1-exp(-2dB)]1/2}/2Peexp(-dB)/2.
PD=ηdλpΛ|H1(λ|H1),
PF=ηdλpΛ|H0(λ|H0).
dPDdPF=dPD/dηdPF/dη=-pΛ|H1(η|H1)-pΛ|H0(η|H0)=η.
1-PeAUC1-2Pe2,
Pe=2-1-4-1dX|px|H1(X|H1)-px|H0(X|H0)|.
1-exp(-dB)/2AUC1-[1-1-exp(-2dB)]2/2.
PD=PF1/(1+CNR)for0PF1,
AUC=1+CNR2+CNR.
dB=ln2+CNR21+CNR,
Pe=121-11+CNR1/CNR CNR1+CNR.
Pe(1-AUC)/2exp(-dB)/2,
Pe(1-AUC)/2Pe/2.
AUCmax[1-Pe,1-exp(-2dB)/2].
Q(dM/2)(1-AUC)/2exp(-dM2/8)/2.
Q(x)>exp(-x2/2)2πx1-1x2forx>0,
Pe=PF*/2+(1-PD*)/2.
[1-PD(0)]PF(1)/2=[PF*+(1-PD*)]2/2=2Pe2,
AUCPD*PF*/2+(1-PD*)(1-PF*)/2+PD*(1-PF*)=1-[PF*/2+(1-PD*)/2]=1-Pe,
2-1-4-1dX|px|H1(X|H1)-px|H0(X|H0)|=2-1-4-1Z1dX[px|H1(X|H1)-px|H0(X|H0)]-4-1Z0dX[px|H0(X|H0)-px|H1(X|H1)]=2-1Z0dXpx|H1(X|H1)+2-1Z1dXpx|H0(X|H0)=Pe.
dX|px|H1(X|H1)-px|H0(X|H0)|2dX[px|H1(X|H1)-px|H0(X|H0)]2×dX[px|H1(X|H1)+px|H0(X|H0)]2=[2-2 exp(-dB)][2+2 exp(-dB)]=4[1-exp(-2dB)].

Metrics