Wolfram Library Archive

Courseware Demos MathSource Technical Notes
All Collections Articles Books Conference Proceedings

Beyond the 2 2 -contingency table: A primer on entropies and mutual information in various scenarios involving m diagnostic categories and n categories of diagnostic tests

Gilbert Reibnegger
Organization: Medical University of Graz
Department: Institute of Physiological Chemistry, Center of Physiological Medicine
Journal / Anthology

Clinica Chimica Acta
Year: 2013
Volume: 425
Page range: 97-103

Background: Usual evaluation tools for diagnostic tests such as, sensitivity/specificity and ROC analyses, are designed for the discrimination between two diagnostic categories, using dichotomous test results. Information theoretical quantities such asmutual information allow in depth-analysis ofmore complex discrimination problems, including continuous test results, but are rarely used in clinical chemistry. This paper provides a primer on useful information theoretical concepts with a strong focus on typical diagnostic scenarios. Methods and results: Information theoretical concepts are shortly explained. MATHEMATICA CDF documents are provided which compute entropies and mutual information as function of pretest probabilities and the distribution of test results among the categories, and allowinteractive exploration of the behavior of these quantities in comparisonwithmore conventional diagnosticmeasures. Using data froma previously published study, the application of information theory to practical diagnostic problems involving up to 4 4 -contingency tables is demonstrated. Conclusions: Information theoretical concepts are particularly useful for diagnostic problems requiringmore than the usual binary classification. Quantitative test results can be properly analyzed, and in contrast to popular concepts such as ROC analysis, the effects of variations of pre-test probabilities of the diagnostic categories can be explicitly taken into account.

*Applied Mathematics > Information Theory

Diagnostic test evaluation, Information theory, Entropy, Mutual information