Abstract
tests that are expected to yield the largest decrease in the uncertainty about a patient’s diagnosis. For capturing diagnostic uncertainty, often an information measure is used. In this paper, we study the Shannon entropy, the Gini
index, and the misclassification error for this purpose. We argue that for a large range of values, the first derivative of the Gini index can be regarded as an approximation of the first derivative of the Shannon entropy. We also argue that the differences between the derivative functions outside this range can explain different test sequences in practice. We further argue that the misclassification error is less suited for test-selection purposes as it is likely to show a tendency to select tests arbitrarily. Experimental results from using the measures with a real-life probabilistic network in oncology support our observations.
Original language | Undefined |
---|---|
Title of host publication | Proceedings of Artifical Intelligence in Medicine Europe (AIME) |
Editors | R Bellazzi, A Abu-Hanna, J Hunter |
Place of Publication | Berlin |
Publisher | Springer |
Pages | 316-326 |
Number of pages | 11 |
DOIs | |
Publication status | Published - 2007 |
Event | 11th Conference on Artificial Intelligence in Medicine, AIME 2007 - Amsterdam, Netherlands Duration: 7 Jul 2007 → 11 Jul 2007 Conference number: 11 |
Publication series
Name | Lecture Notes in Artificial Intelligence |
---|---|
Publisher | Springer Verlag |
Number | LNCS4549 |
Volume | 4594 |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Conference
Conference | 11th Conference on Artificial Intelligence in Medicine, AIME 2007 |
---|---|
Abbreviated title | AIME |
Country/Territory | Netherlands |
City | Amsterdam |
Period | 7/07/07 → 11/07/07 |
Keywords
- EWI-10747
- METIS-241770
- IR-61840
- misclassification error
- Shannon entropy
- Test selection
- Gini index