On the Behaviour of Information Measures for Test Selection

D. Sent, L.C. van der Gaag

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review


    tests that are expected to yield the largest decrease in the uncertainty about a patient’s diagnosis. For capturing diagnostic uncertainty, often an information measure is used. In this paper, we study the Shannon entropy, the Gini index, and the misclassification error for this purpose. We argue that for a large range of values, the first derivative of the Gini index can be regarded as an approximation of the first derivative of the Shannon entropy. We also argue that the differences between the derivative functions outside this range can explain different test sequences in practice. We further argue that the misclassification error is less suited for test-selection purposes as it is likely to show a tendency to select tests arbitrarily. Experimental results from using the measures with a real-life probabilistic network in oncology support our observations.
    Original languageUndefined
    Title of host publicationProceedings of Artifical Intelligence in Medicine Europe (AIME)
    EditorsR Bellazzi, A Abu-Hanna, J Hunter
    Place of PublicationBerlin
    Number of pages11
    Publication statusPublished - 2007
    Event11th Conference on Artificial Intelligence in Medicine, AIME 2007 - Amsterdam, Netherlands
    Duration: 7 Jul 200711 Jul 2007
    Conference number: 11

    Publication series

    NameLecture Notes in Artificial Intelligence
    PublisherSpringer Verlag
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349


    Conference11th Conference on Artificial Intelligence in Medicine, AIME 2007
    Abbreviated titleAIME


    • EWI-10747
    • METIS-241770
    • IR-61840
    • misclassification error
    • Shannon entropy
    • Test selection
    • Gini index

    Cite this