Abstract
This paper reviews several evaluation measures developed for evaluating XML information retrieval (IR) systems. We argue that these measures, some of which are currently in use by the INitiative for the Evaluation of XML Retrieval (INEX), are complicated, hard to understand, and hard to explain to users of XML IR systems. To show the value of keeping things simple, we report alternative evaluation results of official evaluation runs submitted to INEX 2004 using simple metrics, and show its value for INEX.
Original language | English |
---|---|
Title of host publication | Proceedings of the INEX 2005 Workshop on Element Retrieval Methodology |
Subtitle of host publication | Held at the University of Glasgow, 30 July 2005 |
Editors | Andrew Trotman, Mounia Lalmas, Norbert Fuhr |
Place of Publication | Otago, New Zealand |
Publisher | University of Otaga |
Pages | 6-13 |
Number of pages | 8 |
ISBN (Print) | 0-473-10228-5 |
Publication status | Published - 30 Jul 2005 |
Event | 4th International Workshop of the Initiative for the Evaluation of XML Retrieval, INEX 2005 - University of Glasgow, Glasgow, Scotland, United Kingdom Duration: 30 Jul 2005 → 30 Jul 2005 |
Workshop
Workshop | 4th International Workshop of the Initiative for the Evaluation of XML Retrieval, INEX 2005 |
---|---|
Abbreviated title | INEX |
Country/Territory | United Kingdom |
City | Glasgow, Scotland |
Period | 30/07/05 → 30/07/05 |
Keywords
- IR-66444
- DB-XMLIR: XML INFORMATION RETRIEVAL
- EWI-7257
- METIS-225877