Content area

Abstract

One of the challenges of modern information retrieval is to rank the most relevant documents at the top of the large system output. This calls for choosing the proper methods to evaluate the system performance. The traditional performance measures, such as precision and recall, are based on binary relevance judgment and are not appropriate for multi-grade relevance. The main objective of this paper is to propose a framework for system evaluation based on user preference of documents. It is shown that the notion of user preference is general and flexible for formally defining and interpreting multi-grade relevance. We review 12 evaluation methods and compare their similarities and differences. We find that the normalized distance performance measure is a good choice in terms of the sensitivity to document rank order and gives higher credits to systems for their ability to retrieve highly relevant documents.[PUBLICATION ABSTRACT]

Details

Title
Evaluating information retrieval system performance based on user preference
Author
Zhou, Bing; Yao, Yiyu
Pages
227-248
Publication year
2010
Publication date
Jun 2010
Publisher
Springer Nature B.V.
ISSN
09259902
e-ISSN
15737675
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
200196104
Copyright
Springer Science+Business Media, LLC 2010