dc.contributor.author | Chatzichristofis, Savvas A. | |
dc.contributor.author | Zagoris, Konstantinos | |
dc.contributor.author | Arampatzis, Avi | |
dc.date.accessioned | 2017-11-01T09:54:24Z | |
dc.date.available | 2017-11-01T09:54:24Z | |
dc.date.issued | 2011 | |
dc.identifier.isbn | 978-1-4503-0757-4 | |
dc.identifier.uri | http://hdl.handle.net/11728/10189 | |
dc.description.abstract | Traditional tools for information retrieval (IR) evaluation, such as
TREC’s trec_eval, have outdated command-line interfaces with
many unused features, or ‘switches’, accumulated over the years.
They are usually seen as cumbersome applications by new IR researchers,
steepening the learning curve. We introduce a platformindependent
application for IR evaluation with a graphical easy-touse
interface: the TREC_Files Evaluator. The application supports
most of the standard measures used for evaluation in TREC, CLEF,
and elsewhere, such as MAP, P10, P20, and bpref, as well as the Averaged
Normalized Modified Retrieval Rank (ANMRR) proposed
by MPEG for image retrieval evaluation. Additional features include
a batch mode and statistical significance testing of the results
against a pre-selected baseline. | en_UK |
dc.language.iso | en | en_UK |
dc.publisher | Association for Computing Machinery (ACM), United States | en_UK |
dc.relation.ispartofseries | SIGIR '11 Proceedings of the 34th international ACM SIGIR conference;Research and development in Information Retrieval, Beijing, China — July 24 - 28, 2011 | |
dc.rights | Copyright is held by the author/owner(s). | en_UK |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ | en_UK |
dc.subject | Research Subject Categories::TECHNOLOGY | en_UK |
dc.subject | TREC files | en_UK |
dc.subject | Evaluation Measurements | en_UK |
dc.title | The TREC files: the (ground) truth is out there | en_UK |
dc.type | Working Paper | en_UK |
dc.doi | 10.1145/2009916.2010164 | en_UK |