Evaluating retrieval over sessions: the trec session track 2011–2014. Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval

Carterette, Ben, Clough, Paul, Hall, Mark, Kanoulas, Evangelos and Sanderson, Mark (2016) Evaluating retrieval over sessions: the trec session track 2011–2014. Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval. SIGIR 2016, 17/07/2016-21/07/2016, Pisa, Italy, pp. 685-688, ISBN 978-1-4503-4069-4, DOI https://doi.org/10.1145/2911451.2914675.

[img]
Preview
Text
carteretteetal2016 M Hall.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (265kB) | Preview

Abstract

Information Retrieval (IR) research has traditionally focused on serving the best results for a single query| so-called ad hoc retrieval. However, users typically search iteratively, re ning and reformulating their queries during a session. A key challenge in the study of this interaction is the creation of suitable evaluation resources to assess the e ectiveness of IR systems over sessions. This paper describes the TREC Session Track, which ran from 2010 through to 2014, which focussed on forming test collections that included various forms of implicit feedback. We describe the test collections; a brief analysis of the di erences between datasets over the years; and the evaluation results that demonstrate that the use of user session data signi cantly improved e ectiveness.

Item Type: Conference or Workshop Item (Proceedings)
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Computing and Information Systems
Date Deposited: 23 Nov 2016 14:47
URI: http://repository.edgehill.ac.uk/id/eprint/8281

Archive staff only

Item control page Item control page