Toward Eye Gaze Enhanced Information Retrieval Relevance Feedback
Résumé
Information Retrieval (IR) is dedicated to retrieve relevant documents according to a user’s query. The literature in this field shows that gathering relevance information provided by the user on the documents retrieved by the IR system increases the overall quality of the system. The relevance information provided by the user is processed to refine his/her initial query, in a process called Relevance Feedback.
Since it is cumbersome and time consuming for the user to explicitly provide such informa- tion, our hypothesis is that eye gaze information could be used to implicitly estimate the user’s interests, and thus help the relevance feedback mechanism.
The main research question tackled here is is twofold: (1) what is the user behavioral model at the visual level in an information retrieval task, and how this model would determine the user’s interests and (2) how to integrate effectively such eye gaze elements into a relevance feedback mechanism in classical IR systems that present results list with documents extracts (called snippets).
To achieve this goal, we split the problem into the following steps: (a) to model the user behaviour in front of a result list composed of snippets; (b) to define the eye gaze elements to be acquired and the way to link them to the user’s interests in document contents; (c) to build relevance feedback mechanisms that are able to use these elements; and (d) to ex- periment the proposal on a classical IR test collections to compare them to other relevance feedback approaches.
The work presented here focuses on the former two elements above: we define a experimen- tal context to gather relevant information about user’s behaviour in front a result display composed of snippets, and we deduce the EM elements that will need to be acquired in order to perform IR relevance feedback.