Professor Paul Clough from the Information School is giving an invited talk at the Forum for Information Retrieval Evaluation (FIRE) 2014 conference in Bangalore, India, between 5 and 7 December.
His talk is entitled "Examining the Limits of Crowdsourcing for Relevance Assessment" and is based on work which he has undertaken with the UK National Archives.
Evaluation is instrumental in the development and management of effective information retrieval systems and ensuring high levels of user satisfaction. Using crowdsourcing as part of this process has been shown to be viable. What is less well understood are the limits of crowdsourcing for evaluation, particularly for domain specific search.
Professor Clough will present results comparing relevance assessments gathered using crowdsourcing with those gathered from a domain expert for evaluating different search engines in a large government archive. While crowdsourced judgments rank the tested search engines in the same order as expert judgments, crowdsourced workers appear unable to distinguish different levels of highly accurate search results in a way that expert assessors can. The nature of this limitation in crowd sourced workers for this experiment is examined and the viability of crowdsourcing for evaluating search in specialist settings is discussed.
Comments