Using crowdsourcing for TREC relevance assessment |
| |
Authors: | Omar Alonso Stefano Mizzaro |
| |
Institution: | 1. Microsoft Corp., 1065 La Avenida, Mountain View, CA 94043, USA;2. Dept. of Maths and Computer Science, University of Udine, Via delle Scienze, 206, 33100 Udine, Italy |
| |
Abstract: | Crowdsourcing has recently gained a lot of attention as a tool for conducting different kinds of relevance evaluations. At a very high level, crowdsourcing describes outsourcing of tasks to a large group of people instead of assigning such tasks to an in-house employee. This crowdsourcing approach makes possible to conduct information retrieval experiments extremely fast, with good results at a low cost. |
| |
Keywords: | IR evaluation Test collections Relevance assessment Crowdsourcing TREC Amazon Mechanical Turk Experimental design |
本文献已被 ScienceDirect 等数据库收录! |
|