This is an expansion package for the Terrier IR Platform, called CrowdTerrier. CrowdTerrier enables the semi-automatic generation of relevance assessments for a variety of document ranking tasks using crowdsourcing. The aim of CrowdTerrier is to reduce the time and expertise required to effectively Crowdsource relevance assessments by abstracting away from the complexities of the crowdsourcing process. It achieves this by automating the assessment process as much as possible, via a close integration of the IR system that ranks the documents (Terrier) and the crowdsourcing marketplace that is used to assess those documents (Amazon’s Mechanical Turk (MTurk)). The following paper describes the package:

[WWW] http://richardmccreadie.blogspot.co.uk/2012/11/crowdterrier-automatic-crowdsourced.html

Package Contributions

Terms of Use

The dataset is provided free of charge 'as is' for research purposes under the Mozilla Public License:

The source and binary forms of Terrier are subject to the 
Mozilla Public License Version 1.1 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at http://www.mozilla.org/MPL/.

Software distributed under the License is distributed on
expressed or implied. See the License for the specific language
governing rights and limitations under the License.
CrowdTerrier is Copyright (C) 2013 the University of Glasgow.
All Rights Reserved.

Furthermore, by using this Terrier expansion package, you agree to cite the following publication(s) in any future works and/or publications that use this dataset:

 author = {McCreadie, Richard and Macdonald, Craig and Ounis},
 title = {CrowdTerrier: Automatic Crowdsourced Relevance Assessments with Terrier},
 booktitle = {Proceedings of SIGIR'12},
 year = {2012},
 location = {Portland, Oregon},

The dataset can be downloaded here: CrowdsourcingPackage.tar.gz

last edited 2013-07-28 17:51:46 by 0402393m