Published in AAAI Conference on Human Computation and Crowdsourcing, 2017
This paper introduces a platform named Drafty, which enlists visitors of an editable dataset to become “user-editors”. It records and analyzes user-editors’ within-page interactions to construct user interest profiles, creating a cyclical feedback mechanism that enables Drafty to target requests for specific corrections from user-editors. To validate the automatically generated user interest profiles, we surveyed participants who performed self-created tasks with Drafty and found their user interest score was 3.2 higher on data they were interested in versus data they had no interest in. Next, a 7-month live experiment compared the efficacy of user-editor corrections depending on whether they were asked to review data that matched their interests. Our findings suggest that user-editors are approximately 3 times morelikely to provide accurate corrections for data matching theirinterest profiles, and about 2 times more likely to provide corrections in the first place.