Predicting Crowdsourced Decisions on Toxic Behavior in Online Games

A reported advance in the game of predicting what mobs will decide about yobs who run riot in online games:

STFU NOOB! Predicting Crowdsourced Decisions on Toxic Behavior in Online Games,” Jeremy Blackburn and Haewoon Kwak, arXiv:1404.5905, epub April 23, 2014. (Thanks to investigator Evie Tsing for bringing this to our attention.) The authors, at University of South Florida, USA and Telefonica Research, Barcelona, Spain, explain:

league of legends“One problem facing players of competitive games is negative, or toxic, behavior. League of Legends, the largest eSport game, uses a crowdsourcing platform called the Tribunal to judge whether a reported toxic player should be punished or not. The Tribunal is a two stage system requiring reports from those players that directly observe toxic behavior, and human experts that review aggregated reports…. [This]system… naturally requires tremendous cost, time, and human efforts. In this paper, we propose a supervised learning approach for predicting crowdsourced decisions on toxic behavior with large-scale labeled data collections; over 10 million user reports involved in 1.46 million toxic players and corresponding crowdsourced decisions. Our result shows good performance in detecting overwhelmingly majority cases and predicting crowdsourced decisions on them.”