HughPickens.com [hughpickens.com] writes:
Tom Simonite writes at MIT Technology Review that the Wikimedia Foundation is rolling out new
software trained to know the difference between an honest mistake and intentional vandalism [technologyreview.com] in an effort to make editing Wikipedia less psychologically bruising. One motivation for the project is a
significant decline in the number of people considered active contributors [dailydot.com] to the flagship English-language Wikipedia: it has fallen by 40 percent over the past eight years, to about 30,000. Research indicates that the problem is rooted in Wikipedians’ complex bureaucracy and their often hard-line responses to newcomers’ mistakes, enabled by semi-automated tools that make deleting new changes easy. The new
ORES system, for “Objective Revision Evaluation Service,” [wikimedia.org] can be trained to score the quality of new changes to Wikipedia and judge whether an edit was made in good faith or not. ORES can allow editing tools to direct people to review the most damaging changes. The software can also help editors treat rookie or innocent mistakes more appropriately, says Aaron Halfaker who helped diagnose that problem and is now leading a project trying to fight it. “I suspect
the aggressive behavior of Wikipedians [wikipedia.org] doing quality control is because they’re making judgments really fast and they’re not encouraged to have a human interaction with the person," says Halfaker. “This enables a tool to say, ‘If you’re going to revert this, maybe you should be careful and send the person who made the edit a message.’”
Original Submission