Tom Simonite writes at MIT Technology Review that the Wikimedia Foundation is rolling out new software trained to know the difference between an honest mistake and intentional vandalism in an effort to make editing Wikipedia less psychologically bruising. One motivation for the project is a significant decline in the number of people considered active contributors to the flagship English-language Wikipedia: it has fallen by 40 percent over the past eight years, to about 30,000.
Research indicates that the problem is rooted in Wikipedians' complex bureaucracy and their often hard-line responses to newcomers' mistakes, enabled by semi-automated tools that make deleting new changes easy. The new ORES system, for "Objective Revision Evaluation Service," can be trained to score the quality of new changes to Wikipedia and judge whether an edit was made in good faith or not. ORES can allow editing tools to direct people to review the most damaging changes. The software can also help editors treat rookie or innocent mistakes more appropriately, says Aaron Halfaker who helped diagnose that problem and is now leading a project trying to fight it. "I suspect the aggressive behavior of Wikipedians doing quality control is because they're making judgments really fast and they're not encouraged to have a human interaction with the person," says Halfaker. "This enables a tool to say, 'If you're going to revert this, maybe you should be careful and send the person who made the edit a message.'"
(Score: 0) by Anonymous Coward on Monday December 07 2015, @05:44PM
SOME of wikipedia is a farce, most of it is quite valuable.
I think there needs to be some sort of branching so that different versions are easily accessible. Some automation could be enabled so that any version with enough views or somesuch can be linked/highlighted on the main landing page. We even git a good tool for it! (sorry)
(Score: 2) by takyon on Monday December 07 2015, @07:20PM
Your idea could be very effective.
Wikipedia has been working on flagged/trusted revisions for a long time, in order to allow anybody to edit the article but the default view of the article is an approved version.
Instead of trying to fork Wikipedia, why not create an outside system (browser extension?) that curates which edits make it to the top. Edits that normally get reverted by bots or admins instantly still exist in the revision history. Bored people could create whitelists of users. Pick a list and start browsing. Bots and overzealous deletionist admins don't make it onto the whitelist. Wikipedia continues to do the heavy lifting by hosting the content.
That leaves the problem of content that has actually been deleted from Wikipedia. The extension could integrate a search of http://deletionpedia.org/en/Main_Page [deletionpedia.org] into Wikipedia's search results. Internal Wikipedia links to deleted articles could be redirected to the deletionpedia content.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by K_benzoate on Monday December 07 2015, @08:21PM
I've found that it's a decent starting point for reference material if the topic is dry and rather non-political. If I want to look up some quick facts about a certain species of plant, or chemical compound, or equation, it does pretty well. They're usually well sourced if I need to dig deeper. I wouldn't trust it for something like the Israel-Palestine dispute, global warming, or radical feminism. Too heated, too many ideologues all trying to get their vision published.
Climate change is real and primarily caused by human activity.