Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Tuesday August 27 2019, @02:44PM   Printer-friendly
from the internet-hate-machine dept.

Researchers propose a new approach for dismantling online hate networks

How do you get rid of hate speech on social platforms? Until now, companies have generally tried two approaches. One is to ban individual users who are caught posting abuse; the other is to ban the large pages and groups where people who practice hate speech organize and promote their noxious views.

But what if this approach is counterproductive? That's the argument in an intriguing new paper out today in Nature from Neil Johnson, a professor of physics at George Washington University, and researchers at GW and the University of Miami. The paper, "Hidden resilience and adaptive dynamics of the global online hate ecology," explores how hate groups organize on Facebook and Russian social network VKontakte — and how they resurrect themselves after platforms ban them.

As Noemi Derzsy writes in her summary in Nature:

Johnson et al. show that online hate groups are organized in highly resilient clusters. The users in these clusters are not geographically localized, but are globally interconnected by 'highways' that facilitate the spread of online hate across different countries, continents and languages. When these clusters are attacked — for example, when hate groups are removed by social-media platform administrators (Fig. 1) — the clusters rapidly rewire and repair themselves, and strong bonds are made between clusters, formed by users shared between them, analogous to covalent chemical bonds. In some cases, two or more small clusters can even merge to form a large cluster, in a process the authors liken to the fusion of two atomic nuclei. Using their mathematical model, the authors demonstrated that banning hate content on a single platform aggravates online hate ecosystems and promotes the creation of clusters that are not detectable by platform policing (which the authors call 'dark pools'), where hate content can thrive unchecked.

[...] The researchers advocate a four-step approach to reduce the influence of hate networks.

  1. Identify smaller, more isolated clusters of hate speech and ban those users instead.
  2. Instead of wiping out entire small clusters, ban small samples from each cluster at random. This would theoretically weaken the cluster over time without inflaming the entire hive.
  3. Recruit users opposed to hate speech to engage with members of the larger hate clusters directly. (The authors explain: "In our data, some white supremacists call for a unified Europe under a Hitler-like regime, and others oppose a united Europe. Similar in-fighting exists between hate-clusters of the KKK movement. Adding a third population in a pre-engineered format then allows the hate-cluster extinction time to be manipulated globally.)
  4. Identify hate groups with competing views and pit them against one another, in an effort to sow doubt in the minds of participants.

Hidden resilience and adaptive dynamics of the global online hate ecology[$], Nature (DOI: 10.1038/s41586-019-1494-7)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Thexalon on Tuesday August 27 2019, @05:11PM (4 children)

    by Thexalon (636) on Tuesday August 27 2019, @05:11PM (#886210)

    The metaphor I've been using to describe it: Someone comes around and slaps a bumper sticker on your car. And if you try to either remove it or cover it up, they start saying "That's not fair, I have freedom of speech!!"

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 0) by Anonymous Coward on Tuesday August 27 2019, @06:37PM (3 children)

    by Anonymous Coward on Tuesday August 27 2019, @06:37PM (#886295)

    That kind of falls apart when you have companies as large as Youtube, Facebook, and Twitter, which have becomes public squares of sorts, and which routinely bribe our government. However, I don't think that forcing them to accept all speech via government regulation is the answer. All of those companies are toxic to begin with for other reasons (privacy), so no one should use them anyway. We need real decentralized platforms, not centralized trash that facilitates mass surveillance.

    • (Score: 1, Insightful) by Anonymous Coward on Tuesday August 27 2019, @09:57PM (2 children)

      by Anonymous Coward on Tuesday August 27 2019, @09:57PM (#886425)

      No, they are not "public squares". They are private auditoriums and the owner controls the door. Just because they are popular does not make them public.

      • (Score: 1, Insightful) by Anonymous Coward on Tuesday August 27 2019, @10:59PM

        by Anonymous Coward on Tuesday August 27 2019, @10:59PM (#886452)

        So is the phone company. Guess what? They got themselves all sorts of regulated because these exact reasons. Title I exists because of this exact argument.

        If you want a Title I like thing for the internet. This is *exactly* how you go about it.

      • (Score: 0) by Anonymous Coward on Wednesday August 28 2019, @02:53PM

        by Anonymous Coward on Wednesday August 28 2019, @02:53PM (#886813)

        No, the issue is that historically Google has sold themselves as a public forum and refused the assertion that they are a publisher for legal purposes. I don't have a problem with google being a public forum, but that means they can't block/regulate content. If they want to be a publisher thats fine, but they get to be sued for allowing violent/whatever content on their site.