Slash Boxes

SoylentNews is people

posted by LaminatorX on Tuesday March 10 2015, @09:32PM   Printer-friendly
from the why-we-can't-have-nice-things dept.

Jonathon Mahler writes in the NYT that in much the same way that Facebook swept through the dorm rooms of America’s college students a decade ago, the social app Yik Yak, which shows anonymous messages from users within a 1.5-mile radius is now taking college campuses by storm. "Think of it as a virtual community bulletin board — or maybe a virtual bathroom wall at the student union," writes Mahler. "It has become the go-to social feed for college students across the country to commiserate about finals, to find a party or to crack a joke about a rival school." And while much of the chatter is harmless, some of it is not. “Yik Yak is the Wild West of anonymous social apps,” says Danielle Keats Citron. “It is being increasingly used by young people in a really intimidating and destructive way.” Since the app’s introduction a little more than a year ago, Yik Yak has been used to issue threats of mass violence on more than a dozen college campuses, including the University of North Carolina, Michigan State University and Penn State. Racist, homophobic and misogynist “yaks” have generated controversy at many more, among them Clemson, Emory, Colgate and the University of Texas. At Kenyon College, a “yakker” proposed a gang rape at the school’s women’s center.

Colleges are largely powerless to deal with the havoc Yik Yak is wreaking. The app’s privacy policy prevents schools from identifying users without a subpoena, court order or search warrant, or an emergency request from a law-enforcement official with a compelling claim of imminent harm. Esha Bhandari, a staff attorney at the American Civil Liberties Union, argues that "banning Yik Yak on campuses might be unconstitutional," especially at public universities or private colleges in California where the so-called Leonard Law protects free speech. She said it would be like banning all bulletin boards in a school just because someone posted a racist comment on one of the boards. In one sense, the problem with Yik Yak is a familiar one. Anyone who has browsed the comments of an Internet post is familiar with the sorts of intolerant, impulsive rhetoric that the cover of anonymity tends to invite. But Yik Yak’s particular design can produce especially harmful consequences, its critics say. “It’s a problem with the Internet culture in general, but when you add this hyper-local dimension to it, it takes on a more disturbing dimension,” says Elias Aboujaoude.” “You don’t know where the aggression is coming from, but you know it’s very close to you.”

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by NotSanguine on Wednesday March 11 2015, @12:47PM

    We need names associated with mods

    When somebody mods a comment then we should know who did it

    Maybe that would make people think twice about doing bad mods

    The admins and editors know who is modding what. While it's difficult to keep a handle on all mods, it should be fairly clear if some folks are ignoring the moderation guidelines [] and using mod points to promote their personal agendas rather than improving the quality of discussion.

    I'm not sure if the admins have tools (I suspect it would be appropriate DB queries) to look at users' moderation behavior. Presumably someone who mods down more than they mod up might be subject to closer scrutiny.

    It wouldn't (I know you guys have tons of work to do, but this might be quite useful in identifying those who abuse the moderation system) hurt to have weekly/biweekly/monthly reports detailing the ratio of upmods to downmods and the frequency of the various types of up/down mods for users. Assuming that most users are responsible and judicious in their moderation behavior, taking a closer look at the outliers might provide good information.

    Perhaps this could even be crowdsourced, with the reports being anonymized and an ever-changing group of users identifying those who might be abusing the moderation system.

    I don't know enough about how things are set up on the back-end of SN, but since SN is "people", why don't we use that resource to identify those who may be abusing the moderation system and, based on that information, have the admins take action as appropriate.

    No, no, you're not thinking; you're just being logical. --Niels Bohr
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1, Interesting) by Anonymous Coward on Wednesday March 11 2015, @03:09PM

    by Anonymous Coward on Wednesday March 11 2015, @03:09PM (#156093)

    There are all kinds patterns in moderation you can look for.
    Like frequency of mods between user accounts, whether up or down.
    But before we go off hunting bad actors, how about we decide if there is even a problem in the first place?
    A handful of bursty cases isn't a systemic problem, its just a glitch.

    At a bare minimum, come up with some well-defined behaviours that we can agree on as being bad for the site and then start running those reports looking for those behaviours over a period of time, like six months and see how frequent they really are.

    Then publish those reports and we can all decide it there really is a problem worth bothering with.