Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 10 submissions in the queue.
posted by janrinok on Tuesday August 12 2014, @07:56PM   Printer-friendly
from the they-can't-think-twice-if-they-didn't-think-once dept.

A 14 year-old student selected as a Google Science Fair 2014 finalist has come up with a Rethink project that asks teenagers to reread hurtful messages before sending them off and having to deal with consequences they never considered before hitting Send. Trisha Prabhu entered the contest at 13 with a distinct distaste for cyber-bullying and wanting to come up with a solution to help teenagers think twice. "I am looking forward to a future where we have conquered cyber-bullying!" she said in her project notes. Her hypothesis: If adolescents from ages 12 to 18 were given an alert mechanism that suggested to them to revisit their decision to post a hurtful message on social media, the number of hurtful messages would drop lower than for those adolescents not provided with such an alert.

Thinking twice before doing something is not an especially strong skill among teenagers. "As found in this research," she observed in her project descriptions, "the Prefrontal Cortex is not fully developed during adolescence years." Study Methods: She created Baseline and Rethink systems, where both asked users if they were willing to post a series of predetermined messages online. She did 1500 trials, and the results were very encouraging.

See also:

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by cafebabe on Tuesday August 12 2014, @08:45PM

    by cafebabe (894) on Tuesday August 12 2014, @08:45PM (#80592) Journal

    The Greater Internet Fuckwad Theory [penny-arcade.com] has an isomorphism with the fire triangle [wikipedia.org]. Specifically:-

    Anti-social person <=> Heat
    Anonymity <=> Oxygen
    Audience <=> Fuel

    If you want to discourage fires, remove heat, oxygen or fuel. If you want to discourage fuckwads, do likewise. Unfortunately, fuckwads are more persistent and various reasonable schemes have failed.

    Removing anonymity doesn't work and, in particular, a real names policy definitely doesn't work. However, there is definitely much which can be done to control audience. SlashCode's karma system was an early attempt. Systems could be investigated which delay anti-social messages or progressively restrict distribution.

    --
    1702845791×2
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Tuesday August 12 2014, @08:48PM

    by Anonymous Coward on Tuesday August 12 2014, @08:48PM (#80595)

    > SlashCode's karma system was an early attempt.

    Slashcode's karma with metamoderation at least.
    I sure miss metamoderation here, without it, moderation is just a way to cheer for your team.

  • (Score: 3, Interesting) by kaszz on Tuesday August 12 2014, @11:30PM

    by kaszz (4211) on Tuesday August 12 2014, @11:30PM (#80651) Journal

    I like your analysis. The sequence of events seems to be that some low life which might have IQ but no respect harass someone and then other dead fish starts to slap that person. Going by what people have written what they have done in such situations. The most effective strategy seems to expedite harsh pain quickly against the leader bully. And show that you are willing to escalate things without remorse.

    In the cyberbully situation. One way to crash it is perhaps to undermine the social status of the bullies catastrophically.

    • (Score: 2) by cafebabe on Wednesday August 13 2014, @04:15AM

      by cafebabe (894) on Wednesday August 13 2014, @04:15AM (#80703) Journal

      Historically, any transgressions from a child were corrected with physical violence from adults. Environments where large numbers of children met were partitioned and contained in a strict hierarchy. Also, bad behavior was visibly corrected before it spread.

      Online, this disappears because the market is decided by having the largest scale-free network. This is like having a playground with millions of kids and very few supervisors.

      To fix this, the emphasis has been on reducing anonymity. This also has the effect of being financially attractive to advertisers. Various organizations have flim-flammed with account policies in an attempt to reduce anonymity. However, few have attempted to dynamically restrict the immediacy or reach of anti-social comments. I believe this idea has been resisted because it would reduce the ability to cache data and it would increase server load.

      --
      1702845791×2