Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by Fnord666 on Tuesday February 26 2019, @05:31PM   Printer-friendly
from the serious-mental-health-issues dept.

Tips for committing suicide are appearing in children's cartoons on YouTube and the YouTube Kids app.

https://arstechnica.com/science/2019/02/youtube-kids-cartoons-include-tips-for-committing-suicide-docs-warn/

YouTube Kids is theoretically a curated collection of videos that are safe for kids. Apparently, someone forgot to tell YouTube/Google that. Sure, YouTube probably has some butt covering clause in their EULA. That doesn't excuse such an oversight, though. It's very easy to see overworked, stressed parents, giving their kids access to YouTube/YouTube Kids. Kids have enough to deal with, without having to deal with grown-ups twisted thoughts. This seems more like insidious minds at work to me. What do you think?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Touché) by urza9814 on Tuesday February 26 2019, @06:13PM (10 children)

    by urza9814 (3954) on Tuesday February 26 2019, @06:13PM (#807126) Journal

    As far as I can tell, it's not really the people posting these videos who claim they are suitable for children; that's entirely up to Google. This also isn't the first time there's been controversy around violent videos being included by Google's automated filters. Some jackass at Google decided that they're so brilliant that they can program an algorithm that can perfectly determine what is suitable for children, and they failed miserably. Multiple times. And yet Google keeps insisting that it works anyway.

    At this point, I don't see how you can blame anyone other than Google. They are well aware of this problem, yet they still keep telling parents that there is no problem. They're sociopaths that will do absolutely anything for a buck.

    Starting Score:    1  point
    Moderation   +3  
       Informative=1, Touché=2, Total=3
    Extra 'Touché' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @06:27PM (4 children)

    by Anonymous Coward on Tuesday February 26 2019, @06:27PM (#807137)

    Some jackass at Google decided that they're so brilliant that they can program an algorithm that can perfectly determine what is suitable for children, and they failed miserably. Multiple times. And yet Google keeps insisting that it works anyway.

    Same as the parent algorithm: if (isAnimated) isSafeToAbandonKids:=true

    • (Score: 3, Funny) by AssCork on Tuesday February 26 2019, @06:33PM (3 children)

      by AssCork (6255) on Tuesday February 26 2019, @06:33PM (#807142) Journal

      You forgot to trap for Anime;

      if (isAnimated) if (!hasTentacles) isSafeToAbandonKids:=true

      --
      Just popped-out of a tight spot. Came out mostly clean, too.
      • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @07:32PM

        by Anonymous Coward on Tuesday February 26 2019, @07:32PM (#807193)

        No I didn't. It would require a parent to know about tentacle porn, which most squares don't. The simple algorithm is how my mother operated.

      • (Score: 3, Touché) by Pino P on Tuesday February 26 2019, @10:43PM (1 child)

        by Pino P (4721) on Tuesday February 26 2019, @10:43PM (#807305) Journal

        if (isAnimated) if (!hasTentacles) isSafeToAbandonKids:=true

        Thanks for disappointing fans of The Little Mermaid.

        • (Score: 2) by c0lo on Wednesday February 27 2019, @02:15AM

          by c0lo (156) Subscriber Badge on Wednesday February 27 2019, @02:15AM (#807425) Journal

          Context, mate, the devil is in the context.

          E.g., how about


          isSafeToAbandonKids:=true
          // pretend we have an algo taking the decision
          if (isAnimated) if (!hasTentacles) isSafeToAbandonKids:=true

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 3, Informative) by takyon on Tuesday February 26 2019, @06:41PM (3 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday February 26 2019, @06:41PM (#807150) Journal

    Yeah, there is reasonable doubt whether the removed videos were specifically targeting young children (although I'm sure some will argue that nobody under 18 or nobody at all should see the EVILLLL Filthy Frank green screen meme). pedimom.com / Dr. Free N. Hess also stands to gain from some good ol' false outrage.

    This is just another indication that YouTube and Google at large are terrible at any sort of curation. They don't want humans in the loop and they hate that they have to pay human support staff (mechanical turks). This is probably the right approach for colossal social media entities in the end. After all, something like a human lifetime worth of video is uploaded to YouTube every day (400 hours per minute = 65 years per day). But while machine learning can do some amazing things, it can't act as a superintelligent strong AI. So the AI-based solutions are pissing everyone off by censoring too much and not enough at the same time.

    The YouTube platform has a lot of valuable content on it and is almost too big to fail at this point. YouTube Kids on the other hand could be an easier problem to tackle if it consisted of only whitelisted videos. The problem is that YouTube would rather let the throngs of content creators experiment in order to find what kids will click on tens of millions of times (even if it ends up being Elsa from Frozen sticking a needle in Spiderman's ass [thedailybeast.com]). Remember: advertising to kids is considered several times more valuable than advertising to other demographics (on YouTube at least).

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by Pino P on Tuesday February 26 2019, @10:36PM (2 children)

      by Pino P (4721) on Tuesday February 26 2019, @10:36PM (#807303) Journal

      even if it ends up being Elsa from Frozen sticking a needle in Spiderman's ass

      I gathered from a couple other episodes that Elsa Agnarrsdaughter is dating Peter Parker in that series. I haven't seen the one with the injection, so please clue me into what I'm missing: If Peter needs periodic injections of medication that his doctor prescribed to treat a condition, what's wrong with showing that?

      • (Score: 2) by takyon on Tuesday February 26 2019, @11:37PM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday February 26 2019, @11:37PM (#807337) Journal

        I've only seen some of the highlights, I haven't followed the... lore of that channel, so to speak. 😂

        The injection thing is a bit of a theme in these live action videos and weird kid app games. DailyMotion [dailymotion.com] appears to have a better selection than the same search on YouTube.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by bzipitidoo on Wednesday February 27 2019, @09:01AM

        by bzipitidoo (4388) on Wednesday February 27 2019, @09:01AM (#807518) Journal

        Turn a kid loose on YouTube, and it's only a matter of time before they stumble onto something perverted.

        Lot of people are using action figure toys or costumes to make cute videos, and most of them are okay. But the 3 year old found a video of some guy in a Spiderman costume taking a leak outdoors. The camera was behind him so you couldn't see any anatomy, or whether the guy was faking it with a garden hose, but the suggestion was obvious. I stopped that one, but the kid found more questionable videos. There were the superheroes wearing skin colored paper over their butt cheeks, made to look like the butt cheeks had been cut out of the costumes. Wasn't sure what that was supposed to be about, so I let it be for a while. Finally stopped those as well. Then there were these videos showing food preparation with those cones that are used to put decorative frosting on cakes. They added faces to the cones, animated the faces with that squirmy, uncomfortable look of a child trying to hold in bodily waste, and dubbed fart noises over the audio while the cones were squeezing out frosting, guacamole, or pureed whatever. The 3 year old thought those were hilarious, which figures.

        Have to keep an eye on YouTube. It's definitely not PBS.

  • (Score: 2) by DeathMonkey on Tuesday February 26 2019, @07:34PM

    by DeathMonkey (1380) on Tuesday February 26 2019, @07:34PM (#807196) Journal

    Wow, that's actually pretty messed up.

    I think there's an interesting discussion about whether these curated services are good or not......

    But, if you're advertising your service as kid-friendly you better make at least a good-faith effort to deliver on that promise and they have completely failed in that regard.

    Suicide tips stashed in otherwise benign cartoons are just the latest ghastly twist in the corruption of kids’ content on YouTube and YouTube Kids. For years, the video-sharing company has struggled with a whack-a-mole-style effort to keep a variety of disturbing and potentially scarring content out of videos targeting children. Videos have been found with adult content ranging from foul-language to depictions of mass shootings, alcohol use, fetishes, human trafficking stories, and sexual situations. Many contain—and attract clicks with—popular cartoon characters, such as Elsa from the 2013 animated Disney film Frozen. This chilling phenomenon has been referred to as Elsagate. Though YouTube has deleted channels and removed videos, Hess points out that it’s still easy to find a plethora of “horrifying” content aimed at children on YouTube Kids.