Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday February 26 2019, @05:31PM   Printer-friendly
from the serious-mental-health-issues dept.

Tips for committing suicide are appearing in children's cartoons on YouTube and the YouTube Kids app.

https://arstechnica.com/science/2019/02/youtube-kids-cartoons-include-tips-for-committing-suicide-docs-warn/

YouTube Kids is theoretically a curated collection of videos that are safe for kids. Apparently, someone forgot to tell YouTube/Google that. Sure, YouTube probably has some butt covering clause in their EULA. That doesn't excuse such an oversight, though. It's very easy to see overworked, stressed parents, giving their kids access to YouTube/YouTube Kids. Kids have enough to deal with, without having to deal with grown-ups twisted thoughts. This seems more like insidious minds at work to me. What do you think?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Runaway1956 on Tuesday February 26 2019, @05:58PM (25 children)

    by Runaway1956 (2926) Subscriber Badge on Tuesday February 26 2019, @05:58PM (#807116) Journal

    Nahhh, I gotta disagree. I'm pretty insensitive to a lot of this social justice nonsense. Social pressures don't affect me much. I'm the asocial asshole, remember? I just don't care a helluva lot if some twit insists on killing himself. But, this is fucked up. You don't use the internet to target innocent children, trying to encourage them to kill themselves. That's beyond sick. That kind of thing is decidedly anti-social, far beyond my realm of asocial thinking. I can't write that kind of conduct off to Darwin.

    I will agree with your statement, "Parents, watch your children now or watch them die later." It is up to the parents, always has been, always will be.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 5, Funny) by The Mighty Buzzard on Tuesday February 26 2019, @06:09PM (3 children)

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday February 26 2019, @06:09PM (#807122) Homepage Journal

    I'll agree insofar as to say it should not be done deceptively but that's as far as I can back you. If someone wants to take their life, I'm not going to tell them they shouldn't. I'd prefer they do it efficiently and with as little mess as possible though. It's just good manners.

    --
    My rights don't end where your fear begins.
    • (Score: 2) by DeathMonkey on Tuesday February 26 2019, @07:27PM (2 children)

      by DeathMonkey (1380) on Tuesday February 26 2019, @07:27PM (#807190) Journal

      If someone wants to take their life, I'm not going to tell them they shouldn't

      YouTube Kids appears to be targeted at 3 to 8 year-olds. Yeah......I think I'm OK with telling 8-year-olds not to kill themselves.

  • (Score: 5, Touché) by urza9814 on Tuesday February 26 2019, @06:13PM (10 children)

    by urza9814 (3954) on Tuesday February 26 2019, @06:13PM (#807126) Journal

    As far as I can tell, it's not really the people posting these videos who claim they are suitable for children; that's entirely up to Google. This also isn't the first time there's been controversy around violent videos being included by Google's automated filters. Some jackass at Google decided that they're so brilliant that they can program an algorithm that can perfectly determine what is suitable for children, and they failed miserably. Multiple times. And yet Google keeps insisting that it works anyway.

    At this point, I don't see how you can blame anyone other than Google. They are well aware of this problem, yet they still keep telling parents that there is no problem. They're sociopaths that will do absolutely anything for a buck.

    • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @06:27PM (4 children)

      by Anonymous Coward on Tuesday February 26 2019, @06:27PM (#807137)

      Some jackass at Google decided that they're so brilliant that they can program an algorithm that can perfectly determine what is suitable for children, and they failed miserably. Multiple times. And yet Google keeps insisting that it works anyway.

      Same as the parent algorithm: if (isAnimated) isSafeToAbandonKids:=true

      • (Score: 3, Funny) by AssCork on Tuesday February 26 2019, @06:33PM (3 children)

        by AssCork (6255) on Tuesday February 26 2019, @06:33PM (#807142) Journal

        You forgot to trap for Anime;

        if (isAnimated) if (!hasTentacles) isSafeToAbandonKids:=true

        --
        Just popped-out of a tight spot. Came out mostly clean, too.
        • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @07:32PM

          by Anonymous Coward on Tuesday February 26 2019, @07:32PM (#807193)

          No I didn't. It would require a parent to know about tentacle porn, which most squares don't. The simple algorithm is how my mother operated.

        • (Score: 3, Touché) by Pino P on Tuesday February 26 2019, @10:43PM (1 child)

          by Pino P (4721) on Tuesday February 26 2019, @10:43PM (#807305) Journal

          if (isAnimated) if (!hasTentacles) isSafeToAbandonKids:=true

          Thanks for disappointing fans of The Little Mermaid.

          • (Score: 2) by c0lo on Wednesday February 27 2019, @02:15AM

            by c0lo (156) Subscriber Badge on Wednesday February 27 2019, @02:15AM (#807425) Journal

            Context, mate, the devil is in the context.

            E.g., how about


            isSafeToAbandonKids:=true
            // pretend we have an algo taking the decision
            if (isAnimated) if (!hasTentacles) isSafeToAbandonKids:=true

            --
            https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 3, Informative) by takyon on Tuesday February 26 2019, @06:41PM (3 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday February 26 2019, @06:41PM (#807150) Journal

      Yeah, there is reasonable doubt whether the removed videos were specifically targeting young children (although I'm sure some will argue that nobody under 18 or nobody at all should see the EVILLLL Filthy Frank green screen meme). pedimom.com / Dr. Free N. Hess also stands to gain from some good ol' false outrage.

      This is just another indication that YouTube and Google at large are terrible at any sort of curation. They don't want humans in the loop and they hate that they have to pay human support staff (mechanical turks). This is probably the right approach for colossal social media entities in the end. After all, something like a human lifetime worth of video is uploaded to YouTube every day (400 hours per minute = 65 years per day). But while machine learning can do some amazing things, it can't act as a superintelligent strong AI. So the AI-based solutions are pissing everyone off by censoring too much and not enough at the same time.

      The YouTube platform has a lot of valuable content on it and is almost too big to fail at this point. YouTube Kids on the other hand could be an easier problem to tackle if it consisted of only whitelisted videos. The problem is that YouTube would rather let the throngs of content creators experiment in order to find what kids will click on tens of millions of times (even if it ends up being Elsa from Frozen sticking a needle in Spiderman's ass [thedailybeast.com]). Remember: advertising to kids is considered several times more valuable than advertising to other demographics (on YouTube at least).

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Pino P on Tuesday February 26 2019, @10:36PM (2 children)

        by Pino P (4721) on Tuesday February 26 2019, @10:36PM (#807303) Journal

        even if it ends up being Elsa from Frozen sticking a needle in Spiderman's ass

        I gathered from a couple other episodes that Elsa Agnarrsdaughter is dating Peter Parker in that series. I haven't seen the one with the injection, so please clue me into what I'm missing: If Peter needs periodic injections of medication that his doctor prescribed to treat a condition, what's wrong with showing that?

        • (Score: 2) by takyon on Tuesday February 26 2019, @11:37PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday February 26 2019, @11:37PM (#807337) Journal

          I've only seen some of the highlights, I haven't followed the... lore of that channel, so to speak. 😂

          The injection thing is a bit of a theme in these live action videos and weird kid app games. DailyMotion [dailymotion.com] appears to have a better selection than the same search on YouTube.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by bzipitidoo on Wednesday February 27 2019, @09:01AM

          by bzipitidoo (4388) on Wednesday February 27 2019, @09:01AM (#807518) Journal

          Turn a kid loose on YouTube, and it's only a matter of time before they stumble onto something perverted.

          Lot of people are using action figure toys or costumes to make cute videos, and most of them are okay. But the 3 year old found a video of some guy in a Spiderman costume taking a leak outdoors. The camera was behind him so you couldn't see any anatomy, or whether the guy was faking it with a garden hose, but the suggestion was obvious. I stopped that one, but the kid found more questionable videos. There were the superheroes wearing skin colored paper over their butt cheeks, made to look like the butt cheeks had been cut out of the costumes. Wasn't sure what that was supposed to be about, so I let it be for a while. Finally stopped those as well. Then there were these videos showing food preparation with those cones that are used to put decorative frosting on cakes. They added faces to the cones, animated the faces with that squirmy, uncomfortable look of a child trying to hold in bodily waste, and dubbed fart noises over the audio while the cones were squeezing out frosting, guacamole, or pureed whatever. The 3 year old thought those were hilarious, which figures.

          Have to keep an eye on YouTube. It's definitely not PBS.

    • (Score: 2) by DeathMonkey on Tuesday February 26 2019, @07:34PM

      by DeathMonkey (1380) on Tuesday February 26 2019, @07:34PM (#807196) Journal

      Wow, that's actually pretty messed up.

      I think there's an interesting discussion about whether these curated services are good or not......

      But, if you're advertising your service as kid-friendly you better make at least a good-faith effort to deliver on that promise and they have completely failed in that regard.

      Suicide tips stashed in otherwise benign cartoons are just the latest ghastly twist in the corruption of kids’ content on YouTube and YouTube Kids. For years, the video-sharing company has struggled with a whack-a-mole-style effort to keep a variety of disturbing and potentially scarring content out of videos targeting children. Videos have been found with adult content ranging from foul-language to depictions of mass shootings, alcohol use, fetishes, human trafficking stories, and sexual situations. Many contain—and attract clicks with—popular cartoon characters, such as Elsa from the 2013 animated Disney film Frozen. This chilling phenomenon has been referred to as Elsagate. Though YouTube has deleted channels and removed videos, Hess points out that it’s still easy to find a plethora of “horrifying” content aimed at children on YouTube Kids.

  • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @06:43PM

    by Anonymous Coward on Tuesday February 26 2019, @06:43PM (#807151)

    I am personally disappointed in you for going down this "think of the children" rat hole.

  • (Score: 3, Insightful) by DannyB on Tuesday February 26 2019, @06:45PM (7 children)

    by DannyB (5839) Subscriber Badge on Tuesday February 26 2019, @06:45PM (#807153) Journal

    I agree with the point that this is not Darwin at all. This is some sick people. And generally we no longer seem to identify them as sick. Just as trolls or we laugh it off.

    The people doing this may not realize how sick it is. They may think it is funny. Like the Tide Pod Challenge. Or a Bottled Water Taste Contest.

    Their inability to realize they have crossed a line says something. You don't put suicide instructions into a children's video. What kind of sick person does that?

    --
    The people who rely on government handouts and refuse to work should be kicked out of congress.
    • (Score: 1, Interesting) by Anonymous Coward on Tuesday February 26 2019, @06:56PM

      by Anonymous Coward on Tuesday February 26 2019, @06:56PM (#807168)

      What kind of sick person does that?

      Anybody that's interested in mocking the faux outrage coming from the people here and there. Remember the torture and murder your tax dollars buy overseas! And down south... You people elect child killing psychos every two years for *economic growth*. Where's the beef?

    • (Score: 2) by takyon on Tuesday February 26 2019, @07:03PM (1 child)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday February 26 2019, @07:03PM (#807174) Journal

      You don't put suicide instructions into a children's video.

      What is a children's video? Some kids, or perhaps millions of kids, are watching YouTube without any supervision at all. They are coming across all kinds of edgy content.

      Here [youtube.com] we have what looks like a Splatoon "machinima" [wikipedia.org] type of video with the offensive green screen clip edited in (not spliced into a random video, but intentionally placed as part of the video). This content managed to sneak into YouTube Kids, a supposedly safer walled garden within YouTube for young children, because GooTube uses dumb machine learning algorithms to do almost everything on the platform.

      I would not be surprised if the entire video, complete with the "sideways for attention, longways for results" meme, was created and uploaded by a 12-year-old child.

      But yeah, if you are worried about your kids turning weird or dying, keep them far away from the Internet and YouTube. Just search "elsa spiderman" to find some of the disturbing content that is like crack cocaine or heroin for kids.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by DannyB on Tuesday February 26 2019, @08:53PM

        by DannyB (5839) Subscriber Badge on Tuesday February 26 2019, @08:53PM (#807249) Journal

        What is a children's video?

        and then

        Some kids, or perhaps millions of kids, are watching YouTube without any supervision at all.

        Those are two unrelated things.

        A children's video is intended for an audience of children. That's what a children's video is. Some topics, like putting suicide instructions into a child's video, might be a funny joke for an episode of South Park, but the intended audience of South Park (hopefully) is not children. (And I haven't watched it for over ten years now.)

        Letting children roam busy streets or YouTube unsupervised is a lack of parental responsibility.

        Also the internet is not a babysitter. Back in the day, a VHS tape could entertain a child for a short time. Today the internet seems to be used for that, but with no upper limit on time. Here's a bright shiny kid's tablet, have fun!

        --
        The people who rely on government handouts and refuse to work should be kicked out of congress.
    • (Score: 2, Interesting) by DeathMonkey on Tuesday February 26 2019, @07:36PM

      by DeathMonkey (1380) on Tuesday February 26 2019, @07:36PM (#807199) Journal

      And generally we no longer seem to identify them as sick. Just as trolls or we laugh it off.

      Or, they're just being themselves....

      The claim that all the racists on the internet are just pretend trolls suffers from a severe lack of evidence.

    • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @09:48PM

      by Anonymous Coward on Tuesday February 26 2019, @09:48PM (#807279)

      What kind of sick person does that?

      The kind of sick person(s) who hate your country, your standards, your way of life, your values, beliefs, etc. If they can destroy your kids and thus destroy you, they win. They are also the despicable lot who run scams of all sorts by phone, email, letter and want to invade your country, usually disguised as 'refugees'. Also applies at a higher level to state players who want to destabilize you society, democracy, way of life so their side wins (CN, RU, etc). And all these online empires that are accountable to nobody are enabling them to succeed....

    • (Score: 3, Insightful) by driverless on Wednesday February 27 2019, @01:22AM (1 child)

      by driverless (4770) on Wednesday February 27 2019, @01:22AM (#807401)

      You don't put suicide instructions into a children's video.

      Which kid would even recognise what they were, let alone act on them?

      What kind of sick person does that?

      Someone trolling for instant outrage and mass media coverage.

      It's worked brilliantly, hasn't it?

      • (Score: 2) by DannyB on Wednesday February 27 2019, @02:44PM

        by DannyB (5839) Subscriber Badge on Wednesday February 27 2019, @02:44PM (#807607) Journal

        Someone trolling for instant outrage and mass media coverage.

        It's worked brilliantly, hasn't it?

        That would not invalidate that assertion that they are sick people.

        --
        The people who rely on government handouts and refuse to work should be kicked out of congress.
  • (Score: 0) by Anonymous Coward on Wednesday February 27 2019, @01:45AM

    by Anonymous Coward on Wednesday February 27 2019, @01:45AM (#807414)

    But, this is fucked up. You don't use the internet to target innocent children, trying to encourage them to kill themselves.

    Ahh you're just old. I wouldn't do this to kids but it's really worse for concerned adults than anything. I'll bet you most of the people doing it aren't even adults.
    I wouldn't do this now but I know that from ages 11 to 17 I would have thought this is funny as fuck. Younger than that I couldn't say what my reaction would have been but I doubt I would have cared. By 8 I'd watched elmer fudd, daffy duck, and bugs bunny hang, shoot, poison, and decapitate themselves hundreds of times.
    Watching someone die on tv or a video game or whatever wasn't anything compared to the abuse my wholesome seeming whitebread middle class suburban family dumped on me right in front of teachers, school counselors, therapists, and neighbors and nobody lifted a finger or raised an eyebrow. They were however the sorts of people who would make a big deal about protecting me from this.

    It's important as we age to remember the reality of being young or you'll find yourself choking your 8 year old while you wonder if his bad behavior is from ren and stimpy or rated R movies.