Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday October 14 2022, @09:21PM   Printer-friendly
from the what-would-YOU-decide? dept.

Explained: Why a new lawsuit targeting Google & YouTube can potentially change the internet forever- Technology News, Firstpost;:

Legislators have often debated whether social media platforms and search result aggregators should be held responsible for objectionable content that users post, which then gets recommended to different users by an algorithm, based on the user's interest.

The Supreme Court of the United States of America is now going to consider a case against Google, which may settle the debate and potentially change the internet forever.

The Supreme Court of the US is going to listen to the case of Gonzalez v. Google. The case was filed by the parents of Nohemi Gonzalez, who was killed in the 2015 ISIS attack in Paris.

Gonzalez's family is suing Google, claiming that YouTube, which is owned by Google, violated the Anti-Terrorism Act when its algorithm recommended ISIS videos to other users. The complaint states that YouTube not only hosts videos that are used by ISIS to recruit terrorists but also recommends these videos to users, instead of taking them down as per their content moderation policies.

Google and several social media companies have been sued for the content that they host on their platforms earlier as well. However, they have sought protection under Section 230 of the Communications Decency Act, which states no computer service provider "shall be treated as the publisher or speaker of any information" published by another content provider, meaning its users.

[...] In case Google wins the case, nothing changes. However, if Google loses the case the ramifications may be huge.

Google, YouTube and several social media platforms have often cited Section 230 and its fundamentals in lawsuits where they have been pulled up for content that they host. It has also allowed them to coyly state that the algorithm pushes certain types of content and that the algorithm has no bias, and mainly considers what people are engaging with. If Google loses the case, social media platforms will no longer be able to cite Section 230. Moreover, they will be held liable for not only the content they host but also for the content their algorithms recommend.


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Friday October 14 2022, @09:34PM (8 children)

    by Anonymous Coward on Friday October 14 2022, @09:34PM (#1276637)

    Just remove the algorithm that "recommends" the vids

    • (Score: 5, Insightful) by tekk on Friday October 14 2022, @09:50PM (7 children)

      by tekk (5704) Subscriber Badge on Friday October 14 2022, @09:50PM (#1276641)

      That's exactly the problem at hand, tbh. They want to drive engagement which involves using all of their personalized metrics and heuristics, but they *also* want to be treated as if they aren't recommending anything trying to hide behind the mask of "Well a *human* doesn't recommend it to you personally."

      • (Score: 2, Insightful) by khallow on Friday October 14 2022, @10:07PM (5 children)

        by khallow (3766) Subscriber Badge on Friday October 14 2022, @10:07PM (#1276645) Journal

        but they *also* want to be treated as if they aren't recommending anything trying to hide behind the mask of "Well a *human* doesn't recommend it to you personally."

        I'm fine with that. You probably should be too.

        Keep in mind that this ruling would give companies like Google increased incentive to filter material and that will bite us should it come to pass. Remember Google's bread and butter is advertised based on its search engine. They have to recommend or they don't have a product to monetize.

        • (Score: 5, Insightful) by legont on Saturday October 15 2022, @12:30AM (2 children)

          by legont (4179) on Saturday October 15 2022, @12:30AM (#1276659)

          Just wait till Google recommending terrorist videos to you will be used as an evidence that you are a terrorist.

          --
          "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
          • (Score: 1) by khallow on Saturday October 15 2022, @06:07AM

            by khallow (3766) Subscriber Badge on Saturday October 15 2022, @06:07AM (#1276686) Journal

            Just wait till Google recommending terrorist videos to you will be used as an evidence that you are a terrorist.

            Just as easy to do when they have greater control over the recommendations.

          • (Score: 2) by darkfeline on Sunday October 16 2022, @11:10PM

            by darkfeline (1030) on Sunday October 16 2022, @11:10PM (#1276905) Homepage

            There's an easy solution. Voting for and supporting policies and social norms against government overreach and the nanny state.

            --
            Join the SDF Public Access UNIX System today!
        • (Score: 4, Interesting) by tekk on Saturday October 15 2022, @01:55PM (1 child)

          by tekk (5704) Subscriber Badge on Saturday October 15 2022, @01:55PM (#1276732)

          Not particularly. It's still pretty clear to me that Google is exercising editorial power because we've seen them change "the algorithm" to favor certain classes of content before and I'm sure we will again. Even if they don't pick specific videos the engineers inside know what's being recommended and have some degree of control over this.

          I deleted it from my previous post for length, but I'd have much less of a problem with their stated stance if the results weren't so personalized. Were I a lawmaker looking at this sort of thing (god forbid,) I'd probably suggest that something based on the content rather than the watcher might be appropriate, some sort of weighting of views and similarity to the current video's tags. Maybe even go really crazy with it and let the person who uploaded the video recommend videos for the sidebar, just make it so that they can only recommend 1 or 2 of their own videos or something.

          • (Score: 1) by khallow on Sunday October 16 2022, @05:47PM

            by khallow (3766) Subscriber Badge on Sunday October 16 2022, @05:47PM (#1276864) Journal
            Keep in mind that they are required by law to exercise some degree of editorial power in the US, EU, and other parts of the world. And your proposal to limit this power to the content on display ignores that it doesn't solve any problems. The problem isn't Google selectively interfering with a subset of people, based on user context, but that it can interfere with everyone's view of the internet.
      • (Score: 3, Interesting) by NotSanguine on Saturday October 15 2022, @10:07PM

        That's exactly the problem at hand, tbh. They want to drive engagement which involves using all of their personalized metrics and heuristics, but they *also* want to be treated as if they aren't recommending anything trying to hide behind the mask of "Well a *human* doesn't recommend it to you personally."

        An interesting point. Which brings up as a potential argument that while the content is third-party content, and thus protected under CDA/Section 230, the recommendations they make are their own first-party content.

        Note that this doesn't address moderation of the third-party content (which is protected under Section 230), but rather the recommendations and arrangement of that third-party content.

        And there is, in fact, some precedent for this, as private entities can claim copyright on publication of third-party content, such as state legal codes [latimes.com].

        As such, it could be argued that the recommendations (i.e., the other content and the order that other content is displayed) are first-party content and not subject to Section 230.

        IANAL, YMMV, etc.

        --
        No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 3, Touché) by SomeGuy on Friday October 14 2022, @10:08PM (1 child)

    by SomeGuy (5632) on Friday October 14 2022, @10:08PM (#1276646)

    Legislators have often debated whether social media platforms and search result aggregators should be held responsible for objectionable content that users post

    So any responsibility for the objectionable content in their ADVERTISING?

    • (Score: 2, Flamebait) by mcgrew on Saturday October 15 2022, @01:10PM

      by mcgrew (701) <publish@mcgrewbooks.com> on Saturday October 15 2022, @01:10PM (#1276723) Homepage Journal

      "Aggregate" means "mixed", not "advertising". The two terms aren't mutually exclusive, but they're not synonyms.

      Google "dictionary". You badly need one.

      --
      Carbon, The only element in the known universe to ever gain sentience
  • (Score: 5, Interesting) by Runaway1956 on Saturday October 15 2022, @12:28AM (6 children)

    by Runaway1956 (2926) Subscriber Badge on Saturday October 15 2022, @12:28AM (#1276658) Homepage Journal

    We already have a huge echo chamber on the internet, and the media giants only reinforce the echoing.

    Youtube is a good example. If I sign in, the recommendations, and the results of searches, are - predictable. Signing out, and/or using a different browser, or going to Youtube from another operating system gives totally different results. This has been so for years. Check your own Youtube experience - don't they keep recommending the same old stuff?

    Come on Youtube - it's true that I watched a dozen versions of 'Gimme Shelter' back to back a couple years ago. That DOES NOT MEAN that I want to watch one or all of those videos every time I sign in!

    So, if Google loses,there may be a LOT of stuff they won't recommend, out of an excess of caution. That can only make the echo chamber more suffocating.

    --
    Abortion is the number one killed of children in the United States.
    • (Score: 4, Interesting) by legont on Saturday October 15 2022, @12:35AM (5 children)

      by legont (4179) on Saturday October 15 2022, @12:35AM (#1276660)

      Even without login, they track me down. What I want most often when I read say google news is to see what majority of users read; not what google thinks I want to read. I have other sources for what I want, but I need to compare them to "mainstream" which is quite impossible these days.

      --
      "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
      • (Score: 3, Informative) by Anonymous Coward on Saturday October 15 2022, @01:05AM (4 children)

        by Anonymous Coward on Saturday October 15 2022, @01:05AM (#1276662)

        Yep. This exactly. I'm on dynamic DNS and I regularly clear out all the cookies and crap. Go to youtube and start watching random videos and it is strikingly obvious when they suddenly link you back to your shadow account. My shadow account for some reason is linked to Sabaton (Dreadnought), Foster the People (Pumped Up Kicks), and Lehto's Law.

        They probably can identify me from the above.

        • (Score: 0) by Anonymous Coward on Saturday October 15 2022, @03:22AM (1 child)

          by Anonymous Coward on Saturday October 15 2022, @03:22AM (#1276682)

          Well I don't know what two of those are, but at least you have good taste in Music :)

          • (Score: 1, Interesting) by Anonymous Coward on Saturday October 15 2022, @04:28AM

            by Anonymous Coward on Saturday October 15 2022, @04:28AM (#1276684)

            Lehto's Law is just this lawyer who talks about funny court cases. I think he does one a day. I watched a few of them once.
            Which music did you like, Sabaton or Foster the People?

        • (Score: 0) by Anonymous Coward on Saturday October 15 2022, @11:31AM

          by Anonymous Coward on Saturday October 15 2022, @11:31AM (#1276709)

          I don't think that's supposed to happen if you clear cookies. Have you tried browsing YouTube in incognito only?

        • (Score: 0) by Anonymous Coward on Sunday October 16 2022, @08:49AM

          by Anonymous Coward on Sunday October 16 2022, @08:49AM (#1276821)

          Try using a VM. So much harder to track, especially when you change the OS every so often.

  • (Score: 3, Insightful) by hopdevil on Saturday October 15 2022, @01:56AM (1 child)

    by hopdevil (3356) on Saturday October 15 2022, @01:56AM (#1276675)

    Algorithms don't kill people; people kill people

    • (Score: 0) by Anonymous Coward on Saturday October 15 2022, @01:11PM

      by Anonymous Coward on Saturday October 15 2022, @01:11PM (#1276724)

      Unless the algorithm is powering a Killer Robot

  • (Score: 2) by Opportunist on Saturday October 15 2022, @08:37AM (2 children)

    by Opportunist (5545) on Saturday October 15 2022, @08:37AM (#1276690)

    Let's face it, people, Google recommends content that it has reason to believe you're interested in. My viewing habits on Youtube steer it to believe that I would be interested in videos of people tinkering and toying with IoT gadgets and building stuff out of Arduinos and the like. It's not because I search for cat videos, ya know...

    So if someone gets videos for radical terror groups in their recommendations, it's probably not because he was looking at videos of boy scouts helping grannies across the road.

    • (Score: 2) by PiMuNu on Saturday October 15 2022, @01:12PM (1 child)

      by PiMuNu (3823) on Saturday October 15 2022, @01:12PM (#1276725)

      True, but what will The Algorithm recommend if, indeed, one does select for videos of boy scouts helping grannies across the road?

  • (Score: 5, Interesting) by mcgrew on Saturday October 15 2022, @01:21PM (2 children)

    by mcgrew (701) <publish@mcgrewbooks.com> on Saturday October 15 2022, @01:21PM (#1276727) Homepage Journal

    Social media. Don't like that the twits at Twitter or the faces at Farcebook banned you for your terrorist posts calling for death to all non-whites? Start your own asshole web site, as cheap as $15 a year and maybe cheaper. Yes, a site like Soylent or Twatter costs a lot more than a simple site like my journal (the regular journal at mcgrew.info, not my S/N journal).

    When I first got on the internet back in the '90s a URL alone, never mind hosting, was over a hundred bucks just to register a domain. Today you would have to be destitute to not be able to afford your own web site.

    --
    Carbon, The only element in the known universe to ever gain sentience
    • (Score: 1, Interesting) by Anonymous Coward on Saturday October 15 2022, @04:25PM (1 child)

      by Anonymous Coward on Saturday October 15 2022, @04:25PM (#1276745)

      https://archive.ph/2wGUA [archive.ph]

      That is now an outdated article following the recent drama. If your little website steps out of line, which means posting content that is controversial in the $CURRENT_YEAR but not illegal under US laws, you will be kicked off by hosts, domain registrars, DDoS protection rackets, banks, payment processors, physical mail providers, CAPTCHA services, and more. Without an effective solution for mitigating DDoS attacks, your site is effectively dead for as long as people are angry enough to rent botnets to attack it.

      This is not a problem for you right now, but if you become too popular and wrongthink is detected in your work, you get to join the club.

      • (Score: 0) by Anonymous Coward on Monday October 17 2022, @05:13AM

        by Anonymous Coward on Monday October 17 2022, @05:13AM (#1276944)
        It's not as broad as you think it is. Most web hosts' terms of service explicitly state that you are not supposed to use their services to incite violence, even if it might not meet the stronger "imminent lawless action" standard of Brandenburg v. Ohio. This is what the majority of those websites who "step out of line" run afoul of. Wake me up when your controversial content doesn't include incitement to violence of this sort. Give one specific example of someone who's been deplatformed over anything other than truly illegal content or general incitement to violence if you don't think this is true.
(1)