Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday November 17 2024, @05:32AM   Printer-friendly

From the horse's own mouth:

The Guardian has announced it will no longer post content on Elon Musk's social media platform, X, from its official accounts.

In an announcement to readers, the news organisation said it considered the benefits of being on the platform formerly called Twitter were now outweighed by the negatives, citing the "often disturbing content" found on it.

"We wanted to let readers know that we will no longer post on any official Guardian editorial accounts on the social media site X," the Guardian said.
...
Responding to the announcement, Musk posted on X that the Guardian was "irrelevant" and a "laboriously vile propaganda machine".

Last year National Public Radio (NPR), the non-profit US media organisation, stopped posting on X after the social media platform labelled it as "state-affiliated media". PBS, a US public TV broadcaster, suspended its posts for the same reason.

This month the Berlin film festival said it was quitting X, without citing an official reason, and last month the North Wales police force said it had stopped using X because it was "no longer consistent with our values".

In August the Royal National orthopaedic hospital said it was leaving X, citing an "increased volume of hate speech and abusive commentary" on the platform.


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday November 18 2024, @05:52PM (6 children)

    by Anonymous Coward on Monday November 18 2024, @05:52PM (#1382324)

    Well, what do you expect? It's khallow.

    He talks out of his ass so frequently that I'm surprised when he actually makes sense.

    Whatever.

  • (Score: 0, Flamebait) by khallow on Monday November 18 2024, @08:53PM (5 children)

    by khallow (3766) Subscriber Badge on Monday November 18 2024, @08:53PM (#1382360) Journal
    I quoted the relevant part of the Section 230 law itself and you're still bullshitting. If you aren't going to act serious, I'm not going to treat you seriously.
    • (Score: 0) by Anonymous Coward on Monday November 18 2024, @09:09PM

      by Anonymous Coward on Monday November 18 2024, @09:09PM (#1382365)
    • (Score: 0) by Anonymous Coward on Monday November 18 2024, @09:13PM (3 children)

      by Anonymous Coward on Monday November 18 2024, @09:13PM (#1382366)

      Quote what you like, but relevant jurisprudence says you're flat wrong. And you will continue to be. Hilariously so. And we'll keep laughing at you. So go ahead and quadruple down with your ridiculousness [techdirt.com]:

      If you said “Section 230 requires all moderation to be in “good faith” and this moderation is “biased” so you don’t get 230 protections”

      You are, yet again, wrong. At least this time you’re using a phrase that actually is in the law. The problem is that it’s in the wrong section. Section (c)(2)(a) does say that:

              No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected

      However, that’s just one part of the law, and as explained earlier, nearly every Section 230 case about moderation hasn’t even used that part of the law, instead relying on Section (c)(1)’s separation of an interactive computer service from the content created by users. Second, the good faith clause is only in half of Section (c)(2). There’s also a separate section, which has no good faith limitation, that says:

              No provider or user of an interactive computer service shall be held liable on account of… any action taken to enable or make available to information content providers or others the technical means to restrict access to material….

      So, again, even if (c)(2) applied, most content moderation could avoid the “good faith” question by relying on that part, (c)(2)(B), which has no good faith requirement.

      However, even if you could somehow come up with a case where the specific moderation choices were somehow crafted such that (c)(1) and (c)(2)(B) did not apply, and only (c)(2)(A) were at stake, even then, the “good faith” modifier is unlikely to matter, because a court trying to determine what constitutes “good faith” in a moderation decision is making a very subjective decision regarding expression choices, which would create massive 1st Amendment issues. So, no, the “good faith” provision is of no use to you in whatever argument you’re making.

      • (Score: 1) by khallow on Monday November 18 2024, @09:52PM (2 children)

        by khallow (3766) Subscriber Badge on Monday November 18 2024, @09:52PM (#1382378) Journal
        That assumes (c)(2)(B) applies - like users censoring other posts with moderation. When it doesn't, like when you have a third party doing censoring on your site, then you're stuck with (c)(2)(A).
        • (Score: 0) by Anonymous Coward on Monday November 18 2024, @10:21PM (1 child)

          by Anonymous Coward on Monday November 18 2024, @10:21PM (#1382387)

          That assumes (c)(2)(B) applies - like users censoring other posts with moderation. When it doesn't, like when you have a third party doing censoring on your site, then you're stuck with (c)(2)(A).

          You mean like when people with a clue mod you down? Does that mean you think you can hold SoylentNews liable for third-party moderations? Or that you could sue SoylentNews for deleting your posts altogether? Your "reasoning" isn't. It's just making unsupported claims that something you *wish* was so is actually so. Sorry, thinking doesn't make it so.

          And there's no jurisprudence to support your bullshit either.

          Go ahead. Show me one case. Even one case where a judge has ruled that way. Just one.

          CDA was enacted 28 years ago. If that were actually the law we should have seen at least one case where that happened.

          I won't hold my breath.

          • (Score: 1) by khallow on Monday November 18 2024, @10:41PM

            by khallow (3766) Subscriber Badge on Monday November 18 2024, @10:41PM (#1382392) Journal

            You mean like when people with a clue mod you down? Does that mean you think you can hold SoylentNews liable for third-party moderations? Or that you could sue SoylentNews for deleting your posts altogether? Your "reasoning" isn't. It's just making unsupported claims that something you *wish* was so is actually so. Sorry, thinking doesn't make it so.

            My post was the opposite of what you assert above. No I don't mean that and if you had read my post, you would have known that.

            That assumes (c)(2)(B) applies - like users censoring other posts with moderation. When it doesn't, like when you have a third party doing censoring on your site, then you're stuck with (c)(2)(A).

            I find it bizarre that you even quote it and still don't get what is said. Combine this inability to read with your now six times you've reposted the TechDirt link rather than argue in good faith, means you don't have a serious argument.

            I guess I'll just have to talk to the grown ups in the thread instead.