Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday April 11 2017, @07:41PM   Printer-friendly
from the so-long-and-thanks-for-all-the-fish? dept.

This piece of news over at Ars Technica may have some startling implications.

The Digital Millennium Copyright Act's so-called "safe harbor" defense to infringement is under fire from a paparazzi photo agency. A new court ruling says the defense may not always be available to websites that host content submitted by third parties.

A Livejournal site hosted messages of celebrities, and a paparazzi agency that owns some of those photos took exception. Since the site moderated the posts that appeared, the appeals court ruled that just shouting "safe harbour" is insufficient - the court should investigate the extent to which the moderators curated the input.

As the MPAA wrote in an amicus brief:

If the record supports Mavrix’s allegations that LiveJournal solicited and actively curated posts for the purpose of adding, rather than removing, content that was owned by third parties in order to draw traffic to its site, LiveJournal would not be entitled to summary judgment on the basis of the safe harbor...

It's hard to argue with that: a site that actively solicits and then posts content owned by others seems to fall afoul of current copyright legislation in the USA.

But I can't help thinking of the impact this may have on SoylentNews.... if left to stand, this ruling could make running a site such as SN a very tricky line to walk.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by urza9814 on Wednesday April 12 2017, @04:05PM

    by urza9814 (3954) on Wednesday April 12 2017, @04:05PM (#492836) Journal

    Active moderation under strict liability for any misses is not financially viable and any large community with zero moderation becomes 4chan and not viable. They hope to make those the only two choices available and win.

    No. It means *bad* moderation is not viable.

    It means that if you are actively going and checking every post and deciding exactly what you want to display on your site, then yes, you are responsible for what you choose to display. That seems pretty basic, and I don't really understand why it's in dispute here.

    But that doesn't prohibit a system like SN uses, or how Amazon.com moderates their product reviews. You don't hide anything (unless it's reported and found to violate whatever terms, etc, etc... -- moderating only flagged posts is substantially different than moderating every single submission.) Instead, you give users extra information and provide a system where they can use that information to perform any moderation themselves. Or even better you could put together a "web of trust" type system. The point is that if you have a single provider totally controlling everything that gets posted, then it all gets treated as though it was posted by that single provider. If the users themselves are in control, then the company isn't liable for merely supporting their communication.

    This decision is actually a further strike AGAINST the ability of major content providers to control the narrative. They can remove comments and therefore reduce content, or they can give up all hope of controlling the narrative within those comments. But they can no longer attempt to control the narrative within those comments without exposing themselves to liability for whatever gets posted.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2