Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday April 11 2017, @07:41PM   Printer-friendly
from the so-long-and-thanks-for-all-the-fish? dept.

This piece of news over at Ars Technica may have some startling implications.

The Digital Millennium Copyright Act's so-called "safe harbor" defense to infringement is under fire from a paparazzi photo agency. A new court ruling says the defense may not always be available to websites that host content submitted by third parties.

A Livejournal site hosted messages of celebrities, and a paparazzi agency that owns some of those photos took exception. Since the site moderated the posts that appeared, the appeals court ruled that just shouting "safe harbour" is insufficient - the court should investigate the extent to which the moderators curated the input.

As the MPAA wrote in an amicus brief:

If the record supports Mavrix’s allegations that LiveJournal solicited and actively curated posts for the purpose of adding, rather than removing, content that was owned by third parties in order to draw traffic to its site, LiveJournal would not be entitled to summary judgment on the basis of the safe harbor...

It's hard to argue with that: a site that actively solicits and then posts content owned by others seems to fall afoul of current copyright legislation in the USA.

But I can't help thinking of the impact this may have on SoylentNews.... if left to stand, this ruling could make running a site such as SN a very tricky line to walk.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by jmorris on Tuesday April 11 2017, @08:15PM (2 children)

    by jmorris (4844) on Tuesday April 11 2017, @08:15PM (#492430)

    In the old world six companies control pretty much every movie, tv show, magazine and book, establishing and maintaining a unified Narrative was and is easy. In the Internet world there are many and control of a Narrative is difficult. One of these worlds has a future and one does not. Which one survives is still a very live question, this case is but one attempt to kill the new and save the old. Active moderation under strict liability for any misses is not financially viable and any large community with zero moderation becomes 4chan and not viable. They hope to make those the only two choices available and win.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by kaszz on Tuesday April 11 2017, @11:55PM

    by kaszz (4211) on Tuesday April 11 2017, @11:55PM (#492536) Journal

    Moderation can be enabled on external basis by the same means as crypto certs are discarded. Every user posts a signed list of posts identifiers they think are junk. Every reader configures their reader software to use lists from users they think have a good judgement. Viola, moderation without the site itself participating has been accomplished.

    (lists are kept in a separate forum to be posted and read by machine software only)

  • (Score: 2) by urza9814 on Wednesday April 12 2017, @04:05PM

    by urza9814 (3954) on Wednesday April 12 2017, @04:05PM (#492836) Journal

    Active moderation under strict liability for any misses is not financially viable and any large community with zero moderation becomes 4chan and not viable. They hope to make those the only two choices available and win.

    No. It means *bad* moderation is not viable.

    It means that if you are actively going and checking every post and deciding exactly what you want to display on your site, then yes, you are responsible for what you choose to display. That seems pretty basic, and I don't really understand why it's in dispute here.

    But that doesn't prohibit a system like SN uses, or how Amazon.com moderates their product reviews. You don't hide anything (unless it's reported and found to violate whatever terms, etc, etc... -- moderating only flagged posts is substantially different than moderating every single submission.) Instead, you give users extra information and provide a system where they can use that information to perform any moderation themselves. Or even better you could put together a "web of trust" type system. The point is that if you have a single provider totally controlling everything that gets posted, then it all gets treated as though it was posted by that single provider. If the users themselves are in control, then the company isn't liable for merely supporting their communication.

    This decision is actually a further strike AGAINST the ability of major content providers to control the narrative. They can remove comments and therefore reduce content, or they can give up all hope of controlling the narrative within those comments. But they can no longer attempt to control the narrative within those comments without exposing themselves to liability for whatever gets posted.