Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday July 19 2015, @12:20PM   Printer-friendly
from the timely-discussion dept.

We recently discussed reddit's woes and the hiring of a new CEO. However, we have seen communities come and go for many years.

Clay Shirky wrote about his experience in 1978: "Communitree was founded on the principles of open access and free dialogue... And then, as time sets in, difficulties emerge. In this case, one of the difficulties was occasioned by the fact that one of the institutions that got hold of some modems was a high school. ... the boys weren't terribly interested in sophisticated adult conversation. They were interested in fart jokes. They were interested in salacious talk. ... the adults who had set up Communitree were horrified, and overrun by these students. The place that was founded on open access had too much open access, too much openness. They couldn't defend themselves against their own users. The place that was founded on free speech had too much freedom."

There are two clear trends. One is that less input and customization tends to grow bigger. Note how Geocities was replaced with Myspace which was then replaced with Facebook and Twitter. These newer systems take away personal freedom of expression and makes people follow a 'prescribed' system, albeit an easier one to use. The other trend is that communities that try to be truly free and open end up either stifled by that openness or give up. The only obvious exception is a platform that allows us to simply filter out everything we don't want to see, which becomes a series of the feared echo chamber. With the excessive amount of data and the build up of complex rules on how information is shared, where does this leave us? It seems that like the famous iron triangle allowing free (and legal) speech with the possibility of diverse opinions, a cohesive group, and growth only allows you to pick two.

It seems to me this is a wicked problem, perhaps unsolvable. But I wonder if the community thinks there are other design options? Is this even possible with human nature as it is?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Geotti on Sunday July 19 2015, @12:52PM

    by Geotti (1146) on Sunday July 19 2015, @12:52PM (#211030) Journal

    Only thing I could imagine is granting more freedom as we (as individuals and as a society) mature, or have a karma system where abusers are eventually filtered out by the system, i.e. censorship in both cases: one voluntarily and the other mandatory.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Interesting) by Anonymous Coward on Sunday July 19 2015, @01:41PM

    by Anonymous Coward on Sunday July 19 2015, @01:41PM (#211046)

    I see your idea and think it is a great starting point but I'm not convinced it would work.

    To draw a parallel, there is a reason that juvenile criminal systems exists. When you're still young, you are impulsive and you don't really know yet what you're doing or what the impact of your actions will be. In a way, we want to allow you to fuck up a bit with the ability to wipe the slate clean. We have built in this ability for leniency. And I think it has been shown time and time again that lenience is a good thing. In a way, you actually need to give _more_ freedom to those who don't know yet how to deal with it so that they become accustomed to it. Maybe it's a bit like telling your kid "Sure, you can eat *all* of your halloween candy in one go, but if you feel sick afterwards, that's your own fault... however, that doesn't mean I will take away all your candy if you do get sick, I'll still let you have candy".
    I think your idea of using karma is very interesting but I'm eager to see the implementation of it and the rules applied by it.

    Let me give you an idea of how exposure to 'things' is a good thing: look at accessibility of alcohol in the US compared to a European country. In the European country, kids grow up with alcohol, they learn about it early on and unlike in the US there is not really this hard line of '21 years of age'. In the US, at 21 years of age, suddenly the floodgates do open w.r.t. access to alcohol and people drink themselves in a coma. This doesn't happen in Europe because kids there grow up with it and get accustomed to it.
    I think a similar thing needs to happen with free speech.

    Lastly, I would like to state categorically that I am against any form of censorship. I consider hateful, repulsive, or any other kind of undesirable speech as something that should never be suppressed, if for nothing more than to let us easily identify who the morons and retards of our society are.
    And then there is one freedom that comes even before freedom of speech. It is a freedom that literally no one - not even a nation-state - can take away from you, and that freedom is the freedom to be offended; it is *not* the freedom to not be offended! But you have this freedom to be offended and so do I, no one can take it away from us.
    So next time someone's speech offends you, say "thank you" and exercise your right to be offended. But don't expect the other party to stop offending you because that would be taking away one of your core rights. And just like you can be offended by what I say, so I may exercise my right to be offended by something you say, but that doesn't mean you should be barred from saying it.

    • (Score: 0) by Anonymous Coward on Sunday July 19 2015, @01:44PM

      by Anonymous Coward on Sunday July 19 2015, @01:44PM (#211048)

      When you're still young, you are impulsive and you don't really know yet what you're doing or what the impact of your actions will be.

      Sadly, you describe most adults, to only a slightly lesser degree. Or maybe the species itself is too young.

    • (Score: 3, Interesting) by Geotti on Sunday July 19 2015, @05:07PM

      by Geotti (1146) on Sunday July 19 2015, @05:07PM (#211122) Journal

      Unfortunately, with hate-speech by talented and charismatic individuals, the problem is that it can affect a critical mass of people that decide to change the system in a "bad" way, so a line has to be drawn to preserve the development of our society in a "positive" direction. I wouldn't know where, or how to draw the line, though.

      • (Score: 3, Informative) by Ethanol-fueled on Sunday July 19 2015, @08:06PM

        by Ethanol-fueled (2792) on Sunday July 19 2015, @08:06PM (#211165) Homepage

        As Soylent News' resident dickhead, I think that it's pretty obvious that you cannot have free speech and be insanely profitable at the same time. Look at Slashdot and Reddit as perfect examples of admins fighting the community over free-speech issues.

        To host a truly free-speech forum you have to do it for the love of it, and ideally you'd take it in the neck to have a lawyer on retainer as the staff have discussed. I have actually considered leaving SN a few times times because I am starting to feel like I do not belong here, there's no sport in trolling this place...it's like hunting squirrels with a rocket launcher. That's not to say that this isn't a good place, because it is. The discussions are great and the staff are great and I am happy to be a paid subscriber. But I am increasingly feeling like an outsider and sometimes I feel that my presence here is divisive and counterproductive.

        But I am an asshole, and will always be one. That's not negotiable.

        • (Score: 2) by Marand on Sunday July 19 2015, @08:27PM

          by Marand (1081) on Sunday July 19 2015, @08:27PM (#211169) Journal

          But I am increasingly feeling like an outsider and sometimes I feel that my presence here is divisive and counterproductive.

          I'd rather have that than an echo chamber. We need the conflict of dissenting opinions, even minority ones and trolls, to keep us thinking about our own opinions. I know, some people just blindly ignore or argue with anything they find disagreeable without thought, but that isn't everyone.

        • (Score: 2, Informative) by Absolutely.Geek on Monday July 20 2015, @03:44AM

          by Absolutely.Geek (5328) on Monday July 20 2015, @03:44AM (#211271)

          Don't go; your posts are some of the most worthwhile to read......maybe not the most informative but often as a humourous counterpoint to somthing serrious that highlights a different view....all be it in an expletive filled way.

          The reason to have protected speach is to have the freedom to offend; if there was no offensive speach then there is no need to protect speach. Trying to protect the masses from one point of view or another is flawed; and depending on your position it is probably called propaganda.

          BTW I don't mind trolls as long as they are trolling on topic; trolling off topic is just idiocy. Sometimes a good troll can really get a debate going allowing us to find a path to resolution or the old "agree to disagree"

          --
          Don't trust the police or the government - Shihad: My mind's sedate.
        • (Score: 4, Insightful) by mojo chan on Monday July 20 2015, @07:40AM

          by mojo chan (266) on Monday July 20 2015, @07:40AM (#211320)

          Being an asshole is fine. Reddit is fine with that, Slashdot is fine with that, Soylent is fine with it. The problem is when people start to break the law, or drive others away from the site.

          Breaking the law is an easy one. If you post material that is illegal where the site is operated from, or make illegal threats, or use the platform to commit crimes I don't think many people would argue that the content needs to be removed. It's not a free speech issue in so far as the site must comply with the law, although you could argue that the government is suppressing you by making laws. Most of that stuff is pretty clear cut though - child exploitation, posting private information, harassment etc.

          Driving others out of the discussion is harder to define. The test has to be what a "reasonable" person would think. For example, spamming threads on race issues with masses of copy/pasted white supremacist ranting and images is likely to stifle that debate. You could argue that it is a person's right to express themselves by spamming in that way, but it's also the site operator's right not to give them a platform or listen to them. If the site operator wants an inclusive debate they are within their rights to block that kind of material. Like any debate it's impossible if one person is preventing others from taking part by screaming/spamming constantly. So it's really a question of if you want an 8chan style forum where rational debate is impossible, or if you want somewhere like Soylent where the moderation system suppresses posts that are detrimental to the discourse.

          --
          const int one = 65536; (Silvermoon, Texture.cs)
          • (Score: 0) by Anonymous Coward on Monday July 20 2015, @01:39PM

            by Anonymous Coward on Monday July 20 2015, @01:39PM (#211406)

            The problem is when people start to break the law, or drive others away from the site.

            there are technical solutions to this that a site like SN could adopt.

            1. host in a country that's not overrun by lawyers (like some small Pacific island nation)
            2. prevent posting of anything but plain text (a picture may say a thousand words, but if you can't make a concise point then tl;dr)
            3. maybe prevent posting of hyperlinks, except maybe from sites like wikipedia
            4. point out to users in the terms of use/service that you are free to post what you like, but others are too so if you don't like what others are saying, get the fuck over it or gtfo
            5. make it easy for users to ignore other users that offend/troll/stalk/spam them
            6. build a more intelligent moderation system that is more personalized and can adapt based on words in/by comments/users that you ignore, words in comments that you reply to, etc (this is probably the most difficult but might be the critical ingredient for a killer forum)

            point is that it shouldn't be possible to break the law by merely posting a text comment. if it is, then the law is incompatible with free speech, which is why i wouldn't host a free speech web forum in the USA or a number of other countries that purport to promote free speech but lock away anyone they don't like for supposedly promoting extremist views or engaging in terrorism or some shit

            also, if anyone is offended by someone else's comments to the point where they aren't satisfied by merely ignoring them but feel the need to impose their own will on the other person's right to free speech, then that person is free to fuck off

            this is one reason why as much as i hate greedy neocon chickenhawk republicans, i really hate progressive socialists because while they claim to want a fair go for all, their real motivation appears to be to take (by force no less) the hard earned income of working people via taxation and give it to whomever they see fit, just so they can feel like they have made the world a better place and have warm fuzzies inside. fuck that! i don't give a shit what your ideology is, if you think forcing anyone else to partake in your master plan is righteous you are an asshole and deserve to die in a fire. freedom is for everyone without exception; not everyone except for those you don't agree with/don't like/are jealous of/etc.

  • (Score: 1) by number11 on Sunday July 19 2015, @03:47PM

    by number11 (1170) Subscriber Badge on Sunday July 19 2015, @03:47PM (#211090)

    Only thing I could imagine is granting more freedom as we (as individuals and as a society) mature, or have a karma system where abusers are eventually filtered out by the system, i.e. censorship in both cases: one voluntarily and the other mandatory.

    The karma thing might work, if it has a way to keep retesting the user. E.g. provide a positive bias, add 0.2 (or whatever) points per month, so that someone who has committed karma suicide is gradually given a chance to redeem themselves. But any system depends on keeping a critical mass of "good" users, so that the dominant culture remains "good". If you reach a point where the "bad" users outweigh the "good" ones, the culture deteriorates.

    • (Score: 2) by mojo chan on Monday July 20 2015, @07:44AM

      by mojo chan (266) on Monday July 20 2015, @07:44AM (#211325)

      The problem with karama is that it is vulnerable to modbombing. I had that happen to me on Slashdot recently when MRAs were going nuts on the Ask Brianna Wu story. A quick email to the admins fixed it (I've been one of the top posters for years, they could see that two accounts were systematically attacking me) but my point is that it's not immune to abuse.

      --
      const int one = 65536; (Silvermoon, Texture.cs)
      • (Score: 2) by penguinoid on Monday July 20 2015, @07:21PM

        by penguinoid (5331) on Monday July 20 2015, @07:21PM (#211527)

        I've been wondering about what would happen with a different moderation system -- it would have to keep track of who moderated which post, and build up something like a web of trust but for moderation. However, it would need some sort of division into groups, which in English would summarize to "respected modder for Republicans" or "respected modder for copyright fanatics" etc. Also, every user would get mod points but by using the mod points they separate themselves into those groups. The objective of this would be to build a leaky echo chamber where easily offended people are exposed to mostly things they agree with, but also to the best of things they disagree with -- and people who don't mod themselves into an echo chamber, of course, get to see a more balanced set of views. (The mod system might also need to examine the content of the post for key words and phrases, so that what it suspects to be a "disagree" mod gets positive karma when viewed from the opposite side).

        --
        RIP Slashdot. Killed by greedy bastards.
        • (Score: 1) by khallow on Thursday July 23 2015, @01:30PM

          by khallow (3766) Subscriber Badge on Thursday July 23 2015, @01:30PM (#212643) Journal
          I think some of the drama could be adverted with a singular value decomposition-based system. People mod things they like or don't like and then the system crudely ranks posts/articles/people/whatever respectively. I don't know how you'd go from that to implement your leaky echo chamber, but it would at least allow for a highly relative ranking system. I believe that Google implements something like that as part of its page rank thing.
  • (Score: 3, Interesting) by HiThere on Monday July 20 2015, @07:21PM

    by HiThere (866) Subscriber Badge on Monday July 20 2015, @07:21PM (#211528) Journal

    I like a mix of the moderation idea with a automatically splinering net, so that you can choose not to hear, e.g., fart jokes. This requires a rather formidable filter that adapts to the user, which probably means it should run locally. Essentially it's a development of an adaptive spam filter. Just like spam filters you should be able to initialize it to only accept messages which match someone else's criteria (any body else's that you choose). But then you should be able to hit "Like" and "Don't like" buttons to allow it to adapt to your taste.

    Would this be a good thing? Clearly not ideal. And current filters would have both false positives and false negatives, so the filter needs to give each rejected message a summated score, so that when you scan through the junk you can first encounter the messages that it wasn't really sure it should have marked as junk. But notice I said "summated". The score would probably need to be calculated in multiple dimensions so, e.g., rejecting political messages that you didn't want to receive was separated from commercial messages that you didn't want to recieve and both from, say, scatological humor. Etc.

    This allows free speech, but also allows people to not be inundated with messages they don't want on an individualized basis.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.