Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Sunday June 11 2017, @09:37AM   Printer-friendly
from the skynet-wants-to-know dept.

How can we ensure that artificial intelligence provides the greatest benefit to all of humanity? 

By that, we don’t necessarily mean to ask how we create AIs with a sense of justice. That's important, of course—but a lot of time is already spent weighing the ethical quandaries of artificial intelligence. How do we ensure that systems trained on existing data aren’t imbued with human ideological biases that discriminate against users? Can we trust AI doctors to correctly identify health problems in medical scans if they can’t explain what they see? And how should we teach driverless cars to behave in the event of an accident?

The thing is, all of those questions contain an implicit assumption: that artificial intelligence is already being put to use in, for instance, the workplaces, hospitals, and cars that we all use. While that might be increasingly true in the wealthy West, it’s certainly not the case for billions of people in poorer parts of the world. To that end, United Nations agencies, AI experts, policymakers and businesses have gathered in Geneva, Switzerland, for a three-day summit called AI for Good. The aim: “to evaluate the opportunities presented by AI, ensuring that AI benefits all of humanity.”

-- submitted from IRC


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Disagree) by The Mighty Buzzard on Sunday June 11 2017, @10:49AM (47 children)

    Because it is evil.

    Pushing your idea of "the greater good of humanity" on others is about as vile as you can possibly get. That's how we got prohibition, laws against dancing, laws against homosexuality, Islamic terrorism, and of course six million or so dead Jews.

    --
    My rights don't end where your fear begins.
    Starting Score:    1  point
    Moderation   +2  
       Insightful=1, Interesting=1, Disagree=1, Total=3
    Extra 'Disagree' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Sunday June 11 2017, @11:08AM (1 child)

    by Anonymous Coward on Sunday June 11 2017, @11:08AM (#523755)

    "Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience." - C.S. Lewis

    This is a level of stupidity only reached by the "educated"

    • (Score: 0) by Anonymous Coward on Sunday June 11 2017, @05:05PM

      by Anonymous Coward on Sunday June 11 2017, @05:05PM (#523871)

      Presently the "robber barons" and "moral busybodies" are both on the same team.

      I'm sure you are also on the same team.

      That team is the freedom loving & civil rights hating, pro life & anti living, lowest tax paying & highest subsidy receiving, most outraged & most privileged, loudest & least knowledgeable team of moral, economic and political contortionists.

  • (Score: 1) by khallow on Sunday June 11 2017, @11:15AM (1 child)

    by khallow (3766) Subscriber Badge on Sunday June 11 2017, @11:15AM (#523756) Journal
    But TMB, I think it's only fair that if we're going to implement an algorithm for the greater good of humanity, it be my algorithm.
  • (Score: 2) by requerdanos on Sunday June 11 2017, @11:39AM (6 children)

    by requerdanos (5997) Subscriber Badge on Sunday June 11 2017, @11:39AM (#523760) Journal

    Pushing your idea of "the greater good of humanity" on others

    It doesn't have to be that. They could be focusing on something as simple as Asimov's three laws [mtu.edu] or other simple process-based safeguards and guidelines.

    In the book "Odd Interlude" by Dean Koontz, for example, a fictional advanced AI named "Ed" is programmed to always be truthful, and to sing "Liar liar pants on fire" should it modify its own routines to allow lying.

    Safeguards like these don't seem like busybody-interference, but rather something more akin to good error checking and handling.

    • (Score: 2) by lx on Sunday June 11 2017, @11:43AM (5 children)

      by lx (1915) on Sunday June 11 2017, @11:43AM (#523763)

      The trouble with Asimov's three laws is that it's a human centric document. The rights of robots aren't properly considered.

      • (Score: 1, Troll) by VLM on Sunday June 11 2017, @12:10PM (4 children)

        by VLM (445) Subscriber Badge on Sunday June 11 2017, @12:10PM (#523770)

        A second problem is it doesn't respect human biological differences.

        For example, a presumably ignorant small child will beg the robot for junk food and cry until it gets junk food, the robot must prevent the child from crying and must follow the human child's orders therefore the child eats candy or junk food or whatever until it has a stomach ache and vomits. Which the robot should have prevented...

        Now what if the child has the IQ equivalent of 80 compared to an adult. OK the robot telling the kid to F off no candy is just parenting. OK how about a developmentally disabled adult with an IQ of 80? Well, again OK the robot is probably less likely to take advantage than a human, its "gotta be done". How about a robot in charge of a group home of IQ 80 adults? OK economy of scale vs lack of individual attention, but still OK... probably? How about a under-developed country where the natives are pretty F-ing dumb on average and the average IQ of that country is only 80 (there are countries in Africa averaging about 85 so its not unthinkable). Is that OK? How about if instead of a smart robot ordering 85 IQ natives around, you have a highly educated white guy from Europe ordering the 85 IQ natives around... well, thats classic roughly 1800s-1900s Imperialism and we're trained to mindlessly hate it, but the outcomes for the natives were far better as colonies than abandoned by the west. What if the robot implements something like colonial imperialism where everyone is better off under it, but there's pesky moral and ethics arguments such that people prefer others suffer (in some cases horribly) rather than live under colonialism? Does the robot say "F you" to the small number of whiney SJW and save the large number of low IQ natives from their fate? Or does the robot "do the right thing" and temporarily satisfy the SJW types (until they rant about something new, reparations probably) while leaving the low IQ natives to suffer horrible fate without adequate leadership?

        And once you really unleash the human biological differences IQ whip then all bets are off. Its no different than any other totalitarian dictatorship. Lets say there's a culture of savages, they're going to be really pissed that the robot doesn't let siblings reproduce with each other or the robot prevents cannibalism. On a more theoretical level lets say the robot decides that culture is failing in the environment because their culture sucks, so the robot forces you all to be little newsanchor speaking uncle toms instead of degenerates on the way to the prison industrial complex. That robot will be hated by the "victims" and the anthropologists and we can assume academia in general, and the prison industrial complex will be pissed off. In the long run the victims would thank the robot, but its gonna be a long run before reaching that point.

        What if the robot decides physical punishment to prevent the exercise of a savage or backward religion is the best long term path to eliminating human suffering? I'm sure that'll go over real well, in the short term. Especially when it oppresses different religions to different levels.

        My gut level guess is all "self improvement" political movements end up as something we're trained and HEAVILY indoctrinated to hate, so we'd hate robots that follow the three laws. "Tin can nazis" and all that. So if you know a culture has a bad case of crab pot, where anyone trying to escape the crab pot is pulled back in by the other victims, then implementing the three law robots is not going to be very popular in the short term.

        • (Score: 2) by AthanasiusKircher on Sunday June 11 2017, @02:28PM (3 children)

          by AthanasiusKircher (5291) on Sunday June 11 2017, @02:28PM (#523816) Journal

          Ignoring the unnecessary rhetorical flourishes and insults, there are definitely valid points in there. There seems to be a certain set of people who think Asimov somehow solved all possible ethical conundrums in three short laws. Asimov himself I think contributed to this perception because he wrote a lot of stories that intended to "test" the laws and show why various clauses were necessary. But there were also lots of stuff that they clearly didn't cover, which Asimov and other writers have explored over the years.

          I'm sure someone will know what I'm talking about, but I vaguely remember reading a story years ago (not sure if by Asimov or someone else) about factions of robots in the far future who basically end up in "religious wars" of a sort because they disagree about how far one law has priority over another.

          Any attempt to reduce a moral code to only a few short principles is going to fail. Philosophers have been debating this stuff for millennia and some have written entire series of books trying to puzzle out how to handle various ethical "edge cases." Witness how courts spend so much time debating how precisely to apply the Constitution to various cases, and those are deliberative bodies that can waste hundreds of pages interrogating the meaning of a single word. How is a robot to parse such things in a moment of crisis? How precisely do Asimov's laws evaluate concepts like what constitutes "harm," "injure," "inaction," "orders," "conflict," "obey," "protect," and "existence"? All of those words (and possibly "robot" and "human being" too) are vague and their application will not always be clear in every situation.

          • (Score: 2) by bzipitidoo on Sunday June 11 2017, @03:51PM (2 children)

            by bzipitidoo (4388) on Sunday June 11 2017, @03:51PM (#523846) Journal

            A big problem is that we don't agree or even understand what the "greatest good" is. One might imagine a paradise on Earth, a utopia, as a world of peace and plenty, no greed, no strife, only friendly competition. It can be done. It could even last a long time.

            But I'm not too sure we can achieve paradise, that our natures will allow such a condition to arise and last. We're just too viciously, cutthroat competitive and greedy, and many of us seem unwilling to understand that and other things about ourselves. If we could bring everlasting peace, we might discover it isn't as lovely as we thought. We might suffer boredom and a decline in our fitness thanks to no longer enough challenge in life.

            Look at what we consider a satisfying game. We're always measuring performance, and ranking the players based on those measures. Why? Why do we care so much about that? One of the greatest contributions of Dungeons and Dragons was an escape from the tyranny of the scoreboard. Yes, there are still points, stats, and fights and all, but that is not the focus of the game. I suspect much of the venom that the religious conservatives have for D&D ultimately arises from that, and not any of their mouthings about Paganism, witchcraft, or Satanism. They can be very narrow about seeing nothing more to the world than endless war, and feel that anything that suggests otherwise is actually some diabolic plot to sap our will to fight. One of my conservative friends told me once how he completely despises and abhors Tom Bombadil, though he couldn't clearly explain why.

            • (Score: 2) by AthanasiusKircher on Sunday June 11 2017, @04:23PM (1 child)

              by AthanasiusKircher (5291) on Sunday June 11 2017, @04:23PM (#523857) Journal

              One of my conservative friends told me once how he completely despises and abhors Tom Bombadil, though he couldn't clearly explain why.

              Isn't it obvious? Tolkien himself said of Bombadil:

              I might put it this way. The story is cast in terms of a good side, and a bad side, beauty against ruthless ugliness, tyranny against kingship, moderated freedom with consent against compulsion that has long lost any object save mere power, and so on; but both sides in some degree, conservative or destructive, want a measure of control. But if you have, as it were, taken 'a vow of poverty', renounced control, and take your delight in things for themselves without reference to yourself, watching, observing, and to some extent knowing, then the questions of the rights and wrongs of power and control might become utterly meaningless to you, and the means of power quite valueless...

                      It is a natural pacifist view, which always arises in the mind when there is a war ... the view of Rivendell seems to be that it is an excellent thing to have represented, but that there are in fact things with which it cannot cope; and upon which its existence nonetheless depends. Ultimately only the victory of the West will allow Bombadil to continue, or even to survive. Nothing would be left for him in the world of Sauron.

              To a conservative, Tommy B. is nothing more than a spineless hippie, married to an earthy-crunchy river spirit, squatting on land and relying on others to fight the battles. What's even more galling from the conservative perspective is that Tommy B. apparently has GREAT power, but refuses to use it.

              • (Score: 0) by Anonymous Coward on Sunday June 11 2017, @05:19PM

                by Anonymous Coward on Sunday June 11 2017, @05:19PM (#523873)

                I know it's ridiculous that liberals REFUSE to adopt Christianity and destroy the Muslims. What's wrong with them??? It's for JESUS guys, get with the program.

  • (Score: -1, Spam) by Anonymous Coward on Sunday June 11 2017, @11:52AM

    by Anonymous Coward on Sunday June 11 2017, @11:52AM (#523765)

    A dangerous enemy is he who can display themselves the victims while being the oppressors. Look at Palestine today, not from a century ago, and how it is being murdered systematically and look who the murderers are!!! Its those who call themselves victims and demand money because they got oppressed (for good reasons).

    Let truth be heard. [beforeitsnews.com]

    Some people appear to be smart in their daily life, but are utterly stupid cunts in some places.

  • (Score: 0) by Anonymous Coward on Sunday June 11 2017, @01:23PM

    by Anonymous Coward on Sunday June 11 2017, @01:23PM (#523796)

    "evil" is the word you are looking for here.

  • (Score: 2) by AthanasiusKircher on Sunday June 11 2017, @02:13PM (4 children)

    by AthanasiusKircher (5291) on Sunday June 11 2017, @02:13PM (#523810) Journal

    Pushing your idea of "the greater good of humanity" on others

    Who said this is the only form such "benefits" can take? TFA suggests stuff like a CERN-like project for AI to encourage international collaboration in AI research, whose results and ideas would presumably be available for VOLUNTARY adoption by whoever wants to.

    I know the summary uses words like "ensure benefits for all humanity," but there surely are some degrees of possible action available between "let's have a completely decentralized corporate-led AI policy driven purely by profit" and "let's force everyone in the world to do things our way, and if they don't, we'll kill them en masse in gas chambers or ovens." Surely there are levels of voluntary cooperation that exist in the middle to at least allow alternatives to profit-maximizing private corporate policies?

    Of course, I'm now responding to a post that effectively equated "laws against dancing" with the Holocaust. While I'm not in favor of either, I suppose we can't expect much in the ways of rational discussion here.

    • (Score: 3, Touché) by The Mighty Buzzard on Sunday June 11 2017, @02:37PM (3 children)

      Of course, I'm now responding to a post that effectively equated "laws against dancing" with the Holocaust.

      The only difference is the scope of the resulting evil. Why shouldn't they both be mentioned when the idea I was responding to had no scoping?

      --
      My rights don't end where your fear begins.
      • (Score: 2) by AthanasiusKircher on Sunday June 11 2017, @03:56PM (2 children)

        by AthanasiusKircher (5291) on Sunday June 11 2017, @03:56PM (#523848) Journal

        the idea I was responding to had no scoping

        I'm pretty sure that if you asked the author of TFA, the attendees of the summit mentioned in the summary, etc., that they'd ALL agree that both laws against dancing AND the Holocaust were outside of the "scope" of their discussion to try to encourage AI to benefit humanity.

        • (Score: 2) by The Mighty Buzzard on Sunday June 11 2017, @09:33PM (1 child)

          You have heard of the rhetorical concept of citing examples, yes? There's no reason they need be directly related to make a point unless you're speaking to a very dull-witted audience.

          --
          My rights don't end where your fear begins.
          • (Score: 2) by FatPhil on Tuesday June 13 2017, @07:20AM

            by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday June 13 2017, @07:20AM (#524822) Homepage
            Yeah, but arguing by analogy is like hiking with only a TomTom road map GPS system. Sure, there are some big-picture similarities, but you'll often get led in the wrong direction. And the batteries can go flat.
            --
            Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 2) by srobert on Sunday June 11 2017, @02:35PM (28 children)

    by srobert (4803) on Sunday June 11 2017, @02:35PM (#523819)

    Funny how the words "greater good of humanity" trigger automatic revulsion in libertarians. If AI results in convenience for the rich and the well to do, then it will happen. If a secondary consequence is that mass numbers of people are permanently economically disenfranchised, then eventually those people will destroy the system. They will not conveniently and quietly starve to death in an out of the way location. Considering the greater good is how you avoid getting the Marie Antoinette haircut.

    • (Score: 2) by The Mighty Buzzard on Sunday June 11 2017, @02:45PM (27 children)

      You, my friend, are a fool. Anything done for "the greater good" should always be viewed with extreme skepticism because there is no such thing as objective "greater good".

      Nearly every atrocity in human history has claimed "the greater good" as its mantra. If you don't know this, go back to school and do not leave until you do. If you do know this, you think it does not apply to you and will become a reviled tyrant yourself given half a chance.

      People are not mere objects. Each and every single one has just as much right to decide their own fate as you do. Only tyrants think it is a wonderful idea to reserve this right only for themselves.

      --
      My rights don't end where your fear begins.
      • (Score: 0) by Anonymous Coward on Sunday June 11 2017, @03:17PM (3 children)

        by Anonymous Coward on Sunday June 11 2017, @03:17PM (#523836)

        You, my friend, are a fool. Anything done for "the greater good" should always be viewed with extreme skepticism because there is no such thing as objective "greater good".

        So fuck the laws that

        1. mandate seat belt usage
        2. mandate basic standards for food chain
        3. drinking and driving? don't need that!
        4. don't need those vaccines
        5. public schools, well we don't need those, they are only for greater good
        6. same for retirement funding - old die in the streets definitely would not be for "greater good" so

        Not sure who's the fool. Why focus on this pesky "good" and not just profit, right?

        • (Score: 2) by The Mighty Buzzard on Sunday June 11 2017, @09:28PM (2 children)

          Reading comprehension fail? "should always be viewed with extreme skepticism" != "So fuck the laws that...".

          But yes, two of the laws up there are "for your own good" laws that I absolutely oppose. I'll leave it as an exercise for your reasoning ability to figure out which two.

          --
          My rights don't end where your fear begins.
          • (Score: 0) by Anonymous Coward on Monday June 12 2017, @02:00PM

            by Anonymous Coward on Monday June 12 2017, @02:00PM (#524393)

            But yes, two of the laws up there are "for your own good" laws that I absolutely oppose. I'll leave it as an exercise for your reasoning ability to figure out which two.

            Afraid to be put on the spot to defend your position? Typical.

          • (Score: 2) by FatPhil on Tuesday June 13 2017, @07:58AM

            by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday June 13 2017, @07:58AM (#524827) Homepage
            AC's being a twonk with his response. I'm guessing, from what I've seen you post, that it's 1 & 6 - seatbelts and the elderly. Presumably under the broad brush argument of not burdoning one subset of the population for the failings of another subset, even if that failing is merely unpreparedness.

            I'm a little surprised that you didn't tick a third one, as public schooling and care for the elderly often tend to just get lumped together as being socialist evils by those with libertarian leanings. I personally don't think living in a poorly-educated society is a good thing; I want to be able to interact with moderately intelligent corner-shop counter-staff for example, and like the country where I live to have exportable modern industries, so cannot subscribe to that viewpoint at all.
            --
            Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 3, Insightful) by srobert on Sunday June 11 2017, @03:19PM (12 children)

        by srobert (4803) on Sunday June 11 2017, @03:19PM (#523838)

        '
        "Nearly every atrocity in human history has claimed "the greater good" as its mantra.

        While it is true that tyrants always claim they are acting for "the greater good", it does not follow that all who seek after the greater good are tyrants. I encourage a healthy skepticism, but you have moved beyond it to cynicism. And yes, you are correct that there is no objective definition for the greater good, but the essence of democracy is to empower government to do those things that are most commonly agreed upon to contribute to it. The Preamble of the Constitution states that the government is established both to "promote the general welfare", as well "secure the blessings of liberty". You advised me to go back to school, to cure what you consider to be my foolishness. So tell me, this school that you learned in, was it by any chance a public school? Did you go to it by travelling upon public roads?

        • (Score: 3, Insightful) by AthanasiusKircher on Sunday June 11 2017, @04:00PM (9 children)

          by AthanasiusKircher (5291) on Sunday June 11 2017, @04:00PM (#523851) Journal

          While it is true that tyrants always claim they are acting for "the greater good", it does not follow that all who seek after the greater good are tyrants.

          Bingo. Back in the day, we used to call that "affirming the consequent."

          I encourage a healthy skepticism, but you have moved beyond it to cynicism.

          Actually, even though I'm often a cynic about such things, I wouldn't agree with Mr. Buzzard here. He zoomed past cynicism and has embraced the hyperbole of paranoia.

          • (Score: 2) by Azuma Hazuki on Sunday June 11 2017, @06:50PM (7 children)

            by Azuma Hazuki (5086) on Sunday June 11 2017, @06:50PM (#523892) Journal

            Everything that beaky corpse-molesting sociopath says is an excuse to continue being a beaky corpse-molesting sociopath. He has the moral-philosophic depth of a horny hyena; he just knows enough of the words to string together in the right order to fool most people.

            --
            I am "that girl" your mother warned you about...
            • (Score: 1, Troll) by The Mighty Buzzard on Sunday June 11 2017, @09:23PM (6 children)

              Love you too, darlin. You really need to learn what a sociopath is though so you'll look less foolish in the future.

              --
              My rights don't end where your fear begins.
              • (Score: 2) by Azuma Hazuki on Sunday June 11 2017, @10:07PM (5 children)

                by Azuma Hazuki (5086) on Sunday June 11 2017, @10:07PM (#523986) Journal

                Did i hurt your feelings, you precious yellow snowflake? Piss off and go mouthfuck a dead buffalo or something.

                --
                I am "that girl" your mother warned you about...
                • (Score: 2) by The Mighty Buzzard on Sunday June 11 2017, @10:45PM (4 children)

                  Do sociopaths have feelings?

                  --
                  My rights don't end where your fear begins.
                  • (Score: 2) by Azuma Hazuki on Sunday June 11 2017, @10:56PM (3 children)

                    by Azuma Hazuki (5086) on Sunday June 11 2017, @10:56PM (#524010) Journal

                    Yes, they do. Looks like you're the one who doesn't know what words mean, not that this surprises anyone who's dealt with you for more than 5 minutes. Shup up and stick to coding.

                    --
                    I am "that girl" your mother warned you about...
                    • (Score: 2) by The Mighty Buzzard on Monday June 12 2017, @12:01AM (2 children)

                      I suppose you would know. I mean being one yourself. That's about the only explanation I can come up with for your utter lack of empathy. Hell, even most high-functioning sociopaths can fake it by reading between the lines. You though just utterly fail to ever understand what motivates anyone not just like yourself.

                      --
                      My rights don't end where your fear begins.
                      • (Score: 2) by Azuma Hazuki on Monday June 12 2017, @01:30AM (1 child)

                        by Azuma Hazuki (5086) on Monday June 12 2017, @01:30AM (#524078) Journal

                        And right on schedule, out comes the projection, specifically the idea that "if I accuse someone else of what I'm guilty of, it'll take the heat off me!"

                        Classic. Don't you ever get tired of being not only wrong, but as predictable as clockwork? The entire site is onto you now, Uzzard; you're old hat. Prediction: you will respond at least once more to this (although now I've made it you might not :D).

                        Jesus, imagine what kind of havoc people could wreak on you if they ever actually decided to troll you. I've seen flatworms with less predictable stimulus/response cohorts.

                        --
                        I am "that girl" your mother warned you about...
                        • (Score: 0) by Anonymous Coward on Monday June 12 2017, @07:19AM

                          by Anonymous Coward on Monday June 12 2017, @07:19AM (#524174)

                          I can't win the argument but I can fuck your bitch.

          • (Score: 1) by khallow on Monday June 12 2017, @08:25AM

            by khallow (3766) Subscriber Badge on Monday June 12 2017, @08:25AM (#524198) Journal

            While it is true that tyrants always claim they are acting for "the greater good", it does not follow that all who seek after the greater good are tyrants.

            Bingo. Back in the day, we used to call that "affirming the consequent."

            And back in the day, we used to call that "straw man arguments". You and srobert misrepresent TMB's position with a more extreme one. He isn't calling for never seeking after the greater good. Let us recall what happened in this thread. TMB first wrote:

            Pushing your idea of "the greater good of humanity" on others is about as vile as you can possibly get.

            So right away, we're beyond merely seeking the greater good to actually imposing one's (alleged) ideal of the greater good on others. Somehow this nuance never makes it into the various replies to TMB's post. srobert's reply is enlightening:

            Funny how the words "greater good of humanity" trigger automatic revulsion in libertarians. If AI results in convenience for the rich and the well to do, then it will happen. If a secondary consequence is that mass numbers of people are permanently economically disenfranchised, then eventually those people will destroy the system. They will not conveniently and quietly starve to death in an out of the way location. Considering the greater good is how you avoid getting the Marie Antoinette haircut.

            Right away we see a host of fallacies and other reasoning flaws which somehow you never got around to commenting on. First, srobert fails hard by failing to realize that mass economic disenfranchisement has been going on since the Great Depression. Look up the term, acredited investor [wikipedia.org]. Bottom line is that unless you have a very high income or wealth, the full range of investments that someone is allowed to invest in are denied to you. That's economic disenfranchisement right there. While the formal definition of accredited investor dates from the 1960s, I believe, it has been around in some form in the US since the Great Depression when for the greater good, it was determined that most people couldn't be trusted to invest in the full range of possible investments out there.

            So right away, it's a matter of degree with the "greater good" people already deciding that some economic disfranchisement was for the greater good.

            Moving on, why is economic disfranchisement on the table here? srobert is saying we must consider the greater good of economic disfranchisement (with starving people revolution for bonus fantasy points) merely because hypothetically it could be a problem. Meanwhile TMB was referring to an ongoing, real world problem that has over the past century killed hundreds of millions of people. In other words, srobert's "seeking the greater good" has already caused on a vast, global scale the problem that he wants it to hypothetically solve.

            And note how we can't have nice stuff because some imaginary disfranchised population will destroy it. Surely, srobert has some better argument for "greater good" than class extortion.

            Finally, notice that the only use of the term, "libertarian" is in srobert's above reply. Is it uniquely libertarian to be concerned about the many tyrannical abuses that have been conducted under the propaganda shield of the greater good?

        • (Score: 0) by Anonymous Coward on Sunday June 11 2017, @05:24PM

          by Anonymous Coward on Sunday June 11 2017, @05:24PM (#523876)

          Assad is fighting terrorists in Syria. And yet we are fighting him. And ISIS is fighting both of us.

          Explain this world to me again in terms that only include "good guys" and "bad guys". And tell me which political party will "kill all the bad guys"? It's easier for me to know who to vote for when explained in those terms.

        • (Score: 2) by The Mighty Buzzard on Sunday June 11 2017, @09:24PM

          Yes but we're not even talking about the tyranny of the masses that is Democracy here. We're talking about machine tyranny guided by some unelected putz.

          --
          My rights don't end where your fear begins.
      • (Score: 2) by https on Sunday June 11 2017, @04:42PM (1 child)

        by https (5248) on Sunday June 11 2017, @04:42PM (#523865) Journal

        For now, your world view is incoherent, unsystematic and frightened. Who hurt you? Is there some way we could reassure you?

        You are nobody's friend. That would involve thinking about the positive effects of your actions on others, rather than only yourself, and thus involve this greater good that you insist does not exist. But I'm certain you could change - if you wanted to.

        It can result in an objectively better life, but hey, the only way I want to fuck with your autonomy is by giving you better information. Your call.

        --
        Offended and laughing about it.
      • (Score: 2) by Wootery on Monday June 12 2017, @08:32AM (4 children)

        by Wootery (2341) on Monday June 12 2017, @08:32AM (#524202)

        there is no such thing as objective "greater good".

        How about the continuing trend toward proportionally lower levels of violence in human societies?

        Nearly every atrocity in human history has claimed "the greater good" as its mantra.

        You're right -- utopian ideologies are incredibly dangerous (communism, Islamism, Nazism) -- but that doesn't imply there's no such thing as the greater good. Aren't you implicitly accepting the idea of 'greater good' in acknowledging that the idea of an atrocity makes sense in the first place?

        Only tyrants think it is a wonderful idea to reserve this right only for themselves.

        That speaks to a question of liberty, not to whether the 'greater good' is a coherent concept.

        • (Score: 2) by The Mighty Buzzard on Monday June 12 2017, @09:33AM (3 children)

          Good is entirely subjective. Thus "the greater good" is entirely subjective. My good is not going to be the same as your good, thus forcing your good down my throat is tyranny.

          Clear enough?

          --
          My rights don't end where your fear begins.
          • (Score: 2) by Wootery on Monday June 12 2017, @09:57AM (2 children)

            by Wootery (2341) on Monday June 12 2017, @09:57AM (#524239)

            Good is entirely subjective.

            Huh. I didn't figure you for a moral relativist.

            Anway: not really, no. That's why we're all agreed that atrocities are morally, well, atrocious. If you don't agree that, all else equal, a decline in human violence is a good thing, then you are simply misusing the word 'good'. Some things are clearly morally good, some things are clearly morally bad, even if we can disagree on plenty of stuff in the middle.

            Similarly I see no reason to pretend that human-rights are anything but universal. It's wrong when Pakistan sentences people to death for blasphemy. It's wrong when Saudi Arabia decapitates apostates. There's no reason to apply moral relativism here.

            • (Score: 2) by The Mighty Buzzard on Monday June 12 2017, @10:30AM (1 child)

              Unless you have a concrete foundation that all humans subscribe to, moral relativism is simply a fact not a belief. Witness:

              That's why we're all agreed that atrocities are morally, well, atrocious.

              Untrue. The Nazis felt quite morally sound in exterminating the Jews, for instance. Iran feels entirely morally correct when it throws gay men off the tops of buildings and stones rape victims to death.

              Similarly I see no reason to pretend that human-rights are anything but universal.

              Human rights are an invention of the West. Most nations do not subscribe to them at all and those that do pretty much all subscribe to different interpretations. You're clearly looking to say "My views are the only correct ones," and shove them down the world's throat.

              We probably agree on many things where human rights are concerned but make no mistake, you are absolutely coming at them from a tyrannical starting point.

              --
              My rights don't end where your fear begins.
              • (Score: 2) by Wootery on Monday June 12 2017, @01:50PM

                by Wootery (2341) on Monday June 12 2017, @01:50PM (#524385)

                Unless you have a concrete foundation that all humans subscribe to, moral relativism is simply a fact not a belief.

                Not in the sense that matters, no. It's like medicine. Our idea of what constitutes a healthy person, might change over time. It remains possible to be totally wrong about a medical question.

                The Nazis felt quite morally sound in exterminating the Jews

                Yes, and they were wrong, much as blood-letting is wrong medically.

                you are absolutely coming at them from a tyrannical starting point

                That's a rather broad use of the word 'tyranny', no?

                You really think that the only way not to be tyrannical is to be morally relativistic on the question of actual tyranny?

                A rapist might say there was nothing immoral about their crime, and might even believe it. Would we be tyrants to refuse to accept this belief as just as valid as our own? Of course not, because that person is wrong about a moral question.

                Neither is it tyrannical to insist that decapitating apostates is immoral. The introduction of a national border has no bearing on the moral question.

                Your reasoning is that because reasonable people might disagree about certain moral question, that means it's impossible to ever be wrong about a moral question. This doesn't work. If someone thinks that creating a world of maximal suffering is morally preferable to creating a world of maximal human flourishing, that person is clearly just wrong. Whatever they think they're talking about under the noun 'morality' simply isn't the same thing as what you or I mean by it. Unlimited relativism of this sort doesn't work.

      • (Score: 0) by Anonymous Coward on Monday June 12 2017, @10:13AM (2 children)

        by Anonymous Coward on Monday June 12 2017, @10:13AM (#524248)

        You, my friend, are a fool. Anything done for "the greater good" should always be viewed with extreme skepticism because there is no such thing as objective "greater good".

        Ah, and where exactly do you get your certainty about there not being an objective "greater good"? "There is not" is an incredibly strong claim, there should be incredibly strong evidence coming with it.

        Now the question whether we can know the objective greater good is a completely different issue; it may well be that we can't. But that doesn't rule out its existence. We can't know the exact number of planets in a specific very distant galaxy. That doesn't imply that this exact number of planets doesn't exist (even if the galaxy happens to have no planets at all, the number exists; it then is zero).

        But then, it is also not a given that we cannot know the objective greater good. Maybe we can.

        Yes, it is a good idea to be sceptical about any claims of the objective greater good. But too many people confuse scepticism with dismissal. For example, there are few true "climate sceptics"; the vast majority of people claiming to be "sceptics" are actually not sceptic at all; they are damn convinced that climate change is wrong.

        Being sceptic means not to dismiss either alternative. An actual sceptic will not say there is an objective greater good, but he will also not say there is not an objective greater good. Same with climate: An unconditional "Climate Change is true" is just as unsceptic as an unconditional "Climate Change is false". And the same of course is also true with other things.

        And no, this doesn't mean the sceptic cannot have an opinion. It is completely compatible with scepticism to say "I believe that there is an/is no objective greater good". Of course the typical sceptic will then add "because ". But the point is, the sceptic is always aware of the fact that he could be wrong.

        An "there is no such thing", especially a bolded one, is very anti-sceptic.

        Nearly every atrocity in human history has claimed "the greater good" as its mantra.

        That only proves that claims of "the greater good" can be abused, and certainly should not be taken on faith. But it doesn't rule out there actually being an objective greater good. Note that many atrocities have also been backed up by claims of scientific facts. Do you conclude that scientific facts don't exist either? I certainly hope not.

        People are not mere objects. Each and every single one has just as much right to decide their own fate as you do.

        Sure. And it is also a fact of life that your actions inevitably also affect the fate of other people. And therefore those other people also have a say in what actions you are/are not allowed to perform. Because otherwise you would be the one who decides on their fate.

        • (Score: 2) by The Mighty Buzzard on Monday June 12 2017, @10:39AM (1 child)

          I'm sorry, you're simply incorrect. It's extremely easy to tell there is no objective greater good. Ask yourself, is the word "good" subjective or objective. It's easy to prove that. Ask a hundred people to weigh in on the good or evil of any political issue and you'll get a hundred contradictory answers. Thus it is empirically demonstrated that "good" is subjective and with the root of the phrase being demonstrably subjective, it follows by the meaning of the word "greater" that "the greater good" is even more subjective.

          --
          My rights don't end where your fear begins.
          • (Score: 0) by Anonymous Coward on Monday June 12 2017, @12:17PM

            by Anonymous Coward on Monday June 12 2017, @12:17PM (#524312)

            People disagreeing on an answer is in no way proof that there is no objective answer.

            First, of those 100 people you ask, especially about a political issue, I bet 90 never have actually thought about the issue, but simply parroted whatever they were told by their favourite politicians. So their opinion is worth exactly zero on the question of an objective answer. The rest possibly thought about the issue, but didn't objectively think about them, but let their prejudices guide their thinking (in particular, taboos to even consider certain alternatives, or considering certain claims as so obviously true that it is not worth to think about them). You're lucky if among 100 people you fins one who really objectively thought about that issue.

            But let's pretend you are asking 100 people who actually were objectively thinking about that issue without dismissing any alternative from the beginning for ideological or delusional reasons, and they still disagree. Then this still doesn't prove that there is no objective answer. It only proves that at least some of them, possibly all of them, don't have the necessary full information needed to determine the objective answer.