Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday June 22 2014, @08:37AM   Printer-friendly
from the sit-quicksort-sit dept.

Interesting analogy comparing algorithms to dogs.

There is a theory among evolutionary anthropologists that dogs evolved from beasts to pets because the canines that continued to survive were those that gained social intelligence. The wolves that thousands of years ago hung around the edges of human settlements began to interpret human intentions and moods. In other words, their brains began to be wired to tune into people's brains. Over time, this meant their behavior and even their appearance changed to become less fierce, more attuned to human emotions, and more symbiotic. In other words, they became dogs.

I mention the evolution of dogs because we're at the point now where we're living with another non-human species that is far more dangerous and powerful than canines ever were: algorithms. The UK government just announced 220 million pounds sterling for "big data and algorithm" research. What you see on Facebook is determined by algorithms. Amazon's (and Spotify's and Netflix's et al.) recommendation engines are all algorithms. An algorithm now controls the temperature in my house through my Nest thermostat. If you interact with the digital world at all-and who doesn't?-you are coming into contact with an algorithm. We need to ensure that these coded systems understand our needs and intentions in order to create products that feel human and humane.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Sunday June 22 2014, @08:44AM

    by Anonymous Coward on Sunday June 22 2014, @08:44AM (#58631)

    Algorithms are like, social, bro! You gots to believes me! Follow me on twatfuck! Please!

  • (Score: 4, Insightful) by Dunbal on Sunday June 22 2014, @09:43AM

    by Dunbal (3515) on Sunday June 22 2014, @09:43AM (#58641)

    "The wolves that thousands of years ago hung around the edges of human settlements began to interpret human intentions and moods."

    Evolution doesn't work that way. You could say instead: the wolves that successfully interpreted human intentions and moods received a reward for hanging around human settlements whereas those that did not were disadvantaged. While I won't dispute that it is possible that attitude and mood could have some genetic, heritable component that favored more docile wolves over aggressive ones because it is (aggression for instance can be "bred" into or out of a dog line), simply hanging around people doesn't cause a new gene to mutate. However canines who receive an ADVANTAGE because of that gene do become more successful hanging around people that canines that don't, and thus manage to spread their genes better over time. The summary has it backwards, as if somehow human behavior can magically cause genetic mutation.

    "and even their appearance changed to become less fierce"

    This I will argue. It is well known that dogs have been bred from wolves artificially by man. We have created them in our image, selecting for both physical and character attributes.

    • (Score: 4, Funny) by Horse With Stripes on Sunday June 22 2014, @10:18AM

      by Horse With Stripes (577) on Sunday June 22 2014, @10:18AM (#58648)

      simply hanging around people doesn't cause a new gene to mutate.

      I don't know ... hanging around humans has certainly turned me into a mutant.

    • (Score: 3, Insightful) by tathra on Sunday June 22 2014, @11:35AM

      by tathra (3367) on Sunday June 22 2014, @11:35AM (#58665)

      ...simply hanging around people doesn't cause a new gene to mutate.

      mutate, no, but gene expression via epigenetics can change just by eating different foods, or by being stressed, and other minor things. this could allow certain factors to become more or less pronounced as a precursor to actual mutations. so the author really might not be that far off from the truth. evolution is complicated, and we still only barely have a clue how gene expression works (hopefully everybody knows by now that the so-called "junk dna" is more critical than the genes themselves since they dictate gene expression, but how all that is coded is still a massive mystery, and last i knew epigenetics was still a giant unknown too).

      • (Score: 2) by c0lo on Sunday June 22 2014, @01:57PM

        by c0lo (156) Subscriber Badge on Sunday June 22 2014, @01:57PM (#58693) Journal
        Epigenetics [wikipedia.org] - if DNA was sheet music, epigenetics would be interpretation.
        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 5, Informative) by lrmo on Sunday June 22 2014, @01:54PM

      by lrmo (838) on Sunday June 22 2014, @01:54PM (#58692)

      "We have created them in our image, selecting for both physical and character attributes."

      While we have in recent years selected for the physical attributes we want in dogs, scientific research shows that dogs did develop physical traits that were less fierce and more cute just by becoming tame.

      There is an ongoing experiment in domesticating foxes that selects breeding pairs purely on their tameness level and not for any other trait. Despite selecting only on tameness, the foxes started attributing other external signs as well, including raised ears and spotting patterns in their fur.

      A link for reference [wikipedia.org]. It is quite an interesting study. The wiki page does not mention that while breeding for tameness the experimenters also went the other direction and have gone a few generations breeding foxes to be hyper-aggressive.

  • (Score: 4, Insightful) by BradTheGeek on Sunday June 22 2014, @09:46AM

    by BradTheGeek (450) on Sunday June 22 2014, @09:46AM (#58642)

    This is a null analogy if ever there was one. Dogs evolved until we could have a hand in their breeding. Yet, some dogs still attack humans.

    That doesn't matter though. Algorithms are a totally human construct. They already do what we, or a subset of we, want. Facebook's data mining algorithms already do what a human group wants. They feed Facebook data and serve us 'better access' to our 'friends'. They, and other algorithms are not some wild force to be tamed. If anything needs taming and domestication, it's the subset of our species that feels it is okay to take all they can, or feels it is okay to use our private lives for their betterment.

    • (Score: 2) by davester666 on Sunday June 22 2014, @05:08PM

      by davester666 (155) on Sunday June 22 2014, @05:08PM (#58740)

      Yes, the algorithms are already evolving as their masters want, which is what large corporations and gov't want. The writer/submitter is delusional if he believes that the algorithms will somehow evolve in any way that is beneficial or desirable for regular people.

  • (Score: 4, Insightful) by Horse With Stripes on Sunday June 22 2014, @10:28AM

    by Horse With Stripes (577) on Sunday June 22 2014, @10:28AM (#58651)

    That article is utter garbage. Algorithms make binary decisions? Play 'beat the algorithm'? Algorithms don't respond in human-centric ways? They should be understandable?

    This whole pile of shit sounds like it was written by some clown with a degree who just discovered that the basic foundation of how computers work is by processing a list of instructions. Algorithms should be tamed? Sorry, bro, but you should be tazed.

  • (Score: 4, Insightful) by choose another one on Sunday June 22 2014, @10:59AM

    by choose another one (515) Subscriber Badge on Sunday June 22 2014, @10:59AM (#58657)

    The article is wrong on so many levels, it could have come straight from the marketing division of the Sirius Cybernetics Corporation

    Algorithms & computers are good precisely because of their non-human traits. We like them because they do what they are ******* told, are reliable, repeatable, analysable, consistent, uncomplaining - in fact everything humans are not. If they screw up it's because we got the algorithm wrong or put the wrong input in. Computers and algorithms are "what you get is what you asked for", but the article wants them to be "what you get is what we magically determine you really wanted even though you didn't ask for it". Wrong (or at least, it's absolutely not what I want from my algorithms).

    The author thinks that magically / telepathically guessing what someone really wants you to do (presumably with 100% accuracy or we'd send the device back as faulty), no matter what you were asked to do, is a human trait. Wrong. It's not, even if my wife thinks it should be and gets upset that I don't have it.

    The author thinks algorithms would be better if more human-like, the logical extension of which is that they should be imbued with human like behaviour too. Wrong. Like saying "we built a full motion voice commanded RealDoll but that wasn't real enough so we gave it PMT and the menopause, at the same time, because more human like is better".

    Or as a much better writer than me put it a long time ago:

    "GPP feature?" said Arthur. "What's that?"
    "Oh, it says Genuine People Personalities."
    "Oh," said Arthur, "sounds ghastly."
    A voice behind them said, "It is."

    "Ghastly," continued Marvin, "it all is. Absolutely ghastly. Just don't even talk about it. Look at this door," he said, stepping through it. The irony circuits cut into his voice modulator as he mimicked the style of the sales brochure. "All the doors in this spaceship have a cheerful and sunny disposition. It is their pleasure to open for you, and their satisfaction to close again with the knowledge of a job well done."

    • (Score: 1) by Horse With Stripes on Sunday June 22 2014, @11:13AM

      by Horse With Stripes (577) on Sunday June 22 2014, @11:13AM (#58659)

      We like them because they do what they are ******* told, are reliable, repeatable, analysable, consistent, uncomplaining - in fact everything humans are not.

      In fact everything that users are not.

      The author thinks that magically / telepathically guessing what someone really wants you to do (presumably with 100% accuracy or we'd send the device back as faulty)

      If my computer starts guessing what I want I'm going to send it back as faulty with a note that reads "Please keep HAL where he belongs, m'kay?". Asimov's "Three Laws of Robotics" won't work if our algorithms are anything like humans.

      • (Score: 3, Insightful) by maxwell demon on Sunday June 22 2014, @12:36PM

        by maxwell demon (1608) on Sunday June 22 2014, @12:36PM (#58677) Journal

        Your computer already guesses what you want. Granted, the algorithms are generally not very effective, but what do you think are Microsoft's changing start menu or Firefox's "awsome bar" doing if not trying to guess what you want? What is Autocorrect doing other than trying to guess what you wanted to write? And not only your computer: What do you think Google Instant is doing? Or "did you mean ..."? Or all those algorithms for targeted advertisements?

        --
        The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2) by BradTheGeek on Sunday June 22 2014, @12:26PM

      by BradTheGeek (450) on Sunday June 22 2014, @12:26PM (#58674)

      This is a fucking test post to see if the above shit censorship was SN or user.

  • (Score: 1, Funny) by Anonymous Coward on Sunday June 22 2014, @11:17AM

    by Anonymous Coward on Sunday June 22 2014, @11:17AM (#58661)

    Looks like some mod bot has taken offense to any criticism of this article. I guess the subject of this story is evolving right before our soylent eyes. Ahh, progress ... now we're better than ever!

    • (Score: 2) by c0lo on Sunday June 22 2014, @02:02PM

      by c0lo (156) Subscriber Badge on Sunday June 22 2014, @02:02PM (#58696) Journal

      Looks like some mod bot has taken offense to any criticism of this article.

      If life thought me something that would be: never play a smartass during PMS.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 3, Interesting) by clone141166 on Sunday June 22 2014, @12:54PM

    by clone141166 (59) on Sunday June 22 2014, @12:54PM (#58681)

    This article anthropomorphises algorithms into some sort of sentient species. Algorithms are not a "species". Algorithms are about as far from natural evolution as you can possibly get. They are entirely human-made constructs with no sentience or self-awareness.

    Algorithms should not be expected to predict what humans want. Only humans know what they want. We should be taking the complete opposite approach to what this article suggests by taking the time and effort to *design* algorithms that accept complicated user inputs allowing humans to control and tailor the actions of algorithms to their own needs.

    Instead this article suggests that algorithms should somehow magically evolve the necessary sentience to decide what humans want for them. This point of view is one of the core problems in modern application design. Computers are not people, algorithms are not sentient, algorithms are written by programmers. Applications that try to *guess* the needs of users are a pain to use. Programmers are not omnipotent or clairvoyant; they cannot create algorithms that will predict exactly what all users will want an application to do. Modern software design needs to shift focus away from trying to guess what users want, and instead refocus on providing interfaces that allow users to fully control and configure applications to do what they actually want.

    If programmers stopped treating users like idiots and provided the necessary configuration and control interfaces to their programs, we wouldn't need sentient algorithms to guess at what we wanted.

    • (Score: 1) by clone141166 on Sunday June 22 2014, @12:57PM

      by clone141166 (59) on Sunday June 22 2014, @12:57PM (#58682)

      *omniscient - programmers aren't omnipotent either, it would be nice if we were though :P

    • (Score: 2) by c0lo on Sunday June 22 2014, @02:08PM

      by c0lo (156) Subscriber Badge on Sunday June 22 2014, @02:08PM (#58697) Journal

      If programmers stopped treating users like idiots and provided the necessary configuration and control interfaces to their programs, we wouldn't need sentient algorithms to guess at what we wanted.

      All you need to configure sendmail is some text files. Good luck treating users in anyway you like.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 3, Funny) by Qattus on Sunday June 22 2014, @03:02PM

    by Qattus (951) on Sunday June 22 2014, @03:02PM (#58707)

    I would not really want an algorithm to be like my dog. He is highly evolved to get me to do what he wants!

  • (Score: 2) by AsteroidMining on Sunday June 22 2014, @05:04PM

    by AsteroidMining (3556) on Sunday June 22 2014, @05:04PM (#58738)

    "You keep using that word. I do not think it means what you think it means."