Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday February 01 2018, @03:01PM   Printer-friendly
from the can-it-tell-which-personality-is-currently-active? dept.

This psychologist's "gaydar" research makes us uncomfortable. That's the point.
Michal Kosinski used artificial intelligence to detect sexual orientation. Let him explain why.
By Brian Resnick@B_resnickbrian@vox.com Jan 29, 2018, 12:00pm EST

In September, Stanford researcher Michal Kosinski published a preprint of a paper that made an outlandish claim: The profile pictures we upload to social media and dating websites can be used to predict our sexual orientation.

Kosinski, a Polish psychologist who studies human behavior from the footprints we leave online, has a track record of eyebrow-raising results. In 2013, he co-authored a paper that found that people's Facebook "likes" could be used to predict personal characteristics like personality traits (a finding that reportedly inspired the conservative data firm Cambridge Analytica).

For the new paper, Kosinski built a program with his co-author Yilun Wang using a common artificial intelligence program to scan more than 30,000 photos uploaded to an unnamed dating site. The software's job? To figure out a pattern about what could distinguish a gay person's face from a straight person's.

https://www.vox.com/science-and-health/2018/1/29/16571684/michal-kosinski-artificial-intelligence-faces

I hate the terms "Must see TV" and "must read" and similar terms. But, this article comes pretty close to "must read" for those who wish to understand where computer are going to take us. Especially read the conversation between Resnick and Kosinski - the research is not really about homosexuality, but about analyzing people in general.

Michal Kosinski

Exactly.

It proves to be uncomfortably accurate at making predictions.

We know that companies are already collecting this data and using such black boxes to predict future behavior. Google, Facebook, and Netflix are doing this.

Basically, most of the modern platforms are just virtually based on recording digital footprints and predicting future behavior.

Psychologists would say, "Oh, yes, that's true, but not personality. This is just pseudoscience." I'm like, wait. You can accept that you can predict 57 things, but if I say, "What about 58?" you say, "This is absolutely theoretically impossible. This is pseudoscience. How can you even say that?"

Science or pseudoscience, we can bet that corporate America and the government are going to be using this.

A smart person with a computer and access to the internet can judge sexual orientation of anyone in the world, or millions of people simultaneously with very little effort, which makes lives of homophobes and oppressive regimes just a tiny bit more easy.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by bzipitidoo on Thursday February 01 2018, @05:25PM (4 children)

    by bzipitidoo (4388) on Thursday February 01 2018, @05:25PM (#631563) Journal

    One point of privacy is to evade bias. Seems many people hold a ton of unfair biases. In addition to the ones we hear about all the time, there's well known favoritism for taller people just for being taller. It's so ingrained it's in our expressions, as in "look up to" and "look down on" someone. Favoritism towards the more physically attractive is another bias we live with.

    Employers have had to take proactive steps to reduce bias in hiring as much as possible, deliberately ignore some kinds of info about job seekers. Probably will see more of that as it becomes harder to keep private details about oneself private.

    Another is bad laws, such as copyright. If no one had any privacy, the entertainment cartels could actually enforce their vision of artificial scarcity, engage in far more rent seeking than hey are able to now. As it is, ISPs are mostly on their customers' side on that one, refusing to link IP addresses to names. Without privacy, no one could ever get away with speeding or running a red light, no matter the circumstances such as that maybe the traffic light is malfunctioning and will not change, or it's a medical emergency.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by DannyB on Thursday February 01 2018, @05:51PM

    by DannyB (5839) Subscriber Badge on Thursday February 01 2018, @05:51PM (#631584) Journal

    Employers have had to take proactive steps to reduce bias in hiring as much as possible

    Landlords don't have to. And neither do cake baking shops.

    And soon . . . there's a gaydar app for that!

    If an algorithm can learn to recognize sexual orientation from a face, it won't be long until an AI can also recognize whether someone is white or not.

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
  • (Score: 0) by Anonymous Coward on Thursday February 01 2018, @07:28PM

    by Anonymous Coward on Thursday February 01 2018, @07:28PM (#631635)

    This is a nice look into how our brain works. It covers some aspects of bias. It infects us all.

    https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow [wikipedia.org]

  • (Score: 2) by dry on Thursday February 01 2018, @11:08PM (1 child)

    by dry (223) on Thursday February 01 2018, @11:08PM (#631749) Journal

    Some Orchestra's have moved to blind job applications and ended up with a much more diverse Orchestra.

    • (Score: 0) by Anonymous Coward on Friday February 02 2018, @12:40AM

      by Anonymous Coward on Friday February 02 2018, @12:40AM (#631795)

      Does that mean that all the Violin and Cello players won't all be hot sexy chicks anymore?

      Bummer.