Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday September 08 2019, @03:17AM   Printer-friendly

Unsolicited nudes detected and deleted by AI

Software that can detect and delete unsolicited penis pictures sent via private messages on Twitter is being developed by researchers in Seattle. The project was started after developer Kelsey Bressler was sent an unsolicited nude photo by a man. She is now helping a friend refine an artificial intelligence system that can detect the unwanted penis pictures and delete them before they are ever seen.

She said social networks could do more to protect users from cyber-flashing. "When you receive a photo unsolicited it feels disrespectful and violating," Ms Bressler told the BBC. "It's the virtual equivalent of flashing someone in the street. You're not giving them a chance to consent, you are forcing the image on them, and that is never OK."

To test and train the artificial intelligence system, Ms Bressler and her team set up a Twitter inbox where men were invited to "send nudes for science". So many volunteered their nude photos that the team has had to close the inbox.

Related: "Deep Nude" App Removed By Developers After Brouhaha
GitHub Censors "Sexually Obscene" DeepNude Code


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Sunday September 08 2019, @04:53PM (1 child)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday September 08 2019, @04:53PM (#891338) Journal

    And this developer wants to use this awesome tech for a crusade against unwanted porn?

    Facebook and other social media sites have been working with nudity detection for years.

    https://venturebeat.com/2018/10/24/facebook-used-ai-to-remove-8-7-million-images-of-child-nudity-last-quarter/ [venturebeat.com]

    Detecting that an image contains nudity or a penis is an easier problem than identifying a specific person in an image, or detecting jaywalkers in real time so you don't run them over.

    Applied to the scale of one person instead of an entire social media network, the user probably doesn't care if a false positive or two are blocked, and could manually override the block.

    We'll have an all new engine for automated takedowns of allegedly copyright infringing content, no due process need apply.

    That exists. See YouTube's Content ID. That is infamous for detecting copyrighted music and automatically siphoning off ad revenue to license holders. But it can also be used to detect reused clips, even taking into account transformations of the content.

    Nothing that Kelsey Bressler does here will change the fact that these algorithms and the hardware running them are continuing to improve. There is extreme interest among copyright holders to get these technologies working smoothly. They are being used to detect illegal content. Machine learning and recognition algorithms are regarded as necessary to YouTube's survival. Without them, advertisers will continually flee the website whenever the Wall Street Journal hypes up ads appearing on racist or sexual videos.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Monday September 09 2019, @04:23AM

    by Anonymous Coward on Monday September 09 2019, @04:23AM (#891542)

    That is infamous for detecting copyrighted music and automatically siphoning off ad revenue to license holders.

    It's also infamous for being abysmal and completely ignoring fair use.

    Without them, advertisers will continually flee the website whenever the Wall Street Journal hypes up ads appearing on racist or sexual videos.

    Absolutely. And it doesn't matter how poorly the algorithms work or how many innocent people are caught in the crossfire; the appearance of doing something, anything, is what is important. The journalists are utterly inept (or malicious) to suggest that Youtube and other sites can create a magical algorithm to detect Bad Content and make it all go away.