Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by janrinok on Sunday September 08 2019, @03:17AM   Printer-friendly

Unsolicited nudes detected and deleted by AI

Software that can detect and delete unsolicited penis pictures sent via private messages on Twitter is being developed by researchers in Seattle. The project was started after developer Kelsey Bressler was sent an unsolicited nude photo by a man. She is now helping a friend refine an artificial intelligence system that can detect the unwanted penis pictures and delete them before they are ever seen.

She said social networks could do more to protect users from cyber-flashing. "When you receive a photo unsolicited it feels disrespectful and violating," Ms Bressler told the BBC. "It's the virtual equivalent of flashing someone in the street. You're not giving them a chance to consent, you are forcing the image on them, and that is never OK."

To test and train the artificial intelligence system, Ms Bressler and her team set up a Twitter inbox where men were invited to "send nudes for science". So many volunteered their nude photos that the team has had to close the inbox.

Related: "Deep Nude" App Removed By Developers After Brouhaha
GitHub Censors "Sexually Obscene" DeepNude Code


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday September 09 2019, @04:23AM

    by Anonymous Coward on Monday September 09 2019, @04:23AM (#891542)

    That is infamous for detecting copyrighted music and automatically siphoning off ad revenue to license holders.

    It's also infamous for being abysmal and completely ignoring fair use.

    Without them, advertisers will continually flee the website whenever the Wall Street Journal hypes up ads appearing on racist or sexual videos.

    Absolutely. And it doesn't matter how poorly the algorithms work or how many innocent people are caught in the crossfire; the appearance of doing something, anything, is what is important. The journalists are utterly inept (or malicious) to suggest that Youtube and other sites can create a magical algorithm to detect Bad Content and make it all go away.