Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday September 08 2019, @03:17AM   Printer-friendly

Unsolicited nudes detected and deleted by AI

Software that can detect and delete unsolicited penis pictures sent via private messages on Twitter is being developed by researchers in Seattle. The project was started after developer Kelsey Bressler was sent an unsolicited nude photo by a man. She is now helping a friend refine an artificial intelligence system that can detect the unwanted penis pictures and delete them before they are ever seen.

She said social networks could do more to protect users from cyber-flashing. "When you receive a photo unsolicited it feels disrespectful and violating," Ms Bressler told the BBC. "It's the virtual equivalent of flashing someone in the street. You're not giving them a chance to consent, you are forcing the image on them, and that is never OK."

To test and train the artificial intelligence system, Ms Bressler and her team set up a Twitter inbox where men were invited to "send nudes for science". So many volunteered their nude photos that the team has had to close the inbox.

Related: "Deep Nude" App Removed By Developers After Brouhaha
GitHub Censors "Sexually Obscene" DeepNude Code


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Sunday September 08 2019, @06:30AM (1 child)

    by Anonymous Coward on Sunday September 08 2019, @06:30AM (#891216)

    That raises a question: Will an automatically deleted picture count as evidence? You can't look at the picture to check whether it really was a penis, or if is was just misidentified as such by the software, since the image was deleted. It would suck to send a completely harmless picture to someone having installed that software, have the software misidentify it at the receiver, and then getting fined and listed as "sex offender".

    Only if you live in Texas. Yet another good reason not to live in that hell hole.

  • (Score: 2) by Runaway1956 on Sunday September 08 2019, @07:00AM

    by Runaway1956 (2926) Subscriber Badge on Sunday September 08 2019, @07:00AM (#891220) Journal

    Only if you live in Texas.

    That is probably not correct.
    1. pic was sent FROM Texas
    2. pic was sent TO Texas
    3. pic was routed THROUGH Texas
    4. device the pic was stored on was transported into, through, or even over Texas
    5. someone from Texas inadvertantly stumbled over the pic
    6. pic was stored on a server in Texas

    The US has set a lot of standards by which it can assume jurisdiction globally. Surely Texas can follow the Fed's example.