Unsolicited nudes detected and deleted by AI
Software that can detect and delete unsolicited penis pictures sent via private messages on Twitter is being developed by researchers in Seattle. The project was started after developer Kelsey Bressler was sent an unsolicited nude photo by a man. She is now helping a friend refine an artificial intelligence system that can detect the unwanted penis pictures and delete them before they are ever seen.
She said social networks could do more to protect users from cyber-flashing. "When you receive a photo unsolicited it feels disrespectful and violating," Ms Bressler told the BBC. "It's the virtual equivalent of flashing someone in the street. You're not giving them a chance to consent, you are forcing the image on them, and that is never OK."
To test and train the artificial intelligence system, Ms Bressler and her team set up a Twitter inbox where men were invited to "send nudes for science". So many volunteered their nude photos that the team has had to close the inbox.
Related: "Deep Nude" App Removed By Developers After Brouhaha
GitHub Censors "Sexually Obscene" DeepNude Code
(Score: 2, Insightful) by Anonymous Coward on Sunday September 08 2019, @07:10AM
"Ok, so how do they know if it's unsolicited or not? That would be the algorithm that interests me."
Presumably, she is the one who decides if it is unsolicited or not. Much like the way I decide that solicitations from Nigerian princes for help moving a fortune in money should end up in the spam folder. I don't actually have to see it to know that is where it belongs.
"Image recognition isn't exactly cutting edge."
Indeed it isn't. I would recommend you not test the limits if you want to score with her. Just sayin'.