from the AI-can-tell-by-some-of-the-pixels-and-from-seeing-quite-a-few-shops-in-my-time dept.
This AI Can Tell When Faces in Photos Were Photoshopped
Fake photos are a rampant issue in our digital age, but researchers are working hard to restore a greater degree of trust to photography. One team has created a new AI that can detect when faces in photos were manipulated using Photoshop.
The researchers at Adobe and UC Berkeley have published their work in a new paper titled, "Detecting Photoshopped Faces by Scripting Photoshop," explaining how the new method can figure out if Photoshop's Face Aware Liquify feature was used.
[...] While humans were only able to detect the edited faces 53% of the time, the AI managed to correctly catch 99% of them. What's even more impressive is that in addition to figuring out whether and where a photo was manipulated, the AI could also undo those edits and bring that photo back toward its original state.
Also at Adobe Blog and DIYPhotography.
(Score: 3, Insightful) by nishi.b on Monday June 17 2019, @04:04PM (1 child)
How long until a GAN is trained with this tool to learn how to create faces that evade this type of detection ?
(Score: 0) by Anonymous Coward on Monday June 17 2019, @10:05PM
Not enough to evade detection, but needs to look like a face.
(Score: 1) by fustakrakich on Monday June 17 2019, @04:52PM (3 children)
What if I used Corel or Microsoft Paint?
La politica e i criminali sono la stessa cosa..
(Score: 0) by Anonymous Coward on Monday June 17 2019, @05:08PM (1 child)
or excessive makeup?
(Score: 2) by c0lo on Monday June 17 2019, @10:07PM
It's fair to say that the use of enough makeup actually generates a new face without Adobe. This shall be made illegal.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 3, Interesting) by Luke on Monday June 17 2019, @10:27PM
Eh? Don't you mean GIMP!
Actually it _may_ make a difference, from what I read in the paper .
As far as I could tell the only tool they used to generate the test 'fakes' was Photoshop. Although I've no specific knowledge of Photoshop I expect it will have its own particular way of doing things that in itself may generate a 'marker' that could identify changes, given enough comparative data.
The paper discusses 'pixel-wise reconstruction loss', 'resizing methods (bicubic and bilinear), JPEG compression, brightness, contrast, and saturation' as some indicators. At a high level I would have thought other editing tools may mange things a little differently and so thus could reduce the accuracy of the process at this stage.
However I'd imagine that in the future any such potential inaccuracies would reduce as the dataset was increased and the process refined - much like facial recognition processes have improved since the '60's - although it still has a long way to go , as does voice recognition it would appear.
(Score: 1, Interesting) by Anonymous Coward on Tuesday June 18 2019, @02:27AM
Are they detecting by analyzing probabilities of the least significant bit? Would a stego insertion of noise mask the marker?
There are all sorts of ways to put extra stuff into images that are not obvious upon visual inspection.
The art of doing this - known as steganography - is a choice tool for covert communication.