Deepfakes are being used not only to make people laugh, but to make disturbingly convincing celebrity sex tapes and to derail political campaigns.
Experts warn that as artificial intelligences become more sophisticated, the fakes are increasingly hard to spot.
Deepfake researcher Nina Schick, whose book Deep Fakes and the Infocalypse: What You Urgently Need To Know is an urgent wake-up call about the danger to democracy posed by AI, told the Daily Star: “Humans will never be able to detect deepfakes… it’s already there – to the naked eye they’re perfect.”
But an experimental new AI tool could provide some hope.
The system, developed by computer scientists from the University at Buffalo (SUNY), boasts a 94% strike rate on detecting portrait-style deepfakes.
The researchers, Shu Hu, Yuezun Li, and Siwei Lyu, realised that the tiny reflections in a real person’s were similar in a way that the light in computer-generated eyes was not.
The team explain in their paper published on arxiv “when the subject’s eyes look straight at the camera and the light sources or reflections in the surrounding environment are relatively far away from the subject (i.e, the “portrait setting”), the two eyes see the same scene.
The new AI system spots these tiny differences and assigns a realism score to the image – the lower the score, the more likely it is that the face in the photo is a Deepfake.
In tests, the system spotted almost every example it was from Phillip Wang’s This Person Does Not Exist site, while correctly identifying a large number of portrait photos of real people taken from Flickr as genuine.
At present the technique only works on portrait-style photos. In more casual compositions, say the researchers, there’s a higher chance of “false positive” results where the system flags a real person's photo as a deepfake.
The system is also highly dependent on natural light reflections being present in the subject’s eyes.
“In the future, we will investigate these aspects and further improve the effectiveness of our methods” say the SUNY researchers.
Of course, now they're in a race against time as the deepfake makers will be seeking to use the SUNY data to make even more convincing images.
Source: Read Full Article