Now, Adobe has trained Artificial Intelligence (AI) to detect facial manipulation in images that edited using Photoshop software.
California research team worked with Adobe and has developed a Deepfake Detection Technology to identify images by Artificial Intelligence (AI). In partnership with UC Berkeley, Adobe is working on this technology to detect when images have been altered. This new software is designed to detect images that are either edited with Photoshop software or not.
This Adobe technology, programmed by AI, helps to identify the distortion of false images, videos, audio, and documents. The team of researchers has also used Conventional Neural Network (CNN) on this image detection program specifically for image recognition. CNN is able to capture the image of Photoshop's Face Away Liquify feature. Face and lip expression can be changed with the Face Away Liquify feature.
Adobe researcher Oliver Wang said,
We started by showing image pairs (an original and an alteration) to people who knew that one of the faces was altered. For this approach to be useful, it should be able to perform significantly better than the human eye at identifying edited faces.
With this latest technology, it's now possible to distinguish 99 percent of the photos captured with neural network tools. The tool detects the use of Adobe’s Face Aware Liquify tool, which makes it easy to make subtle tweaks to someone’s face in a picture.
If people can know easily that images can be distorted by editing, it can be possible for them to stay safe and more securely.
The detection tool highlights the areas of the image that appear to be modified. Since the training is based on an established Adobe tool, the AI technology can even reverse the edits to produce a very good approximation of the original photo.
Gavin Miller, head of research at Adobe, said that more awareness is needed than such technologies.
Beyond technologies like this, the best defense will be a sophisticated public who know that content can be manipulated - often to delight them, but sometimes to mislead them.