Monash University and the Australian Federal Police (AFP) are developing a new “data poisoning” tool designed to stop criminals from creating malicious AI-generated content, including deepfakes and child abuse material. The tool, called ‘Silverer’, works by subtly altering images before they are uploaded.
The AiLECS Lab, a collaboration between the AFP and Monash, is developing the tool. Data poisoning involves altering pixels in a way that is invisible to humans but “tricks” AI models that train on the data. When criminals try to use this poisoned data, the AI models produce inaccurate, skewed, or unrecognisable results.
“Before a person uploads images on social media or the internet, they can modify them using Silverer,” said Project Lead Elizabeth Perry. “This will alter the pixels to trick AI models and the resulting generations will be very low-quality, covered in blurry patterns, or completely unrecognisable.”
Harmful images and videos
The AFP has identified an increase in AI-generated child abuse material. Digital forensics expert and AiLECS Co-Director Associate Professor Campbell Wilson said the generation of fake images is a growing problem. “Currently, these AI-generated harmful images and videos are relatively easily created using open source technology and there’s a very low barrier to entry for people to use these algorithms,” Associate Professor Wilson said.
AFP Commander Rob Nelson said the tool could also help investigators by cutting down the volume of fake material to wade through.
“We don’t anticipate any single method will be capable of stopping the malicious use or re-creation of data, however, what we are doing is similar to placing speed bumps on an illegal drag racing strip,” Commander Nelson said. “We are building hurdles to make it difficult for people to misuse these technologies.”
The ‘Silverer’ prototype has been in development for the last 12 months and is currently in discussions to be used internally at the AFP. The project’s goal is to create an easy-to-use tool for ordinary Australians to protect their data.