The Australian Federal Police Association (AFPA) supports the bill to be introduced by Independent Member of Parliament, Kate Chaney to criminalise the possession and distribution of artificial intelligence (AI) tools designed to create child exploitation and sexual abuse material.
Australia’s current laws do not explicitly ban the downloading or use of AI generators built for exploitation and abuse content. These tools, which are increasingly available online, allow unlimited production of harmful material. This makes policing more difficult and increases risks to children.
AFPA President Alex Caruana said the bill is a vital step in updating the law.
“AI-created child exploitation and abuse material reduces law enforcement’s ability to detect and intervene. It diverts critical resources and complicates efforts to identify real victims.
“Criminals have always looked for new ways to hide their behaviour. AI gives them another tool to create, share and conceal material that harms children. Without clear laws, police are left on the back foot fighting with one hand tied behind their back.
“These tools can mimic real children or use images of actual victims. That makes it extremely difficult to tell the difference between a digital fake and a real child. Every hour police spend chasing AI-generated material is an hour taken away from finding and rescuing a real child in danger.” Mr Caruana said.
The bill would create new offences for using a carriage service to download, access, supply or promote technology designed to generate child exploitation material, and for scraping or distributing data to train such tools. The maximum penalty would be 15 years in prison. It also includes protections to allow law enforcement and intelligence agencies to investigate these crimes.
The AFPA supports the views of child safety experts including former detective inspector Jon Rouse and Colm Gannon from the International Centre for Missing and Exploited Children, who stress that these AI tools have no public benefit. Mr Caruana said the AFPA will continue to push for stronger protections and this is not an issue we can delay.
“Every day that these tools remain unregulated, more children are put at risk. The harm is real, it is immediate, and it is devastating. This bill gives us a clear and practical way to act.
“We welcome this bill and urge all parliamentarians to support it. The safety of children must come before everything else. Law enforcement needs the legal framework and the resources to stay ahead of offenders. Anything less puts lives at risk.” Mr Caruana said.
Media Contact:
Troy Roberts, AFPA Media and Government Relations Manager (02) 6285 1677 | troy.r@afpa.org.au
The Australian Federal Police Association (AFPA) supports the bill to be introduced by Independent Member of Parliament, Kate Chaney to criminalise the possession and distribution of artificial intelligence (AI) tools designed to create child exploitation and sexual abuse material.
Australia’s current laws do not explicitly ban the downloading or use of AI generators built for exploitation and abuse content. These tools, which are increasingly available online, allow unlimited production of harmful material. This makes policing more difficult and increases risks to children.
AFPA President Alex Caruana said the bill is a vital step in updating the law.
“AI-created child exploitation and abuse material reduces law enforcement’s ability to detect and intervene. It diverts critical resources and complicates efforts to identify real victims.
“Criminals have always looked for new ways to hide their behaviour. AI gives them another tool to create, share and conceal material that harms children. Without clear laws, police are left on the back foot fighting with one hand tied behind their back.
“These tools can mimic real children or use images of actual victims. That makes it extremely difficult to tell the difference between a digital fake and a real child. Every hour police spend chasing AI-generated material is an hour taken away from finding and rescuing a real child in danger.” Mr Caruana said.
The bill would create new offences for using a carriage service to download, access, supply or promote technology designed to generate child exploitation material, and for scraping or distributing data to train such tools. The maximum penalty would be 15 years in prison. It also includes protections to allow law enforcement and intelligence agencies to investigate these crimes.
The AFPA supports the views of child safety experts including former detective inspector Jon Rouse and Colm Gannon from the International Centre for Missing and Exploited Children, who stress that these AI tools have no public benefit. Mr Caruana said the AFPA will continue to push for stronger protections and this is not an issue we can delay.
“Every day that these tools remain unregulated, more children are put at risk. The harm is real, it is immediate, and it is devastating. This bill gives us a clear and practical way to act.
“We welcome this bill and urge all parliamentarians to support it. The safety of children must come before everything else. Law enforcement needs the legal framework and the resources to stay ahead of offenders. Anything less puts lives at risk.” Mr Caruana said.
Media Contact:
Troy Roberts, AFPA Media and Government Relations Manager (02) 6285 1677 | troy.r@afpa.org.au