Deepfake 'nudify' services used by thousands of Australians blocked by government

The decision comes after students were found to be using the services to create fake nude images of their peers.

A woman with blonde hair, looking to the left.

eSafety commissioner Julie Inman Grant said her organisation is working with the government on reforms to restrict access to "nudify" tools. Source: AAP / Mick Tsikas

This article contains references to child sexual exploitation.

Three of the most widely used "nudify" services, linked to AI-generated sexual exploitation material of school children, have been removed from Australia.

The UK-based company withdrew access after the eSafety Commission issued an official warning in September over fears it was allowing users to create artificially generated child sexual exploitation material.

This contravened Australia's mandatory code, which requires all online industry members to take meaningful steps to tackle the worst-of-the-worst online content.

About 100,000 Australians were visiting the "nudify" services every month and have featured in high-profile cases of students creating fake nude images of their classmates.

The takedowns showed Australia's codes and standards were working to make the online world safer, eSafety commissioner Julie Inman Grant said.
"We know 'nudify' services have been used to devastating effect in Australian schools," Grant said.

"With this major provider blocking their use by Australians, we believe it will have a tangible impact on the number of Australian school children falling victim to AI-generated child sexual exploitation."

She said the provider had failed to prevent its services being used to create child sexual exploitation material, after marketing features like "undressing any girl" and options for "schoolgirl" image generation and "sex mode".

eSafety has received a doubling of reports about digitally altered images, including deepfakes, from people under the age of 18 in the past 18 months

Four out of five reports involved the targeting of women and girls.
The action follows global AI model-hosting platform Hugging Face changing its terms of service after warnings Australians were misusing some of its generative tools to create child sexual exploitation material.

Hugging Face's new terms require users to minimise the risks associated with models that they upload, specifically to prevent generating child sexual exploitation or pro-terror material.

The company is required to enforce the terms if it becomes aware of breaches or risk up to $49.5 million in fines.

Grant said her organisation is working with the government on reforms to restrict access to "nudify" tools.

If you or someone you know is impacted by sexual assault, call 1800RESPECT on 1800 737 732, or call the National Sexual Abuse and Redress Support Service on 1800 211 028.


For the latest from SBS News, download our app and subscribe to our newsletter.

Share

3 min read

Published

Source: AAP



Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Download our apps
SBS News
SBS Audio
SBS On Demand

Listen to our podcasts
An overview of the day's top stories from SBS News
Interviews and feature reports from SBS News
Your daily ten minute finance and business news wrap with SBS Finance Editor Ricardo Gonçalves.
A daily five minute news wrap for English learners and people with disability
Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS
SBS World News

SBS World News

Take a global view with Australia's most comprehensive world news service
Watch the latest news videos from Australia and across the world