Deepfake Abuse Reports Double in Australia as eSafety Targets Undressing Apps Used by 100,000
This piece is freely available to read. Become a paid subscriber today and help keep Mencari News financially afloat so that we can continue to pay our writers for their insight and expertise.
Today’s Article is brought to you by Empower your podcasting vision with a suite of creative solutions at your fingertips.
Australian regulators are investigating a United Kingdom-based company behind popular “undressing apps” used by at least 100,000 Australians, eSafety Commissioner Julie Inman Grant revealed Friday, as deepfake image-based abuse reports to the government agency have doubled over the past 18 months with incidents now occurring at least weekly in Australian schools.
Australia’s online safety regulator is escalating action against artificial intelligence-powered applications that create non-consensual intimate images, announcing Friday it is targeting one of the most widely-used nudifying services while warning the problem represents “just the tip of the iceberg.”
eSafety Commissioner Julie Inman Grant disclosed the investigation during a press conference with Communications Minister Anika Wells, revealing the agency is working with British counterpart Ofcom to take action against a UK-based company whose undressing applications are used by approximately 100,000 people in Australia.
“We are taking some action today against some nudifying services, in this case against a company that probably make some of the most popular undressing apps that are used by at least 100,000 people in Australia,” Inman Grant said. “And we are just ramping up our investigation. They’re based in the UK, and we’re working with our sister regulator, Ofcom, to take further action.”
The announcement comes as New South Wales police investigate a case involving deepfake images in Sydney schools, with eSafety investigators already in contact with state police, the Ride Center and the Department of Education regarding the incident.
Truth matters. Quality journalism costs.
Your subscription to Mencari directly funds the investigative reporting our democracy needs. For less than a coffee per week, you enable our journalists to uncover stories that powerful interests would rather keep hidden. There is no corporate influence involved. No compromises. Just honest journalism when we need it most.
Not ready to be paid subscribe, but appreciate the newsletter ? Grab us a beer or snag the exclusive ad spot at the top of next week's newsletter.
Inman Grant said deepfake image-based abuse incidents are now happening at least once weekly in Australian schools, prompting the eSafety Commission to release a specialized incident management tool for educational institutions to help them determine when to involve police and when to request content removal through the federal agency.
“We’ve seen a doubling of deepfaked image-based abuse reports to us over the past 18 months,” Inman Grant said. “So it’s still a small proportion of our overall image-based abuse reports. But I think it’s just the tip of the iceberg.”
The commissioner emphasized deepfake image-based abuse already falls under the government’s existing image-based abuse scheme, which maintains a 98 percent success rate in removing reported content. However, she warned emerging technologies are dramatically accelerating the threat.
Inman Grant specifically cited OpenAI’s recent release of Sora, an AI-generated social media application that converges video creation and image manipulation capabilities, enabling users to harvest images of others and create hyper-realistic deepfake videos within seconds.
“This is real cause for concern. This is really putting potential online harms on steroids,” Inman Grant said of the technological developments.
When asked whether social media and technology companies are doing enough to protect Australian children from exposure to such content, Inman Grant said companies “could do much more.”
She noted the upcoming social media minimum age restrictions will establish important structures to prevent children’s access to major platforms, while the eSafety Commission is using transparency powers, codes and standards to address the broader technology sector.
“Questions have been asked about sites like 4chan and why aren’t these being covered by the social media minimum age,” Inman Grant said. “It’s because we are targeting social media sites and where young people are. What our codes will be able to do is to tackle the worst of the worst sites.”
Communications Minister Anika Wells said the federal government announced at the start of September plans to restrict nudify apps across the country, building on existing state and territory laws that ban either the creation or distribution of such content.
“We want to stop it before it starts. We want to move from an ambulance at the bottom of the cliff setting where... as much as we do important and vigorous work, it’s always a case of whack-a-mole to a place where these things never reach the app store in the first place,” Wells said.
The minister said work to restrict nudify apps announced in September will continue “as quickly as we can,” though she did not provide a specific timeline for implementation.
When asked about international approaches to the problem, Inman Grant said Australia is leading in this policy area, with other countries asking Australian officials about their methods and strategies.
“I think Australia is leading, and frankly, they’re asking us about what we’re doing and how we’re doing it, and we’re continuing to have those conversations with our international counterparts,” Inman Grant said.
The escalating threat posed by nudifying applications and deepfake technology has emerged as a parallel concern alongside the government’s push to implement social media age restrictions by December 10.
Officials are positioning their multi-pronged approach—combining the minimum age laws, enhanced enforcement against specific harmful applications, and comprehensive education initiatives—as necessary to address the rapidly evolving landscape of online harms facing young Australians.
The eSafety Commission maintains a repository of resources and complaint mechanisms on its website, allowing any Australian experiencing online abuse including cyberbullying, image-based abuse, gore content or child sexual abuse material to request content removal and harm remediation in real time.
Inman Grant described the eSafety website as “a rich repository of not just resources, but the ability for any Australian who’s experiencing online abuse” to access support, emphasizing the agency’s high success rate in getting harmful content removed.
The investigation of the UK-based undressing app company represents one of the eSafety Commission’s most significant enforcement actions targeting AI-powered tools designed to create non-consensual intimate imagery.
With deepfake technology becoming increasingly accessible and sophisticated, Australian regulators are racing to establish frameworks that can prevent harm before it occurs rather than simply responding after images are created and distributed.
The doubling of deepfake abuse reports over 18 months, combined with weekly incidents in schools, suggests the problem is accelerating faster than traditional regulatory and educational responses can address, prompting officials to pursue both immediate enforcement actions and longer-term structural reforms to platform access and technological development.
Sustaining Mencari Requires Your Support
Independent journalism costs money. Help us continue delivering in-depth investigations and unfiltered commentary on the world's real stories. Your financial contribution enables thorough investigative work and thoughtful analysis, all supported by a dedicated community committed to accuracy and transparency.
Subscribe today to unlock our full archive of investigative reporting and fearless analysis. Subscribing to independent media outlets represents more than just information consumption—it embodies a commitment to factual reporting.
As well as knowing you’re keeping Mencari (Australia) alive, you’ll also get:
Get breaking news AS IT HAPPENS - Gain instant access to our real-time coverage and analysis when major stories break, keeping you ahead of the curve
Unlock our COMPLETE content library - Enjoy unlimited access to every newsletter, podcast episode, and exclusive archive—all seamlessly available in your favorite podcast apps.
Join the conversation that matters - Be part of our vibrant community with full commenting privileges on all content, directly supporting The Evening Post (Australia)
Catch up on some of Mencari’s recent stories:
It only takes a minute to help us investigate fearlessly and expose lies and wrongdoing to hold power accountable. Thanks!