Watch FIFA World Cup 2026™

LIVE, FREE and EXCLUSIVE

Facebook removes 8.7 million child nudity images

Facebook is using a machine learning tool to identify images of child nudity and another system to catch users grooming children.

The Facebook logo

Facebook has been forced to reset 90 million user logins after a security breach was discovered. (AAP) Source: AAP

Facebook says company moderators removed 8.7 million user images of child nudity during the last quarter with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualised context.

A similar system also disclosed on Wednesday catches users engaged in "grooming," or befriending minors for sexual exploitation.

Facebook's global head of safety Antigone Davis told Reuters in an interview that the "machine helps us prioritise" and "more efficiently queue" problematic content for the company's trained team of reviewers.

The company is exploring applying the same technology to its Instagram app.

Under pressure from regulators and lawmakers, Facebook has vowed to speed up removal of extremist and illicit material. Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.

Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook's automated systems wrongly blocking their posts.

Davis said the child safety systems would make mistakes but users could appeal.

"We'd rather err on the side of caution with children," she said.

Facebook's rules for years have banned even family photos of lightly clothed children uploaded with "good intentions," concerned about how others might abuse such images.

Before the new software, Facebook relied on users or its adult nudity filters to catch child images. A separate system blocks child pornography that has previously been reported to authorities.

Facebook has not previously disclosed data on child nudity removals, though some would have been counted among the 21 million posts and comments it removed in the first quarter for sexual activity and adult nudity.


2 min read

Published

Updated



Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Follow SBS News

Download our apps

Listen to our podcasts

Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS

SBS World News

Take a global view with Australia's most comprehensive world news service

Watch now

Watch the latest news videos from Australia and across the world