Search icon

News

24th Oct 2018

Facebook removed 8.7m child nudity images from the site in three months

'We’d rather err on the side of caution with children.'

Anna O'Rourke

Facebook removed 8.7m child nudity images from the site in three months

Facebook has unveiled a new strategy aimed at tackling the grooming of children on its site.

The social network removed over 8.7 million images featuring child nudity over a three-month period, it said.

Its new software, which has been in use since last year, automatically flags potentially sexualised images of children.

Of the 8.7 million images removed, 99 per cent were first spotted first by the technology rather than being flagged by a user.

The tool sets up a queuing system for moderators whose job it is to decide whether an image should be removed.

Facebook removed 8.7m child nudity images from the site in three months

This helps to take the responsibility for reporting such images off users.

The changes could mean that non-explicit images are also removed from the site, according to Facebook’s global head of safety Antigone Davis, but said that users can appeal when they feel something has been unfairly deleted.

“We’d rather err on the side of caution with children,” she told Reuters.

Facebook is now looking at rolling out the same anti- nude imagery software on Instagram, which it also owns.

It had been under international pressure to ramp up its child protection measures.

An investigation by the BBC in 2016 found that paedophiles were able to share images of nude children with one another in secret groups on Facebook.