AI Case Study
Facebook identifies and removes 8.7m child abuse images from its platform within a few months with machine learning
Facebook has revealed it leverages machine learning technology to remove extremist and illicit content. The programs are able to go through billions of pieces of content to categorise and flag anything inappropriate. This has helped moderators remove 8.7m child abuse images from its platform between August and October 2018.
Industry
Technology
Internet Services Consumer
Project Overview
"Facebook has said its moderators have removed 8.7m child abuse images in the past three months, as the company battles pressure from regulators and lawmakers worldwide to speed up removal of illicit material.
It said on Wednesday that previously undisclosed software automatically flags images that contain both nudity and a child, helping its reviewers. A similar machine learning tool was also revealed that it said caught users engaged in “grooming” of minors for sexual exploitation.
Facebook has vowed to speed up removal of extremist and illicit material, and machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.
Facebook’s global head of safety Antigone Davis told Reuters in an interview the “machine helps us prioritise” and “more efficiently queue” problematic content for its reviewers.
The company is exploring applying the same technology to its Instagram app."
Reported Results
"Facebook has said its moderators have removed 8.7m child abuse images in the past three months" using machine learning software
Technology
Function
Legal And Compliance
Legal
Background
"Before the new software, Facebook relied on users or its adult nudity filters to catch such images. A separate system blocks child abuse that has previously been reported to authorities.
Facebook has not previously disclosed data on child nudity removals, though some would have been counted among the 21m posts and comments it removed in the first quarter for sexual activity and adult nudity."
Benefits
Data
Facebook posts and images