Facebook removes 8.7 million sensual photos of kids in last quarter
Facebook Inc said on Wednesday that company moderators during the last quarter removed 8.7 million user images of child nudity.
This has been done by using an undisclosed software that automatically flags such photos. The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualised context.
The company also disclosed a similar system that would catch users engaged in befriending minors for sexual exploitation.
Facebook’s global head of safety Antigone Davis said that the systems helps identify the problematic content and for company’s trained team of reviewers.
The company is exploring the feasibility of applying the same technology to its Instagram app.
Davis said the child safety systems would make mistakes but users could appeal. “We’d rather err on the side of caution with children,” she said.
The child grooming system evaluates factors such as how many people have blocked a particular user and whether that user quickly attempts to contact many children, Davis said.
Michelle DeLaune, chief operating officer at the National Center for Missing and Exploited Children (NCMEC), said the organisation expects to receive about 16 million child porn tips worldwide this year from Facebook and other tech companies, up from 10 million last year. With the increase, NCMEC said it is working with Facebook to develop software to decide which tips to assess first.