Facebook has removed 8.7 million images of child nudity over the last quarter, with software that automatically flags such photos.

 

This machine learning tool was rolled out over the last year but has remained disclosed up until now. Its purpose is to identify images containing both nudity and a child, and ban photos that show minors in a sexualised context.

 

A similar system was also disclosed on Wednesday, which catches users engaged in ‘grooming’ or befriending minors for sexual exploitation.

 

Facebook is currently exploring ways to apply the same technology to Instagram.