Facebook has developed a new tool that uses artificial intelligence and machine learning to detect non-consensual inappropriate images that have been uploaded onto the social media site, before they are even reported. In a statement released this week, Facebook’s global head of safety said the tool will help the victims of revenge porn by identifying the inappropriate images before they are circulated or made viral.
“Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram,” said Antigone Davis, Facebook’s global head of safety. “This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution, so they are reluctant to report the content themselves or are unaware the content has been shared.”
Content found by the tool will be reviewed by a specially trained Community Operations team, who will then remove the image and disable the account that made the original post. There is also an appeals process if a Facebook user thinks they have been falsely accused.
Facebook has also began working with various victim advocacy programs to launch its non-consensual image pilot initiative, allowing users an emergency option to “securely and proactively submit a photo to Facebook.” Facebook’s team will then create a digital footprint of the image and stop it from being shared or spread on the site. Davis also announced the start of a new “safe zone” on the platform for victims of revenge porn.
“We also want to do more to help people who have been the targets of this cruel and destructive exploitation. To do this, we’re launching “Not Without My Consent,” a victim-support hub in our Safety Center that we developed together with experts,” said Davis. “Here victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further — and they can access our pilot program.”