Facebook employs AI technology to remove "revenge porn" content

Source: Xinhua| 2019-03-16 07:51:43|Editor: Shi Yinglun
Video PlayerClose

SAN FRANCISCO, March 15 (Xinhua) -- Facebook said Friday it is using technologies of machine learning and artificial intelligence (AI) to detect and remove non-consensual intimate images, also known as "revenge porn," from its platform.

Antigone Davis, head of global safety at Facebook, said the new technologies can help the company "proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram."

With the AI-based system, such content can be found and sent to human moderators for review "before anyone reports it," Davis said in a blog post.

"When someone's intimate images are shared without their permission it can be devastating," he said.

Davis said that the new technologies were introduced because the victims are often afraid of retribution so they are reluctant to report the content themselves, or are unaware the content has been shared.

He said the new detection mechanism is part of Facebook's pilot program jointly run with victim advocate organizations, which gives Facebook users an emergency option to securely and proactively submit a photo to the company, thus Facebook can create a digital fingerprint of that image and stop it from ever being shared.

Facebook is planning to expand the pilot program in the next few months to benefit more people in an emergency, Davis said.

In order to help people who have been the targets of such destructive exploitation, Facebook also launched a victim-support hub called "Not Without My Consent," where victims can seek support and assistance including measures to have those unauthorized content removed and blocked from further spreading.

In addition, Facebook has offered an appeal process for users to report any posts that are deleted by mistake.

TOP STORIES
EDITOR’S CHOICE
MOST VIEWED
EXPLORE XINHUANET
010020070750000000000000011100001378990931