Take A Look At How Facebook Fights Revenge Porn Using AI and Human Moderators
Wed, April 14, 2021

Take A Look At How Facebook Fights Revenge Porn Using AI and Human Moderators

Revenge porn is a form of invasion of sexual privacy and online harassment, according to Olivia Solon of American broadcast television network NBC / Photo by: Sam Wordley via Shutterstock

 

Revenge porn is a form of invasion of sexual privacy and online harassment, according to Olivia Solon of American broadcast television network NBC. Revenge porn is usually perpetrated by a victim’s former lover. The ex-partner posts or threatens to upload intimate photos of the victim without their consent to shame them. Revenge porn is a lot more common among everyday people, in particular, the youth, said Fight the New Drug, a Utah-based anti-pornography non-profit organization. 

1 in 25 internet users in America has been a victim of posts or threats of non-consensual pornography sharing. Individuals age 15 to 29 mainly experience having their photos posted online, with men and women are equally likely to have their pictures uploaded without their consent. People of color are more likely to be victims of revenge porn, while LGBTQ+ people are more at risk of experiencing it than heterosexual individuals. 

Given these facts, Facebook is now trying to stop revenge porn before it spreads all over the internet. How? By using AI and humans.

Michaela Zehara’s Case 

Three years ago, the 22-year-old fitness trainer and aspiring actor was checking her Instagram when she saw an account that had her name, photo, and contact number. That account started following Zehara. She had a gut feeling that something bad was about to happen. And she was right. Zehara’s friends and family messaged her saying that the account was posting photos of her nude body, which she shared with her boyfriend at the time.

Zehara and dozens of her friends reported the pictures, prompting Instagram to take them down in about 20 minutes. She doesn’t know who took the screenshots, adding that her boyfriend still has the pictures. 

How Facebook Attempts to Combat Revenge Porn 

Facebook launched an AI tool to detect revenge porn before it spreads, helping victims save time and trial of getting the photos taken down, reported Sara Salinas of business news and real-time financial market coverage news platform CNBC. This initiative is Facebook’s latest move to remove abusive content in its platform. Facebook was criticized with regard to the working conditions of its contracted content reviewers who are responsible for moderating posts on the social media website. Hence, an AI detection tool “could be a step in the right direction.” “Finding these images goes beyond detecting nudity on our platforms,” Facebook’s Global Head of Safety Antigone Davis said. 

By leveraging machine learning and AI, Facebook can now detect “near-nude images or videos” that are shared without the owner’s consent on Facebook and Instagram, Davis added. The AI tool is trained to recognize a “nearly nude” photo— a lingerie shot, for instance— that includes a “derogatory or shaming text.”

This would suggest that the uploaded photo is used to shame or seek revenge on someone else. Facebook’s system scans each post for any signs of revenge porn, albeit some of them are subtle. For example, a post that has a laughing-face emoji with a text such as “look at this” could be signs of revenge porn, Facebook explained. 

The post is sent to a human reviewer for confirmation after the post has been flagged. Trained to handle safety issues, the content moderator “briefly” looks at the photo to ensure that it is an intimate image. Once the nature of the content has been confirmed, the photo is converted into digital fingerprints “to prevent any subsequent posting of the image on Facebook, Instagram, and Messenger.” This process is known as “hashing.” 

Facebook then deletes the image after seven days, which entails that the platform “does not maintain a database of intimate photos” that might be susceptible to abuse or hacking. Mike Masland, Facebook’s product manager for fighting revenge porn stated that their goal is to “find the vengeful context, and the nude or near-nude, and however many signals we need to look at, we’ll use that.” 

AI systems require volumes of data to learn to identify images. For Facebook to obtain samples for their AI tool, the firm used nude and near-nude images that were already flagged by human content moderators. The AI may struggle to catch up since more and more photos are being reported on Facebook. However, Facebook’s AI tool may get better once it is fed with more examples. Masland reassured, “It will evolve.”  

Facebook launched an AI tool to detect revenge porn before it spreads, helping victims save time and trial of getting the photos taken down / Photo by: PK Studio via Shutterstock

 

Praise for Facebook’s Initiative

It drew criticism at first but Facebook was also praised for its efforts in combatting revenge porn. UK non-profit organization Revenge Porn Helpline referred over 400 people to Facebook’s takedown tool for one year. Revenge Porn Helpline’s manager Sophie Mortimer said, “The relief that victims feel when they know their images can’t be shared in this way is immense.”

Mortimer noted that Facebook’s approach stands out because other platforms resort to reactive approaches. People will move to other platforms but Davis said that it is important for the industry to share intelligence to hinder someone from moving from one platform to another. 

Nowadays, social media platforms need to handle the risks associated with interconnectivity. Facebook’s initiative demonstrates how AI and humans work hand-in-hand to get rid of revenge porn. While this is a great example of using AI in removing abusive content, we still have a long way to go before other platforms take on more proactive approaches.