News

2020 Election is coming up! Facebook bans deepfakes

2020 is up and the United States’ presidential election campaign is underway. Prior to these elections, Facebook recently announced to ban manipulated photos and videos, i.e. deepfakes.

This new policy was announced on Monday, 6 Jan through a blog post which was later on highlighted on 8 January in The Washington Post. In their post, Facebook said that they are changing their policies for the manipulated videos identified as deepfakes. It would start removing the manipulated media that might be misleading people. 

In the blog post, Monika Bickert describes the following criteria for the videos to consider deepfakes.

“It has been edited or synthesized – beyond adjustments for clarity or quality – in ways that aren’t apparent to an average person and would likely mislead someone into thinking that a subject of the video said words that they did not actually say and It is the product of artificial intelligence or machine learning that merges, replaces or superimposes content onto a video, making it appear to be authentic.”

However, the policy doesn’t include the pictures or videos that are for parody, or even when the video is edited to remove or change the order in which they appear.

The policy change was presented ahead of House Energy and Commerce hearing on “Manipulation and Deception in the Digital Age” that was scheduled for 8 January 2020. In this conference, Bickert represented Facebook in front of lawmakers in the hearing.

The deepfakes become a serious concern after the altered video of the House Speaker Nancy Pelosi went viral in the last summers. The video was shared on multiple social media platforms. Nevertheless, the new Facebook policies on deepfakes still don’t address such videos because that video didn’t involve AI but was edited by readily available software.

Other platforms are also revising their deepfakes policies, yet there is no proper announcement regarding new guidelines.