Facebook announces ban on Deepfake video ahead of presidential election

As the 2020 U.S. presidential election approaches, Facebook announced Monday that it has banned fake videos and photos (often referred to as Deepfake) on its platform,media outlet The Verge reported. Late Monday, Facebook posted the policy change on its blog, confirming an earlier report by The Washington Post. In the post, Facebook said it would begin removing content that has been edited “in ways that ordinary people are not aware of and may mislead someone” or created by artificial intelligence or machine learning algorithms.

Facebook announces ban on Deepfake video ahead of presidential election

But the company says the policy does include copycat or satirical content, or videos that have been edited to remove words or change the order in which they appear.

The policy change comes ahead of a hearing by the House Department of Energy and Commerce on wednesday on falsifying media content. Monika Bickert, Facebook’s vice president of global policy management, will stand before lawmakers at this week’s hearing.

Last summer, a Deepfake video by House Speaker Nancy Pelosi was retweeted on social media platforms such as Facebook. At the time, Facebook said the video did not violate any of the company’s policies. Monday’s ban does not appear to cover the video. The video was not made through AI, but is likely to be edited using ready-to-use software to express its views.

Other platforms, including Twitter, have come under fire following Pelosi’s Deepfake video. In November, Twitter began developing its own Deepfake policy and asked users for feedback on the platform’s future rules. The company has not issued any new guidelines on fake media content.