Facebook Bans ‘Deepfake’ Videos Ahead of 2020 Election

facebook deepfake videos

Shutterstock

Facebook is making a move to ban deepfake videos as the 2020 election season heats up. The social media company announced its new policy against manipulated media on January 6. Twitter had previously announced its move to alert users of manipulated media, but Facebook appears poised to go a step further.

“While these videos are still rare on the internet, they present a significant challenge for our industry and society as their use increases,” Monika Bickert, Facebook’s vice president of global policy management, wrote in an announcement.

Facebook will remove misleading manipulated media if it meets the following criteria:

Facebook has previously made efforts to cut back on “fake news” which littered the platform during the 2016 election cycle. In an effort to avoid repeating history, Facebook has made multiple changes.

After a highly-publicized misinformation campaign on social media during the 2016 presidential campaign, Facebook announced it would be introducing a fact-checking feature in an effort to bring more accurate information to its audience.

Facebook’s fact-checking program has become a central piece of the company’s response to misinformation since its unveiling in late 2016. Fact-checking groups choose what content to review, and material deemed false or partially false carries a warning and is distributed by Facebook’s algorithms to fewer people.

We’ll have to see how Facebook continues to combat misinformation moving forward, but it’s fact-checking and decision to ban deepfake videos appears to be a step in the right direction.

Exit mobile version