Fake images and videos are the next targets of Facebook’s ongoing effort to fight misinformation. On Thursday, Facebook announced to start that it will start fact-checking images and videos, expanding its review efforts to posts that are traditionally harder to monitor.
We know that people want to see accurate information on Facebook, so for the last two years, we’ve made fighting misinformation a priority. – Facebook
The company announced that it is giving all 27 of its global fact-checking partners the ability to debunk photos and videos on the platform. Once a piece of content is rated as false, its future reach in the News Feed will be reduced by up to 80 per cent and a fact check will be appended in the Related Articles section.
Fact-checkers are able to assess the truth or falsity of a photo or video by combining these skills with other journalistic practices, like using research from experts, academics or government agencies. -Facebook Newsroom
Facebook will now filter false photos and videos into the dashboard to select which pieces of content to debunk. Earlier, most fact-checking partners could only debunk links to false news stories; Agence France-Presse was the first organization to start debunking images for Facebook last spring when Facebook started testing the capability.
Facebook has been ramping up fact-checking efforts and third-party human reviewers in recent months in an effort to protect future elections from foreign interference. According to the blog post, the company is also leveraging technology that can extract text from photos and compare them against headlines in previously debunked stories.
Elegant Themes - The most popular WordPress theme in the world and the ultimate WordPress Page Builder. Get a 30-day money-back guarantee. Get it for Free
And this is important because misinformation often travels across media. After several months of research and testing, Facebook came to know that misinformation in photos and videos usually falls into three categories: (1) Manipulated or Fabricated, (2) Out of Context, and (3) Text or Audio Claim.
Edited photos and strong visuals were common among the posts by Russian agents attempting to interfere with the 2016 U.S. The problem is that since they can be shared so easily and don’t require people to leave Facebook to view them, they can get viral distribution from unsuspecting users who don’t realize that they are the ones becoming victims in a disinformation campaign.