Over the past few months, the world’s most popular social media platform has been under fire for publishing fake news articles. The issue was highlighted during the US Presidential Election, when thousands of false articles with strong political bias were released and shared across the social media platform, including the infamous news piece claiming that Pope Francis had publicly endorsed Donald Trump.
The stats are shocking: In North America, 170 million people use Facebook every day. Nearly half of all adults in the States claim to get their news from Facebook.
Recently, the Pew Research Center released a survey which showed that nearly one-fourth of Americans have shared a fake news story.
What are the repercussions of fake news?
Fake news is invariably created by those with an agenda. These articles are libelous, defamatory and create massive confusion around public events. Many American adults are blaming the results of the 2016 US Election on the ubiquity of fake news over the campaign period, as well as Facebook’s unwillingness to take action against the sharing of fake content.
Facebook steps up
In the line of fire, Facebook finally stepped up to the plate. In December, Zuckerberg released a statement outlining the steps Facebook would take to prevent the spread of fake news on its platform.
“We’re a new kind of platform for public discourse,” he stated, “and that means we have a new kind of responsibility to enable people to have the most meaningful conversations, and to build a space where people can be informed.”
So how does it work?
Facebook is taking a two-pronged approach to inhibit the sharing of fake news across its platform. These measures are currently being tested in the US.
- Financial incentives for spammers have been removed, and Facebook has barred fake news sites from using their ad services.
- Users can now report potentially fake news by clicking on the upper right hand corner of a post. Articles flagged as fake will then get sent to an independent, third party fact-checking organisation.
If the organisation deems the article to be fake, it will get flagged on Facebook as ‘disputed’, with a link to the article explaining why. Although you can still share disputed content, it will become impossible to share the content without the ‘suspect content’ warning attached to it.
Facebook has also stated that disputed articles will be pushed lower down in the News Feed.
Facebook announced this week that they will begin to implement these measures in Germany. With the German Election coming up, a number of false viral stories about Chancellor Angela Merkel have already been spread. The sites responsible for these stories have a history of publishing fake news and conspiracy theories.
Even more concerning is a recent report published by BuzzFeed, which showed that the stories coming out of these websites are some of the top performing content on Facebook in both English and German.
What about South Africa?
In South Africa, Facebook has 14 million active users. Inevitably, fake news has made its way into many a feed. Without Facebook’s measures in place here, it is up to us to take responsibility for the veracity of the content we choose to share.
Here is a list of questions to ask yourself before you re-post an article:
- Do I know the publisher/ website/ source of this article? Remember that often the names of the false publication will be similar to established ones.
- Has this story been reported elsewhere, or on another reliable and trustworthy publication?
- If I read the other articles on this website, do they strike me as fake news or conspiracy theories?
- Do any of these sites appear on Wikipedia’s list of fake news sites?