“Fake news” is nothing new – gossip, rumor mongering and the spreading of outright lies have been around almost as long as humans have been able to communicate. However, the rise of the World Wide Web and easy access to electronic digital communication that can reach millions of people around the planet within minutes has taken it to a new, dangerous level.
Most of us agree that something needs to be done – but what? Should someone be monitoring, filtering and controlling the spread of information? More importantly, can such people and institutions be trusted?
Recently, social media juggernaut Facebook and tech company Google announced they would be taking a stand against “fake news.” In December, the former began encouraging its users to flag stories that may constitute fake news. Facebook also enlisted the services of fact-checking organizations that include Snopes.com, FactCheck.org, Politifact, ABC News and the Associated Press to review those stories.
In the European Union, some governments are starting to take matters into their own hands. Facebook’s actions came in the wake of legislation introduced in the German parliament that would impose a fine of €500,000 (approximately $535,000 USD) on the company for failure to remove what the government deems to be fake news posts within 24 hours. The German government is concerned that fake news stories could influence that country’s elections as they did in the US – which some argue resulted in the disaster we are now facing in this country.
Facebook will be partnering with a German-based, non-profit fact-checking organization, Correctiv.org. Correctiv, which follows the principles of the International Fact-Checking Network, will review stories flagged by Facebook users. Should they prove to be false, those stories will no longer be prioritized in users’ news feeds.
On the surface, this seems to be a good thing. However, the fact that Facebook’s actions were apparently motivated by threats from the German government is disturbing. As Germany’s own history demonstrates, governmental control of the information flow (as the Trump Administration is attempting to institute) is a slippery slope.
When a government – particularly one as authoritarian and openly corrupt as Trump’s – decides to be the arbiter of what citizens should or should not hear and read, we may as well get rid of all media outlets and establish Orwell’s Ministry of Truth. This is, in essence, what the proposed German law would do. Anyone, or any media company publishing a story that doesn’t sit well with Chancellor Merkel or her Administration could be fined – and this would have a chilling effect on free speech.
Nonetheless, the spread of disinformation, particularly among a low-information, uneducated population, is equally dangerous, as the recent election in the US has clearly demonstrated.
So, what is the solution?
The hard reality is that the principle of free speech is a two-edged sword. On one hand, as Thomas Jefferson pointed out, free speech and a free press are vitally necessary for a healthy democracy. On the other hand, a complete lack of constraint on speech means self-serving, corrupt individuals and organizations are free to spread lies, propaganda and half-truths to weak-minded populations who are easily led.
There is no simple solution to this problem. Facebook’s actions – teaming up with reputable, fact-checking organizations that adhere to strict principles – is a good step forward. However, as British author Aldous Huxley said, “The price of freedom is eternal vigilance.” But what happens when so many people are too ignorant or lazy to stay vigilant?
Researchers from Cambridge University have proposed an interesting solution. Looking at the spread of fake news as a “disease,” they believe that what is called for is a “vaccine.” As you may know, a vaccine is a weakened version of the disease itself, which activates the body’s defenses in order to build resistance. Under laboratory conditions, these researchers found that when a small amount of disinformation – presented in the form of a warning – was included with established, well known facts, people were less inclined to believe the falsehoods.
Dr. Sander van der Linden, lead author of the study, said, “The idea is to provide a cognitive repertoire that helps build up resistance to misinformation, so the next time people come across it they are less susceptible.”
So far, that works well – in theory. How it will work under real world conditions remains to be seen.