19th-Century American writer and philosopher Mark Twain reputedly once said, “A lie can travel around the world and back again while the truth is lacing up its boots.” Regardless of whether or not that attribution is accurate, it is true that “fake news” in not a new phenomenon; the history of spreading lies and disinformation can be traced back to the invention of the printing press in the 15th Century.

However, with the rise of near-instantaneous electronic communication, “fake news” presents us with dangers that Johannes Gutenberg could never have imagined. Last week, researchers at the Massachusetts Institute of Technology published a study in the journal Science that gives us a clearer picture of today’s “fake news,” where it is coming from, how it spreads, who is spreading it – and why.

The researchers studied approximately 126,000 stories that were passed along on the social media platform Twitter between 2006 (when Twitter started operations) and 2017. Those stories were “tweeted” over 4.5 million times by 3 million individuals. As it turns out, Twitter is “Ground Zero” for the spread of what researchers term as “false stories.”

Over the course of Twitter’s first decade of existence, the number of fake news tweets far outnumbered those containing truth. Furthermore, those stories go farther at a much faster rate, reaching far deeper into social media networks than legitimate news accounts. Data scientist Soroush Vosughi says that based on the study, “It seems to be pretty clear that false information outperforms true information.”

Indeed, the research team discovered that accurate stories generally got to a maximum of 1,000 people – but the top 1 percent of fake news stories went out to as many as 100,000. Surprisingly, the culprits are not the “influencers,” “thought leaders” and others with substantial followings. According to the study, the ones who are the busiest spreading lies and rumors have “significantly fewer followers…are less active on Twitter…and have been on Twitter for significantly less time.”

That covers the what, who and the how of it. But what about the “why”? Vosoughi says, “It might have something to do with human nature.” Specifically, the researchers’ algorithms showed that human attraction to novelty and things that elicit an emotional response plays a large role. Fake news stories that spread quickly draw the most attention – particularly when they evoke fear, disgust, and surprise.

One thing that has not changed over the centuries: the spread of fake news and disinformation can have grave, even fatal consequences. In 1475, a young child in the Italian city-state of Trent disappeared. In the wake of the child’s disappearance, a Franciscan priest named Bernardino da Feltre preached a number of sermons claiming that local Jews had kidnapped and murdered the boy, drinking his blood in a celebration of Passover. The story spread, despite efforts by the Pope to intervene and put a stop to the rumors – and the Jewish community paid a heavy price. Tragically, anti-Semitic websites continue to present that story as fact.

A few centuries later, Ben Franklin spread false stories of Indians murdering settlers at the behest of King George III in order to rouse support for the revolution. Over the course of the 1800s, “yellow journalism” led to attacks on blacks and immigrants – and even pushed the U.S. into war.

Eventually, there was a backlash, at least for awhile. During the 20th Century, there was a move toward fact-based, objective journalism, based on verifiable facts and reputable sources. However, with the Internet and the rise of social media, yellow journalism has been making a major comeback – with serious implications for democracy itself. The researchers pose the question, “How can we create a news ecosystem … that values and promotes truth?”

There is no simple answer, but the first step lies in recognizing the nature of the problem and analyzing it. Last week, Twitter CEO Jack Dorsey issued his own tweet in which he said, “We’re committing Twitter to help increase the collective health, openness, and civility of public conversation, and to hold ourselves publicly accountable towards progress.”

Recently, the company announced that it would be revisiting a program in which its users can apply for a “verification” credential. In November 2016, Facebook founder and CEO Mark Zuckerberg issued a statement about the issue; the social media giant has been offering users different methods for verifying the stories they see. While well-intentioned, these methods – relying on the efforts of users – are unlikely to address the problem.

If there is a silver lining here, it is the researchers’ findings that Twitter users who share accurate stories are far more engaged and have larger followings than those who spread fake news. Unfortunately, they’re up against human nature, which craves sensationalism and gossip over facts as well as a marketplace that can sell more ads with falsehood than with truth. Currently, there are no reliable solutions to the problem. Meanwhile, the dangers to society and a system that is shaped largely on public perceptions remain – and the situation is not getting any better.

K.J. McElrath is a former history and social studies teacher who has long maintained a keen interest in legal and social issues. In addition to writing for The Ring of Fire, he is the author of two published novels: Tamanous Cooley, a darkly comic environmental twist on Dante's Inferno, and The Missionary's Wife, a story of the conflict between human nature and fundamentalist religious dogma. When not engaged in journalistic or literary pursuits, K.J. works as an entertainer and film composer.