A new study shows that fake news spread faster than facts on Twitter. This was especially prominent when it came to politics than other areas. This result was mainly due to humans themselves and not because of bots. Bots amplified both true and false content to a similar extent. It also showed that fake news are increasing over time.
The research also found that emotional manipulation plays a crucial rule. False content tend to provoke emotions such as fear, disgust and surprised. True content, however, focuses on emotions such as joy and trust.
What is fake news?
Fake news is misinformation turned into a cognitive, social or political weapon. It takes real events and distorts them. It can also make up entire stories out of whole cloth. They push an ideological narrative and manipulates emotions. Why? The simple answer is that it spreads farther and faster. Fake news pushers get more attention and followers. More importantly, it brings in more money. If more people view a story, they also view more ads. More ad views brings in more money.
There are other kinds of fake news as well. This includes impersonating a legitimate news source, using satire to misled, manipulated images and false context.
How do fake news pushers abuse social media?
Fake news pushers spread their content on social media in many different ways. Some methods involve technology, while others rely on psychological manipulation.
Twitter bots amplify any Twitter user or link by putting it in front of more people. Filter bubbles allow people to become ideologically isolated. They rarely see information that goes against their beliefs. Instead, they mostly see information that support their position. This is because you choose who to follow on social media. It is also because search engines adapt to your search behavior over time. The more you search for something, the more you will get that bias in the future.
Writing content that fits ideological narratives mean that it resonates among more people. Such articles get more likes and shares on social media. Another way to get interactions is to write content that manipulates people’s emotions. In other words, people who are angry or upset are more likely to engage and spread the content.
What is the harm with fake news? Is it not just funny that false content fools people? No. Fake news can harm responses to natural disasters and terror attacks, make bad financial investments and skew elections. Fake news is serious business.
There are many ways you can spot fake news. Check independent fact-checkers. Look for misleading domain names. Check the source of the images. Think about what biases the story plays on. Figure out what emotions the story is trying to make you feel.
Twitter boosts spread of fake news
A new study was recently published in the journal Science. The researchers look over 4.5 million tweets about 126 000 articles sent by 3 million users. They determined the truth of a story by checking it against established fact-checking websites. The study found that fake news spread faster and to more people than true stories. This was true for all categories, but some stood out. Fake news about politics showed a stronger effect than fake news about e. g. science, finance or disasters.
They researchers looked specifically at emotional content. Fake news were more novel and focused on fear, disgust and surprise. On the other hand, true content contained more sadness, joy and trust.
The study had one surprising finding. They results showed that bots spread all news at similar rates, whereas humans were more likely to spread fake news.
This study is very robust. It relied on multiple, state-of-the-art bot detection programs. Each fake item spread in several cascades, providing independent data. They also themselves examined claims not covered by fact-checkers to avoid selection bias.
What does it mean?
The research shows that fake news is increasing. This directly contradicts efforts to downplay the effects of fake news.
This research confirms many of the things that scientific skeptics, debunkers and science advocates have suspected for a long time. Fake news spreads much faster than true content. This is explained by the well-known quotes like “a lie can travel halfway around the world while the truth is putting on its shoes” or core skeptical principles such as the asymmetry of bullshit. This states that it takes an order of magnitude more effort to debunk nonsense than it takes to promote it.
It also supports the creeping feeling that pure science content gets less spread on social media than refuting horrible cases of quackery. Articles that in detail describe a scientific paper on herd immunity without sensationalism gets very little spread on social media. In contrast, articles where quacks are sent to prison for a very long time for harming children with quackery gets spread around a lot. The same is true for content that inspires schadenfreude, such as Natural News getting banned from YouTube.
What about Twitter bots?
It is a bit surprising that bots spread fake news at similar rates to true news. One expects Twitter bots to amplify fake news much more than true content. However, this finding might be a result of the fact that so many organizations and users rely on automation on Twitter to spread their content.
For instance, news organizations connect their websites to their social media accounts. After a new article gets posted on the news website, the same story is automatically posted on their social media accounts. Some people automate their account to tweet from a premade list of tweets in order to produce regular content. The signal from bots amplifying fake news might get lost because of the widespread nature of social media automation.
Another reason for this finding is that humans are easier to emotionally manipulate than bots. Humans evolved in a social environment, whereas the typical Twitter bot is probably too simple to model emotions. In other words, humans are easier to emotionally manipulate than bots. These results highlight the importance of teaching humans to recognize false content and content that is emotionally manipulative.
Humans matter. They are easily affected by emotions and ideological biases. This research confirms many suspicions held by science advocates and skeptics before. Fake news is an increasing problem. Human biases play a vital role. Surprisingly, bots amplifies both true and false content at similar rates.