Why Misinformation Spreads
Over the past 16 months, the COVID-19 pandemic has highlighted not only our vulnerability to disease outbreaks but also our susceptibility to misinformation and the dangers of “fake news.”
In fact, COVID-19 is not a pandemic but rather a syndemic of viral disease and misinformation. In the current digital age, there is an abundance of information at our fingertips. This has resulted in a surplus of accurate as well as inaccurate information ― information that is subject to the various biases we humans are subject to.
Bias plays a significant role in the processing and interpretation of information. Our decision making and cognition are colored by our internal and external environmental biases, whether through our emotions, societal influences, or cues from the “machines” that are now such an omnipresent part of our lives.
Let’s break them down:
Emotional bias: We’re only human, and our emotions often overwhelm objective judgment. Even when the evidence is of low quality, emotional attachments can deter us from rational thinking. This kind of bias can be rooted in personal experiences.
Societal bias: Thoughts, opinions, or perspectives of peers are powerful forces that may influence our decisions and viewpoints. We can conceptualize our social networks as partisan circles and “echo chambers.” This bias is perhaps most evident in various online social media platforms.
Machine bias: Our online platforms are laced with algorithms that tailor the content we see. Accordingly, the curated content we see (and, by extension, the less diverse content we view) may reinforce existing biases, such as confirmation bias.
Although bias plays a significant role in decision making, we should also consider intuition vs deliberation ― and whether the “gut” is a reliable source of information.
Intuition vs Deliberation: The Power of Reasoning
The dual process theory suggests that thought may be categorized in two ways: system 1, referred to as rapid, intuitive, or automatic thinking (which may be a result of personal experience); and system 2, referred to as deliberate or controlled thinking (ie, reasoned thinking). System 1 vs system 2 may be conceptualized as fast vs slow thinking.
Let’s use the Cognitive Reflection Test to illustrate the dual process theory. This test measures the ability to reflect and deliberate on a question and to forgo an intuitive, rapid response. One of the questions asks: “A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?” A common answer is that the ball costs $0.10. However, the ball actually costs $0.05. The common response is a “gut” response, rather than an analytic or deliberate response.
This example can be extrapolated to social media behavior, such as when individuals endorse beliefs and behaviors that may be far from the truth (eg, conspiracy ideation). It is not uncommon for individuals to rely on intuition, which may be incorrect, as a driving source of truth. Although one’s intuition can be correct, it’s important to be careful and to deliberate.
But would deliberate engagement lead to more politically valenced perspectives? One hypothesis posits that system 2 can lead to false claims and worsening discernment of truth. Another, and more popular, account of classical reasoning says that more thoughtful engagement (regardless of one’s political beliefs) is less susceptible to false news (eg, hyperpartisan news).
Additionally, having good literacy (political, scientific, or general) is important for discerning the truth, especially regarding events in which the information and/or claims of knowledge have been heavily manipulated.
Are Believing and Sharing the Same?
Interestingly, believing in a headline and sharing it are not the same. A study that investigated the difference between the two found that although individuals were able to discern the validity of headlines, the veracity of those headlines was not a determining factor in sharing the story on social media.
It has been suggested that social media context may distract individuals from engaging in deliberate thinking that would enhance their ability to determine the accuracy of the content. The dissociation between truthfulness and sharing may be a result of the “attention economy,” which refers to user engagement of likes, comments, shares, and so forth. As such, social media behavior and content consumption may not necessarily reflect one’s beliefs and may be influenced by what others value.
To combat the spread of misinformation, it has been suggested that proactive interventions ― “prebunking” or “inoculation” ― are necessary. This idea is in accordance with the inoculatiion theory, which suggests that preexposure can confer resistance to challenge. This line of thinking is aligned with the use of vaccines to counter medical illnesses. Increasing awareness of individual vulnerability to manipulation and misinformation has also been proposed as a strategy to resist persuasion.
The age-old tale of what others think of us versus what we believe to be true has existed long before the viral overtake of social media. The main difference today is that social media acts as a catalyst for pockets of misinformation. Although social media outlets are cracking down on “false news,” we must consider what criteria should be employed to identify false information. Should external bodies regulate our content consumption? We are certainly entering a gray zone of “wrong” vs “right.” With the overabundance of information available online, it may be the case of “them” vs “us” ― that is, those who do not believe in the existence of misinformation vs those who do.
Leanna M. W. Lui, HBSc, completed an HBSc global health specialist degree at the University of Toronto, where she is now an MSc candidate. Her interests include mood disorders, health economics, public health, and applications of artificial intelligence. In her spare time, she is a fencer with the University of Toronto Varsity Fencing team and the Canadian Fencing Federation.
For more news, follow Medscape on Facebook, Twitter, Instagram, and YouTube.
Source: Read Full Article