December 24

The Earth is Flat: A Psychology Perspective

Living in a world of “alternative facts” and misinformation is exhausting. A casual scroll through Facebook can spiral into a tiresome fact checking escapade and an emotional rollercoaster: “You mean we could have been free from COVID if we had all just injected ourselves with bleach?!” Misinformation is defined as false information that is spread, regardless of intent to mislead [1].  In extreme cases, misinformation can morph into more grandiose conspiracy theories — for example that the 9/11 attacks were an “inside job” or that Wayfair furniture sales are a cover for child trafficking. According to psychologist Sander van der Linden, a conspiracy is “an attempt to explain the ultimate cause of an important societal event as part of some sinister plot conjured up by a secret alliance of powerful individuals and organizations.” Conspiracy theories are certainly not a novel product of modern society. In fact, there is documentation of juicy conspiracy suspicions at least as far back as ancient Rome [2].  But it feels like 2020 has been a tornado of “misinformation,” “fake news,” “alternative facts” and conspiracies. In fact, QAnon, one of the more well-known conspiracy groups, has even seeped into mainstream American politics. So, why is misinformation so pervasive and is there any way to curb it?

Conspiracy Theories have a Type

Researchers have sought to identify the type of person who is most readily drawn to conspiracy circles. The literature is vast and a large number of traits have been linked to an increased likelihood of believing in conspiracy theories. For instance, individuals who greatly value feeling unique may be attracted to the small and exclusive nature of conspiracy groups [2]. Did you notice your cereal floating in any Jesus-esque shapes this morning? Research suggests that individuals who are more prone to “discovering” patterns in randomness are also more likely to entertain conspiracy theories. Additionally, people with a more jaded or negative view of the world seem particularly susceptible to becoming a “believer” [2,3,8].

Other key factors are more situational. For instance, people are more likely to cling to conspiracy theories when there is a sense of uncertainty or when major events lack a clear explanation. In this case, the conspiracy theories provide a route to “cognitive closure” or a way to make sense of loose ends [2]. Because conspiracy theories offer answers to complex and emotional events, conspiracy theories gain traction in moments of crisis. On a personal level, many flock to conspiracy groups as a way to cope with feelings of a loss of control or when they feel oppressed by society. In fact, conspiracy theories have been described as a way to “buffer people from threats to the social system.” Not surprisingly, then, experimental manipulations that bolster a sense of control are associated with reductions in conspiracy beliefs. For instance, a group of researchers manipulated one’s sense of control by prompting participants to remember moments where they felt either in control or completely helpless. Specifically, the “high control” group of participants were instructed to recall and describe an instance in which something happened and “they were in complete control of the situation” prior to rating their belief in several conspiracy statements related to a current event. These ratings were compared to ratings made by two other groups: a “low control” group that received the same instruction except that they were instructed to recall an event when they felt they had no control over the situation, and a neutral group of participants who were simply asked to recall what they had for dinner the night before. Participants primed to reflect on a moment where they felt in control endorsed far fewer conspiracy statements compared to both the low control group and  the neutral group, leading the researchers to conclude that feeling in control may make people less vulnerable to conspiracy theories [9].

The longevity of belief in misinformation and conspiracies is likely the result of what psychologists refer to as motivated reasoning, or confirmation bias. Motivated reasoning refers to our tendency to interpret new information in a way that aligns with our currently held views, even when that information appears wildly contradictory [2]. A surprising example of our stubborn adherence to beliefs comes from research on false memories. Some studies suggest that people are more prone to experience a false memory that supports their views. For example, a study conducted during Ireland’s abortion referendum revealed that “yes” voters often reported a false memory for a fabricated event regarding a scandal surrounding the “no” campaign and vice versa [4]. These findings have particularly alarming implications for how fake news can influence our views and even manipulate our memories.

We’re suckers for a good page turner

Another reason that conspiracy theories are so rampant is because they tell a good story. Narrative theory asserts that people are drawn to stories. Indeed, storytelling has been a centerpiece of culture throughout human history, making it an effective mode for delivering a message. One researcher argues that conspiracy theories are convincing (at least in part) because they borrow key archetypes from the familiar “hero’s journey” narrative: a call to action goes out, a  “hero” steps up to  answer the call and seeks guidance from a mentor, and then  ultimately proceeds to  defeat the “shadow” or the source of evil . Conspiracy groups cast you as the hero and it is up to you to unveil the truth and overthrow the shadow (e.g. the government, scientists etc.). This language is thought to resonate with us because the narrative is so familiar, whether from reading the Iliad, the Harry Potter series, or childhood comic books [3,7].  And Superman can’t actually decline the call for action, can he? The power of the hero narrative is vividly illustrated by “Pizzagate”, a striking example of how conspiracy theories can compel an individual to take radical action. In this true tale, Edgar Welch, a seemingly innocuous father of two, was spurred to action by Alex Jones’ report that Hillary Clinton was running a pedophile ring in the basement of a D.C. pizza restaurant. A few days later Edgar barged into the D.C. restaurant armed with a gun and knife and prepared to take any necessary actions to free the children. To his utter shock, Edgar found no captive children and discovered that the pizza restaurant didn’t even have a basement [5]. Thankfully nobody was harmed when Edgard tried to fulfill his role as hero, but the event serves as a flashing beacon for how conspiracy theories and misinformation can fuel dangerous responses.

The Fall of Carrier Pigeons

Even as a non-conspiracy believer, it is challenging to live in today’s world without engaging with misinformation. In this era of social media, our megaphones are only one click away. The ease of sharing messages across the internet and social media has created a fast lane for the spread of misinformation and a breeding ground for conspiracy groups unimpeded by distance and time zones. All of us are guilty from time to time of clicking “share” without performing a thorough fact check.

 Importantly, however, recent research suggests that most people do not want to share misinformation and that simply “nudging” them to consider the veracity of headlines can improve the quality of the information shared. In a set of  studies, researchers first tried to identify why people tend to share misinformation. Participants were assigned to one of two groups: 1) an accuracy condition in which participants had to rate the accuracy of several true or false headlines or 2) a sharing condition in which participants reported whether they would consider sharing each headline online. The results showed that people are actually pretty good at judging the accuracy of headlines and, perhaps surprisingly, the participants did not show a political bias when rating the headline veracity. A subject’s desire to share the headline, however, does not necessarily match their accuracy judgments. Participants showed only a slight preference for sharing true headlines compared to false headlines. Instead their preference to share was driven by whether or not the headlines aligned with their politics. Yet when asked how important they felt it was to share only true information on social media, participants overwhelmingly reported that it was “extremely important.” One explanation for this discrepancy is that the social media environment funnels our attention to factors other than accuracy – things like the evocative nature of a post or our desire to signal our group membership. In a follow up study, researchers found that simply having participants judge the accuracy of a headline upon signing onto Facebook, improved the quality of their posts and reduced the tendency to share false content [6].

But there are of course times when people do in fact believe the misinformation. Research  suggests that we often equate consensus with correctness, making it a valuable weapon against misinformation (for example, that “97% of climate scientists believe in global warming”), but that this initial persuasiveness can be demolished if the science is viewed as politicized. So then how do we convince people of the truth? Psychologists have found that the use of a technique called inoculation theory may be an important tool in allowing the truth to reemerge. Inoculation theory borrows a page out of medicine’s playbook. According to this theory, if you first warn a person that you are about to unveil some information that goes against their beliefs and you then introduce a weaker version of your statement, you are more likely to have success in convincing them of your argument. This functions much like a vaccine: the nurse first warns you that they are about to jab you with a needle and they then inject you with a very small dose of the virus. 

For instance, let’s return to the consensus statement that  “97% of climate scientists believe in global warming”. Using the inoculation approach, researchers first warned participants that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” This subtly suggests that some of currently held beliefs may be flawed and thus implicitly forewarns that such beliefs could be called into question. To test whether the inoculation approach can withstand misinformation, the researchers then presented a false countermessage such as “there is no consensus on human‐caused climate change.” They next moved to the second and final step of inoculation: “injecting” a small dose of the truth. Here, they gently reiterated the main sentiment of the original consensus statement, e.g. that scientific research has revealed essentially no disagreement amongst scientists that humans are causing climate change [7]. This work showed that the inoculation approach is effective even when it is presented alongside a piece of contradictory misinformation. 

 In conclusion, the reason that misinformation and conspiracy theories are so rampant is complex. In some cases, they act as a safety blanket capitalizing on our cognitive biases and human desire to make sense of the events around us. Further, even when people denounce sharing falsities, they sometimes partake in the massive game of telephone created by social media and technology. Luckily, however, recent work suggests that most people do in fact value the truth and that there are several strategies that can be used to nudge people in the right direction. So let’s spread the word: pause and think before you share.


Work Cited


2. Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi, F. (2019). Understanding conspiracy theories. Political Psychology, 40, 3-35. 

3. Runnels, R. (2019). Conspiracy Theories and the Quest for Truth.

4. Murphy, Gillian, et al. “False memories for fake news during Ireland’s abortion referendum.” Psychological science 30.10 (2019): 1449-1459.


6. Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological science, 31(7), 770-780.

7. Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008.


9.van Prooijen, J. W., & Acker, M. (2015). The influence of control on belief in conspiracy theories: Conceptual and applied extensions. Applied Cognitive Psychology, 29(5), 753-761.