Facebook Limits Fact-Checking: Examining Trends of Misinformation

By Steffi Kim

9 minutes

            Splattered across political campaign ads and flashed across the top of Instagram feeds, misinformation is rampant in today’s society. Recently, social media giants have thrust the discussion of misinformation into the spotlight. On January 7th, Meta announced that its third-party fact-checking program would be dismantled and fewer restrictions would be placed on users’ speech. As the guardrails of fact-checking are lifted by Meta and similar companies like X and YouTube, there is growing concern about the deeper implications of such changes and how they may lead to a rise in distorted attitudes and beliefs. How will increased misinformation impact the thought processes of the 68% of American adults who use Facebook, especially among the 30% who consistently use it as a news source?* Misinformation isn’t just limited to TikTok videos or social media; rather, its burgeoning presence is a broader societal trend that is as unavoidable as it is insidious. Contemporary psychology research has revealed key mechanisms of why misinformation is so effective and some strategies to undercut common cognitive bias pitfalls and combat the spread of mistruths.

* according to 2023 data from the Pew Research Center

Personal Beliefs & Emotionality

            Within the past decade, the term “post-truth era” has gained traction to describe how the public has devalued the importance of facts when considering claims. Regarding political campaigns or fake news, people are increasingly influenced by emotionality and personal beliefs. Some scholars suggest that people’s standards for veracity seem to have fallen, and moral acceptance of misinformation has been on the rise. Naturally, people are more susceptible to believing misinformation from their ingroup, or the social group they belong to, rather than misinformation from a different group, termed the outgroup. Due to the concept of the Bandwagon Effect, when surrounding group members accept false information, people are more inclined to believe it themselves. Unsurprisingly, information criticizing the outgroup is more readily accepted than negative information about the ingroup. In Social Psychology, the term Naive Realism describes people’s tendency to believe that their perception of the world is fully accurate, and that any contradicting information must be biased or flawed. As such, people are inclined to believe and justify misinformation that aligns with their personal tenets and worldviews. For instance, staunch environmentalists may condone the spread of inflated statistics e.g. ‘The earth is warming by 0.5 degrees every year,’ because they agree with them in principle—even if the specifics are misleading. Strong social forces—like partisanship, social consensus, and group polarization—are at the root of many exaggerated claims.

            Emotional reactions also influence how misinformation is processed. People are more vulnerable to deceptive information when angry, and headlines or posts that provoke fury can induce wider acceptance and belief. Additionally, being happy increases trust and gullibility to misinformation, while sadness begets more critical evaluations. Nefarious misinformation may target these emotions and prey on personal experience to incite a reaction. Personal experiences and emotional responses are interrelated, and both frustrate true rational analysis.

Faulty Cognition 

            The overarching question ‘Why do people believe false information in the first place?’ can be traced back to the heuristics and irrational thought processes employed by the human mind. For instance, due to a phenomenon known as the Illusory Truth Effect, the more frequently a false statement is repeated, the more likely people are to believe it. Consistent exposure to misleading facts can trick the mind into assuming the statements are true—on the basis of familiarity alone. The cognitive bias known as Processing Fluency explains how information that is easier to understand and recall is deemed more favorable and believable. Viral conspiracy theories and oft-repeated claims may persist in people’s memories, and thus seem more likely to be true.

            While information may be easy to recall, people are notoriously bad at remembering where the information originated from, a phenomenon known as Source Amnesia. Additionally, people tend to take information at face value and do not properly weigh the credibility of the source. People also have trouble distinguishing fact from opinion, even when news articles are categorized and labeled as such. Oftentimes, people are so focused on deciphering the information and deciding how to respond that they neglect to analyze whether the statements are true. In deciding what to believe, people make snap judgments based on a statement’s plausibility, rather than taking time to actually verify the truth. In other words, if a statement seems plausible, regardless of the context, it is presumed to be true. These surface-level judgments make people particularly susceptible to hypothetical statements that straddle the line of truth. Generally, deceptive information that could have been true, for example, a headline falsely claiming that ‘Flooding was caused by government mismanagement,’ when it was caused by natural erosion, is considered less offensive due to its plausibility, despite being false nonetheless.

Methods of Combating False Information

            Combating false information is particularly difficult due to a phenomenon known as CIE, or the Continued Influence Effect. The Continued Influence Effect suggests that many people will continue to believe misinformation even after it has been debunked. One explanation for CIE is that people are good at recalling the false information, but fail to remember that the information was subsequently challenged. It is easy to believe that simply stating disconfirming evidence will change people’s minds. The Information Deficit Model (IDM) posits that false beliefs stem from a lack of knowledge, and providing accurate information to the public will remedy this issue. It is true that highly educated people tend to possess stronger critical thinking skills and are less likely to believe misinformation. Nevertheless, research has shown that even after people learn the correct facts, many still choose to believe false narratives. For instance, research by Brendan Nyhan, a professor at Dartmouth, found that presenting information discrediting fears about vaccines did little to change hesitant parents’ beliefs. This contradicts the Information Deficit Model and explains the Continued Influence Effect. The digestion of misinformation is not solely driven by ignorance, but rather a host of other contributing factors such as personal beliefs, partisanship, emotions, and shallow reasoning.

            Two approaches of guarding against misinformation have emerged, prebunking and debunking, which vary in relative effectiveness depending on the situation. Prebunking refers to taking preemptive measures to warn people about misinformation before the falsehoods are presented. Prebunking relies on Inoculation Theory, a Social Psychology framework that describes how beliefs can be made resistant to future persuasion or messaging. Prebunking initiatives warn people of misinformation and urge them to be wary, thereby priming critical thinking and skeptical analysis before exposure to the stimulus occurs. On social media, red icons warn users that a post is discredited before they read it. Experts anticipating misinformation may preemptively release statements disclosing the correct facts and warning about future attempts to mislead. Other prebunking measures like increasing internet literacy and informing the public of cognitive biases can also stop false information before it takes off.

            Research has also identified the optimal ways of debunking information after a false statement has been spread. Simply retracting the false information and declaring it void is insufficient, rather, people need an alternative version of events to cling onto. This alternative description should be detailed and include facts from credible sources. For instance, to refute rumors that ‘The mayor resigned due to business corruption,’ explain the correct narrative, in this case, that ‘The mayor resigned after falling-out with donors.’ It is important to preface the misinformation with a warning that it has been debunked. Since repeating statements makes them seem more credible, avoid amplifying and spreading the falsehood further. On the other hand, the true information should be repeated to maximize credibility. When debunking information, pointing out why the information is illogical or contradictory, undermining the source, and trying to emotionally connect with people can make the falsehood less appealing.

Real-World Implications

            In light of recent trends to limit fact-checking oversight, misinformation will almost certainly become more prevalent in the media. Regarding Facebook, using the wisdom of the crowd to identify false information can be risky, as people’s personal experiences and emotions will influence whether information is deemed reliable. Economic, political, and ideological motives as well as societal forces also come into play. Unlike third-party verification services such as the Associated Press, social media users will likely rely on quick judgments to decide whether a claim seems plausible, rather than engaging in rational fact-checking processes. The longer it takes for information to be flagged and the more times it is reposted, the more likely people are to believe it is true due to repetition and familiarity. How successful the process of controlling misinformation on social media will be, through prebunking and debunking, remains to be told. 

References

Adam, D. (2025, January 10). Facebook to ditch fact-checking: What do researchers think?. Nature News. https://www.nature.com/articles/d41586-025-00027-0 

American Psychological Association. (2023a, November 29). What psychological factors make people susceptible to believe and act on misinformation?. American Psychological Association. https://www.apa.org/topics/journalism-facts/misinformation-belief-action 

Calvert, D., & Waytz, A. (2017, March 6). The Psychology Behind Fake News. Kellogg Insight. https://insight.kellogg.northwestern.edu/article/the-psychology-behind-fake-news 

Ecker, U. K., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., … & Amazeen, M. A. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), 13-29.

Effron, D. A., & Helgason, B. A. (2022). The moral psychology of misinformation: Why we excuse dishonesty in a post-truth world. Current opinion in Psychology, 47, 101375.

Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: a randomized trial. Pediatrics133(4), e835–e842. https://doi.org/10.1542/peds.2013-2365

Robertson, D. J., Shephard, M. P., Anderson, A., Huhe, N., Rapp, D. N., & Madsen, J. K. (2023). Editorial: The psychology of fake news on social media, who falls for it, who shares it, why, and can we help users detect it?. Frontiers in Psychology, 14, 1236748. https://doi.org/10.3389/fpsyg.2023.1236748

Schaeffer, K. (2024, February 2). 5 facts about how Americans use Facebook, two decades after its launch. Pew Research Center. https://www.pewresearch.org/short-reads/2024/02/02/5-facts-about-how-americans-use-facebook-two-decades-after-its-launch/#:~:text=Around%20seven%2Din%2Dten%20U.S.,close%20to%20Facebook%20in%20usage