When Corrections Fail

Lost Luxury of Shared Reality

Cognitive Orientation

|

November 23, 2025

|

by Romina Wendell

Shared reality, once the default assumption of civic life, now feels quaint. It’s not just that people disagree on how to interpret facts. It’s that they increasingly don’t agree that the facts occurred at all.” – Professor RJ Starr

Step from the left to the right, or from the right to the left, in today’s political or cultural media landscape, and the first thing that becomes apparent is the difference in what counts as news. The top story of the day, depending on the news channel or publication, algorithm or social network, can be worlds apart. When interest and outrage do converge on some major news event, they may as well be mirror universes, taking in and regurgitating vastly different interpretations of what transpired, who said what, and what it all means.

Beyond differences of values or opinions, modern public debate now wrestles more than ever with facts themselves. No matter the topic, from seed oils to vaccines to global conflicts, each political and cultural camp has crafted its own arsenal of facts. No longer merely the evidence-based data points of yore, modern-day “facts” are the raw material for narratives shaped to fit a particular worldview.

Despite an information-rich media landscape that presents an answer for everything, those answers more often than not fall into perpetual dispute. The well-documented erosion of institutional trust over the past decades in the West has led to an inevitable loss of authority to define truth. The result is that the search for truth—even the idea of truth—has transformed into a battleground, one in which every fact must fight for survival.

These divergent realities and the resulting polarization reflect a dramatic loss of shared cognitive orientation. The disintegration of the communal mental map has left the idea of a shared reality in freefall, and with no agreed-upon handholds for a common understanding, the cognitive safety net disappears beneath. The outcome is a populace that becomes unable to mobilize collaboratively around concrete challenges.

When Corrections Do Not Land

There is an old saying that goes, “A lie will go round the world while truth is pulling its boots on,” and while it remains true, it is becoming increasingly likely that when the truth does catch up, it will slam into a well-polished partisan wall of resistance. Of course, people sometimes do update their beliefs after seeing corrections, but studies of political misinformation show that partisan divides still remain strong barriers to truth.

Opposing political or cultural positions often arise from very different assumptions about motives, context, and risk. If people’s starting points sit too far apart, the same piece of information can enter two different mental worlds, changing shape and function inside each one. Within a dogmatic worldview, corrections will not steer the course but loop into thought-terminating clichés such as “biased media,” “rigged systems,” or “out-of-touch elites.”

In a social media–driven age defined by algorithms and echo chambers, it is easy to usher oneself into a preferred reality, as, no matter the belief, there is affirmation for it online. This leaves the faith invested in the solve-all fact-check to clean up information chaos caught in a downward spiral of diminishing returns. People in the modern era, as it turns out, do not actually lack access to accurate facts; it is that they make choices around them, even in spite of them at times.

Fact-checks are no longer the bridge from ignorance they once were, but are now loaded with cultural baggage. What can present as a neutral correction from one angle can appear as a partisan manoeuvre or a cloaked attack from another. Rather than illuminate an informational void, when fact-checks meet partisan dogma, they can end up stoking divisiveness and ultimately become a testing ground for belonging.

The Backfire Effect

The term “backfire effect” describes when a correction makes belief in a false claim even stronger rather than weaker. If people distrust the source of new information, especially when that information clashes sharply with core worldviews, they will reject it. It will come across as a threat to identity or group loyalty, and become the instigator of a double-down rather than a reconsideration.

Corrections can, and regularly do, operate as a signal about who belongs, who can speak, and whose version of reality is allowed to stand. That is why the same fact-check can soften misperceptions in one setting while driving people deeper into defensive narratives in another.

Protest sign held above the crowd that reads, "We wish you were fake news."

Ballot Box Fallout

The recent 2024 U.S. election cycle was a high-stakes showdown between disinformation and correction within a fragmented information landscape. While distortion and outright lying have been par for the course in the political arena for a long time, the sheer rate at which untruths went unchallenged and gained technological propulsion put institutions and news agencies on the back foot. U.S. and foreign-organised campaigns pushed misleading stories about candidates, immigration, crime, and the economy across social media, talk shows, and rallies. Journalists, independent fact-checkers, and official institutions did challenge many of these claims, but these challenges were hemmed in by partisan divides. Too often, provocative narratives had already taken hold and still shaped how large groups of voters saw the race and its candidates.

Prior to the election, trust in news organizations was already low. Compounded with the knowledge that digital tools make it easy to create convincing fake or highly selective content, a fact-check does not arrive as a neutral clarification for everyone. For some, it is enough evidence that reporters and institutions still work; however, with the proliferation of deepfakes and fraudulent studies, that is seen as increasingly naïve. For others, it affirms the media as master manipulator, which in turn can be viewed as feeding a stubborn paranoia. The same corrective story can very easily support two opposite beliefs.

U.S. 2024 election-night rumours followed the same pattern. Slow vote counts or routine administrative steps were framed as normal procedure by some, while others declared them signs of fraud. It all depended on the story people already lived inside. When institutions release explanations, those messages help only if listeners see those institutions as legitimate sources of truth. Without that basic trust, each new explanation turns into fresh material for new doubts and suspicions.

Rear view of young woman holding up sign above her head saying MAHA: My Body My Choice

Public Health & Loss of Trust

The COVID‑19 pandemic offered another look at how fragile a shared sense of reality has become. During the early stages of the pandemic, there was a surge of misinformation. With so little known at the time, it was an inevitable outcome; however, people still debate how much and when new, accurate information was known, and how effectively it was shared. More crucially, as in any human affair, political agendas worked very quickly to co‑opt both governing bodies and the public at large. The cost was decreasing trust in “the science.”

For vaccines whose rapid development made some wary, accessible evidence‑based information was not enough to persuade sceptics. Often, “the facts” were run through a mental filter. If they clashed with strong fears about side effects or deep distrust of institutions, they were tossed aside. So the issue was not a matter of access to facts, but which ones were noticed and trusted as opposed to tuned out.

Later on, inconsistent masking calls, ill‑considered school closures, and a bouncing ball of mixed messaging on social distancing politicised trust and distrust in public health guidelines, so much so that one’s belief—or disbelief—in “public health” became a marker of identity and division. In some communities, official reassurances from health agencies felt calming and credible; in others, they were seen as signs that authorities were hiding real risks or seizing control. The same corrective message lands very differently depending on how people see the source and what they believe that source represents. The result is not just disagreement about policy, but parallel realities about something as basic as safety and risk.

Man immersed in Fake News projections whirling around his body

Loud Voices & Perceived Consensus

Social media has made these problems sharper by changing who gets heard, how fast, and at what scale. Platforms give huge numbers of people the chance to correct others, share links, and post their own versions of events, which can look like a more democratic answer to misinformation. At the same time, the loudest and most extreme voices often dominate, so the overall picture of public opinion can become skewed.

A 2025 study on political corrections online found that people who claim they correct others’ political or COVID‑related misinformation tend to be highly engaged and strongly partisan. A small but real subgroup is also more accepting of political violence. This does not mean that everyone who offers corrections is extreme. It does mean that corrections in public threads often arrive with a strong emotional and partisan charge. For people watching from the sidelines, that noise makes it hard to see what most people actually think about vaccines, elections, or any public issue.

The very tools that could help bring understanding closer instead end up reinforcing separate mental worlds. When each camp sees its own influencers and friends constantly “proving” the other side wrong, a shared sense of reality slips even further away.

When Truth Has No Ground

Across these cases—elections, public health, social media debates—the pattern is similar. The factual content of a correction is only one layer. Beneath it lies a more basic question: does enough shared reality and institutional authority remain for corrections to matter, and to be heard?

Where a sense of shared reality exists, corrections have a chance to take hold. People might still disagree, but they are at least checking mutually agreed‑upon sources, comparing arguments, and updating views when earlier claims collapse. Where that sense has eroded or perhaps even gone, corrections can be met with suspicion and defensiveness.

The crisis of misinformation is fundamentally one of shared reality. Whatever the divided group—be it liberals, conservatives, or the infinite number of warring online niche subcultures—the breakdown or absence of a shared cognitive orientation towards politics, culture, and civic life has turned public life into a series of clashes between insulated realities. Fact‑checking, content moderation, and media literacy do still matter to some degree, but without some rebuilding of common mental ground and basic agreements about what counts as evidence and which processes deserve trust, the most careful corrections cannot help but pull us further into the very divides they are meant to repair.

Man correcting another man who is sitting on a stool and appears unimpressed
Three business women arguing

Well Known Memes:

  • Drake “Hotline Bling” meme
    Used in anti‑vaccine disinformation to frame “not getting the vaccine” as the cool choice and “trusting health authorities” as uncool, often presented in the familiar two‑panel up/down format.
  • Before/after or split‑screen memes
    One viral meme showed Greta Thunberg and climatologist Judith Curry, side by side, claiming the media only covered one of them and falsely stating Curry called climate change a “hoax,” implying a rigged media landscape.
  • “Vaccines cause autism” text memes
    A short, punchy slogan became a global meme after a fraudulent 1998 study, appearing on signs, shareable graphics, and social posts, despite being widely debunked.
  • AI‑generated political memes
    Newer AI‑made memes mix images of cats, politicians, or scenes from small towns with invented stories (for example, about immigrants harming pets), blurring satire and disinformation in election conversations.
  • Climate change hoax memes
    Simple text‑and‑image combos such as “the ice caps are not melting… they are growing!” paired with scenic photos, used to downplay climate science while appearing casual and joke‑like.
  • NFL kneeling and political memes
    Images of players kneeling or politicians speaking, with fabricated captions about donations, health, or secret motives, circulated widely on Facebook and Twitter as “just memes” that still planted false claims.
Research & Sources:
  1. https://www.oiip.ac.at/en/publications/the-politics-of-misinformation-social-media-polarization-and-the-geopolitical-landscape-in-2025-2/
  2. https://www.cambridge.org/core/services/aop-cambridge-core/content/view/20F819E5EBC5C8E076C9DDD54B67D280/S1138741625100103a.pdf/the-fake-news-and-polarization-landscape-scoping-review-of-fake-news-research-and-its-link-with-attitude-polarization.pdf
  3. https://www.brookings.edu/articles/how-disinformation-defined-the-2024-election-narrative/
  4. https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2025/dnr-executive-summary
  5. https://www.sciencedirect.com/science/article/pii/S0176268024000806
  6. https://carnegieendowment.org/research/2024/01/countering-disinformation-effectively-an-evidence-based-policy-guide?lang=en
  7. https://www.pnas.org/doi/10.1073/pnas.1912440117
  8. https://pmc.ncbi.nlm.nih.gov/articles/PMC7934973/
  9. https://www.sciencedirect.com/science/article/pii/S2211368120300516
  10. https://www.frontiersin.org/journals/political-science/articles/10.3389/fpos.2024.1254826/full
  11. https://www.brookings.edu/articles/election-night-disinformation-risks/
  12. https://www.brookings.edu/articles/an-ordinary-election/
  13. https://www.brookings.edu/articles/are-concerns-about-digital-disinformation-and-elections-overblown/
  14. https://www.nature.com/articles/s41598-023-35974-z
  15. https://www.nature.com/articles/s41598-022-17430-6
  16. https://www.nature.com/articles/s41541-024-00951-8
  17. https://www.nature.com/articles/s41591-024-02939-2
  18. https://www.nature.com/articles/s41541-024-00962-5
  19. https://journals.sagepub.com/doi/10.1177/20563051251335086
  20. https://infolab.uottawa.ca/common/Uploaded%20files/PDI%20files/Polorized-EN-FINAL.pdf
  21. https://www.frontiersin.org/journals/political-science/articles/10.3389/fpos.2025.1625535/full
  22. https://pmc.ncbi.nlm.nih.gov/articles/PMC12351547/
  23. https://onlinelibrary.wiley.com/doi/10.1111/ssqu.70039?af=R
  24. https://www.youtube.com/watch?v=l2VvCBlRyd0
  25. https://www.nature.com/articles/s41541-024-01019-3
  26. https://www.nature.com/articles/s41599-025-05714-x