Introduction: “How Are We Doing This Again?”
Most people understand the personal version of repeating mistakes. You date the same type of emotionally unavailable partner. You overwork, burn out, then promise yourself you will not do it again, and somehow you do.
The collective version is more unsettling. Whole teams repeat the same failing strategy. Organizations keep funding projects that are clearly not working. Communities cycle through the same “tough” solutions that do not solve root causes. Countries repeat political mistakes, from scapegoating minority groups to chasing simplistic “silver bullet” policies, then acting surprised when outcomes deteriorate.
From the outside, it looks irrational. From the inside, it often feels like the only reasonable option.
This article explains why groups and societies repeatedly make bad decisions while believing the result will be different. You will learn the research-backed mechanics behind collective error, how online and media ecosystems amplify it, why politics is especially vulnerable, and practical step-by-step tools you can use as an everyday citizen, team member, voter, manager, or parent to reduce repeat mistakes.
1) What Counts as “Collective Bad Decision-Making”?
A collective bad decision is not simply a decision that has a bad outcome. Bad outcomes can happen even with sound reasoning.
A collective bad decision is more like this:
- the group ignores or suppresses warning signs
- the group overweights confident voices and underweights expertise
- the group repeats a familiar strategy despite repeated failure
- the group locks into a story that protects identity, status, or comfort
- dissent is punished socially, subtly or openly
- the group believes “this time” will be different without changing the core inputs
That last bullet is the heart of it. Repeating the same inputs while expecting different results is not just a personal habit. It is a social phenomenon with predictable psychological drivers.
2) Why Groups Repeat Mistakes: The Core Psychological Engines
2.1 Social proof and informational cascades
When people are uncertain, they look to others for cues. This is rational up to a point. The problem is that visible consensus can become a shortcut for truth.
In an informational cascade, individuals stop using their private information and simply follow what others appear to believe or do (Bikhchandani et al., 1992). Once a cascade starts, it can be surprisingly hard to stop because:
- people assume earlier actors knew something
- dissent feels risky
- the group starts rewarding conformity
This is how bad ideas can become “obvious” quickly, especially in politics where most people cannot directly verify complex policy claims.
2.2 Conformity pressure and the fear of isolation
Classic conformity research shows that social pressure can distort judgment even when facts are clear (Asch, 1956). In real life, political and group decisions are rarely clear, and the pressure is often stronger.
In public debates, the fear of social punishment can push people into silence or performance agreement. Spiral of silence theory proposes that people may withhold dissenting views if they think their position is unpopular, which then makes the dominant view seem even more dominant (Noelle-Neumann, 1974).
2.3 Pluralistic ignorance
Sometimes the group is not truly convinced. They only think everyone else is.
Pluralistic ignorance occurs when individuals privately reject a norm but assume others accept it, so they go along with it, which keeps the norm alive (Prentice & Miller, 1993). This helps explain collective repeating mistakes like:
- everyone privately doubts a policy, but no one says it
- people keep supporting a “tough” approach because they assume others demand it
- leaders misread silence as consent
2.4 Groupthink and the suppression of dissent
Irving Janis described groupthink as a deterioration of mental efficiency and moral judgment that can occur when cohesive groups prioritize unanimity over critical thinking (Janis, 1972). Groupthink is more likely when:
- the group is insulated
- there is a strong leader preference
- stress is high
- dissenters are marginalized
Politics often has these conditions, especially in inner circles, party leadership, and crisis decision-making.
2.5 Group polarization
When like-minded people deliberate together, their views often become more extreme. This is group polarization, observed across decades of research (Moscovici & Zavalloni, 1969; Sunstein, 2009). In politics, polarization increases:
- moral certainty
- dehumanization of opponents
- willingness to accept risky or harsh policies
- refusal to update beliefs
Polarization is particularly intensified when identity becomes central.
2.6 Identity-protective cognition and motivated reasoning
People do not process political information like neutral scientists. We often reason like lawyers defending a side. Motivated reasoning describes how people selectively accept evidence that supports their preferred conclusion and scrutinize evidence that threatens it (Kunda, 1990; Taber & Lodge, 2006).
Identity-protective cognition goes further: when beliefs are tied to group identity, rejecting the belief can feel like rejecting your community (Kahan, 2013). In politics, identity is often the main currency.
2.7 Confirmation bias
Confirmation bias is the tendency to seek, interpret, and remember information in ways that confirm preexisting beliefs (Nickerson, 1998). Online environments make this worse because:
- algorithms feed us similar content
- social networks reward agreement
- opposing facts can be framed as hostile propaganda
2.8 The illusory truth effect
Repeated claims feel truer, even when false. This effect persists even when people know the topic well (Hasher et al., 1977; Fazio et al., 2015). In politics, repetition is constant:
- slogans
- talking points
- repeated emotional narratives
A repeated narrative can become “common sense” without becoming accurate.
2.9 Escalation of commitment and sunk costs
Groups often keep investing in failing strategies because changing course feels like admitting failure.
- The sunk cost effect describes continuing an endeavor because of past investment, even when future costs outweigh benefits (Arkes & Blumer, 1985).
- Escalation of commitment shows how negative outcomes can paradoxically increase commitment, especially when ego, reputation, or political capital is involved (Staw, 1976).
Political systems amplify this because leaders fear looking weak or inconsistent.
2.10 Hindsight bias and the illusion of learning
After events unfold, people often feel outcomes were predictable and obvious. Hindsight bias can reduce genuine learning because it replaces “we misjudged uncertainty” with “we always knew” (Fischhoff, 1975; Christensen-Szalanski & Willham, 1991).
In politics, hindsight bias fuels blame without insight. It encourages shallow lessons like “we just needed stronger leadership,” rather than deeper analysis of incentives and information failures.
3) Why Politics Is a Perfect Storm for Repeating Mistakes
Political decisions are uniquely vulnerable because they combine:
- high stakes
- high emotion
- limited direct feedback
- complex causal chains
- identity and tribal belonging
- competition for power
- information ecosystems that reward outrage
3.1 Complexity hides causality
In everyday life, consequences are often immediate. Touch a hot pan, you learn fast.
In politics, cause and effect can be delayed by years, intertwined with global events, economic cycles, and institutional constraints. This makes it easier to keep believing, “This time it will work,” because failure can be explained away.
3.2 Moralization makes updating harder
When political beliefs become moral identity, updating feels like betrayal. Moralized issues trigger stronger emotional processing and social punishment for dissent (Haidt, 2012). This raises the cost of admitting mistakes.
3.3 Affective polarization creates “team first” thinking
Research on affective polarization highlights how politics becomes about liking your side and despising the other side, even more than policy content (Iyengar et al., 2012; Iyengar et al., 2019). In that environment:
- admitting your side is wrong strengthens the other side
- therefore, people defend their side even when privately uncertain
This is a recipe for repeating mistakes.
3.4 Incentives reward confident simplicity
Leaders and commentators are often rewarded for certainty, not calibration. Nuance looks weak. Simplicity spreads.
This is one reason populist messaging and simplistic solutions are structurally advantaged, even when they fail repeatedly.
3.5 The reasoning function is often social, not truth-seeking
Mercier and Sperber argue that human reasoning evolved largely for argumentation, to persuade and justify in social contexts, not purely to discover truth (Mercier & Sperber, 2011). That insight fits political reality: people often reason to defend their camp.
4) Example Scenarios: How Repeating Mistakes Looks in Real Life
Scenario A: Workplace decision loops
A company keeps launching rushed products despite repeated customer backlash. Everyone complains privately, but meetings end with “we have no choice.”
What is happening:
- pluralistic ignorance (people think others support it)
- conformity pressure
- escalation of commitment
- fear of speaking up
Scenario B: Family systems and “the same fight”
A family repeatedly handles conflict by avoidance, then explodes at holidays.
What is happening:
- short-term regulation overrides long-term health
- conflict avoidance becomes the norm
- no accountability structure to change the pattern
Scenario C: Communities repeating “tough” responses
A community responds to social problems with harsher punishments without addressing root causes, then repeats the same policy when problems persist.
What is happening:
- availability heuristic: vivid incidents drive decisions (Tversky & Kahneman, 1974)
- moral outrage increases certainty
- political incentives reward “toughness”
- complex interventions feel slow and uncertain
Scenario D: Political mistakes, the cycle version
A political system repeats patterns like:
- scapegoating outgroups during economic stress
- overpromising quick fixes to structural issues
- underinvesting in long-term prevention because benefits are delayed
- doubling down on failing policies to avoid admitting error
These patterns are not tied to one country or one ideology. They are predictable outcomes of identity, incentives, and social psychology.
5) The Emotional Side: Why “Bad” Decisions Can Feel Good
Collective mistakes often persist because they serve emotional and social functions:
- they reduce uncertainty quickly
- they provide a coherent narrative
- they create belonging and shared purpose
- they offer a villain, which simplifies complex pain
Under stress, humans are more likely to perceive patterns and accept strong narratives that restore a sense of control (Whitson & Galinsky, 2008; van Prooijen & Acker, 2015).
That is why collective repeating mistakes often spikes during:
- economic instability
- rapid cultural change
- crises and perceived threats
6) The “Different Result” Illusion: Why We Expect Change Without Change
People often expect different outcomes because one of these has shifted:
- the leader changed
- the slogan changed
- the enemy changed
- the context feels urgent
But the underlying mechanisms did not change:
- incentives
- institutions
- information flows
- accountability systems
- feedback loops
- the group’s ability to tolerate nuance
So the same decision architecture produces the same failures.
7) How to Break the Cycle: Practical Steps for Individuals and Groups
This is the part that matters. You cannot control entire societies alone. But you can reduce your personal contribution to collective mistakes, influence your teams and communities, and vote and participate with more clarity.
Step-by-step guide: Decision Hygiene for Repeating Mistakes
Step 1: Identify the repeating pattern, not the current argument
Write:
- What keeps happening?
- What do we keep choosing?
- What do we keep ignoring?
Example:
Instead of “this candidate,” identify the pattern: “We keep choosing high-certainty leaders who promise simple solutions and punish dissent.”
Step 2: Name the psychological driver
Use this short checklist:
- Are we following social proof?
- Are we afraid of being isolated?
- Are we in sunk cost mode?
- Are we protecting identity?
- Are we mistaking repetition for truth?
- Are we moralizing complexity into good versus evil?
Simply naming the driver reduces its invisibility.
Step 3: Create one protected dissenter role
Dissent breaks conformity effects. Even one sincere dissenter reduces group error rates in classic conformity setups (Asch, 1956).
In teams and communities, do not appoint a performative devil’s advocate. That often fails because it is not real dissent. Instead, rotate a genuine “critical friend” whose job is:
- to ask for disconfirming evidence
- to offer alternative explanations
- to surface risks without punishment
Step 4: Run a premortem
A premortem asks: “It is one year from now, and this decision failed. What caused it?” This method helps groups surface risks they are reluctant to mention (Klein, 2007).
Use it for:
- policy support
- major purchases
- community campaigns
- workplace initiatives
Step 5: Force a “base rate” check
Base rates are the boring statistical background of how often outcomes happen. People ignore them and rely on vivid stories instead (Tversky & Kahneman, 1974).
Ask:
- “How often does this approach work historically?”
- “What happened the last time a similar approach was tried?”
- “What do comparable places show?”
This matters in politics because one-off stories and anecdotes dominate discourse.
Step 6: Separate identity from evaluation
Try this sentence:
- “I can belong to my group and still evaluate a policy honestly.”
This is psychologically hard, but it is the skill that protects democracies. Identity-protective cognition is powerful, and the antidote is building an identity around truth-seeking rather than tribe-defending (Kahan, 2013).
Step 7: Build accountability into decisions
Accountability reduces many judgment errors when it is designed well, especially when people expect to justify their reasoning process, not just the outcome (Tetlock, 1992; Lerner & Tetlock, 1999; Simonson & Staw, 1992).
Practical version:
- record your predictions and reasons
- set check-in dates
- decide in advance what evidence would trigger a change of course
Step 8: Track predictions to defeat hindsight bias
When people do not track forecasts, they rewrite history and fail to learn (Fischhoff, 1975).
Try:
- “If we do X, we expect Y within Z months.”
Then return later: - “What actually happened?”
This is one of the simplest anti-repeat tools.
Step 9: Reduce information cascades by diversifying inputs
Information cascades thrive in narrow networks. Build an information diet that includes:
- high-quality sources with different perspectives
- long-form explanations over viral clips
- data over slogans
This reduces the chance you are simply following crowd cues.
Step 10: Choose slow thinking moments on purpose
When emotions are high, “fast thinking” dominates and errors increase (Kahneman, 2011).
Create a rule:
- do not share or decide immediately after outrage
- wait one hour, or one day for major decisions
- discuss when calmer
This single habit reduces costly collective spread of bad ideas.
8) Political Mistake Prevention: Practical Guidance for Everyday Citizens
You cannot personally fix political incentives overnight, but you can reduce your vulnerability to repeated collective errors.
A voter’s decision hygiene checklist
Before strongly supporting a claim or policy, ask:
- What problem is it solving, specifically?
- What is the proposed mechanism, how does it create the result?
- What trade-offs are being hidden?
- What evidence would change my mind?
- Who benefits if this fails, and who pays the cost?
- Is my certainty coming from evidence, or from identity and emotion?
These questions are not cynical. They are pro-democracy.
How to talk without making things worse
When polarization is high, humiliating people usually strengthens defensiveness. Focus on:
- shared goals
- concrete outcomes
- uncertainty tolerance
- process accountability
If you can get a conversation from “Are you good or evil?” to “What would count as evidence?” you have already improved the decision environment.
Conclusion: We Repeat Mistakes Together Because We Are Human Together
Collective repeating mistakes is not a sign that people are inherently foolish. It is a sign that human brains are social, emotional, and efficiency-driven. We use shortcuts. We protect belonging. We defend identity. We follow crowds under uncertainty.
Politics makes all of this more intense because the stakes are high, the feedback is delayed, and the incentives reward certainty.
The hopeful part is that collective errors are not random. They are patterned. That means they are interruptible.
When you practice decision hygiene, protect dissent, track predictions, and separate identity from evaluation, you are not just improving your own thinking. You are improving the environment that everyone is thinking inside.
References
- Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140. (ScienceDirect)
- Asch, S. E. (1956). Studies of independence and conformity: A minority of one against a unanimous majority. Psychological Monographs: General and Applied, 70(9), 1–70. (Wikipedia)
- Bikhchandani, S., Hirshleifer, D., & Welch, I. (1992). A theory of fads, fashion, custom, and cultural change as informational cascades. Journal of Political Economy, 100(5), 992–1026. (snap.stanford.edu)
- Christensen-Szalanski, J. J. J., & Willham, C. F. (1991). The hindsight bias: A meta-analysis. Organizational Behavior and Human Decision Processes, 48(1), 147–168. (ScienceDirect)
- Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against the illusory truth effect. Journal of Experimental Psychology: General, 144(5), 993–1002. (JSTOR)
- Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288–299.
- Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. Pantheon Books.
- Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior, 16(1), 107–112.
- Iyengar, S., Sood, G., & Lelkes, Y. (2012). Affect, not ideology: A social identity perspective on polarization. Public Opinion Quarterly, 76(3), 405–431. (OUP Academic)
- Iyengar, S., Lelkes, Y., Levendusky, M., Malhotra, N., & Westwood, S. J. (2019). The origins and consequences of affective polarization in the United States. Annual Review of Political Science, 22, 129–146. (Annual Reviews)
- Janis, I. L. (1972). Victims of groupthink: A psychological study of foreign-policy decisions and fiascoes. Houghton Mifflin. (Google Books)
- Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8(4), 407–424. (JSTOR)
- Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
- Klein, G. (2007). Performing a project premortem. Harvard Business Review. (Harvard Business Review)
- Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.
- Lerner, J. S., & Tetlock, P. E. (1999). Accounting for the effects of accountability. Psychological Bulletin, 125(2), 255–275. (ResearchGate)
- Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34(2), 57–74. (dan.sperber.fr)
- Moscovici, S., & Zavalloni, M. (1969). The group as a polarizer of attitudes. Journal of Personality and Social Psychology, 12(2), 125–135.
- Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. (SAGE Journals)
- Noelle-Neumann, E. (1974). The spiral of silence: A theory of public opinion. Journal of Communication, 24(2), 43–51. (Wiley Online Library)
- Prentice, D. A., & Miller, D. T. (1993). Pluralistic ignorance and alcohol use on campus: Some consequences of misperceiving the social norm. Journal of Personality and Social Psychology, 64(2), 243–256. (Princeton University)
- Simonson, I., & Staw, B. M. (1992). Deescalation strategies: A comparison of techniques for reducing commitment to losing courses of action. Journal of Applied Psychology, 77(4), 419–426.
- Staw, B. M. (1976). Knee-deep in the big muddy: A study of escalating commitment to a chosen course of action. Organizational Behavior and Human Performance, 16(1), 27–44. (ScienceDirect)
- Sunstein, C. R. (2009). Going to extremes: How like minds unite and divide. Oxford University Press.
- Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755–769.
- Tetlock, P. E. (1992). The impact of accountability on judgment and choice: Toward a social contingency model. Advances in Experimental Social Psychology, 25, 331–376. (Semantic Scholar)
- Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. (Science)
- van Prooijen, J. W., & Acker, M. (2015). The influence of control on belief in conspiracy theories: Conceptual and applied extensions. Applied Cognitive Psychology, 29(5), 753–761. (snap.stanford.edu)
- Whitson, J. A., & Galinsky, A. D. (2008). Lacking control increases illusory pattern perception. Science, 322(5898), 115–117.


Leave a Reply