Skip to content

Reasoning Cheat Sheet

Your brain is a heuristic engine optimized for survival, not correctness.

Daniel Kahneman’s framework for how humans think.

System 1System 2
Fast, automatic, effortlessSlow, deliberate, effortful
Pattern-matching, intuitiveLogical, analytical
Always runningLazy — engages only when prompted
Prone to systematic errorsCan catch errors (if activated)
“This feels like a database bug""Let me profile before assuming”

Most reasoning errors come from System 1 answering a question that System 2 should handle. Experience makes this worse — experts develop stronger System 1 patterns, which fail silently when context shifts.

The test: When you feel confident about a decision, ask: “Am I pattern-matching or actually reasoning through this?”

Errors in the structure of an argument. Fallacies make reasoning unreliable regardless of whether the conclusion happens to be true.

Premises that don’t connect to the conclusion.

FallacyDefinitionTechnical ExampleAntidote
Ad hominemAttack the person, not the argument”This junior’s review comments aren’t worth considering”Evaluate arguments on merit, regardless of source
Appeal to authorityAccept claims because an authority says so”We should use microservices because Google does”Verify with evidence; context matters more than credentials
StrawmanMisrepresent a position, then attack that”You want NoSQL? You don’t care about data consistency at all?”Restate the actual position before responding
Red herringIntroduce irrelevant informationDuring a security review: “We should really refactor this module first”Keep discussions scoped; track tangents separately
Tu quoqueDeflect by pointing out the critic’s flaws”You can’t criticize my tech debt when your module has hardcoded secrets”Both issues can be valid simultaneously

Arguments that assume what they’re trying to prove.

FallacyDefinitionTechnical ExampleAntidote
False dilemmaPresent only two options when more exist”We either rewrite everything or accept permanent tech debt”Ask “What else?” to expand the solution space
Sunk costContinue investing because of past costs”We’ve spent 6 months on this custom auth — we can’t switch to OAuth now""If starting today, what would we choose?”
Slippery slopeSmall step must lead to extreme consequences”If we allow one day remote, nobody will ever come to the office”Require evidence for each step in the chain
Begging the questionAssume the conclusion in the premise”This architecture is scalable because it’s designed to handle growth”Ensure premises are independently verifiable
Nirvana fallacyReject solutions that aren’t perfect”This cache doesn’t solve 100% of performance issues, so skip it”Compare to realistic alternatives, not ideals

Insufficient evidence for the conclusion.

FallacyDefinitionTechnical ExampleAntidote
Hasty generalizationBroad conclusion from small sample”Tested on Chrome and Safari — works everywhere”Distinguish exploratory from confirmatory testing
Post hoc ergo propter hocSequence implies causation”We deployed yesterday, traffic dropped today — the feature caused it”Correlation is not causation; control for confounders
Anecdotal evidencePersonal experience over systematic data”I shipped without tests once and it was fine, so testing is overrated”Prioritize data over single cases
Texas sharpshooterCherry-pick data that supports your theoryReporting sprint velocity only from successful sprintsPre-define success criteria before collecting data
Survivorship biasFocus on winners, ignore the failures”Startups use MongoDB, so it must be the right choice”Actively seek and study failure cases

Arguments that exploit unclear meaning.

FallacyDefinitionTechnical ExampleAntidote
EquivocationSame term, different meanings within an argumentUsing “testing” to mean unit tests here and QA thereDefine terms explicitly at the start
CompositionWhat’s true for parts must be true for whole”Each team member is skilled, so the team will be high-performing”Test assumptions about wholes separately
DivisionWhat’s true for whole must be true for parts”Our company is innovative, so every employee must be innovative”Evaluate parts independently
FallacyDefinitionTechnical ExampleAntidote
Argument from fallacyDismiss a conclusion because the argument is flawed”Your caching argument has a flaw, so caching is wrong”A bad argument doesn’t make the conclusion false
Moving goalpostsChange acceptance criteria after they’re met”The feature works, but now it also needs to handle edge case X”Define criteria in writing before starting
Cargo cultCopy patterns without understanding why they workImplementing microservices because successful companies use themUnderstand the problem before copying solutions
Appeal to noveltyNew must be betterAdopting the latest framework without evaluating fitAssess tools against specific needs

Systematic deviations in judgment. Unlike fallacies (errors in arguments), biases are errors in perception — your brain distorts input before reasoning even begins.

BiasDefinitionEngineering ExampleDebiasing Technique
Confirmation biasSeek evidence that confirms; dismiss what contradictsDebugging: “Must be the database” — check only DB logsSeek disconfirming evidence first
AnchoringFirst number dominates subsequent estimates”2 weeks” becomes the baseline no matter what you learn laterDelay estimates until requirements are explored
Availability heuristicOverweight recent or vivid eventsOver-prioritizing edge cases from last week’s outageCheck metrics, not memory
Framing effectDecision changes based on how options are presented”Saves 20 hours/week” accepted; “Risks 4 hours downtime” rejected — same projectPresent as both gains and losses
Loss aversionLosses hurt twice as much as equivalent gainsFeature work over refactoring — features feel like gains, debt prevention doesn’tReframe: “invest $400K now vs face $2M rewrite later”
BiasDefinitionEngineering ExampleDebiasing Technique
Planning fallacyUnderestimate time by simulating best-case scenarios”2 days” for a task that historically takes 5Reference class forecasting from past actuals
Optimism biasBelieve bad outcomes are less likely for you”Our migration will go smoothly” (most don’t)Track estimation accuracy; calibrate over time
Dunning-KrugerLow skill overestimates; high skill underestimatesJunior: “I know React” after one tutorial. Senior: imposter syndromeSeek objective skill assessments
Normalcy biasDismiss warnings because things have been fine so farIgnoring alerts because “it’s probably nothing”Analyze near-misses; treat warnings as data
BiasDefinitionEngineering ExampleDebiasing Technique
Authority biasOvervalue opinions from hierarchy topJunior spots bug in senior’s code but approves anywayAnonymous review; explicitly value merit over rank
Bandwagon effectAdopt beliefs because others hold them”Everyone is moving to Kubernetes, so we should too”Evaluate against specific needs, not popularity
GroupthinkConsensus-seeking suppresses dissentNo one challenges the tech lead’s architecture proposalAssign devil’s advocate; require documented dissent
Fundamental attribution errorBlame people, not systems”Alice caused the outage” vs “Missing guardrails caused the outage”Ask “what system failed?” before “who failed?”
BiasDefinitionEngineering ExampleDebiasing Technique
Hindsight bias”I knew it all along” after the factPost-incident: “We always had concerns” (but no one raised them before)Document predictions before outcomes
Outcome biasJudge decision quality by result, not processCriticizing an architecture that failed due to unforeseeable market shiftEvaluate based on what was known at decision time
IKEA effectOvervalue what you built yourselfDefending your code design against valid criticismSeparate creation from evaluation
Choice-supportive biasRemember past choices as better than they wereRecalling only benefits of a past tech choice, forgetting the painKeep decision records; review them honestly

Biases rarely act alone. Common combinations:

CombinationContextEffect
Confirmation bias + anchoringEstimationInitial estimate sticks; team seeks only confirming data
Availability heuristic + recency biasDebuggingOver-investigate causes similar to last incident
Authority bias + groupthinkArchitectureSenior’s proposal rubber-stamped without critique
Loss aversion + sunk costTech debtKeep broken approach because change feels like loss
IKEA effect + choice-supportive biasCode reviewResist refactoring code you wrote

Knowledge of biases doesn’t prevent them — they operate below conscious awareness. You need systems, not just understanding.

Imagine the project has already failed. Work backward to identify why.

  1. Individual brainstorm: list 5-10 reasons for failure
  2. Round-robin sharing (no debate)
  3. Categorize: technical, organizational, external, assumptions
  4. Identify early warning signals for top risks

Increases risk identification accuracy by 30%. Kahneman calls it his favorite debiasing technique.

Base estimates on actual outcomes from similar past work, not the inside view of the current project.

Last 5 API integrations: 3, 4, 6, 3, 5 weeks
Average: 4.2 weeks
Your estimate for the next one: start at 4.2, adjust for specifics

Counteracts planning fallacy and optimism bias.

Present the strongest version of an opposing argument before responding.

Dennett’s protocol:

  1. Re-express the position so clearly the holder says “I wish I’d put it that way”
  2. List points of agreement
  3. Mention what you learned from their position
  4. Only then offer rebuttal

Prevents strawman fallacy and builds collaborative truth-seeking.

Record decisions with reasoning, predictions (with confidence levels), and expected outcomes. Review quarterly.

Date: 2026-02-21
Decision: Use PostgreSQL over MongoDB for user service
Reasoning: Access patterns are relational joins; ACID needed
Prediction: Read latency <50ms, write throughput >1000 ops/sec (70% confident)
Revisit: 2026-05-21

Creates a feedback loop that calibrates judgment over time.

Brief reminders of essential steps that experts might skip. Not procedures — guardrails.

Get “dumb stuff out of the way so the brain can concentrate on the hard stuff.”

Generate 2-3 reasons you might be wrong. Research shows 2 counterarguments is effective; 10 is counterproductive (the difficulty makes you more confident).

Focus on what happened and how systems failed, not who made mistakes. Use “what” questions (“What was your understanding?”) instead of “why” questions (“Why did you deploy without testing?”). Reveals systemic factors: time pressure, unclear requirements, missing guardrails.

TechniqueCountersHow It Works
Delphi methodAuthority bias, groupthink, bandwagonAnonymous rounds of independent evaluation; converge through iteration
Silent voting firstAnchoring, authority biasEveryone commits a position before discussion
Devil’s advocateGroupthink, confirmation biasAssign someone to argue against the prevailing view
Cognitive diversityShared blind spotsAssemble diverse thinking styles, not just demographics
Psychological safetySelf-censorship, authority biasPeople must feel safe to dissent; Google found it was the #1 factor in team performance
ContextCommon FallaciesCommon Biases
Code reviewAd hominem, strawman, appeal to authorityAuthority bias, IKEA effect, bikeshedding
ArchitectureFalse dilemma, cargo cult, appeal to noveltyGroupthink, anchoring, survivorship bias
EstimationNirvana fallacy, false dilemmaPlanning fallacy, anchoring, optimism bias
DebuggingPost hoc, hasty generalizationConfirmation bias, availability, recency bias
Incident responseRed herring, tu quoqueFundamental attribution error, hindsight bias
HiringAd hominem, anecdotal evidenceConfirmation bias, representativeness heuristic
  • Before deciding: What evidence would change my mind? (confirmation bias)
  • Before estimating: What did similar past work actually take? (planning fallacy)
  • Before debating: Can I state the opposing view so its holder would agree? (strawman)
  • After an incident: What system failed, not who failed? (attribution error)
  • When confident: Am I pattern-matching or reasoning? (System 1 vs 2)

“The first principle is that you must not fool yourself — and you are the easiest person to fool.” — Richard Feynman

“It is difficult to get a man to understand something when his salary depends on his not understanding it.” — Upton Sinclair

“We don’t see things as they are, we see them as we are.” — Anais Nin

“The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.” — Daniel Boorstin

  • Thinking — Mental models and systems thinking
  • Problem Solving — Structured approaches; traps section covers complexity bias and tunnel vision
  • Complexity — Essential vs accidental; the paradox of fighting complexity with complexity
  • Reasoning
  • Daniel Kahneman — Thinking, Fast and Slow (2011)
  • Philip Tetlock — Superforecasting (2015)
  • Atul Gawande — The Checklist Manifesto (2009)
  • Donella Meadows — Thinking in Systems (2008)
  • Gary Klein — Sources of Power (1998)
  • Buster Benson — “Cognitive Bias Codex” (2016)