The hardest bugs to fix are the ones in your own reasoning. You can write flawless C, design elegant circuits, and still make terrible engineering decisions because of a cognitive bias you did not notice or a statistical mistake you did not catch. This course is your debugger for thinking itself. Every lesson uses real engineering scenarios, from code reviews to design meetings to test reports, so the concepts stick where they matter most.
Why Critical Thinking for Engineers?
Engineers make high-stakes decisions under uncertainty every day. Which architecture should we choose? Is this test data sufficient to ship? Should we rewrite this module or keep patching it? The quality of those decisions depends not just on technical knowledge, but on how clearly you reason about the evidence in front of you.
Better Technical Arguments
Learn to recognize when a colleague’s argument (or your own) relies on a logical fallacy rather than evidence. Spot straw man arguments in code reviews, false dichotomies in architecture debates, and appeals to authority when someone invokes a framework’s popularity instead of its fit.
Bias-Resistant Decisions
Understand the cognitive shortcuts your brain takes automatically, and learn when those shortcuts help and when they mislead. Confirmation bias, sunk cost fallacy, survivorship bias, and anchoring affect engineering decisions far more often than most engineers realize.
Statistical Literacy
Stop misinterpreting p-values, drawing conclusions from tiny sample sizes, and confusing correlation with causation. Whether you are evaluating test results, reading research papers, or analyzing sensor data, statistical reasoning is a core engineering skill.
Honest Data Presentation
Learn how charts and data visualizations can mislead, intentionally or accidentally. Then learn how to present your own data clearly and honestly so your audience can trust your conclusions.
Course Structure
Each lesson follows a consistent pattern:
The Concept
A clear explanation of the thinking error, bias, or statistical pitfall, grounded in how the human brain actually works.
Engineering Examples
Real scenarios from debugging sessions, design reviews, test reports, technical forums, and project planning where the concept shows up.
Recognition Checklist
Concrete signs that you or someone else is falling into the trap, so you can catch it in real time.
Countermeasures
Practical techniques to mitigate the error in your own thinking and in team decision-making.
Exercises
Scenarios to practice identifying and countering each thinking error.
Lessons
Lesson 1: How Your Brain Tricks You
How Your Brain Tricks You. Explore System 1 (fast, intuitive) and System 2 (slow, deliberate) thinking from Kahneman’s research. Learn when engineering intuition saves you and when it leads you astray through anchoring, availability heuristic, and substitution.
Lesson 2: Logical Fallacies in Technical Arguments
Logical Fallacies in Technical Arguments. A field guide to the 15+ fallacies most common in engineering contexts: ad hominem, false dichotomy, appeal to authority, straw man, appeal to common practice, nirvana fallacy, and more. Each with a real engineering example and a countermeasure.
Lesson 3: Cognitive Biases in Engineering Decisions
Cognitive Biases in Engineering Decisions. Confirmation bias, survivorship bias (with Wald’s airplane armor story), sunk cost fallacy, Dunning-Kruger effect, hindsight bias, bandwagon effect, and anchoring. How these biases shaped engineering disasters and everyday project failures.
Lesson 4: Statistics Done Wrong
Statistics Done Wrong. Inspired by Alex Reinhart’s book. P-value misinterpretation, p-hacking, small sample sizes, overfitting, confounding variables, base rate neglect, and the multiple comparisons problem. “We ran the test 3 times and it passed. Ship it.” Why that reasoning fails.
Lesson 5: How to Lie with Charts and Data
How to Lie with Charts and Data. Inspired by Darrell Huff and modern data visualization research. Truncated axes, cherry-picked windows, misleading scales, 3D pie charts, dual y-axes, pictogram abuse, and Simpson’s paradox. Learn to spot deception and present your own data honestly.
Lesson 6: Estimation, Uncertainty, and Confidence
Estimation, Uncertainty, and Confidence. Fermi estimation, error bars, significant figures, false precision, error propagation, confidence intervals, and measurement uncertainty budgets. “How sure are you?” should be a question you ask about every number.
Lesson 7: Correlation, Causation, and Evidence
Correlation, Causation, and Evidence. “We added a capacitor and the bug went away” does not mean the capacitor fixed the bug. Post hoc fallacy, confounding variables, controlled experiments, A/B testing, and the hierarchy of evidence.
Lesson 8: Debugging as Scientific Reasoning
Debugging as Scientific Reasoning. Debugging IS the scientific method applied to code and hardware. Hypothesis-driven debugging, binary search on hypotheses, rubber duck as Socratic method, and common debugging fallacies.
Lesson 9: Making Better Engineering Decisions
Making Better Engineering Decisions. Decision matrices, trade-off analysis, premortems, checklists, reversible vs irreversible decisions, and NASA’s lessons from Challenger. A practical framework for every engineering choice.
Who Is This Course For?
No prerequisites
This course does not require any specific technical background. It is useful for:
Hardware engineers who make architecture and component selection decisions
Software engineers who participate in code reviews and design discussions
Graduate students and researchers who interpret experimental data and read papers
Engineering managers who evaluate proposals and make resource allocation decisions
Anyone who wants to think more clearly about technical problems
The examples lean toward engineering and technology, but the principles apply universally. If you have ever lost an argument you should have won, or won an argument you should have lost, this course will help you understand why.
Recommended Reading
These books inspired the course and are excellent companions: