The Best Examples of Business Decisions from 'Thinking, Fast and Slow'
Real examples of business decisions from Thinking, Fast and Slow
Kahneman’s core message is simple: your brain runs on two modes. System 1 is fast, automatic, and emotional. System 2 is slow, effortful, and logical. In business, most bad decisions happen when leaders trust System 1 in situations that really demand System 2.
Here are real examples of business decisions from Thinking, Fast and Slow in action, updated for how companies actually operate today.
1. Overconfident revenue forecasts in startup fundraising
One classic example of a business decision from Thinking, Fast and Slow is overconfidence in forecasting. Founders routinely pitch five‑year revenue projections that show perfect exponential growth. Investors know these numbers are optimistic, yet they still anchor on them.
Kahneman describes the planning fallacy: our tendency to underestimate costs, risks, and timelines even when we know better. In tech startups, this shows up when:
- A SaaS founder assumes every free trial will convert at a high rate because early adopters love the product.
- A hardware startup assumes manufacturing will scale smoothly, ignoring supply chain disruptions and regulatory delays.
- A consumer app team projects viral growth based on a small beta test in a friendly user group.
A better approach, consistent with Kahneman’s recommendations and related research in behavioral economics at places like Harvard Business School, is to use reference class forecasting. Instead of building projections from scratch, leaders compare their company to historical data from similar firms: same industry, stage, geography, and price point.
When founders reframe their pitch using reference classes (for example, conversion rates and churn from comparable public SaaS companies), revenue projections usually drop sharply—but decision quality improves. This is one of the best examples of how slowing down System 1 optimism with System 2 data changes investment decisions.
2. Mispricing products because of anchoring
Anchoring is another standout example of business decisions from Thinking, Fast and Slow. The first number people see heavily influences what they think is reasonable—even if that number is arbitrary.
In pricing, anchoring happens when:
- An enterprise software company lists a very high “Premium” plan that almost no one is expected to buy, mainly to make the mid‑tier plan look more reasonable.
- A retailer shows a “Was \(299, now \)149” tag, knowing most shoppers never saw the product at \(299, but will feel like \)149 is a bargain.
- A consultant quotes an initial high day rate, then “discounts” it during negotiation, even though the final number was the target all along.
Kahneman’s research with Amos Tversky showed that even random numbers can anchor judgments. In business, leaders often anchor on last year’s budget or a competitor’s price without asking whether that number is grounded in value, cost, or customer willingness to pay.
Smart teams counter anchoring by:
- Building value‑based pricing models, not competitor‑based.
- Running A/B tests on price ranges instead of assuming the first guess is close.
- Separating “exploration” pricing experiments from final decisions so early numbers don’t become unexamined anchors.
In 2024–2025, as more companies adopt dynamic pricing and AI‑driven price recommendations, anchoring still matters. If the first AI‑generated price becomes the default, human reviewers may simply accept it, letting a machine‑generated anchor drive millions in revenue.
3. Hiring based on “gut feeling” instead of structured judgment
Kahneman is blunt about unstructured interviews: they’re often little more than confidence theater. One of the best examples of business decisions from Thinking, Fast and Slow is how organizations hire.
System 1 loves first impressions. Within minutes, interviewers form a story about a candidate: confident, “culture fit,” a “natural leader.” After that, they unconsciously filter everything the candidate says through that early judgment. Kahneman calls this halo effect and confirmation bias working together.
In practice, this leads to decisions like:
- Hiring the charismatic candidate who tells great stories but has a thin track record.
- Overrating candidates who share the interviewer’s background, school, or hobbies.
- Underrating candidates who are nervous in conversation but have stellar work samples.
Kahneman advocates for structured decision processes: scorecards, standardized questions, and independent ratings. Research on hiring from universities such as the University of Michigan backs this up: structured interviews predict job performance better than informal chats.
A 2024 twist: AI‑screening tools can encode the same biases at scale if they’re trained on historical hiring data. Leaders who understand Kahneman’s work use System 2 thinking to design hiring processes that:
- Separate evaluation of skills, behaviors, and values.
- Use work samples and job auditions instead of pure talk.
- Audit AI recommendations for bias and unfair patterns.
4. Sunk cost fallacy in product development
Another powerful example of business decisions from Thinking, Fast and Slow is the sunk cost fallacy—continuing a failing project because you’ve already invested time, money, or reputation.
You see this in:
- A product team that keeps building features for a tool with flat user growth because they’ve “already spent 18 months on it.”
- A media company that keeps funding a show with declining viewership because the first season was expensive.
- A corporation that keeps renewing a legacy software contract even though better alternatives exist, simply because switching would admit past mistakes.
Kahneman’s point: sunk costs are gone. Only future costs and benefits matter. But System 1 hates admitting loss. It prefers to protect ego and avoid regret.
Teams that apply System 2 thinking:
- Run pre‑mortems before big investments—imagining the project has failed and asking why.
- Set clear kill criteria in advance (for example, “If retention is below X% after 6 months, we stop”).
- Separate the people who made the original decision from the people who decide whether to continue.
Organizations like the U.S. Government Accountability Office have documented how ignoring sunk costs leads to massive project overruns in public programs; the same logic applies in private companies. Kahneman’s framework helps leaders cut losses earlier.
5. Loss aversion in pricing, discounts, and negotiations
Loss aversion—the idea that losses hurt more than equivalent gains feel good—is one of the most famous concepts in Thinking, Fast and Slow. It shows up everywhere in business decisions.
Some real examples include:
- Subscription companies framing offers as “Don’t lose your premium features” rather than “Upgrade to get more.”
- Sales teams offering “limited‑time discounts”, triggering fear of missing out more than desire for the product itself.
- Procurement teams rejecting contracts with small, unlikely downside risks, even when the expected value is clearly positive.
Kahneman and Tversky’s prospect theory, now widely taught in economics programs (Stanford University’s economics department is one example), formalized this behavior. In 2024–2025, as businesses rely more on behavioral pricing and A/B testing, they keep rediscovering what Kahneman showed decades ago: framing matters.
Leaders who understand loss aversion:
- Test both gain‑framed and loss‑framed messages instead of assuming one is better.
- Recognize when their own fear of loss is blocking a rational strategic pivot.
- Use guarantees, trials, and safety nets to reduce perceived loss for customers.
6. Overreacting to vivid events instead of base rates
Kahneman talks about the availability heuristic: we judge risk based on how easily examples come to mind, not on actual statistics. This leads to some of the most expensive examples of business decisions from Thinking, Fast and Slow.
You’ll see it when:
- A CEO cancels an international expansion after one highly publicized political event, ignoring long‑term market data.
- A company slashes travel budgets after a single high‑profile incident, even though overall risk remains low.
- A board overreacts to one viral customer complaint on social media and redesigns an entire product line.
In 2020–2022, pandemic shocks made availability bias even stronger. Leaders who lived through sudden lockdowns became highly sensitive to tail‑risk events. That’s understandable—but if every strategy discussion fixates on the last crisis, companies underinvest in opportunities that are statistically likely to succeed.
Researchers at institutions like the National Institutes of Health have studied how the brain makes quick decisions under uncertainty. Their work supports Kahneman’s view: fast judgments are efficient but often miscalibrated.
To balance availability bias, smart teams:
- Start strategic discussions with base rates: failure rates, market growth, and adoption curves from solid data.
- Treat extreme stories as signals to investigate, not as automatic decision drivers.
- Use scenario planning that includes both worst‑case and most‑likely outcomes.
7. Framing effects in product and policy decisions
One subtle but powerful example of a business decision from Thinking, Fast and Slow is how options are framed. Different descriptions of the same outcome can produce very different choices.
Common business cases include:
- Health insurers describing a plan as “90% of claims approved” versus “10% of claims denied.” The numbers are identical, but customer reactions differ.
- HR teams presenting a remote‑work policy as “3 days required in office” versus “2 days of flexible remote work.” Same schedule, different emotional response.
- Product teams offering a “\(10 discount for early renewal” versus a “\)10 late fee after the deadline.” Loss framing usually drives higher compliance.
Kahneman’s experiments on framing effects are widely cited in public policy and behavioral health research, including work hosted by the National Library of Medicine. In business, the same logic shapes how you present:
- Risk disclosures to investors.
- Change management plans to employees.
- Pricing and terms to customers.
Leaders who internalize this example of Thinking, Fast and Slow don’t just ask, “What decision are we making?” They also ask, “How are we describing the options, and how might that be nudging people unconsciously?”
8. Herd behavior in tech and AI adoption
Finally, consider how companies are adopting AI tools in 2024–2025. Many executives are racing to implement AI not because they’ve carefully analyzed ROI, but because everyone else seems to be doing it.
This is a modern example of business decisions from Thinking, Fast and Slow that combines several biases:
- Social proof: “Our competitors are rolling out AI copilots; we can’t be left behind.”
- Overconfidence: assuming AI will automatically improve productivity without redesigning workflows.
- Neglect of base rates: ignoring how many large IT projects historically fail or underperform.
Instead of asking, “What’s our clear use case and success metric?” leaders sometimes default to, “Let’s launch something AI‑branded this quarter.” System 1 loves the narrative of innovation and fear of missing out.
System 2 leaders push back by:
- Starting with a few tightly scoped pilots with measurable outcomes.
- Comparing expected returns to historical data on tech adoption projects.
- Treating AI like any other capital investment, not a magic label.
This is one of the best examples of Thinking, Fast and Slow thinking applied to current trends: Kahneman gives you a vocabulary to resist hype and make measured, data‑driven calls.
How to use these examples of business decisions from Thinking, Fast and Slow in your company
Reading about biases is interesting; changing your organization’s habits is harder. The most effective leaders don’t try to “think bias‑free.” Instead, they design systems that force System 2 to show up when it matters.
Here are practical ways to turn these examples of business decisions from Thinking, Fast and Slow into better outcomes:
- Institutionalize pre‑mortems and post‑mortems. Before big bets, imagine failure and list reasons. After decisions, review process and outcome separately so you don’t confuse lucky wins with good judgment.
- Separate decision framing from decision approval. One team prepares options, base rates, and framing variations; another reviews and chooses. This lowers the risk of framing effects and anchoring.
- Use checklists for repeatable decisions. Hiring, pricing, vendor selection, and product launches benefit from structured criteria that reduce the sway of first impressions.
- Track decision quality, not just results. A good process can lead to a bad outcome and vice versa. Over time, measuring process quality (data used, options considered, biases checked) improves performance.
The real power of Kahneman’s work is not in memorizing every bias, but in recognizing patterns. When you see an example of a business decision that feels eerily similar to these stories—overconfident forecasts, gut‑driven hires, stubborn attachment to failing projects—that’s your cue to slow down, invite System 2 into the room, and redesign how the decision is made.
FAQ: examples of business decisions from Thinking, Fast and Slow
Q: What are the most practical examples of business decisions from Thinking, Fast and Slow for managers?
Some of the most practical examples include using reference class forecasting for budgets, structuring interviews to avoid halo effects, setting explicit kill criteria for projects to counter sunk costs, and testing different framings in pricing and communication to account for loss aversion and framing effects.
Q: Can you give an example of how a small business might use these ideas?
A small e‑commerce shop might stop guessing prices based on competitors and instead test different price anchors and frames on its website. It could also use a simple scorecard for hiring part‑time staff, so the owner isn’t swayed by first impressions or shared interests.
Q: Are there examples of large companies changing their decision processes because of Kahneman’s work?
Yes. Many large firms have adopted structured hiring, formal pre‑mortems, and more data‑driven forecasting after internal training on behavioral economics. While they don’t always advertise it as “Kahneman‑inspired,” these practices align directly with the examples of business decisions described in Thinking, Fast and Slow.
Q: How do I avoid overcomplicating decisions by overusing System 2?
Not every choice needs a full analysis. Routine, low‑risk decisions can rely on System 1 habits. The key is to flag high‑stakes, irreversible, or highly uncertain decisions and intentionally slow those down with structured analysis, diverse input, and clear criteria.
Q: Where can I learn more about the research behind these examples?
Kahneman’s original book is still the best starting point. For deeper academic work, you can explore behavioral economics and decision‑science research from universities and public institutions such as Harvard, Stanford, and the National Institutes of Health, many of which publish accessible summaries of current findings.
Related Topics
Powerful Examples of Leadership Styles from 'Leaders Eat Last'
The best examples of key takeaways from 'The Lean Startup'
The Best Examples of Business Decisions from 'Thinking, Fast and Slow'
Examples of Good to Great: Summary Examples for Everyone
Real-world examples of practical applications of 'Drive' at work and beyond
Real-World Examples of 7 Lessons from Highly Effective People
Explore More Business Book Summaries
Discover more examples and insights in this category.
View All Business Book Summaries