In statistics and decision theory, rational belief updating is often explained through Bayesian reasoning, which suggests that people should revise their beliefs proportionally when new evidence appears. However, real human behavior does not always follow this ideal process. A common cognitive bias called Conservatism (Bayesian) explains why this happens.
Conservatism in Bayesian reasoning refers to the tendency for individuals to update their beliefs more slowly than what statistical reasoning would recommend when new information becomes available.
Instead of adjusting beliefs fully based on new evidence, people tend to cling too strongly to their prior assumptions.
1. What Is Bayesian Updating?
Bayesian reasoning is based on a mathematical framework developed by Thomas Bayes.
In simple terms, Bayesian updating means that when new evidence appears, people should adjust their beliefs by combining:
- Prior probability – what was believed before
- New evidence – newly observed information
The updated belief, often called the posterior probability, should reflect both factors in a balanced way.
However, real-world decision-making often deviates from this ideal model.
2. What Is Conservatism in Bayesian Thinking?
Conservatism in Bayesian reasoning occurs when people give too much weight to their prior beliefs and too little weight to new evidence.
Even when strong data contradicts an existing belief, individuals may only make small adjustments to their expectations.
In statistical terms, their belief updates are too conservative compared to what Bayesian probability would suggest.
3. Example of Bayesian Conservatism
Imagine an investor who believes that a certain company is financially stable.
Later, new information appears showing:
- declining earnings
- increasing debt
- weakening market demand
A fully rational update would significantly change the investor’s expectations about the company’s future. However, due to conservatism bias, the investor may only adjust their belief slightly and continue assuming that the company will perform well.
4. Why Bayesian Conservatism Happens
Several psychological tendencies contribute to this bias.
Commitment to Prior Beliefs
People often develop confidence in their earlier judgments.
Cognitive Effort
Recalculating probabilities and reassessing assumptions requires mental effort.
Desire for Stability
Frequent belief changes can feel uncomfortable or destabilizing.
5. Effects in Financial Markets
This bias appears frequently in financial decision-making.
Delayed Market Reactions
Investors may react slowly to new information about companies or economic conditions.
Persistent Mispricing
If many investors adjust their beliefs slowly, asset prices may take time to reflect new realities.
Slow Strategy Adaptation
Traders may continue using strategies that were once successful even when market conditions have changed.
6. How to Reduce Bayesian Conservatism
Improving decision-making requires consciously balancing prior beliefs with new information.
Review assumptions regularly
Question whether previous expectations still hold true.
Evaluate the strength of new evidence
Stronger evidence should lead to larger belief updates.
Use quantitative analysis
Statistical models can help reduce emotional resistance to changing beliefs.
Encourage flexible thinking
Being willing to revise opinions is essential for accurate reasoning.
Conclusion
Conservatism in Bayesian reasoning shows how people tend to adjust their beliefs too slowly when new information appears. While Bayesian theory provides a rational model for updating beliefs, human psychology often leads to more cautious adjustments.
By recognizing this bias and giving appropriate weight to new evidence, individuals can make more accurate judgments and better decisions in uncertain environments.
'Behavioral Finance' 카테고리의 다른 글
| Curse of Knowledge: When What We Know Makes Communication Harder (0) | 2026.03.15 |
|---|---|
| Contrast Effect: How Comparisons Influence Our Perception (0) | 2026.03.15 |
| Conservatism Bias: Why People Are Slow to Update Their Beliefs (0) | 2026.03.14 |
| Conjunction Fallacy: When More Specific Stories Seem More Likely (0) | 2026.03.14 |
| Congruence Bias: Why We Test Ideas in Ways That Confirm Them (0) | 2026.03.13 |