Normalcy bias and Base rate neglect: Bias in Evaluating AGI X-Risks


Normalcy bias

The refusal to plan for, or react to, a disaster which has never happened before.

A society subject to regular minor hazards treats those minor hazards as an upper bound on the size of the risks.

The wise would extrapolate from a memory of small hazards to the possibility of large hazards. Instead, past experience of small hazards seems to set a perceived upper bound on risk. A society well-protected against minor hazards takes no action against major risks.

For example; building on flood plains once the regular minor floods are eliminated. They are guarding against regular minor floods but not occasional major floods.


- link Wikipedia: Normalcy bias
- an item on Forrest Landry’s compiled list of biases in evaluating extinction risks.


Base rate fallacy
Base rate neglect

The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case).

In nearly every scenario associated with a category 1 or 2 extinction risk, there is a power law effect – some sort of catalytic reaction or cascade. No amount of rejection of specific cases of exotic process will provide sufficient basis for a general argument of induction that no such specific case exists.

That is why general arguments are preferred, as they address a general issue in a general (though comprehensive) way.

This particular error has a lot in common with the ‘Neglect of probability’ bias.
It is a symptom of the fact that the vast majority of people naturally think additively.
A very much smaller number of people can think in terms of multiplicative effects, and a very much rarer subset of those folks can think in terms of power laws. For example, have you ever tried to convince someone who is young of the benefits of investing for retirement?

Given that thinking in terms of power laws is unnatural, difficult, and usually requiring of explicit abstract mathematical technique, there is a strong tendency for most people to focus on concrete details in an attempt to re-establish a basis on which their intuitions, in the form of an ‘induction of understanding’, can occur.

Dealing with specifics is therefore considered to be easier and “more productive” than dealing with general and abstract issues in a fully general way.


- link Wikipedia: Base rate fallacy
- an item on Forrest Landry’s compiled list of biases in evaluating extinction risks.

Crossposted to LessWrong (0 points, 0 comments)