Executive summary: Cognitive biases like availability heuristic and confirmation bias can distort judgement about risks, especially unprecedented ones like human extinction.
Key points:
Availability heuristic leads people to underestimate common risks and overestimate memorable ones. This could downplay preparations for unprecedented catastrophes.
Hindsight bias makes past events seem more predictable than they were. This may undermine appreciation for disaster prevention efforts.
Black swan events are highly impactful but unpredictable, so lack of preparation for them is dangerous.
Conjunction fallacy makes complex scenarios seem more likely than they are. This can skew assessments of risk.
Confirmation bias leads people to seek out confirming evidence rather than critically testing hypotheses.
Anchoring causes people to rely too heavily on initial irrelevant information when making judgments.
Affect heuristic boils information down to good/bad feelings, more so under time pressure. This can distort risk analysis.
Scope neglect means people do not scale concern proportionately to affected entities. So large problems may get inadequate responses.
Overconfidence leads to underestimating the costs, risks and timelines of plans and actions.
Bystander apathy causes inaction when people expect others to act. This can propagate collective inaction even in crises.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, andcontact us if you have feedback.
Executive summary: Cognitive biases like availability heuristic and confirmation bias can distort judgement about risks, especially unprecedented ones like human extinction.
Key points:
Availability heuristic leads people to underestimate common risks and overestimate memorable ones. This could downplay preparations for unprecedented catastrophes.
Hindsight bias makes past events seem more predictable than they were. This may undermine appreciation for disaster prevention efforts.
Black swan events are highly impactful but unpredictable, so lack of preparation for them is dangerous.
Conjunction fallacy makes complex scenarios seem more likely than they are. This can skew assessments of risk.
Confirmation bias leads people to seek out confirming evidence rather than critically testing hypotheses.
Anchoring causes people to rely too heavily on initial irrelevant information when making judgments.
Affect heuristic boils information down to good/bad feelings, more so under time pressure. This can distort risk analysis.
Scope neglect means people do not scale concern proportionately to affected entities. So large problems may get inadequate responses.
Overconfidence leads to underestimating the costs, risks and timelines of plans and actions.
Bystander apathy causes inaction when people expect others to act. This can propagate collective inaction even in crises.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.