Comparative Bias
I see a lot of talk about biases in effective altruist circles. I agree that it is very important to acknowledge that the human mind is flawed and consistently makes many mistakes. I wish there was much more discussion of how these biases compare to one another. On many topics there are biases going in both directions and without some sense of how they compare to one another it is hard to make sense of where the bias actually ends up leading. It seems that people assume that most people are biased against their specific perspective, rather than towards it.
Imagine for example that there is a member of the LessWrong community who is considering x-risk:
Biases that would make this person think that x-risk is a large concern |
Biases that would make this person think that x-risk is not a large concern |
AnchoringâThe tendency to rely too heavily, or âanchor,â on one trait or piece of information when making decisions (usually the first piece of information that we acquire on that subject) |
Ambiguity effectâThe tendency to avoid options for which missing information makes the probability seem âunknown.â |
Availability CascadeâA self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or ârepeat something long enough and it will become trueâ) |
Availability heuristicâThe tendency to overestimate the likelihood of events with greater âavailabilityâ in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be. |
Attentional biasâThe tendency of our perception to be affected by our recurring thoughts |
This is just from the âAâ section of the very long list of biases from Wikipedia and I am sure that a full list would have dozens of biases going each way. I am less interested in this specific example and more interested in how people deal in general with conflicting biases. What do you think?
Vaguely related point:
I sometimes see proponents of cause X (for almost all X) say things like âconsider all the cognitive biases that would cause you not to think that cause X is the most important! Therefore you need to pay more attention to cause X.â I think this is an extremely cheesy tacticâpossibly even logically rude depending on how itâs employed.
For many reasonable propositions you can concoct an almost infinite list of biases pushing in both directions on it. Ironically, people who use this form of argument seem to be themselves suffering from confirmation bias about the proposition âcognitive bias causes people not to believe that cause X is importantâ! And also a bias blind spot (âIâm less prone to cognitive bias than all those people who believe in cause Yâ).
I think, as this illustrates, talking about biases usually isnât that helpful when working out what to do. There are often plausible biases on both sides.
This is a pretty common criticism against behavioral finance, which attempts to use cog biases to better understand financial markets, and was one of the first major attempts at application. Theories based on biases are pretty weak unless backed up with a model or some relevant empirical evidence.
At 80k, we donât find understanding biases to be that big a part of making good career decisions. The main ways it comes up is that it raises my credence that ppl tend not to consider enough options and that itâs useful to use a checklist when comparing options (i.e. be more systematic).
Relevant: https://ââintelligence.org/ââfiles/ââCognitiveBiases.pdf
I think the biggest bias here is that most donors would like to be able to point to their clear successes and the people they helped. For most folks, this leans them against x-risk because a) youâll very likely fail to lower x-risk b) even if you succeed, you usually wonât be able to demonstrate it.
On the other hand, itâs also harder to tell if youâve failed.
Like Ben, I doubt this kind of analysis is going to change peopleâs minds much one way or the other.
The biases which Peter Hurford discusses in his classic post Why Iâm skeptical about unproven causes (and you should be too) seem to be relevant here.