I think if EAs better appreciated uncertainty when prioritising causes, people’s careers would span a wider range of cause areas.
I’ve got a strong intuition that this is wrong, so I’m trying to think it through.
To argue that EA’s underestimate uncertainty, you need to directly observe their uncertainty estimates (and have knowledge of the correct level of uncertainty to have). For example, if the community was homogenous and all assigned a 1% chance to Cause X being the most important issue (I’m deliberately trying not to deal with how to measure this) and there was a 99% chance of cause Y being the most important issue, then all individuals would choose to work on Cause Y. If the probabilities were 5% X and 95% you’d get the same outcome. This is because individuals are making single choices.
Now, if there was a central body coordinating everyone’s efforts, in the first scenario, it still wouldn’t follow that 1% of people would get allocated to cause Y. Optimal allocation strategy aside, there isn’t this clean relationship between uncertainty and decision rules.
I think 80 000 Hours could emphasise uncertainty more, but also that the EA community as a whole just needs to be more conscious of uncertainty in cause prioritisation.
I think 80k is already very conscious of this (based on my general sense of 80k materials). Global priorities research is one of their 4 highest priorities areas and it’s precisely about having more confidence about what is the top priority.
I think something that would help me understand better where you are coming from is to hear more about what you think the decision rules are for most individuals, how they are taking their uncertainty into account and more about precisely how gender/culture interacts with cause area uncertainty in creating decisions.
A bit late here but I was looking into it and found this (https://survivalandflourishing.fund/s-process):