Flaming hot take: I wonder if some EAs suffer from Scope Oversensitivity—essentially the inverse of the identifiable victim effect. Take the animal welfare vs global health debate: are we sometimes biased by the sheer magnitude of animal suffering numbers, rather than other relevant factors? Just as the identifiable victim effect leads people to overweight individual stories, maybe we’re overweighting astronomical numbers.
EAs pride themselves on scope sensitivity to combat emotional biases, but taken to an extreme, could this create its own bias? Are we sometimes too seduced by bigger numbers = bigger problem? The meta-principle might be that any framework, even one designed to correct cognitive biases, needs wisdom and balance to avoid becoming its own kind of distortion.
Scope insensitivity has some empirical backing—e.g. the helping birds study—and some theorised mechanisms of action, e.g. people lacking intuitive understanding of large numbers.
Scope oversensitivity seems possible in theory, but I can’t think of any similar empirical or theoretical reasons to think it’s actually happening.
To the extent that you disagree, it’s not clear to me whether it’s because you and I disagree on how EAs weight things like animal suffering, or whether we disagree on how it ought to be weighted. Are you intending to cast doubt on the idea that a problem that is 100x as large is (all else equal) 100x more important, or are you intending to suggest that EAs treat it as more than 100x as important?
Do you have any particular reason to believe that EAs overweight large problems?
Take the animal welfare vs global health debate: are we sometimes biased by the sheer magnitude of animal suffering numbers, rather than other relevant factors?
EAs donate much more to global health than to animal welfare. Do you think the ratio should be even higher still?
Flaming hot take: I wonder if some EAs suffer from Scope Oversensitivity—essentially the inverse of the identifiable victim effect. Take the animal welfare vs global health debate: are we sometimes biased by the sheer magnitude of animal suffering numbers, rather than other relevant factors? Just as the identifiable victim effect leads people to overweight individual stories, maybe we’re overweighting astronomical numbers.
EAs pride themselves on scope sensitivity to combat emotional biases, but taken to an extreme, could this create its own bias? Are we sometimes too seduced by bigger numbers = bigger problem? The meta-principle might be that any framework, even one designed to correct cognitive biases, needs wisdom and balance to avoid becoming its own kind of distortion.
Scope insensitivity has some empirical backing—e.g. the helping birds study—and some theorised mechanisms of action, e.g. people lacking intuitive understanding of large numbers.
Scope oversensitivity seems possible in theory, but I can’t think of any similar empirical or theoretical reasons to think it’s actually happening.
To the extent that you disagree, it’s not clear to me whether it’s because you and I disagree on how EAs weight things like animal suffering, or whether we disagree on how it ought to be weighted. Are you intending to cast doubt on the idea that a problem that is 100x as large is (all else equal) 100x more important, or are you intending to suggest that EAs treat it as more than 100x as important?
Upvoted and disagree-voted.
Do you have any particular reason to believe that EAs overweight large problems?
EAs donate much more to global health than to animal welfare. Do you think the ratio should be even higher still?