Scope insensitivity has some empirical backing—e.g. the helping birds study—and some theorised mechanisms of action, e.g. people lacking intuitive understanding of large numbers.
Scope oversensitivity seems possible in theory, but I can’t think of any similar empirical or theoretical reasons to think it’s actually happening.
To the extent that you disagree, it’s not clear to me whether it’s because you and I disagree on how EAs weight things like animal suffering, or whether we disagree on how it ought to be weighted. Are you intending to cast doubt on the idea that a problem that is 100x as large is (all else equal) 100x more important, or are you intending to suggest that EAs treat it as more than 100x as important?
Scope insensitivity has some empirical backing—e.g. the helping birds study—and some theorised mechanisms of action, e.g. people lacking intuitive understanding of large numbers.
Scope oversensitivity seems possible in theory, but I can’t think of any similar empirical or theoretical reasons to think it’s actually happening.
To the extent that you disagree, it’s not clear to me whether it’s because you and I disagree on how EAs weight things like animal suffering, or whether we disagree on how it ought to be weighted. Are you intending to cast doubt on the idea that a problem that is 100x as large is (all else equal) 100x more important, or are you intending to suggest that EAs treat it as more than 100x as important?