Hm, I think that most of the people who participated in this experiment:
three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88.1 This is scope insensitivity or scope neglect: the number of birds saved—the scope of the altruistic action—had little effect on willingness to pay.
There’s also an essay from 2008 about the intuitions behind utilitarianism that you might find helpful for understanding why someone could consider scope insensitivity a bias instead of just the way human values work:
Hm, I think that most of the people who participated in this experiment:
would agree after the results were shown to them that they were doing something irrational that they wouldn’t endorse if aware of it. (Example taken from here: https://www.lesswrong.com/posts/2ftJ38y9SRBCBsCzy/scope-insensitivity )
There’s also an essay from 2008 about the intuitions behind utilitarianism that you might find helpful for understanding why someone could consider scope insensitivity a bias instead of just the way human values work:
https://www.lesswrong.com/posts/r5MSQ83gtbjWRBDWJ/the-intuitions-behind-utilitarianism