tl;dr: I basically agree with your first paragraph, but think that:
that’s mostly consistent with my prior comment
that doesn’t represent a strong argument against longtermism
Masrani’s claims/language go beyond the defensible claims you’re making
This can happen unconsciously, though, e.g. confirmation bias, or whenever there’s arbitrariness or “whim”, e.g. priors or how you weight different considerations with little evidence. The weaker the evidence, the more prone to bias
Agreed. But:
I think that a small to moderate degree of such bias is something I acknowledged in my prior comment
(And I intended to imply that it could occur unconsciously, though I didn’t explicitly state that)
I think unconscious bias is always a possibility, including in relation to whatever alternative to longtermism one might endorse
That said, I think “The weaker the evidence, the more prone to bias” is true (all other factors held constant), and I think that that does create one reason why bias may push in favour of longtermism more than in favour of other things.
I think I probably should’ve acknowledged that.
But there’s still the fact that there are so many other sources of bias, factors exacerbating or mitigating bias, etc. So it’s still far from obvious which group of people (sorted by current cause priorities) is more biased overall in their cause prioritisation.
And I think that there’s some value in trying to figure that out, but that should be done and discussed very carefully, and is probably less useful than other discussions/research that could inform cause priorities.
E.g., scope neglect, identifiable victim effects, and confirmation bias when most people first enter EA (since more were previously interested in global health & dev than in longtermism) bias against longtermism
But a desire to conform to what’s currently probably more “trendy” in EA biases towards longtermism
And so on
Less important: It seems far from obvious to me whether there’s substantial truth in the claim that “there’s self-selection so that the people most interested in longtermism are the ones whose arbitrary priors and weights support it most, rightly or wrongly”, even assuming bias is a big part of the story.
E.g., I think things along the lines of conformity and deference are more likely culprits for “unwarranted/unjustified” shifts towards longtermism than confirmation bias are
It seems like a very large portion of longtermists were originally focused on other areas and were surprised to find themselves ending up longtermist, which makes confirmation bias seem like an unlikely explanation
Compared to what you’re suggesting, Masrani—at least in some places—seems to imply something more extreme, more conscious, and/or more explicitly permitted by longtermism itself (rather than just general biases that are exacerbated by having limited info)
E.g., “by fiddling with the numbers, the above reasoning can be used to squash funding for any charitable cause whatsoever.” [emphasis added]
E.g., “To reiterate, longtermism gives us permission tocompletely ignore the consequences of our actions over the next one thousand years, provided we don’t personally believe these actions will rise to the level of existential threats. In other words, the entirely subjective and non-falsifiable belief that one’s actions aren’t directly contributing to existential risks gives one carte blanche permission to treat others however one pleases. The suffering of our fellow humans alive today is inconsequential in the grand scheme of things. We can “simply ignore” it—even contribute to it if we wish—because it doesn’t matter.” [emphasis added]
This very much sounds to me like “assuming bad faith” in a way that I think is both unproductive and inaccurate for most actual longtermists
I.e., this sounds quite different to “These people are really trying to do what’s best. But they’re subject to cognitive biases and are disproportionately affected by the beliefs of the people they happen to be around or look up to—as are we all. And there are X, Y, Z specific reasons to think those effects are leading these people to be more inclined towards longtermism than they should be.”
tl;dr: I basically agree with your first paragraph, but think that:
that’s mostly consistent with my prior comment
that doesn’t represent a strong argument against longtermism
Masrani’s claims/language go beyond the defensible claims you’re making
Agreed. But:
I think that a small to moderate degree of such bias is something I acknowledged in my prior comment
(And I intended to imply that it could occur unconsciously, though I didn’t explicitly state that)
I think unconscious bias is always a possibility, including in relation to whatever alternative to longtermism one might endorse
See also Caution on Bias Arguments and Beware Isolated Demands for Rigor
That said, I think “The weaker the evidence, the more prone to bias” is true (all other factors held constant), and I think that that does create one reason why bias may push in favour of longtermism more than in favour of other things.
I think I probably should’ve acknowledged that.
But there’s still the fact that there are so many other sources of bias, factors exacerbating or mitigating bias, etc. So it’s still far from obvious which group of people (sorted by current cause priorities) is more biased overall in their cause prioritisation.
And I think that there’s some value in trying to figure that out, but that should be done and discussed very carefully, and is probably less useful than other discussions/research that could inform cause priorities.
E.g., scope neglect, identifiable victim effects, and confirmation bias when most people first enter EA (since more were previously interested in global health & dev than in longtermism) bias against longtermism
But a desire to conform to what’s currently probably more “trendy” in EA biases towards longtermism
And so on
Less important: It seems far from obvious to me whether there’s substantial truth in the claim that “there’s self-selection so that the people most interested in longtermism are the ones whose arbitrary priors and weights support it most, rightly or wrongly”, even assuming bias is a big part of the story.
E.g., I think things along the lines of conformity and deference are more likely culprits for “unwarranted/unjustified” shifts towards longtermism than confirmation bias are
It seems like a very large portion of longtermists were originally focused on other areas and were surprised to find themselves ending up longtermist, which makes confirmation bias seem like an unlikely explanation
Compared to what you’re suggesting, Masrani—at least in some places—seems to imply something more extreme, more conscious, and/or more explicitly permitted by longtermism itself (rather than just general biases that are exacerbated by having limited info)
E.g., “by fiddling with the numbers, the above reasoning can be used to squash funding for any charitable cause whatsoever.” [emphasis added]
E.g., “To reiterate, longtermism gives us permission to completely ignore the consequences of our actions over the next one thousand years, provided we don’t personally believe these actions will rise to the level of existential threats. In other words, the entirely subjective and non-falsifiable belief that one’s actions aren’t directly contributing to existential risks gives one carte blanche permission to treat others however one pleases. The suffering of our fellow humans alive today is inconsequential in the grand scheme of things. We can “simply ignore” it—even contribute to it if we wish—because it doesn’t matter.” [emphasis added]
This very much sounds to me like “assuming bad faith” in a way that I think is both unproductive and inaccurate for most actual longtermists
I.e., this sounds quite different to “These people are really trying to do what’s best. But they’re subject to cognitive biases and are disproportionately affected by the beliefs of the people they happen to be around or look up to—as are we all. And there are X, Y, Z specific reasons to think those effects are leading these people to be more inclined towards longtermism than they should be.”