tl;dr: I basically agree with your first paragraph, but think that:
thatâs mostly consistent with my prior comment
that doesnât represent a strong argument against longtermism
Masraniâs claims/âlanguage go beyond the defensible claims youâre making
This can happen unconsciously, though, e.g. confirmation bias, or whenever thereâs arbitrariness or âwhimâ, e.g. priors or how you weight different considerations with little evidence. The weaker the evidence, the more prone to bias
Agreed. But:
I think that a small to moderate degree of such bias is something I acknowledged in my prior comment
(And I intended to imply that it could occur unconsciously, though I didnât explicitly state that)
I think unconscious bias is always a possibility, including in relation to whatever alternative to longtermism one might endorse
That said, I think âThe weaker the evidence, the more prone to biasâ is true (all other factors held constant), and I think that that does create one reason why bias may push in favour of longtermism more than in favour of other things.
I think I probably shouldâve acknowledged that.
But thereâs still the fact that there are so many other sources of bias, factors exacerbating or mitigating bias, etc. So itâs still far from obvious which group of people (sorted by current cause priorities) is more biased overall in their cause prioritisation.
And I think that thereâs some value in trying to figure that out, but that should be done and discussed very carefully, and is probably less useful than other discussions/âresearch that could inform cause priorities.
E.g., scope neglect, identifiable victim effects, and confirmation bias when most people first enter EA (since more were previously interested in global health & dev than in longtermism) bias against longtermism
But a desire to conform to whatâs currently probably more âtrendyâ in EA biases towards longtermism
And so on
Less important: It seems far from obvious to me whether thereâs substantial truth in the claim that âthereâs self-selection so that the people most interested in longtermism are the ones whose arbitrary priors and weights support it most, rightly or wronglyâ, even assuming bias is a big part of the story.
E.g., I think things along the lines of conformity and deference are more likely culprits for âunwarranted/âunjustifiedâ shifts towards longtermism than confirmation bias are
It seems like a very large portion of longtermists were originally focused on other areas and were surprised to find themselves ending up longtermist, which makes confirmation bias seem like an unlikely explanation
Compared to what youâre suggesting, Masraniâat least in some placesâseems to imply something more extreme, more conscious, and/âor more explicitly permitted by longtermism itself (rather than just general biases that are exacerbated by having limited info)
E.g., âby fiddling with the numbers, the above reasoning can be used to squash funding for any charitable cause whatsoever.â [emphasis added]
E.g., âTo reiterate, longtermism gives us permission tocompletely ignore the consequences of our actions over the next one thousand years, provided we donât personally believe these actions will rise to the level of existential threats. In other words, the entirely subjective and non-falsifiable belief that oneâs actions arenât directly contributing to existential risks gives one carte blanche permission to treat others however one pleases. The suffering of our fellow humans alive today is inconsequential in the grand scheme of things. We can âsimply ignoreâ itâeven contribute to it if we wishâbecause it doesnât matter.â [emphasis added]
This very much sounds to me like âassuming bad faithâ in a way that I think is both unproductive and inaccurate for most actual longtermists
I.e., this sounds quite different to âThese people are really trying to do whatâs best. But theyâre subject to cognitive biases and are disproportionately affected by the beliefs of the people they happen to be around or look up toâas are we all. And there are X, Y, Z specific reasons to think those effects are leading these people to be more inclined towards longtermism than they should be.â
tl;dr: I basically agree with your first paragraph, but think that:
thatâs mostly consistent with my prior comment
that doesnât represent a strong argument against longtermism
Masraniâs claims/âlanguage go beyond the defensible claims youâre making
Agreed. But:
I think that a small to moderate degree of such bias is something I acknowledged in my prior comment
(And I intended to imply that it could occur unconsciously, though I didnât explicitly state that)
I think unconscious bias is always a possibility, including in relation to whatever alternative to longtermism one might endorse
See also Caution on Bias Arguments and Beware Isolated Demands for Rigor
That said, I think âThe weaker the evidence, the more prone to biasâ is true (all other factors held constant), and I think that that does create one reason why bias may push in favour of longtermism more than in favour of other things.
I think I probably shouldâve acknowledged that.
But thereâs still the fact that there are so many other sources of bias, factors exacerbating or mitigating bias, etc. So itâs still far from obvious which group of people (sorted by current cause priorities) is more biased overall in their cause prioritisation.
And I think that thereâs some value in trying to figure that out, but that should be done and discussed very carefully, and is probably less useful than other discussions/âresearch that could inform cause priorities.
E.g., scope neglect, identifiable victim effects, and confirmation bias when most people first enter EA (since more were previously interested in global health & dev than in longtermism) bias against longtermism
But a desire to conform to whatâs currently probably more âtrendyâ in EA biases towards longtermism
And so on
Less important: It seems far from obvious to me whether thereâs substantial truth in the claim that âthereâs self-selection so that the people most interested in longtermism are the ones whose arbitrary priors and weights support it most, rightly or wronglyâ, even assuming bias is a big part of the story.
E.g., I think things along the lines of conformity and deference are more likely culprits for âunwarranted/âunjustifiedâ shifts towards longtermism than confirmation bias are
It seems like a very large portion of longtermists were originally focused on other areas and were surprised to find themselves ending up longtermist, which makes confirmation bias seem like an unlikely explanation
Compared to what youâre suggesting, Masraniâat least in some placesâseems to imply something more extreme, more conscious, and/âor more explicitly permitted by longtermism itself (rather than just general biases that are exacerbated by having limited info)
E.g., âby fiddling with the numbers, the above reasoning can be used to squash funding for any charitable cause whatsoever.â [emphasis added]
E.g., âTo reiterate, longtermism gives us permission to completely ignore the consequences of our actions over the next one thousand years, provided we donât personally believe these actions will rise to the level of existential threats. In other words, the entirely subjective and non-falsifiable belief that oneâs actions arenât directly contributing to existential risks gives one carte blanche permission to treat others however one pleases. The suffering of our fellow humans alive today is inconsequential in the grand scheme of things. We can âsimply ignoreâ itâeven contribute to it if we wishâbecause it doesnât matter.â [emphasis added]
This very much sounds to me like âassuming bad faithâ in a way that I think is both unproductive and inaccurate for most actual longtermists
I.e., this sounds quite different to âThese people are really trying to do whatâs best. But theyâre subject to cognitive biases and are disproportionately affected by the beliefs of the people they happen to be around or look up toâas are we all. And there are X, Y, Z specific reasons to think those effects are leading these people to be more inclined towards longtermism than they should be.â