It seems like, as people spend more time in EA, they become more longtermist
This is also my impression. And I think knowledge of that played some small role in my probably overdetermined shift towards longtermism.
But I’m also a bit concerned about the idea of using that trend as a factor in forming one’s own beliefs or decisions. I think we should be very cautious about doing so, and provide heavy caveats when discussing the idea of doing so. This is because I think it’s possible we could end up with an unhealthy combination of founders effects and information cascades, where some initial or vocal group happened to lean more a certain way; a bunch of people see that they did so, updates on that, and thus leans more that way; a bunch of people see that and do the same, etc.
(To be honest, I’m personally not so worried about this in relation to longtermism as a moral principle, but more in relation to claims about the future or which specific longtermist strategies to prioritise. I’m not sure exactly why this is. I think it’s largely that the latter seems to rely much more on a range of complex long-term forecasts that might be made fairly arbitrarily at some point but then become ossified into a community’s common sense.)
Yep I agree (I frame this as ‘beware updating on epistemic clones’ - people who have your beliefs for the same reason as you). My point in bringing this up was just that the common-sense view isn’t obviously near-termist.
(Minor point)
This is also my impression. And I think knowledge of that played some small role in my probably overdetermined shift towards longtermism.
But I’m also a bit concerned about the idea of using that trend as a factor in forming one’s own beliefs or decisions. I think we should be very cautious about doing so, and provide heavy caveats when discussing the idea of doing so. This is because I think it’s possible we could end up with an unhealthy combination of founders effects and information cascades, where some initial or vocal group happened to lean more a certain way; a bunch of people see that they did so, updates on that, and thus leans more that way; a bunch of people see that and do the same, etc.
(To be honest, I’m personally not so worried about this in relation to longtermism as a moral principle, but more in relation to claims about the future or which specific longtermist strategies to prioritise. I’m not sure exactly why this is. I think it’s largely that the latter seems to rely much more on a range of complex long-term forecasts that might be made fairly arbitrarily at some point but then become ossified into a community’s common sense.)
Yep I agree (I frame this as ‘beware updating on epistemic clones’ - people who have your beliefs for the same reason as you). My point in bringing this up was just that the common-sense view isn’t obviously near-termist.