Yes! Totally agree. I think I mentioned very briefly that one should also be wary of social dynamics pushing toward EA beliefs, but I definitely didn’t address it enough. Although I think the end result was positive and that my beliefs are true (with some uncertainty of course), I would guess that my update toward long-termism was due in large part to lot’s of exposure to the EA community and from the social pressure that brings.
I basically bought some virtue signaling in the EA domain at the cost of signaling in broader society. Given I hang out with a lot of EAs and plan to do so more in the future, I’d guess that if I were to rationally evaluate this decision it would look net positive in favor of changing toward long-termism (as you would also gain within the EA community by making a similar switch, though with some short-term itoldyouso negative effects).
So yes, I think it was largely due to closer social ties to the EA community that this switch finally became worthwhile and perhaps this was a calculation going on at the subconscious level. It’s probably no coincidence that I finally made a full switch-over during an EA retreat where the broad society costs of switching beliefs was less salient and the EA benefits much more salient. To have the perfect decision-making situation I guess it would be nice to have equally good opportunities in communities representing every philosophical belief, but for now seems a bit unlikely. I suppose it’s another argument for cultivating diversity within EA.
This brings up a whole other rabbit hole in terms of thinking about how we want to appeal to people with some interest in EA but not yet committed to the ideas. I think the social aspect is probably larger than many might think. Of course if we emphasized this we’re limiting people’s choice to join EA in a rational way. But then what is ‘choice’ really given the social construction of our personalities and desires....
Yes! Totally agree. I think I mentioned very briefly that one should also be wary of social dynamics pushing toward EA beliefs, but I definitely didn’t address it enough. Although I think the end result was positive and that my beliefs are true (with some uncertainty of course), I would guess that my update toward long-termism was due in large part to lot’s of exposure to the EA community and from the social pressure that brings.
I basically bought some virtue signaling in the EA domain at the cost of signaling in broader society. Given I hang out with a lot of EAs and plan to do so more in the future, I’d guess that if I were to rationally evaluate this decision it would look net positive in favor of changing toward long-termism (as you would also gain within the EA community by making a similar switch, though with some short-term itoldyouso negative effects).
So yes, I think it was largely due to closer social ties to the EA community that this switch finally became worthwhile and perhaps this was a calculation going on at the subconscious level. It’s probably no coincidence that I finally made a full switch-over during an EA retreat where the broad society costs of switching beliefs was less salient and the EA benefits much more salient. To have the perfect decision-making situation I guess it would be nice to have equally good opportunities in communities representing every philosophical belief, but for now seems a bit unlikely. I suppose it’s another argument for cultivating diversity within EA.
This brings up a whole other rabbit hole in terms of thinking about how we want to appeal to people with some interest in EA but not yet committed to the ideas. I think the social aspect is probably larger than many might think. Of course if we emphasized this we’re limiting people’s choice to join EA in a rational way. But then what is ‘choice’ really given the social construction of our personalities and desires....