I’m pro there being a diversity of worldviews and causes in EA—I’m not certain in longtermism, and think such diversity is a good thing even on longtermist grounds. I mention reasons in the ‘steel manning arguments against EA’s focus on longtermism’ question. And I talked a little bit about this in my recent EAG London talk. Other considerations are helping to avoid groupthink (which I think is very important), positive externalities (a success in one area transfers to others) and the mundane benefit of economies of scale.
I do think that the traditional poverty/animals/x-risk breakdown feels a bit path-dependent though, and we could have more people pursuing cause areas outside of that. I think that your work fleshing out your worldview and figuring out what follows from it is the sort of thing I’d like to see more of.
I’m pro there being a diversity of worldviews and causes in EA—I’m not certain in longtermism, and think such diversity is a good thing even on longtermist grounds. I mention reasons in the ‘steel manning arguments against EA’s focus on longtermism’ question. And I talked a little bit about this in my recent EAG London talk. Other considerations are helping to avoid groupthink (which I think is very important), positive externalities (a success in one area transfers to others) and the mundane benefit of economies of scale.
I do think that the traditional poverty/animals/x-risk breakdown feels a bit path-dependent though, and we could have more people pursuing cause areas outside of that. I think that your work fleshing out your worldview and figuring out what follows from it is the sort of thing I’d like to see more of.