Thanks for listing this as one of your five topics of interest and thanks to everyone for insightful comments.
I do basically think that EA could learn a lot of things from SJ in terms of being an inclusive movement.
I wholeheartedly agree.
Beyond movement building & inclusivity, I’d be curious to hear about other domains where you think that EA could learn from the social justice movement/philosophy? E.g., in terms of the methodologies and academic disciplines that the respective movements tend to rely on, epistemic norms, ethical frameworks, etc.
Beyond movement building & inclusivity, I think it makes sense for EA as a movement to keep their current approach because it’s been working pretty well IMO.
I think the thing EAs as people (with a worldview that includes things beyond EA) might want to consider—and which SJ could inform—is the demands that historical injustices of, e.g., colonialism, racism, etc. make on us. I think those demands are plausibly quite large and failure to satisfy them could constitute a ongoing moral catastrophe. Since they’re not welfarist, they’re outside the scope of EA as it currently exists. But for moral uncertainty reasons I think many people should think about them.
Thanks for listing this as one of your five topics of interest and thanks to everyone for insightful comments.
I wholeheartedly agree.
Beyond movement building & inclusivity, I’d be curious to hear about other domains where you think that EA could learn from the social justice movement/philosophy? E.g., in terms of the methodologies and academic disciplines that the respective movements tend to rely on, epistemic norms, ethical frameworks, etc.
Beyond movement building & inclusivity, I think it makes sense for EA as a movement to keep their current approach because it’s been working pretty well IMO.
I think the thing EAs as people (with a worldview that includes things beyond EA) might want to consider—and which SJ could inform—is the demands that historical injustices of, e.g., colonialism, racism, etc. make on us. I think those demands are plausibly quite large and failure to satisfy them could constitute a ongoing moral catastrophe. Since they’re not welfarist, they’re outside the scope of EA as it currently exists. But for moral uncertainty reasons I think many people should think about them.