“Unless EA changes its positioning soon, it is so obvious to me that this well-meaning platform will remain a sparring ground of ideas, of ivory towers, and not of grassroots or picket lines.”
Yes, there’s a lot of abstract arguments in EA, but we’ve also achieved significant things. See here: https://www.effectivealtruism.org/impact Thus, it doesn’t seem fair to imply that EA is currently merely a sparring ground of ideas and ivory towers. EA’s ground-level work doesn’t look like picket lines, but it’s very much there. One of the unfortunate baked-in problems of the EA Forum is that the most impactful things often aren’t talked about a lot because there’s not a lot of new information to share about them. I don’t know a good solution to this, and I don’t think it’s anyone’s fault, but it does mean that a lot of discussion in the EA Forum will be more the ivory tower type—stuff that’s largely settled, like the Against Malaria Foundation, doesn’t get a lot of debate.
I’m also curious how you reconcile certain parts of this essay.
First, you’ve written this:
”The world of social justice is not so easily swayed as Silicon Valley, we do not iterate, and we certainly do not ‘fail fast’. Such concessions cost lives. This doggedness is something EA sorely lacks at present: its principles are more fluid, more congenial to the power structures that cause the existential threats it rails against. Such flexibility may make us more effective collaborators, but not necessarily more effective influencers. The direction provided by political or ideological weathervanes does not hold its ground in changing winds. Instead, by appealing to the social justice warrior, EA targets become non-negotiable and our politic more steadfast and demanding.
But EA Principles do little to endear themselves to social justice. Strict rationalism and Pascal-Wager-like calculations for doing good feel false, contrived, and a far cry from the wildfire of activism. Longtermism is abhorrent to the advocate.”
I accept this as largely true. The one suggestion I’d make is that I don’t think EA’s principles are more fluid—it is our methods that are more fluid, and more congenial to existing power structures.
That said, for the most part, this is entirely accurate. You seem to have very accurately hit upon several major differences in the way EA and social justice operate, in a way that indicates to me that you understand both pretty well. But then, you write later:
”Firstly, it is essential that we find common ground to work from, reconciling our theory and our language with the real-world experience of advocates. We are not so different, and we need to prove that. We need to demonstrate how what can come off as fixed narratives and frameworks are completely complimentary to the goals and methods of social justice.”
My question is—ARE our frameworks completely complimentary to the goals and methods of social justice? This doesn’t seem obviously false to me, but it also doesn’t seem obviously true either. Iteration, rationalism, remaining open to changing paths and changing our minds, and rejection of ideology are all pretty big in the EA movement. However, you then take it as a given that social justice and EA are compatible. I’d love to hear a more fleshed-out argument for why this is the case.
“Unless EA changes its positioning soon, it is so obvious to me that this well-meaning platform will remain a sparring ground of ideas, of ivory towers, and not of grassroots or picket lines.”
Yes, there’s a lot of abstract arguments in EA, but we’ve also achieved significant things. See here: https://www.effectivealtruism.org/impact Thus, it doesn’t seem fair to imply that EA is currently merely a sparring ground of ideas and ivory towers. EA’s ground-level work doesn’t look like picket lines, but it’s very much there. One of the unfortunate baked-in problems of the EA Forum is that the most impactful things often aren’t talked about a lot because there’s not a lot of new information to share about them. I don’t know a good solution to this, and I don’t think it’s anyone’s fault, but it does mean that a lot of discussion in the EA Forum will be more the ivory tower type—stuff that’s largely settled, like the Against Malaria Foundation, doesn’t get a lot of debate.
I’m also curious how you reconcile certain parts of this essay.
First, you’ve written this:
”The world of social justice is not so easily swayed as Silicon Valley, we do not iterate, and we certainly do not ‘fail fast’. Such concessions cost lives. This doggedness is something EA sorely lacks at present: its principles are more fluid, more congenial to the power structures that cause the existential threats it rails against. Such flexibility may make us more effective collaborators, but not necessarily more effective influencers. The direction provided by political or ideological weathervanes does not hold its ground in changing winds. Instead, by appealing to the social justice warrior, EA targets become non-negotiable and our politic more steadfast and demanding.
But EA Principles do little to endear themselves to social justice. Strict rationalism and Pascal-Wager-like calculations for doing good feel false, contrived, and a far cry from the wildfire of activism. Longtermism is abhorrent to the advocate.”
I accept this as largely true. The one suggestion I’d make is that I don’t think EA’s principles are more fluid—it is our methods that are more fluid, and more congenial to existing power structures.
That said, for the most part, this is entirely accurate. You seem to have very accurately hit upon several major differences in the way EA and social justice operate, in a way that indicates to me that you understand both pretty well. But then, you write later:
”Firstly, it is essential that we find common ground to work from, reconciling our theory and our language with the real-world experience of advocates. We are not so different, and we need to prove that. We need to demonstrate how what can come off as fixed narratives and frameworks are completely complimentary to the goals and methods of social justice.”
My question is—ARE our frameworks completely complimentary to the goals and methods of social justice? This doesn’t seem obviously false to me, but it also doesn’t seem obviously true either. Iteration, rationalism, remaining open to changing paths and changing our minds, and rejection of ideology are all pretty big in the EA movement. However, you then take it as a given that social justice and EA are compatible. I’d love to hear a more fleshed-out argument for why this is the case.