If one chooses options 2 or 3, I see no particular reason why one should focus on “Leftist Ethics” in particular. If one chooses one of those options, one would presumably also want to incorporate other ethical views; e.g. libertarianism, maybe some versions of virtue ethics, etc.
(I’m not hereby rejecting option 1, which I think should be on the table.)
Yes that’s true. Though I have not read any EA report that includes a paragraph of the flavor “Libertarians are worried about X, we have no opinion on whether or not X is true, but it creates substantial PR-risk.”
That might be because libertarians are less inclined to drum up big PR-scandals, but it’s also because EAs tend to be somewhat sympathetic to libertarianism already.
My sense is that people mostly ignore virtue ethics, though maybe Open Phil thinks about them as part of their “worldview diversification” approach. In that case, I think it would be useful to have a specific person serving as a community virtue ethicist instead of a bunch of people who just casually think “this seems reasonable under virtue ethics so it’s robust to worldview diversification”. I have no idea if that’s what happens currently, but basically I agree with you.
Though I have not read any EA report that includes a paragraph of the flavor “Libertarians are worried about X, we have no opinion on whether or not X is true, but it creates substantial PR-risk.”
I’m not sure I understand your reasoning. I thought you were saying that we should focus on whether ethical theories are true (or have some chance of being true), and not so much on the PR-risk? And if so, it doesn’t seem to matter that Libertarians tend to have fewer complaints (which may lead to bad PR).
Fwiw libertarianism and virtue ethics were just two examples. My point is that there’s no reason to single out Leftist Ethics among the many potential alternatives to utilitarianism.
I suspect that good bits of leftish ethics are less represented in EA than libertarian thought and maybe even virtue ethics. So carefully onboarding some good thinkers might be good for cognitive diversity and hence better conversations/research around some issues.
(TBC, I also want to see more well-reasoned conservative voices in EA. Generally, I don’t see diversity as an end in itself. And see fairly significant risks in “diluting” EA. But my take is that diluting happens by itself already via onboarding in the universities and desire to accommodate more people willing to contribute… so some gardening/fencing might be desired.)
I guess you are one of the best people to ask about how EA correlates with (other) political philosophies/thought clusters. Any thoughts?
I think there might sometimes be the opposite problem—that EA is staying too close to recommendations that more mainstream political groups would make, for various reasons.
In the 2019 EA Survey, 40% of EAs said their political views are “centre-left”, whereas 32% said they’re “left”.
This is really interesting and I’d be happy to see a more recent statistic (though I don’t expect it to have changed by much). But even if there are more of us than I think, I find very little consideration, in EA contexts, of concrete leftist ideas, e.g. “Maybe capitalism is a problem we can and should be addressing, now that we have billions of dollars committed to EA”, or “Could charitable giving do in some cases more harm than good by shifting the perceived responsibility from the state to individuals?”.
I agree that this could apply to every other philosophical school too. I do feel, however, that coming from a mostly English speaking perspective, economically liberal ideas are already much more ingrained into the core of EA than leftist ideas are, since they are more prominent in those political systems(?). Then there are probably perspectives that are even less present, like those of people in developing countries or with backgrounds of poverty or sickness or oppression.
Ultimately, since we all share the pragmatism and the will to test different courses of action for improving the world, I think this could be less of a political debate than is found outside EA on the same ideas, and more about just expanding the classes of ideas we consider, and deciding where to look for them and who could look for them best.
I sort of expect the young college EAs to be more leftist, and expect them to be more prominent in the next few years. Though that could be wrong, maybe college EAs are heavily selected for not being already committed to leftist causes.
I don’t think I’m the best person to ask haha. I basically expect EAs to be mostly Grey Tribe, pretty democratic, but with some libertarian influences, and generally just not that interested in politics. There’s probably better data on this somewhere, or at least the EA-related SlateStarCodex reader survey.
This is fairly aligned with my take but I think EAs are more blue than grey and more left than you might be implying. (Ah, by you I meant Stefan, he does/did a lot of empirical psychological/political research into relevant topics.)
If one chooses options 2 or 3, I see no particular reason why one should focus on “Leftist Ethics” in particular. If one chooses one of those options, one would presumably also want to incorporate other ethical views; e.g. libertarianism, maybe some versions of virtue ethics, etc.
(I’m not hereby rejecting option 1, which I think should be on the table.)
Yes that’s true. Though I have not read any EA report that includes a paragraph of the flavor “Libertarians are worried about X, we have no opinion on whether or not X is true, but it creates substantial PR-risk.”
That might be because libertarians are less inclined to drum up big PR-scandals, but it’s also because EAs tend to be somewhat sympathetic to libertarianism already.
My sense is that people mostly ignore virtue ethics, though maybe Open Phil thinks about them as part of their “worldview diversification” approach. In that case, I think it would be useful to have a specific person serving as a community virtue ethicist instead of a bunch of people who just casually think “this seems reasonable under virtue ethics so it’s robust to worldview diversification”. I have no idea if that’s what happens currently, but basically I agree with you.
I’m not sure I understand your reasoning. I thought you were saying that we should focus on whether ethical theories are true (or have some chance of being true), and not so much on the PR-risk? And if so, it doesn’t seem to matter that Libertarians tend to have fewer complaints (which may lead to bad PR).
Fwiw libertarianism and virtue ethics were just two examples. My point is that there’s no reason to single out Leftist Ethics among the many potential alternatives to utilitarianism.
Okay, as I understand the discussion so far:
The RP authors said they were concerned about PR risk from a leftist critique
I wrote this post, explaining how I think those concerns could more productively be addressed
You asked, why I’m focusing on Leftist Ethics in particular
I replied, because I haven’t seen authors cite concerns about PR risk stemming from other kinds of critique
That’s all my comment was meant to illustrate, I think I pretty much agree with your initial comment.
Ah, I see. Thanks!
I suspect that good bits of leftish ethics are less represented in EA than libertarian thought and maybe even virtue ethics. So carefully onboarding some good thinkers might be good for cognitive diversity and hence better conversations/research around some issues.
(TBC, I also want to see more well-reasoned conservative voices in EA. Generally, I don’t see diversity as an end in itself. And see fairly significant risks in “diluting” EA. But my take is that diluting happens by itself already via onboarding in the universities and desire to accommodate more people willing to contribute… so some gardening/fencing might be desired.)
I guess you are one of the best people to ask about how EA correlates with (other) political philosophies/thought clusters. Any thoughts?
I think there might sometimes be the opposite problem—that EA is staying too close to recommendations that more mainstream political groups would make, for various reasons.
In the 2019 EA Survey, 40% of EAs said their political views are “centre-left”, whereas 32% said they’re “left”.
This is really interesting and I’d be happy to see a more recent statistic (though I don’t expect it to have changed by much). But even if there are more of us than I think, I find very little consideration, in EA contexts, of concrete leftist ideas, e.g. “Maybe capitalism is a problem we can and should be addressing, now that we have billions of dollars committed to EA”, or “Could charitable giving do in some cases more harm than good by shifting the perceived responsibility from the state to individuals?”.
I agree that this could apply to every other philosophical school too. I do feel, however, that coming from a mostly English speaking perspective, economically liberal ideas are already much more ingrained into the core of EA than leftist ideas are, since they are more prominent in those political systems(?). Then there are probably perspectives that are even less present, like those of people in developing countries or with backgrounds of poverty or sickness or oppression.
Ultimately, since we all share the pragmatism and the will to test different courses of action for improving the world, I think this could be less of a political debate than is found outside EA on the same ideas, and more about just expanding the classes of ideas we consider, and deciding where to look for them and who could look for them best.
I sort of expect the young college EAs to be more leftist, and expect them to be more prominent in the next few years. Though that could be wrong, maybe college EAs are heavily selected for not being already committed to leftist causes.
I don’t think I’m the best person to ask haha. I basically expect EAs to be mostly Grey Tribe, pretty democratic, but with some libertarian influences, and generally just not that interested in politics. There’s probably better data on this somewhere, or at least the EA-related SlateStarCodex reader survey.
This is fairly aligned with my take but I think EAs are more blue than grey and more left than you might be implying. (Ah, by you I meant Stefan, he does/did a lot of empirical psychological/political research into relevant topics.)