Well stated. This post’s heart is in the right place, and I think some of its proposals are non-accidentally correct. However, it seems that many of the post’s suggestions boil down to “dilute what it means to be EA to just being part of common left-wing thought”. Here’s a sampling of the post’s recommendations which provoke this:
EAs should increase their awareness of their own positionality and subjectivity, and pay far more attention to e.g. postcolonial critiques of western academia
EAs should study other ways of knowing, taking inspiration from a range of academic and professional communities as well as indigenous worldviews
EAs should not assume that we must attach a number to everything, and should be curious about why most academic and professional communities do not
EA institutions should select for diversity
Previous EA involvement should not be a necessary condition to apply for specific roles, and the job postings should not assume that all applicants will identify with the label “EA”
EA institutions should hire more people who have had little to no involvement with the EA community providing that they care about doing the most good
EA institutions and community-builders should promote diversity and inclusion more, including funding projects targeted at traditionally underrepresented groups
Speaker invitations for EA events should be broadened away from (high-ranking) EA insiders and towards, for instance:
Subject-matter experts from outside EA
Researchers, practitioners, and stakeholders from outside of our elite communities
For instance, we need a far greater input from people from Indigenous communities and the Global South
EAs should consider the impact of EA’s cultural, historical, and disciplinary roots on its paradigmatic methods, assumptions, and prioritisations
Funding bodies should within 6 months publish lists of sources they will not accept money from, regardless of legality
Tobacco?
Gambling?
Mass surveillance?
Arms manufacturing?
Cryptocurrency?
Fossil fuels?
Within 5 years, EA funding decisions should be made collectively
EA institutions should be democratised within 3 years, with strategic, funding, and hiring policy decisions being made via democratic processes rather than by the institute director or CEO
EAs should make an effort to become more aware of EA’s cultural links to eugenic, reactionary and right-wing accelerationist politics, and take steps to identify areas of overlap or inheritance in order to avoid indirectly supporting such views or inadvertently accepting their framings
I don’t think the point is that all of the proposals are inherently correct or should be implemented. I don’t agree with all of the suggestions (agree with quite a few, don’t agree with some others), but in the introduction to the ‘Suggested Reforms’ section they literally say:
Below, we have a preliminary non-exhaustive list of suggestions for structural and cultural reform that we think may be a good idea and should certainly be discussed further.
It is of course plausible that some of them would not work; if you think so for a particular reform, please explain why! We would like input from a range of people, and we certainly do not claim to have all the answers!
In fact, we believe it important to open up a conversation about plausible reforms not because we have all the answers, but precisely because we don’t.
Picking out in particular the parts you don’t agree with may seem almost like strawmanning in this case, and people might be reading the comments not the full thing (was very surprised by how long this was when I clicked on it, I don’t think I’ve seen an 84 minute forum post before). But I’m not claiming this was intentional on either of your parts.
If taking positions that are percieved as left wing makes EA more correct and more effective, then EA should still take up those positions. The OP has made great effort to justify these points from a practical position of pursuing truth, effectiveness, and altruism, and they should not be dismissed just because they happen to fall on one side of the political spectrum. Similarly, just because an action makes EA less distinct, it doesn’t mean it’s not the correct thing to do.
This is true, but to the extent that these changes would make EA look/act like already-existing actors, I think it is fair to consider (1) how effective the similiar actors are, and (2) the marginal benefit of having more muscle in or adjacent to the space those actors occupy.
Also, because I think a clear leftward drift would have significant costs, I also think identifying the drift and those costs is a fair critique. As you move closer to a political pole, the range of people who will want to engage with your movement is likely to dwindle. Most people don’t want to work in, or donate to, a movement that doesnt feel respecting toward them—which I think is a strong tendency of almost all political poles.
At present, I think you can be moderately conservative or at least centrist by US standards and find a role and a place where you feel like you fit in. I think giving that range up has significant costs.
Also, because I think a clear leftward drift would have significant costs, I also think identifying the drift and those costs is a fair critique. As you move closer to a political pole, the range of people who will want to engage with your movement is likely to dwindle.
I think a moderate leftward shift on certain issues would actually increase popularity. The current dominant politics of EA seems to be a kind of steven pinker style techno-liberalism, with a free speech absolutist stance and a vague unease about social justice activism. Whether or not you agree with this position, I think it’s popularity among the general public is fairly low, and a shift to mainstream liberal (or mainstream conservative) opinions would make EA more appealling overall. For example, a policy of banning all discussion of “race science” would in the long term probably bring in much more people than it deterred, because almost everybody finds discussing that topic unpleasant.
If your response to this is “wait, there are other principles at play that we need to take into consideration here, not just chasing what is popular”, then you understand the reasons why I don’t find ” these positions would make EA more left wing” to be a very strong argument against them. If following principles pushes EA one way or the other, then so be it.
Fwiw, I think your view that a leftward shift in EA would increase popularity is probably Americocentric. I doubt it is true if you were to consider EA as a global movement rather than just a western one.
Also, fwiw, I’ve lost track of how many people I’ve seen dismiss EA as “dumb left-wing social justice”. EAs also tend to think the consequence of saying something is what matters. So we tend to be disliked both by free speech absolutists and by people who will never concede that properly discussing some controversial topics might be more net positive than the harm caused by talking about them. Some also see EA as tech-phobic. Steven Pinker famously dismissed EA concerns about AI Alignment. If you spend time outside of EA in tech-optimism-liberal circles you see a clear divide. It isn’t culturally the same. Despite this, I think I’ve also lost count of how many people I’ve seen dismiss EA as ” right-leaning libertarian tech-utopia make-billionaires-rich nonsense”
We can’t please everyone and it is a fool’s errand to try.
One person’s “steven pinker style techno-liberalism, with a free speech absolutist stance and a vague unease about social justice activism” is another person’s “Ludite free speech blocking SJW”
If following principles does not clearly push EA one way or the other, also then so be it.
Fwiw, I think your view that a leftward shift in EA would increase popularity is probably Americocentric. I doubt it is true if you were to consider EA as a global movement rather than just a western one
My point was more that theres a larger audience for picking one side of the political spectrum than there is for awkwardly positioning yourself in the middle in a way that annoys both sides. I think this holds for other countries as well, but of course the political battles are different. If you wanted to appeal more to western europe you’d go left, to eastern europe you’d go right, to China you’d go some weird combination of left and right, etc.
Really, I’m making the same point as you: chasing popularity at the expense of principles is a fools errand.
I think there’s a difference between “people in EA tend to have X, Y, and Z views” and those views being actively promoted by major orgs (which is the most natural reading of the proposal to me). Also, although free speech absolutism may not be popular in toto, most points on the US political spectrum at least find some common ground with that stance (they will agree on the outcome for certain forms of controversial speech).
I also think it likely that EA will need significant cooperation from the political system on certain things, particularly involving x-risk, and that becoming strongly left-identified sharply increases the risk you’ll be summarily dismissed by a house of Congress, the White House, or non-US equivalents.
I don’t think “race science” has any place in EA spaces, by the way.
Agree with this. We should de-politicize issues, if anything. Take Climate Change for example. Heavily politicized. But EA is not left wing because 80k hours acknowledges the severity and reality of CC—it is simply very likely to be true. And if truth happens to be more frequent in left wing perspectives then so be it.
I agree with you that EA shouldn’t be prevented from adopting effective positions just because of a perception of partisanship. However, there’s a nontrivial cost to doing so: the encouragement of political sameness within EA, and the discouragement of individuals or policymakers with political differences from joining EA or supporting EA objectives.
This cost, if realized, could fall against many of this post’s objectives:
We must temper our knee-jerk reactions against deep critiques, and be curious about our emotional reactions to arguments – “Why does this person disagree with me? Why am I so instinctively dismissive about what they have to say?”
We must be willing to accept the possibility that “big” things may need to be fixed and that some of our closely-held beliefs are misguided
EAs should make a point of engaging with and listening to EAs from underrepresented disciplines and backgrounds, as well as those with heterodox/“heretical” views
EAs should consider how our shared modes of thought may subconsciously affect our views of the world – what blindspots and biases might we have created for ourselves?
EA institutions should select for diversity
Along lines of:
Philosophical and political beliefs
It also plausibly increases x-risk. If EA becomes known as an effectiveness-oriented wing of a particular political party, the perception of EA policies as partisan could embolden strong resistance from the other political party. Imagine how much progress we could have had on climate change if it wasn’t a partisan issue. Now imagine it’s 2040, the political party EA affiliates with is urgently pleading for AI safety legislation and a framework for working with China on reducing x-risk, and the other party stands firmly opposed because “these out-of-touch elitist San Francisco liberals think the world’s gonna end, and want to collaborate with the Chinese!”
I agree that EA should be accepting of a wide range of political opinions (although highly extreme and hateful views should still be shunned).
I don’t think the suggestions there are necessarily at odds with that, though. For example, increasing demographic diversity is probably going to increase political diversity as well, because people from extremely similar backgrounds have fairly similar politics. If you expand to people from rural background, you’re more likely to get a country conservative, if you encourage more women, you’re more likely to get feminists, if you encourage people from Ghana, you’ll get whole new political ideologies nobody at silicon valley has even heard of. The politics of nerdy white men like me represent a very tiny fraction of the overall political beliefs that exist in the world.
When it comes to extreme views it’s worth noting that what’s extreme depends a lot of the context.
A view like “homosexuality should be criminalized” is extreme in Silicon Valley but not in Uganda where it’s a mainstream political opinion. In my time as a forum moderator, I had to deal with a user from Uganda voicing those views and in cases, like that you have to make choice about how inclusive you want to be of people expressing very different political ideologies.
In many cases, where the political views of people in Ghana or Uganda substantially differ from those common in the US they are going to be perceived as highly extreme.
You might find people who are from Ghana and who adopted woke values, but those aren’t giving you deep diversity in political viewpoints.
For all the talk about decolonization, Silicon Valley liberals seem always very eager when it comes to denying people from Ghana or Uganda to express mainstream political opinions from their home countries.
While on it’s face, increasing demographic diversity seems like it would result in an increase in political diversity, I don’t think that is actually true.
This rests on several assumptions:
I am looking through the lens of U.S. domestic politics, and identifying political diversity by having representation of America’s two largest political parties.
Increases in diversity will not be evenly distributed across the American population. (White Evangelicals are not being targeted in a diversity push, and we would expect the addition of college grad+ women and BIPOC.)
Of all demographic groups, white college grad+ men, “Sams,” are the most politically diverse group, at 48 D, 46R. By contrast, the groups typically understood to be represented by increased diversity:
College Grad+ Women: 65 D, 30R
There is difficulty in a lack of BIPOC breakdown by education level, but assuming that trends of increased education would result in a greater democratic disparity, these are useful lower bounds:
Black: 83 D, 10R
Hispanic: 63 D, 29 R
Asian American: 72 D, 17R
While I would caution against partisanship in the evaluation of ideas and programs, I don’t think there’s anything inherently wrong in a movement having a partisan lean to its membership. A climate change activist group can work in a non-partisan manner, but the logical consequence of their membership will be primarily Democratic voters, because that party appeals to their important issue.
if you encourage people from Ghana, you’ll get whole new political ideologies nobody at silicon valley has even heard of.
I think this aspect of diversity would offer real value in terms of political diversity, and could potentially add value to EA. I think clarification on what it means to “increase diversity” are required to assess the utility. I am biased by my experience in which organizations become more “diverse” in skin color, while becoming more culturally and politically homogenous.
Reducing “political diversity” down to the 2 bit question of “which american political party do they vote for” is a gross simplification. For example, while black people are more likely to vote democrat, a black democrat is half as likely as a white democrat to identify as “liberal”. This is because there are multiple political axes, and multiple political issues to consider, starting with the standard economic vs social political compass model.
This definitely becomes clearest when we escape from a narrow focus on elite college graduates in the US, and look at people from different nations entirely. You will have an easier time finding a Maoist in china than in texas, for example. They might vote D in the US as a result of perceiving the party as less anti-immigrant, but they’re not the same as a white D voter from the suburbs.
As for your experiences where political and ethnic diversity were anti-correlated: did the organisation make any effort on other aspects of diversity, other than skin colour, or did they just, say, swap out a couple of MIT grads of one race for a couple of MIT grads of a different race? Given that you say the culture didn’t change either, the latter seems likely.
I agree with you that many of the broad suggestions can be read that way. However, when the post suggests which concrete groups EA should target for the sake of philosophical and political diversity, they all seem to line up on one particular side of the aisle:
EAs should increase their awareness of their own positionality and subjectivity, and pay far more attention to e.g. postcolonial critiques of western academia
What politics are postcolonial critics of Western academia likely to have?
EAs should study other ways of knowing, taking inspiration from a range of academic and professional communities as well as indigenous worldviews
EA institutions and community-builders should promote diversity and inclusion more, including funding projects targeted at traditionally underrepresented groups
When the term “traditionally underrepresented groups” is used, does it typically refer to rural conservatives, or to other groups? What politics are these other groups likely to have?
As you pointed out, this post’s suggestions could be read as encouraging universal diversity, and I agree that the authors would likely endorse your explanation of the consequences of that. I also don’t think it’s unreasonable to say that this post is coded with a political lean, and that many of the post’s suggestions can be reasonably read as nudging EA towards that lean.
Hmmm, a few of these don’t sound like common left-wing thought (I hope democracy isn’t a left-wing value now), but I agree with the sentiment of your point.
I guess some of the co-writers lean towards identitarian left politics and they want EA to be more in line with this (edit: although this political leaning shouldn’t invalidate the criticisms in the piece). One of the footnotes would seem to signal their politics clearly, by linking to pieces with what I’d call a left-wing ‘hit piece’ framing:
“We should remember that EA is sometimes worryingly close toracist, misogynistic, andevenfascist ideas. For instance, Scott Alexander, a blogger that is very popular within EA, and Caroline Ellison, a close associate of Sam Bankman-Fried, speak favourably about “human biodiversity”, which is the latest euphemism for “scientific” racism. ”
Believing that democracy is a good way to run a country is a different view than believing that it’s an effective way to run an NGO. The idea that NGOs whose main funding comes from donors as opposed to membership dues should be run democratically seems like a fringe political idea and one that’s found in certain left-wing circles.
Well stated. This post’s heart is in the right place, and I think some of its proposals are non-accidentally correct. However, it seems that many of the post’s suggestions boil down to “dilute what it means to be EA to just being part of common left-wing thought”. Here’s a sampling of the post’s recommendations which provoke this:
EAs should increase their awareness of their own positionality and subjectivity, and pay far more attention to e.g. postcolonial critiques of western academia
EAs should study other ways of knowing, taking inspiration from a range of academic and professional communities as well as indigenous worldviews
EAs should not assume that we must attach a number to everything, and should be curious about why most academic and professional communities do not
EA institutions should select for diversity
Previous EA involvement should not be a necessary condition to apply for specific roles, and the job postings should not assume that all applicants will identify with the label “EA”
EA institutions should hire more people who have had little to no involvement with the EA community providing that they care about doing the most good
EA institutions and community-builders should promote diversity and inclusion more, including funding projects targeted at traditionally underrepresented groups
Speaker invitations for EA events should be broadened away from (high-ranking) EA insiders and towards, for instance:
Subject-matter experts from outside EA
Researchers, practitioners, and stakeholders from outside of our elite communities
For instance, we need a far greater input from people from Indigenous communities and the Global South
EAs should consider the impact of EA’s cultural, historical, and disciplinary roots on its paradigmatic methods, assumptions, and prioritisations
Funding bodies should within 6 months publish lists of sources they will not accept money from, regardless of legality
Tobacco?
Gambling?
Mass surveillance?
Arms manufacturing?
Cryptocurrency?
Fossil fuels?
Within 5 years, EA funding decisions should be made collectively
EA institutions should be democratised within 3 years, with strategic, funding, and hiring policy decisions being made via democratic processes rather than by the institute director or CEO
EAs should make an effort to become more aware of EA’s cultural links to eugenic, reactionary and right-wing accelerationist politics, and take steps to identify areas of overlap or inheritance in order to avoid indirectly supporting such views or inadvertently accepting their framings
I don’t think the point is that all of the proposals are inherently correct or should be implemented. I don’t agree with all of the suggestions (agree with quite a few, don’t agree with some others), but in the introduction to the ‘Suggested Reforms’ section they literally say:
Picking out in particular the parts you don’t agree with may seem almost like strawmanning in this case, and people might be reading the comments not the full thing (was very surprised by how long this was when I clicked on it, I don’t think I’ve seen an 84 minute forum post before). But I’m not claiming this was intentional on either of your parts.
If taking positions that are percieved as left wing makes EA more correct and more effective, then EA should still take up those positions. The OP has made great effort to justify these points from a practical position of pursuing truth, effectiveness, and altruism, and they should not be dismissed just because they happen to fall on one side of the political spectrum. Similarly, just because an action makes EA less distinct, it doesn’t mean it’s not the correct thing to do.
This is true, but to the extent that these changes would make EA look/act like already-existing actors, I think it is fair to consider (1) how effective the similiar actors are, and (2) the marginal benefit of having more muscle in or adjacent to the space those actors occupy.
Also, because I think a clear leftward drift would have significant costs, I also think identifying the drift and those costs is a fair critique. As you move closer to a political pole, the range of people who will want to engage with your movement is likely to dwindle. Most people don’t want to work in, or donate to, a movement that doesnt feel respecting toward them—which I think is a strong tendency of almost all political poles.
At present, I think you can be moderately conservative or at least centrist by US standards and find a role and a place where you feel like you fit in. I think giving that range up has significant costs.
I think a moderate leftward shift on certain issues would actually increase popularity. The current dominant politics of EA seems to be a kind of steven pinker style techno-liberalism, with a free speech absolutist stance and a vague unease about social justice activism. Whether or not you agree with this position, I think it’s popularity among the general public is fairly low, and a shift to mainstream liberal (or mainstream conservative) opinions would make EA more appealling overall. For example, a policy of banning all discussion of “race science” would in the long term probably bring in much more people than it deterred, because almost everybody finds discussing that topic unpleasant.
If your response to this is “wait, there are other principles at play that we need to take into consideration here, not just chasing what is popular”, then you understand the reasons why I don’t find ” these positions would make EA more left wing” to be a very strong argument against them. If following principles pushes EA one way or the other, then so be it.
Fwiw, I think your view that a leftward shift in EA would increase popularity is probably Americocentric. I doubt it is true if you were to consider EA as a global movement rather than just a western one.
Also, fwiw, I’ve lost track of how many people I’ve seen dismiss EA as “dumb left-wing social justice”. EAs also tend to think the consequence of saying something is what matters. So we tend to be disliked both by free speech absolutists and by people who will never concede that properly discussing some controversial topics might be more net positive than the harm caused by talking about them. Some also see EA as tech-phobic. Steven Pinker famously dismissed EA concerns about AI Alignment. If you spend time outside of EA in tech-optimism-liberal circles you see a clear divide. It isn’t culturally the same. Despite this, I think I’ve also lost count of how many people I’ve seen dismiss EA as ” right-leaning libertarian tech-utopia make-billionaires-rich nonsense”
We can’t please everyone and it is a fool’s errand to try.
One person’s “steven pinker style techno-liberalism, with a free speech absolutist stance and a vague unease about social justice activism” is another person’s “Ludite free speech blocking SJW”
If following principles does not clearly push EA one way or the other, also then so be it.
My point was more that theres a larger audience for picking one side of the political spectrum than there is for awkwardly positioning yourself in the middle in a way that annoys both sides. I think this holds for other countries as well, but of course the political battles are different. If you wanted to appeal more to western europe you’d go left, to eastern europe you’d go right, to China you’d go some weird combination of left and right, etc.
Really, I’m making the same point as you: chasing popularity at the expense of principles is a fools errand.
I think there’s a difference between “people in EA tend to have X, Y, and Z views” and those views being actively promoted by major orgs (which is the most natural reading of the proposal to me). Also, although free speech absolutism may not be popular in toto, most points on the US political spectrum at least find some common ground with that stance (they will agree on the outcome for certain forms of controversial speech).
I also think it likely that EA will need significant cooperation from the political system on certain things, particularly involving x-risk, and that becoming strongly left-identified sharply increases the risk you’ll be summarily dismissed by a house of Congress, the White House, or non-US equivalents.
I don’t think “race science” has any place in EA spaces, by the way.
Agree with this. We should de-politicize issues, if anything. Take Climate Change for example. Heavily politicized. But EA is not left wing because 80k hours acknowledges the severity and reality of CC—it is simply very likely to be true. And if truth happens to be more frequent in left wing perspectives then so be it.
I agree with you that EA shouldn’t be prevented from adopting effective positions just because of a perception of partisanship. However, there’s a nontrivial cost to doing so: the encouragement of political sameness within EA, and the discouragement of individuals or policymakers with political differences from joining EA or supporting EA objectives.
This cost, if realized, could fall against many of this post’s objectives:
We must temper our knee-jerk reactions against deep critiques, and be curious about our emotional reactions to arguments – “Why does this person disagree with me? Why am I so instinctively dismissive about what they have to say?”
We must be willing to accept the possibility that “big” things may need to be fixed and that some of our closely-held beliefs are misguided
EAs should make a point of engaging with and listening to EAs from underrepresented disciplines and backgrounds, as well as those with heterodox/“heretical” views
EAs should consider how our shared modes of thought may subconsciously affect our views of the world – what blindspots and biases might we have created for ourselves?
EA institutions should select for diversity
Along lines of:
Philosophical and political beliefs
It also plausibly increases x-risk. If EA becomes known as an effectiveness-oriented wing of a particular political party, the perception of EA policies as partisan could embolden strong resistance from the other political party. Imagine how much progress we could have had on climate change if it wasn’t a partisan issue. Now imagine it’s 2040, the political party EA affiliates with is urgently pleading for AI safety legislation and a framework for working with China on reducing x-risk, and the other party stands firmly opposed because “these out-of-touch elitist San Francisco liberals think the world’s gonna end, and want to collaborate with the Chinese!”
I agree that EA should be accepting of a wide range of political opinions (although highly extreme and hateful views should still be shunned).
I don’t think the suggestions there are necessarily at odds with that, though. For example, increasing demographic diversity is probably going to increase political diversity as well, because people from extremely similar backgrounds have fairly similar politics. If you expand to people from rural background, you’re more likely to get a country conservative, if you encourage more women, you’re more likely to get feminists, if you encourage people from Ghana, you’ll get whole new political ideologies nobody at silicon valley has even heard of. The politics of nerdy white men like me represent a very tiny fraction of the overall political beliefs that exist in the world.
When it comes to extreme views it’s worth noting that what’s extreme depends a lot of the context.
A view like “homosexuality should be criminalized” is extreme in Silicon Valley but not in Uganda where it’s a mainstream political opinion. In my time as a forum moderator, I had to deal with a user from Uganda voicing those views and in cases, like that you have to make choice about how inclusive you want to be of people expressing very different political ideologies.
In many cases, where the political views of people in Ghana or Uganda substantially differ from those common in the US they are going to be perceived as highly extreme.
The idea, that you can be accepting of political ideologies of a place like Ghana where the political discussion is about “Yes, we already have forbidden homosexuality but the punishment seems to low to discourage that behavior” vs. “The current laws against homosexuality are enough” while at the same time shunning highly extreme views, seems to me very unrealistic.
You might find people who are from Ghana and who adopted woke values, but those aren’t giving you deep diversity in political viewpoints.
For all the talk about decolonization, Silicon Valley liberals seem always very eager when it comes to denying people from Ghana or Uganda to express mainstream political opinions from their home countries.
While on it’s face, increasing demographic diversity seems like it would result in an increase in political diversity, I don’t think that is actually true.
This rests on several assumptions:
I am looking through the lens of U.S. domestic politics, and identifying political diversity by having representation of America’s two largest political parties.
Increases in diversity will not be evenly distributed across the American population. (White Evangelicals are not being targeted in a diversity push, and we would expect the addition of college grad+ women and BIPOC.)
Of all demographic groups, white college grad+ men, “Sams,” are the most politically diverse group, at 48 D, 46R. By contrast, the groups typically understood to be represented by increased diversity:
College Grad+ Women: 65 D, 30R
There is difficulty in a lack of BIPOC breakdown by education level, but assuming that trends of increased education would result in a greater democratic disparity, these are useful lower bounds:
Black: 83 D, 10R
Hispanic: 63 D, 29 R
Asian American: 72 D, 17R
While I would caution against partisanship in the evaluation of ideas and programs, I don’t think there’s anything inherently wrong in a movement having a partisan lean to its membership. A climate change activist group can work in a non-partisan manner, but the logical consequence of their membership will be primarily Democratic voters, because that party appeals to their important issue.
I think this aspect of diversity would offer real value in terms of political diversity, and could potentially add value to EA. I think clarification on what it means to “increase diversity” are required to assess the utility. I am biased by my experience in which organizations become more “diverse” in skin color, while becoming more culturally and politically homogenous.
https://www.pewresearch.org/politics/2020/06/02/democratic-edge-in-party-identification-narrows-slightly/
Reducing “political diversity” down to the 2 bit question of “which american political party do they vote for” is a gross simplification. For example, while black people are more likely to vote democrat, a black democrat is half as likely as a white democrat to identify as “liberal”. This is because there are multiple political axes, and multiple political issues to consider, starting with the standard economic vs social political compass model.
This definitely becomes clearest when we escape from a narrow focus on elite college graduates in the US, and look at people from different nations entirely. You will have an easier time finding a Maoist in china than in texas, for example. They might vote D in the US as a result of perceiving the party as less anti-immigrant, but they’re not the same as a white D voter from the suburbs.
As for your experiences where political and ethnic diversity were anti-correlated: did the organisation make any effort on other aspects of diversity, other than skin colour, or did they just, say, swap out a couple of MIT grads of one race for a couple of MIT grads of a different race? Given that you say the culture didn’t change either, the latter seems likely.
I agree with you that many of the broad suggestions can be read that way. However, when the post suggests which concrete groups EA should target for the sake of philosophical and political diversity, they all seem to line up on one particular side of the aisle:
What politics are postcolonial critics of Western academia likely to have?
What politics are academics, professional communities, or indigenous Americans likely to have?
When the term “traditionally underrepresented groups” is used, does it typically refer to rural conservatives, or to other groups? What politics are these other groups likely to have?
As you pointed out, this post’s suggestions could be read as encouraging universal diversity, and I agree that the authors would likely endorse your explanation of the consequences of that. I also don’t think it’s unreasonable to say that this post is coded with a political lean, and that many of the post’s suggestions can be reasonably read as nudging EA towards that lean.
Hmmm, a few of these don’t sound like common left-wing thought (I hope democracy isn’t a left-wing value now), but I agree with the sentiment of your point.
I guess some of the co-writers lean towards identitarian left politics and they want EA to be more in line with this (edit: although this political leaning shouldn’t invalidate the criticisms in the piece). One of the footnotes would seem to signal their politics clearly, by linking to pieces with what I’d call a left-wing ‘hit piece’ framing:
“We should remember that EA is sometimes worryingly close to racist, misogynistic, and even fascist ideas. For instance, Scott Alexander, a blogger that is very popular within EA, and Caroline Ellison, a close associate of Sam Bankman-Fried, speak favourably about “human biodiversity”, which is the latest euphemism for “scientific” racism. ”
Believing that democracy is a good way to run a country is a different view than believing that it’s an effective way to run an NGO. The idea that NGOs whose main funding comes from donors as opposed to membership dues should be run democratically seems like a fringe political idea and one that’s found in certain left-wing circles.