Nice post and I agree that we should avoid saying things that might make people feel unwelcome or uncomfortable based on characteristics.
One thing that I bristle at a bit is that I think the exclusion that offhand comments or controversial posts cause is probably dwarfed by orders of magnitude by the exclusion caused by material considerations that prevent minorities (as well as the vast majorities of whites) from being able to contribute to the same degree in EA. If you look around at people at an EAG, you can pretty safely bet that they are not only in college/college educated, but that their parents were as well. They probably have savings, either personally, or through family that they can rely on, to be able to take risks for their personal ambitions, which in the case of EAS, are often choices that enable them to better the world. It kills me when I listen to podcasts and audiobooks that note that mornings are often the most important parts of the day, yet I, and the vast majority of people must direct most of our most productive time in a day to a job that is not impactful rather than the projects that we think can profoundly better the world.
I realize that maybe this is a less tractable issue than make EAs do less microaggressions/controversial and offensive posts. But I think the EA community is grossly negligent with regard to what may be its most valuable resource… EAs. Maybe another amnesty post will be about considering people as agents and people as patients… I think the people I’m talking about—low and middle income people in rich and middle income countries, a lot in lower income countries—basically everyone not in the top global 0.5% , are mostly not good targets as moral patients. The very poorest people, farmed animals, and future people are probably much more fruitful targets for direct utility increases. But if these people are committed to using their minds and effort as EAs do, many of them may be excellent targets as agents. This point probably applies with even greater force to people in middle and low-income countries who are disproportionately likely to be POC.
Anyways, apologies for the digressive response. I should probably just write the full amnesty post on the subject with the time I do not have because I have a full-time non-EA job and run a nonprofit.
I think that would be an interesting post, although I think the tractability part is going to be more difficult. The best idea I’ve come up with is some sort of salary supplement and/or financial backstop program for early-stage EAs from low/middle income backgrounds. That may mitigate the risk of losing excellent candidates who come to EA through the existing recruitment channels but lack the personal / family wealth to take risks that higher-income people in high-wealth countries can somewhat comfortably take. This seems moderately tractable to me.
Radically expanding the universe of people who people who might be invested enough in EA to apply for jobs or funding may be a much costlier and time-intensive task. To the extent EA is now funding-constrained, the “carrying capacity” of the EA ecosystem may be about the number of FTEs currently employed, except to the extent there are new EtG or other donors. So broadening the pool might allow for selection of more qualified people for existing FTEs, and might allow for somewhat more capacity by driving wages down, but I would expect it to have mostly incremental effects.
Although it’s uncomfortable to say, many current approaches to talent recruitment also double to some extent as fundraising development. Although earning-to-give / entrepreneurship-to-give may have become less prominent in the years leading up to the FTX collapse, outreach at elite universities tends to reach more potential startup founders, Biglaw partners, and neurosurgeons who could contribute large sums than would outreach in LMIC countries or in more solidly middle-class institutions. So in a sense, outreach to the soon-to-be-rich may “pay for itself” in a sense.
If your target population is graduates of elite colleges, you can potentially support ~50% of them in direct work if the other 50% get high-paying jobs and donate a significant fraction of their income. That’s perhaps an order of magnitude better than many social movements / recruitment strategies. E.g., even religious movements with relatively high commitment levels would have a hard time supporting more than 3-4% of supporters in direct / full-time ministry positions.
Definitely agree that ETG is very much underrated. I think if you are looking to maximize your impact, you should be looking at how you can bring something to the table in terms of skills/knowledge/insight/etc that money cannot buy or is very difficult/costly for money to buy. Something like this might be building of specialized research skills/knowledge, connections, influence, idea development/cultivation. I am a bit skeptical that generally working for a high impact org in positions with skills that are available in the general employment market is, in expectation, high impact. I may, however, be underestimating the importance of securing alignment in such roles with job. If I could not see the opportunity in my career to build something money cannot buy, I would probably look at earning to give.
I agree that outreach is well-directed to elite colleges. Students of these institutions are, all else being equal, more capable, better-connected, and generally have more resources to deploy to EA because they tend to come from wealthier backgrounds. I think these audience might not be the best target for material support because they may well have the resources to make choices with their lives that can better help the world. The most promising EAs outside of the elite are probably the best targets for material support because their impact is quite likely to be severely curtailed by their own economic/social circumstances. Rereading your third paragraph, I think we are largely in agreement.
CE/AIM just launched something like a founding-to-give incubation program, will be interesting to see how that goes, who their participants end up being etc
Nice post and I agree that we should avoid saying things that might make people feel unwelcome or uncomfortable based on characteristics.
One thing that I bristle at a bit is that I think the exclusion that offhand comments or controversial posts cause is probably dwarfed by orders of magnitude by the exclusion caused by material considerations that prevent minorities (as well as the vast majorities of whites) from being able to contribute to the same degree in EA. If you look around at people at an EAG, you can pretty safely bet that they are not only in college/college educated, but that their parents were as well. They probably have savings, either personally, or through family that they can rely on, to be able to take risks for their personal ambitions, which in the case of EAS, are often choices that enable them to better the world. It kills me when I listen to podcasts and audiobooks that note that mornings are often the most important parts of the day, yet I, and the vast majority of people must direct most of our most productive time in a day to a job that is not impactful rather than the projects that we think can profoundly better the world.
I realize that maybe this is a less tractable issue than make EAs do less microaggressions/controversial and offensive posts. But I think the EA community is grossly negligent with regard to what may be its most valuable resource… EAs. Maybe another amnesty post will be about considering people as agents and people as patients… I think the people I’m talking about—low and middle income people in rich and middle income countries, a lot in lower income countries—basically everyone not in the top global 0.5% , are mostly not good targets as moral patients. The very poorest people, farmed animals, and future people are probably much more fruitful targets for direct utility increases. But if these people are committed to using their minds and effort as EAs do, many of them may be excellent targets as agents. This point probably applies with even greater force to people in middle and low-income countries who are disproportionately likely to be POC.
Anyways, apologies for the digressive response. I should probably just write the full amnesty post on the subject with the time I do not have because I have a full-time non-EA job and run a nonprofit.
I think that would be an interesting post, although I think the tractability part is going to be more difficult. The best idea I’ve come up with is some sort of salary supplement and/or financial backstop program for early-stage EAs from low/middle income backgrounds. That may mitigate the risk of losing excellent candidates who come to EA through the existing recruitment channels but lack the personal / family wealth to take risks that higher-income people in high-wealth countries can somewhat comfortably take. This seems moderately tractable to me.
Radically expanding the universe of people who people who might be invested enough in EA to apply for jobs or funding may be a much costlier and time-intensive task. To the extent EA is now funding-constrained, the “carrying capacity” of the EA ecosystem may be about the number of FTEs currently employed, except to the extent there are new EtG or other donors. So broadening the pool might allow for selection of more qualified people for existing FTEs, and might allow for somewhat more capacity by driving wages down, but I would expect it to have mostly incremental effects.
Although it’s uncomfortable to say, many current approaches to talent recruitment also double to some extent as fundraising development. Although earning-to-give / entrepreneurship-to-give may have become less prominent in the years leading up to the FTX collapse, outreach at elite universities tends to reach more potential startup founders, Biglaw partners, and neurosurgeons who could contribute large sums than would outreach in LMIC countries or in more solidly middle-class institutions. So in a sense, outreach to the soon-to-be-rich may “pay for itself” in a sense.
If your target population is graduates of elite colleges, you can potentially support ~50% of them in direct work if the other 50% get high-paying jobs and donate a significant fraction of their income. That’s perhaps an order of magnitude better than many social movements / recruitment strategies. E.g., even religious movements with relatively high commitment levels would have a hard time supporting more than 3-4% of supporters in direct / full-time ministry positions.
Definitely agree that ETG is very much underrated. I think if you are looking to maximize your impact, you should be looking at how you can bring something to the table in terms of skills/knowledge/insight/etc that money cannot buy or is very difficult/costly for money to buy. Something like this might be building of specialized research skills/knowledge, connections, influence, idea development/cultivation. I am a bit skeptical that generally working for a high impact org in positions with skills that are available in the general employment market is, in expectation, high impact. I may, however, be underestimating the importance of securing alignment in such roles with job. If I could not see the opportunity in my career to build something money cannot buy, I would probably look at earning to give.
I agree that outreach is well-directed to elite colleges. Students of these institutions are, all else being equal, more capable, better-connected, and generally have more resources to deploy to EA because they tend to come from wealthier backgrounds. I think these audience might not be the best target for material support because they may well have the resources to make choices with their lives that can better help the world. The most promising EAs outside of the elite are probably the best targets for material support because their impact is quite likely to be severely curtailed by their own economic/social circumstances. Rereading your third paragraph, I think we are largely in agreement.
CE/AIM just launched something like a founding-to-give incubation program, will be interesting to see how that goes, who their participants end up being etc