I have exactly the opposite intuition (which is why Iâve been using the term âEA-aligned organizationâ throughout my writing for CEA and probably making it more popular in the process).
âEA-aligned organizationâ isnât supposed to mean âhigh-impact organizationâ. Itâs supposed to mean âorganization which has some connection to the EA community through its staff, or being connected to EA funding networks, etc.â
This is a useful concept because itâs legible in a way impact often isnât. Itâs easy to tell whether an org has a grant from EA Funds/âOpen Phil, and while this doesnât guarantee their impact, it does stand in for âsome people at the community vouch for their doing interesting work related to EA goalsâ.
I really donât like the term âhigh-impact organizationâ because it does the same sneaky work as âeffective altruistâ (another term I dislike). Youâre defining yourself as being âgoodâ without anyone getting a chance to push back, and in many cases, thereâs no obvious way to check whether youâre telling the truth.
Consider questions like these:
Is Amazon a high-impact organization? (80K lists jobs at Amazon on their job board, so⌠maybe? I guess certain jobs at Amazon are âhigh-impactâ, but which ones? Only the ones 80K posts?)
Is MIRI a high-impact organization? (God knows how much digital ink has been spilled on this one)
It seems like thereâs an important difference between MIRI and SCI on the one hand, Amazon and Sunrise on the other. The first two have a long history of getting support, funding, and interest from people in the EA movement; theyâve given talks at EA Global. This doesnât necessarily make them most impactful than Amazon and Sunrise, but it does mean that working at one of those orgs puts you in the category of âworking at an org endorsed by a bunch of people with common EA valuesâ.
*****
The fact that people can say âI do ops at an EA orgâ and be warmly greeted as high status even if they could do much more good outside EA rubs me the wrong way.
I hope this doesnât happen very often; Iâd prefer that we greet everyone with equal warmth and sincere interest in their work, as long as the work is interesting. Working at an EA-aligned org really shouldnât add much signaling info to the fact that someone has chosen to come to your EA meetup or whatever.
That said, I sympathize with theoretical objections like âhow am I supposed to know whether someone would do more good in some other job?â and âIâm genuinely more interested in hearing about someoneâs work helping to run [insert org] than I would if they worked in finance or something, because Iâm familiar with that org and I think it does cool stuffâ.
Terms that seem to have some of the good properties of âEA-alignedâ without running into the âassuming your own virtueâ problem:
âLongtermistâ (obviously not synonymous with âEA-alignedâ, but it accurately describes a subset of orgs within the movement)
âImpact-drivenâ or something like that (indicating a focus on impact without insisting that the focus has led to more impact)
âHigh-potentialâ or âpromisingâ (indicating that theyâre pursuing a cause area that looks good by standard EA lights, without trying to assume success â still a bit self-promotional, though)
Actually referring to the literal work being done, e.g. âMalaria prevention orgâ, âAlternative protein companyâ
...but when you get at the question of what links together orgs that work on malaria, alternative proteins, and longtermist research, I think âEA-alignedâ is a more accurate and helpful descriptor than âhigh-impactâ.
I have exactly the opposite intuition (which is why Iâve been using the term âEA-aligned organizationâ throughout my writing for CEA and probably making it more popular in the process).
âEA-aligned organizationâ isnât supposed to mean âhigh-impact organizationâ. Itâs supposed to mean âorganization which has some connection to the EA community through its staff, or being connected to EA funding networks, etc.â
This is a useful concept because itâs legible in a way impact often isnât. Itâs easy to tell whether an org has a grant from EA Funds/âOpen Phil, and while this doesnât guarantee their impact, it does stand in for âsome people at the community vouch for their doing interesting work related to EA goalsâ.
I really donât like the term âhigh-impact organizationâ because it does the same sneaky work as âeffective altruistâ (another term I dislike). Youâre defining yourself as being âgoodâ without anyone getting a chance to push back, and in many cases, thereâs no obvious way to check whether youâre telling the truth.
Consider questions like these:
Is Amazon a high-impact organization? (80K lists jobs at Amazon on their job board, so⌠maybe? I guess certain jobs at Amazon are âhigh-impactâ, but which ones? Only the ones 80K posts?)
Is MIRI a high-impact organization? (God knows how much digital ink has been spilled on this one)
Is SCI a high-impact organization?
Is the Sunrise Movement a high-impact organization?
It seems like thereâs an important difference between MIRI and SCI on the one hand, Amazon and Sunrise on the other. The first two have a long history of getting support, funding, and interest from people in the EA movement; theyâve given talks at EA Global. This doesnât necessarily make them most impactful than Amazon and Sunrise, but it does mean that working at one of those orgs puts you in the category of âworking at an org endorsed by a bunch of people with common EA valuesâ.
*****
I hope this doesnât happen very often; Iâd prefer that we greet everyone with equal warmth and sincere interest in their work, as long as the work is interesting. Working at an EA-aligned org really shouldnât add much signaling info to the fact that someone has chosen to come to your EA meetup or whatever.
That said, I sympathize with theoretical objections like âhow am I supposed to know whether someone would do more good in some other job?â and âIâm genuinely more interested in hearing about someoneâs work helping to run [insert org] than I would if they worked in finance or something, because Iâm familiar with that org and I think it does cool stuffâ.
Terms that seem to have some of the good properties of âEA-alignedâ without running into the âassuming your own virtueâ problem:
âLongtermistâ (obviously not synonymous with âEA-alignedâ, but it accurately describes a subset of orgs within the movement)
âImpact-drivenâ or something like that (indicating a focus on impact without insisting that the focus has led to more impact)
âHigh-potentialâ or âpromisingâ (indicating that theyâre pursuing a cause area that looks good by standard EA lights, without trying to assume success â still a bit self-promotional, though)
Actually referring to the literal work being done, e.g. âMalaria prevention orgâ, âAlternative protein companyâ
...but when you get at the question of what links together orgs that work on malaria, alternative proteins, and longtermist research, I think âEA-alignedâ is a more accurate and helpful descriptor than âhigh-impactâ.