One thing I think is often missing from these sorts of conversations is that âalignment with EAâ and âalignment with my organizationâs missionâ are not the same thing! Itâs a mistake to assume that the only people who understand and believe in your organizationâs mission are members of the effective altruism community. EA ideas donât have to come in a complete package. People can believe that one organizationâs mission is really valuable and important, for different reasons, coming from totally different values, and without also believing that a bunch of other EA organizations are similarly valuable.
For âcore EAâ orgs like the Centre for Effective Altruism[1], thereâs probably near-total overlap between these two things. But for lots of other organizations the overlap is only incidental, and what you should really be looking for is âalignment with my organizationâs missionâ. Perceived EA Alignment is an unpredictable measure of that, while also being correlated with a bunch of other things like culture, thinking style, network, and socioeconomic status, each of which you either donât care about or which you donât want to be selecting for in the first place.
I think that is a good point, and I wish that we had included this in the post!
We approached this mainly from the perspective of a community building work (Tatianaâs main work), which as a meta-EA job is probably the only type of work for which there is such high overlap between âalignment with EAâ and âalignment with my organizationâs mission.â But you are correct. I can see how there would be a lot less overlap for an organization focused on a specific cause.
One thing I think is often missing from these sorts of conversations is that âalignment with EAâ and âalignment with my organizationâs missionâ are not the same thing! Itâs a mistake to assume that the only people who understand and believe in your organizationâs mission are members of the effective altruism community. EA ideas donât have to come in a complete package. People can believe that one organizationâs mission is really valuable and important, for different reasons, coming from totally different values, and without also believing that a bunch of other EA organizations are similarly valuable.
For âcore EAâ orgs like the Centre for Effective Altruism[1], thereâs probably near-total overlap between these two things. But for lots of other organizations the overlap is only incidental, and what you should really be looking for is âalignment with my organizationâs missionâ. Perceived EA Alignment is an unpredictable measure of that, while also being correlated with a bunch of other things like culture, thinking style, network, and socioeconomic status, each of which you either donât care about or which you donât want to be selecting for in the first place.
But also see Does participation in Effective Altruism predict success when applying to CEA?
I think that is a good point, and I wish that we had included this in the post!
We approached this mainly from the perspective of a community building work (Tatianaâs main work), which as a meta-EA job is probably the only type of work for which there is such high overlap between âalignment with EAâ and âalignment with my organizationâs mission.â But you are correct. I can see how there would be a lot less overlap for an organization focused on a specific cause.