I adjust upwards on EAs who haven’t come from excellent groups
I spend a substantial amount of my time interacting with community builders and doing things that look like community building.
It’s pretty hard to get a sense of someone’s values, epistemics, agency …. by looking at their CV. A lot of my impression of people that are fairly new to the community is based on a few fairly short conversations at events. I think this is true for many community builders.
I worry that there are some people who were introduced to some set of good ideas first, and then people use this as a proxy for how good their reasoning skills are. On the other hand, it’s pretty easy to be in an EA group where people haven’t thought hard about different cause areas/interventions/… And come away with the mean take that’s not very good despite being relatively good reasoning wise.
When I speak to EAs I haven’t met before I try extra hard to get a sense of why they think x and how reasonable a take that is, given their environment. This sometimes means I am underwhelmed by people who come from excellent EA groups, and impressed by people who come from mediocre ones.
You end up winning more Caleb points if your previous EA environment was ‘bad’ in some sense, all else equal.
(I don’t defend why I think a lot of the causal arrow points from the EA environment quality to the EA quality—I may write something on this, another time.)
I adjust upwards on EAs who haven’t come from excellent groups
I spend a substantial amount of my time interacting with community builders and doing things that look like community building.
It’s pretty hard to get a sense of someone’s values, epistemics, agency …. by looking at their CV. A lot of my impression of people that are fairly new to the community is based on a few fairly short conversations at events. I think this is true for many community builders.
I worry that there are some people who were introduced to some set of good ideas first, and then people use this as a proxy for how good their reasoning skills are. On the other hand, it’s pretty easy to be in an EA group where people haven’t thought hard about different cause areas/interventions/… And come away with the mean take that’s not very good despite being relatively good reasoning wise.
When I speak to EAs I haven’t met before I try extra hard to get a sense of why they think x and how reasonable a take that is, given their environment. This sometimes means I am underwhelmed by people who come from excellent EA groups, and impressed by people who come from mediocre ones.
You end up winning more Caleb points if your previous EA environment was ‘bad’ in some sense, all else equal.
(I don’t defend why I think a lot of the causal arrow points from the EA environment quality to the EA quality—I may write something on this, another time.)
It’s all about the Caleb points man