Strong upvote for the topic and examples. This seems like a clear area for progress within EA, so long as we continue to evaluate different interventions using an impartial model. (More education experts within EA gives us a better chance of noticing opportunities there, but could also lead to education interventions getting promoted despite lower impact depending on the incentives and priorities of the experts.)
If our goal is to improve the world as much as possible (and it is), the world is a big place with a lot of different groups. There are a lot of levers we could be trying to pull, and while we have a pretty good idea of which levers tend to be more or less important, the world is changing all the time (and even now, if we did have perfect information, our lists of “top causes” would probably look quite a bit different). I’d love to see more people becoming experts on a particular “lever”, just in case.
The idea of having more connections to groups of people (vs. more access to causes, as I discussed above) is even more promising, though it’s important to build EA communities // EA presence within a community slowly and carefully, with respect toward community norms and ideas. I’ve occasionally seen this go badly (e.g. individuals and groups who market EA too aggressively to a new audience, accidentally burning bridges in the process).
Yeah, ideally I’d like people to invest significant time into genuinely being part of two communities, rather than just “marketing EA.” A good example of this would be EA Quakers. I’ve met a few; that don’t explicitly talk about Quaker values at EA events or EA values at Quaker events, but they have a deep understanding of both perspectives.
Strong upvote for the topic and examples. This seems like a clear area for progress within EA, so long as we continue to evaluate different interventions using an impartial model. (More education experts within EA gives us a better chance of noticing opportunities there, but could also lead to education interventions getting promoted despite lower impact depending on the incentives and priorities of the experts.)
If our goal is to improve the world as much as possible (and it is), the world is a big place with a lot of different groups. There are a lot of levers we could be trying to pull, and while we have a pretty good idea of which levers tend to be more or less important, the world is changing all the time (and even now, if we did have perfect information, our lists of “top causes” would probably look quite a bit different). I’d love to see more people becoming experts on a particular “lever”, just in case.
The idea of having more connections to groups of people (vs. more access to causes, as I discussed above) is even more promising, though it’s important to build EA communities // EA presence within a community slowly and carefully, with respect toward community norms and ideas. I’ve occasionally seen this go badly (e.g. individuals and groups who market EA too aggressively to a new audience, accidentally burning bridges in the process).
Yeah, ideally I’d like people to invest significant time into genuinely being part of two communities, rather than just “marketing EA.” A good example of this would be EA Quakers. I’ve met a few; that don’t explicitly talk about Quaker values at EA events or EA values at Quaker events, but they have a deep understanding of both perspectives.