Today, I mentioned to someone that I tend to disagree with others on some aspects of EA community building, and they asked me to elaborate further. Here’s what I sent them, very quickly written and only lightly edited:
Hard to summarize quickly, but here’s some loose gesturing in the direction:
We should stop thinking about “community building” and instead think about “talent development”. While building a community/culture is important and useful, the wording overall sounds too much like we’re inward-focused as opposed to trying to get important things done in the world.
We should focus on the object level (what’s the probability of an extinction-level pandemic this century?) over social reality (what does Toby Ord think is the probability of an extinction-level pandemic this century?).
We should talk about AI alignment, but also broaden our horizon to not-traditionally-core-EA causes to sharpen our reasoning skills and resist insularity. Example topics I think should be more present in talent development programs are optimal taxation, cybersecurity, global migration and open borders, 1DaySooner, etc.
Useful test: Does your talent development program make sense if EA didn’t exist? (I.e., is it helping people grow and do useful things, or is it just funnelling people according to shallow metrics?)
Based on personal experience and observations of others’ development, the same person can have a much higher or much lower impact depending on the cultural environment they’re embedded in, and the incentives they perceive to affect them. Much of EA talent development should be about transmitting a particular culture that has produced impressive results in the past (and avoiding cultural pitfalls that are responsible for some of the biggest fuckups of the last decade). Shaping culture is really important, and hard to measure, and will systematically be neglected by talent metrics, and avoiding this pitfall requires constantly reminding yourself of that.
Much of the culture is shaped by incentives (such as funding, karma, event admissions, etc.). We should be really deliberate in how we set these incentives.
To be clear, are you saying your preference for the phrase ‘talent development’ over ‘community building’ is based on your concern that people hear ‘community building’ and think, ‘Oh, these people are more interested in investing in their community as an end in itself than they are in improving the world’?
I don’t know about Jonas, but I like this more from the self-directed perspective of “I am less likely to confuse myself about my own goals if I call it talent development.”
Thanks! So, to check I understand you, do you think when we engage in what we’ve traditionally called ‘community building’ we should basically just be doing talent development?
In other words, your theory of change for EA is talent development + direct work = arrival at our ultimate vision of a radically better world?[1]
E.g., a waypoint described by MacAskill as something like the below:
”(i) ending all obvious grievous contemporary harms, like war, violence and unnecessary suffering; (ii) reducing existential risk down to a very low level; (iii) securing a deliberative process for humanity as a whole, so that we make sufficient moral progress before embarking on potentially-irreversible actions like space settlement.”
How to fix EA “community building”
Today, I mentioned to someone that I tend to disagree with others on some aspects of EA community building, and they asked me to elaborate further. Here’s what I sent them, very quickly written and only lightly edited:
I find it very interesting to think about the difference between what a talent development project would look like vs. a community-building project.
To be clear, are you saying your preference for the phrase ‘talent development’ over ‘community building’ is based on your concern that people hear ‘community building’ and think, ‘Oh, these people are more interested in investing in their community as an end in itself than they are in improving the world’?
I don’t know about Jonas, but I like this more from the self-directed perspective of “I am less likely to confuse myself about my own goals if I call it talent development.”
Thanks! So, to check I understand you, do you think when we engage in what we’ve traditionally called ‘community building’ we should basically just be doing talent development?
In other words, your theory of change for EA is talent development + direct work = arrival at our ultimate vision of a radically better world?[1]
Personally, I think we need a far more comprehensive social change portfolio.
E.g., a waypoint described by MacAskill as something like the below:
”(i) ending all obvious grievous contemporary harms, like war, violence and unnecessary suffering; (ii) reducing existential risk down to a very low level; (iii) securing a deliberative process for humanity as a whole, so that we make sufficient moral progress before embarking on potentially-irreversible actions like space settlement.”
Yes, this.