I think the centrality of the EA Forum to the overall “EA project” has likely caused a lot of unintended consequences like this. Participating in the Forum is seen as a pretty important “badge” of belonging in EA, but participating in an internet forum is generally not the type of activity that appeals to everyone, much less an internet forum where posts are expected to be lengthy and footnoted.
Sunny1
Great post. I appreciate the framing around the real gaps in human capital. One additional concern I have is that aesthetics might play a counterproductive role in community building. For example, if EA aesthetics are most welcoming to people who are argumentative, perhaps even disagreeable, then the skill set of “one-on-one social skills and emotional intelligence” could be selected out (relatively speaking).
Just as a casual observation, I would much rather hire someone who had done a couple of years at McKinsey than someone coming straight out of undergrad with no work experience. So I’m not sure that diverting talented EAs from McKinsey (or similar) is necessarily best in the long run for expected impact. No EA organization can compete with the ability of McK to train up a new hire with a wide array of generally useful skills in a short amount of time.
Well, no one has the “real” answers to any of these questions, even the most EA of all EAs. The important thing is to be asking good questions in the first place. I think it’s both most truthful and most interpersonally effective to say something like “gee, I’ve never thought about that before. But here’s a question I would ask to get started. What do you think?”
I really liked CEA’s “tour of duty” framing for the recent hiring rounds! I thought that was a great signal to potential candidates of what they could expect in the job. I think employers should be more explicit with candidates about what they’re expecting from new hires in terms of tenure.
Conversely, I would encourage job applicants to be candid with employers about their tenure expectations for a role. For some roles, only staying in the role for 1-2 years is perfectly fine. For other roles, especially ones that require a lot of role-specific knowledge and training, that would be harmful to the organization. I also would ask candidates to introspect honestly—do they feel that a certain role is in any way “beneath them?” It’s can be damaging to morale to have a colleague who feels like the job is just a stepping stone to something else.
Strong +1 for squad goals :)
I just wanted to reinforce the point Benjamin made above about getting involved in the EA community. For example, if you apply for a job at an EA organization, they may request references from the EA community in addition to the standard references from your last job. Do you already have strong references from credible people in the EA community? If not, it would be worthwhile to do more networking. You may also need to build up a track record of EA volunteer work, post on the EA forum, and so on to build up your own EA track record.
Here’s one way to think about this. Getting a job at an EA organization can be like getting a job in the film industry. You’re trying to break into a “glamorous” industry. That is, some people consider these jobs “dream jobs”—they have an extremely compelling “X factor” that has nothing to do with how much the job pays. (In EA, the ‘glamour’ factor is the ability to have a really high-impact career, which is the central life goal of many EAs.) So you may need to network, volunteer for a while, etc. in order to break in.
Congratulations! I’m very excited about this project and I’m looking forward to following along. A couple of questions:
Do you plan to do any thinking / writing about why EA’s might choose to prioritize climate giving? Climate seems to occupy a weird space in EA. It’s more “middle termist” than short-termist or long-termist, so it doesn’t neatly fit into either of those camps. And climate affects animals, humans living today, and future humans, so it also doesn’t align neatly with those camps either. It’s a bit of an “all of the above” cause area. Any thoughts on this topic, or are you just planning to be a resource for those who already care about climate?
What is your plan for influencing donors / moving money? For context GiveWell moved about $150M in 2019 and it’s taken them about 12 years to reach that milestone. What are your plans for translating your research into dollars moved?
Thanks and congrats again.
Thanks for this post. In my view, one of the most important elements of the EA approach is the expanding moral circle. The persistence of systemic racism in the US (where I live) is compelling evidence that we have a long way to go in expanding the moral circle. Writ large, the US moral circle hasn’t even expanded enough to include people of different races within our own country. I think this inability to address systemic racism is an important problem that could have a negative effect on the trajectory of humanity. In the US, it’s obvious that systemic racism impedes our ability to self-govern responsibly. Racial animus motivates people toward political actions that are harmful and irrational. (For a recent example, look at the decision to further restrict H-1B visas.) Racism (which may manifest as xenophobia) also impedes our ability to coordinate internationally—which is pretty important for responding to existential risks. So I tend to think that the value of addressing systemic racism is probably undervalued by the average EA.
Editing to add another dimension in which I think systemic racism is probably undervalued. If you believe that positive qualities like intelligence are distributed across different racial groups (and genders, etc), then you would expect roles of leadership and influence to be distributed across those groups. To me, the fact that our leadership, etc. is unrepresentative is an indicator that we are not using our human capital most effectively. We are losing out on ideas, talents, productivity, etc. Efforts to increase representation (sometimes called diversity efforts) can help us get a more efficient use of human capital.
I also just think racial equality is a good in itself—but I’m not a strict consequentialist / utilitarian as many in EA are.
Agreed. And, it would be great to have a similar top-level post for the “new” GWWC once it launches describing what is in and out of scope. In particular, it would be helpful to know if GWWC is intended to be 1) an EA recruitment pipeline; 2) an end in itself, i.e., driving impact through donations; or both? It seems that charitable giving has fallen out of favor relative to changing careers as an impact lever since I pledged in 2015. I’m curious to know if the leaders of CEA / GWWC see its mission primarily as driving charitable giving or as recruiting new EAs.
Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don’t think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It’s manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when “outsiders” (for lack of a better term) are discussed.
I also think, at least in the past, the attitude towards climate work has been vaguely dismissive.
As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.
There’s something about the “gateway to EA” argument that is a bit off-putting. It sounds like “those folks don’t yet understand that only x-risks are important, but eventually we can show them the error of their ways.” I understand that this viewpoint makes sense if you are convinced that your own views are correct, but it strikes me as a bit patronizing. I’m not trying to pick on you in particular, but I see this viewpoint advanced fairly frequently so I wanted to comment on it.
Thanks for posting! I took an En-ROADS workshop with a trained facilitator in my local community and I thought it was extremely well done. The organization that built En-ROADS trains facilitators to then teach others about the tool (and about climate).
En-ROADS itself is an example of an intervention whose impact would be difficult to quantify. The goal is to educate as many people as possible about the fundamental dynamics of the climate problem, using well-designed interactive workshops/tools that are based on robust evidence. It seems like a good approach to me, but I don’t know if they can ever prove a positive impact on the climate problem. I sometimes wonder if a similar approach would be helpful for spreading the “EA gospel” to a wider audience.
In my view one of the most defining features of the EA community is that it makes most people who come into contact with it feel excluded and “less than,” on several dimensions. So it’s not just you!