Formerly: SoGive volunteer, RA to the Founders Pledge climate research and grant-making team, 80k recruiter.
violet
My thesis here revolves around the overlap between tech and EA culture and how this shapes the demographics. We should expect higher rates of youth, whiteness, maleness, and willingness to move for high pay in the Bay Area because of the influx of people moving for tech jobs in the past 10 years. There could also be some kind of weird sexual competition exacerbated by scarcity.
Here are some other unusual things about the Bay Area which may contribute to the “vibes” mentioned:Founder effects: Bay Area EA organizations tend to be more focused on AI and therefore look to hire tech-types, growing the presence of people who fit this demographic (these orgs also could have been founded in the Bay because of these demographics, it’s unclear to me which came first)
Extremely high wealth inequality and the correlation of wealth with other things EAs select for (e.g. educational attainment) likely means EA in the Bay selects much harder for wealth than in other places
Racism has a profound influence US society. In my experience, people who are unfamiliar with both the history and modern day effects of race in America (or are from more homogenous countries) are worse at creating welcoming spaces and seem to underappreciate the value of creating diverse groups
There is a high prevalence and acceptance of hookup culture and casual sex
There’s high tolerance for non-traditional relationships by broader society
The US is one of the most individualistic cultures in the world according to cultural psychology measures
Overall, the Bay Area is much unlike the rest of the world according to most demographic criteria, and it’s plausible that different outreach strategies are needed there in order to find driven and altruistic people from with a diversity of ideas and approaches to doing good.
For anyone interested, especially university students, here’s my (unsolicited) story of working at SoGive:
Two years out my three main takeaways were probably (1) getting feedback on my writing and practice writing for EA contexts, (2) experience with charity evaluation, and (3) support exploring topics of my own interest, plus (3.5) I really liked working with Sanjay.
I volunteered with SoGive during the last year of my bachelors and later went on to work as an RA for the Founders Pledge Climate Team. During undergrad years prior to SoGive, I was an RA for an academic research lab at my Uni (sciences), had a campus job as a tour guide, held a leadership position with my student co-op, and did a data science internship.
Critically, the things I benefitted from most while volunteering with SoGive were things my other roles didn’t provide. I think specializing in EA research too early probably isn’t a great longterm career move, and diversifying your extracurriculars to get a healthy mixture of community, fun, and targeted skill/career capital building is really important for both well-being and intellectual growth. Because my university had strong research programs for undergraduates, academic labs were probably a more direct way of “testing my fit” for research, but I expect this won’t be the case for most students. This work was a good fit for me as an undergraduate, but especially so because it met criteria others didn’t and provided mentorship from someone I respect (Sanjay).
TLDR; I’d encourage interested students to check out this program and listen to Alex Lawsen’s 80k episode on advice for students.
I’m largely in agreement with other commenters, but see two points worth adding.
1. Project Drawdown’s list of solutions is prioritized only by their emission aversion potential, and this is not the only possible framework for prioritizing climate solutions. If we try to minimize climate damage rather than maximize emissions reductions it may lead to a different ordering of solutions, likely one which better addresses the worst-case possibilities of climate change (e.g. the catastrophic or existential risks associated with very high warming which EAs are especially likely to see as important).
2. Canada (and Western economies in general) account for a very small fraction of remaining emissions this century. Whether or not Canada fails to meet its policy targets will have a very small effect on overall emissions this century, which is a factor I’d be interested in seeing considered in future posts.
Thanks for taking the time to write up your thoughts.
Where can I find estimates of total “EA influenced” giving outside of foundations?
I want to get an idea of how much money (outside of these totals) has been influenced by EA. This would include EA influenced individual donations (e.g. GWWC, EA Funds, etc.) but NOT money from large donors like OpenPhil/GiveWell/FTX.
Recommended charities might have these estimates for their organizations individually, but does this data exist in aggregate?
I genuinely find this fascinating. I don’t think I’ve ever felt worried expressing empathy would be used as a push for concessions, and haven’t wanted for it with this intent. I think your experience might be common though, perhaps among men in particular, and I think it we should talk about it more. Thanks for putting this out there.