Thanks as always for this valuable data!
Given 80k is a large and growing source of people hearing about and getting involved in EA, some people reading this might be worried that 80k will stop contributing to EA’s growth, given our new strategic focus on helping people work on safely navigating the transition to a world with AGI.
tl;dr I don’t think it will stop, and might continue as before, though it’s possible it will be reduced some.
More:
I am not sure whether 80k’s contribution to building ea in terms of sheer numbers of people getting involved is likely to go down due to this focus vs. what it would otherwise be if we simply continued to scale our programmes as they currently are without this change in direction.
My personal guess at this time is that it will reduce at least slightly.
Why would it?
We will be more focused on helping people work on helping AGI go well—that means that e.g. university groups might be hesitant to recommend us to members who are not interested in AIS as a cause area
At a prosaic level, some projects that would have been particularly useful for building EA vs. helping with AGI in a more targeted way are going to be de-prioritised—e.g. I personally dropped a project I began of updating our “building ea” problem profile in order to focus more on AGI targeted things
Our framings will probably change. It’s possible that the framings we use more going forward will emphasise EA style thinking a little less than our current ones, though this is something we’re actively unsure of.
We might sometimes link off to the AI safety community in places where we might have linked off to EA before (though it is much less developed, so we’re not sure).
However, I do expect us to continue to significantly contribute to building EA – and we might even continue to do so at a similar level vs. before. This is for a few reasons:
We still think EA values are important, so still plan to talk about them a lot. E.g. we will talk about *why* we’re especially concerned about AGI using EA-style reasoning, emphasise the importance of impartiality and scope sensitivity, etc.
We don’t currently have any plans for reducing our links to the ea community – e.g. we don’t plan to stop linking to the EA forum, or stop using our newsletter to notify people about EAGs.
We still plan to list meta EA jobs on our job board, put advisees in touch with people from the EA community when it makes sense, and by default keep our library of content online
We’re not sure whether, in terms of numbers, the changes we’re making will cause our audience to grow or shrink. On the one hand, it’s a more narrow focus, so will appeal less to people who aren’t interested in AI. On the other, we are hoping to appeal more to AI-interested people, as well as older people, who might not have been as interested in our previous framings.
This will probably lead directly and indirectly to a big chunk of our audience continuing to get involved in EA due to engaging with us. This is valuable according to our new focus, because we think that getting involved in EA is often useful for being able to contribute positively to things going well with AGI.
To be clear, we also think EA growing is valuable for other reasons (we still think other cause areas matter, of course!). But it’s actually never been an organisational target[1] of ours to build EA (or at least it hasn’t since I joined the org 5 years ago); growing EA has always been something we cause as a side effect of helping people pursue high impact careers (because, as above, we’ve long thought that getting involved in EA is one useful step for pursuing a high impact career!)
Note on all the above: the implications of our new strategic focus for our programmes are still being worked out, so it’s possible that some of this will change.
Also relevant: FAQ on the relationship between 80k & EA (from 2023 but I still agree with it)
[1] Except to the extent that helping people into careers building EA constitutes helping them pursue a high impact career - & it is one of many ways of doing that (along with all the other careers we recommend on the site, plus others). We do also sometimes use our impact on the growth of EA as one proxy for our total impact, because the data is available, and we think it’s often a useful step to having an impactful career, & it’s quite hard to gather data on people we’ve helped pursue high impact careers more directly.
Anecdote: I’m one of those people—would say I’d barely heard of ea / basically didn’t know what it was, before a friend who already knew of it suggested I come to an EA global (I think at the time one got a free t-shirt for referring friends). We were both philosophy students & I studied ethics, so I think he thought I might be intersted even though we’d never talked about EA.