I also agree that prima facie this strategic shift might seem worrying given that 80K has been the powerhouse of EA movement growth for many years.
That said, I share your view that growth via 80K might reduce less than one would naively expect. In addition to the reasons you give above, another consideration is our finding is that a large percentage of people get into EA via ‘passive’ outreach (e.g. someone googles “ethical career” and finds the 80K website’, and for 80K specifically about 50% of recruitment was ‘passive’), rather than active outreach, and it seems plausible that much of that could continue even after 80K’s strategic shift.
Our framings will probably change. It’s possible that the framings we use more going forward will emphasise EA style thinking a little less than our current ones, though this is something we’re actively unsure of.
As noted elsewhere, we plan to research this empirically. Fwiw, my guess is that broader EA messaging would be better (on average and when comparing the best messaging from each) at recruiting people to high levels of engagement in EA (this might differ when looking to recruit people directly into AI related roles), though with a lot of variance within both classes of message.
I’m not sure the “passive” finding should be that reassuring.
I’m imagining someone googling “ethical career” 2 years from now and finding 80k, noticing that almost every recent article, podcast, and promoted job is based around AI, and concluding that EA is just an AI thing now. If they have no interest in AI based careers (either through interest or skillset), they’ll just move on to somewhere else. Maybe they would have been a really good fit for an animal advocacy org, but if their first impressions don’t tell them that animal advocacy is still a large part of EA they aren’t gonna know.
It could also be bad even for AI safety: There are plenty of people here who were initially skeptical of AI x-risk, but joined the movement because they liked the malaria nets stuff. Then over time and exposure they decided that the AI risk arguments made more sense than they initially thought, and started switching over. In hypothetical future 80k, where malaria nets are de-emphasised, that person may bounce off the movement instantly.
I’m imagining someone googling “ethical career” 2 years from now and finding 80k, noticing that almost every recent article, podcast, and promoted job is based around AI, and concluding that EA is just an AI thing now.
I definitely agree that would eventually become the case (eventually all the older non-AI articles will become out of date). I’m less sure it will be a big factor 2 years from now (though it depends on exactly how articles are arranged on the website and so how salient it is that the non-AI articles are old).
It could also be bad even for AI safety: There are plenty of people here who were initially skeptical of AI x-risk, but joined the movement because they liked the malaria nets stuff. Then over time and exposure they decided that the AI risk arguments made more sense than they initially thought, and started switching over.
I also think this is true in general (I don’t have a strong view about the net balance in the case of 80K’s outreach specifically).
Previous analyses we conducted suggested that over half of Longtermists (~60%) previously prioritised a different cause and that this is consistent across time.
You can see the overall self-reported flows (in 2019) here.
@titotal I’m curious whether or to what extent we substantively disagree, so I’d be interested in what specific numbers you’d anticipate, if you’d be interested in sharing.
My guess is that we’ll most likely see <30% reduction in people first hearing about EA from 80K next time we run the survey (though this might be confounded if 80K don’t promote the EA Survey so much, so we’d need to control for that).
Obviously we can’t directly observe this counterfactual, but I’d guess that if a form of outreach that was 100% active shut down, we’d observe close to a 100% reduction (e.g. if everyone stopped running EA Groups or EAGs, we’d soon see ~0% people hearing about EA from these sources).[1]
I don’t say strictly 0% only because I think there’s always the possibility for a few unusual cases, e.g. someone is googling how to do good and happens across an old post about EAG or their inactive local group.
Over half of long termists starting on something else is kind of insane. Although given the current landscape I suspect many of those if there entered now would have entered directly into long termism. Looking forward to seeing the data unfold!
Thanks Arden!
I also agree that prima facie this strategic shift might seem worrying given that 80K has been the powerhouse of EA movement growth for many years.
That said, I share your view that growth via 80K might reduce less than one would naively expect. In addition to the reasons you give above, another consideration is our finding is that a large percentage of people get into EA via ‘passive’ outreach (e.g. someone googles “ethical career” and finds the 80K website’, and for 80K specifically about 50% of recruitment was ‘passive’), rather than active outreach, and it seems plausible that much of that could continue even after 80K’s strategic shift.
As noted elsewhere, we plan to research this empirically. Fwiw, my guess is that broader EA messaging would be better (on average and when comparing the best messaging from each) at recruiting people to high levels of engagement in EA (this might differ when looking to recruit people directly into AI related roles), though with a lot of variance within both classes of message.
I’m not sure the “passive” finding should be that reassuring.
I’m imagining someone googling “ethical career” 2 years from now and finding 80k, noticing that almost every recent article, podcast, and promoted job is based around AI, and concluding that EA is just an AI thing now. If they have no interest in AI based careers (either through interest or skillset), they’ll just move on to somewhere else. Maybe they would have been a really good fit for an animal advocacy org, but if their first impressions don’t tell them that animal advocacy is still a large part of EA they aren’t gonna know.
It could also be bad even for AI safety: There are plenty of people here who were initially skeptical of AI x-risk, but joined the movement because they liked the malaria nets stuff. Then over time and exposure they decided that the AI risk arguments made more sense than they initially thought, and started switching over. In hypothetical future 80k, where malaria nets are de-emphasised, that person may bounce off the movement instantly.
I definitely agree that would eventually become the case (eventually all the older non-AI articles will become out of date). I’m less sure it will be a big factor 2 years from now (though it depends on exactly how articles are arranged on the website and so how salient it is that the non-AI articles are old).
I also think this is true in general (I don’t have a strong view about the net balance in the case of 80K’s outreach specifically).
Previous analyses we conducted suggested that over half of Longtermists (~60%) previously prioritised a different cause and that this is consistent across time.
You can see the overall self-reported flows (in 2019) here.
@titotal I’m curious whether or to what extent we substantively disagree, so I’d be interested in what specific numbers you’d anticipate, if you’d be interested in sharing.
My guess is that we’ll most likely see <30% reduction in people first hearing about EA from 80K next time we run the survey (though this might be confounded if 80K don’t promote the EA Survey so much, so we’d need to control for that).
Obviously we can’t directly observe this counterfactual, but I’d guess that if a form of outreach that was 100% active shut down, we’d observe close to a 100% reduction (e.g. if everyone stopped running EA Groups or EAGs, we’d soon see ~0% people hearing about EA from these sources).[1]
I don’t say strictly 0% only because I think there’s always the possibility for a few unusual cases, e.g. someone is googling how to do good and happens across an old post about EAG or their inactive local group.
Over half of long termists starting on something else is kind of insane. Although given the current landscape I suspect many of those if there entered now would have entered directly into long termism. Looking forward to seeing the data unfold!