This is a great and well researched question, thanks for asking!
First: I think my personal opinion here might matter less than you would imagine: I am intentionally trying not to push for any major strategic changes to CEA while I am in the interim role.
That being said:
I agree that FTX causes the drop to be overstated. (And more importantly, Angelina, who collected this information, also agrees.)
Historically CEA has targeted a top of funnel growth of 30%, which, if you take these metrics at face value, means we are perfectly on target.
My personal low confidence guess is that our target growth rate should be a bit higher than it was historically, because:
CEA now has two full-time communications people, an increase of infinity percent over the zero people we had last year
EA (and particularly AI safety) have become mainstream/âprofessionalized (more people know what the causes are, why they are important, stable funding is available, etc.) which allows us to attract a broader demographic, specifically older/âmore senior people
Some details that seem worth considering:
Will is planning to âdistance myself from the idea that Iâm âthe face ofâ or âthe spokesperson forâ EAâ. I have not heard a compelling suggestions for an alternative person to replace him. (E.g. there are a few people writing books or doing interviews, but my sense is that none of them are likely to be cited as a major place people first heard about EA in the next EA survey.)
Peter Singer is doing a tour. Iâm interested in seeing if we can rope him into doing some stuff, he has historically been very successful at top of funnel outreach.
Some cruxes that would change my mind:
More employers (or things further down in the funnel) saying stuff like âthe difference between my top candidates is really small; getting more people into the hiring pool doesnât seem very useful.â I hear this occasionally, but not enough to make me currently worry about it too much.
More stories of harm from people getting involved in EA and then bouncing. I tried to do some investigation into this in the past, and itâs obviously by definition a hard population to interview, but my sense is that substantial harms are relatively rare.
More evidence behind the âkids these days arenât as good as the ones in my day wereâ complaints that Iâve heard every year since Iâve been in EA. I do worry that we might be accidentally losing all aspects of EA which are important by having too many new people come in without acculturation, but I havenât seen any persuasive arguments that this is actually happening.
More stories of harm from people getting involved in EA and then bouncing. I tried to do some investigation into this in the past, and itâs obviously by definition a hard population to interview, but my sense is that substantial harms are relatively rare.
Oops that was supposed to link to this sequence, updated now. (That sequence isnât a complete list of everything that I and others at CEA have done, but itâs the best I know of.)
People who are substantially harmed by a movement typically donât tell the community builders of that movement that theyâre leaving because they were substantially harmed. They give some other, less vulnerable reason. Some examples of this could be âlack of culture fit or interpersonal conflictâ or âburnout/âmental healthâ, two of the major cited factors in the linked sequence of why people leave.
Re: 2, very helpful to know CEAâs top of funnel target. To the best of my knowledge, this hasnât been shared before. Are there also targets for middle and bottom of funnel growth, and if so, would you mind sharing those?
Re: 3, I agree that both of your points suggest raising the target might make sense. But in the other direction, all else equal we should expect growth rates to slow over time (30% annual growth obviously isnât sustainable in perpetuity).
Re: 4, I would VERY much like to see EA develop growth channels that arenât dependent on a public figure (particularly a philosopher) releasing a book, going on a tour, publicizing his multibillion dollar crypto exchange, etc. More organic channels (e.g. campus outreach) seem more sustainable, more scalable, and less prone to the hero worship that often seems to be found in EA.
This is a great and well researched question, thanks for asking!
First: I think my personal opinion here might matter less than you would imagine: I am intentionally trying not to push for any major strategic changes to CEA while I am in the interim role.
That being said:
I agree that FTX causes the drop to be overstated. (And more importantly, Angelina, who collected this information, also agrees.)
Historically CEA has targeted a top of funnel growth of 30%, which, if you take these metrics at face value, means we are perfectly on target.
My personal low confidence guess is that our target growth rate should be a bit higher than it was historically, because:
CEA now has two full-time communications people, an increase of infinity percent over the zero people we had last year
EA (and particularly AI safety) have become mainstream/âprofessionalized (more people know what the causes are, why they are important, stable funding is available, etc.) which allows us to attract a broader demographic, specifically older/âmore senior people
Some details that seem worth considering:
Will is planning to âdistance myself from the idea that Iâm âthe face ofâ or âthe spokesperson forâ EAâ. I have not heard a compelling suggestions for an alternative person to replace him. (E.g. there are a few people writing books or doing interviews, but my sense is that none of them are likely to be cited as a major place people first heard about EA in the next EA survey.)
Peter Singer is doing a tour. Iâm interested in seeing if we can rope him into doing some stuff, he has historically been very successful at top of funnel outreach.
Some cruxes that would change my mind:
More employers (or things further down in the funnel) saying stuff like âthe difference between my top candidates is really small; getting more people into the hiring pool doesnât seem very useful.â I hear this occasionally, but not enough to make me currently worry about it too much.
More stories of harm from people getting involved in EA and then bouncing. I tried to do some investigation into this in the past, and itâs obviously by definition a hard population to interview, but my sense is that substantial harms are relatively rare.
More evidence behind the âkids these days arenât as good as the ones in my day wereâ complaints that Iâve heard every year since Iâve been in EA. I do worry that we might be accidentally losing all aspects of EA which are important by having too many new people come in without acculturation, but I havenât seen any persuasive arguments that this is actually happening.
Can you say more what investigation you did?
Oops that was supposed to link to this sequence, updated now. (That sequence isnât a complete list of everything that I and others at CEA have done, but itâs the best I know of.)
People who are substantially harmed by a movement typically donât tell the community builders of that movement that theyâre leaving because they were substantially harmed. They give some other, less vulnerable reason. Some examples of this could be âlack of culture fit or interpersonal conflictâ or âburnout/âmental healthâ, two of the major cited factors in the linked sequence of why people leave.
Agreed, it feels real hard to get clear data on this â I would be excited for other people to research and share what they can find.
Thanks Ben, I appreciate this detailed response!
Re: 2, very helpful to know CEAâs top of funnel target. To the best of my knowledge, this hasnât been shared before. Are there also targets for middle and bottom of funnel growth, and if so, would you mind sharing those?
Re: 3, I agree that both of your points suggest raising the target might make sense. But in the other direction, all else equal we should expect growth rates to slow over time (30% annual growth obviously isnât sustainable in perpetuity).
Re: 4, I would VERY much like to see EA develop growth channels that arenât dependent on a public figure (particularly a philosopher) releasing a book, going on a tour, publicizing his multibillion dollar crypto exchange, etc. More organic channels (e.g. campus outreach) seem more sustainable, more scalable, and less prone to the hero worship that often seems to be found in EA.