It’s cool you did that, though I wouldn’t recommend simply combining all the samples, since they’re for really different groups at very different levels of engagement (which leads to predictably very different drop out rates).
A quick improvement would be to split into a highly engaged and a broader group.
The highly engaged meta-analysis could include: Joey’s 50% donors; CEA weekend away highly engaged subset; 80k top plan changes; CEA early employees.
The broader meta-analysis could be based on: GWWC estimate; EA survey estimate; Joey 10% donors; CEA weekend away entire sample.
they’re for really different groups at very different levels of engagement (which leads to predictably very different drop out rates).
This is the reason for doing a random effects meta-analysis in the first place: the motivating assumption is that the populations across studies are very different and so are the underlying dropout rates (e.g. differing estimates are due not just to within-study variation but also to cross-study variation of the kind you describe).
Still, it was sloppy of me to describe 23% as the true estimate above- in RE, there is no true estimate. A better takeaway is that, within the scope of the kind of variation we see across these survey populations, we’d almost certainly expect to see dropout of less than 40%, regardless of engagement level. Perhaps straining the possibilities of the sample size, I ran the analysis again with an intercept for engagement—high engagement seems to be worth about 21 percentage points’ worth of reduced dropout likelihood on the 5-year frame.
>60% persistence in the community at large seems pretty remarkable to me. I understand that you haven’t been able to benchmark against similar communities, but my prior on youth movements (as I think EA qualifies) would be considerably higher. Do you have a reference class for the EA community in mind? If so, what’s in it?
Do you mean 21 percentage points, so if the overall mean is 23%, then the most engaged are only 2%? Or does it mean 21% lower, in which case it’s 18%?
I’m not aware of a good reference class where we have data—I’d be keen to see more research into that.
It might be worth saying that doing something like taking the GWWC pledge is still a high level of engagement & commitment on the scale of things, and I would guess significantly higher than the typical young person affiliating with a youth movement for a while.
(The mean & median age in EA is also ~28 right now, so while still on the youthful side, it’s not mainly not students or teenagers.)
The former! This is pretty sensitive to modeling choices—tried a different way, I get an engagement effect of 31 percentage points (38% vs. 7% dropout).
The modeling assumption made here is that engagement level shifts the whole distribution of dropout rates, which otherwise looks the same; not sure if that’s justifiable (seems like not?), but the size of the data is constraining. I’d be curious to hear what someone with more meta-analysis experience has to say about this, but one way to approximate value drift via a diversity of measurements might be to pile more proxy measurements into the model—dropout rates, engagement reductions, and whatever else you can come up with—on the basis that they are all noisy measurements of value drift.
I’d be super curious to know if the mean/median age of EA right now is a function of the people who got into it as undergrads or grad students several years ago and who have continued to be highly engaged over time. Not having been involved for that long, I have no idea whether that idea has anecdotal resonance.
Hi Matt,
It’s cool you did that, though I wouldn’t recommend simply combining all the samples, since they’re for really different groups at very different levels of engagement (which leads to predictably very different drop out rates).
A quick improvement would be to split into a highly engaged and a broader group.
The highly engaged meta-analysis could include: Joey’s 50% donors; CEA weekend away highly engaged subset; 80k top plan changes; CEA early employees.
The broader meta-analysis could be based on: GWWC estimate; EA survey estimate; Joey 10% donors; CEA weekend away entire sample.
I’d be keen to see the results of this!
This is the reason for doing a random effects meta-analysis in the first place: the motivating assumption is that the populations across studies are very different and so are the underlying dropout rates (e.g. differing estimates are due not just to within-study variation but also to cross-study variation of the kind you describe).
Still, it was sloppy of me to describe 23% as the true estimate above- in RE, there is no true estimate. A better takeaway is that, within the scope of the kind of variation we see across these survey populations, we’d almost certainly expect to see dropout of less than 40%, regardless of engagement level. Perhaps straining the possibilities of the sample size, I ran the analysis again with an intercept for engagement—high engagement seems to be worth about 21 percentage points’ worth of reduced dropout likelihood on the 5-year frame.
>60% persistence in the community at large seems pretty remarkable to me. I understand that you haven’t been able to benchmark against similar communities, but my prior on youth movements (as I think EA qualifies) would be considerably higher. Do you have a reference class for the EA community in mind? If so, what’s in it?
Thank you, that’s helpful!
Do you mean 21 percentage points, so if the overall mean is 23%, then the most engaged are only 2%? Or does it mean 21% lower, in which case it’s 18%?
I’m not aware of a good reference class where we have data—I’d be keen to see more research into that.
It might be worth saying that doing something like taking the GWWC pledge is still a high level of engagement & commitment on the scale of things, and I would guess significantly higher than the typical young person affiliating with a youth movement for a while.
(The mean & median age in EA is also ~28 right now, so while still on the youthful side, it’s not mainly not students or teenagers.)
The former! This is pretty sensitive to modeling choices—tried a different way, I get an engagement effect of 31 percentage points (38% vs. 7% dropout).
The modeling assumption made here is that engagement level shifts the whole distribution of dropout rates, which otherwise looks the same; not sure if that’s justifiable (seems like not?), but the size of the data is constraining. I’d be curious to hear what someone with more meta-analysis experience has to say about this, but one way to approximate value drift via a diversity of measurements might be to pile more proxy measurements into the model—dropout rates, engagement reductions, and whatever else you can come up with—on the basis that they are all noisy measurements of value drift.
I’d be super curious to know if the mean/median age of EA right now is a function of the people who got into it as undergrads or grad students several years ago and who have continued to be highly engaged over time. Not having been involved for that long, I have no idea whether that idea has anecdotal resonance.