As far as I know, not all of that data you were asking about has been released, but some of it has and is in the impact assessment (notably 1.7% is a figure which is something like an average annual rate for members leaving, and 4.7% the equivalent for members losing touch with GWWC).
My understanding was GWWC said that 1.7% was the average annual rate for members who admitted they were leaving, with an additional 4.7% of members not responding for two years. Unfortunately, these are not quite the numbers we want, and are likely to be too optimistic for a few reasons:
These numbers don’t include people who claim to be donating but are not. (Does GWWC attempt to verify donations? )
They also don’t include any sense of cohort differentiation—how do the early members compare to the later members? Is this number skewed by all of Dr Ord’s friends who joined early on?
Nor did they break down students donating 1% vs real people donating 10%. The latter are much more valuable, but I would also expect a significant number of people to drop-out when they switch from the relatively undemanding student pledge to the full 10% pledge. As such, if the current mass of members is very student-heavy, high historical retention rates may not generalize to when they leave and become full members.
How does the dropout curve change for an individual cohort. Do most of the people leaving leave in the early years of the pledge (good) or does the rate of drop-out increase over time (bad)?
Aside from this optimistic bias, there are a few other reasons to want cohort data
It would be nice to be able to reconcile that very low drop-out rates with the very high ones GWWC published a few years ago here, which showed only 65-70% claimed retention after 0-2 years. Right now it seems hard to understand how these are consistent.
2-year grace period is very long. In finance we call a loan non-performing if they are 90-days behind! Worse, GWWC is young and has seen exponential growth, so a 2-year wait period means no data on anyone who joined since June 2013.
A few people have expressed concerns to me, both publicly and privately, that GWWC has discovered some very negative facts about their membership. Standardized disclosure, in a form chosen by a third party, can go a long way towards dispelling these fears. This is why public companies have to report in accordance with US GAAP, rather than getting to choose their own metrics.
Finally, it would be nice to have some of the mundane, technical details as well. How was this average calculated? etc. With cohort data we don’t need to speculate and argue about the virtues of different ways of computing hazard rates, we can just do our own calculations.
Agree that the cohort data looks helpful for the reasons you mention (presuming a careful look at any privacy issues with releasing it checked out). I’ll respond to a couple of the points:
(Does GWWC attempt to verify donations? )
I don’t think so, and guess this is probably the right move despite the costs of lacking verification (aside from the obvious costs of data gathering, I’m worried about the subtle costs of making giving for the pledge feel more like an obligation than an opportunity, and this turning people off and leading to fewer donations, and fewer people encouraging others to join).
It would be nice to be able to reconcile that very low drop-out rates with the very high ones GWWC published a few years ago here, which showed only 65-70% claimed retention after 0-2 years. Right now it seems hard to understand how these are consistent.
I think what’s going on here is that the 65-70% figures were mostly response rates on a survey of members. More recently GWWC has gone to rather more trouble to follow up with members who don’t initially answer the survey, in order to get better data (as it was unclear how many of the 30-35% non-respondents were still donating).
2-year grace period is very long. In finance we call a loan non-performing if they are 90-days behind! Worse, GWWC is young and has seen exponential growth, so a 2-year wait period means no data on anyone who joined since June 2013.
I see the sense in which it feels long. On the other hand, as the frequency of asking for responses from members is annual, this amounts to “missed two cycles”. Counting it as dropping out with one missed cycle seems like it might catch too many people. This one seems pretty thorny to me, since more frequent data would be quite useful for analytics, but on the other hand impose rather larger burdens on both GWWC staff and members. And to the extent that what we care about is long-term donation behaviour, we may just need to wait for data.
Disclaimers: I work for the Global Priorities Project, part of the same umbrella organisation (CEA) as GWWC, and I was acting director of research at GWWC during 2014. I don’t know all the operational details, though.
Thanks for your interest, Dale, and Owen for responding so thoroughly.
I overall agree that there is a lot of information it would be nice to get at, although the numbers are somewhat small if we’re trying to work out the difference between, say, the 2009 cohort and the others (since there were only around 30 members in 2009). Now that I can spend less time on fundraising, I’ll try to put together a post about this.
Just to add a couple of points to what Owen has said: On the question of students to earning—that was definitely a time we were worried people would drop out. The data doesn’t seem to suggest that’s the case though—so far it seems that people do tend to actually start giving 10% when they start earning.
On the verifying donations—we have in the past compared AMF’s data on donors with member self-reporting. While the self-reporting was far from perfect, people were at least as often under-reporting as over-reporting. (And most often discrepancies simply turned out to be mistakes about when the person had donated.) For the reasons Owen mentions, we did this once (to get a sense of whether reporting was approx accurate) but we aren’t planning to do it again.
On the 2009 cohort—would it make sense to bucket this with the 2010 cohort? (So treating the first 14 months of GWWC as one cohort, and in years thereafter)
As far as I know, not all of that data you were asking about has been released, but some of it has and is in the impact assessment (notably 1.7% is a figure which is something like an average annual rate for members leaving, and 4.7% the equivalent for members losing touch with GWWC).
Thanks Owen.
My understanding was GWWC said that 1.7% was the average annual rate for members who admitted they were leaving, with an additional 4.7% of members not responding for two years. Unfortunately, these are not quite the numbers we want, and are likely to be too optimistic for a few reasons:
These numbers don’t include people who claim to be donating but are not. (Does GWWC attempt to verify donations? )
They also don’t include any sense of cohort differentiation—how do the early members compare to the later members? Is this number skewed by all of Dr Ord’s friends who joined early on?
Nor did they break down students donating 1% vs real people donating 10%. The latter are much more valuable, but I would also expect a significant number of people to drop-out when they switch from the relatively undemanding student pledge to the full 10% pledge. As such, if the current mass of members is very student-heavy, high historical retention rates may not generalize to when they leave and become full members.
How does the dropout curve change for an individual cohort. Do most of the people leaving leave in the early years of the pledge (good) or does the rate of drop-out increase over time (bad)?
Aside from this optimistic bias, there are a few other reasons to want cohort data
It would be nice to be able to reconcile that very low drop-out rates with the very high ones GWWC published a few years ago here, which showed only 65-70% claimed retention after 0-2 years. Right now it seems hard to understand how these are consistent.
2-year grace period is very long. In finance we call a loan non-performing if they are 90-days behind! Worse, GWWC is young and has seen exponential growth, so a 2-year wait period means no data on anyone who joined since June 2013.
A few people have expressed concerns to me, both publicly and privately, that GWWC has discovered some very negative facts about their membership. Standardized disclosure, in a form chosen by a third party, can go a long way towards dispelling these fears. This is why public companies have to report in accordance with US GAAP, rather than getting to choose their own metrics.
Finally, it would be nice to have some of the mundane, technical details as well. How was this average calculated? etc. With cohort data we don’t need to speculate and argue about the virtues of different ways of computing hazard rates, we can just do our own calculations.
Agree that the cohort data looks helpful for the reasons you mention (presuming a careful look at any privacy issues with releasing it checked out). I’ll respond to a couple of the points:
I don’t think so, and guess this is probably the right move despite the costs of lacking verification (aside from the obvious costs of data gathering, I’m worried about the subtle costs of making giving for the pledge feel more like an obligation than an opportunity, and this turning people off and leading to fewer donations, and fewer people encouraging others to join).
I think what’s going on here is that the 65-70% figures were mostly response rates on a survey of members. More recently GWWC has gone to rather more trouble to follow up with members who don’t initially answer the survey, in order to get better data (as it was unclear how many of the 30-35% non-respondents were still donating).
I see the sense in which it feels long. On the other hand, as the frequency of asking for responses from members is annual, this amounts to “missed two cycles”. Counting it as dropping out with one missed cycle seems like it might catch too many people. This one seems pretty thorny to me, since more frequent data would be quite useful for analytics, but on the other hand impose rather larger burdens on both GWWC staff and members. And to the extent that what we care about is long-term donation behaviour, we may just need to wait for data.
Disclaimers: I work for the Global Priorities Project, part of the same umbrella organisation (CEA) as GWWC, and I was acting director of research at GWWC during 2014. I don’t know all the operational details, though.
Thanks for your interest, Dale, and Owen for responding so thoroughly. I overall agree that there is a lot of information it would be nice to get at, although the numbers are somewhat small if we’re trying to work out the difference between, say, the 2009 cohort and the others (since there were only around 30 members in 2009). Now that I can spend less time on fundraising, I’ll try to put together a post about this. Just to add a couple of points to what Owen has said: On the question of students to earning—that was definitely a time we were worried people would drop out. The data doesn’t seem to suggest that’s the case though—so far it seems that people do tend to actually start giving 10% when they start earning. On the verifying donations—we have in the past compared AMF’s data on donors with member self-reporting. While the self-reporting was far from perfect, people were at least as often under-reporting as over-reporting. (And most often discrepancies simply turned out to be mistakes about when the person had donated.) For the reasons Owen mentions, we did this once (to get a sense of whether reporting was approx accurate) but we aren’t planning to do it again.
On the 2009 cohort—would it make sense to bucket this with the 2010 cohort? (So treating the first 14 months of GWWC as one cohort, and in years thereafter)