One large worry I have in evaluating GWWCâs impact is that Iâd expect the longer someone has been a GWWC member the more likely they are to drift away and stop keeping their pledge, and people who arenât active anymore are hard to survey. Iâve skimmed through the documents trying to understand how you handled this, and found discussion of related issues in a few places:
Pledgers seem to be giving more on average each year after taking the Pledge, even when you include all the Pledgers for whom we donât have recorded donations after some point. We want to emphasise that this data surprised us and caused us to reevaluate a key assumption we had when we began our impact evaluation. Specifically, we went into this impact evaluation expecting to see some kind of decay per year of giving. In our 2015 impact evaluation, we assumed a decay of 5% (and even this was criticised for seeming optimistic compared to EA Survey data â a criticism we agreed with at the time). Yet, what we in fact seem to be seeing is an increase in average giving per year since taking the Pledge, even when adjusting for inflation.
How does this handle members who arenât reporting any donations? How does reporting rate vary by tenure?
We ran one quick check: the average total 2021 donations on record for our full 250-person sample for the GWWC Pledge survey on reporting accuracy was $7,619, and among the 82 respondents it was $8,124. This may indicate some nonresponse bias, though a bit less than we had expected.
Was the $7,619 the average among members who recorded any donations, or counting ones who didnât record donations as having donated $0? What fraction of members in the 250-person sample recorded any donations?
We are not assuming perfect Pledge retention. Instead, we are aiming to extrapolate from what we have seen so far (which is a decline in the proportion of Pledgers who give, but an increase in the average amount given when they do).
Where does the decline in the proportion of people giving fit into the model?
How does [the evaluationâs finding that Pledgers seem to be giving more on average each year after taking the Pledge] handle members who arenât reporting any donations?
The (tentative) finding that Pledgersâ giving increases more each year after taking the Pledge assumes that members who arenât reporting any donations are not donating.
Was the $7,619 the average among [the 250-person sample we used for GWWC reporting accuracy survey] who recorded any donations, or counting ones who didnât record donations as having donated $0? What fraction of members in the 250-person sample recorded any donations?
The $7,619 figure is the average if you count those as not recording a donation as having donated $0. Unfortunately, I donât have the fraction of the 250-person sample who recorded donations at all on hand. However, I can give an informed guess: the sample was a randomly selected group of people who had taken the GWWC Pledge before 2021, and eyeballing the table I linked above, ~40-50% of pre-2021 Pledgers record a donation each year.
Where does the decline in the proportion of people giving fit into the model?
The model does not directly incorporate the decrease in proportion of people recording/âgiving, and neither does it directly incorporate the increase in the donation sizes for people who record/âgive. The motivation here is that â at least in the data so far â we see these effects cancel out (indeed, we see that the increase in donation size slightly outweighs the decrease in recording rates â but weâre not sure that trend will persist). We go into much more depth on this in our appendix section âWhy we did not assume a decay in the average amount given per yearâ.
Thanks! I think your âProportion of GWWC Pledgers who record any donations by Pledge year (per cohort)â link is pointing a bit too early in the doc, but I do see the table now, and itâs good.
reporting declines in the years after the Pledge, but that decline seems to plateau at a reporting rate of ~30%
Hereâs a version of that table with lines colored by how many people there are in that cohort:
It doesnât look like it stops at a reporting rate 30%, and the more recent (high cohort size) lines are still decreasing at maybe 5% annually as they get close to 30%.
And here are the year/âyear decreases:
Looking at the chart, itâs clear that decay slows down over time, and maybe it slows enough that itâs fine to ignore it, but it doesnât look like it goes to zero. A cohort-size-weighted average year/âyear decay where we ignore the first six years (starting with 5y since pledging, and so ignoring all cohorts since 2015) is 2%.
But thatâs probably too optimistic, since looking at more recent cohorts decay doesnât seem to be slowing, and the reason ignoring the first six years looks good is mostly that it drops those more recent cohorts.
Separately, I think it would be pretty reasonable to drop the pre-2011 reporting data. I think this probably represents something weird about starting up, like not collecting data thoroughly at first, and not about user behavior? I havenât done this in my analysis above, though, because since Iâm weighting by cohort size it doesnât do very much.
Point taken that there is no clear plateau at 30% -- itâll be interesting to see what future data shows.
Part of the reason for us having less analysis on the change of reporting rates over time is that we did not directly incorporate this rate of change into our model. For example, the table of reporting rates was primarily used in our evaluation to test a hypothesis for why we see an increase in average giving (even assuming people are not reporting are not giving at all). Our model does not assume reporting rates donât decline, nor does it assume the decline in reporting rates plateaus.
Instead, we investigated how average giving (which is a product of both reporting rates, and the average amount given conditional on reporting) changes over time. We saw that the decline in reporting rates is (more than) compensated by the increase in giving conditional on reporting. It could be that this will no longer remain true beyond a certain time horizon (though, perhaps it will!), but there are other arguably conservative assumptions for these long time-horizons (e.g., that giving stops at pension age, doesnât include any legacy giving). Some of these considerations come up as we discuss why we did not assume a decay in our influence and in our limitations of our Pledge model (in the bottom of this section, right above this one).
On your final point:
Separately, I think it would be pretty reasonable to drop the pre-2011 reporting data. I think this probably represents something weird about starting up, like not collecting data thoroughly at first, and not about user behavior? I havenât done this in my analysis above, though, because since Iâm weighting by cohort size it doesnât do very much.
Do you mean excluding it just for the purpose of analysing reporting rates over time? If so, that could well be right, and if we investigate this directly in future impact evaluations weâll need to look into what the quality/ârelevance of that data was and make a call here.
That makes sense, thanks! I think your text makes it sound like you disagree with the earlierattrition discussion, when actually itâs that giving increasing over time makes up for the attrition?
just for the purpose of analysing reporting rates over time?
Sorry, yes. I think itâs probably heavily underreported, since the very early reporting system was probably worse?
Ah, I can see what you mean regarding our text, I assume in this passage:
We want to emphasise that this data surprised us and caused us to reevaluate a key assumption we had when we began our impact evaluation. Specifically, we went into this impact evaluation expecting to see some kind of decay per year of giving. In our 2015 impact evaluation, we assumed a decay of 5% (and even this was criticised for seeming optimistic compared to EA Survey data â a criticism we agreed with at the time). Yet, what we in fact seem to be seeing is an increase in average giving per year since taking the Pledge, even when adjusting for inflation.
What you say is right: we agree there seems to be a decay in fulfilment /â reporting rates (which is what the earlierattrition discussion was mostly about) but we just add the additional observation that giving increasing over time makes up for this.
There is a sense in which we do disagree with that earlier discussion, which is that we think the kind of decay that would be relevant to modelling the value of the Pledge is the decay in average giving over time, and at least here, we do not see a decay. But we couldâve been clearer about this; at least on my reading, I think the paragraph I quoted above conflates different sorts of âdecayâ.
Iâd be interested in a chart similar to âProportion of GWWC Pledgers who record any donations by Pledge year (per cohort)â, but with 4 versions (median /â average donation in $) x (inclusive /â exclusive of those that didnât record data, assuming no record is $0). From the data it seems that both things are true: âmost people give less over time and stop givingâ and âon average, pledge donations increase over timeâ, driven entirely by ~5-10% of extremely wealthy donors that increase their pledge.
Thanks for sharing this!
One large worry I have in evaluating GWWCâs impact is that Iâd expect the longer someone has been a GWWC member the more likely they are to drift away and stop keeping their pledge, and people who arenât active anymore are hard to survey. Iâve skimmed through the documents trying to understand how you handled this, and found discussion of related issues in a few places:
How does this handle members who arenât reporting any donations? How does reporting rate vary by tenure?
Was the $7,619 the average among members who recorded any donations, or counting ones who didnât record donations as having donated $0? What fraction of members in the 250-person sample recorded any donations?
Where does the decline in the proportion of people giving fit into the model?
Thanks for your questions Jeff!
To answer point by point:
The (tentative) finding that Pledgersâ giving increases more each year after taking the Pledge assumes that members who arenât reporting any donations are not donating.
We include a table âProportion of GWWC Pledgers who record any donations by Pledge year (per cohort)â on page 48. In sum: reporting declines in the years after the Pledge, but that decline seems to plateau at a reporting rate of ~30% .
The $7,619 figure is the average if you count those as not recording a donation as having donated $0. Unfortunately, I donât have the fraction of the 250-person sample who recorded donations at all on hand. However, I can give an informed guess: the sample was a randomly selected group of people who had taken the GWWC Pledge before 2021, and eyeballing the table I linked above, ~40-50% of pre-2021 Pledgers record a donation each year.
The model does not directly incorporate the decrease in proportion of people recording/âgiving, and neither does it directly incorporate the increase in the donation sizes for people who record/âgive. The motivation here is that â at least in the data so far â we see these effects cancel out (indeed, we see that the increase in donation size slightly outweighs the decrease in recording rates â but weâre not sure that trend will persist). We go into much more depth on this in our appendix section âWhy we did not assume a decay in the average amount given per yearâ.
Thanks! I think your âProportion of GWWC Pledgers who record any donations by Pledge year (per cohort)â link is pointing a bit too early in the doc, but I do see the table now, and itâs good.
Hereâs a version of that table with lines colored by how many people there are in that cohort:
It doesnât look like it stops at a reporting rate 30%, and the more recent (high cohort size) lines are still decreasing at maybe 5% annually as they get close to 30%.
And here are the year/âyear decreases:
Looking at the chart, itâs clear that decay slows down over time, and maybe it slows enough that itâs fine to ignore it, but it doesnât look like it goes to zero. A cohort-size-weighted average year/âyear decay where we ignore the first six years (starting with 5y since pledging, and so ignoring all cohorts since 2015) is 2%.
(code)
But thatâs probably too optimistic, since looking at more recent cohorts decay doesnât seem to be slowing, and the reason ignoring the first six years looks good is mostly that it drops those more recent cohorts.
Separately, I think it would be pretty reasonable to drop the pre-2011 reporting data. I think this probably represents something weird about starting up, like not collecting data thoroughly at first, and not about user behavior? I havenât done this in my analysis above, though, because since Iâm weighting by cohort size it doesnât do very much.
Really appreciate this analysis, Jeff.
Point taken that there is no clear plateau at 30% -- itâll be interesting to see what future data shows.
Part of the reason for us having less analysis on the change of reporting rates over time is that we did not directly incorporate this rate of change into our model. For example, the table of reporting rates was primarily used in our evaluation to test a hypothesis for why we see an increase in average giving (even assuming people are not reporting are not giving at all). Our model does not assume reporting rates donât decline, nor does it assume the decline in reporting rates plateaus.
Instead, we investigated how average giving (which is a product of both reporting rates, and the average amount given conditional on reporting) changes over time. We saw that the decline in reporting rates is (more than) compensated by the increase in giving conditional on reporting. It could be that this will no longer remain true beyond a certain time horizon (though, perhaps it will!), but there are other arguably conservative assumptions for these long time-horizons (e.g., that giving stops at pension age, doesnât include any legacy giving). Some of these considerations come up as we discuss why we did not assume a decay in our influence and in our limitations of our Pledge model (in the bottom of this section, right above this one).
On your final point:
Do you mean excluding it just for the purpose of analysing reporting rates over time? If so, that could well be right, and if we investigate this directly in future impact evaluations weâll need to look into what the quality/ârelevance of that data was and make a call here.
That makes sense, thanks! I think your text makes it sound like you disagree with the earlier attrition discussion, when actually itâs that giving increasing over time makes up for the attrition?
Sorry, yes. I think itâs probably heavily underreported, since the very early reporting system was probably worse?
Ah, I can see what you mean regarding our text, I assume in this passage:
What you say is right: we agree there seems to be a decay in fulfilment /â reporting rates (which is what the earlier attrition discussion was mostly about) but we just add the additional observation that giving increasing over time makes up for this.
There is a sense in which we do disagree with that earlier discussion, which is that we think the kind of decay that would be relevant to modelling the value of the Pledge is the decay in average giving over time, and at least here, we do not see a decay. But we couldâve been clearer about this; at least on my reading, I think the paragraph I quoted above conflates different sorts of âdecayâ.
Iâd be interested in a chart similar to âProportion of GWWC Pledgers who record any donations by Pledge year (per cohort)â, but with 4 versions (median /â average donation in $) x (inclusive /â exclusive of those that didnât record data, assuming no record is $0). From the data it seems that both things are true: âmost people give less over time and stop givingâ and âon average, pledge donations increase over timeâ, driven entirely by ~5-10% of extremely wealthy donors that increase their pledge.