I’d also submit that the relative impact of effective-giving organizations nearer the “grassroots” level will likely be underestimated by looking solely at money moved. For example, grassroots effective-giving campaigns provide people with accessible ways to take action, which itself can spur greater commitment and downstream positive actions that aren’t captured well by a money-moved analysis alone. In contrast, money moved likely does a better job capturing the bulk of the impact from UHNW outreach.
Agreed, Jason! On the other hand, I would expect effective giving organisation to be tracking such indirect impacts if they represented an important part of their theory of change and overall impact. My impression is that the 4 organisations I analysed are not assessing much such indirect impacts. They were also not covered in GWWC’s last impact analysis (see here).
They seem rather difficult to capture and evaluate at a high level of specificity. It’s unclear if attempting to better measure and quantify that portion of ROI was the best use of these org’s resources a year ago, or even now in a tighter funding picture.
Per the first linked source: “In the 2020 EA Survey, 21% of respondents reported that Giving What We Can was important for them getting involved in EA.” Doubtless the percentage would be higher for all effective-givingish organizations (especially if GiveWell were included, my own entry point). Even concluding that 2.5 percent of 21 percent of EA activity should be “credited” to grassroots effective giving would be pretty significant additional impact for the fairly low spend involved.
They seem rather difficult to capture and evaluate at a high level of specificity. It’s unclear if attempting to better measure and quantify that portion of ROI was the best use of these org’s resources a year ago, or even now in a tighter funding picture.
I think there is a tension between:
People getting involved with EA is a major driver of our impact.
We do not measure how much we are responsible for people getting involved with EA.
These imply a major driver of impact is not being measured, which seems strange (specially for the larger effective giving organisations). Note that I am not suggesting investing tons of resources into quantifying indirect impact. Just asking a few questions once a year (e.g. did you apply to any job thanks to becoming aware of EA via our effective giving organisation?) would take little time, and provide useful information.
Per the first linked source: “In the 2020 EA Survey, 21% of respondents reported that Giving What We Can was important for them getting involved in EA.”
I agree GWWC’s indirect impact has been quite important:
I believe it would be important to study (by descending order of importance):
On the other hand, I would say “getting involved in EA” is a little too vague. I think effective giving can be considered being involved in EA, so, from what you quoted alone, it is unclear whether there have been indirect impacts besides donations.
Doubtless the percentage would be higher for all effective-givingish organizations (especially if GiveWell were included, my own entry point).
This is not obvious to me, because I think GWWC and GiveWell have much stronger ties to EA than the mean effective giving organisation. I also expect effective giving to disproportionally select for people who will end up engaged with neartermist interventions, which I think have very unclear impact.
Donor time/attention is a precious commodity to fundraisers, so I wouldn’t expect organizations to have expended much of it on this topic without a specific business justification. It’s plausible to me that the funders thought (and may still think) that each org’s easily-quantifiable output was sufficient to fill room for more funding, and that the orgs didn’t (and don’t) think more precise measurement of indirect impact would materially change org strategy (e.g., because those impacts are attainable by the org only as a byproduct of doing the org’s standard work).
I’d also submit that the relative impact of effective-giving organizations nearer the “grassroots” level will likely be underestimated by looking solely at money moved. For example, grassroots effective-giving campaigns provide people with accessible ways to take action, which itself can spur greater commitment and downstream positive actions that aren’t captured well by a money-moved analysis alone. In contrast, money moved likely does a better job capturing the bulk of the impact from UHNW outreach.
Agreed, Jason! On the other hand, I would expect effective giving organisation to be tracking such indirect impacts if they represented an important part of their theory of change and overall impact. My impression is that the 4 organisations I analysed are not assessing much such indirect impacts. They were also not covered in GWWC’s last impact analysis (see here).
They seem rather difficult to capture and evaluate at a high level of specificity. It’s unclear if attempting to better measure and quantify that portion of ROI was the best use of these org’s resources a year ago, or even now in a tighter funding picture.
Per the first linked source: “In the 2020 EA Survey, 21% of respondents reported that Giving What We Can was important for them getting involved in EA.” Doubtless the percentage would be higher for all effective-givingish organizations (especially if GiveWell were included, my own entry point). Even concluding that 2.5 percent of 21 percent of EA activity should be “credited” to grassroots effective giving would be pretty significant additional impact for the fairly low spend involved.
I think there is a tension between:
People getting involved with EA is a major driver of our impact.
We do not measure how much we are responsible for people getting involved with EA.
These imply a major driver of impact is not being measured, which seems strange (specially for the larger effective giving organisations). Note that I am not suggesting investing tons of resources into quantifying indirect impact. Just asking a few questions once a year (e.g. did you apply to any job thanks to becoming aware of EA via our effective giving organisation?) would take little time, and provide useful information.
I agree GWWC’s indirect impact has been quite important:
On the other hand, I would say “getting involved in EA” is a little too vague. I think effective giving can be considered being involved in EA, so, from what you quoted alone, it is unclear whether there have been indirect impacts besides donations.
This is not obvious to me, because I think GWWC and GiveWell have much stronger ties to EA than the mean effective giving organisation. I also expect effective giving to disproportionally select for people who will end up engaged with neartermist interventions, which I think have very unclear impact.
Donor time/attention is a precious commodity to fundraisers, so I wouldn’t expect organizations to have expended much of it on this topic without a specific business justification. It’s plausible to me that the funders thought (and may still think) that each org’s easily-quantifiable output was sufficient to fill room for more funding, and that the orgs didn’t (and don’t) think more precise measurement of indirect impact would materially change org strategy (e.g., because those impacts are attainable by the org only as a byproduct of doing the org’s standard work).