They seem rather difficult to capture and evaluate at a high level of specificity. It’s unclear if attempting to better measure and quantify that portion of ROI was the best use of these org’s resources a year ago, or even now in a tighter funding picture.
Per the first linked source: “In the 2020 EA Survey, 21% of respondents reported that Giving What We Can was important for them getting involved in EA.” Doubtless the percentage would be higher for all effective-givingish organizations (especially if GiveWell were included, my own entry point). Even concluding that 2.5 percent of 21 percent of EA activity should be “credited” to grassroots effective giving would be pretty significant additional impact for the fairly low spend involved.
They seem rather difficult to capture and evaluate at a high level of specificity. It’s unclear if attempting to better measure and quantify that portion of ROI was the best use of these org’s resources a year ago, or even now in a tighter funding picture.
I think there is a tension between:
People getting involved with EA is a major driver of our impact.
We do not measure how much we are responsible for people getting involved with EA.
These imply a major driver of impact is not being measured, which seems strange (specially for the larger effective giving organisations). Note that I am not suggesting investing tons of resources into quantifying indirect impact. Just asking a few questions once a year (e.g. did you apply to any job thanks to becoming aware of EA via our effective giving organisation?) would take little time, and provide useful information.
Per the first linked source: “In the 2020 EA Survey, 21% of respondents reported that Giving What We Can was important for them getting involved in EA.”
I agree GWWC’s indirect impact has been quite important:
I believe it would be important to study (by descending order of importance):
On the other hand, I would say “getting involved in EA” is a little too vague. I think effective giving can be considered being involved in EA, so, from what you quoted alone, it is unclear whether there have been indirect impacts besides donations.
Doubtless the percentage would be higher for all effective-givingish organizations (especially if GiveWell were included, my own entry point).
This is not obvious to me, because I think GWWC and GiveWell have much stronger ties to EA than the mean effective giving organisation. I also expect effective giving to disproportionally select for people who will end up engaged with neartermist interventions, which I think have very unclear impact.
Donor time/attention is a precious commodity to fundraisers, so I wouldn’t expect organizations to have expended much of it on this topic without a specific business justification. It’s plausible to me that the funders thought (and may still think) that each org’s easily-quantifiable output was sufficient to fill room for more funding, and that the orgs didn’t (and don’t) think more precise measurement of indirect impact would materially change org strategy (e.g., because those impacts are attainable by the org only as a byproduct of doing the org’s standard work).
They seem rather difficult to capture and evaluate at a high level of specificity. It’s unclear if attempting to better measure and quantify that portion of ROI was the best use of these org’s resources a year ago, or even now in a tighter funding picture.
Per the first linked source: “In the 2020 EA Survey, 21% of respondents reported that Giving What We Can was important for them getting involved in EA.” Doubtless the percentage would be higher for all effective-givingish organizations (especially if GiveWell were included, my own entry point). Even concluding that 2.5 percent of 21 percent of EA activity should be “credited” to grassroots effective giving would be pretty significant additional impact for the fairly low spend involved.
I think there is a tension between:
People getting involved with EA is a major driver of our impact.
We do not measure how much we are responsible for people getting involved with EA.
These imply a major driver of impact is not being measured, which seems strange (specially for the larger effective giving organisations). Note that I am not suggesting investing tons of resources into quantifying indirect impact. Just asking a few questions once a year (e.g. did you apply to any job thanks to becoming aware of EA via our effective giving organisation?) would take little time, and provide useful information.
I agree GWWC’s indirect impact has been quite important:
On the other hand, I would say “getting involved in EA” is a little too vague. I think effective giving can be considered being involved in EA, so, from what you quoted alone, it is unclear whether there have been indirect impacts besides donations.
This is not obvious to me, because I think GWWC and GiveWell have much stronger ties to EA than the mean effective giving organisation. I also expect effective giving to disproportionally select for people who will end up engaged with neartermist interventions, which I think have very unclear impact.
Donor time/attention is a precious commodity to fundraisers, so I wouldn’t expect organizations to have expended much of it on this topic without a specific business justification. It’s plausible to me that the funders thought (and may still think) that each org’s easily-quantifiable output was sufficient to fill room for more funding, and that the orgs didn’t (and don’t) think more precise measurement of indirect impact would materially change org strategy (e.g., because those impacts are attainable by the org only as a byproduct of doing the org’s standard work).