I thought it would be interesting to add uncertainty. If you have
20K 40K # Mean annual salary 2025 pledgers
* 0.1 # 10% given
* beta 1 4 # counterfactual adjustment. Differs from post
* beta 5 5 # effectiveness adjustment
* 5 20 # discounted living lifespan
* 1.1 2 # reporting adjustment
* 800 2K # expected number of pledgers
* 1.2 1.5 # adjustment for largest donors
* beta 2 8 # more adjustments (the product of rows 27:37 is 0.18)
/ 209K # cost of GWWC
The result is a giving multiplier of 0.2 to 30.
To me the key parameter is the counterfactuality of these donations. Your current number is 50%, but not super sure if you are accounting for people being less able to do ambitious things because
they have fewer savings.
To some extent you may also want to account for adjustments you haven’t thought of generally
We report a crude version of uncertainty intervals at the end of the report (pg 28) - taking the lower bound estimates of all the important variables, the multiplier would be 0x, while taking the upper bound estimates, it would be 100x.
In terms of miscellaneous adjustments, we made an attempt to be comprehensive; for example, we adjust for (a) expected prioritization of pledges over donations by GWWC in the future, (b) company pledgers, (c) post-retirement donations, (d) spillover effects on non-pledge donations, (e) indirect impact on the EG ecosystem (EG incubation, EGsummit), (f) impact on the talent pipeline, (g) decline in the counterfactual due to the growth of EA (i.e. more people are likely to hear of effective giving regardless of GWWC), and (h) reduced political donations. The challenge is that a lot of these variables lack the necessary data for quantification, and of course, there may be additional important considerations we’ve not factored in.
That said, I’m not sure if we would get a meaningful negative effect from people being less able to do ambitious things because of fewer savings—partly for effect size reasons (10% isn’t much), and also you would theoretically have people motivated by E2G to do very ambitious for-profit stuff when they otherwise would have done something less impactful but more subjectively fulfilling (e.g. traditional nonprofit roles). It does feel like a just-so story either way, so I’m not certain if the best model would include such an adjustment in the absence of good data.
I thought it would be interesting to add uncertainty. If you have
The result is a giving multiplier of 0.2 to 30.
To me the key parameter is the counterfactuality of these donations. Your current number is 50%, but not super sure if you are accounting for people being less able to do ambitious things because they have fewer savings.
To some extent you may also want to account for adjustments you haven’t thought of generally
Hi Nuno,
We report a crude version of uncertainty intervals at the end of the report (pg 28) - taking the lower bound estimates of all the important variables, the multiplier would be 0x, while taking the upper bound estimates, it would be 100x.
In terms of miscellaneous adjustments, we made an attempt to be comprehensive; for example, we adjust for (a) expected prioritization of pledges over donations by GWWC in the future, (b) company pledgers, (c) post-retirement donations, (d) spillover effects on non-pledge donations, (e) indirect impact on the EG ecosystem (EG incubation, EGsummit), (f) impact on the talent pipeline, (g) decline in the counterfactual due to the growth of EA (i.e. more people are likely to hear of effective giving regardless of GWWC), and (h) reduced political donations. The challenge is that a lot of these variables lack the necessary data for quantification, and of course, there may be additional important considerations we’ve not factored in.
That said, I’m not sure if we would get a meaningful negative effect from people being less able to do ambitious things because of fewer savings—partly for effect size reasons (10% isn’t much), and also you would theoretically have people motivated by E2G to do very ambitious for-profit stuff when they otherwise would have done something less impactful but more subjectively fulfilling (e.g. traditional nonprofit roles). It does feel like a just-so story either way, so I’m not certain if the best model would include such an adjustment in the absence of good data.
Yeah, possible. It’s just been on my mind since FTX.