I think this is the key table is table 6 (link HERE—can’t seem to put images in comments -- @Luke maybe add it to your post?)
Aside: However, I’m also curious about the overall effect of the information intervention in *itself*, without considering whether it is a downward or an upwards update. I don’t see that presented anywhere!
In the table above I guess that the row I care most about is the second row, and then
Columns 2 — support for global redistribution — with a small and positive mean estimated effect and
Column 5. actual choice of allocating one’s own money to the poor Kenyan person — estimated near 0.
These are not ‘statistically significant’ but that doesn’t mean they are zero, we just don’t know.
We may be able to rule out _very large effects_, but The standard errors in parentheses seem to be moderately large. (ThinkRhink +/- 1 std error) you cannot rule out a substantial positive effect of ’learning you are ranked higher globally … on support for giving/redistributing globally.
There is still a substantial likelihood that updating beliefs about global ranking upwards increases my support fo these by, maybe 10-25% of a standard deviation, which seems substantial to me.
I suppose for EA giving orgs target audience tend to be left-of-center politically (in the German context? Not sure) . For that group — the very important column 4 seems to be somewhat positive… and with a pretty large standard error.
Checking on EA Survey data (noting that these are not the GWWC target audience, but just for reference), about 300 of the 400 Germans who respond to the politics question say they are either ‘Left’ or “Center left” and under 10% identify as right, center, or center-right.
Empirics
I think this is the key table is table 6 (link HERE—can’t seem to put images in comments -- @Luke maybe add it to your post?)
Aside: However, I’m also curious about the overall effect of the information intervention in *itself*, without considering whether it is a downward or an upwards update. I don’t see that presented anywhere!
In the table above I guess that the row I care most about is the second row, and then
Columns 2 — support for global redistribution — with a small and positive mean estimated effect and
Column 5. actual choice of allocating one’s own money to the poor Kenyan person — estimated near 0.
These are not ‘statistically significant’ but that doesn’t mean they are zero, we just don’t know.
We may be able to rule out _very large effects_, but The standard errors in parentheses seem to be moderately large. (ThinkRhink +/- 1 std error) you cannot rule out a substantial positive effect of ’learning you are ranked higher globally … on support for giving/redistributing globally.
There is still a substantial likelihood that updating beliefs about global ranking upwards increases my support fo these by, maybe 10-25% of a standard deviation, which seems substantial to me.
I suppose for EA giving orgs target audience tend to be left-of-center politically (in the German context? Not sure) . For that group — the very important column 4 seems to be somewhat positive… and with a pretty large standard error.
Checking on EA Survey data (noting that these are not the GWWC target audience, but just for reference), about 300 of the 400 Germans who respond to the politics question say they are either ‘Left’ or “Center left” and under 10% identify as right, center, or center-right.