Is the 10% Giving What We Can pledge, in which participants commit to donating 10% of their annual income to an effective charity, part of EA’s brand or reputation?
These questions seem empirically tractable through surveys and related experiments. It’s relatively straightforward to assess how many many familiar with EA associate it with the 10% pledge (the main challenge is that so few people have any familiarity with EA at all).
It would also be possible to assess how the pledge or association with effective giving more broadly, influences the reputation of EA. i.e. by conducting experiments, where people are randomly presented depictions of EA which include reference to the 10% pledge or to effective donations in general. This would also allow assessment of how these effects differ across different groups. RP could conduct this kind of experiment, though would need funding to do so.
As one additional note, first, thank you for linking to the survey about people’s familiarity with EA. Although I think it is probably useful evidence, and am extremely supportive of attempts to gather such evidence in general, one of my immediate concerns is that the data was gathered in April 2022.
This means the results predate both Will MacAskill’s high-profile publicity tour for What We Owe The Future as well as the downfall of FTX. My guess is that the number of people who have heard of Effective Altruism has increased substantially since then. The New York Times has 8.6 million digital subscribers and has covered EA a decent amount over the last year (often negatively), although I am confident that only a fraction of its subscribers read these articles.
What we can learn from it is how EA was perceived prior to these two important signal-boosting and reputation-altering events.
One specific relevant point is the figure for how many people have heard of GWWC relative to other EA orgs: it is the second-most-recognized of the institutions they asked about, at 4.1% of respondants (vs. 7.8% for GiveWell, the most recognized organization).
I am not a professional pollster, so my ability to parse the results in a sophisticated way is limited. But I give some deference to the idea of the Lizardman Constant—the idea that a small fraction (on the order of 2-5%) will endorse just about anything in the context of a poll, including the idea that Lizardmen rule the earth. As most of the results are roughly in this range, I have to treat the results with moderate skepticism.
I think the numbers initially claiming to have heard of EA (19.1%) are strongly inflated by false positives (including lizardmen), but the numbers after the ‘stringent’ checks (including giving a qualitative explanation of what EA is) were applied (1.9-3.1%) are much less so (though, as we argue, still somewhat inflated). Note that the org results didn’t have the same checks applied, so those definitely shouldn’t be taken at face value and should be expected to be inflated by lizardmen etc.
This means the results predate both Will MacAskill’s high-profile publicity tour for What We Owe The Future as well as the downfall of FTX. My guess is that the number of people who have heard of Effective Altruism has increased substantially since then.
We’ll be publishing results about this soon, but as we noted here, we don’t think there’s been such a substantial increase in awareness of EA due to FTX, including among elite groups.
Yes, I have tentative plans to conduct some interviews and MTurk surveys as a cheap and easy way to gather more empirical information. I don’t think these will resolve the question, but hopefully they will continue to elevate the discussion with critique that is less focused on convenience sampling and ad hoc interpretation by a potentially motivated debater (which is how I would criticize the quality of the evidence I present here).
These questions seem empirically tractable through surveys and related experiments. It’s relatively straightforward to assess how many many familiar with EA associate it with the 10% pledge (the main challenge is that so few people have any familiarity with EA at all).
It would also be possible to assess how the pledge or association with effective giving more broadly, influences the reputation of EA. i.e. by conducting experiments, where people are randomly presented depictions of EA which include reference to the 10% pledge or to effective donations in general. This would also allow assessment of how these effects differ across different groups. RP could conduct this kind of experiment, though would need funding to do so.
As one additional note, first, thank you for linking to the survey about people’s familiarity with EA. Although I think it is probably useful evidence, and am extremely supportive of attempts to gather such evidence in general, one of my immediate concerns is that the data was gathered in April 2022.
This means the results predate both Will MacAskill’s high-profile publicity tour for What We Owe The Future as well as the downfall of FTX. My guess is that the number of people who have heard of Effective Altruism has increased substantially since then. The New York Times has 8.6 million digital subscribers and has covered EA a decent amount over the last year (often negatively), although I am confident that only a fraction of its subscribers read these articles.
What we can learn from it is how EA was perceived prior to these two important signal-boosting and reputation-altering events.
One specific relevant point is the figure for how many people have heard of GWWC relative to other EA orgs: it is the second-most-recognized of the institutions they asked about, at 4.1% of respondants (vs. 7.8% for GiveWell, the most recognized organization).
I am not a professional pollster, so my ability to parse the results in a sophisticated way is limited. But I give some deference to the idea of the Lizardman Constant—the idea that a small fraction (on the order of 2-5%) will endorse just about anything in the context of a poll, including the idea that Lizardmen rule the earth. As most of the results are roughly in this range, I have to treat the results with moderate skepticism.
I think the numbers initially claiming to have heard of EA (19.1%) are strongly inflated by false positives (including lizardmen), but the numbers after the ‘stringent’ checks (including giving a qualitative explanation of what EA is) were applied (1.9-3.1%) are much less so (though, as we argue, still somewhat inflated). Note that the org results didn’t have the same checks applied, so those definitely shouldn’t be taken at face value and should be expected to be inflated by lizardmen etc.
We’ll be publishing results about this soon, but as we noted here, we don’t think there’s been such a substantial increase in awareness of EA due to FTX, including among elite groups.
Yes, I have tentative plans to conduct some interviews and MTurk surveys as a cheap and easy way to gather more empirical information. I don’t think these will resolve the question, but hopefully they will continue to elevate the discussion with critique that is less focused on convenience sampling and ad hoc interpretation by a potentially motivated debater (which is how I would criticize the quality of the evidence I present here).