Thanks for this—I found it very interesting both with regards to Rethink Priorities’ impact in particular, and with regards to one way in which EA research organisations in general could evaluate their impacts.
We also plan to more explicitly prompt respondents on the amount of money in donations which was influenced and the result of any research conducted that was inspired by our work.
That definitely sounds interesting. As a reader, I’d have been interested in at least a vague sense of what ballpark the donation sizes were likely to be in—e.g., were most of the reported donations influenced from individual EAs who aren’t on extremely high salaries, or from (decision-makers for) fairly substantial funders like the EA Funds or OPP. But I can also understand why any vague sense of that might be too difficult when you also want to maintain anonymity and not lead people to focus too strongly on a maybe fairly made-up guess.
Also, when I saw one of the “actions” was “Inspired you or someone in your organisation to research this topic”, it occurred to me that perhaps an additional valuable action could be “Allowed you or someone in your organisation to use time and other resources for things other than researching this topic”. This could occur if:
seeing RP’s research made people realise/feel that RP had the topic covered, or
the people only really wanted to research the topic to build their own background knowledge in order to do other research or make other decisions, and RP’s research served as a sufficiently good summary that that was no longer needed.
And this could be valuable if it “frees up” those people to do things that are more valuable than researching that topic (given that RP has done the research it has done).
It’s possible that that would be one of the “Other” actions people indicated having taken.
I imagine “influencing donations” would’ve captured the equivalent for donations—e.g., people deciding to donate to something other than an intervention RP investigated, because RP’s work negatively updated their estimates of that intervention’s value. But it might be interesting to somehow check if respondents interpreted that “action” that way, rather than just as meaning “Influenced me to donate tothis intervention that RP researched”.
Two last quite minor points about things I found slightly unclear:
“44 out of 47 respondents stated how useful they found our work. The options were “not at all useful”, “not so useful” “somewhat useful”, “very useful” , and “N/A”. For 10 of the 13 options, 50% or more found our work somewhat or very useful.” That last sentence confused me, because I thought that that meant 50% or more of the 44 respondents who answered that, whereas it turned out it meant 50% or more of the people who had read some of RP’s work in that particular category. There was a few other places where the same sort of phrasings were used, though I had figured it out by then from the raw numbers.
The slaughterhouse ban research was mentioned a few times before it was revealed that it was forthcoming. This made me slightly confused, as I thought I was pretty familiar with at least what RP had worked on (even if I hadn’t read everything). Plus I had been assuming that that research was somehow lower quality or less impactful than the other lines of research, based on the survey responses, but now I realise that the responses may have mainly reflected the fact that the work was not yet public.
Thanks for this—I found it very interesting both with regards to Rethink Priorities’ impact in particular, and with regards to one way in which EA research organisations in general could evaluate their impacts.
That definitely sounds interesting. As a reader, I’d have been interested in at least a vague sense of what ballpark the donation sizes were likely to be in—e.g., were most of the reported donations influenced from individual EAs who aren’t on extremely high salaries, or from (decision-makers for) fairly substantial funders like the EA Funds or OPP. But I can also understand why any vague sense of that might be too difficult when you also want to maintain anonymity and not lead people to focus too strongly on a maybe fairly made-up guess.
Also, when I saw one of the “actions” was “Inspired you or someone in your organisation to research this topic”, it occurred to me that perhaps an additional valuable action could be “Allowed you or someone in your organisation to use time and other resources for things other than researching this topic”. This could occur if:
seeing RP’s research made people realise/feel that RP had the topic covered, or
the people only really wanted to research the topic to build their own background knowledge in order to do other research or make other decisions, and RP’s research served as a sufficiently good summary that that was no longer needed.
And this could be valuable if it “frees up” those people to do things that are more valuable than researching that topic (given that RP has done the research it has done).
It’s possible that that would be one of the “Other” actions people indicated having taken.
I imagine “influencing donations” would’ve captured the equivalent for donations—e.g., people deciding to donate to something other than an intervention RP investigated, because RP’s work negatively updated their estimates of that intervention’s value. But it might be interesting to somehow check if respondents interpreted that “action” that way, rather than just as meaning “Influenced me to donate to this intervention that RP researched”.
Two last quite minor points about things I found slightly unclear:
“44 out of 47 respondents stated how useful they found our work. The options were “not at all useful”, “not so useful” “somewhat useful”, “very useful” , and “N/A”. For 10 of the 13 options, 50% or more found our work somewhat or very useful.” That last sentence confused me, because I thought that that meant 50% or more of the 44 respondents who answered that, whereas it turned out it meant 50% or more of the people who had read some of RP’s work in that particular category. There was a few other places where the same sort of phrasings were used, though I had figured it out by then from the raw numbers.
The slaughterhouse ban research was mentioned a few times before it was revealed that it was forthcoming. This made me slightly confused, as I thought I was pretty familiar with at least what RP had worked on (even if I hadn’t read everything). Plus I had been assuming that that research was somehow lower quality or less impactful than the other lines of research, based on the survey responses, but now I realise that the responses may have mainly reflected the fact that the work was not yet public.