I don’t think this is quite what you’re looking for as I don’t technically ‘use this research’ but I personally think it’s plausible that giving to the Global Priorities Institute is one of the best giving opportunities available, provided you think long-termism is plausible and that we don’t already know everything about it already. Having said that if you look at their research agenda it does cover general issues in global prioritisation as well as longtermist questions.
Thank you Jack. Maybe see my update comment below. I don’t know how to evaluate the impact of academic research where I cannot see any real world use of that research. That is not to say that I don’t think it has value but the feedback loops to creating value are really long and opaque, especially for the kind of philosophical work GPI appear to be focusing on. If you have a good way of evaluating that kind of research do say, I would love to hear. But at present I would be more excited to donate to organisations where there is use of their research in some tangible way, like Charity Entrepreneurship or Founders Pledge etc (or maybe like CSER as a longtermist one).
That’s fair enough. I agree the route to impact is longer and more opaque. As such I think that it is hard to “evidence” the impact of GPI and I certainly won’t be able to adequately do that here. I suppose one way that this could be done would be to survey EA organisations that actually “get things done” to see if they make use of GPI’s research.
Having said that I think it is possible to make a more abstract case that GPI may have tremendous impact by looking at the impact that academia has already had in the EA community. For example, as far as I’m aware, the heavy focus on longtermism in the EA community stems originally from academic papers such as this one or this one. Given that global priorities research is still so new it seems plausible that further research could still radically change the direction of the EA community for the better. That’s why I personally would probably give to GPI over CE at this stage.
Yet it would be an even stronger case if organisations produce research that is both being peer reviewed so as to build an academic field AND had immediate real world outcomes. This seems possible and the two papers you cited would pass that bar. Hence my curiosity to try to see what EA research is being used.
(The lack of responses so far either implies that not much research from these organisations is currently having any real world medium-term measurable output or that I am asking the wrong question, asking in the wrong place or asking in the wrong way.)
By the way sorry about not being that helpful and essentially sidestepping your actual question in my first response.
I think if you want to get an accurate view on what research people use that’s probably not going to be possible by asking a question on the EA Forum. I’m just not that sure how many people answer questions here and so you’ll inevitably get a skewed picture from those that do answer. Having a question like this in a wider survey would be helpful. I can’t quite remember if this was asked in the 2020 EA survey—I think something fairly similar was. It’s a good one to have going forward.
I don’t think this is quite what you’re looking for as I don’t technically ‘use this research’ but I personally think it’s plausible that giving to the Global Priorities Institute is one of the best giving opportunities available, provided you think long-termism is plausible and that we don’t already know everything about it already. Having said that if you look at their research agenda it does cover general issues in global prioritisation as well as longtermist questions.
Ben Todd’s blog post on this is pretty good.
Thank you Jack. Maybe see my update comment below. I don’t know how to evaluate the impact of academic research where I cannot see any real world use of that research. That is not to say that I don’t think it has value but the feedback loops to creating value are really long and opaque, especially for the kind of philosophical work GPI appear to be focusing on. If you have a good way of evaluating that kind of research do say, I would love to hear. But at present I would be more excited to donate to organisations where there is use of their research in some tangible way, like Charity Entrepreneurship or Founders Pledge etc (or maybe like CSER as a longtermist one).
That’s fair enough. I agree the route to impact is longer and more opaque. As such I think that it is hard to “evidence” the impact of GPI and I certainly won’t be able to adequately do that here. I suppose one way that this could be done would be to survey EA organisations that actually “get things done” to see if they make use of GPI’s research.
Having said that I think it is possible to make a more abstract case that GPI may have tremendous impact by looking at the impact that academia has already had in the EA community. For example, as far as I’m aware, the heavy focus on longtermism in the EA community stems originally from academic papers such as this one or this one. Given that global priorities research is still so new it seems plausible that further research could still radically change the direction of the EA community for the better. That’s why I personally would probably give to GPI over CE at this stage.
Yet it would be an even stronger case if organisations produce research that is both being peer reviewed so as to build an academic field AND had immediate real world outcomes. This seems possible and the two papers you cited would pass that bar. Hence my curiosity to try to see what EA research is being used.
(The lack of responses so far either implies that not much research from these organisations is currently having any real world medium-term measurable output or that I am asking the wrong question, asking in the wrong place or asking in the wrong way.)
By the way sorry about not being that helpful and essentially sidestepping your actual question in my first response.
I think if you want to get an accurate view on what research people use that’s probably not going to be possible by asking a question on the EA Forum. I’m just not that sure how many people answer questions here and so you’ll inevitably get a skewed picture from those that do answer. Having a question like this in a wider survey would be helpful. I can’t quite remember if this was asked in the 2020 EA survey—I think something fairly similar was. It’s a good one to have going forward.