Quick response. That is not what it means. 80k’s priorities have shifted towards caring about the long term survival and flourishing of humanity. They are particularly concerned with risk from artificial intelligence that may not be well aligned with sentient life. In those areas some argue that it is very unclear what is worth funding, and the priority has been to try to get people to work in particular career and research areas.
However the Givewell global health and well-being charities, as well as other charities in this area (like fistula surgery etc) are still extremely effective at helping people survive and reduce their suffering. They are far more effective per dollar in this sense than charities that operate in wealthier countries.
There is still a very strong case for supporting these charities, particularly if you are concerned with reducing human suffering and helping people that are alive today. (I won’t get into the discussion of population ethics here.)
Yes, this is enlightening, thanks. The 80k article wasn’t clear that they were talking about longtermism to the exclusion of more old-school EA priorities, and you make this clear.
Quick response. That is not what it means. 80k’s priorities have shifted towards caring about the long term survival and flourishing of humanity. They are particularly concerned with risk from artificial intelligence that may not be well aligned with sentient life. In those areas some argue that it is very unclear what is worth funding, and the priority has been to try to get people to work in particular career and research areas.
However the Givewell global health and well-being charities, as well as other charities in this area (like fistula surgery etc) are still extremely effective at helping people survive and reduce their suffering. They are far more effective per dollar in this sense than charities that operate in wealthier countries.
There is still a very strong case for supporting these charities, particularly if you are concerned with reducing human suffering and helping people that are alive today. (I won’t get into the discussion of population ethics here.)
I hope this helps.
Yes, this is enlightening, thanks. The 80k article wasn’t clear that they were talking about longtermism to the exclusion of more old-school EA priorities, and you make this clear.