We aim to put ~10% of team time into underlying research, where one topic is trying to figure out which problems and paths go into each priority level. We also have podcast episodes on newer problems from time to time.
All that said, I am sympathetic to the idea that as a community we are underinvesting in cause priorities research.
Super great to hear that 10% of 80000 Hours team time will go into underlying research. (Also apologies for getting things wrong, was generalising from what I could find online about what 80K plans to work on – have edited the post). If you have more info on what this research might look into do let me know.
– –
That there is an exploit explore tradeoff. Continuing to do cause prioritisation research needs to be weighed against focusing on specific cause areas.
I imply in my post that EA organisations have jumped too quickly into exploit. (I mention 80K and FHI, but l am judging from an outside view so might be wrong). I think this is a hard case to make, especially to anyone who is more certain than me about which causes matter (which may be the most EA folk). That said there are other reasons for continuing to explore, to create a diverse community, epistemic humility, game theoretic reasons (better if everyone explores a bit more), to counter optimism bias, etc.
Not sure I am explaining this well. I guess I am saying that I still think the high level point I was making stands: that EA organisations seem to move towards exploit quicker than I would like. But do let me know if you disagree.
Hey Sam — being a small organisation 80,000 Hours has only ever had fairly limited staff time for cause priorities research.
But I wouldn’t say we’re doing less of it than before, and we haven’t decided to cut it. For instance see Arden Koehler’s recent posts about Ideas for high impact careers beyond our priority paths and Global issues beyond 80,000 Hours’ current priorities.
We aim to put ~10% of team time into underlying research, where one topic is trying to figure out which problems and paths go into each priority level. We also have podcast episodes on newer problems from time to time.
All that said, I am sympathetic to the idea that as a community we are underinvesting in cause priorities research.
Super great to hear that 10% of 80000 Hours team time will go into underlying research. (Also apologies for getting things wrong, was generalising from what I could find online about what 80K plans to work on – have edited the post). If you have more info on what this research might look into do let me know.
– –
That there is an exploit explore tradeoff. Continuing to do cause prioritisation research needs to be weighed against focusing on specific cause areas.
I imply in my post that EA organisations have jumped too quickly into exploit. (I mention 80K and FHI, but l am judging from an outside view so might be wrong). I think this is a hard case to make, especially to anyone who is more certain than me about which causes matter (which may be the most EA folk). That said there are other reasons for continuing to explore, to create a diverse community, epistemic humility, game theoretic reasons (better if everyone explores a bit more), to counter optimism bias, etc.
Not sure I am explaining this well. I guess I am saying that I still think the high level point I was making stands: that EA organisations seem to move towards exploit quicker than I would like. But do let me know if you disagree.