[Question] Is there more guidance/resources on what to do if one feels like they are part of a sugnificant minority that should focus on potentially pressing global issues beyond current 80 000h priorities?
“significant minority of our readers (say 10–20%) to explore new areas like those listed below rather than focusing on our current priority problem areas. This would be the 10–20% who are relatively best suited to these areas, which probably means those with some kind of pre-existing interest.”
I think I’m part of that significant minority but cannot really find any further help or enough material regarding those topics from an EA angle, for example safeguarding democracy, risks of stable totalitarianism, risks from malevolent actors, global public goods etc.
I think the lack of discussion and materials and research is probably due to resource optimisation towards what people think is highest on the priority list(?). Then the website says this:
“We’d be excited to see more discussion and exploration of many of these areas from the perspective of trying to improve the long-term future, and hope to help facilitate such exploration going forward.”
➡️ what could this mean? I think I could do something with this information but I don’t know what.
[Question] Is there more guidance/resources on what to do if one feels like they are part of a sugnificant minority that should focus on potentially pressing global issues beyond current 80 000h priorities?
Long lists of potentially pressing global issues beyond our current priorities https://80000hours.org/problem-profiles/
80 000h website says this:
“significant minority of our readers (say 10–20%) to explore new areas like those listed below rather than focusing on our current priority problem areas. This would be the 10–20% who are relatively best suited to these areas, which probably means those with some kind of pre-existing interest.”
I think I’m part of that significant minority but cannot really find any further help or enough material regarding those topics from an EA angle, for example safeguarding democracy, risks of stable totalitarianism, risks from malevolent actors, global public goods etc.
I think the lack of discussion and materials and research is probably due to resource optimisation towards what people think is highest on the priority list(?). Then the website says this:
“We’d be excited to see more discussion and exploration of many of these areas from the perspective of trying to improve the long-term future, and hope to help facilitate such exploration going forward.”
➡️ what could this mean? I think I could do something with this information but I don’t know what.