Hey Joey, Arden from 80k here. I just wanted to say that I don’t think 80k has “the answers” to how to do the most good.
But we do try to form views on the relative impact of different things, so we do try to reach working answers, and then act on our views (e.g. by communicating them and investing more where we think we can have more impact).
So e.g. we prioritise cause areas we work most on by our take at their relative pressingness, i.e. how much expected good we think people can do by trying to solve them, and we also communicate these views to our readers.
(Our problem profiles page emphasises that we’re not confident we have the right rankings here https://80000hours.org/problem-profiles/#problems-faq and also at the top of the page, and in ranking meta problems like global priorities research fairly highly).
I think all orgs interested in having as much positive impact as the can need to have a stance on how to do that—otherwise they cannot act. They might be unsure (as we are), and open to changing their minds (as we try to be), and often be asking themselves the question “is this really the way to do the most good?” (as we try to do periodically). I think that’s part of what characterises EA. But in the meantime we all operate with provisional answers, even if that provisional answer is “the way to do the most good is to not have a publicly stated opinion on things like which causes are more pressing than others.”
Hey Joey, Arden from 80k here. I just wanted to say that I don’t think 80k has “the answers” to how to do the most good.
But we do try to form views on the relative impact of different things, so we do try to reach working answers, and then act on our views (e.g. by communicating them and investing more where we think we can have more impact).
So e.g. we prioritise cause areas we work most on by our take at their relative pressingness, i.e. how much expected good we think people can do by trying to solve them, and we also communicate these views to our readers.
(Our problem profiles page emphasises that we’re not confident we have the right rankings here https://80000hours.org/problem-profiles/#problems-faq and also at the top of the page, and in ranking meta problems like global priorities research fairly highly).
I think all orgs interested in having as much positive impact as the can need to have a stance on how to do that—otherwise they cannot act. They might be unsure (as we are), and open to changing their minds (as we try to be), and often be asking themselves the question “is this really the way to do the most good?” (as we try to do periodically). I think that’s part of what characterises EA. But in the meantime we all operate with provisional answers, even if that provisional answer is “the way to do the most good is to not have a publicly stated opinion on things like which causes are more pressing than others.”