I like “science-aligned” better than “secular”, since the former implies the latter as well as a bunch of other important concepts.
Also, it’s worth noting that “everyone’s welfare is to count equally” in Will’s account is approximately equivalent to “effective altruism values all people equally” in Ozymandias’ account, but neither of them imply the following paraphrase: “from the effective altruism perspective, saving the life of a baby in Africa is exactly as good as saving the life of a baby in America, which is exactly as good as saving the life of Ozy’s baby specifically.” I understand the intention of that phrase, but actually I’d save whichever baby would grow up to have the best life. Is there any better concrete description of what impartiality actually implies?
“Whichever will have the best life” seems very compatible with “welfare.” I agree there are a lot of considerations that aren’t obviously indicated by any of these definitions, though, even if they’re compatible.
Yes, I’d be excited to always include something about epistemics, such as scientific mindset. One can then argue about evidence instead of whether something qualifies as secular, which seems only relevant insofar as it is a weak predictor of well-evidenced-ness. In particular, while I don’t assign it high credence, I would not be hasty to rule out maximising ‘enlightenment’ as one of my end goals. Terminal goals are weird.
Notably, without an epistemic/scientific part to the definition, it is unclear how to distinguish many current EA approaches from e.g. Playpumps, which is a secular project which was hyped for its extraordinary outcomes in helping people far away. Looking forward, I also think that strong epistemics are how long-term-focused EA efforts can continue to be more useful than regular futurism.
I like “science-aligned” better than “secular”, since the former implies the latter as well as a bunch of other important concepts.
Also, it’s worth noting that “everyone’s welfare is to count equally” in Will’s account is approximately equivalent to “effective altruism values all people equally” in Ozymandias’ account, but neither of them imply the following paraphrase: “from the effective altruism perspective, saving the life of a baby in Africa is exactly as good as saving the life of a baby in America, which is exactly as good as saving the life of Ozy’s baby specifically.” I understand the intention of that phrase, but actually I’d save whichever baby would grow up to have the best life. Is there any better concrete description of what impartiality actually implies?
“Whichever will have the best life” seems very compatible with “welfare.” I agree there are a lot of considerations that aren’t obviously indicated by any of these definitions, though, even if they’re compatible.
Yes, I’d be excited to always include something about epistemics, such as scientific mindset. One can then argue about evidence instead of whether something qualifies as secular, which seems only relevant insofar as it is a weak predictor of well-evidenced-ness. In particular, while I don’t assign it high credence, I would not be hasty to rule out maximising ‘enlightenment’ as one of my end goals. Terminal goals are weird.
Notably, without an epistemic/scientific part to the definition, it is unclear how to distinguish many current EA approaches from e.g. Playpumps, which is a secular project which was hyped for its extraordinary outcomes in helping people far away. Looking forward, I also think that strong epistemics are how long-term-focused EA efforts can continue to be more useful than regular futurism.