Yes, I’d be excited to always include something about epistemics, such as scientific mindset. One can then argue about evidence instead of whether something qualifies as secular, which seems only relevant insofar as it is a weak predictor of well-evidenced-ness. In particular, while I don’t assign it high credence, I would not be hasty to rule out maximising ‘enlightenment’ as one of my end goals. Terminal goals are weird.
Notably, without an epistemic/scientific part to the definition, it is unclear how to distinguish many current EA approaches from e.g. Playpumps, which is a secular project which was hyped for its extraordinary outcomes in helping people far away. Looking forward, I also think that strong epistemics are how long-term-focused EA efforts can continue to be more useful than regular futurism.
Yes, I’d be excited to always include something about epistemics, such as scientific mindset. One can then argue about evidence instead of whether something qualifies as secular, which seems only relevant insofar as it is a weak predictor of well-evidenced-ness. In particular, while I don’t assign it high credence, I would not be hasty to rule out maximising ‘enlightenment’ as one of my end goals. Terminal goals are weird.
Notably, without an epistemic/scientific part to the definition, it is unclear how to distinguish many current EA approaches from e.g. Playpumps, which is a secular project which was hyped for its extraordinary outcomes in helping people far away. Looking forward, I also think that strong epistemics are how long-term-focused EA efforts can continue to be more useful than regular futurism.