Probably something like striving for a Long Reflection process. (Due to complex cluelessness more generally, not just moral uncertainty.)
The real issue is unrealistic levels of coordination and a assumption that moral objectivism is true. While it is an operating assumption in order to do anything in EA, that doesn’t equal that’s it’s true.
Probably something like striving for a Long Reflection process. (Due to complex cluelessness more generally, not just moral uncertainty.)
The real issue is unrealistic levels of coordination and a assumption that moral objectivism is true. While it is an operating assumption in order to do anything in EA, that doesn’t equal that’s it’s true.