Exactly, 1 has been the approach I have taken; as long as I am unsure I err on the side of safety and believing in morally large universes including those with free will. That said, it would be interesting if many EAs were similar and thought something like “there’s only a ~10% chance free will and hence morality is real, so very likely my life is useless, but I am trying anyway”. I think that is a good approach, but would be an odd outcome.
Exactly, 1 has been the approach I have taken; as long as I am unsure I err on the side of safety and believing in morally large universes including those with free will. That said, it would be interesting if many EAs were similar and thought something like “there’s only a ~10% chance free will and hence morality is real, so very likely my life is useless, but I am trying anyway”. I think that is a good approach, but would be an odd outcome.