I’m no expert in this topic and haven’t read Sam Harris’s argument, but there are a couple of things I usually bear in mind:
1. If you’re uncertain about whether determinism is true (that is, the probability you assign to hard determinism is less than 1), then it seems you should still act as though you are not determined. Then we can apply reasoning like Pascal’s Wager—if determinism is false, then sadistic torture is terrible; if it’s right, then we are indifferent. Hence it seems that we should still act on the side of morality still having bearing.
2. A more compelling response (although, still contentious) is compatibilism. I leave you to explore it here.
Exactly, 1 has been the approach I have taken; as long as I am unsure I err on the side of safety and believing in morally large universes including those with free will. That said, it would be interesting if many EAs were similar and thought something like “there’s only a ~10% chance free will and hence morality is real, so very likely my life is useless, but I am trying anyway”. I think that is a good approach, but would be an odd outcome.
I’m no expert in this topic and haven’t read Sam Harris’s argument, but there are a couple of things I usually bear in mind:
1. If you’re uncertain about whether determinism is true (that is, the probability you assign to hard determinism is less than 1), then it seems you should still act as though you are not determined. Then we can apply reasoning like Pascal’s Wager—if determinism is false, then sadistic torture is terrible; if it’s right, then we are indifferent. Hence it seems that we should still act on the side of morality still having bearing.
2. A more compelling response (although, still contentious) is compatibilism. I leave you to explore it here.
Exactly, 1 has been the approach I have taken; as long as I am unsure I err on the side of safety and believing in morally large universes including those with free will. That said, it would be interesting if many EAs were similar and thought something like “there’s only a ~10% chance free will and hence morality is real, so very likely my life is useless, but I am trying anyway”. I think that is a good approach, but would be an odd outcome.