Thanks for continuing the series, this is one of the most stimulating philosophical issues for me.
After the AI asks Bob if it should do what an ideally informed version of him would want, Bob replies:
Bob: Hm, no. [...] I don’t necessarily care about my take on what’s good. I might have biases. No, what I’d like you to do is whatever’s truly morally good; what we have a moral reason to do in an… irreducibly normative sense. I can’t put this in different terms, but please discount any personal intuitions I may have about morality—I want you to do what’s objectively moral.
I think that part paints a slightly misleading picture of (at least my idea of) moral realism. As if the AI shouldn’t mostly study humans like Bob when finding out what is good in this universe, and instead focus on “objective” things like physics? Logic? My Bob would say:
Hm, kinda. I expect my idealized preferences to have many things in common with what is truly good, but I’m worried that this won’t maximize what is truly good. I might, for example, carry around random evolutionary and societal biases that will waste astronomical resources for things of no real value, like my preference for untouched swaths of rainforest. Maybe start with helping us understand what what we mean with the qualitative feeling of joy, there might be something going on that you can work with, because it just seems like something that is unquestionably good. Vice versa with pain and sorrow and suffering, those seems undeniably bad. Of course I’m open to be convinced otherwise, but I expect there’s a there there.
Thanks for continuing the series, this is one of the most stimulating philosophical issues for me.
After the AI asks Bob if it should do what an ideally informed version of him would want, Bob replies:
I think that part paints a slightly misleading picture of (at least my idea of) moral realism. As if the AI shouldn’t mostly study humans like Bob when finding out what is good in this universe, and instead focus on “objective” things like physics? Logic? My Bob would say: