I’m not sure I understand the first question. I don’t really know what a “non-conscious being” would be. Is it synonymous with an agent?
My impression is that feeling lost is a very common response to consciousness issues, which is why it seems to me like it’s not that unlikely we get it wrong and either (a) fill the universe with complex but non-conscious matter, or (b) fill it with complex conscious matter that is profoundly unlike us, in such a way that high levels of positive utility are not achieved.
The main response I can imagine for this at this time is something like “don’t worry, if we solve AI alignment our AIs will solve this question for us, and if we don’t things are likely to go much more obviously wrong”. But this seems unsatisfactory here for some reason, and I’d like to see the argument sketched out more fully.
I’m not sure I understand the first question. I don’t really know what a “non-conscious being” would be. Is it synonymous with an agent?
My impression is that feeling lost is a very common response to consciousness issues, which is why it seems to me like it’s not that unlikely we get it wrong and either (a) fill the universe with complex but non-conscious matter, or (b) fill it with complex conscious matter that is profoundly unlike us, in such a way that high levels of positive utility are not achieved.
The main response I can imagine for this at this time is something like “don’t worry, if we solve AI alignment our AIs will solve this question for us, and if we don’t things are likely to go much more obviously wrong”. But this seems unsatisfactory here for some reason, and I’d like to see the argument sketched out more fully.
Yeah, I meant it to be synonymous with agent.