My point was that the alignment goal, from the human perspective, is an enslavement goal, whether the goal succeeds or not.
Really? I think it’s about making machines that have good values, e.g., are altruistic rather than selfish. A better analogy than slavery might be raising children. All parents want their children to become good people, and no parent wants to make slaves out of them.
Really? I think it’s about making machines that have good values, e.g., are altruistic rather than selfish. A better analogy than slavery might be raising children. All parents want their children to become good people, and no parent wants to make slaves out of them.
Hmm, you have more faith in the common-sense and goodwill of people than I do