An AI could be aligned to something other than humanity’s shared values, and this could potentially prevent most of the value in the universe from being realized. Nate Soares talks about this in Don’t leave your fingerprints on the future.
Most of the focus goes on being able to align an AI at all, as this is necessary for any win-state. There seems to be consensus among the relevant actors that seizing the cosmic endowment for themselves would be a Bad Thing. Hopefully this will hold.
An AI could be aligned to something other than humanity’s shared values, and this could potentially prevent most of the value in the universe from being realized. Nate Soares talks about this in Don’t leave your fingerprints on the future.
Most of the focus goes on being able to align an AI at all, as this is necessary for any win-state. There seems to be consensus among the relevant actors that seizing the cosmic endowment for themselves would be a Bad Thing. Hopefully this will hold.