Personally something like “There are many important things in the world, and AI risk is just one among them; it would be bad to only evaluate outsiders by their alignment with our fairly idiosyncratic priorities.”
(I upvoted your comment, thanks!) Which things do you think are more important than AI risk?
Personally something like “There are many important things in the world, and AI risk is just one among them; it would be bad to only evaluate outsiders by their alignment with our fairly idiosyncratic priorities.”
(I upvoted your comment, thanks!) Which things do you think are more important than AI risk?