Hmm this comment seemed pretty controversial! Curious where/why people disagree, though no obligation to comment of course.
Personally something like “There are many important things in the world, and AI risk is just one among them; it would be bad to only evaluate outsiders by their alignment with our fairly idiosyncratic priorities.”
(I upvoted your comment, thanks!) Which things do you think are more important than AI risk?
Hmm this comment seemed pretty controversial! Curious where/why people disagree, though no obligation to comment of course.
Personally something like “There are many important things in the world, and AI risk is just one among them; it would be bad to only evaluate outsiders by their alignment with our fairly idiosyncratic priorities.”
(I upvoted your comment, thanks!) Which things do you think are more important than AI risk?