Eliezer Yudkowsky’s recent post on AGI ruin seems to have sparked a good amount of thinking and discussion (e.g. in the comments there; on LessWrong; in this post from today). I’m glad this is happening. This is a link post for Paul Christiano’s response, which is I think worth reading for anyone following the discussion.
Personally I like Christiano’s response a good bit. More than anything else I’ve read in the vicinity I find myself nodding along and thinking “yeah that’s a good articulation of my feelings on this”. I think it’s interesting that it lacks the rhetorical oomph of Yudkowsky’s post; on the whole I’m into the rhetorical oomph for pulling attention onto a cluster of topics which I think are super important, although I feel a bit sad that the piece which is most attention-pulling doesn’t seem to be the one which is most truth-tracking.
[linkpost] Christiano on agreement/disagreement with Yudkowsky’s “List of Lethalities”
Link post
Eliezer Yudkowsky’s recent post on AGI ruin seems to have sparked a good amount of thinking and discussion (e.g. in the comments there; on LessWrong; in this post from today). I’m glad this is happening. This is a link post for Paul Christiano’s response, which is I think worth reading for anyone following the discussion.
Personally I like Christiano’s response a good bit. More than anything else I’ve read in the vicinity I find myself nodding along and thinking “yeah that’s a good articulation of my feelings on this”. I think it’s interesting that it lacks the rhetorical oomph of Yudkowsky’s post; on the whole I’m into the rhetorical oomph for pulling attention onto a cluster of topics which I think are super important, although I feel a bit sad that the piece which is most attention-pulling doesn’t seem to be the one which is most truth-tracking.