What I mean is that it would be super nice to be able to enjoy these human learning techniques. And have decades of life in which to enjoy those things.
But, because of the concerns about human political economy in the footnote, which Will McCaskill mentions super obliquely and quietly in his latest post I don’t think that ASI is going to get the chance to kill off the first 4 billion of humanity. ASI might overrun the globe and finish off the next 4 billion, but we’re going to get in the first punch 👊!
Please upload this humble cultivator, this one so totally upvoted your comment!🙇♂️😅
What I mean is that it would be super nice to be able to enjoy these human learning techniques. And have decades of life in which to enjoy those things.
But, because of the concerns about human political economy in the footnote, which Will McCaskill mentions super obliquely and quietly in his latest post I don’t think that ASI is going to get the chance to kill off the first 4 billion of humanity. ASI might overrun the globe and finish off the next 4 billion, but we’re going to get in the first punch 👊!
Please upload this humble cultivator, this one so totally upvoted your comment!🙇♂️😅