An average North Korean may well think that AGI based on their values would be a great thing to overtake the universe, but most of us would disagree. The view from inside a system is very different than the view from the outside. Orwell spoke of a jackboot on the face of humanity forever. I feel like the EA community are doing their best to avoid that outcome, but I’m not sure major world powers are. Entrenching the power of current world governments is unlikely, in my view, to lead to great outcomes. Perhaps the wild card is a valid choice. More than I want to be a paperclip, I want to live in a world where building a billion humanoid robots is not a legitimate business plan and where AGI development is slowly slowly. That doesn’t seem to be an option. So maybe no control of AGI is better than control by pyschopaths?
An average North Korean may well think that AGI based on their values would be a great thing to overtake the universe, but most of us would disagree. The view from inside a system is very different than the view from the outside. Orwell spoke of a jackboot on the face of humanity forever. I feel like the EA community are doing their best to avoid that outcome, but I’m not sure major world powers are. Entrenching the power of current world governments is unlikely, in my view, to lead to great outcomes. Perhaps the wild card is a valid choice. More than I want to be a paperclip, I want to live in a world where building a billion humanoid robots is not a legitimate business plan and where AGI development is slowly slowly. That doesn’t seem to be an option. So maybe no control of AGI is better than control by pyschopaths?