My personal preference is to take my chances with unaligned ASI as the thought of either of these circuses being the ringmaster of all eternity is terrifying. I’d much rather be a paper clip than a communist/corporate serf.
I don’t want to harp too much on “lived experiences”, but both stated and revealed preferences from existing denizens of either the US or China will strongly suggest otherwise for the preferences of most other people. It’s possible you’d have an unusual preference if you lived in those countries, but I currently suspect otherwise.
An average North Korean may well think that AGI based on their values would be a great thing to overtake the universe, but most of us would disagree. The view from inside a system is very different than the view from the outside. Orwell spoke of a jackboot on the face of humanity forever. I feel like the EA community are doing their best to avoid that outcome, but I’m not sure major world powers are. Entrenching the power of current world governments is unlikely, in my view, to lead to great outcomes. Perhaps the wild card is a valid choice. More than I want to be a paperclip, I want to live in a world where building a billion humanoid robots is not a legitimate business plan and where AGI development is slowly slowly. That doesn’t seem to be an option. So maybe no control of AGI is better than control by pyschopaths?
I don’t want to harp too much on “lived experiences”, but both stated and revealed preferences from existing denizens of either the US or China will strongly suggest otherwise for the preferences of most other people. It’s possible you’d have an unusual preference if you lived in those countries, but I currently suspect otherwise.
An average North Korean may well think that AGI based on their values would be a great thing to overtake the universe, but most of us would disagree. The view from inside a system is very different than the view from the outside. Orwell spoke of a jackboot on the face of humanity forever. I feel like the EA community are doing their best to avoid that outcome, but I’m not sure major world powers are. Entrenching the power of current world governments is unlikely, in my view, to lead to great outcomes. Perhaps the wild card is a valid choice. More than I want to be a paperclip, I want to live in a world where building a billion humanoid robots is not a legitimate business plan and where AGI development is slowly slowly. That doesn’t seem to be an option. So maybe no control of AGI is better than control by pyschopaths?