TBC, it’s plausible that in the future I’ll think that “marginally influencing AIs to have more sensible values” is more leveraged than “avoiding AI take over and hoping that humans (and our chosen successors) do something sensible”. I’m partially defering to others on the view that AI takeover is the best angle of attack, perhaps I should examine further.
(Of course, it could be that from a longtermist perspective other stuff is even better than avoiding AI takeover or altering AI values. E.g. maybe one of conflict avoidance, better decision theory, or better human institutions for post singularity is even better.)
I certainly wish the question of how much worse/better AI takeover is relative to human control was investigated more effectively. It seems notable to me how important this question is from a longtermist perspective and how little investigation it has received.
(I’ve spent maybe 1 person day thinking about it and I think probably less than 3 FTE years have been put into this by people who I’d be interested in defering to.)
TBC, it’s plausible that in the future I’ll think that “marginally influencing AIs to have more sensible values” is more leveraged than “avoiding AI take over and hoping that humans (and our chosen successors) do something sensible”. I’m partially defering to others on the view that AI takeover is the best angle of attack, perhaps I should examine further.
(Of course, it could be that from a longtermist perspective other stuff is even better than avoiding AI takeover or altering AI values. E.g. maybe one of conflict avoidance, better decision theory, or better human institutions for post singularity is even better.)
I certainly wish the question of how much worse/better AI takeover is relative to human control was investigated more effectively. It seems notable to me how important this question is from a longtermist perspective and how little investigation it has received.
(I’ve spent maybe 1 person day thinking about it and I think probably less than 3 FTE years have been put into this by people who I’d be interested in defering to.)