A nice short argument that a sufficiently intelligent AGI would have the power to usurp humanity is Scott Alexander’s Superintelligence FAQ Section 3.1.
A nice short argument that a sufficiently intelligent AGI would have the power to usurp humanity is Scott Alexander’s Superintelligence FAQ Section 3.1.