Some successor species are much “worthier” of inheriting the future than others. I expect that the type of AI that would violently take over control of civilization would be on the less worthy side (especially conditioning on sub-2035 AGI timelines), and that the type of society we’d design given lots of deliberation and care (possibly through a long reflection) would be much more worthy.
Some successor species are much “worthier” of inheriting the future than others. I expect that the type of AI that would violently take over control of civilization would be on the less worthy side (especially conditioning on sub-2035 AGI timelines), and that the type of society we’d design given lots of deliberation and care (possibly through a long reflection) would be much more worthy.