The last chapter of Global Catastrophic Risks (Bostrom and Circovic) covers global totalitarianism. Among other things they mention how improved lie-detection technology, anti-aging research (to mitigate risks of regime change), and drugs to increase docility in the population could plausibly make a totalitarian system permanent and stable. Obviously an unfriendly AGI could easily do this as well.
The increasing docility could be a stealth existential risk increaser, in that people would be less willing to challenge other peoples ideas and so slow or stop entirely technological progress we need to save ourselves from super volcanoes and other environmental threats
The last chapter of Global Catastrophic Risks (Bostrom and Circovic) covers global totalitarianism. Among other things they mention how improved lie-detection technology, anti-aging research (to mitigate risks of regime change), and drugs to increase docility in the population could plausibly make a totalitarian system permanent and stable. Obviously an unfriendly AGI could easily do this as well.
The increasing docility could be a stealth existential risk increaser, in that people would be less willing to challenge other peoples ideas and so slow or stop entirely technological progress we need to save ourselves from super volcanoes and other environmental threats