Conditional on AGI being developed by 2070, what is the probability that humanity will suffer an existential catastrophe due to loss of control over an AGI system?
Requesting a few clarifications:
I think of existential catastrophes as things like near-term extinction rather than things like “the future is substantially worse than it could have been”. Alternatively, I tend to think that existential catastrophe means a future that’s much worse than technological stagnation, rather than one that’s much worse than it would have been with more aligned AI. What do you think?
Are we considering “loss of control over an AGI system” as a loss of control over a somewhat monolithic thing with a well-defined control interface, or is losing control over an ecosystem of AGIs also of interest here?
Thanks for your questions. We’re interested in a wide range of considerations. It’s debatable whether human-originating civilization failing to make good use of its “cosmic endowment” constitutes an existential catastrophe. If you want to focus on more recognizable catastrophes (such as extinction, unrecoverable civilizational collapse, or dystopia) that would be fine.
In a similar vein, if you think there is an important scenario in which humanity suffers an existential catastrophe by collectively losing control over an ecosystem of AGIs, that would also be an acceptable topic.
Requesting a few clarifications:
I think of existential catastrophes as things like near-term extinction rather than things like “the future is substantially worse than it could have been”. Alternatively, I tend to think that existential catastrophe means a future that’s much worse than technological stagnation, rather than one that’s much worse than it would have been with more aligned AI. What do you think?
Are we considering “loss of control over an AGI system” as a loss of control over a somewhat monolithic thing with a well-defined control interface, or is losing control over an ecosystem of AGIs also of interest here?
Hi David,
Thanks for your questions. We’re interested in a wide range of considerations. It’s debatable whether human-originating civilization failing to make good use of its “cosmic endowment” constitutes an existential catastrophe. If you want to focus on more recognizable catastrophes (such as extinction, unrecoverable civilizational collapse, or dystopia) that would be fine.
In a similar vein, if you think there is an important scenario in which humanity suffers an existential catastrophe by collectively losing control over an ecosystem of AGIs, that would also be an acceptable topic.
Let me know if you have any other questions!