First, estimate the total existential risk associated with developing highly capable AI systems, bearing in mind all of the work on safety that will be done.
“Existential risk” doesn’t have units of percentages in my mind. I would phrase this as “probability that Earth-based life goes extinct” or something.
Cool idea, thanks – one usability comment:
“Existential risk” doesn’t have units of percentages in my mind. I would phrase this as “probability that Earth-based life goes extinct” or something.