For purposes of our definitions, we’ll count it as AGI being developed if there are AI systems that power a comparably profound transformation (in economic terms or otherwise) as would be achieved in such a world [where cheap AI systems are fully substitutable for human labor].
and:
causing human extinction or drastically limiting humanity’s future potential may not show up as rapid GDP growth, but automatically counts for the purposes of this definition.
If someone uses AI capabilities to create a synthetic virus (which they wouldn’t have been able to do in the counterfactual world without that AI-generated capability) and caused the extinction or drastic curtailment of humanity, would that count as “AGI being developed”?
My instinct is that this should not be considered to be AGI — since it is the result of just narrow AI and a human. However the caveat implies that it would count, because an AI system would have powered human extinction.
I get the impression you want to count ‘comprehensive AI systems’ as AGI if the system is able to act ~autonomously from humans[1]. Is that correct?
Putting it another way: If there is a company powered employs both humans and lots of AI technologies and it brings about a “profound transformation (in economic terms or otherwise)” , I assume the combined capability of the AI-elements of the company should be equivalently general as a single AGI would be to count.
If it does not sum up to that level of generality, but is still used to bring about a transformation, I think that it should not resolve ‘AGI developed’ positively. However, it currently looks like it would resolve it positively.
Thanks, I think this is subtle and I don’t think I expressed this perfectly.
> If someone uses AI capabilities to create a synthetic virus (which they wouldn’t have been able to do in the counterfactual world without that AI-generated capability) and caused the extinction or drastic curtailment of humanity, would that count as “AGI being developed”?
No, I would not count this.
I’d probably count it if the AI a) somehow formed the intention to do this and then developed the pathogen and released it without human direction, but b) couldn’t yet produce as much economic output as full automation of labor.
I am unsure what you mean by AGI. You say:
and:
If someone uses AI capabilities to create a synthetic virus (which they wouldn’t have been able to do in the counterfactual world without that AI-generated capability) and caused the extinction or drastic curtailment of humanity, would that count as “AGI being developed”?
My instinct is that this should not be considered to be AGI — since it is the result of just narrow AI and a human. However the caveat implies that it would count, because an AI system would have powered human extinction.
I get the impression you want to count ‘comprehensive AI systems’ as AGI if the system is able to act ~autonomously from humans[1]. Is that correct?
Putting it another way:
If there is a company powered employs both humans and lots of AI technologies and it brings about a “profound transformation (in economic terms or otherwise)” , I assume the combined capability of the AI-elements of the company should be equivalently general as a single AGI would be to count.
If it does not sum up to that level of generality, but is still used to bring about a transformation, I think that it should not resolve ‘AGI developed’ positively. However, it currently looks like it would resolve it positively.
Thanks, I think this is subtle and I don’t think I expressed this perfectly.
> If someone uses AI capabilities to create a synthetic virus (which they wouldn’t have been able to do in the counterfactual world without that AI-generated capability) and caused the extinction or drastic curtailment of humanity, would that count as “AGI being developed”?
No, I would not count this.
I’d probably count it if the AI a) somehow formed the intention to do this and then developed the pathogen and released it without human direction, but b) couldn’t yet produce as much economic output as full automation of labor.
Okay great, that makes sense to me. Thank you very much for the clarification!