This. It suggests that once a powerful AI is released, even in restricted format, others will be motivated to copy it. This is a bad dynamic for AGI, as existential risk only depends on the least-safe actor. If this dynamic is repeated, it’s very possible that we all die due to copies of AGI being released.
This. It suggests that once a powerful AI is released, even in restricted format, others will be motivated to copy it. This is a bad dynamic for AGI, as existential risk only depends on the least-safe actor. If this dynamic is repeated, it’s very possible that we all die due to copies of AGI being released.