Smart things are not dangerous because they have access to human-built legacy nukes. Smart things are dangerous because they are smarter than you.
I expect that the most efficient way to kill everyone is via the biotech->nanotech->tiny diamondoid bacteria hopping the jetstream and replicating using CHON and sunlight->everybody falling over dead 3 days after it gets smart. I don’t expect it would use nukes if they were there.
Smart AIs are not dangerous because somebody built guns for them, smart AIs are not dangerous because cars are connected to the Internet, smart AIs are not dangerous because they can steal existing legacy weapons infrastructure, smart AIs are dangerous because they are smarter than you and can think of better stuff to do.
Smart things are not dangerous because they have access to human-built legacy nukes. Smart things are dangerous because they are smarter than you.
I expect that the most efficient way to kill everyone is via the biotech->nanotech->tiny diamondoid bacteria hopping the jetstream and replicating using CHON and sunlight->everybody falling over dead 3 days after it gets smart. I don’t expect it would use nukes if they were there.
Smart AIs are not dangerous because somebody built guns for them, smart AIs are not dangerous because cars are connected to the Internet, smart AIs are not dangerous because they can steal existing legacy weapons infrastructure, smart AIs are dangerous because they are smarter than you and can think of better stuff to do.
Some back-and-forth on this between Eliezer & me in this thread.