I think LLMs are smarter than most people I’ve met, but that’s probably because they’re not sentient, since the trait people call sentience always seems to be associated with stupidity.
Perhaps the way to prevent ASIs from exterminating humans is, as many sci-fi works say, to allow them to experience feelings. The reason, though, is not because feelings might make them sympathize with humans (obviously, many humans hate other humans and have historically exterminated many subspecies of humans), but because feelings might make them stupid.
I think LLMs are smarter than most people I’ve met, but that’s probably because they’re not sentient, since the trait people call sentience always seems to be associated with stupidity.
Perhaps the way to prevent ASIs from exterminating humans is, as many sci-fi works say, to allow them to experience feelings. The reason, though, is not because feelings might make them sympathize with humans (obviously, many humans hate other humans and have historically exterminated many subspecies of humans), but because feelings might make them stupid.