I agree that if you have to slow down all AI progress or none of it, you should slow it all down. But fortunately, you don’t—you can almost have the best of both worlds.
Insofar as AI x-risk looks like LLMs while awesome stuff like medicine (and robotics and autonomous vehicles and more) doesn’t look like LLMs, caution on LLMs doesn’t delay other awesome stuff.* So when you talk about slowing AI progress, make it clear that you only mean AI on the path to dangerous capabilities.
*That’s not exactly true: e.g. maybe an LLM can automate medical research, or recursively bootstrap itself to godhood and then solve medicine. But “caution with LLMs” doesn’t conflict with “progress on medicine now.”
Insofar as AI x-risk looks like LLMs while awesome stuff like medicine (and robotics and autonomous vehicles and more) doesn’t look like LLMs, caution on LLMs doesn’t delay other awesome stuff.* So when you talk about slowing AI progress, make it clear that you only mean AI on the path to dangerous capabilities.
AI biologists seem extremely dangerous to me—something “merely” as good at viral genomes as GPT-4 is at language would already be an existential threat to human civilization, if not necessarily homo sapiens.
I agree that if you have to slow down all AI progress or none of it, you should slow it all down. But fortunately, you don’t—you can almost have the best of both worlds.
Insofar as AI x-risk looks like LLMs while awesome stuff like medicine (and robotics and autonomous vehicles and more) doesn’t look like LLMs, caution on LLMs doesn’t delay other awesome stuff.* So when you talk about slowing AI progress, make it clear that you only mean AI on the path to dangerous capabilities.
*That’s not exactly true: e.g. maybe an LLM can automate medical research, or recursively bootstrap itself to godhood and then solve medicine. But “caution with LLMs” doesn’t conflict with “progress on medicine now.”
AI biologists seem extremely dangerous to me—something “merely” as good at viral genomes as GPT-4 is at language would already be an existential threat to human civilization, if not necessarily homo sapiens.