Interesting post! I think analogies are good for public communication but not for understanding things at a deep level. They’re like a good way to quickly template something you haven’t thought about at all with something you are familiar with. I think effective mass communication is quite important and we shouldn’t let the perfect be the enemy of the good.
I wouldn’t consider my Terminator comparison an analogy in the sense of the other items on this list. Most of the other items have the character of “why might AI go rogue?” and then they describe something other than AI that is hard to understand or goes rogue in some sense and assert that AI is like that. But Terminator is just literally about an AI going rogue. It’s not so much an analogy as a literal portrayal of the concern. My point wasn’t so much that you should proactively tell people that AI risk is like Terminator, but that people are just going to notice this on their own (because it’s incredibly obvious), and contradicting them makes no sense.
Interesting post! I think analogies are good for public communication but not for understanding things at a deep level. They’re like a good way to quickly template something you haven’t thought about at all with something you are familiar with. I think effective mass communication is quite important and we shouldn’t let the perfect be the enemy of the good.
I wouldn’t consider my Terminator comparison an analogy in the sense of the other items on this list. Most of the other items have the character of “why might AI go rogue?” and then they describe something other than AI that is hard to understand or goes rogue in some sense and assert that AI is like that. But Terminator is just literally about an AI going rogue. It’s not so much an analogy as a literal portrayal of the concern. My point wasn’t so much that you should proactively tell people that AI risk is like Terminator, but that people are just going to notice this on their own (because it’s incredibly obvious), and contradicting them makes no sense.