Nice! I realised that I can’t think of the last time I received low-quality criticism (but can think of a moderate amount of fairly high-quality criticism) so I am probably quite lucky in that regard, as my work/writing thus far has either been privately shared or public but not very provocative. (Of course the flipside is having more people engage with one’s writing is one way to increase impact.)
I hadn’t heard the “idea inoculation” term before—that does seem like a useful framing. I wonder if that is part of the explanation for some of the AI safety/x-risk backlash, that someone hears a third-hand snippet of an argument for why AGI/TAI might be dangerous, or consumes some not-very-realistic fiction about this, and later is pretty reluctant to engage with more careful work on the subject.
Nice! I realised that I can’t think of the last time I received low-quality criticism (but can think of a moderate amount of fairly high-quality criticism) so I am probably quite lucky in that regard, as my work/writing thus far has either been privately shared or public but not very provocative. (Of course the flipside is having more people engage with one’s writing is one way to increase impact.)
I hadn’t heard the “idea inoculation” term before—that does seem like a useful framing. I wonder if that is part of the explanation for some of the AI safety/x-risk backlash, that someone hears a third-hand snippet of an argument for why AGI/TAI might be dangerous, or consumes some not-very-realistic fiction about this, and later is pretty reluctant to engage with more careful work on the subject.