You’re right, this seems like mostly semantics. I’d guess it’s most clear/useful to use “alignment” a little more narrowly—reserving it for concepts that actually involve aligning things (i.e. roughly consistently with non-AI-specific uses of the word “alignment”). But the Critch(/Dafoe?) take you bring up seems like a good argument for why AI-influenced coordination failures fall under that.
You’re right, this seems like mostly semantics. I’d guess it’s most clear/useful to use “alignment” a little more narrowly—reserving it for concepts that actually involve aligning things (i.e. roughly consistently with non-AI-specific uses of the word “alignment”). But the Critch(/Dafoe?) take you bring up seems like a good argument for why AI-influenced coordination failures fall under that.