it’s just like the current most natural term we have for it. I don’t think anyone is super attached to it, and you can just propose a different name and then people might go with that one,
It doesn’t seem natural at all to me. How about, e.g., “AI explosion”, “Runaway AI”, “AI-apocalypse”, or “catastrophic AI”? instead?
Forecasting the adoption of terms is quite hard and I haven’t seen people have a lot of success with it.
This is not quite about ‘forecasting the adoption of terms’ (not that you were explicitly making this case. It might be that FOOM does get adopted, but then used as a term to dismiss these concerns. While I think another term might be just as easily adopted but lead to greater credibility and sympathy.
None of those obviously mean the same thing (“runaway AI” might sort of gesture at it, but it’s still pretty ambiguous). Intelligence explosion is the thing it’s pointing at, though I think there are still a bunch of conflated connotations that don’t necessarily make sense as a single package.
I think “hard takeoff” is better if you’re talking about the high-level “thing that might happen”, and “recursive self improvement” is much clearer if you’re talking about the usually-implied mechanism by which you expect hard takeoff.
I use “AI Apocalypse” when talking about this to non-EA/LW friends and family. Didn’t really explicitly think about it, it was just the most natural choice of words in the context.
It doesn’t seem natural at all to me. How about, e.g., “AI explosion”, “Runaway AI”, “AI-apocalypse”, or “catastrophic AI”? instead?
This is not quite about ‘forecasting the adoption of terms’ (not that you were explicitly making this case. It might be that FOOM does get adopted, but then used as a term to dismiss these concerns. While I think another term might be just as easily adopted but lead to greater credibility and sympathy.
None of those obviously mean the same thing (“runaway AI” might sort of gesture at it, but it’s still pretty ambiguous). Intelligence explosion is the thing it’s pointing at, though I think there are still a bunch of conflated connotations that don’t necessarily make sense as a single package.
I think “hard takeoff” is better if you’re talking about the high-level “thing that might happen”, and “recursive self improvement” is much clearer if you’re talking about the usually-implied mechanism by which you expect hard takeoff.
I use “AI Apocalypse” when talking about this to non-EA/LW friends and family. Didn’t really explicitly think about it, it was just the most natural choice of words in the context.