I agree with this, there are definitely two definitions at play. I think a failure to distinguish between these two definitions is actually a big problem with the AI doom argument, where they end up doing an unintentional motte-and-bailey between the two definitions.
David Thornstad explains it pretty well here. The “people want money” definition is trivial and obviously true, but does not lead to the “doom is inevitable” conclusion. I have a goal of eating food, and money is useful for that purpose, but that doesn’t mean I automatically try and accumulate all the wealth on the planet in order to tile the universe with food.
I agree with this, there are definitely two definitions at play. I think a failure to distinguish between these two definitions is actually a big problem with the AI doom argument, where they end up doing an unintentional motte-and-bailey between the two definitions.
David Thornstad explains it pretty well here. The “people want money” definition is trivial and obviously true, but does not lead to the “doom is inevitable” conclusion. I have a goal of eating food, and money is useful for that purpose, but that doesn’t mean I automatically try and accumulate all the wealth on the planet in order to tile the universe with food.