I’d like to make clear to anyone reading that you can support the PauseAI movement right now, only because you think it is useful right now. And then in the future, when conditions change, you can choose to stop supporting the PauseAI movement.
AI is changing extremely fast (e.g. technical work was probably our best bet a year ago, I’m less sure now). Supporting a particular tactic/intervention does not commit you to an ideology or team forever!
I’d like to add an asterisk. It is true that you can and should support things that seem good while they seem good and then retract support, or express support on the margin but not absolutely. But sometimes supporting things for a period has effects you can’t easily take back. This is especially the case if (1) added marginal support summons some bigger version of the thing that, once in place, cannot be re-bottled, or (2) increased clout for that thing changes the culture significantly (I think cultural changes are very hard to reverse; culture generally doesn’t go back, only moves on).
I think there are many cases where, before throwing their lot in with a political cause for instrumental reasons, people should’ve first paused to think more about whether this is the type of thing they’d like to see more of in general. Political movements also tend to have an enormous amount of inertia, and often end up very influenced by by path-dependence and memetic fitness gradients.
Thanks for your comment Rudolf! I predict that my comment is going to be extremely downvoted but I’m writing it partly because I think it is true and partly because it points to a meta issue in EA:
I think it is unrealistic to ask people to internalise the level of ambiguity you’re proposing. This is how EA’s turn themselves into mental pretzels of innaction.
I’d like to make clear to anyone reading that you can support the PauseAI movement right now, only because you think it is useful right now. And then in the future, when conditions change, you can choose to stop supporting the PauseAI movement.
AI is changing extremely fast (e.g. technical work was probably our best bet a year ago, I’m less sure now). Supporting a particular tactic/intervention does not commit you to an ideology or team forever!
I’d like to add an asterisk. It is true that you can and should support things that seem good while they seem good and then retract support, or express support on the margin but not absolutely. But sometimes supporting things for a period has effects you can’t easily take back. This is especially the case if (1) added marginal support summons some bigger version of the thing that, once in place, cannot be re-bottled, or (2) increased clout for that thing changes the culture significantly (I think cultural changes are very hard to reverse; culture generally doesn’t go back, only moves on).
I think there are many cases where, before throwing their lot in with a political cause for instrumental reasons, people should’ve first paused to think more about whether this is the type of thing they’d like to see more of in general. Political movements also tend to have an enormous amount of inertia, and often end up very influenced by by path-dependence and memetic fitness gradients.
Thanks for your comment Rudolf! I predict that my comment is going to be extremely downvoted but I’m writing it partly because I think it is true and partly because it points to a meta issue in EA:
I think it is unrealistic to ask people to internalise the level of ambiguity you’re proposing. This is how EA’s turn themselves into mental pretzels of innaction.
Yup.
Is one of the main points of my post. If you support PauseAI today you may unleash a force which you cannot control tomorrow.