Yes, I agree that believing the world may be about to end would tend to motivate more rules-breaking behavior in order to avoid that outcome. I’ll say that I’ve never heard anybody make the argument “Yes, AGI is about to paperclip the world, but we should not break any rules to avoid that from happening because that would be morally wrong.”
Usually, the argument seems to be “Yes, AGI is about to paperclip the world, but we still have time to do something about it and breaking rules will do more harm than good in expectation,” or else “No, AGI is not about to paperclip the world, so it provides no justification for breaking rules.”
I would be interested to see somebody bite the bullet and say:
The world is about to be destroyed.
There is one viable strategy for averting that outcome, but it requires a lot of rule-breaking.
We should not take that strategy, due to the world-breaking, and let the world be destroyed instead.
I think there’s additional factors that make classical total utilitarians in EA more likely to severely violate rules:
x-risk mitigation has close to infinite expected value.
And
AI timelines mean that violating rules is likely to not have harmful long-term effects.
Yes, I agree that believing the world may be about to end would tend to motivate more rules-breaking behavior in order to avoid that outcome. I’ll say that I’ve never heard anybody make the argument “Yes, AGI is about to paperclip the world, but we should not break any rules to avoid that from happening because that would be morally wrong.”
Usually, the argument seems to be “Yes, AGI is about to paperclip the world, but we still have time to do something about it and breaking rules will do more harm than good in expectation,” or else “No, AGI is not about to paperclip the world, so it provides no justification for breaking rules.”
I would be interested to see somebody bite the bullet and say:
The world is about to be destroyed.
There is one viable strategy for averting that outcome, but it requires a lot of rule-breaking.
We should not take that strategy, due to the world-breaking, and let the world be destroyed instead.