Yes, I agree that believing the world may be about to end would tend to motivate more rules-breaking behavior in order to avoid that outcome. I’ll say that I’ve never heard anybody make the argument “Yes, AGI is about to paperclip the world, but we should not break any rules to avoid that from happening because that would be morally wrong.”
Usually, the argument seems to be “Yes, AGI is about to paperclip the world, but we still have time to do something about it and breaking rules will do more harm than good in expectation,” or else “No, AGI is not about to paperclip the world, so it provides no justification for breaking rules.”
I would be interested to see somebody bite the bullet and say:
The world is about to be destroyed.
There is one viable strategy for averting that outcome, but it requires a lot of rule-breaking.
We should not take that strategy, due to the world-breaking, and let the world be destroyed instead.
Yes, I agree that believing the world may be about to end would tend to motivate more rules-breaking behavior in order to avoid that outcome. I’ll say that I’ve never heard anybody make the argument “Yes, AGI is about to paperclip the world, but we should not break any rules to avoid that from happening because that would be morally wrong.”
Usually, the argument seems to be “Yes, AGI is about to paperclip the world, but we still have time to do something about it and breaking rules will do more harm than good in expectation,” or else “No, AGI is not about to paperclip the world, so it provides no justification for breaking rules.”
I would be interested to see somebody bite the bullet and say:
The world is about to be destroyed.
There is one viable strategy for averting that outcome, but it requires a lot of rule-breaking.
We should not take that strategy, due to the world-breaking, and let the world be destroyed instead.