The unilateralist’s curse is the phenomenon whereby, when each of many altruistic agents has the power to bring about some state of affairs whose net value is negative but unknown to these agents, the probability that the state will be realized grows with the number of agents who decide to act based on their own private judgment.
Salient examples include decisions to leak information about weapons technologies, potential decisions by individual nations to use geoengineering to mitigate climate change, and the unilateral decision to introduce rabbits to Australia.
To avoid the unilateralist’s curse, members of a group might implement a group decision-making procedure, deliberate with others before taking action, or create a norm of deferring to the beliefs or actions of the other members of the group.
Further reading
Bostrom, Nick, Thomas Douglas & Anders Sandberg (2016) The unilateralist’s curse and the case for a principle of conformity, Social Epistemology, vol. 30, pp. 350-371.
Lewis, Gregory (2018) Horsepox synthesis: A case of the unilateralist’s curse?, Bulletin of the Atomic Scientists, February 19.
Usefully connects the curse to other factors
Schubert, Stefan & Ben Garfinkel (2017) Hard-to-reverse decisions destroy option value, Centre for Effective Altruism, March 17.
Zhang, Linchuan (2020) Framing issues with the unilateralist’s curse, Effective Altruism Forum, January 17.
I added a Further reading section containing the things I’d earlier collected in a shortform: https://forum.effectivealtruism.org/posts/EMKf4Gyee7BsY2RP8/michaela-s-shortform?commentId=y3o9YFvj4iXiAqKWa
But:
We haven’t yet decided whether the EA Wiki should have Further reading sections, or just put all of that in a Bibliography, or do something else
I didn’t take the time to format these properly
The order I used is arbitrary
So the idea is that this is a useful starting point that can be adjusted later.