As someone working in U.S. foreign policy as a career, I strongly agree with you. I have watched this debate unfold for the past three months, only to see the fears and predictions of others who agreed with you largely come true. And mostly I feel that the whole ordeal validates my decision to pursue foreign policy—with a focus on reducing the risk of great power conflict—as an EA cause area.
The existence of disagreement in the comments might hint that this area is relatively under-researched or discussed by the EA community, given what a cross-cutting risk factor it appears to be on so many other X-risks, from AI to nukes to bioweapons. From curiosity, would you agree that EA’s strong preference for consequentialism compared to other moral frameworks should incline most EAs to a relatively “realist” approach to foreign policy?
Although the EA forum seems to be literally raising funds for war material, I doubt the uneven reception to David’s post is driven by emotion or ignorance (although I agree it didn’t help that the post initially seemed to favor the Russian narrative).
I think “realism” or “consequentialism” is dominant (whatever that really means).
From that perspective, I don’t think anyone believes NATO had an option to close the door on Ukraine. Also, the characterization of Russian behavior and irredentialism seems incomplete, and the value of the current situation is unclear.
In your other comments, you knocked down some bad takes. While I think you are right, you’ve only knocked down bad takes.
Great power conflict is important and neglected in EA, especially how to better bridge and communicate that peacefully. I’m less sure what that has to do with Russia.
There is a vast apparatus to study Russia already, it would be good to hear clearly what EA’s contribution would be and why it should be increased by these events.
Thanks Andrew—I’m glad you agree. I also agree that consequentialism encourages a high level of realism. That said, I was expecting a higher level of agreement from the EA community on this post, so it’s interesting that not everyone shares my view.
As someone working in U.S. foreign policy as a career, I strongly agree with you. I have watched this debate unfold for the past three months, only to see the fears and predictions of others who agreed with you largely come true. And mostly I feel that the whole ordeal validates my decision to pursue foreign policy—with a focus on reducing the risk of great power conflict—as an EA cause area.
The existence of disagreement in the comments might hint that this area is relatively under-researched or discussed by the EA community, given what a cross-cutting risk factor it appears to be on so many other X-risks, from AI to nukes to bioweapons. From curiosity, would you agree that EA’s strong preference for consequentialism compared to other moral frameworks should incline most EAs to a relatively “realist” approach to foreign policy?
Although the EA forum seems to be literally raising funds for war material, I doubt the uneven reception to David’s post is driven by emotion or ignorance (although I agree it didn’t help that the post initially seemed to favor the Russian narrative).
I think “realism” or “consequentialism” is dominant (whatever that really means).
From that perspective, I don’t think anyone believes NATO had an option to close the door on Ukraine. Also, the characterization of Russian behavior and irredentialism seems incomplete, and the value of the current situation is unclear.
In your other comments, you knocked down some bad takes. While I think you are right, you’ve only knocked down bad takes.
Great power conflict is important and neglected in EA, especially how to better bridge and communicate that peacefully. I’m less sure what that has to do with Russia.
There is a vast apparatus to study Russia already, it would be good to hear clearly what EA’s contribution would be and why it should be increased by these events.
Thanks Andrew—I’m glad you agree. I also agree that consequentialism encourages a high level of realism. That said, I was expecting a higher level of agreement from the EA community on this post, so it’s interesting that not everyone shares my view.