To me, it seems to be evidence that you can be a believer in a cause, but still become corrupt because you use that very belief to justify self-serving logic about how what you’re doing really advances the cause.
Thus it would be even more relevant to EA because I think the risk of EAs becoming nakedly self-interested is low; the more likely failure mode is using EA to fool yourself and rationalize self-serving behavior.
To me, it seems to be evidence that you can be a believer in a cause, but still become corrupt because you use that very belief to justify self-serving logic about how what you’re doing really advances the cause.
Thus it would be even more relevant to EA because I think the risk of EAs becoming nakedly self-interested is low; the more likely failure mode is using EA to fool yourself and rationalize self-serving behavior.