I found this piece on moral catastrophe really insightful. One thought I’d add is that when we look across history, it seems every generation has been blinded not just by particular false beliefs, but by something deeper — our own egoistic nature. In Kabbalistic terms, this is sometimes described as the “will to receive for oneself.”
What strikes me is that this “operating system” quietly drives so many of our choices, even in causes we think of as altruistic. If we don’t become aware of it, we risk repeating the same pattern: creating new systems, norms, or technologies that look like progress, but still end up serving ego-driven ends.
So while I agree with the call for more moral research and flexibility, I also wonder whether a key part of avoiding future catastrophes is learning to recognize and work with this underlying egoism. Otherwise, we might just keep patching symptoms without ever debugging the core code.
I found this piece on moral catastrophe really insightful. One thought I’d add is that when we look across history, it seems every generation has been blinded not just by particular false beliefs, but by something deeper — our own egoistic nature. In Kabbalistic terms, this is sometimes described as the “will to receive for oneself.”
What strikes me is that this “operating system” quietly drives so many of our choices, even in causes we think of as altruistic. If we don’t become aware of it, we risk repeating the same pattern: creating new systems, norms, or technologies that look like progress, but still end up serving ego-driven ends.
So while I agree with the call for more moral research and flexibility, I also wonder whether a key part of avoiding future catastrophes is learning to recognize and work with this underlying egoism. Otherwise, we might just keep patching symptoms without ever debugging the core code.