I think there’s an understandable wariness of emotionality in EA, since it so often leads into Rousseau-ian romanticism, sentimentality, and virtue-signaling that’s the exact opposite of Bentham-ite utilitarianism, scope-sensitivity, and rational thinking valued in EA. See, e.g. Paul Bloom’s excellent 2018 book ‘Against empathy’.
However, I think it’s important for EAs to take emotions more seriously for at least a couple of reasons. (Full disclosure: I’ve taught a seminar on ‘Human Emotions’ about ten times for upper-level undergrads, and we often end up discussing EA topics.)
First, emotional experiences are the basic building blocks of sentient utility. If we’re trying to maximize the quality, quantity, and variety of sentient well-being, it might be important to understand the origins, nature, functions, triggers, and details of animal, human, and artificial emotions. Both from the outside (e.g. as students of emotion research) and from the inside (as people who have deep personal experience with the full range of human emotions—including holding them at arm’s length during mindfulness meditation or psychedelic experiences).
Second, emotional experiences, as you argue, can be great for recruiting new people, helping them understand the stakes and ethos of EA, and in keeping current EAs happy, motivated, and inspired. I agree that more development of EA-oriented stories, art, music, videos, films, etc could help. The challenge is to recruit the kinds of artsy creatives who will actually resonate to the hyper-rationality of the EA mind-set. The Venn diagram of people who can write great screenplays, and people who actually understand long-termism and scope-sensitivity, might be quite small, for example. But those are the kinds of people who could really help with EA movement-building—as long as they don’t dilute, distort, or trivialize the core EA principles.
Thank you for the thoughtful comment Geoffrey. I agree it’s a fine balance between being wary – but not dismissive – of emotions.
I spent the latter half of my psychology undergraduate harping against emotions – talking about how our evolution has left us with unreliable guides for who to care about and how much to care. (Basically regurgitating Against Empathy).
Yet here I am writing a whole post on emphasizing emotions. With enough nuance on which emotions we’re talking about and in what settings, however, I think both views are compatible. (I appreciate your comment to that effect).
I also think your Venn diagram comment is apt. I agree it’s a narrow overlap, but it’s one I’d like to see a few more people with the right aptitudes lean into.
Wonderful post Michel; thanks for your thoughts.
I think there’s an understandable wariness of emotionality in EA, since it so often leads into Rousseau-ian romanticism, sentimentality, and virtue-signaling that’s the exact opposite of Bentham-ite utilitarianism, scope-sensitivity, and rational thinking valued in EA. See, e.g. Paul Bloom’s excellent 2018 book ‘Against empathy’.
However, I think it’s important for EAs to take emotions more seriously for at least a couple of reasons. (Full disclosure: I’ve taught a seminar on ‘Human Emotions’ about ten times for upper-level undergrads, and we often end up discussing EA topics.)
First, emotional experiences are the basic building blocks of sentient utility. If we’re trying to maximize the quality, quantity, and variety of sentient well-being, it might be important to understand the origins, nature, functions, triggers, and details of animal, human, and artificial emotions. Both from the outside (e.g. as students of emotion research) and from the inside (as people who have deep personal experience with the full range of human emotions—including holding them at arm’s length during mindfulness meditation or psychedelic experiences).
Second, emotional experiences, as you argue, can be great for recruiting new people, helping them understand the stakes and ethos of EA, and in keeping current EAs happy, motivated, and inspired. I agree that more development of EA-oriented stories, art, music, videos, films, etc could help. The challenge is to recruit the kinds of artsy creatives who will actually resonate to the hyper-rationality of the EA mind-set. The Venn diagram of people who can write great screenplays, and people who actually understand long-termism and scope-sensitivity, might be quite small, for example. But those are the kinds of people who could really help with EA movement-building—as long as they don’t dilute, distort, or trivialize the core EA principles.
Thank you for the thoughtful comment Geoffrey. I agree it’s a fine balance between being wary – but not dismissive – of emotions.
I spent the latter half of my psychology undergraduate harping against emotions – talking about how our evolution has left us with unreliable guides for who to care about and how much to care. (Basically regurgitating Against Empathy).
Yet here I am writing a whole post on emphasizing emotions. With enough nuance on which emotions we’re talking about and in what settings, however, I think both views are compatible. (I appreciate your comment to that effect).
I also think your Venn diagram comment is apt. I agree it’s a narrow overlap, but it’s one I’d like to see a few more people with the right aptitudes lean into.