I agree with Scott Alexander that when talking with most non-EA people, an X risk framework is more attention-grabbing, emotionally vivid, and urgency-inducing, partly due to negativity bias, and partly due to the familiarity of major anthropogenic X risks as portrayed in popular science fiction movies & TV series.
However, for people who already understand the huge importance of minimizing X risk, there’s a risk of burnout, pessimism, fatalism, and paralysis, which can be alleviated by longtermism and more positive visions of desirable futures. This is especially important when current events seem all doom’n’gloom, when we might ask ourselves ‘what about humanity is really worth saving?’ or ‘why should we really care about the long-term future, it it’ll just be a bunch of self-replicating galaxy-colonizing AI drones that are no more similar to us than we are to late Permian proto-mammal cynodonts?’
In other words, we in EA need long-termism to stay cheerful, hopeful, and inspired about why we’re so keen to minimize X risks and global catastrophic risks.
But we also need longtermism to broaden our appeal to the full range of personality types, political views, and religious views out there in the public. My hunch as a psych professor is that there are lots of people who might respond better to longtermist positive visions than to X risk alarmism. It’s an empirical question how common that is, but I think it’s worth investigating.
Also, a significant % of humanity is already tacitly longtermist in the sense of believing in an infinite religious afterlife, and trying to act accordingly. Every Christian who takes their theology seriously & literally (i.e. believes in heaven and hell), and who prioritizes Christian righteousness over the ‘temptations of this transient life’, is doing longtermist thinking about the fate of their soul, and the souls of their loved ones. They take Pascal’s wager seriously; they live it every day. To such people, X risks aren’t necessarily that frightening personally, because they already believe that 99.9999+% of sentient experience will come in the afterlife. Reaching the afterlife sooner rather than later might not matter much, given their way of thinking.
However, even the most fundamentalist Christians might be responsive to arguments that the total number of people we could create in the future—who would all have save-able souls—could vastly exceed the current number of Christians. So, more souls for heaven; the more the merrier. Anybody who takes a longtermist view of their individual soul might find it easier to take a longtermist view of the collective human future.
I understand that most EAs are atheists or agnostics, and will find such arguments bizarre. But if we don’t take the views of religious people seriously, as part of the cultural landscape we’re living in, we’re not going to succeed in our public outreach, and we’re going to alienate a lot of potential donors, politicians, and media influencers.
There’s a particular danger in overemphasizing the more exotic transhumanist visions of the future, in alienating religious or political traditionalists. For many Christians, Muslims, and conservatives, a post-human, post-singularity, AI-dominated future would not sound worth saving. Without any humane connection to their human social world as it is, they might prefer a swift nuclear Armageddon followed by heavenly bliss, to a godless, soulless machine world stretching ahead for billions of years.
EAs tend to score very highly on Openness to Experience. We love science fiction. We like to think about post-human futures being potentially much better than human futures. But it that becomes our dominant narrative, we will alienate the vast majority of current living humans, who score much lower on Openness.
If we push the longtermist narrative to the general public, we better make the long-term future sound familiar enough to be worth fighting for.
Based on my memory of how people thought while growing up in the church, I don’t think increasing the number of saveable souls is something that makes sense for a Christian—or even any sort of long termist utilitarian framework at all.
Ultimately god is in control of everything. Your actions are fundamentally about your own soul, and your own eternal future, and not about other people. Their fate is between them and God, and he who knows when each sparrow falls will not forget them.
I remember my father explicitly saying that he regretted not having more children because he’s since learned that God wants us to create more souls for him. Didn’t make sense to me even as a Christian at the time, but the idea is out there.
There are fringe movements (ex: Quiverfull) that focus on procreation as a way of living out God’s will, but very few. What resonates with Christians is a “stewardship” mindset—using our God-given abilities and opportunities wisely. The Bible is full of stories of an otherwise-unspecial person being at the right time and place to make a historically impactful decision.
In other words, we in EA need long-termism to stay cheerful, hopeful, and inspired about why we’re so keen to minimize X risks and global catastrophic risks.
“However, even the most fundamentalist Christians might be responsive to arguments that the total number of people we could create in the future—who would all have save-able souls—could vastly exceed the current number of Christians”.
I had thought about the above before, thanks for pointing it out!
I agree with Scott Alexander that when talking with most non-EA people, an X risk framework is more attention-grabbing, emotionally vivid, and urgency-inducing, partly due to negativity bias, and partly due to the familiarity of major anthropogenic X risks as portrayed in popular science fiction movies & TV series.
However, for people who already understand the huge importance of minimizing X risk, there’s a risk of burnout, pessimism, fatalism, and paralysis, which can be alleviated by longtermism and more positive visions of desirable futures. This is especially important when current events seem all doom’n’gloom, when we might ask ourselves ‘what about humanity is really worth saving?’ or ‘why should we really care about the long-term future, it it’ll just be a bunch of self-replicating galaxy-colonizing AI drones that are no more similar to us than we are to late Permian proto-mammal cynodonts?’
In other words, we in EA need long-termism to stay cheerful, hopeful, and inspired about why we’re so keen to minimize X risks and global catastrophic risks.
But we also need longtermism to broaden our appeal to the full range of personality types, political views, and religious views out there in the public. My hunch as a psych professor is that there are lots of people who might respond better to longtermist positive visions than to X risk alarmism. It’s an empirical question how common that is, but I think it’s worth investigating.
Also, a significant % of humanity is already tacitly longtermist in the sense of believing in an infinite religious afterlife, and trying to act accordingly. Every Christian who takes their theology seriously & literally (i.e. believes in heaven and hell), and who prioritizes Christian righteousness over the ‘temptations of this transient life’, is doing longtermist thinking about the fate of their soul, and the souls of their loved ones. They take Pascal’s wager seriously; they live it every day. To such people, X risks aren’t necessarily that frightening personally, because they already believe that 99.9999+% of sentient experience will come in the afterlife. Reaching the afterlife sooner rather than later might not matter much, given their way of thinking.
However, even the most fundamentalist Christians might be responsive to arguments that the total number of people we could create in the future—who would all have save-able souls—could vastly exceed the current number of Christians. So, more souls for heaven; the more the merrier. Anybody who takes a longtermist view of their individual soul might find it easier to take a longtermist view of the collective human future.
I understand that most EAs are atheists or agnostics, and will find such arguments bizarre. But if we don’t take the views of religious people seriously, as part of the cultural landscape we’re living in, we’re not going to succeed in our public outreach, and we’re going to alienate a lot of potential donors, politicians, and media influencers.
There’s a particular danger in overemphasizing the more exotic transhumanist visions of the future, in alienating religious or political traditionalists. For many Christians, Muslims, and conservatives, a post-human, post-singularity, AI-dominated future would not sound worth saving. Without any humane connection to their human social world as it is, they might prefer a swift nuclear Armageddon followed by heavenly bliss, to a godless, soulless machine world stretching ahead for billions of years.
EAs tend to score very highly on Openness to Experience. We love science fiction. We like to think about post-human futures being potentially much better than human futures. But it that becomes our dominant narrative, we will alienate the vast majority of current living humans, who score much lower on Openness.
If we push the longtermist narrative to the general public, we better make the long-term future sound familiar enough to be worth fighting for.
Based on my memory of how people thought while growing up in the church, I don’t think increasing the number of saveable souls is something that makes sense for a Christian—or even any sort of long termist utilitarian framework at all.
Ultimately god is in control of everything. Your actions are fundamentally about your own soul, and your own eternal future, and not about other people. Their fate is between them and God, and he who knows when each sparrow falls will not forget them.
I remember my father explicitly saying that he regretted not having more children because he’s since learned that God wants us to create more souls for him. Didn’t make sense to me even as a Christian at the time, but the idea is out there.
There are fringe movements (ex: Quiverfull) that focus on procreation as a way of living out God’s will, but very few. What resonates with Christians is a “stewardship” mindset—using our God-given abilities and opportunities wisely. The Bible is full of stories of an otherwise-unspecial person being at the right time and place to make a historically impactful decision.
Eliezer’s underrated fun theory sequence tackles this.
“However, even the most fundamentalist Christians might be responsive to arguments that the total number of people we could create in the future—who would all have save-able souls—could vastly exceed the current number of Christians”.
I had thought about the above before, thanks for pointing it out!