Imagine it’s 2022. You wake up and check the EA forum to see that Scott Alexander has a post knocking the premise of longtermism and it’s sitting in at 200 karma. On top, Holden Karnofsky has a post saying he may be only 20% convinced that x-risk itself is overwhelmingly important. Also, Joey Savoie is hanging in there.
Obviously, I’ll write in to support longtermism.
Below is a one long story about how some people might change their views, in this story, x-risk alone wouldn’t work.
TLDR; Some people think the future is really bad and don’t value it. You need something besides x-risk, to engage them, like a competent and coordinated movement to improve the future. Without this, x-risk and other EA work might be meaningless too. This explanation below has an intuitive or experiential quality, not numerical. I don’t know if this is actually longtermism.
Many people don’t consider future generations valuable because they have a pessimistic view of human society. I think this is justifiable.
Then, if you think society will remain in its current state, it’s reasonable that you might not want to preserve it. If you only ever think about one or two generations into the future, like I think most people do, it’s hard to see the possibility of change. So I think this “negative” mentality is self-reinforcing, they’re stuck.
To these people, the idea of x-risk doesn’t make sense, not because these dangers aren’t real but because there isn’t anything to preserve. To these people, giant numbers like 10^30 are really, especially unconvincing, because they seem silly and, if anything, we owe the future a small society.
I think the above is an incredibly mainstream view. Many people with talent, perception and resources might hold it.
The alternative to the mindset above is to see a long future that has possibilities. That there is a substantial possibility that things can be a lot better. And that it is viable to actually try to influence it.
I think these three sentences above seem “simple”, but for this to substantially enter some people’s world view, these ideas need to go together at the same time. Because of this, it’s non-obvious and unconvincing.
I think one reason why the idea or movement for influencing the future is valuable is that most people don’t know anyone who is seriously trying. It takes a huge amount of coordination and resources to do this. It’s bizarre to do this on your own or with a small group of people.
I think everyone, deep down, wants to be optimistic about the future and humanity. But they don’t take any action or spend time thinking about it.
With an actual strong movement that seems competent, it is possible to convince people there can be enough focus and investments that are viable to improve the future. It is this viability and assessment that produces a mental shift to optimism and engagement.
So this is the value of presenting the long term future in some way.
To be clear, in making this shift, people are being drawn in by competence. Competence involves “rational” thinking, planning and calculation, and all sorts of probabilities and numbers.
But for these people, despite what is commonly presented, I’m not sure focusing on numbers, or using Bayes, etc. may play any role in this presentation. If someone told me they changed their worldview because they ran numbers, I would be suspicious. Even now, most of the time, I am skeptical when I see huge numbers or intricate calculations.
Instead, this is a mindset or worldview that is intuitive. To kind of see this, this text seems convincing (“Good ideas change the world, or could possibly save it...”) but doesn’t use any calculations. I think this sort of thinking is how most people actually change their views about complex topics.
To have this particular change in view, I think you still need to have further beliefs that might be weird or unusual:
You need to have a sense of personal agency, that you can affect the future through your own actions, even though there are billions of people. This might be aggressive or wrong.
You might also need to have judgment of society and institutions that are “just right”.
You need to believe society could go down a bad path because institutions are currently dysfunctional and fragile.
Yet, you need to believe it’s possible to design ones that are robust to change the future.
I have no idea if the above is longtermism at all. This seems sort of weak, and seems like it only would compel me to execute my particular beliefs.
It seems sort of surprising if many people had this particular viewpoint in this comment.
This viewpoint does have the benefit that you could ask questions to interrogate these beliefs (people couldn’t just say there’s “10^42 people” or something).
Imagine it’s 2022. You wake up and check the EA forum to see that Scott Alexander has a post knocking the premise of longtermism and it’s sitting in at 200 karma. On top, Holden Karnofsky has a post saying he may be only 20% convinced that x-risk itself is overwhelmingly important. Also, Joey Savoie is hanging in there.
Obviously, I’ll write in to support longtermism.
Below is a one long story about how some people might change their views, in this story, x-risk alone wouldn’t work.
TLDR; Some people think the future is really bad and don’t value it. You need something besides x-risk, to engage them, like a competent and coordinated movement to improve the future. Without this, x-risk and other EA work might be meaningless too. This explanation below has an intuitive or experiential quality, not numerical. I don’t know if this is actually longtermism.
Many people don’t consider future generations valuable because they have a pessimistic view of human society. I think this is justifiable.
Then, if you think society will remain in its current state, it’s reasonable that you might not want to preserve it. If you only ever think about one or two generations into the future, like I think most people do, it’s hard to see the possibility of change. So I think this “negative” mentality is self-reinforcing, they’re stuck.
To these people, the idea of x-risk doesn’t make sense, not because these dangers aren’t real but because there isn’t anything to preserve. To these people, giant numbers like 10^30 are really, especially unconvincing, because they seem silly and, if anything, we owe the future a small society.
I think the above is an incredibly mainstream view. Many people with talent, perception and resources might hold it.
The alternative to the mindset above is to see a long future that has possibilities. That there is a substantial possibility that things can be a lot better. And that it is viable to actually try to influence it.
I think these three sentences above seem “simple”, but for this to substantially enter some people’s world view, these ideas need to go together at the same time. Because of this, it’s non-obvious and unconvincing.
I think one reason why the idea or movement for influencing the future is valuable is that most people don’t know anyone who is seriously trying. It takes a huge amount of coordination and resources to do this. It’s bizarre to do this on your own or with a small group of people.
I think everyone, deep down, wants to be optimistic about the future and humanity. But they don’t take any action or spend time thinking about it.
With an actual strong movement that seems competent, it is possible to convince people there can be enough focus and investments that are viable to improve the future. It is this viability and assessment that produces a mental shift to optimism and engagement.
So this is the value of presenting the long term future in some way.
To be clear, in making this shift, people are being drawn in by competence. Competence involves “rational” thinking, planning and calculation, and all sorts of probabilities and numbers.
But for these people, despite what is commonly presented, I’m not sure focusing on numbers, or using Bayes, etc. may play any role in this presentation. If someone told me they changed their worldview because they ran numbers, I would be suspicious. Even now, most of the time, I am skeptical when I see huge numbers or intricate calculations.
Instead, this is a mindset or worldview that is intuitive. To kind of see this, this text seems convincing (“Good ideas change the world, or could possibly save it...”) but doesn’t use any calculations. I think this sort of thinking is how most people actually change their views about complex topics.
To have this particular change in view, I think you still need to have further beliefs that might be weird or unusual:
You need to have a sense of personal agency, that you can affect the future through your own actions, even though there are billions of people. This might be aggressive or wrong.
You might also need to have judgment of society and institutions that are “just right”.
You need to believe society could go down a bad path because institutions are currently dysfunctional and fragile.
Yet, you need to believe it’s possible to design ones that are robust to change the future.
I have no idea if the above is longtermism at all. This seems sort of weak, and seems like it only would compel me to execute my particular beliefs.
It seems sort of surprising if many people had this particular viewpoint in this comment.
This viewpoint does have the benefit that you could ask questions to interrogate these beliefs (people couldn’t just say there’s “10^42 people” or something).