ExponentialDragon—this is such a timely, interesting, & important question. Thanks for raising it.
Tens of millions of young people are already concerned about climate change, and often view it as an existential risk (although it is, IMHO, a global catastrophic risk rather than an existential risk). Many of them are already working hard to fight climate change (albeit sometimes with strategies & policies that might be counter-productive or over-general, such as ‘smash capitalism’).
This is a good foundation for building concern about other X risks -- a young generation full of people concerned about humanity’s future, with a global mind-set, some respect for the relevant science, and a frustration with the vested political & corporate interests that tend to downplay major global problems.
How can we nudge or lure them into caring about other X risks that might actually be more dangerous?
I also agree that asking them to abandon their climate change friends, their political tribes, and their moral in-groups is usually asking them too much.
So how do we convince a smarter-than-average 22-year-old who thinks ‘climate change will end the world within 20 years; we must recycle more!’ into someone who thinks ‘climate change is really bad, and we should fight it, but also, here’s cause area X that is also a big deal and worth some effort’?
I’m not sure. But my hunch is that we need to piggy-back on their existing concerns, and work with the grain of their political & ideological beliefs & values. They might not care about AGI X-risk per se, but they might care about AI increasing the rate of ‘economic growth’ so quickly that carbon emissions ramp up very fast, or AI amplifying economic inequalities, or AI propaganda by Big Oil being used to convince citizens to ignore climate change, or whatever. Some of these might seem like silly concerns to those deeply involved in AI research… but we’re talking here about recruiting people from where they are now, not recruiting idealized hyper-rational decouplers who already understand machine learning.
Likewise with nuclear war as an X risk. Global thermonuclear war seems likely to cause massive climate change (eg through nuclear winter), and that’s one of its most lethal, large-scale effects, so there’s potentially strong overlap between fighting climate change due to carbon emissions, and fighting climate change due to nuclear bombs.
I think EA already pays considerable lip service to climate change as a global catastrophic risk (even though most of us know it’s not a true X risk), and we do that partly so we don’t alienate young climate change activists. But I think it’s worth diving deeper into how to recruit some of the smarter, more open-minded climate change activists into EA X risk research and activism.
Thanks! You seem to think about this topic like me, which gives me hope I am not alone. I am glad the world is full of people who want a better future for us and I think directing them to the right causes may be easier than to make new activists. I believe just as charities compete with each other, activism causes do as well right? Because there is only a limited number of activists.
ExponentialDragon—this is such a timely, interesting, & important question. Thanks for raising it.
Tens of millions of young people are already concerned about climate change, and often view it as an existential risk (although it is, IMHO, a global catastrophic risk rather than an existential risk). Many of them are already working hard to fight climate change (albeit sometimes with strategies & policies that might be counter-productive or over-general, such as ‘smash capitalism’).
This is a good foundation for building concern about other X risks -- a young generation full of people concerned about humanity’s future, with a global mind-set, some respect for the relevant science, and a frustration with the vested political & corporate interests that tend to downplay major global problems.
How can we nudge or lure them into caring about other X risks that might actually be more dangerous?
I also agree that asking them to abandon their climate change friends, their political tribes, and their moral in-groups is usually asking them too much.
So how do we convince a smarter-than-average 22-year-old who thinks ‘climate change will end the world within 20 years; we must recycle more!’ into someone who thinks ‘climate change is really bad, and we should fight it, but also, here’s cause area X that is also a big deal and worth some effort’?
I’m not sure. But my hunch is that we need to piggy-back on their existing concerns, and work with the grain of their political & ideological beliefs & values. They might not care about AGI X-risk per se, but they might care about AI increasing the rate of ‘economic growth’ so quickly that carbon emissions ramp up very fast, or AI amplifying economic inequalities, or AI propaganda by Big Oil being used to convince citizens to ignore climate change, or whatever. Some of these might seem like silly concerns to those deeply involved in AI research… but we’re talking here about recruiting people from where they are now, not recruiting idealized hyper-rational decouplers who already understand machine learning.
Likewise with nuclear war as an X risk. Global thermonuclear war seems likely to cause massive climate change (eg through nuclear winter), and that’s one of its most lethal, large-scale effects, so there’s potentially strong overlap between fighting climate change due to carbon emissions, and fighting climate change due to nuclear bombs.
I think EA already pays considerable lip service to climate change as a global catastrophic risk (even though most of us know it’s not a true X risk), and we do that partly so we don’t alienate young climate change activists. But I think it’s worth diving deeper into how to recruit some of the smarter, more open-minded climate change activists into EA X risk research and activism.
Thanks! You seem to think about this topic like me, which gives me hope I am not alone. I am glad the world is full of people who want a better future for us and I think directing them to the right causes may be easier than to make new activists. I believe just as charities compete with each other, activism causes do as well right? Because there is only a limited number of activists.