Somehow EA folks seem to be good at establishing an emotional distance between climate change and existential or extinction risk. They believe that their attention to direct vs indirect risks somehow justifies denying that climate change is an existential risk. For example, they claim that it’s less important than pandemics (although it will contribute to pandemics) or the risk of nuclear war (although it will increase the risk).
I like to see solutions to problems as either individual, group, or systemic. An individual solution to a systemic problem is one that protects just the individual. I’m fairly sure that most people in developed countries adopt an individualist attitude toward the systemic problem of climate change.
As individuals, we have to play along with the bigger system if we can’t change it. I think EA’s have a conflict of interest around climate change. It’s very hard not to have a conflict of interest around climate change somewhere in how I live. I think that’s true for EA’s as well.
Whatever I am invested in as an individual probably contributes to global warming in a significant way when everyone does it collectively. Whether its elements of my lifestyle, my political stance, or my vision of the future (for example, techno-utopianism), it evokes conflicts for me personally, politically, or professionally if I address the root causes of global warming.
Whether I:
play along with technological determinism (by my definition, marketing that tech companies have a great vision of the future for consumers who use their products)
sell myself on techno-utopianism (for example, that a nanotech future will be worth living in or that AI will solve our problems for us)
pretend that old talking points are still relevant (for example, that climate change could rather than will have existential and extinction consequences)
speak hopefully about actual efforts to solve climate change (for example, the recent climate change bill that made it into law in the US)
all I’m doing is trying to protect myself as an individual, or maybe some small group that I care about.
We do have a need for research into tipping points, for better hardware to run higher-fidelity (smaller mesh size) atmospheric and ocean models, for better risk modeling of systemic and cascading risks, etc. We might, for example, develop better models of how geo-engineering efforts could work or fail. So, yeah.
But what that research will also confirm is something like Carl Sagan talked about in the 1980′s to congress, or what climatologists began worrying about publicly in the 1970′s, or scientists understood about heat-trapping gases in the 1960′s. We will confirm, with ever greater certainty, that we should really do something about GHG’s and anthropogenic climate change.
When EA folks say that climate change is not neglected, what they are not saying is that genuine climate change adaptation and mitigation efforts are limited or doomed to failure. What about BECCS? CCS? Planting trees? Migration assistance for climate refugees? All unfeasible at scale and in time.
Furthermore, the lack of climate change prevention is why a paper like that Climate Endgame paper would ever get published. Passing 5 tipping points in the short term? That is an utter failure of prevention efforts. Tipping points were not supposed to be passed at all. The assumption that tipping points are in the distant future has kept the discussion of “fighting climate change” a hopeful one. And now that assumption has to be given up.
Didn’t EA start with some understanding that a lot of money and energy is wasted in charitable efforts? Well, similar waste must be happening in the climate change arena. Governments are taking action based on silly models of risk or outdated models of causes and so their attention is misdirected and their money is wasted.
So I agree with you, yes, EA folks should take climate change seriously. It could help the situation for EA’s to learn that climate change poses an existential and extinction threat this century. Beyond that, I don’t know what EA’s could really positively accomplish anyway, unless they were willing to do something like fund migration for climate refugees or pay for cooling technologies for the poor or reconstruct infrastructure in countries without an effective government.
Hi, Nir.
You make some great points.
Somehow EA folks seem to be good at establishing an emotional distance between climate change and existential or extinction risk. They believe that their attention to direct vs indirect risks somehow justifies denying that climate change is an existential risk. For example, they claim that it’s less important than pandemics (although it will contribute to pandemics) or the risk of nuclear war (although it will increase the risk).
I like to see solutions to problems as either individual, group, or systemic. An individual solution to a systemic problem is one that protects just the individual. I’m fairly sure that most people in developed countries adopt an individualist attitude toward the systemic problem of climate change.
As individuals, we have to play along with the bigger system if we can’t change it. I think EA’s have a conflict of interest around climate change. It’s very hard not to have a conflict of interest around climate change somewhere in how I live. I think that’s true for EA’s as well.
Whatever I am invested in as an individual probably contributes to global warming in a significant way when everyone does it collectively. Whether its elements of my lifestyle, my political stance, or my vision of the future (for example, techno-utopianism), it evokes conflicts for me personally, politically, or professionally if I address the root causes of global warming.
Whether I:
sell myself on techno-utopianism (for example, that a nanotech future will be worth living in or that AI will solve our problems for us)
pretend that old talking points are still relevant (for example, that climate change could rather than will have existential and extinction consequences)
speak hopefully about actual efforts to solve climate change (for example, the recent climate change bill that made it into law in the US)
all I’m doing is trying to protect myself as an individual, or maybe some small group that I care about.
We do have a need for research into tipping points, for better hardware to run higher-fidelity (smaller mesh size) atmospheric and ocean models, for better risk modeling of systemic and cascading risks, etc. We might, for example, develop better models of how geo-engineering efforts could work or fail. So, yeah.
But what that research will also confirm is something like Carl Sagan talked about in the 1980′s to congress, or what climatologists began worrying about publicly in the 1970′s, or scientists understood about heat-trapping gases in the 1960′s. We will confirm, with ever greater certainty, that we should really do something about GHG’s and anthropogenic climate change.
When EA folks say that climate change is not neglected, what they are not saying is that genuine climate change adaptation and mitigation efforts are limited or doomed to failure. What about BECCS? CCS? Planting trees? Migration assistance for climate refugees? All unfeasible at scale and in time.
Furthermore, the lack of climate change prevention is why a paper like that Climate Endgame paper would ever get published. Passing 5 tipping points in the short term? That is an utter failure of prevention efforts. Tipping points were not supposed to be passed at all. The assumption that tipping points are in the distant future has kept the discussion of “fighting climate change” a hopeful one. And now that assumption has to be given up.
Didn’t EA start with some understanding that a lot of money and energy is wasted in charitable efforts? Well, similar waste must be happening in the climate change arena. Governments are taking action based on silly models of risk or outdated models of causes and so their attention is misdirected and their money is wasted.
So I agree with you, yes, EA folks should take climate change seriously. It could help the situation for EA’s to learn that climate change poses an existential and extinction threat this century. Beyond that, I don’t know what EA’s could really positively accomplish anyway, unless they were willing to do something like fund migration for climate refugees or pay for cooling technologies for the poor or reconstruct infrastructure in countries without an effective government.