Similar to Ollie and Larks, I’m slightly uncomfortable with
“(i) Those who live at future times matter just as much, morally, as those who live today;”
I’m pretty longtermist (I work on existential risk) but I’m not sure whether I think that those who live at future times matter “just as much, morally”. I have some sympathy with the view that people nearer to us in space or time can matter more morally than those very distant—seperately from the question of how much we can do to effect those people.
I also don’t think its necessary for the definition. A less strong definition would work as well. Something like:
“(i) Those who live at future times matter morally”.
Hi John, thanks for the very detailed response. My claim was that ecosystem shift is a “contributor” to existential risk—my claim is that it should be examined to assess the extent to which it is a “risk factor” that increases other risks, one of a set of causes that may overwhelm societal resilience, and a mechanism by which other risks cause damage.
As I said in the first link, “humanity relies on ecosystems to provide ecosystem services, such as food, water, and energy. Sudden catastrophic ecosystem shifts could pose equally catastrophic consequences to human societies. Indeed environmental changes are associated with many historical cases of societal ‘collapses’; though the likelihood of occurrence of such events and the extent of their socioeconomic consequences remains uncertain.”
I can’t respond to your comment at the length it deserves, but we will be publishing papers on the potential link between ecosystem shifts and existential risk in the future, and I hope that they will address some of your points.
I’ll email you with some related stuff.
Thanks for the question. Climate change is a contributor to existential risk. Changing what business schools teach (specifically to include sustainability) might change the behaviour of the next generation of business leaders.
We also have further publications forthcoming on the link between climate change and existential risk.
Thanks for the question. Biodiversity loss and associated catastrophic ecosystem shifts are a contributor to existential risk. Partha’s review may influence UK and international policy.
We also have further publications forthcoming on the link between biodiversity and existential risk.
Does anyone have any idea when we’ll be able to embed YouTube videos on the forum?
Warren introduced the No First Use Act (“It is the policy of the United States to not use nuclear weapons first.”) and Gillibrand is a co-sponsor.
I don’t really understand the conclusion this post is arguing for (or if indeed there is one). In particular, I didn’t spot an answer to “how can we influence the long-term future?”.
If this research seems interesting to you, CSER is currently hiring!
“Cause areas shouldn’t be tribes”
“We shouldn’t entrench existing cause areas”
“Some methods of increasing representativeness have the effect of entrenching current cause areas and making intellectual shifts harder.”
Does this mean you wouldn’t be keen on e.g. “cause-specific community liasons” who mainly talk to people with specific cause-prioritisations, maybe have some money to back projects in ‘their’ cause, etc? (I’m thinking of something analogous to an Open Philanthropy Project Program Officer )
The recent quality of posts has been absolutely stellar*. Keep it up everyone!
*interesting, varied, informative, written to be helpful/useful, rigorous, etc
Really glad to see you taking conflicts of interest so seriously!
This is incredibly valuable (and even groundbreaking) work. Well done for doing it, and for writing it up so clearly and informatively!
Thanks for this!
I personally agree that Democratic control of Congress, or even Congress and the Presidency, would be great. But I’m not sure how likely that is, or how certain that I should be about that likelihood.
Even if there was a high certainty and high likelihood, I probably still wouldn’t take that option—the increased risk for four years is just too high. As Michael_S says you get higher nuclear risk and higher pandemic risk. As I said in my post, I think Trump also raises the risks of increased global instability, increased international authoritarianism, climate change, and emerging technologies. Take climate change—we really don’t have long to fix it! We need to make significant progress by 2030 - we can’t afford to go backwards for four years.
[Writing in a personal capacity, my views are not my employer’s]