I assume its because humans rely on natural ecosystems in a variety of ways in order to have the conditions necessary for agriculture, life, etc. So, like with climate change, the long-term cost of mitigation is simply massive… really these numbers should not be thought of as very meaningful, I think, since the kinds of disruptions and destruction we are talking about is not easily measured in $s.
TBH, I find it not-at-all surprising that saving coral reefs would have a huge impact, since they are basically part of the backbone of the entire global ocean ecosystem, and this stuff is all connected, etc.
I think environmentalism is often portrayed as some sort of hippy-dippy sentimentalism and contrasted with humanist values and economic good sense, and I’ve been a bit surprised how prevalent that sort of attitude seems to be in EA. I’m not trying to say that either of you in the thread have this attitude; it’s more just that I was reminded of it by these comments… it seems like I have a much stronger prior that protecting the environment is good for people’s long-term future (e.g. like most people here have probably heard the idea that all the biodiversity we’re destroying could have massive scientific implications, e.g. leading to the development of new materials and drugs).
I think the reality is that we’re completely squandering the natural resources of the earth, and all of this only looks good for people in the short term, or if we expect to achieve technological independence from nature. I think it’s very foolhardy to assume that we will achieve technological independence from nature, and doing so is a source of x-risk. (TBC, I’m not an expert on any of this; just sharing my perspective.)
To be clear, I also think that AI timelines are likely to be short, and AI x-risk mostly dominates my thinking about the future. If we can build aligned, transformative AI, there is a good chance that we will be able to leverage to develop technological independence from nature. At the same time, I think our current irresponsible attitude towards managing natural resources doesn’t bode well, even if we grant ourselves huge technological advances (it seems to me that many problems facing humanity now require social, not technological solutions; the technology is often already there...).
(Sorry, this is a bit stream-of-conscious):
I assume its because humans rely on natural ecosystems in a variety of ways in order to have the conditions necessary for agriculture, life, etc. So, like with climate change, the long-term cost of mitigation is simply massive… really these numbers should not be thought of as very meaningful, I think, since the kinds of disruptions and destruction we are talking about is not easily measured in $s.
TBH, I find it not-at-all surprising that saving coral reefs would have a huge impact, since they are basically part of the backbone of the entire global ocean ecosystem, and this stuff is all connected, etc.
I think environmentalism is often portrayed as some sort of hippy-dippy sentimentalism and contrasted with humanist values and economic good sense, and I’ve been a bit surprised how prevalent that sort of attitude seems to be in EA. I’m not trying to say that either of you in the thread have this attitude; it’s more just that I was reminded of it by these comments… it seems like I have a much stronger prior that protecting the environment is good for people’s long-term future (e.g. like most people here have probably heard the idea that all the biodiversity we’re destroying could have massive scientific implications, e.g. leading to the development of new materials and drugs).
I think the reality is that we’re completely squandering the natural resources of the earth, and all of this only looks good for people in the short term, or if we expect to achieve technological independence from nature. I think it’s very foolhardy to assume that we will achieve technological independence from nature, and doing so is a source of x-risk. (TBC, I’m not an expert on any of this; just sharing my perspective.)
To be clear, I also think that AI timelines are likely to be short, and AI x-risk mostly dominates my thinking about the future. If we can build aligned, transformative AI, there is a good chance that we will be able to leverage to develop technological independence from nature. At the same time, I think our current irresponsible attitude towards managing natural resources doesn’t bode well, even if we grant ourselves huge technological advances (it seems to me that many problems facing humanity now require social, not technological solutions; the technology is often already there...).