Most animals are wild animals, so the answer to this question should focus on them. It seems to me that the answer largely depends on how we understand “goes well for humans”, and what we expect the counterfactual to be.
So in what are the possible scenarios?
AGI empowers humans to make their own decisions, and to make better decisions. I expect this would greatly accelerate progress toward helping wild animals. This would be great.
AGI replaces human decision-making. It then either:
Reasons further from a starting point of human values, removing biases and inconsistencies—which I think would lead it to care more about animals.
Or it could just lock in current human values.
And what’s the counterfactual?
A continuation of the world as it is today: one where humanity gradually cares more and more about animal welfare, and in which there is at least a potential for caring about wild animals to be normalized. In this case, scenarios 1 and 2(a) seem good, but 2(b) seems very bad.
A world in which the WAW movement fails. In this case even 2(b) doesn’t look that bad, but 1 and 2(a) seem very good.
I’m not sure if this is complete. I’m also not sure how to assign probabilities—I don’t think I know enough about AGI. But tentatively, I expect scenario 2 to be most likely, with (a) and (b) roughly equal, and counterfactual 1 to be most likely. For that reason I’m going with 20% likely to be good.
But I want to say that I would not take a 20% bet of winning everything vs losing everything, and this feels very close. I think this is a terrible gamble and we shouldn’t do it. I hope that the debate results won’t be understood as EAs saying that this is a bet worth taking.
I can imagine a future where most animals are farmed animals. I’m not saying it’s particularly likely, but if humans spread to other planets, I think we’re more likely to take factory farming with us than take nature with us. Farmed animals should be part of this convo imo.
Does that mean you think it’s likely that we will spread to other planets without spreading ecosystems? If we spread ecosystems it seems likely that we would also spread at least some wild animals. And I think we have good reasons to do so—to promote good atmospheres and other ecosystem services.
I feel pretty skeptical that humans capable of going to other galaxies would not have realized the inefficiencies of meat and would still not have made competitive substitutes.
This. I do not see off-world animal farming as a real issue. It’s such an energy and resource inefficient way of making food. Indeed, a prerequisite or a proxy indicator for Earth-independent sustainable civilization seems to be extremely good efficiency in food production. You can’t possibly be on Mars or make an interstellar ship and still have a thousand cows in it for making some cheese.
Most animals are wild animals, so the answer to this question should focus on them. It seems to me that the answer largely depends on how we understand “goes well for humans”, and what we expect the counterfactual to be.
So in what are the possible scenarios?
AGI empowers humans to make their own decisions, and to make better decisions. I expect this would greatly accelerate progress toward helping wild animals. This would be great.
AGI replaces human decision-making. It then either:
Reasons further from a starting point of human values, removing biases and inconsistencies—which I think would lead it to care more about animals.
Or it could just lock in current human values.
And what’s the counterfactual?
A continuation of the world as it is today: one where humanity gradually cares more and more about animal welfare, and in which there is at least a potential for caring about wild animals to be normalized. In this case, scenarios 1 and 2(a) seem good, but 2(b) seems very bad.
A world in which the WAW movement fails. In this case even 2(b) doesn’t look that bad, but 1 and 2(a) seem very good.
I’m not sure if this is complete. I’m also not sure how to assign probabilities—I don’t think I know enough about AGI. But tentatively, I expect scenario 2 to be most likely, with (a) and (b) roughly equal, and counterfactual 1 to be most likely. For that reason I’m going with 20% likely to be good.
But I want to say that I would not take a 20% bet of winning everything vs losing everything, and this feels very close. I think this is a terrible gamble and we shouldn’t do it. I hope that the debate results won’t be understood as EAs saying that this is a bet worth taking.
I can imagine a future where most animals are farmed animals. I’m not saying it’s particularly likely, but if humans spread to other planets, I think we’re more likely to take factory farming with us than take nature with us. Farmed animals should be part of this convo imo.
Copying my response from your other comment:
Does that mean you think it’s likely that we will spread to other planets without spreading ecosystems? If we spread ecosystems it seems likely that we would also spread at least some wild animals. And I think we have good reasons to do so—to promote good atmospheres and other ecosystem services.
I feel pretty skeptical that humans capable of going to other galaxies would not have realized the inefficiencies of meat and would still not have made competitive substitutes.
This. I do not see off-world animal farming as a real issue. It’s such an energy and resource inefficient way of making food. Indeed, a prerequisite or a proxy indicator for Earth-independent sustainable civilization seems to be extremely good efficiency in food production. You can’t possibly be on Mars or make an interstellar ship and still have a thousand cows in it for making some cheese.