Hey, everyone. I don’t post here often and I’m not particularly knowledgeable about strong longtermism, but I’ve been thinking a bit about it lately and wanted to share a thought I haven’t seen addressed yet and I was wondering if it’s reasonable and unaddressed. I’m not sure this is the right place though, but here goes.
It seems to me that strong longtermism is extremely biased towards human beings.
In most catastrophic risks I can imagine (climate change, AI misalignment, and maybe even nuclear war* or pandemics**), it seems unlikely that earth would become uninhabitable for a long period or that all life on earth would be disrupted.
Some of these events (e.g. climate change) could have significant short to medium term effects on all life on earth, but in the long run (after several million years?), I’d argue the impact on non-human animals would likely be negligible, since evolution would eventually find its way. So if this is right and you consider the very long term and value all lives (humans and other animals) equally, wouldn’t strong longtermism imply not doing anything?
Although I definitely am somewhat biased towards human beings and think existential risk is a very important cause, I wonder if this critique makes sense.
*Regarding nuclear war, I guess it would depend on the length and strength of the radioactivity, which is not a subject I’m familiar with.
**From what I’ve learned in the last year and a half, it wouldn’t be easy for viruses (not sure about bacteria) to infect lots of different species (covid-19 doesn’t seem to be a problem to other species).
Thanks for the comment. I really hadn’t considered colonizing the stars and bringing animals.