Yes, totally agree that some longtermist or AI safety oriented types have actually thought about these things, and endorse precise probabilties, and have precise probability assignments to things I find quite strange, like thinking it’s 80% likely that the universe will be dominated by sentient machines instead of wild animals. Although I expect I’d find any precise probability assignment about outcomes like this quite surprising, perhaps I’m just a very skeptical person.
But I think a lot of EAs I talk to have not reflected on this much and don’t realize how much the view hinges on these sorts of beliefs.
However, many of the longtermists who would be convinced by this might fall back on the opinion I describe in footnote 1 of my above comment in the (they don’t know how likely) scenario where wild animals dominate (and then the crux becomes what we can reasonably think is good/​best for long-term WAW).
Yes, totally agree that some longtermist or AI safety oriented types have actually thought about these things, and endorse precise probabilties, and have precise probability assignments to things I find quite strange, like thinking it’s 80% likely that the universe will be dominated by sentient machines instead of wild animals. Although I expect I’d find any precise probability assignment about outcomes like this quite surprising, perhaps I’m just a very skeptical person.
But I think a lot of EAs I talk to have not reflected on this much and don’t realize how much the view hinges on these sorts of beliefs.
Agreed. I think we should probably have very indeterminate/​imprecise beliefs about what moral patients will dominate in the far future, and this imprecision arguably breaks the Pascalian wager (that many longtermists take) in favor of assuming enhanced human-ish minds outnumber wild animals.
However, many of the longtermists who would be convinced by this might fall back on the opinion I describe in footnote 1 of my above comment in the (they don’t know how likely) scenario where wild animals dominate (and then the crux becomes what we can reasonably think is good/​best for long-term WAW).