P[humans and animals survive a long time]; large
P[humans survive with animal life with super AI]: small
P[humans survive without animal life with super AI]: much smaller
P[Animals survive without humans but with super AI]: nearly none?
It seems to me that by focusing on protecting humanity and its society, you’re protecting animals by implication pretty much.
Promoting animal liberation has large wierdness points.
MIRI’s efforts are already hampered by wierdness points.
So using MIRI as a platform to promote animal liberation is probably not a wise move?
P[humans and animals survive a long time]; large
P[humans survive with animal life with super AI]: small
P[humans survive without animal life with super AI]: much smaller
P[Animals survive without humans but with super AI]: nearly none?
It seems to me that by focusing on protecting humanity and its society, you’re protecting animals by implication pretty much.
Promoting animal liberation has large wierdness points.
MIRI’s efforts are already hampered by wierdness points.
So using MIRI as a platform to promote animal liberation is probably not a wise move?