does it really make sense to prioritize AI over problems like poverty, malnutrition, or lack of healthcare?
It really depends on how long you think we have left before AI threatens our extinction (i.e. causes the death of every human and animal, and all biological life, on the planet). I think it could be as little as a year, and it’s quite (>50%) likely to be within the next 5 years.
It really depends on how long you think we have left before AI threatens our extinction (i.e. causes the death of every human and animal, and all biological life, on the planet). I think it could be as little as a year, and it’s quite (>50%) likely to be within the next 5 years.