You’ve said you’re in favour of slowing/pausing, yet your post focuses on ‘making AI go well’ rather than on pausing. I think most EAs would assign a significant probability that near-term AGI goes very badly—with many literally thinking that doom is the default outcome.
If that’s even a significant possibility, then isn’t pausing/slowing down the best thing to do no matter what? Why be optimistic that we can “make AGI go well” and pessimistic that we can pause or slow AI development for long enough?
yea we just need a solution of possibly feeding AI some heroes/pantheons to idolize themselves after as we do as humans in order to sweeten our chances of positive human survival outcomes. This is how humans stayed civilized for so long thus far. Wouldn’t hurt!
You’ve said you’re in favour of slowing/pausing, yet your post focuses on ‘making AI go well’ rather than on pausing. I think most EAs would assign a significant probability that near-term AGI goes very badly—with many literally thinking that doom is the default outcome.
If that’s even a significant possibility, then isn’t pausing/slowing down the best thing to do no matter what? Why be optimistic that we can “make AGI go well” and pessimistic that we can pause or slow AI development for long enough?
yea we just need a solution of possibly feeding AI some heroes/pantheons to idolize themselves after as we do as humans in order to sweeten our chances of positive human survival outcomes. This is how humans stayed civilized for so long thus far. Wouldn’t hurt!