Sure, but I think of, say, a 5% probability of success and a 6% probability of success as similarly dire enough not to want to pick either.
What we call AGI today, human level at everything as aminimum but running on a GPU, is what Bostrom called speed and/or collective superintelligence, if chip prices and speeds continue to change.
and 4. Sure, alignment isn’t enough, but it’s necessary, and it seems we’re not on track to make even that low bar.
To respond to you points in order:
Sure, but I think of, say, a 5% probability of success and a 6% probability of success as similarly dire enough not to want to pick either.
What we call AGI today, human level at everything as aminimum but running on a GPU, is what Bostrom called speed and/or collective superintelligence, if chip prices and speeds continue to change.
and 4. Sure, alignment isn’t enough, but it’s necessary, and it seems we’re not on track to make even that low bar.