I believe that there’s more uncertainty about the future than there was previously.
This means that
(a) it’s hard for me to commit to a doom outcome with high confidence
(b) it’s hard for me to commit to any outcome with high confidence
(c) even if I think that doom has <10% chance of happening, it doesn’t mean I can articulate what the rest of the probability space looks like.
To be clear, I think that someone with this set of beliefs, including 1% chance of doom, should be highly concerned and should want action to be taken to keep everyone safe from the risks of AI.
I agree with Tyler Cowen that it’s hard to predict what will happen, although my argument has a (not mega important) nuance that his blog post doesn’t have, namely that the difficulty of predictions is increasing.
A (more important) difference is that I don’t commit what Scott Alexander calls the Safe Uncertainty Fallacy. I’ve encountered that argument a lot with climate sceptics for many years, and have found it infuriating how it’s simultaneously a very bad argument and yet can be made to sound sensible.
I believe that there’s more uncertainty about the future than there was previously.
This means that
(a) it’s hard for me to commit to a doom outcome with high confidence
(b) it’s hard for me to commit to any outcome with high confidence
(c) even if I think that doom has <10% chance of happening, it doesn’t mean I can articulate what the rest of the probability space looks like.
To be clear, I think that someone with this set of beliefs, including 1% chance of doom, should be highly concerned and should want action to be taken to keep everyone safe from the risks of AI.
This reminds me a bit of Tyler Cowen’s take (but glad for your last paragraph!). I think Scott Alexander’s response to Cowen is good.
I agree with Tyler Cowen that it’s hard to predict what will happen, although my argument has a (not mega important) nuance that his blog post doesn’t have, namely that the difficulty of predictions is increasing.
A (more important) difference is that I don’t commit what Scott Alexander calls the Safe Uncertainty Fallacy. I’ve encountered that argument a lot with climate sceptics for many years, and have found it infuriating how it’s simultaneously a very bad argument and yet can be made to sound sensible.