Well, at a technical level the first is a conditional probability and the second is an unconditional probability of a conjunction. So the first is to be read as âthe probability that alignment is achieved, conditional on humanity creating a spacefaring civilizationâ whilst the second is âthe probability that the following is happens: alignment is solved and humanity creates a spacefaring civilizationâ. If you think of probability as a space, where the likelihood of an outcome=the proportion of the space it takes up, then:
-the first is the proportion of the region of probability space taken up humanity creating a space-faring civilization in which alignment occurs.
-the second is the proportion of the whole of probability space in in which both alignment occurs and humanity creates a space-faring civilization.
But yes, knowing that does not automatically bring real understanding of whatâs going on. Or at least for me it doesnât. Probably the whole idea being expressed would better written up much more informally, focusing on a concrete story of how particular actions taken by people concerned with alignment might surprisingly be bad our suboptimal.
Well, at a technical level the first is a conditional probability and the second is an unconditional probability of a conjunction. So the first is to be read as âthe probability that alignment is achieved, conditional on humanity creating a spacefaring civilizationâ whilst the second is âthe probability that the following is happens: alignment is solved and humanity creates a spacefaring civilizationâ. If you think of probability as a space, where the likelihood of an outcome=the proportion of the space it takes up, then:
-the first is the proportion of the region of probability space taken up humanity creating a space-faring civilization in which alignment occurs.
-the second is the proportion of the whole of probability space in in which both alignment occurs and humanity creates a space-faring civilization.
But yes, knowing that does not automatically bring real understanding of whatâs going on. Or at least for me it doesnât. Probably the whole idea being expressed would better written up much more informally, focusing on a concrete story of how particular actions taken by people concerned with alignment might surprisingly be bad our suboptimal.
Thanks David, that makes sense :)