That’s right! I just think that the base rate for “civilisation collapse prevents us from ever becoming a happy intergalactic civilisation” is very low. And multiplying any probability by 0.1 also does matter because when we’re talking about AGI, we’re talking about things are >=10% likely to happen for a lot of people (I put a higher likelihood than that but Toby Ord putting 10% is sufficient).
So it means that even if you condition on biorisks being the same as AGI (which is the point I argue against) for everything else, you still need biorisks to be >5% likely to lead to a civilizational collapse by the end of the century for my point to not hold, i.e that 95% of longtermists should work AI (19/20 of the people + assumption of linear returns for the few first thousands ppl).
That’s right! I just think that the base rate for “civilisation collapse prevents us from ever becoming a happy intergalactic civilisation” is very low.
And multiplying any probability by 0.1 also does matter because when we’re talking about AGI, we’re talking about things are >=10% likely to happen for a lot of people (I put a higher likelihood than that but Toby Ord putting 10% is sufficient).
So it means that even if you condition on biorisks being the same as AGI (which is the point I argue against) for everything else, you still need biorisks to be >5% likely to lead to a civilizational collapse by the end of the century for my point to not hold, i.e that 95% of longtermists should work AI (19/20 of the people + assumption of linear returns for the few first thousands ppl).