No worries! The paper Existential Risk and Cost-Effective Biosecurity is the quantification effort I am aware of. I like it, but was looking for more because it still involves some guesses which arguably warrant further investigation:
For the purposes of this model [“Model 2: Potentially Pandemic Pathogens”], we assume that for any global pandemic arising from this kind of research [gain of function research], each has only a 1 in 10,000 chance of causing an existential risk
[...]
[“Model 3: Naive Power Law Extrapolation”:] Extrapolating the power law out [seems pessimistic to me], we find that the probability that an attack [“using biological and chemical weapons”] kills more than 5 billion will be (5 billion)–0.5 or 0.000014. Assuming 1 attack per year (extrapolated on the current rate of bio-attacks) and assuming that only 10% [seems pessimistic to me] of such attacks that kill more than 5 billion eventually lead to extinction (due to the breakdown of society, or other knock-on effects), we get an annual existential risk of 0.0000014 (or 1.4 × 10–6).
No worries! The paper Existential Risk and Cost-Effective Biosecurity is the quantification effort I am aware of. I like it, but was looking for more because it still involves some guesses which arguably warrant further investigation: