AI experts have said in polls that building AGI carries a 14-30% chance of causing human extinction!
My colleague took the median number of 14% from the latest AI Impacts survey
FWIW I believe the median value from the linked survey is 5%. The only relevant place where 14% shows up is that it is the mean probability researchers place on high-level machine intelligence being extremely bad for humanity. The median probability for the same answer is 5% and the median answer to the more specific question “What probability do you put on future AI advances causing human extinction or similarly permanent and severe disempowerment of the human species?” is also 5%
I’d phrase this section a little differently. I think as a prior you should assume that charities become less cost-effective as they scale. However, the organisations that do grow should be the ones with above-average cost-effectiveness for their size. So even if a charity is less cost-effective than when it was smaller, if funders properly consider size, an average large charity should be equally cost-effective to an average small charity.