I think in general we should consider the possibility that we could just fund most of the useful x-risk work ourselves (by expected impact), since we have so much money ($50 billion and growing faster than the market) that we’re having a hard time spending. Accelerating growth broadly seems to accelerate risks without actually counterfactually giving us much more safety work. If anything, decelerating growth seems better, so we can have more time to work on safety.
In case it matters who gets a technology first, then targeted accelerating or decelerating might make sense.
(I’m saying this from a classical utilitarian perspective, which is not my view. I don’t think these conclusions follow for asymmetric views.)
My main objection to the idea that we can fund all useful x-risk ourselves is that what we really want to achieve is existential security, which may require global coordination. Global coordination isn’t exactly something you can easily fund.
Truth be told though I’m actually not entirely clear on the best pathways to existential security, and it’s something I’d like to see more discussion on.
I think in general we should consider the possibility that we could just fund most of the useful x-risk work ourselves (by expected impact), since we have so much money ($50 billion and growing faster than the market) that we’re having a hard time spending. Accelerating growth broadly seems to accelerate risks without actually counterfactually giving us much more safety work. If anything, decelerating growth seems better, so we can have more time to work on safety.
In case it matters who gets a technology first, then targeted accelerating or decelerating might make sense.
(I’m saying this from a classical utilitarian perspective, which is not my view. I don’t think these conclusions follow for asymmetric views.)
My main objection to the idea that we can fund all useful x-risk ourselves is that what we really want to achieve is existential security, which may require global coordination. Global coordination isn’t exactly something you can easily fund.
Truth be told though I’m actually not entirely clear on the best pathways to existential security, and it’s something I’d like to see more discussion on.