[Separating out this paragraph into a new comment as I’m guessing it’s what lead to the downvotes, and I’d quite like the point of the parent paragraph to stand alone. Not sure if anyone will see this now though.]
I think it’s imperative to get the leaders of AGI companies to realise that they are in a suiciderace (and that AGI will likely kill themtoo). The default outcome of AGI is doom. For extinction risk at the 1% level, it seems reasonable (even though it’s still 80M lives in expectation) to pull the trigger on AGI for a 99% chance of utopia. This is totally wrong-headed and is arguably contributing massively to current x-risk.
[Separating out this paragraph into a new comment as I’m guessing it’s what lead to the downvotes, and I’d quite like the point of the parent paragraph to stand alone. Not sure if anyone will see this now though.]
I think it’s imperative to get the leaders of AGI companies to realise that they are in a suicide race (and that AGI will likely kill them too). The default outcome of AGI is doom. For extinction risk at the 1% level, it seems reasonable (even though it’s still 80M lives in expectation) to pull the trigger on AGI for a 99% chance of utopia. This is totally wrong-headed and is arguably contributing massively to current x-risk.