As far as I know, there are no estimates (at least not public ones). But as Stan pointed out, Tobias Baumann has raised some very relevant considerations in different posts/podcasts.
Fwiw, researchers at the Center on Long-Term Risk think AGI conflict is the most concerning s-risk (see Clifton 2019), although it may be hard to comprehend all the details of why they think that if you just read their posts and don’t talk to them.
As far as I know, there are no estimates (at least not public ones). But as Stan pointed out, Tobias Baumann has raised some very relevant considerations in different posts/podcasts.
Fwiw, researchers at the Center on Long-Term Risk think AGI conflict is the most concerning s-risk (see Clifton 2019), although it may be hard to comprehend all the details of why they think that if you just read their posts and don’t talk to them.