We thought about including such a scenario but decided against it. We think it might give the EA community a bad rep even if some people have already publically talked about it.
“Pivotal act” includes [scary sounding stuff]; if you don’t want to discuss that, fine. But I think it’s tragic how under-discussed very different kinds of pivotal acts or pivotal processes or just things that would be very good are. Don’t assume it has to look like [scary sounding stuff].
Another kind of scenario that comes to mind: there exists a pivotal act that is possible before AGI, which one actor performs.
Another catalyst for success that comes to mind: reducing existential risk from misuse & conflict caused by AI.
We thought about including such a scenario but decided against it. We think it might give the EA community a bad rep even if some people have already publically talked about it.
“Pivotal act” includes [scary sounding stuff]; if you don’t want to discuss that, fine. But I think it’s tragic how under-discussed very different kinds of pivotal acts or pivotal processes or just things that would be very good are. Don’t assume it has to look like [scary sounding stuff].