AI Risk timelines: 10% chance (by year X) should be the headline (and deadline), not 50%. And 10% is _this year_!

Artificial General Intelligence (AGI) poses an extinction risk to all known biological life. Given the stakes involved—the whole world—we should be looking at 10% chance-of-AGI-by timelines as the deadline for catastrophe prevention (a global treaty banning superintelligent AI), rather than 50% (median) chance-of-AGI-by timelines, which seem to be the default[1].

It’s way past crunch time already: 10% chance of AGI this year![2] AGI will be able to automate further AI development, leading to rapid recursive self-improvement to ASI (Artificial Superintelligence). Given alignment/​control is not going to be solved in 2026, and if anyone builds it [ASI], everyone dies (or at the very least, the risk of doom is uncomfortably high by most estimates), a global Pause of AGI development is an urgent immediate priority. This is an emergency. Thinking that we have years to prevent catastrophe is gambling a huge amount of current human lives, let alone all future generations and animals.

To borrow from Stuart Russell’s analogy: if there was a 10% chance of aliens landing this year[3], humanity would be doing a lot more than we are currently doing[4]. AGI is akin to an alien species more intelligent than us that is unlikely to share our values.

  1. ^

    This is an updated version of this post of mine from 2022.

  2. ^

    In the answer under “Why 80% Confidence?” on the linked page, it says “there’s roughly a 10% chance AGI arrives before [emphasis mine] the lower bound”, so before 2027; i.e. 2026. See also: the task time horizon trends from METR. You might want to argue that 10% is actually next year (2027), based on other forecasts such as this one, but that only makes things slightly less urgent—we’re still in a crisis if we might only have 18 months.

  3. ^

    This is different to the original analogy, which was an email saying: “People of Earth: We will arrive on your planet in 50 years. Get ready.” Say astronomers spotted something that looked like a spacecraft, heading in our direction, and estimated there was 10% chance that it was indeed an alien spacecraft.

  4. ^

    Although perhaps we wouldn’t. Maybe people would endlessly argue about whether the evidence is strong enough to declare a 10%(+) probability. Or flatly deny it.

Crossposted to LessWrong (61 points, 18 comments)