Yes. This is no comfort for me in terms of p(doom|AGI). There will be sudden changes in control requirements, judging by the big leaps of capability between GPT generations.
More controllable is one thing, but it doesn’t really matter much for reducing x-risk when the numbers being talked about are “29%”.
Yes. This is no comfort for me in terms of p(doom|AGI). There will be sudden changes in control requirements, judging by the big leaps of capability between GPT generations.
More controllable is one thing, but it doesn’t really matter much for reducing x-risk when the numbers being talked about are “29%”.
That’s what I meant by “phase transition”