How much is 1.8 million years of work?

In What will GPT-2030 look like?, Jacob Steinhardt imagines what large pretrained ML systems might look like in 2030. He predicts that a hypothetical GPT2030 would:

  1. Be superhuman at specific tasks

  2. Work and think quickly

  3. Be run in many parallel copies

  4. Learn quickly in parallel

  5. Be trained on additional modalities

Thinking through the implications of any of these predictions is I think pretty interesting—but here I want to focus on getting a handle on 2-4.

As I understand it, 4 is basically a function of 2 and 3: systems will learn quickly in parallel in proportion with how fast they work/​think/​learn individually, and how many systems can be run in parallel. (Probably there are other technical details here; I’m going to ignore them for now.)

It makes intuitive sense to me that GPT2030 will be fast at thinking and fast at learning. But how fast is fast?

Jacob predicts that:

  • GPT2030 will think/​work 5x as fast as a human

  • GPT2030 will be trained on enough compute to perform 1.8 million years of work at human speed

  • It will be feasible to run at least 1 million copies of GPT2030 in parallel

On the back of this, he estimates that:

  • GPT2030 will be able to perform 1.8 million years of work in 2.4 months[1]

  • GPT2030 will be able to do 2,500 years of learning in 1 day[2]

For now, let’s just assume that these forecasts are sensible. How much learning is 2500 years? What would 1.8 million years of work in 2.4 months look like?

(An aside on why I’m interested in this: I want to have a sense of how ‘good’ AI might be in 2030/​the future. 1.0e28 training FLOPs doesn’t viscerally mean anything to me. 1.8 million years of work starts to mean something—but I still don’t really grok what it would mean if you could fit that many years of work into 2.4 months.)

Assuming that people work 8 hours a day, 5 days a week, 50 weeks a year,[3] and then taking the first estimates I found on the internet for how many people are in different groups, then in a single day:

  • PhD students study for a bit more than 8000 years[4]

  • DOD employees work for close to 2000 years

  • Amazon employees work for around 900 years

  • Mathematicians work for just over 400 years

  • AI researchers work for nearly 200 years

  • Microsoft and Alphabet each work for a bit over 100 years

  • Google DeepMind works for a year and a half

  • OAI works for nearly 10 months

  • Anthropic works for nearly 4 months

So an AI system that could fit 2500 years of learning/​thinking/​working into one day would be doing:

  • A third of the learning of all PhD students in the world

  • 10x the work of all AI researchers

  • Thousands of times the work of OAI, GDM and Anthropic combined

Every day.

What about doing 1.8 million years of work in 2.4 months?

Making the same assumptions about human working time, you’d need around 40 million humans to fit 1.8 million years of work into that time.[5]

For scale:

  • There are around 30 million software engineers in the world

  • The entire labour force of Germany is around 44 million

So GPT2030 would be doing more work than all software engineers combined.

__________________________________________________

So, where does all of this get me to? Mostly, that fast is pretty darn fast.

On Jacob’s predictions, it’s not the case that GPT2030 could do more work than humanity combined or anything crazy like that—but GPT2030 could be doubling the amount of software engineering, or ten-x-ing the amount of AI research, or thousand-x-ing the amount of AGI research.

I think worlds like that could look pretty strange.

Thanks to Owen Cotton-Barratt, Oscar Delaney, and Will MacAskill for comments; and to Max Dalton for unblocking me on posting.

  1. ^

    Here I think Jacob is taking a central estimate of 1.8 million copies, rather than his lower bound of at least 1 million. So 1.8 million systems working 5x as fast as humans can do 1.8 million years in 1 year /​ 5x speedup − 2.4 months.

  2. ^

    Here I think Jacob is using his lower bound of 1 million copies, and also isn’t factoring in the 5x speedup, I presume to make an even more conservative lower bound. So 1 million copies working for 1 day each is 1 million days, which is around 2,500 years. (With the 5x speedup, it would be around 13,500 years.)

  3. ^

    Which seems to be roughly the global average right now, from https://​​ourworldindata.org/​​working-more-than-ever (8*5*50=2000)

  4. ^

    Assuming 222 million people in tertiary education from here https://​​www.worldbank.org/​​en/​​topic/​​tertiaryeducation, and then assuming that a) the percentage of the global population with tertiary education is 17%, from here https://​​ourworldindata.org/​​grapher/​​share-of-the-population-with-completed-tertiary-education?tab=table&time=2020..2025, b) the percentage of the global population with a PhD is 1%, c) the proportion of those in tertiary education who are PhD students is 1:17.

  5. ^

    There are 73 days in 2.4 months (365 days * 2.4/​12 months). So that’s ~25,000 years in a day (1.8 million years /​ 73 days). Converting that into hours and dividing by the hours worked by the average human in an average day gets you to ~40 million humans. How long stuff takes