I think it would be useful to get a feel for Forum users’ AI timelines. There are three questions, two of which are designed to align with questions on a LessWrong survey (from 2023). They are roughly year of artificial general intelligence, singularity (variously defined as cannot predict beyond, super exponential or explosive growth of the economy, etc.), and “crazy.” Feel free to define “crazy” as you wish, but some possibilities could be greater than 20% unemployment in most countries, widespread political unrest, widespread loss of confidence in what is true, widespread economic growth exceeding 10% per year[1], your personal plans being disrupted by something related to AI, etc. It would be interesting to see in the comments how people define this. Please use the median year of your distribution (an even chance of happening before or after). There are 21 locations on each poll, and they correspond to these years (if you comment, it would be helpful for you to put the year in, as the automatic description is not very helpful):
2026
2027
2028
2029
2030
2032
2035
2037
2040
2045
2050 (the middle of the poll range)
2060
2070
2080
2100
2125
2150
2200
2300
later
never
By what year do you think AI will be able to do intellectual tasks that expert humans currently do?
By what year do you think the singularity will occur?
By what year do you think the world will get crazy?
Edit: Now that the polls have closed, I thought I would offer some commentary. The median years of AGI, singularity, and crazy are 2035, 2038.5 and 2035, respectively (if options were continuous, it looks like AGI median would have been ~2034). LessWrong was 2030 for AGI (sooner) and 2040 for singularity (later). About 15% expect AGI and separately crazy by 2030 or sooner. One thing that surprised me was that by medians, people expected crazy ~1 year after AGI (~3 years after when I matched individual forecasts). Whereas I am expecting crazy 6 years before AGI, partly because of Epoch modeling indicating 10% global economic growth rate before 2029, much before full automation (and I think AGI).
I suspect that EAs working on AI were more likely to vote, and that they would have shorter timelines, so the timelines across all EAs are actually longer, and the timelines of only AI safety EAs would be shorter.
Quick Polls on AI Timelines
I think it would be useful to get a feel for Forum users’ AI timelines. There are three questions, two of which are designed to align with questions on a LessWrong survey (from 2023). They are roughly year of artificial general intelligence, singularity (variously defined as cannot predict beyond, super exponential or explosive growth of the economy, etc.), and “crazy.” Feel free to define “crazy” as you wish, but some possibilities could be greater than 20% unemployment in most countries, widespread political unrest, widespread loss of confidence in what is true, widespread economic growth exceeding 10% per year[1], your personal plans being disrupted by something related to AI, etc. It would be interesting to see in the comments how people define this. Please use the median year of your distribution (an even chance of happening before or after).
There are 21 locations on each poll, and they correspond to these years (if you comment, it would be helpful for you to put the year in, as the automatic description is not very helpful):
2026
2027
2028
2029
2030
2032
2035
2037
2040
2045
2050 (the middle of the poll range)
2060
2070
2080
2100
2125
2150
2200
2300
later
never
By what year do you think AI will be able to do intellectual tasks that expert humans currently do?
By what year do you think the singularity will occur?
By what year do you think the world will get crazy?
Edit: Now that the polls have closed, I thought I would offer some commentary. The median years of AGI, singularity, and crazy are 2035, 2038.5 and 2035, respectively (if options were continuous, it looks like AGI median would have been ~2034). LessWrong was 2030 for AGI (sooner) and 2040 for singularity (later). About 15% expect AGI and separately crazy by 2030 or sooner. One thing that surprised me was that by medians, people expected crazy ~1 year after AGI (~3 years after when I matched individual forecasts). Whereas I am expecting crazy 6 years before AGI, partly because of Epoch modeling indicating 10% global economic growth rate before 2029, much before full automation (and I think AGI).
I suspect that EAs working on AI were more likely to vote, and that they would have shorter timelines, so the timelines across all EAs are actually longer, and the timelines of only AI safety EAs would be shorter.
Not recovering from a recession.