Depending of different attitudes towards questions like take-off speed, people argue that with the development of AGI we will face situations of world GDP doubling days/weeks/a few years (with the number of years shriking with each further doubling). Many peoples’s timelines here seem to be quite broad, including quite commonly expectations like “AGI within the next 2-3 decades very likely”.
How the global world order politically as well as economically will change over the next decades is a quite extensively discussed topic in public as well as academia, with many goals and forecasts made until years like 2050 or 2070 (“climate neutral 2050”, “china’s economy in 30 years”). Barely is AGI mentioned in economics classes, political research papers and the like, despite its apparent impact of making any politics redundant and throwing over any economic forecasts. If AGI was even significantly less mighty than we think and there was even just a 20% chance of it occuring in the next 3 decades, that should be the number one single factor debated in every single argument on any economic/political topic with medium-length scope. Why, do you think, is it the case, that AGI is comparatively so rarely a topic there?
My motivated reasoning would immediately come up with explanations along the lines of
people in these disciplines are just not so much aware of AI developments
any forecasts/plans made assuming short timelines and fast takeoff speeds are useless anyways, so it makes sense to just assume longer timelines
Maybe I am just not noticing the omnipresence of AGI debate in economic/political long-term discourse
@1 seems unreasonable, because as soon as the first AI-economics people would come up with these arguments, if they were reasonable, they would become mainstream
@2 if that assumption was consciously made, I’d expect to hear this more often as side note
@3 hard to argue against, given it assumes I don’t see the discourse. But I regularity engage with media/content from the UN on their SDGs, have taken some Economics/IR/Politics electives, try to be a somewhat informed citicien and have friends studying these things, and I barely see AI suddenly speeding up things in any forecasts or discussions
Why might this be the case? To me it seems like either mainstream academia, global institutions and public discourse heavily miss something or we tech/ea/ai people are overly biased in the actual relevance of our own field (I’m CS student)?
[Question] Why does AGI occur almost nowhere, not even just as a remark for economic/political models?
Depending of different attitudes towards questions like take-off speed, people argue that with the development of AGI we will face situations of world GDP doubling days/weeks/a few years (with the number of years shriking with each further doubling). Many peoples’s timelines here seem to be quite broad, including quite commonly expectations like “AGI within the next 2-3 decades very likely”.
How the global world order politically as well as economically will change over the next decades is a quite extensively discussed topic in public as well as academia, with many goals and forecasts made until years like 2050 or 2070 (“climate neutral 2050”, “china’s economy in 30 years”). Barely is AGI mentioned in economics classes, political research papers and the like, despite its apparent impact of making any politics redundant and throwing over any economic forecasts. If AGI was even significantly less mighty than we think and there was even just a 20% chance of it occuring in the next 3 decades, that should be the number one single factor debated in every single argument on any economic/political topic with medium-length scope. Why, do you think, is it the case, that AGI is comparatively so rarely a topic there?
My motivated reasoning would immediately come up with explanations along the lines of
people in these disciplines are just not so much aware of AI developments
any forecasts/plans made assuming short timelines and fast takeoff speeds are useless anyways, so it makes sense to just assume longer timelines
Maybe I am just not noticing the omnipresence of AGI debate in economic/political long-term discourse
@1 seems unreasonable, because as soon as the first AI-economics people would come up with these arguments, if they were reasonable, they would become mainstream
@2 if that assumption was consciously made, I’d expect to hear this more often as side note
@3 hard to argue against, given it assumes I don’t see the discourse. But I regularity engage with media/content from the UN on their SDGs, have taken some Economics/IR/Politics electives, try to be a somewhat informed citicien and have friends studying these things, and I barely see AI suddenly speeding up things in any forecasts or discussions
Why might this be the case?
To me it seems like either mainstream academia, global institutions and public discourse heavily miss something or we tech/ea/ai people are overly biased in the actual relevance of our own field (I’m CS student)?