Links are to prediction markets, subscripts/brackets are my own forecasts, done rapidly.
I open my eyes. It’s nearly midday. I drink my morning Huel. How do I feel? My life feels pretty good. AI progress is faster than ever, but I’ve gotten used to the upward slope by now. There has perhaps recently been a huge recession, but I prepared for that. If not, the West feels more stable than it did in 2024. The culture wars rage on, inflamed by AI, though personally I don’t pay much attention.
Either Trump or Biden won the 2024 election85%. If Biden, his term was probably steady growth and good, boring decision making70%. If Trump there is more chance of global instability70% due to pulling back from NATO 40%, lack of support for Ukraine60%, incompetence in handling the Middle East30%. Under both administrations there is a moderate chance of a global recession30%, slightly more under Trump. I intend to earn a bit more and prep for that, but I can imagine that the median person might feel worse off if they get used to the gains in between.
AI progress has continued. For a couple of years it has been possible possible for anyone to type a prompt for a simple web app and receive an entire interactive website60%. AI autocomplete exists in most apps80%, AI images and video are ubiquitous80%. Perhaps an AI has escaped containment45%. Some simple job roles have been fully automated 60%. For the last 5 years the sense of velocity we felt in 2023 onwards hasn’t abated 80%. OpenAI has made significant progress on automating AI engineers 70%.
And yet we haven’t hit the singularity yet 90%, in fact, it feels only a bit closer than it did in 202460%. We have blown through a number of milestones, but AIs are only capable of doing tasks that took 1-10 hours in 2024 60%, and humans are better at working with them 70%. AI regulation has become tighter80%. With each new jump in capabilities the public gets more concerned and requires more regulation60%. The top labs are still in control of their models75%, with some oversight from the government, but they are red-teamed heavily60%, with strong anti-copyright measures in place85%. Political deepfakes probably didn’t end up being as bad as everyone feared60%, because people are more careful with sources. Using deepfakes as scams is a big issue60%. People in the AI safety community are a little more optimistic60%.
The world is just “a lot” (65%). People are becoming exhausted by the availability and pace of change (60%). Perhaps rapidly growing technologies focus on bundling the many new interactions and interpreting them for us (20%).
There is a new culture war (80%), perhaps relating to AI (33%). Peak woke happened around 2024, peak trans panic around a similar time. Perhaps eugenics (10%) is the current culture war or polyamory (10%), child labour (5%), artificial wombs (10%). It is plausible that with the increase in AI this will be AI Safety, e/acc and AI ethics. If that’s the case, I am already tired (80%).
In the meantime physical engineering is perhaps noticeably out of the great stagnation. Maybe we finally have self-driving cars in most Western cities (60%), drones are cheap and widely used, we are perhaps starting to see nuclear power stations (60%), house building is on the up. Climate change is seen as a bit less of a significant problem. World peak carbon production has happened and nuclear and solar are now well and truly booming. A fusion breakthrough looks likely in the next 5 years.
In the wider world, both Africa and India are deeply unequal. Perhaps either has shrugged off its dysfunction (50%) buoyed up by English speakers and remote tech jobs, but it seems unlikely either is an economic powerhouse (80%). Malaria has perhaps been eradicated in Sub-Saharan Africa (50%).
Animal welfare is roughly static, though cultivated meat is now common on menus in London (60%). It faces battles around naming conventions (e.g. can it be called meat) (70%) and is growing slowly (60%).
Overall, my main feeling is that it’s gonna be okay (unless you’re a farm animal). I guess this is partly priors-based but I’ve tried to poke holes in it with the various markets attached. It seems to me that I want to focus on the 5 − 15 year term when things get really strange rather than worry deeply about the next 5 years.
This is my first post like this. How could I make the next one better?
The World in 2029
Links are to prediction markets, subscripts/brackets are my own forecasts, done rapidly.
I open my eyes. It’s nearly midday. I drink my morning Huel. How do I feel? My life feels pretty good. AI progress is faster than ever, but I’ve gotten used to the upward slope by now. There has perhaps recently been a huge recession, but I prepared for that. If not, the West feels more stable than it did in 2024. The culture wars rage on, inflamed by AI, though personally I don’t pay much attention.
Either Trump or Biden won the 2024 election85%. If Biden, his term was probably steady growth and good, boring decision making70%. If Trump there is more chance of global instability70% due to pulling back from NATO 40%, lack of support for Ukraine60%, incompetence in handling the Middle East30%. Under both administrations there is a moderate chance of a global recession30%, slightly more under Trump. I intend to earn a bit more and prep for that, but I can imagine that the median person might feel worse off if they get used to the gains in between.
AI progress has continued. For a couple of years it has been possible possible for anyone to type a prompt for a simple web app and receive an entire interactive website60%. AI autocomplete exists in most apps80%, AI images and video are ubiquitous80%. Perhaps an AI has escaped containment45%. Some simple job roles have been fully automated 60%. For the last 5 years the sense of velocity we felt in 2023 onwards hasn’t abated 80%. OpenAI has made significant progress on automating AI engineers 70%.
And yet we haven’t hit the singularity yet 90%, in fact, it feels only a bit closer than it did in 2024 60%. We have blown through a number of milestones, but AIs are only capable of doing tasks that took 1-10 hours in 2024 60%, and humans are better at working with them 70%. AI regulation has become tighter80%. With each new jump in capabilities the public gets more concerned and requires more regulation60%. The top labs are still in control of their models75%, with some oversight from the government, but they are red-teamed heavily60%, with strong anti-copyright measures in place85%. Political deepfakes probably didn’t end up being as bad as everyone feared60%, because people are more careful with sources. Using deepfakes as scams is a big issue60%. People in the AI safety community are a little more optimistic60%.
The world is just “a lot” (65%). People are becoming exhausted by the availability and pace of change (60%). Perhaps rapidly growing technologies focus on bundling the many new interactions and interpreting them for us (20%).
There is a new culture war (80%), perhaps relating to AI (33%). Peak woke happened around 2024, peak trans panic around a similar time. Perhaps eugenics (10%) is the current culture war or polyamory (10%), child labour (5%), artificial wombs (10%). It is plausible that with the increase in AI this will be AI Safety, e/acc and AI ethics. If that’s the case, I am already tired (80%).
In the meantime physical engineering is perhaps noticeably out of the great stagnation. Maybe we finally have self-driving cars in most Western cities (60%), drones are cheap and widely used, we are perhaps starting to see nuclear power stations (60%), house building is on the up. Climate change is seen as a bit less of a significant problem. World peak carbon production has happened and nuclear and solar are now well and truly booming. A fusion breakthrough looks likely in the next 5 years.
China has maybe attacked Taiwan (25%), but probably not. Xi is likely still in charge (75%) but there has probably been a major recession (60%). The US, which is more reliant on Mexico is less affected (60%), but Europe struggles significantly (60%).
In the wider world, both Africa and India are deeply unequal. Perhaps either has shrugged off its dysfunction (50%) buoyed up by English speakers and remote tech jobs, but it seems unlikely either is an economic powerhouse (80%). Malaria has perhaps been eradicated in Sub-Saharan Africa (50%).
Animal welfare is roughly static, though cultivated meat is now common on menus in London (60%). It faces battles around naming conventions (e.g. can it be called meat) (70%) and is growing slowly (60%).
Overall, my main feeling is that it’s gonna be okay (unless you’re a farm animal). I guess this is partly priors-based but I’ve tried to poke holes in it with the various markets attached. It seems to me that I want to focus on the 5 − 15 year term when things get really strange rather than worry deeply about the next 5 years.
This is my first post like this. How could I make the next one better?
Crossposted from my blog here: https://nathanpmyoung.substack.com/p/the-world-in-2029