I think if it turns out that short AI timelines are wrong, those with short timelines should acknowledge it and the EA as a whole should seek to understand why we got it so wrong. I will think it odd if those who make repeatedly wrong predictions continue to be taken seriously.
I think this only applies to people who are VERY confident in short timelines. Say you have a distribution over possible timelines that puts 50% probability on <20 years, and 20% probability on >60 years. This would be a really big deal! It’s a 50% chance of the world wildly changing in 20 years. But having no AGI within 60 years is only a 5x update against this model, hardly a major sin of bad prediction.
Though if someone is eg quitting their job and not getting a pension they probably have a much more extreme distribution, so your point is pretty valid there.
Though if someone is eg quitting their job and not getting a pension they probably have a much more extreme distribution, so your point is pretty valid there.
I’m confused at that implication. I would make bets of that magnitude at substantially lower probabilities than 50%, and in fact have done so historically.
Though maybe “quitting their job and not getting a pension” is meant as a metaphor for “take very big life risks,” whereas to me e.g. quitting Google to join a crypto startup even though I had <20% credence in crypto booming, or explicitly not setting aside retirement monies in my early twenties, both seemed liked pretty comfortable risks at the time, and almost not worth writing about from a risk-taking angle.
Though maybe “quitting their job and not getting a pension” is meant as a metaphor for “take very big life risks,”
That’s fair pushback—a lot of that really doesn’t seem that risky if you’re young and have a very employable skillset. I endorse this rephrasing of my view, thanks
I guess you’re still exposed to SOME increased risk, eg that the tech industry in general becomes much smaller/harder to get into/less well paying, but you’re still exposed to risks like “the US pension system collapses” anyway, so this seems reasonable to mostly ignore. (Unless there’s a good way of buying insurance against this?)
I think this only applies to people who are VERY confident in short timelines. Say you have a distribution over possible timelines that puts 50% probability on <20 years, and 20% probability on >60 years. This would be a really big deal! It’s a 50% chance of the world wildly changing in 20 years. But having no AGI within 60 years is only a 5x update against this model, hardly a major sin of bad prediction.
Though if someone is eg quitting their job and not getting a pension they probably have a much more extreme distribution, so your point is pretty valid there.
I’m confused at that implication. I would make bets of that magnitude at substantially lower probabilities than 50%, and in fact have done so historically.
Though maybe “quitting their job and not getting a pension” is meant as a metaphor for “take very big life risks,” whereas to me e.g. quitting Google to join a crypto startup even though I had <20% credence in crypto booming, or explicitly not setting aside retirement monies in my early twenties, both seemed liked pretty comfortable risks at the time, and almost not worth writing about from a risk-taking angle.
That’s fair pushback—a lot of that really doesn’t seem that risky if you’re young and have a very employable skillset. I endorse this rephrasing of my view, thanks
I guess you’re still exposed to SOME increased risk, eg that the tech industry in general becomes much smaller/harder to get into/less well paying, but you’re still exposed to risks like “the US pension system collapses” anyway, so this seems reasonable to mostly ignore. (Unless there’s a good way of buying insurance against this?)