It might be somewhat hard to follow, but this little prediction market is interesting (wouldn’t take the numbers too seriously):
In December of last year it seemed plausible to many people online that by now, August 2023, the world would be a very strange, near-apocalyptic place full of inscrutable alien intelligences. Obviously, this is totally wrong. So it could be worth comparing others’ “vibes” here to your own thought process to see if you’re overestimating the rate of progress.
Paying for GPT-4 if you have the budget may also be helpful to calibrate. It’s magical, but you run into embarrassing failures pretty quickly, which most commentators tend to talk about rarely.
I think this is an extremely good post laying out why the public discussion on this topic might seem confusing:
https://www.lesswrong.com/posts/BTcEzXYoDrWzkLLrQ/the-public-debate-about-ai-is-confusing-for-the-general
It might be somewhat hard to follow, but this little prediction market is interesting (wouldn’t take the numbers too seriously):
In December of last year it seemed plausible to many people online that by now, August 2023, the world would be a very strange, near-apocalyptic place full of inscrutable alien intelligences. Obviously, this is totally wrong. So it could be worth comparing others’ “vibes” here to your own thought process to see if you’re overestimating the rate of progress.
Paying for GPT-4 if you have the budget may also be helpful to calibrate. It’s magical, but you run into embarrassing failures pretty quickly, which most commentators tend to talk about rarely.