Pronouns: she/her or they/them.
I got interested in EA back before it was called EA, back before Giving What We Can had a website. Later on, I got involved in my university EA group and helped run it for a few years. Now I’m trying to figure out where EA can fit into my life these days and what it means to me.
Good grief. Maybe we can retrospectively say that AGI was created in the 1990s. Maybe Babbage’s Analytical Engine was AGI.
This is an absurd and discrediting statement. Sorry, Tyler Cowen, you’ve said some things I thought were interesting or insightful on blogs or podcasts before, and Ezra Klein respects you a lot, which goes a long way for me because I respect Ezra a lot. But this is sheer madness.
As Andrew Connor points out in the quoted text in this post, there are simple tests of fairly rudimentary intelligence like ARC-AGI-2 that, as far as we know (the full test results are still forthcoming), o3 isn’t human-level on — and it’s not like ARC-AGI-2 is a test of AGI, nor will ARC-AGI-3 be. Getting human-level on ARC-AGI-2 or ARC-AGI-3 is necessary, but not sufficient, for AGI. It’s a very low bar to clear.
And that’s just for starters. I recently posted a great talk by François Chollet here that points out many weaknesses in LLMs, and I can’t imagine o3 overcomes all or even most of them.
There are all kinds of tests we could come up with for AGI, such as a deliberately challenging, adversarial Turing test with no time limits and with judges who are savvy about LLMs. Or the ability to fully and autonomously replace a highly skilled knowledge worker such as a journalist, an academic research assistant, or a paralegal with only the same level of input and supervision that a human in that role would require. Can current frontier AI systems like o3 do long-term, autonomous hierarchical planning at that level of complexity? No. Absolutely not. Not even close.
I think people should interpret Tyler Cowen making such a ridiculous statement as a signal that the AGI discourse is rotten and off the rails. This sucks.