Note that this grant was made at the very peak of the period of very abundant (partially FTX-driven) EA funding where finding good funding opportunities was extremely hard.
Yeah this is the obvious dynamic going on here, thanks for pointing it out.
I think video games are a pretty promising medium to explain a bunch of safety ideas in.
I’m skeptical. My current opinion of “edutainment for GCR stuff” is the same as my opinion for edutainment more broadly: “Sounds like a good idea at first glance, basically no wins. Probably doomed” I’d be curious to see your arguments here, or case studies.
It’s very salient to me that the very successful paperclips game managed to hit many hard parts of doing a game on AI Safety:
Popular (450k+ unique players in the first 11 days)
Managed to convey core ideas faithfully
Was aesthetically a good experience and genuinely a good game
I personally quite enjoyed it
But despite the fairly impressive output, the game, AFAIK, has ~zero traceable long-term impact. I’m not aware of anybody who was convinced to work on technical AIS as a result of the game, or people who said that their world-models for AI risk were improved, or significant pieces of communication that built on the game, or tangible advocacy or policy wins.
This somewhat updates me against the genre overall. Since the paperclips game was quite successful on standard metrics (and was designed by a professor of video game design), I think most would-be grantees should be expected to develop worse games (or leave them incomplete), so even less likely to have longer-term impact.
It’s very salient to me that the very successful paperclips game managed to hit many hard parts of doing a game on AI Safety:
Popular (450k+ unique players in the first 11 days)
Managed to convey core ideas faithfully
Was aesthetically a good experience and genuinely a good game
I personally quite enjoyed it
But despite the fairly impressive output, the game, AFAIK, has ~zero traceable long-term impact. I’m not aware of anybody who was convinced to work on technical AIS as a result of the game, or people who said that their world-models for AI risk were improved, or significant pieces of communication that built on the game, or tangible advocacy or policy wins.
Oh, I actually know of multiple people who told me they found a bunch of safety ideas because of the universal paperclips game. My guess is that it would have very likely been worth $100k+ by my lights. Of course this kind of thing would require proper surveying to identify, but my guess is if you included a question for it, you would have it show up for at least 1-2 people in the Open Phil survey, though I am definitely not confident.
While I’m also sceptical of this type of grant, I think this sort of comment is fundamentally misunderstanding marketing, which is what it sounds like this game essentially was. I’d be hard pressed to name anyone who made a decision based on a single advert, yet thousands of companies pay vast sums of money to produce them.
When your reach is high enough (and 450 unique visitors in 11 days is a very large reach by comparison to, say, a 2-year old intro video by Robert Miles which has 150k total views to date), even an imperceptibly small nudge can have a huge effect in expectation.
The comparison to Robert Miles is pretty apt imo, because I’m indeed aware of people who trace their decisions to work on AI safety to Rob Miles’ videos.
I played the paperclips game 6-12 months before reading Superintelligence (which is what convinced me to prioritize AI x-risk), and I think the game made these ideas easier for me to understand and internalize.
my opinion for edutainment more broadly: “Sounds like a good idea at first glance, basically no wins. Probably doomed”
Are you sure there are basically no wins? Kaj Sotala has an interesting anecdote about the game DragonBox in this blog post. Apparently it’s a super fun puzzle game that incidentally teaches kids basic algebra.
When I was a kid, I played some of edugames of the form “pilot a submarine, dodge enemies, occasionally a submarine-themed math problem pops up”. I’m not excited about that sort of game. I’m more excited about what I’d call a “stealth edugame”—a game that would sell just fine as an ordinary game, but teaches you useful knowledge that happens to be embedded in the game mechanics. Consider the game Railroad Tycoon 2. It’s not marketed as an edugame, and it’s a lot of fun, but as you play you’ll naturally pick up some finance concepts like: debt and equity financing, interest rates, the business cycle, profit and loss, dividends, buying stock on margin, short selling, M&A, bankruptcy, liquidation, etc. You’ll get an intuitive idea of what supply and demand are, how to optimize your operations for profitability, and how to prioritize investments based on their net present value.
Another example along the same lines—not primarily edutainment, but apparently law professors play clips of that movie in their classes because it is so accurate.
Kaj Sotala has an interesting anecdote about the game DragonBox in this blog post. Apparently it’s a super fun puzzle game that incidentally teaches kids basic algebra.
@Kaj_Sotala wrote that post 11 years ago, titled “Why I’m considering a career in educational games.” I’d be interested to see if he still stands by it and/or have more convincing arguments by now.
Games are designed first and foremost to be fun—or beautiful, or engrossing, or exhilarating. Games are an aesthetic medium, and (generally speaking) they compel our participation insofar as they compel us aesthetically. It’s true that in some games, players end up developing certain skills or understandings along the way. But that doesn’t mean we can make a great game that teaches anything. You’re seeing the survivors. These games’ designers tried and discarded dozens of gameplay ideas in search of something aesthetically compelling. Then, only after they’d satisfied the primary constraint of making something fun, or beautiful, or whatever, the designers figured out how to ensure people would learn what they need as they play. Most mechanisms are not fun. Good games come from a demanding selection process which works the other way around: first, find the fun. There’s no reason at all to believe that for any arbitrary abstract topic, one can always “find the fun” which implicitly teaches it.
On the other hand, in principle it still seems to me like you should be able to make games that significantly improve on current education. Even if an edugame wasn’t as fun as a pure entertainment game, it could still be more fun than school. And people still watch documentaries because they value learning, even though documentaries can’t compete with most movies and TV shows on pure entertainment value.
But then again, for some reason DragonBox seems to have been an exception rather than the rule. Even the company that made it mostly just made games for teaching simpler concepts to younger kids afterward, rather than moving on to teaching more complicated concepts. The fact that I haven’t really heard of even reasonably-decent edugames coming out in the 11 years since that post seems like strong empirical evidence against its thesis, though I don’t really understand the reason for that.
(I also work at the Long-Term Future Fund)
Yeah this is the obvious dynamic going on here, thanks for pointing it out.
I’m skeptical. My current opinion of “edutainment for GCR stuff” is the same as my opinion for edutainment more broadly: “Sounds like a good idea at first glance, basically no wins. Probably doomed” I’d be curious to see your arguments here, or case studies.
It’s very salient to me that the very successful paperclips game managed to hit many hard parts of doing a game on AI Safety:
Popular (450k+ unique players in the first 11 days)
Managed to convey core ideas faithfully
Was aesthetically a good experience and genuinely a good game
I personally quite enjoyed it
But despite the fairly impressive output, the game, AFAIK, has ~zero traceable long-term impact. I’m not aware of anybody who was convinced to work on technical AIS as a result of the game, or people who said that their world-models for AI risk were improved, or significant pieces of communication that built on the game, or tangible advocacy or policy wins.
This somewhat updates me against the genre overall. Since the paperclips game was quite successful on standard metrics (and was designed by a professor of video game design), I think most would-be grantees should be expected to develop worse games (or leave them incomplete), so even less likely to have longer-term impact.
Oh, I actually know of multiple people who told me they found a bunch of safety ideas because of the universal paperclips game. My guess is that it would have very likely been worth $100k+ by my lights. Of course this kind of thing would require proper surveying to identify, but my guess is if you included a question for it, you would have it show up for at least 1-2 people in the Open Phil survey, though I am definitely not confident.
While I’m also sceptical of this type of grant, I think this sort of comment is fundamentally misunderstanding marketing, which is what it sounds like this game essentially was. I’d be hard pressed to name anyone who made a decision based on a single advert, yet thousands of companies pay vast sums of money to produce them.
When your reach is high enough (and 450 unique visitors in 11 days is a very large reach by comparison to, say, a 2-year old intro video by Robert Miles which has 150k total views to date), even an imperceptibly small nudge can have a huge effect in expectation.
The comparison to Robert Miles is pretty apt imo, because I’m indeed aware of people who trace their decisions to work on AI safety to Rob Miles’ videos.
I played the paperclips game 6-12 months before reading Superintelligence (which is what convinced me to prioritize AI x-risk), and I think the game made these ideas easier for me to understand and internalize.
Are you sure there are basically no wins? Kaj Sotala has an interesting anecdote about the game DragonBox in this blog post. Apparently it’s a super fun puzzle game that incidentally teaches kids basic algebra.
When I was a kid, I played some of edugames of the form “pilot a submarine, dodge enemies, occasionally a submarine-themed math problem pops up”. I’m not excited about that sort of game. I’m more excited about what I’d call a “stealth edugame”—a game that would sell just fine as an ordinary game, but teaches you useful knowledge that happens to be embedded in the game mechanics. Consider the game Railroad Tycoon 2. It’s not marketed as an edugame, and it’s a lot of fun, but as you play you’ll naturally pick up some finance concepts like: debt and equity financing, interest rates, the business cycle, profit and loss, dividends, buying stock on margin, short selling, M&A, bankruptcy, liquidation, etc. You’ll get an intuitive idea of what supply and demand are, how to optimize your operations for profitability, and how to prioritize investments based on their net present value.
Another example along the same lines—not primarily edutainment, but apparently law professors play clips of that movie in their classes because it is so accurate.
Nope, not sure at all. Just vague impression.
@Kaj_Sotala wrote that post 11 years ago, titled “Why I’m considering a career in educational games.” I’d be interested to see if he still stands by it and/or have more convincing arguments by now.
I think that some of the bits in that essay were too strong, in particular this line
was probably wrong, for reasons Andy Matuschak outlines:
On the other hand, in principle it still seems to me like you should be able to make games that significantly improve on current education. Even if an edugame wasn’t as fun as a pure entertainment game, it could still be more fun than school. And people still watch documentaries because they value learning, even though documentaries can’t compete with most movies and TV shows on pure entertainment value.
But then again, for some reason DragonBox seems to have been an exception rather than the rule. Even the company that made it mostly just made games for teaching simpler concepts to younger kids afterward, rather than moving on to teaching more complicated concepts. The fact that I haven’t really heard of even reasonably-decent edugames coming out in the 11 years since that post seems like strong empirical evidence against its thesis, though I don’t really understand the reason for that.