Why don’t we fund movies and documentaries that explore EA topics?
It seems to me that the way society thinks about the future is largely shaped by movies and documentaries. Why don’t we create movies that shape the views in a way that’s more realistic and useful? E.g., I haven’t read the discussion on whether Terminator is or is not a good comparison for AI risks but it’s almost certainly not a perfect comparison. Why don’t we create a better one that we could point people to? Something that would explore many important points. Now that EA has more money, that seems plausible. In 2021, OpenPhil gave grants totalling $77.6 million for work on the potential risks from Advanced AI. The budget of a movie with an all-star cast and special effects like Don’t Look Up is $75 million. But the difference is that the movie might make money, maybe even more money than its budget. It’s not obvious to me that even something extravagant like this would be a bad investment because it might make it easier to make progress on AI policy and other stuff for years to come. Of course, movies wouldn’t have to be so high budget, especially at the start. And better approach would probably be creating documentaries. Maybe a series like Vox Explained for various EA issues or for longtermism. I think it could become popular because some of the EA ideas about how far future might look seem more interesting than a lot of sci-fi, and also more novel to most people. And this is not just about AI. E.g., I can imagine a nuanced documentary about wild animal suffering that also talks about why we should think twice before spreading nature to other planets.
Anyway, this is just a shower thought, I imagine that this has been discussed before but just wanted to post it in case it hasn’t been discussed enough. And note that I never worked on AI so I don’t know what I’m talking about in that part of my text.
Maybe a consideration is that these sorts of collaborations are harder to setup than it seems.
Basically, execution and alignment seems important and hard.
Even if there is available media talent and available funding, setting up the right aesthetic (in more than one sense) and content seems difficult.
It’s unclear, but there may be downside risk (from looking silly or condescending).
This may not pertain to Saulius’s point, which isn’t focused on outreach, but people have cringed at, or even vehemently opposed, certain kinds of involvement, like US federal agencies getting involved in funding AI safety. So directly promoting the movement (as opposed to underlying ideas or topics) isn’t seen as robustly good, but this is highly unclear.
Why don’t we fund movies and documentaries that explore EA topics?
It seems to me that the way society thinks about the future is largely shaped by movies and documentaries. Why don’t we create movies that shape the views in a way that’s more realistic and useful? E.g., I haven’t read the discussion on whether Terminator is or is not a good comparison for AI risks but it’s almost certainly not a perfect comparison. Why don’t we create a better one that we could point people to? Something that would explore many important points. Now that EA has more money, that seems plausible. In 2021, OpenPhil gave grants totalling $77.6 million for work on the potential risks from Advanced AI. The budget of a movie with an all-star cast and special effects like Don’t Look Up is $75 million. But the difference is that the movie might make money, maybe even more money than its budget. It’s not obvious to me that even something extravagant like this would be a bad investment because it might make it easier to make progress on AI policy and other stuff for years to come. Of course, movies wouldn’t have to be so high budget, especially at the start. And better approach would probably be creating documentaries. Maybe a series like Vox Explained for various EA issues or for longtermism. I think it could become popular because some of the EA ideas about how far future might look seem more interesting than a lot of sci-fi, and also more novel to most people. And this is not just about AI. E.g., I can imagine a nuanced documentary about wild animal suffering that also talks about why we should think twice before spreading nature to other planets.
Anyway, this is just a shower thought, I imagine that this has been discussed before but just wanted to post it in case it hasn’t been discussed enough. And note that I never worked on AI so I don’t know what I’m talking about in that part of my text.
Check out #33 on the Future Fund list of project ideas:
https://ftxfuturefund.org/projects/
I think people have been working with Kurzgesagt and probably others.
Maybe a consideration is that these sorts of collaborations are harder to setup than it seems.
Basically, execution and alignment seems important and hard.
Even if there is available media talent and available funding, setting up the right aesthetic (in more than one sense) and content seems difficult.
It’s unclear, but there may be downside risk (from looking silly or condescending).
This may not pertain to Saulius’s point, which isn’t focused on outreach, but people have cringed at, or even vehemently opposed, certain kinds of involvement, like US federal agencies getting involved in funding AI safety. So directly promoting the movement (as opposed to underlying ideas or topics) isn’t seen as robustly good, but this is highly unclear.
ah, thanks so much for pointing this out, happy to see that funders already have this idea on their radar and I don’t need to do anything :)