Why I think the EA Community should write more fiction
What if the Precipice (Toby Ord’s book on existential risks) was a science fiction anthology?
This is just an idea that occurred to me recently, I’d be interested to hear what people think. While I’m not going to say that writing EA-aligned fiction is the highest impact thing you can do, it does seem like it could be a great way to build the movement and to explore and communicate long-termist ideas.
I’m sure there already is a lot that could perhaps be described as EA fiction, but I wanted to keep that in the comments.
There’s an exploration of the impact of fiction in a previous post https://forum.effectivealtruism.org/posts/Bc8J5P938BmzBuL9Y/when-can-writing-fiction-change-the-world that I think is worth a read.
Why write fiction?
It’ll be enjoyable! I’ll be honest, I want this to exist so I can read it, everything beyond this point in the post is just me trying to justify this opinion. The EA community doesn’t exist for my benefit, so keep my pro-fun bias in mind as you read on.
Any X-risk is a Science Fiction Premise: If you worry that the average person won’t be interested in nuclear war and the challenges of AI alignment, the success of the Terminator franchise demonstrates that you’re wrong. Same with asteroid impact (Armageddon, Deep Impact), biorisk (Resident Evil, 28 Days Later) and catastrophic climate change (The Day After Tomorrow). You have to try really hard to make the end of the world seem boring, and I kind of feel that most EA writing on the subject is actively trying to do that. Pivoting into science fiction seems pretty easy to me.
Communicating EA ideas to a wider audience: I can hear my audience screaming that the examples I’ve listed are scientifically inaccurate. Granted, I too am sceptical that highest impact intervention to prevent asteroid impact is to train oil drillers to become astronauts, or that zombies are the main biorisk we should concern ourselves with. Unfortunately, this is apparently where the average person is at right now. I doubt we’ll be able to change this, but countering bad stories with good stories seems more likely to spread beyond the EA community than thoughtful articles on 80,000 hours. I’m not expecting an EA blockbuster any time soon, but I’m optimistic that more realistic premises could still result in engaging stories.
Communicating ideas within the community: A lot of people say that it’s hard for them to imagine a scenario in which a AI gets given enough power to do serious damage without a human intervening, which is just asking for a speculative fiction scenario. I think this is true for a lot of other long-termist ideas, and a collection of short stories could get more people thinking about long-termist causes.
Doesn’t have to be too long or high budget: I’m a big fan of the short story, because it gives you enough time to fully explore an interesting idea in an accessible way, without either the writer or the reader having to commit to a series of novels or any high-budget special effects.
Captive Audience: If you already do writing, the incorporation of EA themes seems like a great way to lots of people here to read it, which will be great for getting the external validation and/or scathing criticism you need.
Exploration of Morality: Everyone kind of thinks Utilitarians are evil. The average person seems to think they’re all one bad day away from pushing people in front of trolleys and harvesting their organs. If you want a villain to do something terrible, utilitarian logic can give him a plausible sounding motivation—Thanos from Infinity War and Ozymandias from Watchmen come to mind. One way to counter this would be to have more utilitarian protagonists, and I’d love to read a story in which the hero and villain resolve their dispute through a cost-benefit analysis rather than a fistfight. Fiction is a great way to explore different moral frameworks, especially when they conflict with one another.
Reasons not to write fiction
You don’t have the time: I’m sure a lot of you are very busy and have better things to do. Fair enough. I guess I just hope that writing is a higher impact use of free time than watching Netflix or feeling powerless dread in the face of political turmoil, which definitely occupies too much of my life.
You don’t want to: If the idea of writing EA themed fiction doesn’t seem interesting to you, it’s probably not a great personal fit.
You’re not good at it: If you’ve tried before and your writing is just really bad, maybe you can decide this isn’t your comparative advantage (that’s maybe me). Although nobody’s great the first time, maybe all you need is practice and feedback.
Low fidelity communication of information: Getting EA ideas out there isn’t very useful if they’re wildly distorted, and fiction has a tendency to do that. I’m thinking of this more as a way to get people interested and engaged with rather abstract cause areas, supplementing rather than replacing actual information and facts. Spreading inaccurate information about EA is a possible concern, although this may be less of a risk with an explicitly fictional work.
Nobody will take us seriously: We definitely shouldn’t become a weird creative writing club. Fiction should never become a core part of a movement that’s actually trying to do something productive. However, I doubt that a few bad science fiction stories will really damage the movement, we’re already in the realm of science fiction and people already think we’re weird. While sci-fi used to be trashy pulp for children, I think we’re at the point where the genre is taken seriously enough by the mainstream that we need not fear guilt by association.
What about the Information hazard? I’m not convinced by this one, but you may worry that fiction will inspire reality, and somebody will be inspired to create a bioweapon by your story about it. While I can understand taking reasonable precautions (i.e. don’t provide exact details of molecular cloning) this seems implausible to me. If we want governments to do more about biorisk, not telling anyone about it probably isn’t going to work (more here: https://forum.effectivealtruism.org/posts/ixeo9swGQTbYtLhji/bioinfohazards-1 ). If anything, this seems like a great excuse to justify scientific inaccuracy and speculation and focus on broad ideas, rather than closely sticking to what’s possible with current technology.
Obligatory link to Harry Potter and the Methods of Rationality, both a great piece of literature on its own merits and also one of the leading gateways to the LW/EA community.
Publishing a novel is very competitive, and most people who publish sell less than 10,000 copies. You would have to work part-time on your novel for years and receive very little money in return.
Online there’s a lot of fiction available as well—you can donate your time to writing but again I think it’s unlikely that more than 10,000 people would read your novel or short story, and I think most of those people wouldn’t be influenced very much, unless you’re a REALLY good writer.
I’d encourage people to enjoy fiction writing as a hobby and I’d encourage people who are unusually talented to develop their skills, but this is just SUCH a competitive field!
In terms of Effective Altruist Fiction, I think Unsong ( http://unsongbook.com ) is a great example. Despite the premise being rather strange (The Bible and Talmud are literally true), Peter Singer and EA get explicitly mentioned in Cantors and Singers, and the Comet King is a great example of a utilitarian protagonist who genuinely tries to do as much good as possible (by trying to literally destroy Hell).
The idea of communicating Long-termism through fiction is discussed in an episode of the 80000 hours podcast ( https://80000hours.org/podcast/episodes/aj-jacobs-on-writing-reframing-problems-as-puzzles/), although Rob suggests an Office-style sitcom, whereas I think a science fiction thriller would be more interesting and potentially more effective.
Finding and sharing existing fiction/passages that gets at EA ideas seems higher-leverage than trying to produce new fiction (for the most part). People write a huge amount of fiction from many different moral perspectives, and some of it is bound to be EA-aligned in some ways.
There’s also the rational fiction community, which is highly influenced by EA and shows how EA ideas could work in a variety of settings (e.g. Superman as an X-risk, improving institutional decision-making in the Pokemon universe).
I mostly agree with Khorton.
A related idea would be to try to convince established authors to write books promoting EA or longtermism (maybe by giving them a grant?). Such authors are already very good writers with a large audience.
I guess the main problem will be the likely lack of high fidelity. Another problem is that this could be seen as a sinister attempt of pushing one’s (weird) agenda.
There is a german author (Andreas Eschbach) who incorporates true science into his (imo great) fiction novels to spread awareness especially of potentially dangerous technologies. For example, in his book “Der Herr aller Dinge/ Lord of all things” he outlines the potential dangers of atomic scale manufacturing. The book is also available in English.