A random idea on how this film could end by explicitly promoting existential risk awareness:
At the end of the film, imagine the comet impacts Earth and humanity goes extinct.
The audience is surprised that the movie actually ends in extinction and the heroes don’t win.
The otherwise-comedic film ends on this serious, sad note.
The screen goes dark, and the following text appears:
“Experts estimate the chance of extinction via asteroid or comet impact within the next 100 years at only ~1 in 1,000,000.”
“However, experts believe the chance of extinction from other causes is much higher.”
The film cuts from black to Toby Ord in his office, reading the caption of Table 6.1 from his book The Precipice: “My best estimates for the chance of an existential catastrophe from each of these sources occurring at some point in the next 100 years...”
The film cuts to the table itself as he begins reading the risks aloud:
“If you’re wondering if this is a joke, it’s not. The risks really do seem to be this high.”
Toby Ord reading a table out loud sounds like a bridge too far, but it’s not uncommon for movies to end with a link to some relevant real-world resource. If I knew the people behind this movie (I don’t) and thought there might be time to change it (no idea), I’d probably advocate for something like this (many ways to improve the wording, I’m sure) before the credits:
This film isn’t based on a true story. But it may become one.
Learn about risks to humanity, and how you can help:
I don’t know; I doubt it’s a problem where throwing money at it is the right answer. In any case, it’s unclear to me whether doing this would actually be positive value or not. I imagine it would be quite controversial, even among EAs who are into longtermism. I just shared the idea because I thought it was interesting, not because I necessarily thought it was good.
Yeah, I agree that money is not the bottleneck. I think the strongest bottleneck is decision quality on whether this is a good idea, and a secondary bottleneck is whether our Hollywood contacts are good enough to make this happen conditional upon us believing it’s actually a good idea.
Having popular presentations of our ideas in an unnuanced form may either a) give the impression that our ideas are bad/silly/unnuanced or b) low-status, akin to how a lot of AI safety efforts are/were rounded off as “Terminator” scenarios.
A random idea on how this film could end by explicitly promoting existential risk awareness:
At the end of the film, imagine the comet impacts Earth and humanity goes extinct.
The audience is surprised that the movie actually ends in extinction and the heroes don’t win.
The otherwise-comedic film ends on this serious, sad note.
The screen goes dark, and the following text appears:
“Experts estimate the chance of extinction via asteroid or comet impact within the next 100 years at only ~1 in 1,000,000.”
“However, experts believe the chance of extinction from other causes is much higher.”
The film cuts from black to Toby Ord in his office, reading the caption of Table 6.1 from his book The Precipice: “My best estimates for the chance of an existential catastrophe from each of these sources occurring at some point in the next 100 years...”
The film cuts to the table itself as he begins reading the risks aloud:
“If you’re wondering if this is a joke, it’s not. The risks really do seem to be this high.”
Cut to credits.
Toby Ord reading a table out loud sounds like a bridge too far, but it’s not uncommon for movies to end with a link to some relevant real-world resource. If I knew the people behind this movie (I don’t) and thought there might be time to change it (no idea), I’d probably advocate for something like this (many ways to improve the wording, I’m sure) before the credits:
This film isn’t based on a true story. But it may become one.
Learn about risks to humanity, and how you can help:
theprecipice.com
(More realistically, if I did have an in, I’d ask people like Toby Ord what message they’d want millions of random viewers to see.)
I could imagine them interviewing Toby Ord for a mockumentary, like Death to 2020
beautiful
How much would it cost to influence the film to make this happen?
I don’t know; I doubt it’s a problem where throwing money at it is the right answer. In any case, it’s unclear to me whether doing this would actually be positive value or not. I imagine it would be quite controversial, even among EAs who are into longtermism. I just shared the idea because I thought it was interesting, not because I necessarily thought it was good.
Yeah, I agree that money is not the bottleneck. I think the strongest bottleneck is decision quality on whether this is a good idea, and a secondary bottleneck is whether our Hollywood contacts are good enough to make this happen conditional upon us believing it’s actually a good idea.
Do you have a story for why this could be a bad idea?
Having popular presentations of our ideas in an unnuanced form may either a) give the impression that our ideas are bad/silly/unnuanced or b) low-status, akin to how a lot of AI safety efforts are/were rounded off as “Terminator” scenarios.