One other minor mistake (I think): The video says “All the impact would derive from increasing the chance that current humans will get to experience extremely long and fulfilling lives as a result of hastening transformative technological development.” But one can also make a case for existential risk reduction based on more everyday concerns like not having 8 billion people die, be pushed into a collapsed society, or be pushed into a dystopia.
I don’t think that it’s worth adding a clarification to the video regarding this—I think this is a non-central point—but I think it also weakly pushes in favour of getting a little more feedback before uploading
(But again, I do think the video is good and I’m glad it exists and is getting viewed!)
This makes me think that I would definitely use more feedback. It makes me kind of sad that I could have added this point myself (together with precising the estimate by Toby Ord) and didn’t because of...? I don’t know exactly.
Edit: one easy thing I could do is to post scripts in advance here or on LW and request feedback (other than messaging people directly and waiting for their answers, although this is often slow and fatiguing).
Edit 2: Oh, and the Slack group. Surely more ideas will come to me, I’ll stop adding edits now.
Hey- Don’t be sad. It’s really brave to do work like this and make it public—thank you :) I’d definitely recommend this to newcomers (Indeed, I just did! Edited to add: The animations are so cool here!). You might consider sending future drafts through to CEA’s press team- they have a large network of people and I have always found the quality of their feedback to be really high and helpful! Thanks again for this!
Hey, thanks a lot for this comment. It did brighten my mood.
I think I’ll definitely want to send scripts to CEA’s press team, especially if they are heavily EA related like this one. Do you know how can I contact them? (I’m not sure I know what’s the CEA’s press team. Do you mean that I should just send an e-mail to CEA via their website?)
Absolutely :) Sky Mayhew <sky@centreforeffectivealtruism.org> and <media@centreforeffectivealtruism.org>. Sky is incredibly kind and has always given me brilliant feedback when I’ve had to do external-facing stuff before (which I find intensely nerve-wracking and have to do quite a lot of ). I can’t speak more highly of their help. The people who have given me feedback on my sharing your video have mainly done so in the form of WOW face reactions on Slack FWIW- I’ll let you know if I get anything else. My husband also loves this type of stuff and was psyched to watch your video too. :)
FWIW, I think it’s almost inevitable that a script of roughly the length you had will initially contain some minor errors. I don’t think you should feel bad about that or like it’s specific to you or something. I think it just means it’s good to get feedback :)
(And I think these minor errors in this case didn’t like create notably downside risk or something, and it is still a good video.)
One other minor mistake (I think): The video says “All the impact would derive from increasing the chance that current humans will get to experience extremely long and fulfilling lives as a result of hastening transformative technological development.” But one can also make a case for existential risk reduction based on more everyday concerns like not having 8 billion people die, be pushed into a collapsed society, or be pushed into a dystopia.
See also https://forum.effectivealtruism.org/tag/ethics-of-existential-risk and https://forum.effectivealtruism.org/posts/dfiKak8ZPa46N7Np6/the-person-affecting-value-of-existential-risk-reduction
I don’t think that it’s worth adding a clarification to the video regarding this—I think this is a non-central point—but I think it also weakly pushes in favour of getting a little more feedback before uploading
(But again, I do think the video is good and I’m glad it exists and is getting viewed!)
This makes me think that I would definitely use more feedback. It makes me kind of sad that I could have added this point myself (together with precising the estimate by Toby Ord) and didn’t because of...? I don’t know exactly.
Edit: one easy thing I could do is to post scripts in advance here or on LW and request feedback (other than messaging people directly and waiting for their answers, although this is often slow and fatiguing).
Edit 2: Oh, and the Slack group. Surely more ideas will come to me, I’ll stop adding edits now.
Hey- Don’t be sad. It’s really brave to do work like this and make it public—thank you :) I’d definitely recommend this to newcomers (Indeed, I just did! Edited to add: The animations are so cool here!). You might consider sending future drafts through to CEA’s press team- they have a large network of people and I have always found the quality of their feedback to be really high and helpful! Thanks again for this!
Hey, thanks a lot for this comment. It did brighten my mood.
I think I’ll definitely want to send scripts to CEA’s press team, especially if they are heavily EA related like this one. Do you know how can I contact them? (I’m not sure I know what’s the CEA’s press team. Do you mean that I should just send an e-mail to CEA via their website?)
Absolutely :) Sky Mayhew <sky@centreforeffectivealtruism.org> and <media@centreforeffectivealtruism.org>. Sky is incredibly kind and has always given me brilliant feedback when I’ve had to do external-facing stuff before (which I find intensely nerve-wracking and have to do quite a lot of ). I can’t speak more highly of their help. The people who have given me feedback on my sharing your video have mainly done so in the form of WOW face reactions on Slack FWIW- I’ll let you know if I get anything else. My husband also loves this type of stuff and was psyched to watch your video too. :)
Thanks a lot! This is definitely going to be helpful :)
FWIW, I think it’s almost inevitable that a script of roughly the length you had will initially contain some minor errors. I don’t think you should feel bad about that or like it’s specific to you or something. I think it just means it’s good to get feedback :)
(And I think these minor errors in this case didn’t like create notably downside risk or something, and it is still a good video.)