Error
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
I really liked this!!!
Since you asked for feedback, here’s a little suggestion, take it or leave it: I found a couple things at the end slightly out-of-place, in particular “If you choose to tackle the problem of nuclear security, what angle can you attack the problem from that will give you the most fulfillment?” and “Do any problems present even bigger risks than nuclear war?”
Immediately after such an experience, I think the narrator would not be thinking about option of not bothering to work on nuclear security because other causes are more important, nor thinking about their own fulfillment. If other causes came to mind, I imagine it would be along the lines of “if I somehow manage to stop the nuclear war, what other potential catastrophes are waiting in the wings, ready to strike anytime in the months and years after that—and this time with no reset button?”
Or if you want it to fit better as written now, then shortly after the narrator snaps back to age 18 the text could say something along the lines of “You know about chaos theory and the butterfly effect; this will be a new re-roll of history, and there might not be a nuclear war this time around. Maybe last time was a fluke?” Then that might remove some of the single-minded urgency that I would otherwise expect the narrator to feel, and thus it would become a bit more plausible that the narrator might work on pandemics or whatever.
(Maybe that “new re-roll of history” idea is what you had in mind? Whereas I was imagining the Groundhog Day / Edge of Tomorrow / Terminator trope where the narrator knows 100% for sure that there will be a nuclear war on this specific hour of this specific day, if the narrator doesn’t heroically stop it.)
(I’m not a writer, don’t trust my judgment.)
Fantastic comments, thank you! I included the bit about personal fulfillment because it’s such an important component of being able to sustain an effective career long term, but in retrospect I was so focused on including as many EA ideas as I could that I didn’t notice how out of place that sentiment is at that point in the story. I removed both that sentence and the one about more important causes, and I added a variant of your suggested replacement sentence.
I am a writer (though not a published one) and I second his judgement. I felt brief disquiet at the line he commented on, but didn’t analyze it until I read his post because the story as a whole had still worked very well for me. I think the change makes a good story better, and I thank both Steve for suggesting it and Joshua for implementing it.
I found this motivational. Thanks for posting!
I’m glad to hear it. And sure thing!
That was really, really good.
Thanks, I’m glad you liked it!
Wow, … this was powerful, … and moving!
Thank you!
That’s outstandingly impactful. I just shared it with a bunch of non-EA friends. Great work!
Oh that’s awesome. Thank you!
So insightful and thought-provoking!
Thanks!
Overall I really like the way you delivered the message in this story. I noticed a couple opportunities to make the story a bit more rational. I would expect the main character would spend some time gathering information to find out what caused the nuclear catastrophe so they can prevent it before they hit the reset. And as a smaller suggestion, I’d expect that they would also do some planning of who they’d need to coordinate with or how they’d get funds before they hit the reset. Not sure if there’s a word limit but if you could find a way to work some of those things in I think it would be beneficial.
I actually read the protagonist as ‘probably suffering from radiation poisoning, might be about to literally die from the next bomb or the building collapsing’ as of the moment before they hit the reset, so I would see such planning as irrational rather than sensible—a little information might help, but not if it risks your life (which is what you’re thinking about if you’re selfish) or the fate of the world (which is what you’re thinking about if you’re selfless).
Interesting thoughts, thanks for your input! I’ll think about how to incorporate the feedback.