I’ve been trying to work out how to sell EA in the form of a parable; let me illustrate with my current best candidate.
In a post-apocalyptic world, you’re helping get the medicine that cures the disease out to the people. You know that there’s a truck with the medicine on the way, and it will soon reach a T-junction. The truck doesn’t know who is where and its radio is broken; you’re powerless to affect what it does, watching with binoculars from far away. If it turns left, it’ll be flagged down by a family of four and their lives will be saved. If it turns right, it’ll be flagged down by a school where dozens of families with the disease have taken refuge.
Don’t you find yourself fervently wishing the truck will turn right? It’s not because the family’s lives aren’t worth saving; they are, and they all deserve to live. But it’s clear that the better outcome is that it turn right.
So here’s some things I like about this: it’s not totally unfair. It’s not just a choice between “save A” and “save A and B”; if you make the most effective choice, then some people die who you could have chosen to save. And weirdly, I think the reframing where you can’t choose who gets saved, you can only will the truck to make the right decision, might help people see more clearly; you’re not worried about guilt about not saving the family or anger at someone making the wrong moral choice, just looking at a flip of a coin and discovering how you want it to land.
What I’d like to improve is to somehow make it more like an everyday situation rather than a super contrived one.
Any improvements? Does this seem like a useful exercise?
I agree that making it not a choice between A and A+B is fairer. Also, saying that they’re a witness, and can’t actually make any decision might help with switching off guilt relating to a taboo tradeoff.
I agree that the problem is that the current example is too contrived, though I haven’t yet thought of a more ordinary example. Scott Siskind’s Arctic exploration analogy is the closest I know.
I wonder if you can do something with a different kind of disaster? Maybe make it a coach that can get people out of the danger zone? Or is that cheating because people don’t want seats to be “wasted”?
I’ve been trying to work out how to sell EA in the form of a parable; let me illustrate with my current best candidate.
In a post-apocalyptic world, you’re helping get the medicine that cures the disease out to the people. You know that there’s a truck with the medicine on the way, and it will soon reach a T-junction. The truck doesn’t know who is where and its radio is broken; you’re powerless to affect what it does, watching with binoculars from far away. If it turns left, it’ll be flagged down by a family of four and their lives will be saved. If it turns right, it’ll be flagged down by a school where dozens of families with the disease have taken refuge.
Don’t you find yourself fervently wishing the truck will turn right? It’s not because the family’s lives aren’t worth saving; they are, and they all deserve to live. But it’s clear that the better outcome is that it turn right.
So here’s some things I like about this: it’s not totally unfair. It’s not just a choice between “save A” and “save A and B”; if you make the most effective choice, then some people die who you could have chosen to save. And weirdly, I think the reframing where you can’t choose who gets saved, you can only will the truck to make the right decision, might help people see more clearly; you’re not worried about guilt about not saving the family or anger at someone making the wrong moral choice, just looking at a flip of a coin and discovering how you want it to land.
What I’d like to improve is to somehow make it more like an everyday situation rather than a super contrived one.
Any improvements? Does this seem like a useful exercise?
The exercise seems useful.
I agree that making it not a choice between A and A+B is fairer. Also, saying that they’re a witness, and can’t actually make any decision might help with switching off guilt relating to a taboo tradeoff.
I agree that the problem is that the current example is too contrived, though I haven’t yet thought of a more ordinary example. Scott Siskind’s Arctic exploration analogy is the closest I know.
Thanks for the encouragement!
I wonder if you can do something with a different kind of disaster? Maybe make it a coach that can get people out of the danger zone? Or is that cheating because people don’t want seats to be “wasted”?
Here’s a continuation of this kind of discussion: The EA Pitch guide