If we’re ignoring getting the numbers right and instead focusing on the emotional impact, we have no claim to the term “effective”. This sort of reasoning is why epistemics around dogooding are so bad in the first place.
I hate to admit it, but I think there does exist a utilitarian trade-off between marketability and accuracy. Although I’m thrilled that the EA movement prides itself on being as factually accurate as possible and I believe the core EA movement absolutely needs to stick with that, there is a case to be made that an exaggerated truth may be an important teaching tool in helping non-EAs understand why EAs do what they do.
It seems likely that Peter Singer’s example has had a net-positive impact, despite the inaccuracies. Even I was originally drawn to EA by this example, among a few of his others. I’ve since been donating at least 10% and been active in EA projects. I’m sure I’m not the only one.
We just have to be careful that the integrity of the EA movement isn’t compromised due to inaccurate examples like this. But I think anyone who goes far enough with EA to learn that this example is inaccurate, or even cares to do so, will most likely already have converted into an EA mindset, which is Mr. Singer’s end-goal.
If we’re ignoring getting the numbers right and instead focusing on the emotional impact, we have no claim to the term “effective”. This sort of reasoning is why epistemics around dogooding are so bad in the first place.
I hate to admit it, but I think there does exist a utilitarian trade-off between marketability and accuracy. Although I’m thrilled that the EA movement prides itself on being as factually accurate as possible and I believe the core EA movement absolutely needs to stick with that, there is a case to be made that an exaggerated truth may be an important teaching tool in helping non-EAs understand why EAs do what they do.
It seems likely that Peter Singer’s example has had a net-positive impact, despite the inaccuracies. Even I was originally drawn to EA by this example, among a few of his others. I’ve since been donating at least 10% and been active in EA projects. I’m sure I’m not the only one.
We just have to be careful that the integrity of the EA movement isn’t compromised due to inaccurate examples like this. But I think anyone who goes far enough with EA to learn that this example is inaccurate, or even cares to do so, will most likely already have converted into an EA mindset, which is Mr. Singer’s end-goal.