FWIW, Brian Tomasik does a fuzzies/utilons split thing too. One justification is that it helps avoid cognitive dissonance between near-term causes and, in his mind, more effective longtermist causes.
My position, in contrast, is that I acknowledge the epistemic force of far-future arguments but maintain some commitment to short-term helping as an intrinsic spiritual impulse. Along the lines of Occam’s imaginary razor, this allows me to avoid distorting my beliefs about the far-future question based on emotional pulls to stop torture-level suffering in the present. In the face of emotion-based cognitive dissonance, it’s often better to change your values than to change your beliefs.
It might be overly confusing to call it “changing [my ideal] values”. It’s more that I have preferences for both. Some that seem like ones I would ideally like to keep (minimizing suffering in expectation), but some that as a human, for better or worse, I have (drives to reduce suffering in front of me, sticking to certain principles...).
If the price of a split in donations/personal focus results in me becoming more effective at the far-future stuff that I think is more important for utilons, in a way that makes those utilons go up, then that seems worth it.
FWIW, Brian Tomasik does a fuzzies/utilons split thing too. One justification is that it helps avoid cognitive dissonance between near-term causes and, in his mind, more effective longtermist causes.
It might be overly confusing to call it “changing [my ideal] values”. It’s more that I have preferences for both. Some that seem like ones I would ideally like to keep (minimizing suffering in expectation), but some that as a human, for better or worse, I have (drives to reduce suffering in front of me, sticking to certain principles...).
If the price of a split in donations/personal focus results in me becoming more effective at the far-future stuff that I think is more important for utilons, in a way that makes those utilons go up, then that seems worth it.