It depends on the case, but there are definitely cases where I would.
Also, while you make a good point that these can sometimes converge, I think the priority of concerns is extremely different under shortermism vs longtermism, which i see as the important part of “most important determinant of what we ought to do.” (Setting aside mugging and risk aversion/robustness) some very small or even directional shift could make something hold the vast majority of your moral weight, as opposed to before where the impact might have not been that big/ impact would have been outweighed by lack of neglectedness or tractability.
P.S. If one (including myself) failed to do x, given that it would shift priorities but wouldn’t affect what one would do in light of short term damage, I think that would say less about one’s actual beliefs and more about their intuitions of disgust towards means-end-reasoning—but this is just a hunch and somewhat based on my own introspection (to be fair, sometimes this comes from moral uncertainty/reputational concerns that should be used in this reasoning and is to your point).
It depends on the case, but there are definitely cases where I would.
Then it definitely fits with your vote. I just meant that the fact that you (and me) tend to think that making the near-future better also will make the far-future better shouldn’t influence the answer to this question.
We just disagree on how confident we are on our assessment of how our actions will affect the far-future. And probably this is because of our age ;-)
It depends on the case, but there are definitely cases where I would.
Also, while you make a good point that these can sometimes converge, I think the priority of concerns is extremely different under shortermism vs longtermism, which i see as the important part of “most important determinant of what we ought to do.” (Setting aside mugging and risk aversion/robustness) some very small or even directional shift could make something hold the vast majority of your moral weight, as opposed to before where the impact might have not been that big/ impact would have been outweighed by lack of neglectedness or tractability.
P.S. If one (including myself) failed to do x, given that it would shift priorities but wouldn’t affect what one would do in light of short term damage, I think that would say less about one’s actual beliefs and more about their intuitions of disgust towards means-end-reasoning—but this is just a hunch and somewhat based on my own introspection (to be fair, sometimes this comes from moral uncertainty/reputational concerns that should be used in this reasoning and is to your point).
Then it definitely fits with your vote. I just meant that the fact that you (and me) tend to think that making the near-future better also will make the far-future better shouldn’t influence the answer to this question.
We just disagree on how confident we are on our assessment of how our actions will affect the far-future. And probably this is because of our age ;-)