I’m open to the possibility that what’s all things considered best might take into account other kinds of values beyond traditionally welfarist ones (e.g. Nietzschean perfectionism). But standard sorts of agent-relative reasons like Wolf adverts to (reasons to want your life in particular to be more well-rounded) strike me as valid excuses rather than valid justifications. It isn’t really a better decision to do the more selfish thing, IMO.
Your second paragraph is hard to answer because different people have different moral beliefs, and (as I suggest in the OP) laxer moral beliefs often stem from motivated reasoning. So the two may be intertwined. But obviously my hope is that greater clarity of moral knowledge may help us to do more good even with limited moral motivation.
I’m open to the possibility that what’s all things considered best might take into account other kinds of values beyond traditionally welfarist ones (e.g. Nietzschean perfectionism). But standard sorts of agent-relative reasons like Wolf adverts to (reasons to want your life in particular to be more well-rounded) strike me as valid excuses rather than valid justifications. It isn’t really a better decision to do the more selfish thing, IMO.
Your second paragraph is hard to answer because different people have different moral beliefs, and (as I suggest in the OP) laxer moral beliefs often stem from motivated reasoning. So the two may be intertwined. But obviously my hope is that greater clarity of moral knowledge may help us to do more good even with limited moral motivation.