I second weakening the definition. As someone who cares deeply about future generations, I think it is infeasible to value them equally to people today in terms of actual actions. I sketched out an optimal mitigation path for asteroid/​comet impact. Just valuing the present generation in one country, we should do alternate foods. Valuing the present world, we should do asteroid detection/​deflection. Once you value hundreds of future generations, we should add in food storage and comet detection/​deflection, costing many trillions of dollars. But if you value even further in the future, we should take even more extreme measures, like many redundancies. And this is for a very small risk compared to things like nuclear winter and AGI. Furthermore, even if one does discount future generations, if you think we could have many computer consciousnesses in only a century or so, again we should be donating huge amount of resources for reducing even small risks. I guess one way of valuing future generations equally to the present generation is to value each generation an infinitesimal amount, but that doesn’t seem right.
Is the argument here something along the lines of; I find that I don’t want to struggle to do what these values would demand, so they must not be my values?
I hope I’m not seeing an aversion to surprising conclusions in moral reasoning. Science surprises us often, but it keeps getting closer to the truth. Technology surprises us all of the time, but it keeps getting more effective. If you wont accept any sort of surprise in the domain of applied morality, your praxis is not going to end up being very good.
Thanks for your comment. I think my concern is basically addressed by Will’s comment below. That is it is good to value everyone equally. However, it is not required in our daily actions to value a random person alive today is much as ourselves or a random person in the future as much as ourselves. That is, it is permissible to have some special relationships and have some personal prerogatives.
I second weakening the definition. As someone who cares deeply about future generations, I think it is infeasible to value them equally to people today in terms of actual actions. I sketched out an optimal mitigation path for asteroid/​comet impact. Just valuing the present generation in one country, we should do alternate foods. Valuing the present world, we should do asteroid detection/​deflection. Once you value hundreds of future generations, we should add in food storage and comet detection/​deflection, costing many trillions of dollars. But if you value even further in the future, we should take even more extreme measures, like many redundancies. And this is for a very small risk compared to things like nuclear winter and AGI. Furthermore, even if one does discount future generations, if you think we could have many computer consciousnesses in only a century or so, again we should be donating huge amount of resources for reducing even small risks. I guess one way of valuing future generations equally to the present generation is to value each generation an infinitesimal amount, but that doesn’t seem right.
Is the argument here something along the lines of; I find that I don’t want to struggle to do what these values would demand, so they must not be my values?
I hope I’m not seeing an aversion to surprising conclusions in moral reasoning. Science surprises us often, but it keeps getting closer to the truth. Technology surprises us all of the time, but it keeps getting more effective. If you wont accept any sort of surprise in the domain of applied morality, your praxis is not going to end up being very good.
Thanks for your comment. I think my concern is basically addressed by Will’s comment below. That is it is good to value everyone equally. However, it is not required in our daily actions to value a random person alive today is much as ourselves or a random person in the future as much as ourselves. That is, it is permissible to have some special relationships and have some personal prerogatives.