When an EA cares for their family taking away time from extinction risk they’re valuing their family as much as 10^N people.
No. I’ve said this before elsewhere, and it’s not directly relevant to most of this discussion, but I think it’s very worth reinforcing; EA is not utilitarianism, and the commitment to EA does not imply that you have any obligatory trade-off between yourself or your family’s welfare and your EA commitment. If, as is the generally accepted standard, a “normal” EA commitment is of 10% of your income and/or resources, it seems bad to suggest that such an EA should not ideally spend the other 90% of their time/effort on personal things like their family.
(Note that in addition to being a digression, this is a deontological rather than decision-theoretic point.)
No. I’ve said this before elsewhere, and it’s not directly relevant to most of this discussion, but I think it’s very worth reinforcing; EA is not utilitarianism, and the commitment to EA does not imply that you have any obligatory trade-off between yourself or your family’s welfare and your EA commitment. If, as is the generally accepted standard, a “normal” EA commitment is of 10% of your income and/or resources, it seems bad to suggest that such an EA should not ideally spend the other 90% of their time/effort on personal things like their family.
(Note that in addition to being a digression, this is a deontological rather than decision-theoretic point.)