I would disagree with two steps in your reasoning one the relative importance of different animals but Cameron_Meyer_Shorb comment already covers this point. Although your conclusion would probably not change if you valued animals more highly making the combined effect of an american diet equal to one or up to maybe ten equivalent years of human life per year ( $430 dollars of enjoyment).
Instead, I think your argument breaks down when accounting for moral uncertainty where if you are not 100% certain in consequentialist ethics then almost any other moral system would hold you much more accountable for pain you cause rather than fail to prevent. Particularly if we increase the required estimate for $ of the enjoyment gained even if they are met. This makes it a different case to other altruistic trade offs you might make in that you are not trading a neutral action.
Another argument against this position is its effect on your moral attitudes as Jeff Sebo argued in his talk at EA global in 2019. You could dismiss this if you are certain it will not effect the relative value you place on other being and by not advertising your position as to not effect others.
Another argument against this position is its effect on your moral attitudes as Jeff Sebo argued in his talk at EA global in 2019. You could dismiss this if you are certain it will not effect the relative value you place on other being and by not advertising your position as to not effect others.
(FYI, this is the argument I was referring to as the “epistemic” argument in my other comment. Thanks for linking to that talk, George!)
I would disagree with two steps in your reasoning one the relative importance of different animals but Cameron_Meyer_Shorb comment already covers this point. Although your conclusion would probably not change if you valued animals more highly making the combined effect of an american diet equal to one or up to maybe ten equivalent years of human life per year ( $430 dollars of enjoyment).
Instead, I think your argument breaks down when accounting for moral uncertainty where if you are not 100% certain in consequentialist ethics then almost any other moral system would hold you much more accountable for pain you cause rather than fail to prevent. Particularly if we increase the required estimate for $ of the enjoyment gained even if they are met. This makes it a different case to other altruistic trade offs you might make in that you are not trading a neutral action.
Another argument against this position is its effect on your moral attitudes as Jeff Sebo argued in his talk at EA global in 2019. You could dismiss this if you are certain it will not effect the relative value you place on other being and by not advertising your position as to not effect others.
(FYI, this is the argument I was referring to as the “epistemic” argument in my other comment. Thanks for linking to that talk, George!)