Thanks, Pablo. I appreciate the effort to be construcive. However, I have a hard time parsing what you are suggesting. All moral actions depend on humans’ decisions to some extent, so it looks like everything would be in equal footing. One could argue we should discount more consequences which depend more on humans’ decisions, but I do not understand what this means. In my mind, one should simply weight consequences according to their probabilities.
Thanks, Pablo. I appreciate the effort to be construcive. However, I have a hard time parsing what you are suggesting. All moral actions depend on humans’ decisions to some extent, so it looks like everything would be in equal footing. One could argue we should discount more consequences which depend more on humans’ decisions, but I do not understand what this means. In my mind, one should simply weight consequences according to their probabilities.