(I think it’s likely that I misunderstood at least some of the other arguments in this thread).
I think good arguments with uncomfortable/repugnant conclusions should be a) upvoted to the extent that they are good arguments and b) agreed or disagreed with to the extent that we believe the conclusions are true.
(and we may believe the bottom-line conclusions to be false for reasons that are outside the scope of the presented arguments).
I think we should be very willing to accept uncomfortable/repugnant conclusions to the extent that we believe they’re true. Our movement is effective altruism, not effective feel-good-about-ourselvesism. Since we probably live in the midst of multiple unknown moral catastrophes, one of the most important things we can do (other than averting imminent existential risk) is to carefully figure out which are the avertable moral catastrophes we currently live in. This search probably means evaluating the evidence we have, and seek out new evidence, and look at the world with deliberation, care, and good humor. In comparison, I expect moral disgust to be substantially less truth-tracking in comparison, and on the margins even net negative.
Losing access to our ability to think clearly is just really costly[1]. I’m not saying that we shouldn’t give this up at any price. But we should at least set the price to be very very high, and not be willing to sacrifice clear thinking quite so easily.
(I think it’s likely that I misunderstood at least some of the other arguments in this thread).
I think good arguments with uncomfortable/repugnant conclusions should be a) upvoted to the extent that they are good arguments and b) agreed or disagreed with to the extent that we believe the conclusions are true.
(and we may believe the bottom-line conclusions to be false for reasons that are outside the scope of the presented arguments).
I think we should be very willing to accept uncomfortable/repugnant conclusions to the extent that we believe they’re true. Our movement is effective altruism, not effective feel-good-about-ourselvesism. Since we probably live in the midst of multiple unknown moral catastrophes, one of the most important things we can do (other than averting imminent existential risk) is to carefully figure out which are the avertable moral catastrophes we currently live in. This search probably means evaluating the evidence we have, and seek out new evidence, and look at the world with deliberation, care, and good humor. In comparison, I expect moral disgust to be substantially less truth-tracking in comparison, and on the margins even net negative.
Losing access to our ability to think clearly is just really costly[1]. I’m not saying that we shouldn’t give this up at any price. But we should at least set the price to be very very high, and not be willing to sacrifice clear thinking quite so easily.
(“At first they came for our epistemology. And then they...well, we don’t know what happened next”)