I think, accounting for the information youâre asking about, there are strong consequentialist and deontological reasons not to murder anyone.
First, from a consequentialist perspective, murdering someone puts yourself and the Effective Altruism community at serious legal and reputational risk. This would be irrational and irresponsible given that you have many other, more effective, ways to reduce the suffering of factory farmed animals. We also have uncertainty about the moral status of non-human animals, so we need to be careful when trading off between the lives of humans and other species.
Second, from a deontological (and perhaps virtue-ethical) standpoint, murder is simply inexcusable. Since we face moral uncertainty about which moral framework is correct, I believe we ought not to do anything that would be clearly immoral on moral views other than consequentialism.
See more detailed information here (the page is about careers, but 80% of it applies to this case as well): https://ââ80000hours.org/ââarticles/ââharmful-career/ââ
Iâm glad you mustered the courage to post this! I think itâs a great post.
I agree that, in practice, people advocating for effective altruism can implicitly argue for the set of popular EA causes (and they do this quite often?), which could repel people with useful insight. Additionally, it seems to be the case that people in the EA community can be dismissive of newcomersâ cause prioritization (or their arguments for causes that are less popular in EA). Again, this could repel people from EA.
I have a couple of hypotheses for these observations. (I donât think either is a sufficient explanation, but theyâre both plausibly contributing factors.)
First, people might feel compelled to make EA less âabstractâ by trying to provide concrete examples of how people in the EA community are âtrying to do the most good they can,â possibly giving the impression that the causes, instead of the principles, are most characteristic of EA.
Second, people may be more subconsciously dismissive of new cause proposals because theyâve invested time/âmoney into causes that are currently popular in the EA community. Itâs psychologically easier to reject a new cause prioritization proposal than it is to accept it and thereby feel as though your resources have not been used with optimal effectiveness.