However, atheists don’t believe in any divine laws such as the sin of killing, are thus not bound by any rules.
I think your gripe is with consequentialism, not atheism per se. And don’t forget that there are plenty of theists who do horrible things, often in the name of their religion.
I think that the Future of Humanity Institute should add negative utilitarian atheism to their list of existential risks.
It just means that many EAs use EA as a means to promote atheism/atheists.
It is evident that the majority of EAs are atheist/irreligious, but I am not aware of any EA organizations actively promoting atheism or opposing theism. Who uses EA as a “means to promote atheism”?
Coincidentally, the closest example I can recall is Phil Torres’s work on religious eschatological fanaticism as a possible agential x-risk.
Roman Yampolskiy’s shortlist of potential agents who could bring about an end to the world (https://arxiv.org/ftp/arxiv/papers/1605/1605.02817.pdf) also includes Military, Government, Corporations, Villains, Black Hats, Doomsday Cults, Depressed, Psychopaths, Criminals, AI Risk Deniers, and AI Safety Researchers.
I think your gripe is with consequentialism, not atheism per se. And don’t forget that there are plenty of theists who do horrible things, often in the name of their religion.
The X-Risks Institute, which is run by /u/philosophytorres, specializes in agential risks, and mentions NU as one such risk. I don’t whether FHI has ever worked on agential risks.
It is evident that the majority of EAs are atheist/irreligious, but I am not aware of any EA organizations actively promoting atheism or opposing theism. Who uses EA as a “means to promote atheism”?
Coincidentally, the closest example I can recall is Phil Torres’s work on religious eschatological fanaticism as a possible agential x-risk.
Roman Yampolskiy’s shortlist of potential agents who could bring about an end to the world (https://arxiv.org/ftp/arxiv/papers/1605/1605.02817.pdf) also includes Military, Government, Corporations, Villains, Black Hats, Doomsday Cults, Depressed, Psychopaths, Criminals, AI Risk Deniers, and AI Safety Researchers.