Eric Sampson published a paper on this in Oxford Studies in Philosophy of Religion. See here.
Abstract: Longtermist Effective Altruists (EAs) aim to mitigate the risk of existential catastrophes. In this paper, I have three goals. First, I identify a catastrophic risk that has been completely ignored by EAs. I call it religious catastrophe: the threat that (as Christians and Muslims have warned for centuries) billions of people stand in danger of going to hell for all eternity. Second, I argue that, even by secular EA lights, religious catastrophe is at least as bad and at least as probable, and therefore at least as important as many of the standard EA catastrophic risks (e.g., catastrophic climate change, nuclear winter). Third, I present the following dilemma for secular EAs: either adopt religious catastrophe as an EA cause or ignore religious catastrophe but also ignore catastrophic risks whose mitigation has a similar, or lower, expected value (i.e., most, or all, of them). Business as usual—ignoring religious catastrophe while championing the usual EA causes—is not an option consistent with longtermist EA principles.
Not a popular topic among secular EAs, in my experience.
Eric Sampson published a paper on this in Oxford Studies in Philosophy of Religion. See here.
Abstract: Longtermist Effective Altruists (EAs) aim to mitigate the risk of existential catastrophes. In this paper, I have three goals. First, I identify a catastrophic risk that has been completely ignored by EAs. I call it religious catastrophe: the threat that (as Christians and Muslims have warned for centuries) billions of people stand in danger of going to hell for all eternity. Second, I argue that, even by secular EA lights, religious catastrophe is at least as bad and at least as probable, and therefore at least as important as many of the standard EA catastrophic risks (e.g., catastrophic climate change, nuclear winter). Third, I present the following dilemma for secular EAs: either adopt religious catastrophe as an EA cause or ignore religious catastrophe but also ignore catastrophic risks whose mitigation has a similar, or lower, expected value (i.e., most, or all, of them). Business as usual—ignoring religious catastrophe while championing the usual EA causes—is not an option consistent with longtermist EA principles.
Not a popular topic among secular EAs, in my experience.