David—I considered myself an atheist for several decades (partly in alignment with my work in evolutionary psychology), and would identify now as an agnostic (insofar as the Simulation Hypothesis has some slight chance of being true, and insofar as ‘Simulation-Coders’ aren’t functionally any different from ‘Gods’, from our point of view).
And I’m not opposed to various kinds of reproductive tech, regenerative medicine research, polygenic screening, etc.
However, IMHO, too many atheists in the EA/Rationalist/AI Safety subculture have been too hostile or dismissive of religion to be effective in sharing the AI risk message with religious people (as I alluded to in this post).
And, I think way too much overlap has developed between transhumanism and the e/acc cult that dismisses AI risk entirely, and/or that embraces human extinction and replacement by machine intelligences. Insofar as ‘transhumanism’ has morphed into contempt for humanity-as-it-is, and into a yearning for hypothetical-posthumanity-as-it-could be, then I think it’s very dangerous.
Modest, gradual, genetic selection or modification of humans to make them a little healthier or smarter, generation by generation? That’s fine with me.
Radical replacement of humanity by ASIs in order to colonize the galaxy and the lightcone faster? Not fine with me.
David—I considered myself an atheist for several decades (partly in alignment with my work in evolutionary psychology), and would identify now as an agnostic (insofar as the Simulation Hypothesis has some slight chance of being true, and insofar as ‘Simulation-Coders’ aren’t functionally any different from ‘Gods’, from our point of view).
And I’m not opposed to various kinds of reproductive tech, regenerative medicine research, polygenic screening, etc.
However, IMHO, too many atheists in the EA/Rationalist/AI Safety subculture have been too hostile or dismissive of religion to be effective in sharing the AI risk message with religious people (as I alluded to in this post).
And, I think way too much overlap has developed between transhumanism and the e/acc cult that dismisses AI risk entirely, and/or that embraces human extinction and replacement by machine intelligences. Insofar as ‘transhumanism’ has morphed into contempt for humanity-as-it-is, and into a yearning for hypothetical-posthumanity-as-it-could be, then I think it’s very dangerous.
Modest, gradual, genetic selection or modification of humans to make them a little healthier or smarter, generation by generation? That’s fine with me.
Radical replacement of humanity by ASIs in order to colonize the galaxy and the lightcone faster? Not fine with me.