Do you actually oppose transhumanism or atheism? That would slightly surprise me for an evo psych prof, but maybe I am totally wrong. Unlike you I am not, to put it mildly, a fan of National Conservatism (happy to see anyone tell anyone to care about AI takeover and mass unemployment from AI* though), but it seems a bit disrespectful and manipulative towards them to to talk like you share their fear of atheism and genetic engineering if you don’t.
*I actually think the point about how we can’t just really on big tech’s benevolence to keep paying basic income should play even better on the left than on the right, or at least would if a lot of the US left wasn’t addicted to “lol, imagine thinking these moronic tech bros could build anything other than a dumb plagiarism machine” So embarrassing!” as their big take on AI.
David—I considered myself an atheist for several decades (partly in alignment with my work in evolutionary psychology), and would identify now as an agnostic (insofar as the Simulation Hypothesis has some slight chance of being true, and insofar as ‘Simulation-Coders’ aren’t functionally any different from ‘Gods’, from our point of view).
And I’m not opposed to various kinds of reproductive tech, regenerative medicine research, polygenic screening, etc.
However, IMHO, too many atheists in the EA/Rationalist/AI Safety subculture have been too hostile or dismissive of religion to be effective in sharing the AI risk message with religious people (as I alluded to in this post).
And, I think way too much overlap has developed between transhumanism and the e/acc cult that dismisses AI risk entirely, and/or that embraces human extinction and replacement by machine intelligences. Insofar as ‘transhumanism’ has morphed into contempt for humanity-as-it-is, and into a yearning for hypothetical-posthumanity-as-it-could be, then I think it’s very dangerous.
Modest, gradual, genetic selection or modification of humans to make them a little healthier or smarter, generation by generation? That’s fine with me.
Radical replacement of humanity by ASIs in order to colonize the galaxy and the lightcone faster? Not fine with me.
Do you actually oppose transhumanism or atheism? That would slightly surprise me for an evo psych prof, but maybe I am totally wrong. Unlike you I am not, to put it mildly, a fan of National Conservatism (happy to see anyone tell anyone to care about AI takeover and mass unemployment from AI* though), but it seems a bit disrespectful and manipulative towards them to to talk like you share their fear of atheism and genetic engineering if you don’t.
*I actually think the point about how we can’t just really on big tech’s benevolence to keep paying basic income should play even better on the left than on the right, or at least would if a lot of the US left wasn’t addicted to “lol, imagine thinking these moronic tech bros could build anything other than a dumb plagiarism machine” So embarrassing!” as their big take on AI.
David—I considered myself an atheist for several decades (partly in alignment with my work in evolutionary psychology), and would identify now as an agnostic (insofar as the Simulation Hypothesis has some slight chance of being true, and insofar as ‘Simulation-Coders’ aren’t functionally any different from ‘Gods’, from our point of view).
And I’m not opposed to various kinds of reproductive tech, regenerative medicine research, polygenic screening, etc.
However, IMHO, too many atheists in the EA/Rationalist/AI Safety subculture have been too hostile or dismissive of religion to be effective in sharing the AI risk message with religious people (as I alluded to in this post).
And, I think way too much overlap has developed between transhumanism and the e/acc cult that dismisses AI risk entirely, and/or that embraces human extinction and replacement by machine intelligences. Insofar as ‘transhumanism’ has morphed into contempt for humanity-as-it-is, and into a yearning for hypothetical-posthumanity-as-it-could be, then I think it’s very dangerous.
Modest, gradual, genetic selection or modification of humans to make them a little healthier or smarter, generation by generation? That’s fine with me.
Radical replacement of humanity by ASIs in order to colonize the galaxy and the lightcone faster? Not fine with me.