Also with the possible exception of the earliest Christians who were hoping for an imminent second coming, I’m quite sure most Christians have not predicted an imminent apocalypse so we’re talking about specific sects and pastors (admittedly some quite influential). You do say you’re talking about early Christians at the start of the article, but I think the conclusion of your post makes it sound like religious people are constantly making apocalyptic predictions that fail to come true.
Anthropic shadow certainly creates some degree of uncertainty, however, it seems to apply less in this case than it might in say the case of nuclear war. (I’m actually about to submit another submission about anthropic shadow in that case) It seems like AI development wasn’t slowed down by a few events but rather due to overarching complexities in its development. It’s my understanding that anthropic shadow is mostly applicable in cases where there have been close calls, and is less applicable in non-linear cases. However, I might be mistaken.
The conclusion doesn’t read this way to me, as to me the statement “apocalyptic claims made by Christians” doesn’t imply that all Christians make apocalyptic claims. However, it does seem to have created unnecessary confusion, I will add the word some.
I think it’s an open question as to how much we can learn from failed apocalyptic predictions- https://forum.effectivealtruism.org/posts/2MjuJumEaG27u9kFd/don-t-be-comforted-by-failed-apocalypses
Also with the possible exception of the earliest Christians who were hoping for an imminent second coming, I’m quite sure most Christians have not predicted an imminent apocalypse so we’re talking about specific sects and pastors (admittedly some quite influential). You do say you’re talking about early Christians at the start of the article, but I think the conclusion of your post makes it sound like religious people are constantly making apocalyptic predictions that fail to come true.
Anthropic shadow certainly creates some degree of uncertainty, however, it seems to apply less in this case than it might in say the case of nuclear war. (I’m actually about to submit another submission about anthropic shadow in that case) It seems like AI development wasn’t slowed down by a few events but rather due to overarching complexities in its development. It’s my understanding that anthropic shadow is mostly applicable in cases where there have been close calls, and is less applicable in non-linear cases. However, I might be mistaken.
The conclusion doesn’t read this way to me, as to me the statement “apocalyptic claims made by Christians” doesn’t imply that all Christians make apocalyptic claims. However, it does seem to have created unnecessary confusion, I will add the word some.
Look forward to your next post!