I’m not sure I can totally spell it out—a lot of this piece is about the raw intuition that “something is weird here.”
One Bayesian-ish interpretation is given in the post: “The odds that we could live in such a significant time seem infinitesimal; the odds that Holden is having delusions of grandeur (on behalf of all of Earth, but still) seem far higher.” In other words, there is something “suspicious” about a view that implies that we are in an unusually important position—it’s the kind of view that seems (by default) more likely to be generated by wishful thinking, ego, etc. than by dispassionate consideration of the facts.
There’s also an intuition along the lines of “If we’re really in such a special position, I’d think it would be remarked upon more; I’m suspicious of claims that something really important is going on that isn’t generally getting much attention.”
I ultimately think we should bite these bullets (that we actually in the kind of special position that wishful thinking might falsely conclude we’re in, and that there actually is something very important going on that isn’t getting commensurate attention). I think some people imagine they can avoid biting these bullets by e.g. asserting long timelines to transformative AI; this piece aims to argue that doesn’t work.
I agree that Intuition is certainly an important piece of the puzzle.
A lot of this makes me think of not only Nietzche but Jean Baudrillard, as well as Jose Ortega when he speaks of the fearlessness scientist and philosophers need in the coming times, and the idea of the “masses”.
We must be just as careful of works of media such as Anne franks frankenstein as we are brave new world.
I’m not sure I can totally spell it out—a lot of this piece is about the raw intuition that “something is weird here.”
One Bayesian-ish interpretation is given in the post: “The odds that we could live in such a significant time seem infinitesimal; the odds that Holden is having delusions of grandeur (on behalf of all of Earth, but still) seem far higher.” In other words, there is something “suspicious” about a view that implies that we are in an unusually important position—it’s the kind of view that seems (by default) more likely to be generated by wishful thinking, ego, etc. than by dispassionate consideration of the facts.
There’s also an intuition along the lines of “If we’re really in such a special position, I’d think it would be remarked upon more; I’m suspicious of claims that something really important is going on that isn’t generally getting much attention.”
I ultimately think we should bite these bullets (that we actually in the kind of special position that wishful thinking might falsely conclude we’re in, and that there actually is something very important going on that isn’t getting commensurate attention). I think some people imagine they can avoid biting these bullets by e.g. asserting long timelines to transformative AI; this piece aims to argue that doesn’t work.
I agree that Intuition is certainly an important piece of the puzzle.
A lot of this makes me think of not only Nietzche but Jean Baudrillard, as well as Jose Ortega when he speaks of the fearlessness scientist and philosophers need in the coming times, and the idea of the “masses”.
We must be just as careful of works of media such as Anne franks frankenstein as we are brave new world.