Just person to person, I don’t think there’s any substitute for staying awake and alert around your beliefs. I don’t mean be tense or reflexively skeptical—I mean accept that there is always uncertainty, so you have to trust that, if you are being honest with yourself and doing your best, you will notice when discomfort with your professed beliefs arises. You can set up external standards and fact checking, but can’t expect some external system to do the job for you of knowing whether you really think this stuff is true. People who don’t trust themselves on the latter over-rely on the former.
I partly agree with Nathan’s post, for a few reasons:
If Alice believes X because she trusts that Bob looked into it, then it’s useful for Alice to note her reason. Otherwise, you can get bad situations like ‘Bob did not in fact look into X, but he observes Alice’s confidence and concludes that she must have looked into it, so he takes X for granted too and Alice never realizes why’. This isn’t a big problem in two-person groups, but can lead to a lot of double-counted evidence in thousand-person groups.
It’s important to distinguish ‘this feels compelling’ from ‘this is Bayesian evidence about the physical world’. If an argument seems convincing, but would seem equally convincing if it were false, then you shouldn’t actually treat the convincingness as evidence.
Getting the right answer here is important enough, and blind spots and black-swan errors are common enough, that it can make a lot of sense to check your work even in cases where you’d be super surprised to learn you’d been wrong. Getting outside feedback can be a good way to do this.
I’ve noticed that when I worry “what if everything I believe is wrong?”, sometimes it’s a real worry that I’m biased in a specific way, or that I might just be missing something. Other times, it’s more like an urge to be dutifully/performatively skeptical or to get a certain kind of emotional reassurance; see https://equilibriabook.com/toc/ for a good discussion of this.
Re
Arguably this forum kind of does this job, though A) we are all tremendously biased B) are people *really* checking the minutiae? I am not.
I haven’t had any recent massive updates about EA sources’ credibility after seeing a randomized spot check. Which is one way of trying to guess at the expected utility of more marginal spot-checking, vs. putting the same resources into something else.
My main suggestion, though, would be to check out various examples of arguments between EAs, criticisms of EAs by other EAs, etc., and use that to start building a mental model of EA’s epistemic hygiene and likely biases or strengths. “Everyone on the EA Forum must be tremendously biased because otherwise they surely wouldn’t visit the forum” is a weak starting point by comparison; you can’t figure out which groups in the real world are biased (or how much, or in what ways) from your armchair.
I think I know very well where Nathan is coming from, and I don’t think it’s invalid, for the reasons you state among others. But after much wrangling with the same issues, my comment is the only summary statement I’ve ever really been able to make on the matter. He’s just left religion and I feel him on not knowing what to trust—I don’t think there’s any othe place he could be right now.
I suppose what I really wanted to say is that you can never surrender those doubts to anyone else or some external system. You just have to accept that you will make mistakes, stay alert to new information, and stay in touch with what changes in you over time.
Just person to person, I don’t think there’s any substitute for staying awake and alert around your beliefs. I don’t mean be tense or reflexively skeptical—I mean accept that there is always uncertainty, so you have to trust that, if you are being honest with yourself and doing your best, you will notice when discomfort with your professed beliefs arises. You can set up external standards and fact checking, but can’t expect some external system to do the job for you of knowing whether you really think this stuff is true. People who don’t trust themselves on the latter over-rely on the former.
+1 to this.
I partly agree with Nathan’s post, for a few reasons:
If Alice believes X because she trusts that Bob looked into it, then it’s useful for Alice to note her reason. Otherwise, you can get bad situations like ‘Bob did not in fact look into X, but he observes Alice’s confidence and concludes that she must have looked into it, so he takes X for granted too and Alice never realizes why’. This isn’t a big problem in two-person groups, but can lead to a lot of double-counted evidence in thousand-person groups.
It’s important to distinguish ‘this feels compelling’ from ‘this is Bayesian evidence about the physical world’. If an argument seems convincing, but would seem equally convincing if it were false, then you shouldn’t actually treat the convincingness as evidence.
Getting the right answer here is important enough, and blind spots and black-swan errors are common enough, that it can make a lot of sense to check your work even in cases where you’d be super surprised to learn you’d been wrong. Getting outside feedback can be a good way to do this.
I’ve noticed that when I worry “what if everything I believe is wrong?”, sometimes it’s a real worry that I’m biased in a specific way, or that I might just be missing something. Other times, it’s more like an urge to be dutifully/performatively skeptical or to get a certain kind of emotional reassurance; see https://equilibriabook.com/toc/ for a good discussion of this.
Re
Some people check some minutiae. The end of https://sideways-view.com/2018/07/08/the-elephant-in-the-brain/ is a cool example that comes to mind.
I haven’t had any recent massive updates about EA sources’ credibility after seeing a randomized spot check. Which is one way of trying to guess at the expected utility of more marginal spot-checking, vs. putting the same resources into something else.
My main suggestion, though, would be to check out various examples of arguments between EAs, criticisms of EAs by other EAs, etc., and use that to start building a mental model of EA’s epistemic hygiene and likely biases or strengths. “Everyone on the EA Forum must be tremendously biased because otherwise they surely wouldn’t visit the forum” is a weak starting point by comparison; you can’t figure out which groups in the real world are biased (or how much, or in what ways) from your armchair.
I think I know very well where Nathan is coming from, and I don’t think it’s invalid, for the reasons you state among others. But after much wrangling with the same issues, my comment is the only summary statement I’ve ever really been able to make on the matter. He’s just left religion and I feel him on not knowing what to trust—I don’t think there’s any othe place he could be right now.
I suppose what I really wanted to say is that you can never surrender those doubts to anyone else or some external system. You just have to accept that you will make mistakes, stay alert to new information, and stay in touch with what changes in you over time.
Yeah, strong upvote to this too.