For the record, I think that I had mediocre judgement in the past and did not reliably believe true things, and I sometimes had made really foolish decisions. I think my experience is mostly that I felt extremely alienated from society, which meant that I looked more critically on many common beliefs than most people do. This meant I was weird in lots of ways, many of which were bad and some of which were good. And in some cases this meant that I believed some weird things that feel like easy wins, eg by thinking that people were absurdly callous about causing animal suffering.
My judgement improved a lot from spending a lot of time in places with people with good judgement who I could learn from, eg Stanford EA, Triplebyte, the more general EA and rationalist community, and now MIRI.
I feel pretty unqualified to give advice on critical thinking, but here are some possible ideas, which probably aren’t actually good:
Try to learn simple models of the world and practice applying them to claims you hear, and then being confused when they don’t match. Eg learn introductory microeconomics and then whenever you hear a claim about the world that intro micro has an opinion on, try to figure out what the simple intro micro model would claim, and then inasmuch as the world doesn’t seem to look like intro micro would predict, think “hmm this is confusing” and then try to figure out what about the world might have caused this. When I developed this habit, I started noticing that lots of claims people make about the world are extremely implausible, and when I looked into the facts more I found that intro micro seemed to back me up. To learn intro economics, I enjoyed the Cowen and Tabarrok textbook.
I think Katja Grace is a master of the “make simple models and then get confused when the world doesn’t match them” technique. See her novel opinions page for many examples.
Another subject where I’ve been doing this recently is evolutionary biology—I’ve learned to feel confused whenever anyone makes any claims about group selection, and I plan to learn how group selection works, so that when people make claims about it I can assess them accurately.
Try to find the simplest questions whose answers you don’t know, in order to practice noticing when you believe things for bad reasons.
For example, some of my favorite physics questions:
Why isn’t the Sun blurry?
What is the fundamental physical difference between blue and green objects? Like, what equations do I solve to find out that an object is blue?
If energy is conserved, why we so often make predictions about the world by assuming that energy is minimized?
I think reading Thinking Physics might be helpful at practicing noticing your own ignorance, but I’m not sure.
Try to learn a lot about specific subjects sometimes, so that you learn what it’s like to have detailed domain knowledge.
For the record, I think that I had mediocre judgement in the past and did not reliably believe true things, and I sometimes had made really foolish decisions. I think my experience is mostly that I felt extremely alienated from society, which meant that I looked more critically on many common beliefs than most people do. This meant I was weird in lots of ways, many of which were bad and some of which were good. And in some cases this meant that I believed some weird things that feel like easy wins, eg by thinking that people were absurdly callous about causing animal suffering.
My judgement improved a lot from spending a lot of time in places with people with good judgement who I could learn from, eg Stanford EA, Triplebyte, the more general EA and rationalist community, and now MIRI.
I feel pretty unqualified to give advice on critical thinking, but here are some possible ideas, which probably aren’t actually good:
Try to learn simple models of the world and practice applying them to claims you hear, and then being confused when they don’t match. Eg learn introductory microeconomics and then whenever you hear a claim about the world that intro micro has an opinion on, try to figure out what the simple intro micro model would claim, and then inasmuch as the world doesn’t seem to look like intro micro would predict, think “hmm this is confusing” and then try to figure out what about the world might have caused this. When I developed this habit, I started noticing that lots of claims people make about the world are extremely implausible, and when I looked into the facts more I found that intro micro seemed to back me up. To learn intro economics, I enjoyed the Cowen and Tabarrok textbook.
I think Katja Grace is a master of the “make simple models and then get confused when the world doesn’t match them” technique. See her novel opinions page for many examples.
Another subject where I’ve been doing this recently is evolutionary biology—I’ve learned to feel confused whenever anyone makes any claims about group selection, and I plan to learn how group selection works, so that when people make claims about it I can assess them accurately.
Try to find the simplest questions whose answers you don’t know, in order to practice noticing when you believe things for bad reasons.
For example, some of my favorite physics questions:
Why isn’t the Sun blurry?
What is the fundamental physical difference between blue and green objects? Like, what equations do I solve to find out that an object is blue?
If energy is conserved, why we so often make predictions about the world by assuming that energy is minimized?
I think reading Thinking Physics might be helpful at practicing noticing your own ignorance, but I’m not sure.
Try to learn a lot about specific subjects sometimes, so that you learn what it’s like to have detailed domain knowledge.