Recently I was asked for tips on how to be less captured by motivated reasoning and related biases, a goal/quest I’ve slowly made progress on for the last 6+ years. I don’t think I’m very good at this, but I do think I’m likely above average, and it’s also something I aspire to be better at. So here is a non-exhaustive and somewhat overlapping list of things that I think are helpful:
Treat this as a serious problem that needs active effort.
In general, try to be more of a “scout” and less of a soldier/zealot.
As much as possible, try to treat ideas/words/stats as tools to aid your perception, not weapons to advance your cause or “win” arguments.
I’ve heard good things about this book by Julia Galef, but have not read it
Relatedly, have a mental check and try to notice when you are being emotional and in danger of succumbing to motivated reasoning. Always try to think as coherently as you can if possible, and acknowledge it (preferably publicly) when you notice you’re not.
sometimes, you don’t notice this in real-time, but only long afterwards.
If this happens, try to postmortem your mistakes and learn from them.
In my case, I noticed myself having a bunch of passion that’s neither truth- nor utility- tracking with the whole Scott Alexander/NYT debacle.
I hope to not repeat this mistake.
Be precise with your speech, and try not to hide behind imprecise words that can mean anything.
especially bad to have words/phrases that connotes a lot of stuff while having empty denotation
Relatedly, make explicit public predictions or monetary bets whenever possible.
Just the fact that you’re betting either your $s or your reputation on a prediction often reminds you to be concrete and aware of your own BS.
But it’s also helpful to learn from and form your own opinions on how to have relatively accurate+well-calibrated forecasts.
Surround yourself with people who you respect as people who care a lot about the truth and try to seek it
For humans (well, at least me), clear thinking is often at least a bit of a social practice.
Just being around them helps make you try to emulate their behavior
Even more helpful if they’re at least a little bit confrontational and willing to call you out on your BS.
If you are in a situation where surrounding yourself with truthseekers is not a practical solution for your offline interactions (eg live in a small town, most government bureaucracy work), make “people who are extremely invested in truth-seeking” a top priority in the content you consume (in blog posts, social media, books, videocalls, and so forth).
Conversely, minimize the number of conversations (online and otherwise) where you feel like your interlocutor can’t change your mind
Maybe this is just a rephrase of a soldier/scout mindset distinction?
But I find it a helpful heuristic anyway.
In general, I think conversations where I’m trying to “persuade” the other person while not being open to being persuaded myself are a) somewhat impolite/disrespectful and b) has a corrosive effect (not just costly in time) to my intellectual progress
This doesn’t apply to everything important (eg intro to EA talks)
But nonetheless a good heuristic to have. If your job/hobbies pushes you to be in an “expert talking to non-expert” roles too much, you need to actively fight the compulsion to getting high on your own epistemic supply.
My personal view is that this sense of righteous certainty is more corrosive with one-to-one conversations than one-to-many conversations, and worse online than offline, but it’s just a personal intuition and I haven’t studied this rigorously.
Be excited to learn facts about the world, and about yourself.
I mostly covered psychological details here, but learning true facts about the world a) is directly useful in limiting your ability to lie about facts and b) somewhat constrains your thinking to reasoning that can explain actual facts about the world.
Critical thinking is not just armchair philosophy, but involves empirical details and hard-fought experience.
I’ve previously shared this post on CEA’s social media and (I think) in an edition of the Forum Digest. I think it’s really good, and I’d love to see it be a top-level post so that more people end up seeing it, it can be tagged, etc.
Would you be interested in creating a full post for it? (I don’t think you’d have to make any changes — this still deserves to be read widely as-is.)
Recently I was asked for tips on how to be less captured by motivated reasoning and related biases, a goal/quest I’ve slowly made progress on for the last 6+ years. I don’t think I’m very good at this, but I do think I’m likely above average, and it’s also something I aspire to be better at. So here is a non-exhaustive and somewhat overlapping list of things that I think are helpful:
Treat this as a serious problem that needs active effort.
Personally, I repeat the mantra “To see the world as it is, not as I wish it to be” on a near-daily basis. You may have other mantras or methods to invoke this mindset that works better for you.
In general, try to be more of a “scout” and less of a soldier/zealot.
As much as possible, try to treat ideas/words/stats as tools to aid your perception, not weapons to advance your cause or “win” arguments.
I’ve heard good things about this book by Julia Galef, but have not read it
Relatedly, have a mental check and try to notice when you are being emotional and in danger of succumbing to motivated reasoning. Always try to think as coherently as you can if possible, and acknowledge it (preferably publicly) when you notice you’re not.
sometimes, you don’t notice this in real-time, but only long afterwards.
If this happens, try to postmortem your mistakes and learn from them.
In my case, I noticed myself having a bunch of passion that’s neither truth- nor utility- tracking with the whole Scott Alexander/NYT debacle.
I hope to not repeat this mistake.
Be precise with your speech, and try not to hide behind imprecise words that can mean anything.
especially bad to have words/phrases that connotes a lot of stuff while having empty denotation
Relatedly, make explicit public predictions or monetary bets whenever possible.
Just the fact that you’re betting either your $s or your reputation on a prediction often reminds you to be concrete and aware of your own BS.
But it’s also helpful to learn from and form your own opinions on how to have relatively accurate+well-calibrated forecasts.
Surround yourself with people who you respect as people who care a lot about the truth and try to seek it
For humans (well, at least me), clear thinking is often at least a bit of a social practice.
Just being around them helps make you try to emulate their behavior
Even more helpful if they’re at least a little bit confrontational and willing to call you out on your BS.
If you are in a situation where surrounding yourself with truthseekers is not a practical solution for your offline interactions (eg live in a small town, most government bureaucracy work), make “people who are extremely invested in truth-seeking” a top priority in the content you consume (in blog posts, social media, books, videocalls, and so forth).
Conversely, minimize the number of conversations (online and otherwise) where you feel like your interlocutor can’t change your mind
Maybe this is just a rephrase of a soldier/scout mindset distinction?
But I find it a helpful heuristic anyway.
In general, I think conversations where I’m trying to “persuade” the other person while not being open to being persuaded myself are a) somewhat impolite/disrespectful and b) has a corrosive effect (not just costly in time) to my intellectual progress
This doesn’t apply to everything important (eg intro to EA talks)
But nonetheless a good heuristic to have. If your job/hobbies pushes you to be in an “expert talking to non-expert” roles too much, you need to actively fight the compulsion to getting high on your own epistemic supply.
My personal view is that this sense of righteous certainty is more corrosive with one-to-one conversations than one-to-many conversations, and worse online than offline, but it’s just a personal intuition and I haven’t studied this rigorously.
Be excited to learn facts about the world, and about yourself.
I mostly covered psychological details here, but learning true facts about the world a) is directly useful in limiting your ability to lie about facts and b) somewhat constrains your thinking to reasoning that can explain actual facts about the world.
Critical thinking is not just armchair philosophy, but involves empirical details and hard-fought experience.
See Zeynep on Twitter
A robust understanding of facts can at least somewhat make up for epistemic shortcomings, and not necessarily vice versa.
I’ve previously shared this post on CEA’s social media and (I think) in an edition of the Forum Digest. I think it’s really good, and I’d love to see it be a top-level post so that more people end up seeing it, it can be tagged, etc.
Would you be interested in creating a full post for it? (I don’t think you’d have to make any changes — this still deserves to be read widely as-is.)