Outline of Galef’s “Scout Mindset”
Julia Galef’s The Scout Mindset is superb.
For effective altruists, I think (based on the topic and execution) it’s straightforwardly the #1 book you should use when you want to recruit new people to EA. It doesn’t actually talk much about EA, but I think starting people on this book will result in an EA that’s thriving more and doing more good five years from now, compared to the future EA that would exist if the top go-to resource were more obvious choices like The Precipice, Doing Good Better, the EA Handbook, etc.
For typical rationalists, I think the best intro resource is still HPMoR or R:AZ, but I think Scout Mindset is a great supplement to those, and probably a better starting point for people who prefer Julia’s writing style over Eliezer’s.
I’ve made an outline of the book below, for my own reference and for others who have read it. If you don’t mind spoilers, you can also use this to help decide whether the book’s worth reading for you, though my summary skips a lot and doesn’t do justice to Julia’s arguments.
Introduction
Scout mindset is “the motivation to see things as they are, not as you wish they were”.
We aren’t perfect scouts, but we can improve. “My approach has three prongs”:
Realize that truth isn’t in conflict with your other goals. People tend to overestimate how useful self-deception is for things like personal happiness and motivation, starting a company, being an activist, etc.
Learn tools that make it easier to see clearly. Use various kinds of thought experiments and probabilistic reasoning, and rethink how you go about listening to the “other side” of an issue.
Appreciate the emotional rewards of scout mindset. “It’s empowering to be able to resist the temptation to self-deceive, and to know that you can face reality even when it’s unpleasant. There’s an equanimity that results from understanding risk and coming to terms with the odds you’re facing. And there’s a refreshing lightness in the feeling of being free to explore ideas and follow the evidence wherever it leads”. Looking at lots of real-world examples of people who have exemplified scout mindset can make these positives more salient.
PART I: The Case for Scout Mindset
Chapter 1. Two Types of Thinking
“Can I believe it?” vs. “must I believe it?” In directionally motivated reasoning, often shortened to “motivated reasoning”, we disproportionately put our effort into finding evidence/reasons that support what we wish were true.
Reasoning as defensive combat. Motivated reasoning, a.k.a. soldier mindset, “doesn’t feel like motivated reasoning from the inside”. But it’s extremely common, as shown by how often we describe our reasoning in militaristic terms.
“Is it true?” An alternative to (directionally) motivated reasoning is accuracy motivated reasoning, i.e., scout mindset.
Your mindset can make or break your judgment. This stuff matters in real life, in almost every domain. Nobody is purely a scout or purely a soldier, but it’s possible to become more scout-like.
Chapter 2. What the Soldier is Protecting
“[I]f scout mindset is so great, why isn’t everyone already using it all the time?” Three emotional reasons:
Comfort: avoiding unpleasant emotions. This even includes comforting pessimism: “there’s no hope, so you might as well not worry about it.”
Self-esteem: feeling good about ourselves. Again, this can include ego-protecting negativity and avoiding “‘getting my hopes up’”.
Morale: motivating ourselves to do hard things.
And three social reasons:
Persuasion: convincing ourselves so we can convince others.
Image: choosing beliefs that make us look good. “Psychologists call it impression management, and evolutionary psychologists call it signaling: When considering a claim, we implicitly ask ourselves, ‘What kind of person would believe a claim like this, and is that how I want others to see me?’”
Belonging: fitting in to your social groups.
“We use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us”. So it’s no surprise that e.g. ‘training people in critical thinking’ don’t help change people’s thinking. But while “soldier mindset is often our default strategy for getting what we want”, it’s not generally the best strategy available.
Chapter 3. Why Truth is More Valuable Than We Realize
We make unconscious trade-offs. “[T]he whole point of self-deception is that it’s occurring beneath our conscious awareness. [...] So it’s left up to our unconscious minds to choose, on a case-by-case basis, which goals to prioritize.” Sometimes it chooses to be more soldier-like, sometimes more scout-like.
Are we rationally irrational? I.e., are we good at “unconsciously choosing just enough epistemic irrationality to achieve our [instrumental] social and emotional goals, without impairing our judgment too much”? No. “There are several major biases in our decision-making [… that] cause us to overvalue soldier mindset”:
We overvalue the immediate rewards of soldier mindset. Present bias: we prefer small rewards now over large rewards later.
We underestimate the value of building scout habits. Cognitive skills are abstract (and, again, have most of their benefits in the future), so they’re harder to notice and care about.
We underestimate the ripple effects of self-deception. These ripple effects are “delayed and unpredictable”, which is “exactly the kind of cost we tend to neglect”.
We overestimate social costs.
An accurate map is more useful now. Humans have more options now than we did tens of thousands of years ago, and more ability to improve our circumstances. “So if our instincts undervalue truth, that’s not surprising—our instincts evolved in a different world, one better suited to the soldier.”
PART II: Developing Self-Awareness
Chapter 4. Signs of a Scout
“A key factor preventing us from being in scout mindset more frequently is our conviction that we’re already in it.” Examples of “things that make us feel like scouts even when we’re not”:
Feeling objective doesn’t make you a scout.
Being smart and knowledgeable doesn’t make you a scout. On ideologically charged questions, learning more tends to make people more polarized; and even scientists studying cognitive biases have a track record of exhibiting soldier mindset.
Actually practicing scout mindset makes you a scout. “The test of scout mindset isn’t whether you see yourself as the kind of person who [changes your mind in response to evidence, is fair-minded, etc. …] It’s whether you can point to concrete cases in which you did, in fact, do those things. [...] The only real sign of a scout is whether you act like one.” Behavioral cues to look for:
Do you tell other people when you realize they were right?
How do you react to personal criticism? “Are there examples of criticism you’ve acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you?”
Do you ever prove yourself wrong?
Do you take precautions to avoid fooling yourself? E.g., “Do you avoid biasing the information you get?” and “[D]o you decide ahead of time what will count as a success and what will count as a failure, so you’re not tempted to move the goalposts later?”
Do you have any good critics? “Can you name people who are critical of your beliefs, profession, or life choices who you consider thoughtful, even if you believe they’re wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable[...]?”
“But the biggest sign of scout mindset may be this: Can you point to occasions in which you were in soldier mindset? [… M]otivated reasoning is our natural state,” so if you never notice yourself doing it, the likeliest explanation is that you’re not self-aware about it.
Chapter 5. Noticing Bias
“One of the essential tools in a magician’s tool kit is a form of manipulation called forcing.” The magician asks you to choose between two cards. “If you point to the card on the left, he says, ‘Okay, that one’s yours.’ If you point to the card on the right, he says, ‘Okay, we’ll remove that one.’ [...] If you could see both of those possible scenarios at once, the trick would be obvious. But because you end up in only one of those worlds, you never realize.”
“Forcing is what your brain is doing to get away with motivated reasoning while still making you feel like you’re being objective.” The Democratic voter doesn’t notice that they’re going easier on a Democratic politician than they would on a Republican, because the question “How would I act if this politician were a Republican?” isn’t salient to them, or they’re tricking themselves into thinking they’d apply the same standard.
A thought experiment is a peek into the counterfactual world. “You can’t detect motivated reasoning in yourself just by scrutinizing your reasoning and concluding that it makes sense. You have to compare your reasoning to the way you would have reasoned in a counterfactual world, a world in which your motivations were different—would you judge that politician’s actions differently if he was in the opposite party? [...] Would you consider that study’s methodology sound if its conclusion supported your side? [...] Try to actually imagine the counterfactual scenario. [… D]on’t simply formulate a verbal question for yourself. Conjure up the counterfactual world, place yourself in it, and observe your reaction.” Five types of thought experiment:
The double standard test. Am I judging one person/group by a standard I wouldn’t apply to another person/group?
The outsider test. “Imagine someone else stepped into your shoes—what do you expect they would do in your situation?” Or imagine that you’re an outsider who just magically teleported into your body.
The conformity test. “If other people no longer held this view, would you still hold it?”
The selective skeptic test. “Imagine this evidence supported the other side. How credible would you find it then?”
The status quo bias test. “Imagine your current situation was no longer the status quo. Would you then actively choose it?”
Thought experiments on their own “can’t tell you what’s true or fair or what decision you should make.” But they allow you to catch your brain “in the act of motivated reasoning,” and take that into account as you work to figure out what’s true.
Beyond the specific thought experiments, the core skill of this chapter is “a kind of self-awareness, a sense that your judgments are contingent—that what seems true or reasonable or fair or desirable can change when you mentally vary some features of the question that should have been irrelevant.”
Chapter 6. How Sure Are You?
We like feeling certain. “Your strength as a scout is in your ability [...] to think in shades of gray instead of black and white. To distinguish the feeling of ’95% sure’ from ‘75% sure’ from ‘55% sure’.”
Quantifying your uncertainty. For scouts, probabilities are predictions of how likely they are to be right. The goal is to be calibrated in the probabilities you assign.
A bet can reveal how sure you really are. “Evolutionary psychologist Robert Kurzban has an analogy[...] In a company, there’s a board of directors whose role is to make the crucial decisions for the company—how to spend its budget, which risks to take, when to change strategies, and so on. Then there’s a press secretary whose role it is to give statements[...] The press secretary makes claims; the board makes bets. [...] A bet is any decision in which you stand to gain or lose something of value, based on the outcome.”
The equivalent bet test. By comparing different bets and seeing when you prefer taking one vs. the other, or when they feel about the same, you can translate your feeling “does X sound like a good bet?” into probabilities.
The core skill of this chapter is “being able to tell the difference between the feeling of making a claim and the feeling of actually trying to guess what’s true.”
PART III: Thriving Without Illusions
Chapter 7. Coping with Reality
Keeping despair at bay. Motivated reasoning is especially tempting in emergencies; but it’s also especially dangerous in emergencies. In dire situations, it’s essential to be able to keep despair at bay without distorting your map of reality. E.g., you can count your blessings, come to terms with your situation, or remind yourself that you’re doing the best you can.
Honest vs. self-deceptive ways of coping. Honest ways of coping with painful or difficult circumstances include:
Make a plan. “It’s striking how much the urge to conclude ‘That’s not true’ diminishes once you feel like you have a concrete plan for what you would do if the thing were true.”
Notice silver linings. “You’re recognizing a silver lining to the cloud, not trying to convince yourself the whole cloud is silver. But in many cases, that’s all you need”.
Focus on a different goal.
Things could be worse.
Does research show that self-deceived people are happier? No, the research quality is terrible.
Chapter 8. Motivation Without Self-Deception
Using self-deception to motivate yourself is bad, because:
An accurate picture of your odds helps you choose between goals.
An accurate picture of the odds helps you adapt your plan over time.
An accurate picture of the odds helps you decide how much to stake on success.
Bets worth taking. “[S]couts aren’t motivated by the thought, ‘This is going to succeed.’ They’re motivated by the thought, ‘This is a bet worth taking.’” Which bets are worth taking is a matter of their expected value.
Accepting variance gives you equanimity. Expecting to always succeed is unrealistic, and will lead to unnecessary disappointments. “Instead of being elated when your bets pay off, and crushed when they don’t,” try to get a realistic picture of the variance in bets and focus on ensuring your bets have high expected value.
Coming to terms with the risk.
Chapter 9. Influence Without Overconfidence
Two types of confidence. Epistemic confidence is “how sure you are about what’s true,” while social confidence is self-assurance: “Are you at ease in social situations? Do you act like you deserve to be there, like you’re secure in yourself and your role in the group? Do you speak as if you’re worth listening to?” Influencing people requires social confidence, which people conflate with epistemic confidence.
People judge you on social confidence, not epistemic confidence. Various studies show that judgments of competence are mediated by perceived social (rather than epistemic) confidence.
Two kinds of uncertainty. People trust you less if you seem uncertain due to ignorance or inexperience, but not if you seem uncertain due to reality being messy and unpredictable. Three ways to communicate uncertainty without looking inexperienced or incompetent:
Show that uncertainty is justified.
Give informed estimates. “Even if reality is messy and it’s impossible to know the right answer with confidence, you can at least be confident in your analysis.”
Have a plan.
You don’t need to promise success to be inspiring. “You can paint a picture of the world you’re trying to create, or why your mission is important, or how your product has helped people, without claiming you’re guaranteed to succeed. There are lots of ways to get people excited that don’t require you to lie to others or to yourself.”
“That’s the overarching theme of these last three chapters: whatever your goal, there’s probably a way to get it that doesn’t require you to believe false things.”
PART IV: Changing Your Mind
Chapter 10. How to Be Wrong
Change your mind a little at a time. Superforecasters constantly revise their views in small ways.
Recognizing you were wrong makes you better at being right. Most people, when they learn they were wrong, give excuses like “I Was Almost Right”. Superforecasters instead “reevaluate their process, asking, ‘What does this teach me about how to make better forecasts?’”
Learning domain-general lessons. Even if your error is in a domain that seems unimportant to you, noticing your errors can teach domain-general lessons “about how the world works, or how your own brain works, and about the kinds of biases that tend to influence your judgment.” Or they can help you fully internalize a lesson you previously only believed in the abstract.
“Admitting a mistake” vs. “updating”. Being factually wrong about something doesn’t necessarily mean you screwed up. Learning new information should usually be thought of in matter-of-fact terms, as an opportunity to update your beliefs—not as something humbling or embarrassing.
If you’re not changing your mind, you’re doing something wrong. By default, you should be learning more over time, and changing your strategy accordingly.
Chapter 11. Lean in to Confusion
Usually, “we react to observations that conflict with our worldview by explaining them away. [...] We couldn’t function in the world if we were constantly questioning our perception of reality. But especially when motivated reasoning is in play, we take it too far, shoehorning conflicting evidence into a narrative” well past the point where it makes sense. “This chapter is about how to resist the urge to dismiss details that don’t fit your theories, and instead, allow yourself to be confused and intrigued by them, to see them as puzzles to be solved”.
“You don’t know in advance” what surprising and confusing observations will teach you. “All too often, we assume the only two possibilities are ‘I’m right’ or ‘The other guy is right’[...] But in many cases, there’s an unknown unknown, a hidden ‘option C,’ that enriches our picture of the world in a way we wouldn’t have been able to anticipate.”
Anomalies pile up and cause a paradigm shift. “Acknowledge anomalies, even if you don’t yet know how to explain them, and even if the old paradigm still seems correct overall. Maybe they’ll add up to nothing in particular. Maybe they just mean that reality is messy. But maybe they’re laying the groundwork for a big change of view.”
Be willing to stay confused.
Chapter 12. Escape Your Echo Chamber
How not to learn from disagreement. Listening to the “other side” usually makes people more polarized. “By default, we end up listening to people who initiate disagreements with us, as well as the public figures and media outlets who are the most popular representatives of the other side.” But people who initiate disagreements tend to be unusually disagreeable, and popular representatives of an ideology are often “ones who do things like cheering for their side and mocking or caricaturing the other side—i.e., you”. To learn from disagreement:
Listen to people you find reasonable.
Listen to people you share intellectual common ground with.
Listen to people who share your goals.
The problem with a “team of rivals.” “Dissent isn’t all that useful from people you don’t respect or from people who don’t even share enough common ground with you to agree that you’re supposed to be on the same team.”
It’s harder than you think. “We assume that if both people are basically reasonable and arguing in good faith, then getting to the bottom of a disagreement should be straightforward[...] When things don’t play out that way [...] everyone gets frustrated and concludes the others must be irrational.” But even under ideal conditions, learning from disagreements is still hard, e.g., because:
We misunderstand each other’s views.
Bad arguments inoculate us against good arguments.
Our beliefs are interdependent—changing one requires changing others.
PART V: Rethinking Identity
Chapter 13. How Beliefs Become Identities
What it means for something to be part of your identity. Criticizing part of someone’s identity tends to spark passionate, combative, and defensive reactions. Two things that tend to turn a belief into an identity:
Feeling embattled.
Feeling proud.
Signs a belief might be an identity.
Using the phrase “I believe”.
Getting annoyed when an ideology is criticized.
Defiant language.
A righteous tone.
Gatekeeping.
Schadenfreude.
Epithets.
Having to defend your view.
“Identifying with a belief makes you feel like you have to be ready to defend it, which motivates you to focus your attention on collecting evidence in its favor. Identity makes you reflexively reject arguments that feel like attacks on you or on the status of your group. [...] And when a belief is part of your identity it becomes far harder to change your mind[.]”
Chapter 14. Hold Your Identity Lightly
What it means to hold your identity lightly. Rather than trying to have no identities, you should try to “keep those identities from colonizing your thoughts and values. [...] Holding your identity lightly means thinking of it in a matter-of-fact way, rather than as a central source of pride and meaning in your life. It’s a description, not a flag to be waved proudly.”
Could you pass an ideological Turing test? Passing means explaining an ideology “as a believer would, convincingly enough that other people couldn’t tell the difference between you and a genuine believer”. The ideological Turing test tests your knowledge of the other side’s beliefs, but “it also serves as an emotional test: Do you hold your identity lightly enough to be able to avoid caricaturing your ideological opponents?”
A strongly held identity prevents you from persuading others.
Understanding the other side makes it possible to change minds.
Is holding your identity lightly compatible with activism? Activists usually “face trade-offs between identity and impact,” and holding your identity lightly can make it easier to focus on the highest-impact options.
Chapter 15. A Scout Identity
Flipping the script on identity. Identifying as a truth-seeker can make you a better scout.
Identity makes hard things rewarding. When you act like a scout, you can take pride and satisfaction in living up to your values. This short-term reward helps patch our bias for short-term rewards, which normally favors soldier mindset.
Your communities shape your identity. “[I]n the medium-to-long term, one of the biggest things you can do to change your thinking is to change the people you surround yourself with.”
You can choose what kind of people you attract. You can’t please everyone, so “you might as well aim to please the kind of people you’d most like to have around you, people who you respect and who motivate you to be a better version of yourself”.
You can choose your communities online.
You can choose your role models.
- The motivated reasoning critique of effective altruism by 14 Sep 2021 20:43 UTC; 285 points) (
- Announcing a contest: EA Criticism and Red Teaming by 1 Jun 2022 18:58 UTC; 276 points) (
- Learning from non-EAs who seek to do good by 8 Dec 2022 0:22 UTC; 263 points) (
- Pre-announcing a contest for critiques and red teaming by 25 Mar 2022 11:52 UTC; 173 points) (
- Deconfusion Part 3 - EA Community and Social Structure by 9 Feb 2023 8:19 UTC; 83 points) (
- Elements of Rationalist Discourse by 14 Feb 2023 3:39 UTC; 68 points) (
- Resource for criticisms and red teaming by 1 Jun 2022 18:58 UTC; 60 points) (
- Five recommendations for better political discourse by 14 Mar 2022 22:35 UTC; 31 points) (
- 12 Jan 2023 14:44 UTC; 28 points) 's comment on [Linkpost] Nick Bostrom’s “Apology for an Old Email” by (
- EA Updates for September 2021 by 27 Aug 2021 9:37 UTC; 27 points) (
- EA & LW Forums Weekly Summary (5th Dec − 11th Dec 22′) by 13 Dec 2022 2:53 UTC; 27 points) (
- The Motivated Reasoning Critique of Effective Altruism by 15 Sep 2021 1:43 UTC; 27 points) (LessWrong;
- 15 Sep 2023 11:26 UTC; 21 points) 's comment on James Herbert’s Quick takes by (
- More to explore on ‘The Effectiveness Mindset’ by 13 Apr 2022 23:00 UTC; 17 points) (
- Announcing a contest: EA Criticism and Red Teaming by 2 Jun 2022 20:27 UTC; 17 points) (LessWrong;
- 30 Dec 2022 22:33 UTC; 14 points) 's comment on StrongMinds should not be a top-rated charity (yet) by (
- [Draft] Increasing Empathy as a new Cause Area by 17 Dec 2022 1:29 UTC; 12 points) (
- The Scientist Mode, the Scout Mindset and Flashcards by 29 Aug 2021 18:56 UTC; 7 points) (
- Culture wars in riddle format by 17 Jul 2022 14:51 UTC; 7 points) (LessWrong;
- EA & LW Forums Weekly Summary (5th Dec − 11th Dec 22′) by 13 Dec 2022 2:53 UTC; 7 points) (LessWrong;
- 9 Dec 2022 1:16 UTC; 6 points) 's comment on Learning from non-EAs who seek to do good by (
- 10 Nov 2022 10:21 UTC; 4 points) 's comment on In (Praise?) of Ineffective Altuism by (
- Mais a ser explorado sobre ‘A Mentalidade da Efetividade’ by 20 Jul 2023 18:49 UTC; 4 points) (
- [Opzionale] Per approfondire “La Mentalità dell’Efficacia” by 26 Dec 2022 22:11 UTC; 1 point) (
For the majority of people/instances, I think people would benefit themselves if not also society more broadly (when their goals are not anti-societal) if they would adopt more of a scout mindset in a variety of their decisions. That being said, after reading the book I did at times worry that Julia may have overstated the value or understated the costs/harms of adopting more of a scout mindset. Or at least, I think there are definitely a non-trivial number of people+decision combinations for which a hybrid or even soldier-heavy mindset is really important, and I felt like the book puts a lot of effort into making the general case for a scout mindset, whereas I had come in hoping for a greater degree of discussion/analysis on when the scout mindset is appropriate vs. when a more-soldiery mindset is appropriate.
Overall, I definitely liked the book, and I don’t think this comment does justice to my appreciation of other aspects of the book, but I did want to share that perspective (which some other people also voiced in a book club which covered this book). I think it’s a matter of understanding the audience and having the right expectations about what the book is (a very readable commentary on a broad concept) vs. isn’t meant to be (e.g., an academic study or personal-policy analysis on choosing scout vs. soldier).
I am currently very much on the fence about whether to agree with you or not. I’m very keen to hear your views on situations in which soldier mindset is better than scout mindset—can you elaborate?
I don’t have an extensive list written down, and I’ll emphasize that I’m not saying it’s often best to use a pure soldier mindset. Still, here are some initial thoughts/examples of where hybrid or soldier-leaning mindsets might be better:
People suffering from depression or similar conditions
People who are not used to using a scout mindset and related tools of rationality (and thus, for example, may not be particularly skilled at applying it—which isn’t to say people should never start learning, but quitting cold turkey may not be a good idea either, especially in combination with other points here)
When you’ve established a “reputation”/habit of communicating with a soldier mindset, switching to scout-mindset communication may cause people to overfilter your statements (e.g., “Normally this person is really enthusiastic about things, but now they’re being extra uncertain, so they are probably just really skeptical). Also applies for communicating with friends in general that expect you to be supportive (at least when it comes to materially unimportant matters where their being wrong isn’t significant/bad). Also family: I recently witnessed some major family drama and recognized it would be insanely out of place to express caveats/thoughts in the way I might normally do so in a scout mindset.
Persuasion: speaking as someone who’s done lots of speech and debate—and is even a major advocate for the light side (truth-seeking/honest) of the persuasion force—I can say that if you want to be persuasive in some instances, it really helps to commit to the mindset “they are ultimately wrong.” Yes, you need to not be a blind donkey and assume literally everything you say is right—you need to understand how the audience and competitors are thinking and know your side’s weaknesses—but ultimately it helps to believe what you are saying when you actually get up to say what you’ve planned to say.
The rock climbing example (I’ll have to come back to this later and explain some details once I can review the book and my annotations there, I just seem to remember really disliking that example) -- UPDATE: This example is around pages 106–108 in my book. I might have slightly overestimated how much I disliked it when I first read it (based on my minor annotations), but I still think it’s a valid example of where you could get paralyzed by analysis. Of course, you should take some time to consider “what are my alternatives”, but from what I have heard (and just Google-skimmed), climbing down a mountain can often be harder than climbing up—sometimes it’s simply not a choice, unlike what Julia says. Alternatively, staying put might be another bad idea if the elements are threatening. The ultimate point being: yes, do consider alternatives, but if you judge that you have none (or they are all worse), then you may just have to really steel your nerves by adopting a soldier mindset to make the jump without flinching. Julia tries to explicitly respond to this sentiment in the next couple of vignettes/paragraphs, but I found the responses really lacking for that specific case (e.g., “there’s no clear divide between the ‘decision-making’ and ‘execution’ stages of pursuing a goal” [p. 110]—which definitely seems false for the example of “deciding to make the jump” vs. “making the jump”).
That’s definitely not an exhaustive list; just initial thoughts/examples that came to mind. And some of those were addressed in the book, but I feel like the emphasis was much more on “here are why those examples aren’t always justified” than on a more-balanced perspective.
Thanks! I notice that 1, 4, and 5 are examples where in some sense it’s clear what you need to do, and the difficulty is just actually doing it. IIRC Julia says somewhere in the book (perhaps in discussing the rock climbing example?) that this where the soldier mindset performs relatively well. I think I tentatively agree with this take, meaning that I agree with you that in some cases soldier is better probably.
(I updated my comment with some more details about the rock climbing example.) I’ll just re-emphasize, though, that I do think that people tend to overuse the soldier mindset, and that there are good arguments to make for not using it as often. I mainly was just pushing back against the OP’s sentiment which felt so effusively positive. In the end, if you’re looking for a readable, slightly-soldier case for a scout mindset, I think her book is great. If you’re hoping for a rather nuanced analysis or lessons on why+when to be less scout-ish, I still think the book is good, but you’ll definitely want to treat it more as a foil for thought.
I think I agree with this take. Thanks!
Thanks for the great summary!
I really liked the book, and think it’s an important read for folks early in their EA journey but I want to quickly say that I disagree with this claim. The book “doesn’t actually talk much about EA”, so it’d be surprising if it was the best introduction to a field. Statistics is a useful field for understanding and contributing to social science, but it’d be surprising if it was straightforwardly the #1 book to recommend to someone wanting to learn social science.
If someone’s specifically looking for a book about EA, I wouldn’t give them Scout Mindset and say ‘this is a great introduction to EA’—it’s not! Riffing on your analogy, it’s more like a world where:
There’s a book about statistics (or whatever) that happens to be especially useful as a prereq for social science resources—e.g., it provides the core tools for evaluating social-science claims, even if it doesn’t discuss social science on the object level.
Social science departments end up healthier when they filter on the kind of person who’s interested in the stats book and reads that book, vs. filtering on a social science book.
Compared to the content of the stats book, the basics of social science are sufficiently ‘in the water’, or sufficiently easy to pick up via conversation and scattered blog posts, that there’s less lost from soaking it up informally.
It’s more important that a critical mass of social scientists have the stats book’s concepts as cultural touchstones / common language / shared standards / etc., than that they have that for any given social science book’s concepts.
People who almost go into social science (but decide to do something else instead) end up doing much more useful work if they read the stats book than if they read a social science book (assuming they only read one). (Note that this might make the stats book better consequentially even if it means that fewer people end up doing social science work—maximizing EA’s growth isn’t identical to maximizing EA’s impact-mediated-by-people-we-court.)
I could of course just be wrong about this. But that’s the shape of my view.
Here’s some more evidence I got in favor of the fact that this is a particularly good book to give to new people. So far, the Rational Animations video about the “Rethinking Identity” section is the channel’s most appreciated video in terms of comments, both on Reddit and YT. Also, I’m seeing comments suggesting that at least some people deeply understand and incorporate the message. On r/videos, which is a pretty generalist sub, I’m finding some uplifting (for me) interactions:
I’ve seen some criticism of this book in EA/Rationality spaces and in some Amazon reviews about the fact that it uses too much internet culture as examples and ties too much with current internet discourse. But I think this is potentially something good. It could achieve at least three things: 1. provide real examples (in a non-aggressive way) that are likely to be somewhat associated with people’s identities, thus maybe making them break from this pattern. 2. Be a guide and act as example on how to achieve non-inflammatory non-mind-killing discourse on potentially sensitive topics, and 3. be read more because it ties deeply with how discourse is happening on the internet in recent years. Before obtaining real-world evidence I wouldn’t necessarily bet on the fact that it achieves these positive effects, but after seeing reactions in the wild I’m more positive. The negative examples I’ve seen are fewer and generally downvoted.