Jump to ratings and reviews
Rate this book

The Scout Mindset: Why Some People See Things Clearly and Others Don't

Rate this book
When it comes to what we believe, humans see what they want to see. In other words, we have what Julia Galef calls a soldier mindset. From tribalism and wishful thinking, to rationalizing in our personal lives and everything in between, we are driven to defend the ideas we most want to believe--and shoot down those we don't.

But if we want to get things right more often, argues Galef, we should train ourselves to have a scout mindset. Unlike the soldier, a scout's goal isn't to defend one side over the other. It's to go out, survey the territory, and come back with as accurate a map as possible. Regardless of what they hope to be the case, above all, the scout wants to know what's actually true.

In The Scout Mindset, Galef shows that what makes scouts better at getting things right isn't that they're smarter or more knowledgeable than everyone else. It's a handful of emotional skills, habits, and ways of looking at the world--which anyone can learn. With fascinating examples ranging from how to survive being stranded in the middle of the ocean, to how Jeff Bezos avoids overconfidence, to how superforecasters outperform CIA operatives, to Reddit threads and modern partisan politics, Galef explores why our brains deceive us and what we can do to change the way we think.

288 pages, Hardcover

First published June 4, 2019

Loading interface...
Loading interface...

About the author

Julia Galef

6 books240 followers
Julia Galef is co-founder of the Center for Applied Rationality. She hosts Rationally Speaking, the official podcast of New York City Skeptics, which she has done since its inception in 2010, sharing the show with co-host and philosopher Massimo Pigliucci until 2015.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
2,165 (38%)
4 stars
2,231 (39%)
3 stars
987 (17%)
2 stars
198 (3%)
1 star
32 (<1%)
Displaying 1 - 30 of 633 reviews
Profile Image for Trevor.
1,340 reviews22.7k followers
September 17, 2021
There is a kind of smug, self-satisfaction to books like this that invariably make me feel, regardless of how useful parts of them might prove to be, uncomfortable. I used to think of political beliefs as existing on a continuum running from left to right. Then I thought of this continuum as being more like a circle, where the far left and far right end up virtually touching – something the current pandemic has made particularly clear to me as I’ve watched as some Marxists have started sounding much more like Q-Anon supporters. But I’ve started thinking that perhaps the geometric figure that most accurately describes ideas is the triangle. Aristotle liked to talk of the ‘golden mean’ – his rational position between two extremes – and this book certainly plays that idea for all that it is worth – but actually, I sometimes feel that the centre can be just as extreme as any of the ‘ends’.

Surely we have all met one of these people in our travels. The sort who says things like ‘correlation doesn’t equal causation’ or ‘it seems like you are projecting’ or ‘can we stick to the facts and leave the ad hominen attacks at the door’. And don’t get me wrong – I’ve been that person too. But again, the problem isn’t always what is said, but rather the unbearable smugness with which it is said.

At one point the author says that all she is really doing is providing us with the tools discussed in all those books popular in the early 2000s more accessible to people – books like Predictably Irrational, Thinking: Fast and Slow, Mistakes Were Made, but not by me… I was struggling with this book anyway, but became completely put off at the end when she started spruiking her group – the ‘effective altruists’. Look, I don’t believe in God, I feel uncomfortable around people who say very silly things about science – but you will definitely know something is deeply wrong if I ever join an evangelical Atheist rationalist society or if I start wearing ‘Science Rocks, but you are just too stupid to know…fart face.’ t-shirt. The author criticises people like this too, well, in part, although, she has nice things to say about Richard Dawkins. All the same, an organisation that spends its time praising itself for how wonderfully self-critical it can be. Hmm… I don’t know.

She ends the book with a list of ‘Scout Habits’ for the reader to practice. I’m going to quote them in full:

1. The next time you’re making a decision, ask yourself what kind of bias could be affecting your judgment in that situation, and then do the relevant thought experiment (e.g., outsider test, conformity test, status quo bias test).
2. When you notice yourself making a claim with certainty (“There’s no way . . .”), ask yourself how sure you really are.
3. The next time a worry pops into your head and you’re tempted to rationalize it away, instead make a concrete plan for how you would deal with it if it came true.
4. Find an author, media outlet, or other opinion source who holds different views from you, but who has a better-than-average shot at changing your mind—someone you find reasonable or with whom you share some common ground.
5. The next time you notice someone else being “irrational,” “crazy,” or “rude,” get curious about why their behavior might make sense to them.
6. Look for opportunities to update at least a little bit. Can you find a caveat or exception to one of your beliefs, or a bit of empirical evidence that should make you slightly less confident in your position?
7. Think back to a disagreement you had with someone in the past on which your perspective has since shifted and reach out to that person to let them know how you’ve updated.
8. Pick a belief you hold strongly and attempt an ideological Turing test of the other side. (Bonus points if you can actually find someone from the other side to judge your attempt.)

Again, she doesn’t say this is an exhaustive list, just something people might like to practice. It is a list of what I guess could be called ‘intellectual empathy exercises’ – and as such they aren’t outrageous. In fact, some of them are quite good, I mean, I quite like the idea of the ‘ideological Turing test’. It is just that I think some of her earlier advice is better, and that this stuff is likely to be the take-away people leave the book with – mixed with a hyper-dose of smugness too, I fear.

Earlier, she said we should ‘hold our identities lightly’ – since having decided we are a particular kind of person makes it very difficult for us to ever change our minds. She says somewhere else that we should notice that we are highly influenced by the situations that we find ourselves in. This is one of the reasons she gives for joining a group like her effective altruists, that you will be around people who hold their identities lightly and so they will help to make you a better person. I find that the generally accepted ideas around identities are something I’m rather uncomfortable with at the best of times. I don’t think we have nearly as fixed identities as we imagine we have. I think our identities are much more situationally constrained. In fact, my main problem with this book is that it assumes that people’s behaviour can be changed by changing how they think – whereas the reality is more that the cart of thinking needs to go behind the horse of being, that is, people’s ideas are unlikely to change until their situation does.

That is, this book is a kind of primer advocating a new form of consciousness raising – but I’m not sure any form of consciousness raising really works all that well. I think, instead, I think that people’s ideas are much more tied to their life experiences than to how they go about practicing particular habits of mind. Still, the people most likely to learn to practice these habits of mind will then be able to congratulate themselves on how open they are, how rational, how reasonable. Whereas, they have likely just codified what were already their ‘extreme middle’ views.

To provide what I think might be a case in point. I consider myself to be a feminist – which, in turn, I do not consider to be a particularly big statement. In fact, it strikes me as odd that we are in the 2020s and the idea that men and women should be judged on their abilities and their character rather than their genitals still hasn’t quite taken on. We have had decades and decades of ‘steps’ towards equality between the sexes. Yet men are still paid significantly more than women, men still do significantly less housework than women, and men are still much more likely to work in jobs with higher status than women. I don’t know that the decades and decades of consciousness raising have achieved anywhere near enough.

I don’t want to say it has been a complete waste of time – but I do think that were the changes brought about by feminism have been most impactful have been where facts on the ground have been changed – rather than just opinions. I don’t know that the real achievements of feminism have been where it has made an old male chauvinist feel a bit uncomfortable while noticing the logical inconsistency in some of their strongly held opinions.

The world really does need to change, it’s just I’m not in the least sure that scouting is likely to get us to where we need to be. Another case in point – and perhaps a clearer example of my view – the US has just ended it longest foreign war. It spent something like $2 trillion blowing up stuff in Afghanistan. The Taliban are back in control. Presumably, they are even more convinced they are right than before. What if, rather than blowing stuff up, the US had sought to build Afghanistan up? Two trillion dollars might have made a pretty nice country. It certainly would have completely changed the situation in the country and possibly made it very hard for the Taliban to convince people that they look like a credible alternative. I just don’t think people’s ideas are as fixed as they are made out to be. But, sure, there really are great ways to make them as fixed as you want them to be.
Profile Image for JD.
767 reviews535 followers
October 19, 2023
SCOUT MINDSET: To see things as they are, not as you wish they were.

This is an awesome read that really opens your eyes to the world around you about how the human mind works on a daily basis. This is a very well written book with practical examples put in by the author of how to be able to implement all these scout attributes. While reading this book, I could daily see examples of both the scout mindset and the soldier mindset working against each other in myself and my environment. This book is a great tool to use to make life easier to navigate while still staying true to yourself.

I highly recommend this to anyone wanting to get into a different frame of mind, and this would also be a good book for high school classes to look at as this mindset would do the future of our world so much good. Besides the lessons it gives to achieving a scout mindset, a word that comes to mind though rarely used in the book, is respect, which I personally think is also one of the foundations of this mindset.

A review by Tim Urban also sums it up nicely for me, "The Scout Mindset is a lens, and once you’re looking through it, the world makes a lot more sense".
Profile Image for Nika.
185 reviews220 followers
July 18, 2022
3.5 stars

The author introduces two definitions that deal with our mindset. According to Julia Galef, most people can be divided into two categories - soldiers and scouts. People may combine the qualities of both. One can act as a soldier today, and as a scout tomorrow under different circumstances.
The scout and the soldier are archetypes. In reality, nobody is a perfect scout, just as nobody is a pure soldier. We fluctuate between mindsets from day to day, and from one context to the next.


We can think of a soldier and a scout as a metaphor for how the human brain chooses and processes incoming information.
These two types of mindset affect our ability to make decisions. The author argues that a scout mindset helps to improve our judgment which leads to effective decisions.
Soldier mindset implies that our decisions are often motivated by deeply ingrained reflexes. The logic behind the actions of a soldier is to defend himself and his group and defeat the enemy.
Scout mindset means attempting to understand and explain what is happening, why it is happening, and what could be done to tackle the problem. To this end, the scout maps the terrain trying to include as many details as possible. She studies and weighs potential obstacles. Unlike a soldier, the scout wants to draw her map as accurately as possible.
Finding out you are wrong means revising your map.

These two roles or models of behavior are both needed in society depending on circumstances.
However, scout mindset is much more valuable in many cases.
According to the author, such an approach can be extrapolated to almost all areas that involve decision-making.

The theory is explained in detail with a number of examples.
To illustrate her argument, Galef brings to the fore the Dreyfus Affair - a big political scandal that astounded and divided France at the turn of the century. French men and women started to identify themselves as Dreyfusards and Anti-Dreyfusards.

Alfred Dreyfus - a Jewish officer in the French army - was accused of passing French military secrets to Germany. He was condemned without any serious proof, only on the basis of a memo that he had allegedly written. For those who were convinced of his guilt, this was enough. It seems that the judges and jury who decided his fate genuinely believed that Dreyfus was a German spy.
That is how soldier mindset works. It ignores everything that contradicts the convenient theory it has adopted. It discards factors that do not fit into its habitual way of looking at things.
This is called "motivated reasoning" when our desires and prejudices shape the way we interpret information. Some factors or ideas look like our allies and we want them to win, while others feel hostile and we want them to be defeated, figuratively speaking.

But let us return to the Dreyfus Affair. Another factor - Colonel Picquart, a French high-ranking officer, - came into play. He was not free from anti-Semitism, quite the contrary. However, he set out to dig up the truth. He decided to test the assumption what if Dreyfus was wrongly convinced? Picquart admitted to insufficient evidence. He felt the court must have been missing something.
He had discovered that spying had continued even when Dreyfus was imprisoned. Finally, Picquart managed to prove his innocence. It took him around ten years. Dreyfus was rehabilitated. The truth won.
Picquart's motivation to discover the truth despite all the prejudices, including his own biases, prevailed. This is an example of scout mindset. The author defines it as "trying to get an accurate picture of reality, even when that is unpleasant or inconvenient."

Galef mentions Darwin who seems to have possessed scout mindset. He was able to recognize good critics and use their remarks to polish his theories.
Another example concerns capital punishment. How people react to the study on its possible effectiveness depends on whether or not they support capital punishment.

Which side we want to prevail affects our judgment, sometimes the influence is strong. People think that they are impartial and fair-minded when in fact they are biased.
To cure that we should train scout mindset. The author offers several recommendations in this regard, including calibration and putting a number on our degree of confidence.

Soldier mindset is rooted in emotions, such as defensiveness and tribalism. Scout mindset is connected to different qualities, such as curiosity, openness, and inquisitive mind.
No less important, scout's self-esteem is not determined by how right or wrong they are.
'Looks like I have been wrong', is a powerful phrase.

Also, the author distinguishes between two types of confidence - epistemic confidence (or certainty) and social confidence (or self-assurance). Epistemic confidence describes how sure you are about what is true, or rather what you think to be true.
Social confidence works differently. Someone endowed with social confidence speaks as if they are worth listening to. They are not afraid of admitting to being wrong.
This type of confidence increases our ability to convince others and sound coherent.
As to epistemic confidence, it does not help to impress opponents. On the contrary, it makes the speaker look overconfident or arrogant.

To conclude, soldiers seek to defend their own beliefs at any cost, whereas scouts yearn to see the world as clearly as possible.
For this reason, scouts are happy to encounter something that contradicts their expectations or beliefs.
Profile Image for Gavin.
1,113 reviews407 followers
April 16, 2021
Here's a way to tell scientific intelligence from legal intelligence. Both may start from the idea that something cannot be done and think up arguments to explain why. However, the scientist may discover a flaw in the argument that leads him change his mind and to discover a way to do it...

The legal thinker will merely try to patch the flaw in the argument, because, once he has chosen a side, all his intelligence is devoted to finding arguments for that side.

― John McCarthy

I was a bit of a legalist as a young man: completely gripped by what Galef calls the "soldier mindset", the urge to win arguments and cling to your positions, rather than find the truth. I was a philosophy student. Philosophy is supposed to be dispassionate and open-minded, but in fact the sheer number of degrees of freedom in it, and the absence of conclusive evidence lead to the usual bias and inertia. (We can name positions after philosophers because so few change their minds.) A certain level of intelligence and knowledge of say logical fallacies can end up trapping you, since you can usually improvise a fix for the deadly new fact, or anyway say "you too!".

Or not. This is an uplifting and useful set of stories about moving from the (pretty diseased) default mode of thinking to be, on average, less deluded and unfair. If you spend much time looking at internet arguments, or TV news debates, or other kinds of stupid war then you'll be cheered, and - who knows - healed, by Galef's examples of people changing their minds and running the numbers, against their current narrowly construed interests.

Galef is a master of this, as you can see from basically any of her radio episodes.

This would have helped the young legalist realise what he was doing, and might have sped him on the road.

Much more like a normal business book than I expected, with three-sentence stories of [random CEO]'s [triumph | desolation], and with more references to other self-help books. I'll accept this as airport bookshop camouflage. It is a friendly first step into honest reason.

The principles are not new, but the illustrating anecdotes are, and the writing is utterly, crashingly accessible in the Bestseller Nonfiction style, and it's short and sunny, and anyway it is a vital public service to redo Plato / Laplace / Schopenhauer / Peirce / Russell / Kahneman / Hanson / Yudkowsky / Galef, every say two years til the end of time.

News to me:

* The London Homeopathic Hospital had the best results during the Victorian cholera epidemic, for reasons unrelated to homeopathy (clean sheets and proto-rehydration therapy). Still dismal 18% mortality.

* Spock has a Brier score above 0.5: way worse than the average forecaster on low-stakes internet platforms (0.25), and somewhat worse than a flipping coin.

* An author of the Christian abstinence craze was persuaded that his book (advising that teens not even date other teens) was harmful, and stopped selling it.

---

Galef type:

Data #2: surprising case studies
Theory #2: models of what makes something succeed or fail
Theory #5: a general lens you can use to analyze many different things
Values #1: an explicit argument about values
Thinking #1: teach principles of thinking directly
Profile Image for Ozzie Gooen.
77 reviews84 followers
September 26, 2021
TDLR: A solid book with mass appeal to help people care more about being accurate. Highly readable, easy to recommend.

Also, see this related LessWrong post, which provides an excellent summary. This is much better than my notes. https://www.lesswrong.com/posts/yFJ7v...

----

The Scout Mindset is the sort of book I'm both happy with and frustrated by. I'm frustrated because this is a relatively casual overview of what I wish were a thorough Academic specialty. I felt similarly with The Life You Can Save when that was released. I think it's quite good on its own, and recommend it as such. Just know what you're going in for!

Another way of putting this is that I was sort of hoping for an academic work, but instead, think of this more as a journalistic work. It reminds me a bit more of Popular Documentaries and Malcolm Gladwell (in a nice way), instead of Superforecasting or The Elephant in the Brain. That said, journalistic works have their unique contributions in the literature, it's just a very different sort of work.

I just read through the book on Audible and don't have notes. To write a really solid review would take more time than I have now, so instead, I'll leave scattered thoughts.

1. The main theme of the book is the dichotomy of "The Scout Mindset" vs. "The Soldier Mindset", and more specifically, why the Scout Mindset is (almost always?) better than the Solider Mindset. Put differently, we have a bunch of books about "how to think accurately", but surprisingly few on "you should even try thinking accurately." Sadly, this latter part has to be stated, but that's how things are.

2. I was expecting a lot of references to scientific studies, but there seemed to be a lot more text on stories and a few specific anecdotes. The main studies I recall were a very few seemingly small psychological studies, which at this point I'm fairly suspect of. One small note: I found it odd that Elon Musk was described multiple times as something like an exemplar of honesty. I agree with the particular examples pointed to, but I believe Elon Musk is notorious for making explicit overconfident statements.

3. Motivated reasoning is a substantial and profound topic. I believe it already has many books detailing not only that it exists, but why it's beneficial and harmful in different settings. The Scout Mindset didn't seem to engage with much of this literature. It argued that "The Scout Mindset is better than the Soldier Mindset", but that seems like an intense simplification of the landscape. Lies are a much more integral part of society than I think they are given credit for here, and removing them would be a very radical action. If you could go back in time and strongly convince particular people to be atheistic, that could be fatal.

4. The most novel part to me was the last few chapters, on "Rethinking Identity". This section seems particularly inspired by the blog post Keep Your Identity Small by Paul Graham, but of course, goes into more detail. I found the mentioned stories to be a solid illustration of the key points and will dwell on these more.

5. People close to Julia's work have heard much of this before, but maybe half or so seemed rather new to me.

6. As a small point, if the theme of the book is about the benefits of always being honest, the marketing seemed fairly traditionally deceiving. I wasn't sure what to expect from the cover and quotes. I could easily see potential readers getting the wrong impression looking at the marketing materials, and there seems to be little work to directly make the actual value of the book more clear. There's nothing up front that reads, "This book is aiming to achieve X, but doesn't do Y and Z, which you might have been expecting." I guess that Julia didn't have control over the marketing.
Profile Image for Matthew Jordan.
101 reviews69 followers
May 27, 2021
I was surprised by how much I loved The Scout Mindset. I've been following Julia Galef's work for many years, and spent a long time immersed in the literature on rationality, decision-making, and belief formation, so I expected the book to be kind of boring. Instead, I found it extremely persuasive and even quite moving. Every page was jam-packed with important ideas, and the examples masterfully supported the main arguments. It also was never polemical. Julia Galef does not want you to be on her team; she truly just wants you to be intellectually honest and think clearly.

There were a few things I particularly enjoyed about The Scout Mindset. First was its length. Books that argue a simple thesis or introduce a concept really don't need to be long. Second was the lack of jargon. Most books on rationality talk about "steelmanning" and "Bayesian priors" and "the availability heuristic"—none of that here. Just solid reasoning and fun examples.

Third was the focus on immediate, practical solutions. While reading the book, I felt a palpable shift in the way I thought about my own reasoning, and a greater willingness to question why I held certain beliefs. (In particular, I've found it very helpful while navigating my complex feelings about the current iteration of Israeli/Palestinian tensions.)

Fourth, it did not rely on psychology studies to make its central points. Books like Thinking: Fast and Slow, The Righteous Mind, and Predictably Irrational are great, but rely on studies that often don't replicate. None of that here. This is a work of philosophy, and almost self-help, that makes a compelling case on its own merits.

Finally, I loved that it addressed the relationship between clear thinking and mental health. Galef acknowledges that the biggest barrier to seeking out feedback is low self-esteem, that overestimating the size of your problems is a central cause of anxiety, and that the research linking accurate self-knowledge with depression is hella spurious.

A couple other memorable ideas:
- Our language around belief-formation is very war-like: "Knockdown argument", "conceded the point", "staunch supporter", "defending your position". This is bad!! Our view of the world isn't a military stronghold; it's an ever-changing estimate.
- No one has ever changed their mind because of antagonism. If you're being snarky about a viewpoint you don't share, you're declaring "I care about flaunting for my tribe, but don't care about the dumbos who disagree."
- There is a big difference between epistemic confidence (being certain that you're correct) and social confidence (being self-assured and charismatic). The most trustworthy people are very socially confident, but often very skeptical of their own ideas. We should all aim for that!!

The book does have its flaws, of course. Here's one that stood out to me: Julia Galef is deeply immersed in the tech industry, and the examples of careers and contentious topics sometimes feel overly tech-y (Bezos, Musk, entrepreneurs, polyamory, boards of directors, Effective Altruism, etc.). I do worry that this might be a turn-off for some listeners/readers.

All in all, this is the single best introductory resource I've come across for starting the never-ending journey of seeking out the truth. The Scout Mindset is a terrific book, and a model for popular nonfiction. Concise, persuasive, and intellectually honest. Highly recommended!!
Profile Image for Ross Blocher.
478 reviews1,420 followers
September 18, 2021
I’ve found it! It’s the book everyone should read. In The Scout Mindset, Julia Galef has distilled important lessons on the weaknesses of thinking that could (and do) fill many other books. Galef lists some of these related tomes, such as Mistakes Were Made (but not by me) and Thinking Fast and Slow, both favorites of mine. The result fills a niche somewhere between social psychology and self-help, but the advice is sound, relatable, and scrupulously backed by evidence. The hardest thing is to describe this without making it sound like the bookish equivalent of eating your vegetables.

Perhaps one way out is to provide an insightful excerpt. Galef defines the “scout mindset” as a learned habit of checking oneself for errors, in full expectation that one will find them and benefit from their discovery. This is in contrast to our more natural “soldier mindset”, which seeks to protect oneself against ever being wrong. She illustrates how this defensive posture is hard-wired into the language of disagreement:

We talk about our beliefs as if they're military positions, or even fortresses, built to resist attack. Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in our beliefs or have unshakeable faith in something.

Arguments are either forms of attack or forms of defense. If we're not careful, someone might poke holes in our logic or shoot down our ideas. We might encounter a knock-down argument against something we believe. Our positions might get challenged, destroyed, undermined, or weakened. So we look for evidence to support, bolster, or buttress our position. Over time, our views become reinforced, fortified, and cemented. And we become entrenched in our beliefs, like soldiers holed up in a trench, safe from the enemy's volleys.

And if we do change our minds? That's surrender. If a fact is inescapable, we might admit, grant, or allow it, as if we're letting it inside our walls. If we realize our position is indefensible, we might abandon it, give it up, or concede a point, as if we're ceding ground in a battle.

Even words that don't seem to have any connection to the defensive combat metaphor often reveal one when you dig into their origins. To rebut a claim is to argue that it's untrue, but the word originally referred to repelling an attack. Have you heard of someone being a staunch believer? A staunch is a solidly constructed wall. Or perhaps you've heard of someone being adamant in their beliefs, a word that once referred to a mythical unbreakable stone.


After making the case for the scout mindset, Galef provides tools to develop self-awareness, recognize our biases, weigh our certainty (there's even an interactive exercise for this), transition to a life without illusion, lean into confusion, accept being wrong, escape our echo chambers, and rethink our identity. Along the way, we learn about tools like the Double Standard Test (do we hold politicians from our party to the same standards we hold members of the opposition?), the Outsider Test (how would a disinterested party judge our situation?), and the Status Quo Bias Test (would you actively choose the current situation if it wasn't already the default?). These and many other heuristics are incredibly helpful... and often sobering. No one is immune to these human weaknesses, but we can learn to combat them (heh, combat). The first step is to recognize them when they arise. Galef offers real-life and relatable examples to drive these concepts home. It's a well-argued and well-written book full of actionable insights, even for someone who's read many related works. Highly recommended, and the kind of book you may be tempted to buy for all your friends.
Profile Image for Patrick Peterson.
486 reviews229 followers
August 9, 2023
2023-02-05
Finished this book two mornings ago. Very cool book.
Great observations and arguments for seeing reality as objectively as possible.
Lots of neat discussion on how to get around natural/biological and other accumulated and popular biases one has that defend and promote the "Soldier" mentality, the way of operating that takes orders, that is very "rah-rah," and does not worry too much about what the truth really is.

The book promotes the "Scout Mindset" that puts a premium on seeing reality the way it really is, coming back with an accurate map, not necessarily one that is what the boss (or your psyche), just "wants" to be there, but isn't.

The author works hard to show the benefits of this mentality that focuses on finding out what the truth is all about, not taking things for granted. I really like that and hope it helps more people develop that skill. It will do them much good in achieving a better life, as well as contributing to a better society.

Super commendable book.
I find it really in tune with Ayn Rand's philosophy of Objectivism that puts the focus on reality and reason. No mention of Rand in the book at all. In fact, the author uses a fairly wide variety of ideological examples to highlight her various points about the benefits of the scout mentality vs. the soldier mentality. I appreciated that, even though I cringed at some of the them, since they did not ring as true to my experience as the author implied they should.

So - I highly recommend book, and cultivating your own "scout mentality", BUT do not take the examples for granted as being true. Do your own diligence.
Profile Image for Zainab.
52 reviews46 followers
May 2, 2021
Julia picks interesting cases to support her claim, so in a way, she satisfied my selfish want to find stories in the non-fiction. Thanks, Julia.

However (there's always that however), what I did not like was the same-old-same-old tradition among modern-day non-fiction writers who think it's the best strategy to make it to that 200+ pages by adding as many evidences as they can to validate their precious insights. It's annoying. I don't want to get used to this stupid tradition just because Julia is good with choosing her page adders.

Anyway. I'm sharing one of the cases to show you what Julia's precious insights are all about.

So, back in the 70s, when Susan Blackmore was a freshman at Oxford University studying psychology, just like her other college mates, she decided to use drugs to experience her newly-gotten freedom. She eventually found her spirit being lifted up towards the ceiling (reminds me of Jessie from Breaking Bad). But the thing was, she could see her body lying on the bed. That first experience with drugs changed her mindset about the paranormal. After that experience, she would wear stupid costumes, perform cute rituals, and read tarot cards to listen to her spirit guides and all. She also changed her academic focus to parapsychology. Did her Ph.D., and eventually found that all the evidence she had that proved the existence of the paranormal was only chance-based.

She couldn't easily go back to being that annoying skeptic in the family especially when she had been ghost-hunting for a living to save people from their supposed demons (I still have to look it up). She became that annoying skeptic anyway, because truth, you know.

Seeking truth, by all means, that's the thesis. Julia uses billions of cases and case studies to make her point that you should always reach conclusions based on accuracy-motivated reasoning (the Scout mindset) and not directionally-motivated reasoning (the Soldier mindset aka half-truth, biases-based rationalizing). She made her point well.

But the thing is how do you objectively separate the truth from the crap when there is so much crap we feed on on the daily basis? Reminds me of that Netflix documentary called Surviving Death. In the first episode, people narrate their near-death/full-death experiences. Most of them witnessed their spirits being lifted from their bodies (mostly during an operation), saw the doctors struggling to keep the body alive, then suddenly, felt like dissolving into all colorful things like the intermediary phase where the spirits go, and still made back to this world because "it just wasn't their time."

For me, consistency of evidence doesn't always mean truth. But, but, but another parapsychologist as shown in the same episode showed that in most of these cases, the survived lot mentioned the specific details like the position of a particular doctor standing in whatever direction, performing whatever specific thing that the patient could not have possibly mentioned had he/she not seen it from a distance with eyes fully open and in a state of consciousness.

And yet again, see, all the patients were drugged as per the regular operating procedures. So, yeah.

The book left me with this final thought: What the hell is clarity?

Note: I have updated my review system as I get to read more these days. Now, the books I feel conflicted about (most of them) get 3 stars. It's just easy.
Profile Image for Alexander.
68 reviews62 followers
October 3, 2021
This book is superb. It is like Rationality: From AI to Zombies but targeting a wider audience.

People tend to overestimate the value of self-deception excessively. When I talk about rationality, some will immediately jump to "but self-deception can be useful." I agree. Self-deception can indeed be helpful but to a limited extent. Self-deception is not going to, for example, magically make all existential risks vanish.

When talking about rationality, the most important thing is defining what we mean by this massively overloaded word. I subscribe to Eliezer Yudkowsky's definition of rationality. He defines rationality as follows:

Epistemic rationality: to systematically improve the probabilities of beliefs.
Instrumental rationality: to systematically improve at achieving goals.


There is a complicated interplay between Epistemic and Instrumental rationality. There are edge cases in which having more accurate beliefs could work against achieving one's goals. For example, if you are studying for an exam and your goal is to maximize your grade, knowing that your favourite person died the night before the exam is epistemically valuable but not instrumentally. However, in most situations, having more accurate beliefs is instrumental.

Overall, this definition seems sensible to me. It makes no unreasonable claims about the nature of knowledge or values. It acknowledges the intrinsically probabilistic nature of knowledge and respects the is-ought problem, not claiming objective moral goals.

The central idea in The Scout Mindset is that we humans have a plethora of cognitive biases. We can learn the art of rationality to see things more clearly and achieve our goals more effectively. The tools of rationality include probabilistic reasoning, noticing bias, and learning to listen. One of our core biases stems from "motivated reasoning," which is our tendency to disproportionately put our effort into cherry-picking evidence and reasons that support what we wish were true. R. W. Hamming puts it as follows, ‘We see what we look for.’

Galef argues that we are better off treating our beliefs as provisionally true and stress testing them through a process of truth-seeking (hence the scout metaphor). This truth-seeking process involves getting out there and surveying the environment to improve our map of the world. It consists in updating beliefs in light of new relevant and informative evidence or reasons. Galef says that no one is a perfect scout, but it is possible to improve this skill, just like any other skill. Scouts go out there and look at the world to construct the best map of the world they can build. Scouts don't start from conclusions. Instead, they reflect and ask themselves, "Do I really have to believe this? What evidence and reasons might cause me to change my belief about this?"

Imagine a scout going on joyful explorations, working towards constructing a map for a section of the territory as she works collaboratively with her squad. Her goal is to create a map of the territory that is as accurate as possible. She draws some contour lines representing hills, some trees. She spots a bear and marks the bear area on the map, and keeps going. She then encounters the river. She sees no bridge crossing the river, but all along the journey, she was hoping for there to be a bridge. How would you feel if our scout decided to ignore reality and draw a bridge across the river in her map in the name of positive thinking? Her squad will be somewhat disappointed and underprepared when they get to the river and find no bridge.

To contrast this, Galef uses the metaphor of the soldier. The soldier is on a mission. The soldier has already found the ultimate truth and already decided what side they are on. They already decided on the conclusion. The soldier then goes out into the world to cherry-pick evidence and reasons to support their conclusion and shoot down any criticism. It is completely understandable why someone would adopt the soldier mindset. Soldiers are defending something very dear to them—their self-esteem. Additionally, admitting that one is wrong can cause negative emotions, which one can avoid by deluding themselves into believing they are right.

A. C. Grayling captures the above in the following:

Grasping [reality/history/whatever] is to cleave to the evidence, be scrupulous in reasoning, dispassionate in judgment, and never tempted to start from conclusions to bend facts to fit them. In that direction lies the possibility of convergence on a best-supported understanding of [reality/history/whatever].


Rationality is a subtle art. Rationality is a tremendously practical skill and matters a great deal in real life. Improving at rationality takes a lot of deliberate practice. As Daniel Kahneman and Amos Tversky showed, we are intelligently illogical. Our irrationality has been forged through the processes of evolution. The cognitive biases we are endowed with are not always beneficial, and we are aware enough to know when they are detrimental to us. We have the capacity to spot our biases and learn to correct them.

What conclusions one reaches is not the important part overall. The part that matters is how one gets to those conclusions and whether that process is truth-conducive and whether it has any chance of serving one well in the long run.

My favourite part about this book is that it creates an identity for scouts. People sometimes associate rationalists with sceptics, cynics, or emotionally cold, socially impotent outcasts, etc. However, truth-seeking should be an identity worn proudly.

Truth-seekers need social connections just like all other human beings. We ought to work towards fostering communities that value truth-seeking and hold it in high esteem. Deliberately practising the art of rationality means encountering powerful new ideas and colliding more with other rationalists. There are critical pitfalls that can afflict groups united around shared ideals. Rationalists will need to be aware of and overcome these pitfalls if they’re to create effective communities.
Profile Image for JJ Khodadadi.
435 reviews108 followers
September 1, 2022
تفاوت دو مدل ذهنی سرباز و پیشاهنگ چیست؟
در مدل ذهنی سرباز استدلال کردن همچون مبارزه‌ای دفاعی هست ولی در مدل ذهنی پیشاهنگ استدلال همچون ترسیم یک نقشه و کشف موقعیت‌ها بر اساس واقعیت هست.

کسی که در یک زمینه دارای مدل ذهنی سرباز هست، برای اینکه بتواند تصمیم بگیرد که چیزی را باور کند یا نه، متناسب با انگیزه‌ای که دارد از خود می‌پرسد:«آیا می‌توانم این ادعا را باور کنم؟» یا «آیا باید این را باور کنم؟» اما فرد دارای مدل ذهنی پیشاهنگ برای گرفتن چنین تصمیمی از خود می‌پرسد:«آیا این ادعا واقعیت دارد؟»

برای فرد دارای مدل ذهنی سرباز پذیرش هر اشتباهی به معنای پذیرش شکست است اما برای فرد دارای مدل ذهنی پیشاهنگ کشف و پذیرش هر اشتباه در حکم تغییر و اصلاح نقشه‌ای هست که در دست دارد.

افراد دارای مدل ذهنی سرباز پیوسته در حال جستجوی شواهد و مدارکی در جهت تحکیم پایه‌ها و مبانی باورها و استدلال‌های قبلی خود هستند. اما افراد با مدل ذهنی پیشاهنگ تمام تلاش خود را به کار می‌گیرند تا نقشه خود را دقیق‌تر و درست‌تر کنند.

البته در عمل هیچ کس به شکلی مطلق و در همه زمینه‌ها، دارای مدل ذهنی سرباز یا مدل ذهنی پیشاهنگ نیست. بلکه هر کسی در موقعیت‌های مختلف و در زمینه‌های گوناگون در حال نوسان در میان این دو نوع مدل ذهنی هست. مهم این است که حواسمان به استدلال‌ها و رویکردهای خودمان در مواجهه با اطلاعات جدید و استدلال‌های مخالف باشد و تا می‌توانیم خودمان را از سمت مدل ذهنی سرباز به سمت مدل ذهنی پیشاهنگ هدایت کنیم.

آنچه مهم‌ترین مانع در مسیر اصلاح مدل ذهنی در افراد به شمار می‌رود، اینکه تشخیص مدل ذهنی سرباز در دیگران بسیار راحت‌تر از تشخیص آن در خودمان است. هرچند نباید فراموش کنیم که در مسیر حرکت از یک سر طیف به سمت انتهای دیگر طیف، موانع متعدد دیگری از جمله سوگیری‌های مختلف ذهنی، انگیزه‌های درونی، نیازهای روحی و روانی، اطرافیان، هویت شخصی و همچنین ترس از اشتباه هم وجود دارد که جولیا گالف به خوبی هر کدام از آنها و روش‌های غلبه بر آنها را توضیح داده است.
120 reviews6 followers
March 26, 2021
Julia’s a friend so I’ll avoid being too effusive. But this is a rare book that actually makes you want to be a better, or at least a better-reasoning, person.
Profile Image for Fredrik deBoer.
Author 3 books673 followers
January 11, 2022
This review was originally published in my newsletter.

Julia Galef's The Scout Mindset is not for me, in ways both big and small. To start with, it should be called just Scout Mindset, not The Scout Mindset. No, I will not be justifying that statement with an argument. Beyond that injustice, it's an engaging precis on some important topics by a thoughtful author, and a book that was clearly a labor of love. And at times I couldn't stand reading it.

Galef’s book, her first, is part of a burgeoning genre in how to think more rationally. The text, squarely designed for a popular audience, is a primer on how to make better choices and think more clearly despite the fact that we as a species have a remarkable number of ways to fail at both. As a string of bestsellers on neuroscience and psychology have argued in the past 20 years or so, we are a self-deluding species, and the ways that we lie to ourselves cause us unnecessary hardship. The trick is whether learning about these cognitive biases can really help free us from them. Galef is convinced that we can think better, if we want to, and presents a set of thought experiments and tools to help the reader in this regard. Embracing such tools helps one to think like a scout - not all the time, but perhaps when it counts.

The titular “Scout Mindset” exists in contrast to “Soldier Mindset.” Someone who thinks like a scout is an explorer, willing to truly scout out the terrain and see the world for what it is. (Mostly, it turns out, through applying concepts from elementary probability.) The soldier, in contrast, is stuck in defensive thinking, determined never to cede territory, intent on defending what they believe to be true in the face of threats, even when it would be more to their advantage to let old beliefs go. If those sound like somewhat awkward analogs, I’m with you, but they’re also a useful enough acrostic for different ways of thinking. Those fundamental terms do disappear from the text for a strangely long period, though, given the title. There are times that I felt that the central metaphor was perhaps pushed onto Galef by her publishing company; they love digestible metaphors, and creating a good guy/bad guy dichotomy never hurts sales, and as I said Galef develops her metaphor and swiftly sets it aside. But that’s speculation, and not very responsible speculation.

Once that table setting is dispensed with, the meat of the book is a variety of thought experiments and mind games, many of them genuinely fun. Much of the text is devoted to laying out those basic elements of probability I mentioned and some proto-game theory, utilizing real-world scenarios that sketch out how bad thinking can lead to bad outcomes. These dips into probabilistic thinking and optimizing decisions are all well-drawn and refreshingly clear. Thanks to her patience and talent for aphoristic thinking, Galef's writing is a model of measured clarity, and the text functions in many ways as a book-length invitation to learn more about thinking. I’m very happy to say upfront that the book is resolutely competent, never falling flat on its face or risking embarrassment by getting out in front of its skis. Whether this is an entirely salutary condition for a book is a question I will leave to others.

One thing that struck me was what the book is not, at least not explicitly: a part of the “rationalist” movement, the loose constellation of bloggers and thinkers that coalesced around the blogs LessWrong and Overcoming Bias, and which is now probably exemplified by Scott Alexander’s Astral Codex Ten. Several versions of “rational/rationality” appear in the index, but unless I missed something these do not refer to the rationalist movement. Alexander appears twice in the text, but is only referred to as “psychiatrist” and “blogger” rather than as an emissary from the rationalist worldview. (Amusingly Galef approvingly cites Alexander for changing his mind to a pro opinion on the efficacy of pre-K; in other words, for changing his mind from right to wrong.) Tellingly, Galef nominates as a healthy intellectual community not the broader rationalism movement but ChangeAView (now CeaseFire), which is merely rationalism-adjacent and which she again does not locate within that context.

Also conspicuous in its absence from the index is “Yudkowsky, Eliezer,” the man considered the originator of the modern rationalist movement by broad affirmation and, in my opinion, not an ideal representative for a movement looking to popularize its ideas. I say that as someone who thinks that the rationalist movement gets many things right and is an overall positive development for our intellectual culture. The trouble is that Yudkowsky is frequently emblematic of a kind of insularity and intellectual arrogance that I associate with that culture as a whole, and this cuts deeply against Galef’s project, which is so clearly designed to welcome newcomers into the fold (of more rational thinking generally, that is). Perhaps I’m reading too much into what’s not there, but it certainly seems that Galef is taking steps not to be associated with that crew. Whether the distancing from the rationalist movement is intended or not, The Scout Mindset seems like a great delivery vehicle for those ideas, presenting the best elements of the tradition without any of the smarter-than-thou baggage.

So what's my complaint? I find the lessons clear and the advice well-taken, and as someone who was already fond of Galef’s work, the book’s content only increased my admiration. She’s the right messenger for a good message at an opportune time. It's the execution of all this that I find imperfect - sometimes it’s just a little odd, sometimes exasperating.

Consider the brief section (little more than a page) titled “Reasoning as Defensive Combat.” It’s in this passage that Galef establishes the Soldier Mindset concept. Galef thinks that most people have a martial orientation towards reasoning, hence Soldier Mindset. To illustrate this, she sets up a associative construct that will probably be familiar to readers of nonfiction, seeking to demonstrate that we use martial terms (that is, terms of war and combat) when talking about reasoning, which ties in nicely with her Soldier Mindset bit. She then proceeds to… not do that. Really, not at all. Observe.

We talk about our beliefs as if they’re military positions, or even fortresses, built to resist attack. Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in our beliefs or have an unshakeable faith in something.

I would hope this would be obvious - none of these are military metaphors, and there are no martial terms here. I confess I find the contrast between the first sentence and the examples quite baffling. These are indeed terms that are often used in a military context, but they’re used as metaphors in the military space, rather than being military terms that are used metaphorically in other contexts. The metaphorical arrow points in the opposite direction than what Galef seems to think, so to speak. If, indeed, they’re metaphors at all. “Solid foundations” is metaphorical language (but has nothing to do with combat or soldiers), so is “deep-rooted” (ditto). “Backed up” is sort of metaphorical in a vestigial way. “Built on fact” is a bridge too far, for me; yes, you build houses or bridges, but you also just build stuff intellectually. And, again, not a shred of martial language involved. “Firm,” “unshakeable,” “strong,” and “secure” are just words, and none of them are military terms. So what is the relationship between the first quoted sentence and the rest?

She continues, however!

Arguments are either forms of attack or forms of defense. If we’re not careful, someone might poke holes in our logic or shoot down our ideas. We might encounter a knock-down argument against something we believe. Our positions might get challenged, destroyed, undermined, or weakened.

Here we’re on better footing. Not great footing, but better. “Shoot down” qualifies as a martial metaphor. “Knock-down” I'll grant. “Poke holes” is a mighty stretch but one I'm willing to make in the spirit of charity. But challenged, destroyed, undermined, and weakened are all terms that are so general and context-dependent that it's just hard to see what we're accomplishing here. I'm afraid it gets worse.

And if we do change our minds? That’s surrender. If a fact is inescapable, we might admit, grant, or allow it, as if we’re letting it inside our walls. If we realize our position is indefensible, we might abandon it, give it up, or concede a point, as if we’re ceding ground in a battle.

Here I just have to say… what on earth? Galef is clearly intelligent and a strong writer, and I have to imagine that Penguin employs excellent editors. And yet here I have terms like “admit,” “grant,” and “allow” proffered as combat metaphors. “Abandon” is a martial term? Really? And if you think I’m being uncharitable, I will refer you again to the name of the section, “Reasoning as Defensive Combat,” and ask you to consider that in a footnote, she explicitly says these are words with “connection to the defensive combat metaphor.” To which I say, what metaphor? You have utterly failed to establish such a metaphor.

There's a temptation for all writers, to get too attached to a metaphor. This is hard in the best of times - it's certainly hard for me - but it becomes more complex within an editing process, as you have to fight to keep what you want even as the context in which it initially made sense gets altered. Tricky thing. But this is a professionally published book and I bought my copy, so I ask for a certain level of coherence to analysis. It's a small issue, obviously, this weird failed written construct. But I harp on it because there's a strange sense throughout that the text itself is unfinished, as opposed to its argument. As I said the propositional content here is always credible and well-presented. But the book book, the form rather than the substance, the vehicle through which the arguments are delivered, is not, and it feels like a shame.

Let's take another petty example before we get to the big issue.


Here I must apologize, as Scout Mindset's failing in this regard is a fairly common one. I believe it should be added to the penal code, Unnecessary Use of a Venn Diagram. Please, tell me: what on earth is gained by using such a diagram here? Is the reader really going to be confused by the concept of a set that includes a subset which has a characteristic that those outside of that subset don’t share? I don’t want to pick on Galef here, as I feel like I see Venn diagrams for no reason frequently now. Venn diagrams are most useful when there’s overlaps between multiple circles which create more spaces than circles (and thus sets). They speed quick visualizations of inclusion and exclusion when such understanding might otherwise be challenging. Here, we might as well just have two circles that never overlap labeled “Coping Strategies That Require Self-Deception” and “Coping Strategies that Don’t Require Self-Deception.” That’s also a valid Venn diagram and one which adds about as much interpretive content as this one. Even better, couldn't we just have a two-column list of strategies that do and don't require self-deception? What's that? I’m spending way too much time on this? OK sorry.

The point is… I think this book was in need of an editing team with a firmer hand that was more interested in engaging in editing as an adversarial process. And the irony is that such a process would have been perfectly in keeping with the kind of rigor that The Scout Mindset seeks to inspire.

Which gets to the big problem, for me, and why I say it's not for me rather than saying it's not a good book: tone. I use this term with considerable misgivings. For ten years I scolded college freshman for using it in their papers, as it can so often function as a vague substitute for the precise feelings that I was trying to get them to articulate clearly. But here it’s the best word I can think of. The Scout Mindset's tone is that of a patient sixth-grade geography teacher, trying to guide her young charges calmly and gently and landing on an attitude that's perhaps 5% too chipper and 10% too condescending. It happens that I am not a sixth grader and I like being condescended to even less than I like it when people are chipper. All argumentative non-fiction is to some extent didactic; it's a question of degree. For me, the calibration was off.

And so I can’t think of a better overarching judgment here than “not for me.” Not for me because I'm not someone who wants to be talked to that way, not for me because I am not the perfect beginner this book seems to imagine as its audience, not for me because I like to be challenged by authors much more than I like to be gently shepherded by them. But - and this is a big but - many people are not like me. Many people on the internet are looking exactly for someone to kindly and patiently guide them from ignorance, and I don't mistake that as an unworthy goal. Quite the opposite, as a matter of fact. There are likely more of those kinds of people than there are of people like me. But all I can do is review from my own perspective, and for me, though the book is fast-reading and frequently entertaining I found the nearly-300 pages difficult to get through. Galef is just too resolutely enthusiastic for this cynical soul.

But the fun moments are fun. My personal favorite is when she watches Star Trek movies and episodes and tracks Spock's certainty relative to his actual predictive outcomes. (He sucks at predictions, for the record.) She does this in service to explaining the concept of calibrating one’s one ability to make predictions at a given confidence level, and it's a lovely illustration of useful concepts. I also quite liked a brief but sharp consideration of the ever-bubbling frequentist vs. Bayesian divide, a genuinely complicated topic that she handles with poise. Galef has the goods here, in general. The trouble for me is, one, that I already know much of this stuff. (But I’m an obsessive weirdo who took a dozen stats and methods courses in grad school.) For another, I don’t like feeling like I’m being led gently to knowledge by a benevolent teacher but prefer to be throw into the deep end. (But many people are not like me.) So it’s tough to judge. There are some stylistic elements I’m happy to straightforwardly say should have been fixed in editing. (“A thought experiment is a peek into the counterfactual world.” Uh, yeah.) But I can’t remember a book in recent memory that I was happier simply saying, it’s not for me. The Scout Mindset is a good book. It’s just not for me.

The larger question with The Scout Mindset, though, is the one that haunts the entire rationalism movement: is it really the case that we can think our way out of bad thinking? Or are the Hindu sages correct in believing that the rational mind itself is the trap, itself is maya, from which one can only be liberated by letting go? I appreciate Galef’s set of constructive choices that one can make to be more rational at the end of the book. But we live in a world where many millions of people genuinely believe that (inaccurate) estimates of where various heavenly bodies were in relation to each other at the time of their birth influences the events of their life. Against such irrationalism, thought experiments seem like profoundly impotent tools. Like I said before, books on irrationalism and how to avoid it have become a cottage industry, and yet I can’t say I’ve observed any corresponding growth in ambient rationalism. And even the best of us fall into irrationalism. Just the other day I threw a coin in a wishing well myself. But then, Galef admits upfront that we’ll frequently fail to be rational even with all of the tools at our disposal, which is a mature and welcome qualification. The only trouble is that while such admissions may be a bit of sober wisdom, they can also make the whole genre seem like a bait and switch.

The best thing I can say for The Scout Mindset is that, at its most confident and charming points, I almost believe that we really can slay our irrational demons and engage with the world from the standpoint of greater objectivity, that we can achieve genuine reason, if only for awhile. I almost believe that. But not quite.

After all. Even Spock was half-human.
Profile Image for Stefan Schubert.
19 reviews92 followers
April 20, 2021
The Scout Mindset is a spirited defence of truth-seeking and intellectual honesty. Julia Galef argues that we by default are in the "soldier mindset", where we're trying to defend our views come what may. We're making our beliefs part of our identity, so feel personally threatened if someone challenges them. Instead, she argues, we should adopt the scout mindset - we should be genuinely curious; genuinely open to changing our mind.

Galef argues that adopting this mindset or attitude is key to becoming a better reasoner. She also describes a number of helpful techniques, such as the superforecasters' habit of constantly making small revisions to their beliefs, and how to choose interlocutors that are more likely to change your mind. These techniques serve to show how to live the scout mindset on a daily basis.

I think that Galef is right that we intuitively underestimate the value of figuring out the truth. It's not just valuable for individuals, but is also supremely important for humanity as a whole. Civilisation is ever becoming more complex. So to address the great problems of tomorrow, we need a spirit of impartial truth-seeking. This is the best book on that spirit to date.
Profile Image for Cristina Balan.
73 reviews31 followers
March 25, 2021
I started reading this book with a strange feeling: I could not understand the link between a scout and how people think. Slowly into the book, I realised that I was wrong: there is a strong connection between how people in general think and react, and the inquirer's mindset.

Critical thinking is the muscle we lack training. We pay the costs, while the burden gets bigger only because we are not true to ourselves. We keep lying ourselves instead of scrutinising arguments, researching, checking if we are indeed right or not.

I don't know who said that changing our minds is a sign of intelligence (please attribute it as appropriate), but admitting we are or might be wrong is definitely a sign of health. Instead of taking things personal and fighting others for no solid reason (and no real satisfaction), we should take a step back, chill a bit, look again into things, then admit that we can't be always, ALWAYS, right.

Bonus: there's a sweet hint in the book on how the author met the love of her life. Happy marriage and good luck! It looks like both of you have the appropriate mindset.
Profile Image for Mona.
196 reviews31 followers
July 10, 2021
This is a book about self bias.


The first half was very interesting, author provides very good examples of our judgment flaws and proofs her point that we tend to jump into conclusions without investigating all available data.
Now, her main motivator for a reader is "finding the truth". As noble and idealistic as it sounds, probably more "down to earth" benefits would be more convincing to average Joe.

Book is very well edited and readers can follow thought process in a smooth way. It is written in a lay language but good background research was certainly done.

When it comes to the weakness, the second half of the book felt a bit forced, as if the most important things have been said already. Now, I'm not sure if this is a topic that requires book volume to discuss, nevertheless the whole foundation that author is involved in, but as long as someone sponsors it, who cares.

I have paused few times already before jumping to conclusions thanks to this book, so as far as I'm concerned... job well done.
Profile Image for Rob Barry.
286 reviews3 followers
September 5, 2021
In considering how I liked this book, one of the author’s phrases come to mind: “Just as there are fashions in clothing, so, too are there fashions in ideas.” Understandably, people are trying to make sense of a world that makes less and less sense every day, and I believe the current rationality movement is is the latest fad to capitalize upon the need for more certainty. In turn, this book will seemingly appeal to those who are already convinced that they will be able to see things clearly when others don’t.

And this, in my opinion, is issue #1: the subtitle implies an “us” and “them” worldview: one hopes that “rational people” will be better than “irrational people.” Is the categorization of a person as either “rational” or “irrational” a moral statement?

Issue #2: After reading this book, I continue to be at a loss as to what “rationality” actually is. In the author’s opinion, what is behind this idea? It seems to me that “rational” is associated with a range of definitions that seem to be a function of the academic discipline providing the definition (e.g., economics, history, mathematics, philosophy, sociology, neuroscience, physics, etc.).

Issue #3: The author presumes a common understanding of “reality” and “truth.” Rest assured, I am not the “everything is relative” type, but the book left me with the impression that I am “rational” when I stumble into “reality” or the “truth” - that is, we’ll realize it in retrospect IF we are rational, however, how do I know I’m not making the ��best” decision already?

Issue #4: The author convinced me that I am really alone with my circumstances. It is very easy for me to imagine making an irrational decision (haha - I think it’s my default setting). In this decision, as with every decision, I’m either rational or irrational - I have to be clever enough to figure it out. If anything goes wrong, I’ve only got myself to blame, right? What a terrible way to live. What if I have to make a very time-sensitive medical decision for a person I love? Do I have time to run through a menu of decision-making tools, or can I turn to someone I trust to help me with the decision? How do I know that person is rational? Is it enough to know that they care, and that I trust them under the circumstances? Is rational thinking the litmus test for every real-life situation?

Ultimately , I believe the author falls short in her effort to suggest that rationality is not only necessary, but sufficient in order to be “better than” those who can’t see clearly.
Profile Image for Julia.
311 reviews15 followers
January 5, 2022
Yet another strong rec from the esteemed Henry! I was probably the target audience for this book — interested in rationality and convinced of its merits but uninformed about specific tools or strategies — and I found it entertaining and practical. Parts 2 and 4, 'Developing Self-Awareness' and 'Changing Your Mind', were the most applicable to me; I especially liked the sections on using thought experiments to detect bias, applying the equivalent bet test to approximate quantitative estimates, and 'leaning into confusion' when faced with 'irrational' behaviour. Most of the anecdotes helped add colour and personality to the writing, but my one petty gripe is that I'm so over reading the same stories about the same billionaires again and again. Otherwise, this was a worthwhile read, and I'm excited to put some of what I've learned into practice!

Our judgment isn't limited by knowledge as much as it’s limited by attitude.
Profile Image for Ryan.
1,048 reviews
August 17, 2021
Can people be rational? Yes, sometimes, but it's hard.

Therefore, in The Scout Mindset, Julia Galef urges readers to think like scouts rather than soldiers. Scouts seek the truth, whereas soldiers indulge in motivated reasoning to defend their intuited or identity bound positions.

I found that the value of The Scout Mindset was Galef’s constant warnings to readers—often but not always imagined as rationality bros—that they are more irrational than they think. In fact, I wonder if, in seeking rational thinkers, we would be further ahead screening for people who say “I worry about my irrationality” rather than “I am rational.” At one point, Galef amusingly studies Spock’s conclusions, which are nearly always wrong.

Galef offers a lot of great advice, and here a few that I especially liked. The first comes from Paul Graham: have a light or small identity. Second, think of beliefs as a continuum. Third, it's often useful to establish what will change your mind about a belief. Fourth, seek out voices you disagree with but respect. I note that all of these are not ways of thinking rationally so much as ways of guarding against a soldier mindset.

I find thinking rationally tough, and I often find it difficult to understand how others reach many of their conclusions. In fact, I came to The Scout Mindset after a year and more of the coronavirus pandemic wondering what explains the behaviours that confound me. Is it a lack of intelligence, motivated reasoning, distrust of authority, scientific ignorance, a failure of public health policy that explains anti-vaxxers? (Probably some mix, as suggested in the Maclean's article "Typical 'vaccine hesitant' person is a 42-year-old Ontario woman who votes Liberal.") More and more, I wonder if the pandemic has shed light on two major issues: 1) it's hard to say "I don't know" and 2) risk management is tough to do, let alone to do well. (Galef tackles both subjects here, btw.)

Although Galef does not find Spock very logical, he is nevertheless an amazing officer. And he is a model of a final good piece of advice: rational people often guard their biases by constructing a crew willing to disagree with each other. Live long and prosper.

*Galef has a podcast, Rationally Speaking, that I started listening to while reading the book and which I am happy to recommend.
Profile Image for Yigitalp Ertem.
29 reviews11 followers
April 28, 2021
I listened to The Scout Mindset after watching some Big Think and Bayesian Thinking related videos from Julia Galef but I didn't find the book very helpful. Rarely interesting, mostly garnishing a simple thesis. The parts about tech-billionaire-appraisals and inductive examples starting with Facebook/Reddit posts and ending with overarching generalizations about 'how we'd better think' made me even more disinterested. Bored towards the end, I still tried to preserve my scout mindset and finished. The author smiled at me at the end:

"Find an author, media outlet, or other opinion source who holds different views from you, but who has a better-than-average shot at changing your mind—someone you find reasonable or with whom you share some common ground."

One good moment was a confessional memoir from the author where she was asking to the participants about a lecture/workshop about their feedback. She mentions that she noticed that while she was asking "was it good for you", she was nodding her head and smiling, pushing the people to say positive things even though her actual aim was to get some critical responses. I like and remember the mistakes, revelations and detailed authentic moments more than "that's how some random successful entrepreneur think" kind of orations.
Profile Image for Maris.
92 reviews2 followers
August 29, 2021
Going into this book, I entered with the bias that I knew I'd most likely love reading this. Julia Galef has been a person who has the skill to teach better ways of thinking in a kind and productive way, at least as much as I have encountered her through her previous videos and essays. It's hard to find words that would do this work justice. I came out from this knowing more about myself, how to think, what to notice in my own viewpoints, and at the same time the book has a rightful positive outlook on life. She doesn't rely on biases and heuristics that have been discussed almost everywhere where you can read about how to think better. Instead, she offers a different frame of thinking - the scout mindset. And even then it's not that people don't inherently think in scout mindset and we'd just be better people to learn this, but she brings out parts in our thinking that are already there, and other parts that are not. Awareness of these thinking patterns goes a long way. I could not recommend this book enough, especially if you're curious about how you think yourself and how to improve that to have better conversations and judgements about the world.
Profile Image for Maher Razouk.
718 reviews210 followers
April 16, 2021
الشعور بالمتعة عند الاستماع إلى أخبار تهين مجموعة أيديولوجية تختلف معها هي علامة على "هوية معارضة" - هوية يحددها ما تعارضه.

من السهل التغاضي عن هذه لأنها غالبًا لا تتضمن تسميات خاصة بها ، لكنها يمكن أن تشوه حكمك تمامًا. إذا كنت تحب أن تكره الهيبيين أو التقنيين أو الليبراليين أو الأصوليين أو أي مجموعة أيديولوجية أخرى ، فهذا يمنحك دافعًا لتصديق أي شيء يبدو أنه يشوه صورتهم . هل تجد النباتيين مزعجين؟ سترحب بأي أخبار تفيد بأن النظام الغذائي النباتي غير صحي.
.
Julia Galef
The Scout Mindset
Translated by #Maher_Razouk
Profile Image for Milan.
292 reviews2 followers
April 19, 2021
The 'scout mindset' seeks to discover what is correct through fact-checking, and rationalizes toward conclusions that lead to “the motivation to see things as they are, not as you wish they were.” The 'soldier mindset' leads us to defend our beliefs against outside threats. “We change our minds less often than we should but more often than we could.” "There’s no such thing as a 'threat' to your beliefs. If you find out you were wrong about something, great—you’ve improved your map, and that can only help you.” I did not find anything new in Julia Galef's book, just old wine in new bottle.
Profile Image for Martin Smrek.
107 reviews26 followers
October 14, 2021
Books quoting Youtube comments, Reddit discussions, and bloggers' opinions should have a warning printed on their front page. Oh, and books based on a TED talk should too. However, the main idea of the book is good, altough rather trivial, and shaddowed by the deluge of anecdotes and bizarre quotes from social media.
Profile Image for Tobias Leenaert.
Author 2 books149 followers
November 15, 2021
Quite a useful book, also for activists/people who want to make the world a better place.

(a note: I'm afraid that it's mostly people who already have a big part of the "scout mindset" will read the book. Those are people who probably are already not in need of more reasons to doubt; they doubt *too much*. )
Profile Image for Ali Amiri.
2 reviews
December 25, 2021

This book is informative and mind-opening. A kind of “how to” not a “why” book trying to redeem your “scout mindset” (curious and researching) from your “soldier mindset” (defensive and biased).


I have three main issues with the book: bias, shallowness, and avoiding variance. The irony is that these three are some of the main traps which you are hopefully supposed to overcome by mastering the provided guidelines in the book.

1- Bias: Counting the exact number of real life soldier and scout mindset examples of the book is a hard task, because many of the instances are unknown people. However, among well-known characters, most of scouts are liberals (or conservatives who, due to scout mindset, behave like liberals) and as you can guess, soldiers are mostly conservatives. The most iconic scouts introduced in this book are: Benjamin Franklin, Abraham Lincoln, Charles Darwin, Jeff Bezos, Elon Musk (who is difficult to categorize but it is hard to label him a conservative), Barry Goldwater (once a conservative but inclined to some liberal causes in his later life), Susan Blackmore, Jerry Taylor(advocate of climate change), Joshua Harris author of I Kissed Dating Goodbye: A New Attitude Toward Relationships and Romance (at first a soldier then after repenting for his conservative book became a scout), and Humane League (an animal rights group). Most of them were liberal (by the standard of the time) or conservative who became liberal. What about liberal soldiers? I could find just Lyndon Johnson, and physicist Lawrence Krauss.


By reading Julia Galef's book, I can conclude (of course implicitly) there is a strong association between liberalism (mainly atheism) and scout mindset.


2- Shallowness: By this I mean some hasty information which does not seem to be well-researched. For example, In chapter 12 she tries to debunk the “team of rivals” myth. I quote the exact paragraphs:


“When Abraham Lincoln won the presidency in 1860, he reached out to the men who had been his main opponents for the Republican nomination—Simon Cameron, Edward Bates, Salmon Chase, and William Seward—and offered them all positions in his cabinet. That story was immortalized by historian Doris Kearns Goodwin in her bestselling 2005 book Team of Rivals: The Political Genius of Abraham Lincoln.
Lincoln’s “team of rivals” is now a standard example cited in books and articles urging people to expose themselves to diverse opinions. “Lincoln self-consciously chose diverse people who could challenge his inclinations and test one another’s arguments in the interest of producing the most sensible judgments,” wrote Harvard Law professor Cass Sunstein in his book Going to Extremes. Barack Obama cited Team of Rivals as inspiration for his own presidency, praising Lincoln for being “confident enough to be willing to have these dissenting voices” in his cabinet.
This is the account I had heard as well, before I began researching this book. But it turns out that the full story yields a more complicated moral. Out of the four “rivals” Lincoln invited into his cabinet—Cameron, Bates, Chase, and Seward—three left early, after an unsuccessful tenure.”

The first problem with her analysis is that there were four republican rivals in 1860 election which Simon Cameron was not one of them. Here is a quote from the introduction of Team of Rivals: The Political Genius of Abraham Lincoln:
“In my own effort to illuminate the character and career of Abraham Lincoln, I have coupled the account of his life with the stories of the remarkable men who were his rivals for the 1860 Republican presidential nomination—New York senator William H. Seward, Ohio governor Salmon P. Chase, and Missouri’s distinguished elder statesman Edward Bates.”

The second problem is where the author asserts three (we omitted Cameron previously, so there are two now) of Lincoln’s cabinet members left early. But, both Bates and Chase left the cabinet in 1864, the last year of Lincoln’s first term. Another quote from Doris Kearns Goodwin’s book:
“With Seward, Stanton, and Welles secure in their cabinet places, the resignation of Edward Bates provided the only opening for change in the immediate aftermath of the election. The seventy-one-year-old Bates had contemplated resigning the previous spring, after suffering through a winter of chronic illness. In May, his son Barton had pleaded with him to return to St. Louis. “The situation of affairs is such that you are not required to sacrifice your health and comfort for any good which you may possibly do,” urged Barton. “As to pecuniary matters, I know well that you have but little to fall back on…for the present at least make your home at my house & Julian’s, going from one to the other as suits your convenience…. You’ve done your share of work anyhow, & it is time the youngsters were working for you. If you had nothing at all, Julian and I could continue to take good care of you and Ma and the girls; & you know that we would do it as cheerfully as you ever worked for us, and we would greatly prefer to do it rather than you should be wearing yourself out as now with labor and cares unsuited to your age.”
The prospect of going home to children and grandchildren was attractive, especially to Julia Bates, whose wishes remained paramount with her husband after forty-one years of marriage. On their anniversary in late May, Bates happily noted that “our mutual affection is as warm, and our mutual confidence far stronger, than in the first week of marriage. This is god’s blessing.”
However, during the dark period that preceded the fall of Atlanta, when Bates believed “the fate of the nation hung, in doubt & gloom,” he did not feel he could leave his post. Nor did he wish to depart until Lincoln’s reelection was assured. “Now, on the contrary,” he wrote to Lincoln on November 24, 1864, ‘the affairs of the Government display a brighter aspect; and to you, as head & leader of the Government all the honor & good fortune that we hoped for, has come. And it seems to me, under these altered circumstances, that the time has come, when I may, without dereliction of duty, ask leave to retire to private life.’”

Compare it to how Bates’s resignation is depicted in the Galef’s book:
“Bates resigned after becoming increasingly detached from his work. He had little influence in the administration; Lincoln didn’t request his counsel very often, and Bates didn’t offer it.”

Even though she provided a reference for her claims, but, I could not access the mentioned source.

I am not an expert in American history, but I think the author analyzed Lincoln’s cabinet according to her soldier mindset. In addition, you can find some minor mistakes in the Dreyfus affair part of this book.


3- Variance: In chapter 6 there is a section with the title “Quantifying Your Uncertainty”. To test yourself on measuring your uncertainty (or certainty) numerically, 40 short questions are provided. You should answer them with a certain degree of certainty (55%, 65%, 75%, 85%, and 95%). After checking the answers you should calculate the percentage of correct answers for each degree of certainty and draw a graph of calibration. The perfect calibration is the one which for example, if you answered 12 of the questions with 75% certainty 9 of them should be correct which means exactly 75%. The lower the difference between your assumed certainty and the actual result, the better your prediction.


Aside from just containing four categories of questions instead of random domain questions, the problem with this approach is simply eliminating the effect of variance. For instance, in my case, I answered 3 questions with 75% certainty, all of them correctly. Therefore, my error is 100 – 75 = 25%. For 85% level, I answered 8 questions, 6 of them correct. So, my error would be 85 – 75 = 10%. According to the book, I performed better in 85% level than 75%. But this is not exactly true. You can imagine answering questions like flipping an unfair coin which one side is more probable (let’s assume your answers are independent of each other). In 75% case, I flipped the coin 3 times, and the probability of answering all of them correctly is 0.75 ^ 3 = 42%. In the 85% case, the probability is around 24%. So, to wrap it up, we can conclude that answering 3 out of 3 questions correctly with 75% certainty is more probable than answering 6 out of 8 questions correctly with certainty of 85%. Due to variance, my more erroneous answer is more probable than the other one.


To make the calibration more robust, there should be more random questions, and either calculate variance or eliminate answers with frequency less than a predetermined number.


Despite aforementioned issues, I recommend this book to anyone interested in psychology or decision-making strategies. Other more useful candidates in the same fields are: Superforecasting: The Art and Science of Prediction and Thinking, Fast and Slow.

Profile Image for Trey Hunner.
135 reviews44 followers
June 26, 2021
This was a great book and I'd like everyone to read it. The discussion about soldier vs scout mindset is a framing of the world that will stick with me. I need to re-read the various sections on strategies for adopting a more truth-oriented life, as those ideas I'll need to do work to internalize.

The discussion near the end of the book on identity really resonated with me.

Holding an identity (athlete, Democrat, feminist, etc.) is neither bad nor good. Identity can make hard things rewarding (e.g. "I'm the kind of person who keeps promises"). But when making decisions related to our identity we often find ourselves at a crossroad, choosing between an identity-affirming and a more impactful (but non-affirming) action.

A quote that stuck out to me: "You can make the effort to think honestly no matter what community you're embedded in, but your friends, coworkers, and audience can either provide a headwind or a tailwind for your efforts." Julia then goes onto note that this tailwind is why she joined the effective altruism movement.

This type of thinking is very important to me and I suspect I'll be re-reading this book (which is something I do rarely) within the next few years.

If you'd prefer not to read the whole book, I'd recommend any of these interviews about it (I finally read the book because I kept hearing interviews with Julia in various podcasts):

If you want a sample before you read the whole thing, Julia gave a TED talk on this topic years ago and she's done some interviews recently about her book (Clearer Thinking, Vox Conversations, EconTalk, The Weeds).
35 reviews3 followers
December 24, 2021
Solid book! If you've poked around the rationality community before I don't think anything in here will be groundbreaking, but Galef has a good and accessible prose. She has a clear, well-explained framework for how to implement the Scout Mindset, and focuses on actionable changes you can make, which I appreciated.
Profile Image for Kay.
549 reviews63 followers
January 6, 2022
This is an interesting and provocative book, in the vein of Thinking Fast and Slow (which I admittedly have not read but which Julia Galef references a lot!), that aims to help you improve your critical thinking. It's a well reported and well-executed book, and gave me some good tips for how to think about things moving forward. She delivers concrete mental tests for us to evaluate information or ideas to avoid getting into the trap of polarized thinking or confirmation bias.

The basic premise of the book is that we have a somewhat innate tendency to exercise what she calls the "soldier" mindset: fit in with the group, agree with the dominant view, and go along with the crowd. This can have its benefits! Humans are social creatures and adhering to social norms is a big part of the glue that's holding us together. But it can also lead to a lot of groupthink and doing things for the sake of doing things the way they've always been done. Galef advocates for more of what she calls the "scout" mindset -- reevaluating beliefs and ideas consistently as we gather new information and making better decisions. She has a set of mental tests to try:

-"Are you judging one person (or group) by a different standard than you would use for another person (or group)?"
- "How would you evaluate this situation if it wasn't your situation?
- "If other people no longer held this view, would you still hold it?"
- "If this evidence supported the other side, how credible would you judge it to be?"
- "If your current situation was not the status quo, would you actively choose it?"

This are all really helpful mental exercises -- and ones that I unknowingly sometimes employ in my own work. (The first, particularly, is helpful for cutting through the BS and talking points on Twitter that often fly around.)

The downside with a book like this is that it would perhaps be best executed as a well-edited magazine article rather than a whole book. That's not a critique of Julia! She should get her book advance, and there's certainly a market for middle manager-targeted "how to think" books. (Hello I am a middle manager.) But if you don't have time to read the whole book, I'd recommend just the intro and a few key chapters. She also has a podcast that's probably worth checking out.
Profile Image for Teo Ekstrom.
145 reviews
July 11, 2022
I loved this book to the extent that I almost am starting to doubt that it's a good book. Like, it's TOO close to perfect to not feel suspicious as I'm reading it. Basically, her idea is that you can sort your mind as operating one of two modes (mindsets) at any given time--mostly soldier mindset, or mostly scout. Soldier mindset involves defending your beliefs, whereas Scout mindset isn't really interested in defending anything as much as it is making an accurate "map" of the world (increasing your model for how the world works).

This isn't really something that a lot of people would disagree with--everyone I interact with kind of gets that confirmation bias exists, other biases exist, but the trick is how to stay out of soldier mindset. That's where the book gets really good, with Galef coming up with tons of interesting strategies to help you make decisions when you're in a tight spot. Some of my favorites:

Imagine that this thing you're considering (having kids, studying at university, etc) was something that only 5% of the world did. Is it still something you're interested in, or were you subject to strong peer influence?

If you're trying to predict something, attach a money value to it in a kind of bet. If I'm wondering if Sunny will ever walk properly on a leash, would I accept a bet for $100 that in a year he won't be walking properly? No, I probably wouldn't, which helps me get perspective that it's a temporary frustration.

If I'm agreeing with someone in a discussion, imagine if they suddenly said "Actually, I've changed my mind". How strongly do I feel about defending the original position? Would I change as well?

If I'm considering a change, think about if it's just status quo preference. If I'm torn about taking a new job that pays more but has a longer commute, how would I feel if I had that job already and I was offered my current job? Which one would I take in that event?

Anyway, the book is full of little stories or hypotheticals like this. It's very much going to appeal to a specific kind of person (me), but I would honestly recommend this to anyone and have already bought a copy for a friend.
Displaying 1 - 30 of 633 reviews

Can't find what you're looking for?

Get help and learn more about the design.