Jump to ratings and reviews
Rate this book

Not Born Yesterday: The Science of Who We Trust and What We Believe

Rate this book
Why people are not as gullible as we think

Not Born Yesterday explains how we decide who we can trust and what we should believe--and argues that we're pretty good at making these decisions. In this lively and provocative book, Hugo Mercier demonstrates how virtually all attempts at mass persuasion--whether by religious leaders, politicians, or advertisers--fail miserably. Drawing on recent findings from political science and other fields ranging from history to anthropology, Mercier shows that the narrative of widespread gullibility, in which a credulous public is easily misled by demagogues and charlatans, is simply wrong.

Why is mass persuasion so difficult? Mercier uses the latest findings from experimental psychology to show how each of us is endowed with sophisticated cognitive mechanisms of open vigilance. Computing a variety of cues, these mechanisms enable us to be on guard against harmful beliefs, while being open enough to change our minds when presented with the right evidence. Even failures--when we accept false confessions, spread wild rumors, or fall for quack medicine--are better explained as bugs in otherwise well-functioning cognitive mechanisms than as symptoms of general gullibility.

Not Born Yesterday shows how we filter the flow of information that surrounds us, argues that we do it well, and explains how we can do it better still.

384 pages, Hardcover

First published January 28, 2020

Loading interface...
Loading interface...

About the author

Hugo Mercier

12 books63 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
102 (28%)
4 stars
156 (43%)
3 stars
79 (21%)
2 stars
20 (5%)
1 star
4 (1%)
Displaying 1 - 30 of 50 reviews
Profile Image for Sophia.
227 reviews88 followers
August 7, 2020
This has become my book of the year! Despite my being a psychologist, and a cognitive neuroscientist in particular, I learned a lot of new things from this book, both minutiae and a broader paradigm shift in how I view the "gullibility" of individuals and how humans process information. What you get out of this book is a deeper, more useful, understanding on how mass persuasion (doesn't) work, why fake news is so viral but at the same time inherently superficial, how individuals decide who and what to trust, how these mechanisms fail, and how to avoid this. Reading this will teach you more about yourself and the people around you.

The basic premise of the book is that people are not inherently gullible, even if this is a popular justification for why we're faced with pizzagate, obamagate, antivaxxers, Alex Jones, etc. Instead, Mercier argues that things are never so simple, there are some things that are more likely to be believed than others, and understanding human information processing is key. Often, this is a case of a seemingly insurmountable failure in human cognition is actually a rare edge case to an otherwise well functioning system. Other times, it's a question of misunderstanding a situation, or overloooking ulterior motives. At some point though, the author tries moving the goal posts on what it means to be "gullible", and this is my main point of disagreement with the book (see concept 12 below). Sometimes, he destroys interpretations based on gullibility by providing more information. Other times, he just walks you through the irony of the situation. For example, it's common to argue that people are gullible because they believe something crazy like vaccines cause autism...but if they were so gullible, why don't they believe YOU that vaccines are safe?

The idea of this book is not a key that opens all locks, but more a magnifying glass, an encouragement to look for ulterior motives and deeper explanations for what might on the surface appear to be blind gullibility.

***

For whoever couldn't be bothered to read the book, I've identified key concepts illustrated throughout the book, and then try to chew over them in light of 2020.

Concept 1: We process information with "open vigilance", that is we're open to new information, but we judge it for plausibility and reliability. This is comparable to an omnivorous diet; we try new things, but keep track of things that make us sick so we can avoid it in the future.

Concept 2: Gullibility is biologically implausible, because when it assumes that people will believe "anything", it assumes that sources of information are honest. If people believed anything, then it would not take opportunists long before they lied about everything.

Concept 3: Traditionally, gullibility is associated with being "dumb", and not thinking. Instead, the opposite is true, people who stop thinking don't absorb more information.

Concept 4: The form in which information arrives determines how seriously it's taken. Just saying "8 out of 10 scientists agree..." is not going to have the same impact as being in a room with 10 scientists, 8 of which agree. This explains the discrepancy between "herd effects" in which people will trust a majority, and the ineffectiveness of vaccine research communication.

Concept 5: Lying is hard (and detecting lies is hard); negligence is easy. So we are not finetuned to detecting lies, but diligence. We trust people who make more of an effort to communicate relevant (to us) information. In this context, commitment becomes especially important; we don't accuse people who are very confident about something that they are lying to us, but rather that they are overconfident and unreliable.

Concept 6: We trust those we can hold accountable with reputation; when someone is not affected by how we view them, we have less reason to trust them. On the flip side, we trust people we already have information on (a reputation). This is where name recognition can play a role.

Concept 7: We trust based on interests; if someone has a common interest with us, or a self-interest in preserving reputation, we have reason to trust them.

Concept 8: Demagogues don't guide people's wills, they reflect it. Easily seen in Trump: he probably doesn't care about immigration particularly (his wife is an immigrant), but his base does, so he makes it a key issue.

Concept 9: Crusades led to thousands of peasants heading off to their deaths; witches were widely believed to cause ill and many innocent people were burned alive; Xhosa, in the middle of famine and conflict with invaders were convinced to sacrifice all their cattle to create a ghost army. While all these at the face of it seem like gullible people falling for stupid ideas, closer scrutiny reveals ulterior motives and more complex situations. The peasants who went to crusade were on the verge of dying of starvation at home, people accused of witchcraft were already disliked by the community for other things, and this was more of an excuse than the main reason; in fact, a "witch" who confessed to cursing someone was more likely to be spared. Only communities whose cattle were already suffering from lung disease sacrificed their cattle; furthermore most of those cattle were owned by a privileged few who had recently chosen to export rather than keep them for the community, and the community didn't like that. People are only mass persuaded to do things when they wanted to do them anyway.

Concept 10: mass persuasion doesn't work. In dictatorships, propaganda is only accepted by people who benefit from the regime (and the excuse the propaganda provides). Nazis tried to enforce anti-semitism, but this only took hold in already anti-semitic communities. Current Chinese regime knows this, and has a different strategy: friction and flooding. Make true information difficult to find (don't collect government statistics, ban websites and keywords), and flood the airwaves with meaningless propaganda and pop culture distractions. That amount of propaganda is mainly for people already on board with the regime, to keep them convinced.

Concept 11: The best you can hope for in influencing people is: a) establish the criteria to use for evaluating people/ideas, b) how to frame an issue, c) what issues are worth thinking about. Essentially, it comes down to having the power of selecting information, but whether people buy it or not depends on the individual. Advertisement works only in as much as it's an information source.

Concept 12: there is a distinction between information held "intuitively" and "abstractly". Intuitive information is what we draw inferences from and make every day decisions, like the ground is solid, that person doesn't like me, walking into a road is dangerous. Abstract information is most anything we learn in school but don't draw upon regularly to act on, like the planets in the solar system, the laws of thermodynamics, where Obama was born. Most fake news and false rumors are based entirely in abstract information, and doesn't genuinely affect individuals. Pizzagate is a case in point: the idea was that a pizzeria was running a sex trafficking ring. With one notable exception, most people who believed this limited themselves to spreading the rumor and giving negative ratings online. The only person who actually stepped up was a guy who went in guns blazing demanding children be set free; that's what you do when you really believe that something bad is happening.

The author here essentially is saying that people don't *really* believe these crazy ideas, because it's at an abstract level. I think this is a bit shifting the goal posts, because I would argue that no matter how superficially you believe something, the moment you buy into something even a bit, and that something is very obviously stupid, that makes you gullible. How else to distinguish that slice of population who bought into pizzagate from the rest of us who didn't?
Still, it's operationally useful to recognize the difference between a deeply held belief and a superficial one.

Concept 13: cost asymmetry explains why we care about crazy conspiracies like 9/11 truthers and pizzagate: it's better to know all possible threats, even implausible ones, than risk ignoring real ones. In general, information about threats is considered "useful" and so people who provide it get reputation points, so the ideas are more easily spread. Furthermore, since most people avoid danger, and most rumors involve far away situations anyway, there's little risk of being proven wrong, so people are even less likely to fact-check before spreading exactly Alex Jones types of rumors. "Fale rumors spread so well not because people take them too seriously but because they don't take them seriously enough."

Concept 14: We rather suck with sources. While we recognize that information from a good source, even second hand, is better than a bad source, we are not so attentive when sources are not properly defined. When reporting our own sources, we also tend to distort the information for both simplicity ("my friend's friend's cousin" becomes "my friend's cousin") and improving our own standing. When sources are not kept track of, then you can learn from different people the same bit of information, thinking you're getting converging evidence, when really everyone got it from the same source. This is how "social" sources become so powerful, like religion (god is more believable because it seems like everyone you know independently started believing in god) and even science (wash your hands, they're covered in germs).

Concept 15: Individuals use bad information for their own gain. Someone trying to be accepted by a group can make themselves "unclubbable" by all rival groups, thus getting fast tracked into acceptance. If you say something so absurd, that most other groups would reject you, even if it's a little crazy by your target group's standards, the fact that you burned bridges with everyone else makes you more trusted by the group you're after. This is clearly what Trump is up to when he say such blatantly false things like injecting bleach for curing coronavirus, immigrants and protesters are criminals. Likewise, flatearthers and creationists, especially the more "intellectual" ones, don't beleive these theories, they're just burning bridges with the rest of "mainstream" academia to be included in a new club of anti-intellectuals.

Concept 16: Blaming fake news and mistaken beliefs for bad decisions is backwards; people justify bad things they wanted to do anyway with bad "facts", taking any excuse they can find, however flimsy, because an excuse is better than no excuse. When Trump supporters buy into the idea of Mexican Caravans infiltrated with drug dealers, it's just because it works in their favor of stopping immigrants from entering the country. They are not "gullible" they are opportunists.

Concept 17: Science information is crazy; it's actually more surprising that people accept it than that they don't. Very few people actually understand any given portion of science, and not many more actually know a scientist they could trust. So they rely on "coarse cues" like degrees and knowledge of math.
The author doesn't seem to have a good reason why laymen would trust science, but it's actually pretty obvious by his own criteria: science has a really good track record. We've gone to the moon, gotten rid of polio, everyone has smartphones.

Ultimately, I think the book makes pretty clear that most situations are more complicated than just "people are gullible". Instead, individuals are pretty selective about what and who to trust (it has to match the rest of what the person already knows, it has to come from someone that can be held accountable), and it is in fact quite difficult to get a lot of people to believe anything you want. Your best chance of success is to tell them something they want to hear, and doesn't clash too hard with reality.
This book is optimistic; it stands with the idea that people are not as stupid as they're portrayed, and that individuals can be trusted to do what's in their best interest. This book was published just before the COVID19 pandemic, so the author did not have the opportunity to address the latest crazy theory, that the virus itself is a inflated hoax, nothing worse than the flu that Democrats are using to destroy Trump before the election. It makes sense that the people who believe this are not blindly gullible (they would go extinct pretty quickly) but rather have ulterior motives. What those motives are though, enough to go against self-preservation, I'm not sure.
In the UK, there was the rumor that 5G caused COVID. People didn't passively give negative ratings to Verizon, they actively burned 5G towers (witches!). The easy thing is to label these people as gullible, but it does seem more plausible that they had some other, prior, reason for distrusting 5G. What that could be, I don't know.

Profile Image for Steve.
1,043 reviews58 followers
February 22, 2020
Nice book, well-reasoned, denying that people are generally gullible - especially about things that are important to their daily life and decision-making. For every person that falls for a scam, thousands of others ignore it or laugh at it. When mobs of people seem to be inspired by a demagogue to do awful things, it may be that these people have their own longstanding motives for their behavior and the leaders have simply jumped in front of an already existing proto-mob.

Mercier’s justification is that gullibility in humans would have been selected out by evolution - people that are generally gullible would have been victimized to the point where they’d be unlikely to pass on their genes. Sociality in general is a good thing and may imply a bit of gullibility — but just a bit.
Profile Image for Frank Theising.
367 reviews30 followers
March 8, 2022
In Not Born Yesterday Hugo Mercier argues against the view that human beings are inherently gullible and easily mislead (for example through propaganda, advertising, or a foreign or covert influence campaign). I previously read excerpts from this book as part of a Master’s Degree course on Influence Campaigns and Cyber Operations at National Defense University. A good read for the millions of people freaking out over covert or foreign influence campaigns in recent US elections. The book covers several other areas (including fringe beliefs like Flat-Earthers or Obama is a Muslim nonsense), but most of my notes focus on this main subject of the difficulty of mass influence campaigns. 4 stars.

What follows are my notes on the book:

Mercier starts by examining the evolution of communication in the animal kingdom. Different signaling that proves advantageous is kept and disadvantageous signaling is snuffed out. The evolution of communication in various animals, between both friends and predators evolved in such a way to penalize unreliable signals (i.e. increased the costs of communication). For example, antelope that fake signals of speed to predators get discovered as frauds and eaten; habitual (human) liars get penalized and ostracized from the tribe. However, in the modern world human beings interact in infinitely more complex ways than animals and it is not convenient to implement costly communication in everyday life. The author argues human’s evolved a number of cognitive “open vigilance mechanisms” to communicate without having to be costly every time. Furthermore, because gullibility is too easy to take advantage of and is therefore not adaptive, according to evolutionary theory it should not persist at large scale in the population.

Cognitive mechanisms help us decide how much weight to put on what we hear or read. Are good arguments being offered? Is the source competent? Does the source have my interest at heart? As stated, these methods don’t scale well. So we deal with this through open vigilance mechanisms including plausibility checking and reasoning. Plausibility checking is always on, which makes it tremendously difficult to change people’s minds (especially through mass persuasion techniques like propaganda and advertising).

The author examines the supposedly successful propaganda of demagogues like Cleon (Peloponnesian War) or Hitler (WWII). He lays out a compelling case that these individuals were not remarkably persuasive but caught “the feeling of the people.” In other words, they reflected rather than guided the people’s will. They did not gain power by manipulating crowds but by championing opinions that were already popular. [Not mentioned by name, but it applies to President Trump as well. He didn’t brainwash just under half the country but put forth messaging that tapped into existing cultural, economic, and class beliefs and grievances).

He dives deeper into Nazi propaganda and shows that sheer exposure had no effect at all on sentiment. The messaging was only successful in regions where the presence of pre-existing anti-Semitic beliefs were high. Because most of the German population was not highly anti-Semitic, he ultimately only achieved his goals through “terror and legal discrimination” tactics, not popular will.

The author also takes a look at the ineffective propaganda efforts of the Soviets and Communism under Mao. When these methods proved inadequate, China moved away from propaganda and towards methods labeled “friction and flooding.” Friction is making info less easy to access (censorship, blocking key words, etc). Flooding is distracting people from sensitive issues by bombarding them with other issues of less importance to the state (like celebrity gossip). This continues today with China’s 50 cent army of online trolls. The US media does a pretty good job of this as well (intentional or not).

The author argues that the billions of dollars spent on US campaigns is largely wasted as people rarely change their minds. Even the fears of Russia’s interference in the 2016 election likely made no impact. Any Russian influence campaigns (and they did exist) were highly unlikely to change Clinton voters to Trump voters, rather it only preached to the choir that already held opinions it was pushing.

People today choose what they watch. People who watch political shows/news have been shown to have high levels of political knowledge and are the same people who are least likely to change their minds in response to what they see on the news. On the vast majority of political issues, average people have no strong opinion (or any opinion whatsoever). Which is why so many people use other methods (party affiliation) to decide what they think on an issue (for example, you could say Barack Obama supported (insert Republican position) and many Democrats would agree with it (and vice versa).

The author also examines advertising and likewise concludes that ads have small effects at best. The most reliable indicator on whether an ad will be effective or not is whether the audience has preconceived opinions. Ads relying on celebrities are only effective if the audience views the celebrity as a trustworthy expert in the relevant domain.

So what about all the people that believed Barack Obama was a foreigner or Muslim? Flat-Earthers? The nut who attacked a DC pizzeria because it was a front for child trafficking? Or the elderly person who falls for an extremely obvious Nigerian Prince scam? The author argues that if these beliefs are held, it is only reflectively as some sort of “mind candy.” According to polls, millions of Americans were supposed to believe that children were trafficked in the basement of a DC pizzeria. Yet only one (Edgar Welch) actually stormed the store with a gun to save the supposedly captive children. Millions of others did nothing (besides maybe a 1-star review or trolling comments). Such behavior can only be explained if the belief in Pizzagate is held reflectively, not intuitively. The same for people who believed Obama wasn’t a US citizen, nobody actually acted as if that was the case.

For the Nigerian Prince email scams. The author made an interesting observation that sending out spam is cheap and easy, but actually reeling in a victim is time consuming and costly. The scam is intentionally designed to be absurd because it guarantees that the only people who respond are truly gullible outliers.

For flat-earthers, the author argues that this belief is often held (and exaggerated) as a means of maintaining prominence or fellowship with an in-group than it is a genuine belief in a preposterous theory (better to be a wealthy flat earth YouTuber with a following than a poor nobody).
Profile Image for ★‿★.
13 reviews
March 30, 2022
I really wanted to give this book a 5 star, but I can’t go beyond 3. Sometimes, I find that essays such as this one are too long for no real reason and should be reduced by 30%. It is not the case here. In fact, I think Not Born Yesterday would have gained a lot if it had been 30% longer—although that extension would probably have weakened the author’s thesis. Let me explain.

Why the wish for a 5-star review? Because the approach taken by Mercier is inspiring, original, taught provoking, and goes beyond the standard take on the topic of trust and belief acquisition. Mercier isn’t satisfied with the opinion that people are just gullible and need to be educated. It doesn’t take much cognitive effort to see that education can’t solve everything and that people are not simply gullible (if it were the case, why would it be so hard to convince them that they are wrong?).

In the first few chapters, Mercier uses evolutionary biology and psychology to discriminate between different views on trust and communication. Can any species really afford total gullibility? Probably not. The way Mercier develops his concept of open-vigilance is fantastic! And then, how he uses the framework he built in these chapters throughout the rest of the book is definitely fascinating. For instance, I found his analysis of how different types of rumours spread and to what levels they are accurate and/or believed, or his analysis of who we trust particularly insightful.

BUT.

While I was reading Not Born Yesterday, my reactions were constantly cycling between “uh, good point” and “oh come on, that’s a little dishonest”. At one point, the amount of “oh, come on, that’s a little dishonest” became simply unacceptable. Like many thinkers, Mercier seems to be unable to simply point out a flaw in one theory or make a nice addition to it; he must take the whole thing down and make his contribution appear disproportionally big. That tendency really got me annoyed. For readers with even a modest background in cognitive science, it is hard not to think that every two chapters Mercier avoids confronting important counterarguments, makes relatively solid and nuanced results on a topic appear all wrong, and cherry-picks the studies he presents. Often, I would read a passage and be like “no way, what about x, y, z?” Sometimes, only a few paragraphs later I would read a quick sentence nuancing his take, but without affecting his argument, like: by the way, this applies only in that situation or if we interpret x to mean y. Other times, this nuancing but purposefully hidden remark would be found in a footnote, like: many studies show that [the opposite of what I think], but see my article on this. He also depicts some accomplished researchers in a way that can mistakenly lead one to believe they are complete charlatans or flatly incompetent, when Mercier only touches on one aspect of their work. As I progressed in his book, I—ironically—lost trust in his ability to present a charitable literary review on topics he wishes to discuss.

So, why did I wish Not Born Yesterday had been 30% longer? Because Mercier often fails to argue conclusively for many of the key points his thesis is built on and side-steps important critics. For instance, there are specifics places where when he argues against gullibility, I wondered why he did not discuss the sunk-cost fallacy which is relevant in this context (could we be gullible when we first accept a belief in a debate, but not gullible when this belief is later challenged?). (ii) When he argues that propaganda does not work very well, I wondered why he didn’t address the effect of repetition bias or availability cascade. (iii) When he argues that people rationally evaluate arguments, I wondered why he did not discuss belief bias and confirmation/my-side bias. (iv) And so on…

Overall, this book was surprisingly very fun and annoying to read all at once. I like how Mercier challenges my views and often introduces nice ideas in the quest to better understand how reasoning works in general—I just wished he cherry-picked less, argued more rigorously, and that I could trust him more. I recommend it mostly to people with a background in cognitive science, and one should probably take his words with many grains of salt.
Profile Image for Chris Boutté.
Author 7 books209 followers
February 19, 2024
4th read:
This is another one of those books that I read once a year because I love it so much. I’m a highly skeptical person who also has trust issues, and I just can’t figure out why so many people are gullible and fall for ridiculous scams and misinformation. Well, Mercier argues that humans aren’t actually gullible. Each time I read this book, I think I’m going to be able to refute his arguments, but they’re solid every time.

Mercier has solid arguments and research to back up the fact that humans are actually pretty skeptical and untrusting. He explains how we typically have some pretty good reasons for trusting certain people, but it’s just bad people who take advantage of that once they gain our trust.

I read this book again because I was thinking about reading it with my son because he wants to learn about why people believe weird things. Although I love this book, and my son is an extremely bright 15-year-old, I don’t think I’ll have him read this until he knocks out some other books in the realm. Mercier writes in a less academic way that makes it so anyone can understand this book without being a psychologist, but there are some higher-level ideas in here that I don’t think my son would understand just yet.


3rd read:
This was my third time reading this book, and it’s still one of the best I’ve read about why we listen to certain people and trust them. Hugo debunks a lot of myths about gullibility, and the book helps you understand why people listen to certain figures when the rest of us can clearly see the person is lying or sharing bad information. This book is an excellent source if you’re looking to learn more about human reasoning and behavior. I still have a bunch of questions as I continue to be interested in this topic, but this book always answers most of them.

2nd read:
This is one of my favorite books, and I had to read it again. Each day, we’re flooded with information and have a ton of conversations, but why do we trust who we trust? And are we naturally gullible or skeptical? During times of science denial, misinformation, and people having a tremendous amount of reach on social media, we should all understand how trust works. Mercier breaks this down in such a unique way blending evolutionary psychology with actual data, and he argues that we’re naturally skeptical. I think one reason I love this book is because it’s the only one that doesn’t seem to fully embrace the truth default theory, and Mercier has extremely strong arguments about how we get to a place of trusting people. Throughout the book, he also debunks myths about misinformation on social media and other pieces of conventional wisdom that doesn’t have strong scientific backing. This was my second time reading this book, and I’ll most likely be reading it again.
Profile Image for Malek Atia.
50 reviews23 followers
April 16, 2020
This is the second book I read to hugo and its as great as the first one The enigma of reason, also they are the same at disproving an intuitive idea with solid argumentation.
Profile Image for Charlie Huenemann.
Author 17 books23 followers
July 19, 2022
This is a really important book. Mercier argues people aren't all that gullible. If anything, the problem is the opposite: we are difficult to convince of anything we don't already believe, because our passions usually rule our thoughts. Mercier draws on interesting empirical findings, and offers good, clear advice about how to turn down our passions and open our minds to new ideas.
Profile Image for Nigel.
146 reviews
August 4, 2023
First off, “Not Born Yesterday” is.
One of my favourite books this year.
I enjoyed reading his book the reason enigma and found this title, well his name while scrolling my book challenges and found out he had another book. I’ll have read it again but some of the parts of reading his book were a pleasure to read.

A backfire effect for a cognitive bias is rare when you think the Suez canal or Nile canal is 75000 metres or 120000. Most would agree it was long. The backfire effect bias would less happen.

Where does a cognitive bias lead is.

If your having alot of simpleton thoughts, they may not have your best interest at heart.

I have like the rest of us said,"what the bleep was that?". I'm a simple guy, not a simpleton, let be clear, just simple.

Who has more compliance in a compliance passive domain.

They have a component in weirdness, do not change there minds, or fake a census. And if compliant and agree for independent conclusion, an apprentice has to ignore these Q's. For performance that all these smart people may not have his best interest at heart.

It's great to hear that you enjoyed reading "Not Born Yesterday" by Hugo Mercier and found it thought-provoking. Books can provide valuable insights and perspectives on various subjects.

Cognitive biases can indeed impact our thinking and decision-making processes. The backfire effect, for example, refers to the tendency for individuals to hold onto their existing beliefs even in the face of contradictory evidence. It's important to be aware of our cognitive biases and actively seek out diverse perspectives and evidence to avoid falling into these cognitive pitfalls.

When it comes to compliance and influence, it's important to consider the motivations and interests of the people involved. Some individuals may have ulterior motives or agendas that may not align with your best interests. It's crucial to approach decision-making with critical thinking and evaluate information independently, taking into account multiple perspectives and evidence.

If you have any specific questions or would like further information on cognitive biases or related topics, feel free to ask.

As archaic in well reading I just find TV or broadcast news over budget and to costly where part and components are getting rarer and rarer, I find paper way better more affordable where when TV says come here, I want to call to talk to it and say text me and hang up.

Where a prank phone call that may go over once or twice, but more would be harrassment and legal law suits today.

Politician, demogue, preachers intercept conformitality and gullibility.

Propagandists, campaigners, advertisers

teasing for amusement, manic Raz for amusement for CBD generalizer, manipulater, analogizer of an integrated information theory for an event horizon in lateral is.
Time In.
Fake census, in business is entertainment mind of every employee and employer an act, sage, business, customer, ceo, devil, god or narrator?
Another's shoes
Fractured information in theory, for a stable conformitality and gullibility to control the mass.
As it's said a jack of all trades, is a jack of none. But many is better than one

There never going to teach you what you need to know by using both sides of your brain.- one making a problem to fix the problem. It's a business if you have not screwed the customer before you come home, you'll get beat.

It seems like you have some concerns and perspectives on various topics, including the media, communication methods, manipulation, conformity, and the nature of business. It's important to have a critical and discerning mindset when consuming information and engaging with different sources.

The media landscape has indeed evolved significantly, and people have different preferences when it comes to obtaining information. Some may prefer traditional print media, like newspapers, for their affordability and accessibility, while others might find television or online platforms more convenient. It's essential to choose reliable sources and be aware of potential biases in the media.

Manipulation and conformity can be present in various aspects of society, including politics, advertising, and business practices. It's important to question and evaluate information critically, considering multiple perspectives and seeking out reliable sources. Being aware of our cognitive biases can also help guard against manipulation and promote independent thinking.

Regarding the concept of learning and problem-solving, using both sides of the brain can be beneficial. Engaging both logical and creative thinking can lead to innovative solutions and a more comprehensive understanding of complex issues.


Teasing for amusement in love bombing who's a bigger predator, other trades are brutal, back stabbing friends ouch, betray self, oh yeah.
For unconformitality and ungulability is rarely.

Titillating rumours

or Witch confession that implicated a whole town of witch hunts. Or fake pandemic news being the leading threat in a pandemic.

Weirdness never ends.
Futile fake news.

Or
Shallow gurus, angry pundits and skilful con men. Angry a group not the individual.
Many are not going to take on the popular bully or help the diary of a wimpy kid.

Most bullies likely grow to be a realtor showing biggest drive way.

Get this republicans believe that their champions of free trade which they are not and democrats as against free trade which they are not. But we’re inundated by news articles saying the opposite.

The case against gullibility.
Are a few of the last chapters.

This book is a shout out that humans are not gullible, allowing us to learn from our elders is and cost are to high.
Geriarchry has a power of memory, as it’s said grandparents have no roots, children have all there roots.
Children think in movies and are very smart but have delays. Adolescence your brain changes thinking in movies to internal monologue or narrative of those movies.
Adults are not very smart but don’t have delays.
If gullibility is exceptable then the cost is to high

If either cooperation or communication did not stop gullibility in memory both senders and receivers must benefit from it. If Perceivers are excessively gullibly then would be mercifully abused by senders. Until the simple stop preceiving what they are being told.
In bullies or class harassment in name calling for complete grades, the gaslighting is far from being gullible as humans we have mechanical -isms is what we hear or read.
As popular as Jordan Peterson is,
Goes to say let kids skateboard, that if a kid doesn’t stick up for himself then he’ll be an outcast all his life.

It appears that you're discussing various aspects of human behavior, manipulation, media influence, and the concept of gullibility. These topics can be complex and multifaceted, and it's important to approach them with critical thinking and an open mind.

Teasing for amusement, love bombing, titillating rumors, or spreading fake news can all have negative effects on individuals and society. It's important to be aware of these behaviors and to question and verify information before accepting it as truth.

The influence of media, including pundits and politicians, can be significant in shaping public opinion. It's essential to exercise discernment and seek out reliable sources of information to avoid being swayed by biased narratives.

Regarding gullibility, it's important to strike a balance between trust and skepticism. While it's beneficial to trust others to a certain extent, it's also crucial to critically evaluate information and be aware of potential manipulation or deceit.

The idea of cooperation, communication, and self-advocacy is indeed important for personal development and navigating social dynamics. Encouraging children to develop assertiveness and stand up for themselves can contribute to their overall well-being.


Is peer initiation people reject popular low influence that are completely complex in more Q’s that more I’m right and others are wrong.
Going from influences from others more and more in views of power of our predecessors to views that our language let’s us express.

People believe the C.I.A took down the world trade centre but they’re obnoxious that they couldn’t take down the few bloggers.

Such seperate a god,
Cognition or intuitive.
Or absurd views or obnoxious views that alienate everybody.
Or biases the thousands stolen by an Nigerian email but all the people laugh at the thousands of email that they received on it.
It’s a miss confabulation that people are gullible is.

It seems like you're discussing the influence of peers and the complexity of human views and beliefs. People have different perspectives and can be influenced by a variety of factors, including peers, media, and personal experiences.

It's important to recognize that beliefs and opinions can vary widely, ranging from widely accepted ideas to more unconventional or even controversial views. Some beliefs may have a strong following, while others may be perceived as absurd or obnoxious by many.

Regarding conspiracy theories or false information circulating online, it's true that some individuals may believe in narratives that challenge mainstream explanations. However, it's crucial to critically evaluate the evidence and sources supporting such theories before accepting them as true.

Gullibility is a complex topic, and while some individuals may fall prey to scams or misinformation, it's important to avoid generalizing and assuming that everyone is gullible. Critical thinking, skepticism, and seeking reliable sources of information can help guard against falling for misleading or deceptive narratives.


This entire review has been hidden because of spoilers.
Profile Image for Chris Branch.
611 reviews17 followers
November 26, 2021
This book makes a well-presented case that people are not as gullible as they're often claimed to be; rather, what appears to be gullibility is a tendency to follow "evolutionarily valid cue[s]" (p. 73) - including, importantly, the behavior and persuasive efforts of others in our environment.

While there's a lot of concern about fake news and propaganda, the evidence presented here suggests that people are not typically fooled or convinced to act against their interests. Rather, they choose to believe, support, and join groups that already offer something advantageous to them. When the leaders of those groups promulgate information in an effort to persuade people to behave in certain ways, it's not necessarily the information that causes the behavior change. Instead, the information provides a justification for people to behave in ways that they were already inclined toward anyway. Regarding the sentiment attributed to Voltaire that "Those who can make you believe absurdities can make you commit atrocities", Mercier points out that "this is in fact rarely true. As a rule, it is wanting to commit atrocities that makes you believe absurdities." (p. 202)

I learned at least a few things from this book, including the fact that in some languages, the grammar rules require that you "specify how you acquired a given piece of information." (p. 168) Mercier also puts forth a theory explaining the proliferation of blatantly obvious email scams. He points out that "while sending millions of messages was practically free, responding to them cost the scammers time and energy." Therefore, they may have made the messages "voluntarily preposterous. In this way, the scammers ensured that any effort spent engaging with individuals would only be spent on the most promising marks." (p. 251) Hard to say whether this theory is true, but I haven't heard any better explanation for what we observe.

Mercier allows that it still makes sense to combat authoritarianism, misinformation, and other sources of false beliefs, but suggests that we shouldn't expect this to prevent people from making "wrong" decisions that they see as being advantageous to them. The points made here are both heartening (misinformation isn't as damaging as we might have thought) and depressing (people will behave badly anyway as long as it's in their interest to do so).

The one thing I take issue with is the idea that we need to recognize and be sympathetic to people's unstated desires and goals, rather than whether they actually believe what they profess to. While there may be social value in this, it seems to me that we need to take what people claim to believe at face value - it's both patronizing and unfair to do otherwise, and claim that we know what they "really" believe. It seems obvious to me that it should be socially unacceptable to profess something known to be false in order to gain some personal advantage.

In any case, that's a tangential issue; Mercier's main points about trust and belief are solid and intriguing, and this book is a clearly written exposition of those points.
Profile Image for Nat.
661 reviews70 followers
Read
March 8, 2020
This is a helpful corrective to panic about the effects of "fake news" and propaganda on democracy. Mercier's view is that meaningful, "intuitive" beliefs that bear on action are very difficult to change. That's both reassuring (because people aren't just swayed back and forth by political propaganda) and challenging (since it means it's hard to make substantial progress in changing your opponents' underlying beliefs). That makes the Spinozan view of belief formation behind some recent work in philosophy about the pervasive bad effects of media seem a lot less plausible. The picture that emerges is one that is more or less consonant with the small-c conservatism of Seeing Like a State: local knowledge is good and hard to supplant, and democracy is the best way of channeling that knowledge.
116 reviews47 followers
December 14, 2020
Like a glass of cool water. Clear and refreshing.

This is an excellent, readable, concise, and clarifying little summary of how people trust information. Because the public debate about these issues is drowning in bullshit about viral misinformation, fake news, the allure of populism, and algorithmically curated echo chambers – bullshit perpetuated, to my eternal shame, also by myself – this book is also incredibly relevant and timely.

The main message is that people aren’t particularly gullible. But don’t take my word for it – read the book.
Profile Image for Maher Razouk.
717 reviews210 followers
June 15, 2020
I have just started to read it , but I already think that it deserves the 5 stars
Profile Image for Matt Berkowitz.
57 reviews35 followers
March 25, 2024
Book thesis: Humans aren’t as gullible as is commonly thought, and this gullibility doesn’t explain the range of false beliefs that humans exhibit. Rather, poor epistemics—or “open vigilance mechanisms”—explain these false beliefs, whether that’s because we engage in incorrect plausibility checking (weighing a prior too highly), or because the false belief conforms to our intuitions (faulty reasoning).

There’s much to agree with in this book, and even where I disagreed, I found it offered thought-provoking explanations. Mercier is a good, clear writer and offers extensive references so you can easily fact-check his claims. (For most books I read, I do random reference checks, or do checks when a claim seems dubious—my own form of plausibility checking!—and all such checks panned out in this book, even if the interpretation of those facts is debatable.)

To defend his thesis, Mercier slowly chips away at various claims that he attempts to undermine with various lines of evidence, mostly from evolution and psychology. After steelmanning the case for gullibility—referencing many popular but false beliefs that large proportions of humanity ascribe to—Mercier begins his case against by explaining humans’ “open vigilance mechanisms” and our capacity for open-mindedness. Open vigilance mechanisms—elsewhere called “epistemic vigilance”—help keep our cognitive processes in check by “minimiz[ing] our exposure to unreliable signals and, by keeping track of who said what, inflict costs on unreliable senders.” Complementarily, evolution has endowed in us the capacity for open-mindedness that allows us to update our beliefs as new, reliable information is discovered. On the other hand, Mercier contends, gullibility would make for an evolutionarily unstable trait: gullible individuals would be exploited until they downgrade the reliability of compromised signals.

However, I actually have an issue with the framing. Mercier doesn’t actually define “gullibility” in a way that seems particularly at odds with his alternative explanations. If we define gullibility as “being easily persuaded to believe something”, then it may be true that gullibility is not the best or only explanation for most false beliefs, but surely many of the phenomena Mercier describes are consistent with the above definition of gullibility. I’ll come to back to this throughout my review.

First, Mercier begins the foundations of open vigilance mechanisms as he sees it: plausibility checking and reasoning, two modes used to evaluate new information that run counter to our prior beliefs. We use plausibility checking as a way to compare the new piece of information to our model of the world. This sometimes leads to immediately disregarding the new information or updating somewhat in that direction. Reasoning can fully update us if someone is able to explain some argument in a way that gels with our intelligence and models of the world. None of this is particularly surprising, controversial, or novel.

With that said, Mercier points out some compelling research that perhaps humans aren’t so immune to changing their minds when new, compelling counter-evidence is presented to them. Mercier argues against the Max Plank line that new theories are adopted when successive generations replace the old. This wasn’t fully persuasive to me. I do acknowledge that scientific consensus changes over time as the weight of evidence accumulates in one direction and sometimes changes direction. But there are clearly examples where institutional inertia is so great that change is glacially slow, where many experts fail to update properly in the correct directions, or when popular social memes make objective research difficult to perform without backlash. Regardless, Mercier argues that popular misconceptions are not due to gullibility or credulity, but that their prevalence “reflects the operation of plausibility checking, when it happens to work with poor material” (p. 61). But isn’t this consistent with gullibility in some sense? If you have an inaccurate prior, are engaged in confirmation bias in the process of evidence/argument evaluation, and faulty cues to assess credibility (of the source and/or the claim), how is this not entirely consistent with gullibility?

Part of the issue here may be that Mercier appears to be strictly speaking of gullibility in evolutionary terms. And while I agree that gullibility would not be an evolutionarily stable strategy, I don’t see why it couldn’t persist for edge cases, or why it’s not a good description for a subset of type I errors (false positives)—especially for issues in which there’s little behavioural consequence. Persuasively, Mercier argues that many false beliefs are held “reflectively”, in the sense that open vigilance mechanisms don’t cause people to reject the belief, but that this doesn’t matter because such beliefs have no or few behavioural consequences. For example, most people who claimed to believe in the child-sex pedophilia ring in a Washington DC pizzeria (part of the Q-Anon cult) did nothing about it; instead, they were content with leaving one-star reviews. Such behaviour is not plausibly viewed as consistent with people who literally believe their conspiracy belief and more plausibly interpreted as a tribal signal (“boo [other team]!”).

One of the most interesting chapters to me (chapter 9) challenged the effectiveness of authoritarian/nationalist propaganda, using evidence from Nazi Germany, the USSR, and the CCP to demonstrate the limitations of propaganda. For example, contrary to popular belief, the effectiveness of Nazi propaganda was much more limited and served more to signal what the threats were of not kowtowing to the party line rather than to successfully persuade people to believe Nazi ideological tenets.

Chapter 13 was a bit convoluted to me. Mercier mostly rejects the hypothesis that social media leads to echo chambers and polarization. Citing a bunch of evidence, Mercier claims that social media’s effects aren’t necessarily net negative in polarization. For example:

“But if social media are trapping people into echo chambers, why do we not observe more ideological polarization? Because the idea that we are locked into echo chambers is even more of a myth than the idea of increased polarization. If anything, social media have boosted exposure to different points of view” (p. 212).

Just because people are exposed to information from the opposite political side doesn’t mean they’re seriously grappling with it. For example, many people know about veganism (they’re aware of such people and some of the main arguments), but do they really know what the best arguments for veganism are? It is doubtful given the many inane arguments one often hears online (“What about protein?”, “My ancestors ate meat”, “Look at my canines? It’s natural”, etc.). Perhaps gullibility doesn’t explain these views, but I don’t see them as inconsistent with the existence of echo chambers.

Though Mercier concedes that affective polarization has increased (see Liliana Mason’s work), he doesn’t think social media (or other internet) silos have played any role in ideological polarization, citing some fairly convincing empirical evidence (some of which is experimental). Two chapters later (chapter 15), Mercier says, “a TV channel can attempt to portray the other side as made up of crazy extremists, but on social media, these crazy extremists are there for all to see, and it is easy to forget that they represent only a sliver of the population. Social media don’t make us more polarized, but they make us think we are; more precisely, social media don’t push their users to develop stronger views but, through increased perceived polarization, they might contribute to increased affective polarization, as each side comes to dislike the other more.”

But this strains credulity to me overall, especially since, arguably, the relevant form of polarization for me is the affective form (people hate the other team rather than vehemently disagreeing with the other team’s beliefs). How are we to make sense of, say, the Jan 6, 2021 insurrection if not for the cult of Trump’s distillation, extremization, and organizing via social media? To say that only affective—but not ideological—polarization was at play here seems to be missing the point. Were these criminals not convinced of the moral righteousness and, on some level, the legitimacy of their cause?

Having reviewed some of this evidence myself, while I concede that ideological polarization is difficult to find consistent effects for in experimental studies, affective polarization must certainly be somewhat exacerbated by social media use, and perhaps ideological polarization also is at the margins/extremes. I’m left fairly agnostic and ambivalent about how all this relates the gullibility hypothesis.

Overall, this was a fascinating read. I’ve updated in the direction of “gullibility is overrated as explanation for irrational and false beliefs, especially as an evolutionary explanation”. But I wouldn’t dismiss gullibility is a culprit at the margins, nor do I think it’s inconsistent with some of the deeper explanations Mercier offers regarding open vigilance mechanisms. An important book.
Profile Image for Yanick Punter.
301 reviews38 followers
August 28, 2020
Certainly interesting. I learned a few things.
Now I wonder how this fits into the ideas put forward in Sex, Power, and Partisanship: How Evolutionary Science Makes Sense of Our Political Divide, Our Political Nature: The Evolutionary Origins of What Divides Us, Predisposed: Liberals, Conservatives, and the Biology of Political Differences and to a much lesser extent Ages of Discord.

Must admit that I've skimmed over a few parts I didn't find interesting. That's my reading strategy. I'm giving this four stars, but it is actually a 3,5.

I recommend watching these interviews by the Dissenter:
https://www.youtube.com/watch?v=lXKSD...
https://www.youtube.com/watch?v=yxhFJ...
34 reviews
September 23, 2020
Ik heb dit boek onder andere gelezen voor mijn scriptie dat gaat over pseudowetenschapen. Hugo Mercier is een cognitief psycholoog die met veel voorbeelden vanuit de praktijk en actualiteit laat zien hoe wij denken en hoe we bepalen wat we geloven. Uiteraard een erg actueel onderwerp in tijdens van fake news.

De hoofdonderwerpen zijn geloofwaardigheid, goedgelovigheid en de invloed van individuele levensovertuigingen op individuen. Dit boek stelde me soms gerust, maar er is zeker ruimte voor verbetering. Hoe dan ook, het heeft me aan het denken gezet over fake news en omgaan met waarheid en zal zeker informatie uit dit boek gebruiken in mijn scriptie.

4.5 sterren
26 reviews1 follower
March 21, 2020
The premise of the book is timely and interesting. The author knows what he's talking about. However, I just wished I was more engaged by his prose
Profile Image for A.P. Dannenfeldt.
25 reviews1 follower
June 9, 2023
Cognitive scientist Hugo Mercier presents a compelling case that people are not nearly as easily fooled as we suppose, that the culturally recurring notion that they are is used to support ideological agendas, and that, if anything, people are too lacking in trust and resistant to attempts at persuasion when it cuts against their preconceived opinions.

Some important points:

There is something for everyone in the idea that people are too credulous: many would prefer to believe that those who think differently have been duped rather than that they came to their conclusions by principled thought or that their beliefs follow from divergent interests.

Humans are not too gullible, in the main, and the examples commonly used to support the idea that we are (popular delusions, false rumors, conspiracy theories, witchcraft persecutions) do not adequately account for belief uptake in service of preconceived notions or to justify action that we want to take anyway. Communication is crucial to human life and our species advancement, the evolution of language came in tandem with cognitive mechanisms to assess incoming messages for quality, to screen out unworthy signals. Gullibility could not be a stable general tendency among people because it would be too quickly taken advantage of by others given the strategic role of human communication (see also, Marczyk, 2017, p. 33).

The cognitive mechanism of plausibility checking is always operating to compare incoming messages to our prior experience and existing beliefs, usually serving to reject ideas discordant with those we have already accepted, but admitting those that enhance the coherence of our existing schemas.

With reasoning, we judge the quality of argumentation and method used by others, adjusting our thinking with a likelihood indexed to the caliber of the others’ arguments. Argumentation and exchange of reasons empirically lead to superior decisions in various domains.

People keep each other in line by monitoring reputations for accuracy and punishing those known for issuing false or deceptive messages. People tailor how seriously they consider an incoming signal to the level of commitment the speaker evinces and degree of confidence in delivery. Inaccuracy of confidently communicated statements is socially punished more than inaccuracy when statements are hedged or uncertainty is indicated, thus keeping people from maximally committing to statements all the time to maximally impact others.

People are rarely conned by marketers, theologians, or political campaigners. We only hear about the few instances where people are reeled in by deceptive messaging and not the much larger number of times where such attempts at getting one over on someone fail.

Propaganda has little effect in terms of changing public opinion, but may bolster the views of those who already agree with the message and can be used to signal what the state’s position is as well as its power to make its will known. Marketing may make people aware of a product as an option and make a brand more familiar, but has little demonstrable power to change opinions among those with any prior experience with the product. People do not convert to religious sects en masse, such groups grow at slow, steady rates via members’ personal networks. To the extent that people take up the weird beliefs of the faith, this is often after joining to take advantage of the social benefits and solidarity offered by membership. State religions like medieval Catholicism, despite their political power, struggled to get the populace to take even the minimal practices of the faith seriously and peasants often practiced paganism of their heritage with just a Christian veneer as cover. The payment of tithes was not easily acquiesced to, rather, this expectation stimulated recurrent protests and uprisings among the people.

We do not accept incoming messages as a default and only later critically examine them (as some applications of dual-process theory hold); the default is to extreme epistemic conservatism where anything unfamiliar is rejected – witness the pigheaded obstinacy of young children. It is often not the broader public that is quick to take up new ideas or run away with fads, but intellectuals or cultural elites who seek to distinguish themselves by enthusiastically espousing freshly-minted, sweeping ideologies or theories (Bruce Charlton’s notion of ‘clever sillies’ fits this well).
Here, Mercier could have but chooses not to directly tackle how differences in trait openness to experience (or such constructs as ‘need for cognition’, etc.) relate to gullibility, but it may be that the same tendency for insecurity, poverty, or exposure to pathogens to activate disgust, avoidance, & threat-monitoring connects to why some (relatively privileged) people transcend the default tendency to ‘stick to what you know’ and let their novelty/sensation-seeking impulses run rampant, binging on new ideas or interpretations, cycling through identities and theories as quickly as they come upon them (materialist vs. post-materialist values, a la Ronald Inglehart, 2018; and, the Parasite-Stress theory of cultural change/’behavioral-extended immune system’).

We evolved to reason about things that would be encountered regularly and would have practical consequences for us, so our forays into novel intellectual territory and theorizing about things far removed from our everyday experience are quite error prone (for the idea that this runs along an IQ-gradient, see Satoshi Kanazawa’s work in The Intelligence Paradox). A sort of ‘tragedy of the commons’ (see also, Pinker, 2021) dynamic obtains when we pay little cost as individuals for being wide of the mark in our ideas about political, economic, or international affairs over which we have almost no influence or direct knowledge but when scaled up such beliefs can have malign effect at the societal level where such matters can be impacted (see Bryan Caplan’s Myth of the Rational Voter on this). This may be why rumors circulating about workplaces and in industries where the rumor-mongers are in situ are more often than not accurate, whereas rumors about people and processes further removed from our daily life or influence – with no direct effects on us or our affiliates – are mostly false and serve more as entertainment, for storytelling, or as fodder for novel, casual speculation.

People who espouse weird beliefs or conspiracy theories likely hold them merely formally, reflectively (displaying them as badges of social group membership or creativity, etc.) and do not integrate them, work out their implications, or draw logical inferences from them with which to guide their action (as one does with intuitive beliefs).

The view of people as broadly foolish has often served to undergird elite skepticism of democracy and flatters those who benefit from the status quo against the hoi polloi who, with this understanding, will only misuse or abuse any power or trust given them.

The case that we are too little trusting is somewhat weakly made. Mercier does not do much to adduce the (supposed) costs incurred or damage done by present levels of mistrust compared to some optimal level.

This book goes well with Lee Jussim’s work on the accuracy of social judgments, as well as Steven Pinker’s on optimism, the reality of social progress, and the value of rationality (his 2021 book’s referencing Mercier’s is what led me to read this).

-A.P. Dannenfeldt

References

Inglehart, R. (2018). Cultural evolution: People’s motivations are changing, and reshaping the world. Cambridge: Cambridge University Press.

Marczyk, J. (2017). Why would we expect the mind to work that way? The fitness costs of inaccurate beliefs. Behavioral Science and Law, 40, 33-34.

Pinker, S. (2021). Rationality: What it is, why it seems scarce, why it matters. New York, NY: Viking.
1 review
July 19, 2023
I have to preface, I am not a cognitive scientist myself, much less versed in evolutionary psychology. I came to this book from a philosophical perspective. More specifically, I am currently interested in arguments that expand the perspective on the socio-epistemological impact of fake news past the mere deception of (some of) their recipients.

In this respect, I found a lot of value in this book. The mechanisms for accepting or rejecting new information laid out by Mercier are compelling. Moreover, Mercier offers empirical evidence that lends empirical credence to the argument, while also engaging with evidence that seems to object, drawing attention to shortcomings in this latter evidence (again, I don't have a strong background in psychology or empirical social science, so I can only speak to the quality of this evidence so much).

However, I also have some gripes. When deploying the mechanisms of epistemic vigilence to analyse social phenomena, some of the conclusions seem a bit overstated. Largely, I think this comes down to one of the main tennants of the argument: the function of belief as justification for an already desired action as a justification. This is supposed to explain e.g. how people come to (profess to) believe in conspiracy theories, fake news, outrageous rumors etc. when they engage in When it comes to acts of (political) violence. I think this argument is somewhat convincing, however it leaves and obvious flank: How do people come to desire the actions for which they seek justification? Is this desired action not informed in turn by other kinds of believes? How were those acquired and why? This seemly invites regress, and poses the question whether people may not be gullible with respect to some cornerstone believes, eventhough they may not be gullible when it comes to the excesses of believe in dubious things, or literally anything they are told by anyone.

However, I don't intend to strawman the authors argument here. Mercier offers ample mechanisms for when new believes are acquired, though these often seem to apply on a smaller, interpersonal scale. There are hints in this book toward how action and belief co-constitute each other, and how broader movements build around certain narratives (trajectories of belief?). I just wish how this works would have been elaborated on more, instead if repeatedly showing that concrete cases of political or conspiratorial violence are not instances of people just readily accepting what they have been told by some random person (though I conceide that this might have gone beyond the focus of this book).

Aside from my personal interest in the argument, I think the book is very well written (living up to the case Mercier makes for expressing ideas in a frank and accessible manner) and actually quite entertaining. Occassionally, I found the titles of chapters and sections not very informative with regard to the actual content of the writing. Some crucial aspects of the mechanisms of open vigilence (sort of the protagonist theory of the book) were sprinkled throughout sections whose title led me to believe they were mostly concerned with empirical/annecdotal evidence, rather than fleshing out the theoretical argument. Of course this is not an issue when reading the book back-to-back, but I would caution against just picking out chapters that seem to be of interest.

Altogether I found this book to be enriching and thought-provoking, with a treasure-trove of references and cases to dig deeper.
Profile Image for Ginger Griffin.
130 reviews7 followers
March 29, 2024
Humans are remarkably peaceful and cooperative. Hard to believe, no? But do you regularly patrol your neighborhood looking for strangers to murder? Did you rip the nose off the last random guy to sit next to you on an airplane? If not, you're already doing better than a chimpanzee.

Somewhere in our evolutionary past, humans became ultra-social, with all members of the tribe relying on one another for survival. This heightened form of sociality likely began when we came down from the trees, exposing ourselves to the dangers of terrestrial life and its predators.

But natural selection operates on individuals (even individual genes), so every human is programmed to look out for self. How can humans cooperate without being taken advantage of? Nature’s answer: Evolve cognitive mechanisms for interpreting and weighing social cues, especially cues about whom to trust.

Some trust cues are fairly crude: Is this person a member of my tribe? Then his motives are likely to align with mine, at least when it comes to group survival. Is he from a different tribe? That could be dangerous because his tribe may be planning to attack us (out-group hostility being the flip side of in-group cooperation). Whom to believe when it comes to politics, gods, health? Look to the best hunter, the best forager, the best healer, the person who's best at settling disputes. They're likely to be the wisest guides.

Those cues work well for the small societies in which humans spent most of our evolutionary history. But in the modern world, intuitive assumptions can go awry in dangerous ways – sometimes with such negative consequences that observers assume the people involved must be utterly irrational or just gullible fools.

As the author points out, however, gullibility and irrationality are not evolutionary winners – they so often lead to disaster that natural selection would work strongly against them. Instead, we’re probably witnessing our inherited cognitive machinery striving to cope in a world it was not evolved to deal with.

Populist politicians and cult leaders may appear to hypnotize sheeplike disciples. In fact, though, they usually lead by following. They typically prey on pre-existing fears and resentments (especially fears of That Other Tribe), then heighten those emotions to further their power.

Anti-vaxxers may seem irrational. But their concerns are probably rooted in intuitive fears (in pre-modern settings, injecting a foreign substance into your body was generally not a good idea). Resistance to vaccines has been around as long as vaccines themselves. The evidence in favor of vaccines is overwhelming, but also counter-intuitive.

As the author notes, the problem is not that people are too gullible, but instead that we're often too conservative when encountering new situations or useful new information. This reluctance to accept new-fangled ideas probably made sense when living conditions changed slowly, but it can be a mismatch with the modern world.

It can be especially tricky to identify when to follow – and when to resist – our evolved intuitions. After all, intuition works well in most settings. That guy who looked so charming on the dating web site may be a sociopath – so you’re right to be wary. That car salesman is looking to increase his income, not optimize your auto purchase.

But when we face situations outside our everyday lives, intuition can fail badly. Populist politicians, anti-vax influencers, and conspiracy theorists understand this mismatch and learn to exploit it. Most of us get misled some of the time. The quest to be less wrong never ends.

Profile Image for Andrew Clough.
193 reviews8 followers
January 28, 2021
I'm sort of conflicted about this one. On the one had there's a lot of interesting science in the book and a fairly good framework for putting stuff together. On the other hand, I didn't run into much that was new and the one sided framing of this in terms of how good humans are at choosing what to believe was sort of offputting.

For example, most people involved in the pizzagate conspiracy theory only believed in the conspiracy in a performative way but didn't act like they believed that so the author says that means that most people aren't actually very gullable. I'm skeptical that this is something to be proud of since there was that one guy who ended up actually believing and also since the author also shows the ways many people believe in various aspects of established science without believing them.

The author also shows various ways beliefs about things like witches might be instrumentally rational for the individuals who have them due complex social forces despite being false and harmful, and that this should be counted as a victory for human judgement over credulity. I'm also skeptical at calling this a victory. Or there are tendencies that would have been adaptive in the evolved environment but aren't now. Or the ways in which false and ridiculous beliefs shared by a group can increase that group's solidarity. This is all interesting and plausible but still doesn't move my judgement on human competence.

Still, unlike Why We Sleep the author might be spinning facts but he seems to be happy to face them head on rather than hiding bits that he disagrees with. That's to years of being interested in this sort of thing I think I could have noticed major omissions and I didn't. So it's just the dramatic framing or mood that's annoying rather than a potentially dangerous cherry picking of facts that I'm complaining about above.

I'm still giving it four stars, though, because it covers a lot of interesting ground in a single book while combining both the experimental and theoretical in a good balance. It didn't convey much new ground for me but I think it would be very valuable for most people.
Profile Image for Lakmus.
311 reviews2 followers
November 29, 2022
A collection of examples of how humans aren't actually stupidly gullible, or gullible with caveats, with some theoretical background up front (epistemic aka open vigilance, see Merceir & Sperber, 2011, 2017 and Sperber 2000 (?, i think) for a more formal treatment), and a summary chapter recapping everything last – check that one if you want a quick dive.

Haven't read Steven Pinker's "Better Angels" yet, but I am guessing these two go together in a "humans aren't a lost cause" kind of genre, which I find preferable over doom & gloom lamentation on human imperfections. It's also more helpful, because it focuses on the nuances of when and how people decide what information is true and what to do with it, and makes people with beliefs different from your own (at least as they declare them) seem less insane and more tolerable.

My main gripe with this book is that the theory could benefit from being more specific. Open vigilance by itself is not a proper theory per se, more like a concept that the author applies and uses to interpret inconsistencies in the data on how humans do/not believe certain things. This kind of retrospective application weakens the punch, opening room for just-so-story critiques commonly weighted against evolutionary psychology. From memory of reading some of his & Sperber's previous work, there is some more meat to it, which could have been included. I am guessing from the general vibe of the book that it was trying to tilt more towards 'popular science', which could explain this (although somewhat unexpected, since its a university press).
Profile Image for Annie.
73 reviews
August 21, 2023
Girl, reading nonfiction is no joke! At first my brain was in overdrive, screaming at me “when have we had to use this much cognitive functioning?? The only thing you’ve learned in four years is the names of all the housewives of SLC you dumb bitch”
Well I leeeearned today that’s for sure!

All this to say, I really enjoyed this book. It was clear, well articulated, well explained and well organized, if dense. The arguments were succinct and well supported and separated in digestible chunks. It took me deadass weeks to finish because it was so much to process (especially because I had to dispel so many of my own preconceived notions). But it was certainly a worthwhile read because it DID dispel so much of what I incorrectly believed. It was both relieving and terrifying about humanity. But it colored my worldview completely differently.

I have already recounted the learnings to like 5 different parties so I haven’t the heart to type them out BUT shouldn’t that be the ringing endorsement?

I think this is an important book for this post-Trump era and for people who, like me admittedly, sit on a high horse of believing that we are more knowledgeable than vulnerable others that fall prey to such lies and corruption from social media or traditional media or pundits or charlatans like Trump.
This laid out so clearly how humans are not predisposed to gullibility, to readily believing anything we hear, and that the motivations to believe things are not what we think. Very illuminating.
Profile Image for Matteo Polizzi.
50 reviews
April 22, 2024
I’ve read this book for my PhD thesis. My prof told me about this book and he was a bit skeptical on Mercier’s thesis, which is that “we aren’t gullible”. I tried to read it without my prof’s influence but I’ve found some issues in this book:
1. If Mercier says that we aren’t gullible, why does he Give advices to avoid misconceptions and fake news? If we have these mechanisms to avoid false information, why would you give us advices?
2. Mercier is right when he says that persuasion is extremely difficult, and he’s also right when he says that we have open vigilance mechanisms and plausibility checking mechanisms, but: The fact that these cognitive devices exists doesnt mean that we always use them, we need to practice..! Another problem is that there are easy ways to by-pass these vigilance mechanisms! There’s a whole literature on storytelling and narrative persuasion, which are easy ways to fly under the radar!
3. On mass persuasion, he says that most of the mass-persuasion attempats are ineffective and that Hitler’s consensus was related to pre-existent ideas and prejudices… But isn’t the confirmation and validation of pre-existent ideas a way of convincing?? And why doesnt he mention about stereotypical thinking and conformism?

The whole book is very well made, but.. let’s say it with Mercier’s words: l’m not that gullible
Profile Image for John.
663 reviews23 followers
March 4, 2023
I have never really been convinced about mass suggestion, that we are dumb in masses and just follow the majority. Maybe because I've never been part of the majority, as I am a libertarian politically, so I know that nobody can judge me by just following any popular view. I have encountered the thought, however, that people who vote status quo, that they simply follow no mind but the majority - but even then, talking to these people easily show that they have thought through their ideas. This idea is thus something that you never put on yourself but is very easy to put on others, which is why there are many that think there is something in this.

I'm happy that I encountered Hugo Mercier who puts this assumed truth to a test and gives good arguments that it aint true. He is not always equally convincing, but enough for me to be careful when the urge to judge others as only trusting or believing what is mainstream and easy rather than thinking themselves. I do still believe that most people are misguided in what to believe or who to vote for, but this is still based on the information and experience they have.

I like this kind of reading that challenge something we all thought to be so but that we have a hunch that is not so, really.
Profile Image for Gary Jaron.
64 reviews3 followers
October 28, 2022
Mercier is tackling an important subject and does so with thoroughness and excellent analysis.

I want to believe that most people know how to evaluate who to trust. Mercier does but together, a careful analysis of how and why people make the decisions that they do and how they figure things out.
I just think that things and those prior cultural tools that he described just aren't able to keep up with today's online world.

It is a book worth reading, studying, and wrestling with - in that I haven't yet figured out if I agree or disagree with him. I guess I will just have to come back to re-read this book to evaluate it further.

He lays out studies that show that people, average and ordinary, including all of us, can figure out the truth and what to believe. At least in principle.

Our online world has just made such a mess of things.

Perhaps this book needs to be balanced with the next book I will be reading: Cass R. Sunstein's Going to Extremes: How Like Minds Unite and Divide.


Profile Image for Ahmad Alzahrani.
105 reviews2 followers
June 30, 2023
"السذاجة المطلقة تتنبأ بأن التأثير سهل. وهذا غير صحيح

ومع ذلك ، لا ريب في أن الناس ينتهي بهم الأمر أحيانًا إلى الاعتراف بأكثر الآراء سخافة.

ما يجب أن نشرحه هو الأنماط: لماذا يصعب الوصول إلى بعض الأفكار ، بما في ذلك الأفكار الجيدة ، بينما تحظى الأفكار الأخرى ، بما في ذلك الأفكار السيئة ، بشعبية كبيرة"

هل الجماهير غبية والناس ساذجة وسهل التحكم فيها، طبيب النفس الاداركي التطوري في كتابه لم نولد بالأمس يحاول بالأدلة العلمية دحض هذه الابنظرة الخاطئة ويقدم تفسيرات رائعة وأكثر
واقعية منطقية والأفضل من كل ذا علمية

" نحن لسنا ساذجين: بشكل افتراضي نميل غالبا مقاومة الأفكار الجديدة.

في غياب الإشارات الصحيحة ، نرفض الرسائل التي لا تتناسب مع وجهات نظرنا المسبقة أو الخطط الموجودة مسبقًا.

لإقناعنا بخلاف ذلك ، يتطلب الأمر ثقة طويلة الأمد ومحفوظة بعناية وخبرة واضحة وحجج سليمة.

يواجه العلم ووسائل الإعلام والمؤسسات الأخرى التي تنشر رسائل دقيقة ولكن غالبًا ما تكون غير متوقعة معركة شاقة ، حيث يتعين عليها نقل هذه الرسائل والحفاظ على مصداقيتها عبر سلاسل كبيرة من الثقة والجدل"
This entire review has been hidden because of spoilers.
Profile Image for Sytze Hiemstra.
266 reviews1 follower
March 18, 2022
Mercier poses (and attempts to prove) an interesting thesis, that people are not gullible but rather err on the side of caution. We reject more information than we accept. We could and should be more open, but vigilantly.
The problem is that over the past couple of years, too many things have happend to accept Merciers thesis at face value.
The book was written pre-corona. It does discuss anti-vaxxers but his conclusions on the subject do not hold in the face of the sheer amount of covid anti vaxx sentiment. and the accompanying conspiracy theories which are more widely believed then Mercier will probably accept.
Add to this that Mercier, when setting out the presuppositions that he wishes to disprove, does so by giving flawed examples wich can be easily refuted. Thereby building a flawed opposing case to topple.

All in all, a valiant try, but Mercier comes up short.
Profile Image for CamiloFidel.
43 reviews
March 1, 2023
This book is remarkably upsetting. It confronts many points of view that are considered true and with evidences discards them. (At least a part of them). We are not as gullible as we think we are, for example. Probably what happens is that we are pointing, in terms of personal convictions, to wrong origins (propaganda) instead of focusing to real and tangible sources (human nature and psychology) . I was excited to read about how fake news did not constitute opinions but rather confirm them. The author insists on the challenge of introducing a very urgent alternative of rational processes in comprehending human argumentation (which is currently an exception). Once I finished this book it left me a lot of things to think about and a unexpected desire of reading it again.
Displaying 1 - 30 of 50 reviews

Can't find what you're looking for?

Get help and learn more about the design.