Jump to ratings and reviews
Rate this book

The Misinformation Age: How False Beliefs Spread

Rate this book
The social dynamics of “alternative facts”: why what you believe depends on who you know

Why should we care about having true beliefs? And why do demonstrably false beliefs persist and spread despite bad, even fatal, consequences for the people who hold them?

Philosophers of science Cailin O’Connor and James Weatherall argue that social factors, rather than individual psychology, are what’s essential to understanding the spread and persistence of false beliefs. It might seem that there’s an obvious reason that true beliefs matter: false beliefs will hurt you. But if that’s right, then why is it (apparently) irrelevant to many people whether they believe true things or not?

The Misinformation Age, written for a political era riven by “fake news,” “alternative facts,” and disputes over the validity of everything from climate change to the size of inauguration crowds, shows convincingly that what you believe depends on who you know. If social forces explain the persistence of false belief, we must understand how those forces work in order to fight misinformation effectively.

280 pages, Hardcover

Published December 11, 2018

338 people are currently reading
3536 people want to read

About the author

Cailin O'Connor

5 books23 followers
Cailin O’Connor is assistant professor of logic and philosophy of science at the University of California, Irvine.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
242 (21%)
4 stars
485 (43%)
3 stars
290 (26%)
2 stars
74 (6%)
1 star
21 (1%)
Displaying 1 - 30 of 142 reviews
Profile Image for Diane S ☔.
4,901 reviews14.6k followers
April 26, 2020
I'm not going to discuss politics here, except to say that we have reached the Pinnacle of false news or alternate so called facts. This book takes examples from the past as well as the present to illustrate how people and even scientists, come to believe one thing against all others. How a whole community of professionals can alter and come to believe a fact that may be faulty.

A comprehensive study using the ozone layer, those who believe in climate change and those who don't. The vaccine debate and how people formed their opinion on that subject. Acid rain, are stomach ulcers caused by bacteria and how that finding got waylaid for many years. Russia interfering is our elections, who believes what and why. Interesting to see how false information is taken for truth by those who wish to embrace a certain side. Social media is of course they say many if us now get our information, and that can go a long way to explaining bias towards or against a certain issue.

Quite enlightening but also a little scary.

Profile Image for Atila Iamarino.
411 reviews4,512 followers
December 31, 2020
Cailin O’Connor e James Weatheralln estudam cascatas de informação: como a informação se propaga em grupos. E quais dinâmicas de grupos levam à circulação de qual tipo de informação. Neste livro, eles discutem como desinformação surge e porque é praticamente impossível de ser eliminada. Eles mostram como qualquer grupo heterogêneo, como um grupo de especialistas (médicos, cientistas, etc.) com pessoas mais ou menos avessas à novas ideias ou mais ou menos influentes podem criar a polarização de pensamento e gerar campos ideológicos que se apegam à ideias erradas de maneira inevitável. Quem duvida de uma ideia também frequentemente duvida de quem propõe e testa essa ideia, de maneira que se cria um impasse sem solução.

Eles demonstram por dinâmicas de grupos e usando exemplos reais como aquecimento global e desinformação sobre meio ambiente como esse processo acontece. E uma conclusão bem complicada, para a qual não dão uma saída fácil ou simples, é a de que se ideias perigosas não forem suprimidas, vão surgir, circular e poluir:

"Uma lição geral deste livro é que devemos parar de pensar que o “mercado de idéias” pode separar o fato da ficção de maneira efetiva. Em 1919, o juiz Oliver Wendell Holmes discordou da decisão da Suprema Corte em Abrams v. Estados Unidos de defender a Lei de Sedição de 1918. Os réus distribuíram folhetos denunciando as tentativas dos EUA de interferir na Revolução Russa. Enquanto o tribunal mantinha suas sentenças, Holmes respondeu que “o bem final desejado é melhor alcançado pelo livre comércio de idéias. . . . O melhor teste da verdade é o poder do pensamento para ser aceito na competição do mercado. ” O admirável objetivo de Holmes era proteger a liberdade de expressão, mas a metáfora do mercado de ideias, como um análogo ao livre mercado na economia, foi amplamente adotada.

Por meio da discussão, pode-se imaginar, o trigo será separado do joio, e o público acabará adotando as melhores idéias e crenças e descartando o resto. Infelizmente, este mercado é uma ficção e perigoso. Não queremos limitar a liberdade de expressão, mas queremos defender veementemente que aqueles em posições de poder ou influência vejam seu discurso pelo que ele é - um exercício de poder, capaz de causar danos reais. É irresponsável defender pontos de vista sem suporte, e fazer isso precisa ser considerado um erro moral, não apenas um acréscimo inofensivo a algum tipo de "mercado" ideal. Isso é verdade tanto para os cientistas quanto para os líderes políticos e sociais. Vale lembrar dos modelos de propaganda apresentados no Capítulo 3. Eles mostraram que os estudos que sustentam erroneamente as falsas crenças são ferramentas essenciais para os propagandistas. Isso não é culpa dos cientistas, mas com base no pressuposto (certo) de que os interesses da indústria vieram para ficar, ainda cabe aos cientistas tomar todas as medidas possíveis para evitar que seu trabalho seja usado para causar danos sociais."
Profile Image for Woman Reading  (is away exploring).
471 reviews376 followers
June 1, 2021
We live in an age of misinformation - an age of spin, marketing, and downright lies.

The authors of The Misinformation Age are concerned about how false beliefs spread, persist, and are difficult to overcome. Fake news and propaganda have circulated for centuries. Modern methods of propaganda were developed by the U.S. Committee on Public Information (CPI) to wage psychological warfare during WWI. Technological advances such as the creation of social media platforms enable even faster and more efficient distribution of false information. When public opinion is influenced this way, a true democracy does not exist.

O'Connor and Weatherall are philosophers of science - interested in questions such as what qualifies as science, are the scientific theories reliable, and what is the ultimate purpose of science. As such, they included many scientific issues to illustrate how social factors can disseminate knowledge, both true and false, throughout a scientific community and then the ramifications of that information as it becomes known to policy makers and the public. These social factors include the structure of social networks and exclude things like the individuals' intelligence levels or political affiliations.

We now accept that tobacco is a carcinogen, using it means playing Russian roulette with cancers of the lungs, mouth and throat. The medical and scientific communities knew this long before the public because lung cancer deaths had increased by a factor of 10+ from 1920 to 1948. Indeed, a CPI veteran Edward Bernays was hired by a tobacco company to increase sales. Bernays rebranded cigarettes as "torches of freedom" and then paid women to smoke during NYC' s Easter Sunday parade in 1929. By breaking the then social taboo of female smokers, he doubled the potential pool of demand.

In 1952, the widely read Reader's Digest published an article connecting cigarettes to lung cancer. Reader's Digest had bridged the knowledge gap for only scientific journals had covered this issue in the preceding two years. Public reaction was immediate and strong as tobacco companies saw both their stock prices and cigarette sales plummet.

At the time, there was a strong consensus within the scientific community that smoking was bad. In the tobacco industry's fight for survival, the "Tobacco Strategy" was born and sowing doubt would be the savior. The industry funded their own research. Among the few scientists who had inconclusive or ambiguous conclusions, these results would be touted in hundreds of thousands of pamphlets delivered to medical and dentist offices. Any research that was contrary to the industry's interests would never see the light of day and these scientist stopped receiving monies. The industry funded research in such a way that more often than not ambiguous conclusions would be created. This was accomplished by distributing money to many small research projects (and their small sample sizes) instead of to a few very large projects with their larger samples and thus more robust conclusions. The "tobacco strategy" had succeeded in breaking the scientific community's earlier consensus.

Lobbyists were deployed to relay the industry -favourable studies and they successfully delayed political reform for decades. It took 12 years after the Reader's Digest article before the U.S. Surgeon General issued an anti-smoking statement. Congress required a health warning on cigarette packaging in 1965 and passed a law in 1970 banning tv and radio advertising for tobacco. Another 22 years would elapse before the prohibition of sales to minors.

The authors had created a computer model to explain how false or misleading information is disseminated. Although outright fake information can and has been used, it isn't necessary to have that. Legitimate but lower caliber research can be very effective in altering scientific beliefs. Although the authors disbelieve the central economic assumption that everybody behaves rationally, the authors note that even rational behavior on an individual level can lead to a false belief at the collective level.

In their final chapter, they applied their findings from scientific examples to current politics to explain how Americans have become so polarized in their opinions. From the propagandist Bernays:
Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country. We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we've never heard of.

The authors included recommendations for how to combat the proliferation of misinformation as they acknowledged that it will be with us forever.

This field of philosophy of science is new to me and so Misinformation Age wasn't what I had expected from the title. It doesn't make it bad or wrong, just novel. I did learn quite a bit about propaganda. I also found the authors' discussion of their model to be a plausible explanation of the creation and persistence of false beliefs. This book did read at times like it was an expansion of a doctoral thesis. Does it fully explain the cause of the current political polarization? I'm not completely convinced but Misinformation Age provided another piece of the puzzle.

3.5 Stars
Profile Image for Jan Rice.
585 reviews517 followers
May 23, 2021
After hearing the author interviewed on the NPR show and podcast "The Hidden Brain," I obtained the audio version and listened. It was a book that for the most part was accessible via audio so, although there were a few parts (for example, the description of mathematical models) that would have required study to understand, I did not study but just listened.

Although the book is about how beliefs are influenced by other than rational factors, it isn't psychology but philosophy: philosophers studying how we know what we know and using math in that endeavor. It was written and published in 2019 (during the Trump era) and is clearly fighting back against the notion of post-truth.

Nor is she accepting the typical notions of emotionality and poor intelligence/education as responsible for erroneous beliefs. Instead she's studying how our social context impacts us, even the scientists among us.

The book Destiny of the Republic: A Tale of Madness, Medicine and the Murder of a President provides an example. According to that biography about the assassination of James Garfield in 1881, the president more nearly died from erroneous medical beliefs and practices than from his bullet wound. At that time, European physicians already had adopted antiseptic and aseptic procedures, while American physicians turned up their noses at what they viewed as overly fastidious and dandified ways. Exploring the wound and feeling around for the bullet with unwashed hands, President Garfield's physicians fatally infected him, leading to his death from a wound that may otherwise have healed. Why? Because they trusted their colleagues and did not know what they did not know.

That example was not in The Misinformation Age, but another example was: that of the discoverer of the cause and prevention of childbed fever in the mid-nineteenth century, Ignaz Semmelweis. He noticed that patients treated by physicians in his Vienna clinic were much more likely to contract and die from childbed fever (puerperal fever), to the extent that women would beg not to be delivered in the clinic or would pretend they couldn't get there in time, preferring to give birth in the street. So, he instituted hygienic reforms such that physicians performing autopsies wouldn't also attend births. His fellow physicians, though, incensed that they could be considered dirty, rejected his reforms and subjected him to ridicule and disparagement until he was transferred. He eventually had a breakdown and was confined in an asylum where he was beaten to death -- a case of no good deed going unpunished that also shows that standing up to the crowd is stressful.

Lest we think that social influence and propaganda go only one way, take the case of the aristocratic woman who introduced variolation (an early form of smallpox vaccination) into England from India in the eighteenth century. She too faced ridicule, this time for supporting a health custom from a "primitive" society. But she herself was a smallpox survivor who immunized her own children and got the royals involved. That, coupled with the success of her measures, served to swayed public opinion.

Subsequently the author does focus on the difficulty of changing beliefs and the negative uses of propaganda for thwarting scientific findings -- for example, the relationship between smoking and lung cancer, and between refrigerants and climate change. In this book, industry does not get a good name. The fact that acceptable scientific findings are at a specified level of confidence has been exploited to assert that the findings are in doubt. Reporters are cautioned about the hazards of giving a "both sides of the picture" report providing interested parties a platform from which to sow doubt.

Funding by industry has in itself been found to affect research outcomes and should be avoided.

Lengthy and persistent campaigns have been able to change deeply ingrained societal beliefs despite industry efforts, for example, smoking's image, now versus 50 years ago. It wasn't easy! Yet in other areas the horse has left the barn amidst rampant politicization. Truth isn't infinitely manipulable, though. People die from false beliefs.

Since people don't know what they don't know, sincerity is no guarantee of truth.

Toward the end of the book it began to seem that the author could conceive of negative propaganda as originating from one side of the political spectrum only -- but we are talking about 2019 and prior.

The author does not narrate her own book and the narrator, while okay, has a few pronunciation issues, unfortunately including "in-'fur'-ence," a word that appeared many times in the book's first half.

I enjoy "The Hidden Brain;" recently heard the interview of Iain McGilchrist regarding his book The Master and His Emissary: The Divided Brain and the Making of the Western World, which sounds like another good one.
Profile Image for Todd.
160 reviews9 followers
January 1, 2019
By modeling how misinformation can intentionally and innocently weave its way through the networks of scientists, O'Connor and Weatherall offer a model for the spread of "fake news" among the general population. There's plenty of solid philosophy and sociology of science here supported by a variety of current real-world examples attempting to illustrate the underlying theoretical models. Readers familiar with Philip Kitcher's more recent work and the fantastic reporting of Oreskes and Conway in "Merchants of Doubt" will quickly recognize the niche that "The Misinformation Age" successfully fills. A Highly readable consideration of "fake news" but not without its moments of philosophical jargon.
Profile Image for Kressel Housman.
992 reviews263 followers
August 16, 2019
The authors of this book are philosophers of science, which means they specialize in questions of how we know what we know. Their bias, if you can call it that, is for the scientific method. The thesis of the book, however, is that since people don’t form their opinions based on the scientific method, they’re easily manipulated.

The main point of the book was about how misinformation worked in Trump’s election, but there were more examples of misinformation from the history of science to illustrate how it gets spread. The lesson is that people believe what they believe because of who they surround themselves with. Even scientists aren’t immune; they get influenced by one another and form their own information bubbles. Manipulating them can be especially dangerous when someone has a financial motive to suppress the findings of a given scientific study. That is how an incorrect consensus can get sustained over a long period. One powerful example is Dr. Semmelweiss, the doctor who proposed germ theory. He was censured in his lifetime and ended his life in an insane asylum. The manipulations of the cigarette industry are also a compellingly clear example, and the authors then apply these patterns to explain why people believed Hillary Clinton was a dangerous criminal.

Some of the science in this book got way too technical for me, but there was one study whose lessons affected me so much, I repeated it to each of my sons. It illustrates conformity bias. The subjects were shown a pair of different sized lines and were asked to pick out the biggest. It’s a simple question that any child could answer, but each subject was shown the diagram with four other people, who were actually “plants” in the study posing as participants. Each of them gave the wrong answer, and finally the real subject was asked the same question. The vast majority of subjects gave the wrong answer, and even those who gave the right one prefaced their responses with something self-deprecating, like, “I’m always the contrarian, but . . .” My whole family are a bunch of contrarians, but we’ve paid a high social price for it.

So, overall, it’s an excellent book, but if you’re not scientifically inclined, parts of it are a bit tough to get through. Still, it’s worth the effort. Critical thinking is highly necessary these days. We all need to bolster our skills.
Profile Image for Wendi Lau.
436 reviews39 followers
October 8, 2022
Want to see a finely crafted piece of propaganda? Pick this up. Many apparently logical chapters explain how bad information spreads, but if you are used to examining primary sources, thinking critically, and considering omissions and causations, this book may frustrate you.

First, note the authors are logic and philosophy professors at the University of California, Irvine. Since an overwhelming majority of humanities professors are oppressively liberal (how ‘bout that for an oxymoron, and yes, “oppressive” is the correct word when conservative faculty hide their views for fear of professional repercussions) and UC schools characteristically so, the book’s contents are as expected.

The authors categorize views outside of the mainstream as conspiracy theories, outliers, and wrong:
• MMR vaccinations can be harmful
• Concerns about water fluoridation
• Unknown long-term effects of GMO food
• Hilary Clinton involved in child prostitution ring (This might be my favorite because the involuntarily deceased Jeffrey Epstein did traffic underage girls to powerful individuals, Bill Clinton among them. With Hilary known for intimidating and silencing her husband’s conquests, doesn’t that correctly involve her in a child prostitution ring?)

Yet, the examples of misinformation smothering good information were also once mainstream:
• Surgical instruments and gowns don’t need to be sterile
• Doctors don’t need to wash their hands between autopsying diseased corpes and delivering babies
• Cigarette smoking is harmless

The ideas continue to contradict throughout this interminable 186-page book. The following 63 pages of notes and bibliography are worth exploring. But I would have missed the disguised conclusion if I’d quit the book earlier.

On the very last page:
“…proposing our own form of government is, of course, beyond the scope of this book. But we want to emphasize that that is the logical conclusion of the ideas we have discussed. And the first step in that process is to abandon the notion of a popular vote as the proper way to adjudicate issues that require expert knowledge.

The challenge is to find new mechanisms for aggregating values that capture the ideals of democracy without holding us all hostage to ignorance and manipulation.”

Here is the real message:
• Letting people decide for themselves is bad.
• People are too stupid.
• Let others decide what information you see, and what you should think.

This book advocates AGAINST freedom of information, speech, and thought. You might think the wrong thing. You might influence others with your wrong thinking.

This is communist ideology! Prettied up with study references, diagrams, and reputable faculty credentials, this “informative” little book is an argument for communism. No wonder I have over 20 pages of comments, arguments, and notes (library book, or I would have scribbled all over this sucker)! Finishing the book was exhausting but worth it for the hidden agenda.

The book’s purpose is to convince readers the government should be overhauled to completely control the flow of information, science, education, and ideas. Because we’re not capable of muddling through. We could make mistakes; therefore, a government collective is better.

Admirers of this work have not studied history.
Profile Image for Warren Wulff.
177 reviews3 followers
July 14, 2021
Written from a mathematical modelling view of how groups of individuals change their minds when exposed variously to new ideas (right and wrong) and propaganda, this book starts strong. However, the underlying Bayesian probability mathematics are never explained, so you don’t really understand how their systems work so it becomes a bit of a black box. Further, since it’s modelling, I never got any evidence that their models are actually predictive of real world outcomes. Yes, a computer simulation shows that people act like 5th is when exposed to real fake news (not the Trump variety), but do they? I feel this is left unanswered. Finally, the thrust of the book was to model scientist behaviour, which is fine insofar that scientists are a more cohesive group with which to model. But, it’s not scientists who are attacking the US Capitol, so it would have been nice to model Joe Six-pack and why he has come to believe that Trump deserves credit for the vaccine but he would never get it.
26 reviews
December 31, 2018
A really excellent and readable book! Professors O'Connor and Weatherall write in a clear and straightforward manner that should be generally accessible. The book is divided into four chapters each of which employ two to three case studies which talk about one aspect of the spread of false beliefs. They draw upon formal social epistemology, but ensure that complex models are explained in clear terms. The text effectively argued that there are several ways for interest groups help propagate misbeliefs that are significantly more powerful than current checks put in place to prevent such an occurrence. The authors conclude that we need both governmental and independent safeguards which can help stem the proliferation of misinformation.
Profile Image for Jamie Showrank.
123 reviews17 followers
February 5, 2019
Kicks off with the vegetable lamb! Compelling examples describing how beliefs are shaped through trusted and propaganda systems. Incredibly informative and helpful for #behavioral #design. Will read again!
Profile Image for Lee .
170 reviews7 followers
August 23, 2020
Not the most riveting book I've read, but it brought into focus many problems regarding how we obtain our information, which is not necessarily the correct information, and why some of us stick to the "fake news" as opposed to moving toward the truth.

At the end of the book, the authors mention a philosopher of science, Philip Kitcher, who has written about having a "democratic society that is responsive to fact. When it comes to decisions about and informed by science -- which we may think of, broadly, as everything from clearly scientific subjects such as climate change to calculations of the actual flows of immigrants across the US-Mexico border -- what he calls 'vulgar democracy' is simply unacceptable. Vulgar democracy is the majority-rules picture of democracy, where we make decisions about what science to support, what constraints to place on it, and ultimately what policies to adopt in light of that science by putting them to a vote. The problem, he argues, is simple: most of the people voting have no idea what they are talking about. Vulgar democracy is a 'tyranny of ignorance' -- or, given what we have argued here, a tyranny of propaganda. Public beliefs are often worse than ignorant: they are actively misinformed and manipulated."

Now, if we could just find a solution!
Profile Image for Heather.
486 reviews21 followers
December 2, 2023
If you've ever thought, "Why do those idiots fall for such obviously fake news?!", this book is for you.

No book/piece of research is completely unbiased (which, ironically, is a point the authors explain very clearly in this book), but The Misinformation Age does a very good job of presenting logical arguments about why people believe obvious falsehoods, why seemingly respectable people perpetuate those falsehoods, and who has the responsibility of sussing out and eliminating outright lies that do real harm. The authors create simple models to explain how opinions form and change; the models quickly become convoluted (and rather unnecessary), but I appreciate how the authors also use real case studies, such as global warming and tobacco, to illustrate the power and danger of scientific misinformation. One review I read said this book feels like a dumbed-down version of someone's Ph.D. dissertation -- I agree with that, and am 100% okay with it.

I have assigned this to my 300-level Health Misinformation students for two semesters and think it is a lovely mid-point between textbooks and pop-science mass market books.
Profile Image for Brian.
1,162 reviews14 followers
February 8, 2019
VERY "scholarly" and scientific - which is good, but didn't make for an exactly riveting read. Some very important points, but not much I didn't already realize. Ms. O'Connor gave a very good interview on the "Hidden Brain" podcast - search that out instead.
Profile Image for Lucas Mattos.
35 reviews
February 1, 2024
Read this for PHIL 4499. Didn't track my progress because I didn't know we'd be reading the entire thing until I was over halfway through. Provides digestible presentations of several mathematical models. Illustrates otherwise complex and intangible problems of information distribution with interesting case studies, both recent and from centuries ago. Comprehensively tracks the way information “trickles down” from academics to policymakers and, finally, to laypeople. Vivifies the harm of misinformation on all relevant social levels (academic, governmental, general) and prescribes solutions in all cases. Overall a very pleasant and accessible read 👍
Profile Image for Riley Haas.
516 reviews14 followers
March 9, 2019
This is a compelling examination of mathematical models about the way beliefs spread through human social networks.
The authors, two logicians, give us an overview of mathematical modelling of the methods by which beliefs spread within scientific social networks. I read a lot of psychology -specifically cognitive biases - and one of the things most psychology studies do not focus on is the spread of beliefs which are generated by our biases. This modelling is very idealistic and imperfect, but it's a necessary addition to understanding human beliefs, given our nature as social animals.
The book then covers the methods by which industry can distort both the public perception of scientific consensus and even sometimes the scientific consensus itself. This part is particularly fascinating give the subtleties with which money can distort our perceptions of science through the good intentions of scientists, journalists and policy makers.
But perhaps the most fascinating part - and the part that is most relevant to many people in 2019 - is their analysis on Russian influence in the 2016 US Presidential Election (and Brexit, though it barely gets a mention) because it sure seems like Russia's cyber-warfare operatives used these mathematical modeling theories to influence the outcome of the election. It's uncanny the degree to which their tactics seem to follow these theories. So this section feels basically essential to anyone hoping to understand the methods governments are no using to interfere politically in other countries without sanctions, directly funding resistance groups or committing other traditional acts of interference.
The book then leaves of us with a very difficult question: is there some method by which democracy can be changed so that majority rule is no longer the arbiter of policies which must rely on science? Fortunately they do not try to answer that question (though they hint at one) but they leave it open-ended.
Anyway, well worth your time.
Profile Image for Matheus Silva.
6 reviews
January 20, 2021
The authors embark on a journey through the ins-and-outs of a mathematical model of how consensus is formed. Grounded on recent scientific debates, they go step-by-step from a simplistic analysis (all scientists are unbiased and in search of the ‘truth,’) to a full-on description of the current state of information flows: a complete mess where social networks, news channels, industries, and politics interact.

Even though most of it will seem obvious - yes, we know that the tobacco companies were successful in delaying regulation, - the book has two main contributions. The first is organizing what we know about beliefs and their transmission, making these complex mathematical models sensible and intuitive to anyone.

The second is to point out that even small changes in social structures can lead to completely different beliefs. Consequently, the book formally adds one new checkbox in the ‘How to Manipulate Public Opinion Checklist’: reshape social interactions so to create smaller, like-minded, almost indivisible communities.

This has been put in practice in the two recent a Elections in the USA, when Facebook and other social media companies made obscene amounts of money by literally reshaping social networks. Splitting society into smaller bubbles was a conscious move to radicalize both right and left-wingers. A modernized divide-and-conquer strategy marketed as “We just want to provide you the best content on your feed.”

“The age of misinformation” is one of those books that reads the past to understand the present, but in doing so it is also able to forecast what will happen if we don’t act as a society. A future so grim I don’t think I’ve been this pessimistic about things since I read Huxley’s “Brave New World.”
Profile Image for Benji.
349 reviews75 followers
May 9, 2021
A group in which almost every member individually would be inclined to make the right judgment might end up agreeing collectively on the wrong one. A group of rational beings can be very irrational.

Conformity nips the spread of good new ideas in the bud. Of course, conformism can also nip the spread of bad ideas in the bud, but on average, the greater their tendencies to conform, the more often a group of scientists will take the worse action.

Let's abandon the notion of a popular vote as the proper way to adjudicate issues that require expert knowledge. The challenge is to find new mechanisms for aggregating values that capture the ideals of democracy, without holding us all hostage to ignorance and manipulation.
Profile Image for Mandelbrot.
27 reviews8 followers
July 20, 2020
Lots of stuff about social networks and how they influence opinions.
Few case studies about scientific discoveries but mostly about how politicians and journalists respond to scientific findings.
Not about science or scientific methods.
Everything is political ... blabla.
No! Politics is the mind-killer!
Profile Image for Steven Cunningham.
Author 4 books5 followers
August 18, 2019
Just finished this book, The Misinformation Age: How False Beliefs Spread, by Cailin O'Connor and James Owen Weatherall.

The Vegetable Lamb of Tartary, a gourd-like fruit with an actual, tiny flesh-and-blood lamb inside, was believed by leading scholars in the 14th century to actually exist in India. It was a false belief, obviously.

Starting with this remarkable example, the authors ask what are the mechanisms by which such false beliefs are formed and spread, even among supposed experts? In other words, what are today's Vegetable Lambs?

There are MANY, of course, and there always have been, but now with the internet they spread MUCH FASTER than ever before, making them a much bigger problem than ever before.

Some key false beliefs:

- That the Pope endorsed Trump in the weeks prior to the 2016 election (this was the single most frequently shared item on Facebook at that time!)

- That hacked DNC e-mails were discovered by the FBI on Rich’s computer (reported and later retracted by Fox).

- That Spain sank the USS Maine (an example of fake news causing a war).

- That Hillary Clinton was the kingpin of an international child enslavement in sex ring, which Edgar Madison Welch believed was based at the DC pizzeria known as Comet Ping Pong, upon which he opened fire with an AR15 to take matters into his own hands since nobody was doing anything about his terrible (but fake) story.

- That there is any substantial disagreement among scientists about several scientific issues:
+++ climate change
+++ the roundness of the earth
+++ evolution
+++ the cancer-causing effects of smoking
+++ the effect of CFCs on ozone
+++ the germ theory of disease (that washing your hands prevents spread of disease)
+++ the safety of vaccines
There is no substantial disagreement about these scientific issues among scientists, yet false beliefs abounded in history and amazingly abound today. Just like the tobacco industry did for decades, many continue today to try to give the impression that substantial (and therefore meaningful and important) disagreement exists among the world’s leading scientists when in fact it doesn’t.

There are countless other false beliefs, in science and not in science.

They spend a lot of the book, however, talking about how false beliefs spread *among scientists* since, scientists are generally “doing their best to learn about the world, using the best methods available and paying careful attention to the available evidence. They are trained to gather and analyze evidence and they are generally well-informed about the issues they study. In other words, scientists are the closest we have to ideal inquirers. For these reasons, the fact that even communities of scientists can persist in false beliefs is striking—and if even scientists are influenced by social factors, surely the rest of us are as well” (12).

What affects the spread of false beliefs? Mostly social factors do: whom we trust, reputations, our biases and expectations, our embedded norms and assumptions, what feels right, our social connections, etc, all profoundly affect how beliefs, true and false, spread. The authors do a great job looking at how so much more than logic and evidence affects what we belief and what we share.

They offer several suggestions to improve on this rather dismal situation of false beliefs spreading rapidly today:

1) Find spokespeople whose shared values can ground trust with groups that are highly dubious of well-established facts (180).

Ideally, they say, politicians might play this role. Think of John McCain saying that climate change is real. Such a statement has much more impact on the right then the same statement made by Al Gore for example, because people would expect McCain to conform (the “maverick effect”). This same mechanism of course worked in the opposite direction in the case of Roger Revelle (Al Gore’s mentor whose life work suggested that human-caused climate change was real but despite this his name appeared on a paper with a conclusion he didn’t agree with, that greenhouse warming too uncertain to justify action. He died before he could refute it and it was later used to ridicule Gore and to cast inappropriate doubt on climate change, to give the impression that there was more uncertainty than there really was.

2) Stop thinking that there is a “marketplace of ideas” that will sort out fact from fiction. That marketplace doesn’t work like that. They make the case, rightly I think, that it is irresponsible and immoral to advocate for unsupported views, to spread false beliefs. It is not just a harmless addition to some ideal “marketplace” (180)

3) Scientists and should continue to raise standards, try to publish better studies, etc.

4) Abandon industry funding of research.

5) Journalist should minimize the social spread of false beliefs by holding themselves to different standards when writing about science and expert opinion. “If there are 99 studies indicating there smoking is dangerous for every one study indicating the opposite, journalists should talk to 99 scientists who think smoking is harmful for everyone who does not” (182). Reporting on “the other side” just to appear even-handed can be misleading.

6) And a more controversial suggestion: “Legislative frameworks should be extended to cover more general efforts to spread misinformation” (183). They note that some may consider this a form of censorship, counter to free speech, but the goal is not to limit free speech, but to prevent speech from illegitimately posing as something it is not, to prevent damaging propaganda from getting amplified on social media. They draw an analogy to current legislation limiting certain industries like tobacco and pharmaceuticals to advertise their product and to spread misinformation. This is because there is a clear public health risk. They also note that we have defamation and libel laws that prohibit inaccurate claims about individuals. They suggest that those legislative frameworks be extended more generally to avoid the spread of misinformation.

7) And really stepping back for a big-picture look, they offer “what we expect will be the most controversial proposal of all,” viz., to “reimagine democracy … to extract from Kitcherism an idea about what it means to have a democratic society that is responsive to fact” (184). But what that would look like is beyond the scope of their book (and certainly this already-too-long post!

See more book comments at https://stevenclarkcunningham.net/other/ then click "General Reading List"
Profile Image for karoline steinfatt.
36 reviews7 followers
September 16, 2024
Bei diesem Buch liegt Schwerpunkt in der Konsensbildung in der Wissenschaft, über die die Autoren geforscht haben. Diese Erkenntnisse werden erst im letzten Teil des Buches noch auf die Meinungbildung in Medien und Öffentlichkeit übertragen. Dennoch ist das Buch sehr spannend zu lesen, was auch an den vielen anschaulichen Beispielen für die Ursachen und Verbreitung von Fehlannahmen in der Wissenschaft liegt.

Leider muss ich gestehen, dass viele der Erkenntnisse aus der Untersuchung der Meinungsbildung in wissenschaftlichen Gemeinschaften meine Grundannahme einer absolut"objektiven" Wahrheitsfindung in der Wissenschaft (besonders auch in den Naturwissenschaften) schwer erschüttert haben. Zugleich wird gezeigt, wie leicht sich die Ergebnisse von Forschung schon durch die Vergabe von Forschungsmittel und PR-Studien manipulieren lassen.
Profile Image for Dave Cass.
10 reviews1 follower
April 15, 2023
This is a dry, but enjoyable read for those with a basic scientific and technical literacy. The authors follow a relatively consistent and familiar format that allows the reader to move through relatively easy: anecdote, explanation of concept, visual model, further considerations. This format covers how information spreads and the impact of scientists, propagandists, policy makers, journalists, and citizens on this process.

In this day and age, it is important for individuals to become more conscious of the factors at play in the "idea marketplace", and the Misinformation Age does well at providing this information. For their an educated target audience, there are no brilliant revelations, but instead a series of concepts that add a bit of clarity for an individual attempting to seek truth.

In general, I think this would be a wonderful book for a group to read and then discuss together.
30 reviews
March 1, 2020
Well cited book presenting interesting models of how misinformation can spread among a population. The conclusion that we may need to re-envision how democracy functions is important. Democracy, the authors argue, should be governed by evidence. There are some issues/questions that have so much, well-vetted evidence that in the eyes of government policy they should no longer be debatable. For example, the case for antropogenic driven climate change is now backed by a wealth of studies and should no longer be up for debate among policy makers. To be sure, the scientific process will grind on, continually testing and searching for new evidence--always with the prospect of toppling previously held answers.
Profile Image for Oliver Bogler.
152 reviews8 followers
March 26, 2019
With a strong emphasis on science, it explores and explains how information travels through social networks, and what influences its movement. It shows how such networks can fail to spread the best information spontaneously, or worse under the influence of individuals trying to spread bad information. Using familiar examples it points to the key influence of trust and reputation. Some recommendations are made on how to avoid bad outcomes.

The book is well written with informative diagrams of the models - easy to understand for someone not steeped in the logic and philosophy of science which is the field of the authors.
Profile Image for Brian.
152 reviews
February 28, 2019
Extremely interesting. O'Connor and Weatherall examine the social effects contributing to the spread of false beliefs. If you're like me, you'll go in thinking you already know how fake news spreads. But by the time you're done, you'll realize things are way, way worse than you even imagined. True beliefs can still flourish, but it's clear they often get slowed down by other forces. Highly recommend.
749 reviews
August 5, 2019
What I particularly appreciated was how they illustrated the different ways that media, scientists and bad actors influence public policy. This is very readable and if you're new to this topic, definitely worth reading; for me, there wasn't a lot of new information. It is a useful and well presented discussion that serves as a reminder of how misinformation spread can have consequences before I am aware that I've fallen for it.
2 reviews54 followers
August 22, 2020
Extremely relevant and well researched. Anyone who isn't familiar or equipped with philosophy of science/ science studies might find it a bit challenging although an attempt will be very rewarding.

There are also some points that I do not necessarily agree with. Irrespective, the book does a brilliant job of introducing not only how scientific consensus is reached within the academic sphere but also, the process (and agents) through which these are imbued by the general public.
Profile Image for Fiona.
1,232 reviews14 followers
February 18, 2020
More academic than I would have liked.
Profile Image for Lance Polin.
45 reviews3 followers
August 20, 2024
This is a clever book; smart and mostly even-tempered. It has an intelligent approach to an urgent topic, and it spends the bulk of the book on scientific controversies and disputes (this book was published before COVID-19 existed as we came to know it, in 2018, and so avoids much of the increased hysteria that ensued). The scientific studies, with a history going back to the late 19th century, explores how individual scientists, whether out of different conclusions, or often mistaken interpretations on results, came to be manipulated by corporate interests and unscrupulous politicians supporting them. There are outright allegations made by the authors about how history proved such agents not just wrong, but guilty of social disruption--even murder--through their selfishness and greed. Of course, trying to be fair, the authors also point out the enraged disputes that emerged based on the far too utopian beliefs of other scientists preaching absolute health, even immortality in the future.

Discussed in detail are lies about smoking, about climate change, and about vaccine debates between unconnected scientists and rabid anti-vaxxers. The authors do their best to take a judicious appraoch, and their subtle bias in fact strengthens their argument rather than dismisses it as a political agenda. The two of them seem sincere.

The most significant part, the science taking up three quarters of the book, uses the models articulated to take on "fake news" and propaganda as expressed on television, in newspapers, and especially online. They discuss just how easy it is to spread such manipulative lies to targeted groups of people through the algorithems we've all heard about and condemned in one way or another. They site sources, and point fingers at disruption sites--mostly Russian--such as RT, which is a sneering, cynical, let's be honest, terrorist Russian propaganda site (I have quite a bit of experience exploring this one in particular) that has fed lies so profoundly into western culture that mainstream news media, seeking the "if it bleeds, it leads" promotions, and frantically pursuing the tabloid nature they have devolved into, turns divisive issues into violent protests over social--even religious divisions, many which are enhanced by such cynicism, into open partisan warfare. Even petty sports debates are inspired into rage by promoting widely hated athletes as the best ever, or condemning others people believe to be so (the Michael Jordan/LeBron James debates are particularly intense, as fans passions, both pro and con, are easily manipulated. Those issues subtlely sink into politics, turning political debate into nothing more than team sports.)

Propaganda techniques are explored, names are named, and the genuine threat--regardless of which side of the political spectrum you find yourself--is warned about. Looking around and seeing the crumbling nature of social discourse, it is difficult to discount the arguments articulated here.

Very much worth your while. Only four stars because, to this reader anyway, one dedicated to the study of media influence and social chaos, the important science studies get a little tedious.
Profile Image for Tom Schulte.
3,424 reviews78 followers
May 18, 2024
While reading this book, I watched the Athena Annual Lecture: Why evidence matters from Imperial College London delivered by Join Dame Anne Glover, President of the Royal Scottish Geographical Society. We seem to live in an age where the message to employ critical thinking and the scientific method (think falsifiable and repeatable results) needs to be spread. The message here feels crafted for those drawn into political conspiracies but avoids most often confronting deeply held subjective beliefs. Instead, the approach is to discuss and review and analyze scientific or real-world objective fallacies from the Vegetable Lamb of Tartary to the back story on acid rain and the resistance to accepting that stomach ulcers caused by bacteria. The thrust is to illustrate how even scientists come to believe one thing by following their community and non-factual motivators.
Displaying 1 - 30 of 142 reviews

Can't find what you're looking for?

Get help and learn more about the design.