This is the kind of primer that is becoming more common, a book that seeks to unpack why an age of information is also most decisively an age of misinformation. There is a grab bag of approaches here: a bit of discussion of confirmation bias and how political polarisation affects judgements of accuracy; some teaching of basic Bayesian techniques, how to identify evidence from a social science lens, and plenty of tales of how people, corporations, and governments have relied upon bad information.
It is a useful book to share with anyone prone to reading science articles, or who is simply trying to work out how to engage with information sources. It is an easy, relatively quick read, full of easily digestible concepts and many handy tips.
My frustrations were not so much with what it did do as what it didn't do. Edmans has little broader contextual analysis, and as a result, it kinda sorta seems like misinformation just happens because people are flawed in their judgements. In most, not all, cases of his big examples, however, there were clear motivations for distorting evidence, often the need to protect profitmaking enterprises. We don't just have climate sceptics because people apply bias to their decision-making, but because there were hundreds of millions of dollars spent discrediting the science by industries that didn't want the costs of the scale of change we needed. Pharma companies don't bury bad reports because they don't understand they are bad; they do it to protect their reputations and their bottom lines. Edmans is an economist who regularly measures "performance" by profit and share price - he certainly raises that there should be other outcomes and objectives that drive decisions, but it does seem to loom large in his worldview.
Edmans advocates for techniques to reduce polarisation around key issues like climate change, migration and Brexit, but it left me wondering if reducing polarisation is the aim or developing the capacity to make better decisions as a society is. In other sections, he talks about how to use management techniques to get around a human desire to conform. Conformity and polarization are both aspects of humanity which can affect judgement (and can come together in conformity to my side and innate opposition to yours). But it seems to me that there needs to be some distinction between understanding how this affects public opinion trends and how flawed science reporting leads to bad outcomes - I don't think they are the same thing, and it can be unhelpful to collapse them.
Similarly, it can come across as Bayesian reasoning and peer review will simply solve all our problems. Don't get me wrong, I am a fan, but these are not the only approaches we need. Advocating for people to check all the references, including looking up studies in published research, seems to misunderstand both the accessibility of scholarly publications to the general public and the available time and resources. Edmans does not critique more systemic issues in modern academia like decreasing basic research, the barriers to multidisciplinary work, or the incentivisation of overstating results (in fact, a couple of the things he advocates, trusting prestigious universities more, ensuring a strong publication record of those you trust, could be interpreted in the opposite way. One of his main takedowns is of Matthew Walkers book on sleep, which I had read. On checking my own review, I realised that I had noted that the claims were so poorly referenced and explained that the only reason I believed them was that Walker headed the sleep research unit at UC Berkeley - apparently, I should have trusted that a whole lot less). He al
In the end, this is a useful primer on why we should be less trusting and some steps on how to identify what to trust. It does exactly what it says it will, and I hope it moves us towards a more thoughtful society.