top of page
  • Writer's pictureJake Browning

Critical Comment on "The Psychology of Misinformation"

The Psychology of Misinformation, by Roozenbeek and van der Linden, is an excellent book. Short, balanced, readable. The authors are also remarkably honest with the faults of their views and, while somewhat flippant towards critics, accept how their interventions are limited. While I found the descriptive sections more useful than the prescriptive, the text is unquestionably a useful overview of an important topic.


But, at a few points, my jaw-dropped from incredulity. The simplest example is the authors' take on the buildup to the Iraq War:

A hugely influential example of harmful misinformation in the internet era came from one of the world’s foremost media outlets, the New York Times. In the run-up to the 2003 US invasion of Iraq, journalist Judith Miller published a series of articles alleging the existence of an Iraqi site said to be producing biological weapons [. . .] The New York Times later issued an apology, stating that Miller’s reporting had been inaccurate and that its editors had failed to weigh the available evidence against their desire to have Saddam Hussein removed from power (New York Times, 2004). Some argue that Miller’s reporting directly influenced the US government’s decision to invade,

Reading this, each time, feels surreal. Judith Miller was citing military intelligence providing her by the government. She wasn't the issue; she was just disseminating the government's misinformation. Singling Miller out is absurd.


But this also misses the issue. Their point is that misinformation is a threat, but they don't spend time talking about the psychology of this kind of misinformation. Their "interventions" seem specifically targeted at populist misinformation--the kind coming from Breitbart or the Joe Rogan show. The closest they come to non-populist misinformation, from Russia RT, largely treats it as a reason to "check your source." But they don't seem to provide interventions around Herman and Chomsky's Manufacturing Consent, explaining why you shouldn't trust "unnamed intelligence officials." They don't seem to want people to stop trusting their government or the NY Times.


Which makes it kind of a weird book. The U.S. could have used a "psychology of misinformation" book in 2008, but it wouldn't look like this. It might have a chapter on why we (had been) overly credulous of our government. And the interventions would look different, telling us how to avoid believing banks and Moody's. And it couldn't assure us that scientific consensus is reliable unless they also said economics isn't a science. The misinformation people had been getting came from "trustworthy" institutions, often by smart people with lots of data, and they rarely trafficked in populist schticks (like fallacies, emotional appeals, etc). And this misinformation ended up killing a lot of people and ruining a lot of lives.


I'm not opposed to efforts to stop populist misinformation. A lot of "health" information strikes me as dangerous garbage. But focusing on the "pop" stuff seems myopic. The highly educated, liberal folks reading a book on the psychology of misinformation should be reminded that they are especially prone to believing horseshit, and the book really misses an opportunity to highlight that. Smart, savvy, highly educated people often believe the scientific consensus, even when it is absurdly counterintuitive--only to realize later that the commoners are right and infants don't fantasize about killing their fathers and sleeping with their mothers. That'd be a great question the book doesn't answer: why are highly educated people so prone to believing sexy but absurd theories?


I worry the reason the book doesn't devote a chapter to why smart people are so stupid is because the authors don't really believe it. The book isn't written as, "why are you such an easy mark?" Rather, it is written as, "why are THEY such an easy mark?" The authors seem to think it is always respectable to believe the scientific consensus and agree with the NY Times editorial board. And they also think, even if the consensus is wrong, it is only because of immoral actions--people like Miller or Andrew Wakefield--and the consensus will quickly self-correct. The upshot is that it is seen as acceptable to just trust the government and institutions--that we can avoid the harms of misinformation by just believing the good guys.


I think that is a mistaken--and a real missed opportunity. This is a good book that does what it does well, explaining clearly how populist misinformation works. But there needs to be a larger book on the psychology of belief itself, with a special part devoted to why some people are so easily duped by experts with theory, stats, and a slick PowerPoint deck, and others are so easily duped by a smooth talker with a YouTube channel. The book really seems stuck on the latter, despite all the harm caused by people with lots of letters after their name believing the former.


4 views0 comments

Recent Posts

See All

Copyright and Generative AI

The recent wave of cases concerning generative AI, such as Silverman v OpenAI, have not gone well. From a legal perspective, this isn't surprising. The plaintiff couldn't definitively show their book

Why isn't Multimodality Making Language Models Smarter?

Philosophy has something called the "symbol grounding" problem. The basic question is whether the words and sentences of language need to be "grounded" in the world through some sort of sensory relati

bottom of page