top of page
  • Writer's pictureJake Browning

The Pre-History of Misinformation Studies (or "Why Misinformation Studies is Problematic")

Every once in awhile, someone reinvents a wheel. This is especially common in academia, since there is just so many papers in so many domains that no person could possibly know about all of them. Moreover, often what is happening in one corner of academia is really important for what is happening in another corner, but there is no known intermediary--nobody who is hep to both and able to bring the two groups together. The result is that someone reinvents a wheel. Backpropagation of errors, for example, appeared a few different times in different places before sticking in the 80s.


Even so, the reinvention of "misinformation studies" in 2016 was surprising since most colleges and universities already had people teaching on this stuff. Critical thinking courses inevitably had sections on dubious statistics, cognitive biases, media literacy, rhetorical techniques, and questionable journalistic practices. During my teaching fellowship in Spring 2016, my students and I read excerpts from Manufacturing Consent, How to Lie with Statistics, The Righteous Mind, and plenty of articles by George Lakoff and Chris Mooney. We spent eight weeks--over half the semester--on topics related to evaluating information and media, all before Trump or Brexit introduced the terms "fake news" and "alternative facts." The challenges of dealing with disinformation, misinformation, selective reporting, and a host of other issues were critical thinking classes bread and butter. And it had been for a decade: the syllabus I was using was based on one used by every teaching fellow for as far back as I could find.


For anyone working in the wake of the Sokal Hoax, the Iraq War, the housing crisis, the contentious 2008/12 elections, and the era of austerity, there were plenty of examples of bad information to talk about. We had Dan Rather and Brian Williams; the problems of "the blob," worries about academic laziness, fraud, malfeasance, and failures to replicate; institutional and corporate deception, especially concerning financial data; dishonest (or, at least, "selective") political advertisements; and the homegrown problems of fact-free conspiracy theories floating around on AM radio stations. And, as far as I could remember, we always had these problems. We had the Pentagon Papers, a peer-reviewed "vaccines cause autism" article in The Lancet, wild conspiracy theories like the CIA selling crack, oil and cigarette companies funding deceptive studies, and Adbusters. Misinformation seemed pretty common.


This shapes critical thinking courses: we don't teach people to be skeptical. No teacher of millennials or gen Z needs to teach them to be skeptical. The point was to teach them how to evaluate information and find the signal in the noise. That's why philosophers tended to teach the course: its a mix of logic, epistemology, philosophy of science, psychology (individual and social), and technology studies. Philosophers are also pretty historically savvy, so we don't treat things like "evidence" and "facts" as ahistorical truths that mean the same thing everywhere at all times. There's no use--or honesty--in telling students, "here's the truth!" We instead say, "here's tactics and strategies to avoid some, but not all, errors. But you'll still be wrong a lot over your life, so always be willing to evaluate new evidence and change your mind."


The new misinformation wants to narrow this kind of philosophical, critical thinking approach down to a narrower concern for fighting bad information--information that is in some clear way distinct from the other kind. They often, in this area, point to "fingerprints" of misinformation--many of which are familiar and already discussed in critical thinking classes, such as appeals to emotion or appeal to popularity. This is no problem, on its own. But this is often combined with an appeal to things like "scientific consensus," "trusted institutions," and "experts." To say this is invalid is an understatement; this move is a rejection of critical thinking itself--and one doomed to both failure and embarrassment.


Take the build-up to the Iraq War. The experts and trusted institutions, like the Times, provided a near united front, with few of the "fingerprints" of misinformation. It was mostly clinical detachment, dry PowerPoint, confident appeal to "intelligence communities," and lots of high-quality evidence. By contrast, the protestors opposing the war often appealed to emotion--often with witty slogans, signs, and songs. News outlets mostly disparaged them as radical fringe, so they went to street corners and protested directly to the people. The professional "consensus" said war, the scattered nuts warned that the state was lying.


This isn't even weird. This is normal. Scientific racism was pretty common in the 18th through 20th century (and still is among certain corners), whereas a lot of the more emotional appeals came on the opposite side. But the problem wasn't emotions or its lack; the problem was motivated reasoning on the part of racists. Similarly, the lack of evidence--or, for that matter, falsifiability--for Freudian theory or historical materialism did little to spread the influence of either view--especially, one might add, among intellectuals and academics. The opponents of these theories were often mocked as reactionaries, defending outdated "common-sense" against the new scientific truth that infants have sexual fantasies or that dialectics was the underlying principle of not just history but also physics and biology. A look back in history provides little comfort to those hoping to treat experts and consensus as the kinds of things we can outsource our thinking to.


Why are misinformation studies leaning on "consensus," if there are so many reasons not to? In part, this is necessary for creating a discipline; there needs to be some principled way to distinguish misinformation and truth, and appeal to experts and consensus seem like a plausible route. But blaming misinformation--defined narrowly--works well for technocrats wanting to continue the status quo, since it denies their own role in generating misinformation. On the contrary, it provides cover for it: the "good" media happens to sound like academic papers and pentagon briefings. Misinformation studies assures us that there are experts, that the scientific consensus is not motivated reasoning, that we should trust the fact-checkers.


Philosophers don't critique this way of thinking because we don't believe in truth. It is because finding truth is essential, and philosophers don't want anyone deceived into thinking it is easy. The new misinformation studies would do well to spend time with the old misinformation studies. A good place to start would be asking themselves if it is a Good Thing that the Davos set, the U.S. intelligence community, and billionaires are praising their research, claiming misinformation is the biggest threat to the world, and showering them with money to stop the rubes from electing the baddies. I think Herman and Chomsky might have a word to say about that.

2 views0 comments

Recent Posts

See All

The Protests are Unpopular--and Widly Successful

After the march on Birmingham, a number of northern liberal activists wrote a public letter condemning them. They said the protests were divisive, violent, and failed to abide by time and place restri

Critical Comment on "The Psychology of Misinformation"

The Psychology of Misinformation, by Roozenbeek and van der Linden, is an excellent book. Short, balanced, readable. The authors are also remarkably honest with the faults of their views and, while so

bottom of page