Misinformed: An interview with Associate Professor of Philosophy Yuval Avnur

illustration by Brooke Irish

In our digital age, information is more accessible to more people than ever before. Yet one of the central concerns of public life is our susceptibility to the influence of bad information, whether in the form of fake news articles, doctored images, or manipulated video. Associate Professor of Philosophy Yuval Avnur comes to this dilemma as an epistemologist, interested in how we arrive at knowledge in the first place. But he’s also interested in how other fields might contribute to the conversation. That’s why, as this year’s director of the Humanities Institute, he’s made “Ignorance in the Age of Information” the focus, inviting a cadre of academics, researchers, and technologists to campus to discuss fake news, biases, echo chambers, conspiracy theories, and virality—and how they all work in concert to disrupt the foundations of a healthy democracy.

Scripps Magazine: Most of us have heard by now of the “fake news” phenomenon. Tell us more about it and how it affects our understanding of truth.

Yuval Avnur: Fake news is a form of manipulation through false facts, explanations, and narratives. Many people think of fake news as a new phenomenon, but actually, it’s quite an old problem—one as old as the press. One of the things that differentiates fake news now is the speed of our news cycles and how quickly it proliferates. There’s an incredible MIT study, published just last year, in which researchers looked at the dissemination rates of false and true statements on Twitter. They found that false statements spread much faster and wider, presumably because they’re more entertaining in that they target users’ interests. So, because of the speed at which misinformation spreads, and because we tend to engage more with and believe in stories that reflect our biases or interests, we now have an information environment that makes our susceptibility to these biases much worse. We’ve always been biased about how we process information, but now it’s on steroids.

What is new about our information environment is the “echo chamber” problem: The way we get information is often from likeminded sources and often excludes serious consideration of opposing views. For better or worse, all Americans used to watch the same couple of news programs every night and get the same info from the same people. Now, there’s an echo-chamber effect because we can pick and choose where we get our information in a way that’s determined by whether we generally agree with the source. We filter our news so that it reinforces what we believe by designating as the arbiters of truth—whether Fox News, CNBC, or Breitbart—those who are likely to share our outlook.

SM: So, right or wrong, at least we all used to receive the same information; there was less suspicion and doubt among the citizenry. How is our modern echo chamber, then, more pernicious?

YA: It’s not always about what information you receive. People reading news feeds representing polar opposites, politically, often get the same basic information. Rather, it’s what’s said about the information, or what conclusions are drawn from the information, that changes according to the echo chamber. Which echo chamber you’re in will decide which information gets emphasized and which sources get trusted.

So, this echo-chamber phenomenon is the product of how we now receive information. We get it from people with similar interests and fears and views as us, and our biases are thus reflected back to us as credible. What makes it especially insidious is that, along with our information comes encouragement to downplay unpleasant information and make more of information that helps us arrive at the conclusions we like.

SM: So, we inoculate ourselves from opposing arguments, opposing facts—such as they are—and opposing interpretations of those facts.

YA: Exactly. One of the most interesting versions of this involves the Mueller probe, the Special Counsel investigation into whether there was Russian interference in the 2016 election. For a long time, no one on the Trump campaign was specifically charged with any crimes relating to helping Russia interfere with the 2016 election. [Editor’s note: Professor Avnur and I went back and forth about how to most accurately phrase this statement, during which various news sources and political blogs were cited and/or dismissed—an irony that was not lost on us.] Some journalists represent this as proof that either Mueller has no evidence or that he’s being secretive and inappropriate. Others, however, represent this as proof that Mueller is strategic and has something big coming down the pipe. I’m not saying that both sides are equally credible, but what is obvious is that our opinions are reinforced by the source of our information, which we choose based on its conformity with our biases. If you’re in a pro-Trump echo chamber, you’re likely to hear a lot more from those who think that Mueller’s inaction is a sign that he’s found nothing supporting collusion.

SM: And the more we hear our own biases reflected back to us, the more entrenched in our beliefs we become.

YA: Definitely. And this is one way that conspiracy theories are born. There is a difference between a conspiracy and a conspiracy theory. Conspiracies do exist—like Watergate. They either provide misleading evidence or work to cover something up. They happen. A conspiracy theory, on the other hand, is just that: a theory. It’s a way of trying to make sense of evidence, and conspiracy theories often provide ways of accounting for why the evidence might appear to suggest one thing but really suggests another. Conspiracy theories mess with how we interpret evidence. It’s a fascinating topic— I couldn’t possibly do it justice here.

SM: So how do we become media literate in a way that enables us to distinguish between, say, a well-researched article and the blog post of a nonexpert? How can we overcome our desire to have our biases confirmed and instead seek out the best information?

YA: We need to learn a new kind of critical thinking about how we get our information and how to weigh the trustworthiness of different sources.But perhaps the best thing we can do to avoid the trap of the echo chamber is to practice intellectual humility, which involves being critical of our own views, biases, and tendencies. We are all vulnerable to misinformation, but nobody thinks they are—that’s the nature of the disease! We are so good at spotting cognitive bias in others but not so good at spotting it in ourselves. David Hume, the Scottish Enlightenment philosopher, once said that we should always be suspicious of the hypotheses favored by our passions. He said this because the more strongly we feel about something, the more likely our feelings are to influence and bias our reasoning. This must involve more than just looking at a diversity of sources—we’re very good at finding reasons to dismiss evidence and arguments from the “other” side. If you really want to know what it feels like to navigate our information environment well, think of how you form an opinion on a topic you don’t have any personal investment in. When your passions and preferences aren’t at play, you can really look at different sides of an issue and draw conclusions based on what’s important: facts and credibility. It’s only at that point that traditional critical thinking can operate the way it’s supposed to.

SM: What are some of the larger ramifications of the fake news and echo-chamber phenomena?

YA: They tend to lead to an increasingly polarized population, which is bad because we aren’t just disagreeing; we hate each other. We’re turning away from the ideal of a public sphere of dialogue—and toward sectarianism. I think everyone knows this, and some think it’s a good thing. I don’t. I think it undermines the whole structure of democracy. If half the population thinks that the other half is misinformed or lying, if we don’t trust each other to a degree, then we can’t really believe that we are in a democracy. Think of the lie that Trump lost the popular vote because of mass illegal voting: How can we feel we are in a democracy if we are led to believe that the system itself, or official statements about it, can’t be trusted? If we can’t trust the democratic process, we can’t believe that we are in a democracy. If we can’t reliably get at the facts or rationally arrive at beliefs about the facts, then we cannot rationally vote. If you can’t do that, you don’t have a government that works for the people. Instead, the government supports whatever interests are supported by the misinformation. In the current milieu of fake news and the echo chamber, the “wisdom of the crowd” is playing a diminishing role.

SM: So, what do we do? By which I mean, what action can we take, or who should be charged with dealing with this problem?

YA: It’s a truly multifaceted topic. Is this a question of public policy? Is this a legal question? Is this a question about journalistic ethics, or a definition of “the press” that should include purveyors of echo chambers, like Facebook? Is this a question for sociologists and psychologists? For computer scientists? How information in groups travels and how the new tech affects that—is this a question for epistemology and philosophy, perhaps along
the lines of how to form rational beliefs in such an environment? Or is this a question for those who study power dynamics and oppression? I think the answer is “all of the above.” We need to look at the problem through all of these lenses to develop a more sophisticated, updated way of dealing with disagreement in society in order for our democracy to flourish.

Tags