Disinformation and Media Literacy: An Interview with Dr. Tom Johnson

Dr. Thomas J. Johnson joined the faculty at the University of Texas at Austin in 2010 and currently serves as the Amon G. Carter Jr. Centennial Professor in the School of Journalism as well as the director of the Digital Media Research Program. Much of Johnson’s work focuses on the credibility, use, and exposure of the internet and the relationship between the Internet and traditional media sources. His most recent edited book, Agenda Setting in a 2.0 World:  New Agendas in Mass Communication:  A Tribute To Maxwell McCombs International Media Communication in a Global Age (Routledge, 2014), explores agenda-setting theory in light of changes to the media environment in the 21st century. I met with Dr. Johnson to discuss the rampant spread of social media misinformation and its effect on political communication.

CH: A large part of your research focuses on, and I’m quoting from your university bio page here, “how people use the media to build social capital.” Can you elaborate on that? What does it mean in a political sense?

TJ: Social capital is the idea that you use something to build things for yourself. It could be reputation; it could be more knowledge; it could be to make decisions. It’s seeking out media to serve a kind of a specific purpose that will benefit you, usually in a social but often political sense too. To pick up a topic and become more knowledgeable means you’ll become interested and perhaps participate in it. Often, when building knowledge, you’re assuming that you’re reading sources that are presenting factual information versus partisan sources or social media disinformation. Misinformation is just wrong information. Disinformation is when you are purposefully saying wrong information. We often talk about that with the government. Look at President Trump: we don’t know to what degree he believes what he’s saying. But we do know that when Russia is making all these claims about Ukraine, these are absolutely untrue.

CH: In 2009, you co-authored an article titled “The FaceBook Election: New Media and the 2008 Election Campaign,” part of which discusses the Obama campaign’s effective use of Online Interactive-Social Media (OSIMs). You were at the very forefront of the research being done on this new campaign medium; at the time of article, did you have any notion of its potential for disinformation and foreign interference? 

TJ: Not nearly to the degree that we’ve seen. At that time, I really didn’t get a sense of how much misinformation—or worse, disinformation—would be present. I was really surprised to see it starting even back in 2016, where there was a lot of disinformation, and certainly also in 2020. I was also one of the first ones to study blogs, and blogs are partisan media that come from a certain perspective. But still, 2016 really threw me for a loop with so much disinformation. The disappointing thing is that it was not unknown. President Obama wanted to do a press conference talking about it. Hillary Clinton brought it up in the third debate. But the media didn’t really pay attention to it until after the election campaign. In this [most recent] election I am surprised there wasn’t a lot of focus again. In some cases, it was more expansive with sources in Russia, China, and Iran. I will say, you saw some stories about it, but you just didn’t often see pushback against the notion we had disinformation.

CH: Do you have thoughts regarding how the line might be drawn between freedom of speech and the ease with which mis- and disinformation can be spread with these social media platforms?

TJ: Well, I will say [the media] have done a much better job on climate change. In the past, climate change in itself could be considered a pretty dull topic, but recently there has been more of an emphasis on accurate coverage than balanced coverage of both sides. Though, they still need to work in support of facts and pushback against things that aren’t true. When the media considers something important they’ll ask it again, or the next reporter will. Fox News is often criticized for misinformation, but Chris Wallace is often really good about trying to pin people down on things like that. 

CH: I’ve been seeing a lot of buzz on the portrayal of news and reporting on TikTok from Ukraine. How familiar are you with the relatively new form of OSIM, and are there any significant differences from older platforms (Facebook, Instagram, etc.)?

TJ: TikTok has become one of the new media sources for spreading disinformation because it is so visual and because it is heavily focused on the young. In the recent Republican election, they created an official group that was there to solely promote pro-Trump information. I’ll say in general that I think Republicans are much better at this visual propaganda than Democrats. Republicans are aware that, to really convince people, you convince them through emotions. Anger and fear in particular can really mobilize people’s actions. So they’ll focus on issues such as Critical Race Theory that get people upset and get people acting. I’ll talk to some of my Democratic friends, and they’ll tell me that— like the Michelle Obama quote, “When they go low, we go high.” That may be admirable, but if you’re going high and you’re not connecting with people in the same way, then going high just isn’t working. Visual media goes with the low. I’ve seen some of the commercials from this election, and they’re still connecting on these emotional issues. At the school board meeting, if you have one side yelling loudly about a position and then another side quietly trying to argue persuasively, the media is going to focus on the yelling. Democrats need to realize that there are good, decent ways of appealing to emotions that are very effective as well. 

CH: Looking back at your previous work, I see some significant discussion surrounding internet conspiracy groups. I think people might be most familiar with QAnon. Is this a purely modern phenomena? 

TJ: Actually, I’ve just finished working on a paper, and originally, I had used a conspiracy example from 2006. My co-author suggested an earlier example, and I found a story about conspiracy theories of the 1796 election, really the first true election in this country. Even back then, there were a lot of conspiracy theories going around, a lot of disinformation. Certainly there are very famous ones in our country, such as the Kennedy assassination and 9/11. Often they come up when you have a stress period. In a time like this, during elections and coronavirus, suddenly people are searching for answers. Conspiracy answers are very simple and often offer partisan, political solutions. QAnon was a huge story, with its basic ideas of a secret cabal with Democrats and celebrities. On the face of it, it seems ridiculous. But for some people who are partisan and looking for an enemy, these things make sense. Conspiracy theories really whip up hatred against an enemy. The idea is less promoting your side as much as hitting back against the other side. 

CH: Candance Rondeaux, a researcher for the School of Politics and Global at ASU, was recently asked about the decrease of FCC (Federal Communications Commission) regulation in spaces like Amazon, Apple, Microsoft and Google. She responded that “Really, content moderation is like the last resort, or like, not even the tip of the spear. I’d say it’s like the way back end of the spear… there’s lots of different moving parts in the ecosystem.” Any thoughts on her statement?

TJ: I think it is certainly a good place to start. Facebook and Twitter did a better job in this election, flagging things that are mis-statements and banning people. Banning President Trump was big. And in some cases, it’s not just flagging negative things. One of the things The New York Times is pretty good at doing is a kind of model behavior. They’ll put arguments at the top that they think are well reasoned and treat them as examples. But it takes work. You have to pay people to moderate your discussion and most organizations can’t afford to do that. It takes truly moderating discussion boards and modeling good behavior, as well as kicking off those who are not rule-abiding. But the idea that conservatives are talking about this as a freedom of speech issue is just incorrect. The First Amendment applies to freedom of speech, but it doesn’t cover commercial speech. You don’t have the right in a commercial space to express any idea you want. Facebook and Twitter have a right to moderate what you say. I express my views but am careful to not personally derogate anybody. People should look to their everyday life and draw guidance from that in regards to self-regulation. 

CH: Do you have any further advice for students navigating OSIMs in a modern political climate?

TJ: Firstly, when you look for information, go to sources that you trust. Typically brand name sources like The New York Times or CDC you can place more trust in than a random website. While you can’t verify everything, if you have questions, see what other people are saying about it. It does take some media literacy to learn what is reliable and what is not. Especially if you’re seeing it on TikTok or some form of social media, check it out against a source you trust. Also, be aware of what your biases are, know where you stand and be willing to seek out the other side. Seek out sites that you might not agree with and decide for yourself. Take the time to comprehend what you’re seeing online and, at the end of the day you can only ask yourself, “is this true?”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s