APU Environmental Exploring STEM Podcast

Podcast: The Dangerous Difference between Misinformation and Disinformation

Podcast featuring Dr. Bjorn Mercer, Program Director, Communication, Philosophy, Religion, World Languages and the Arts and
Dr. Karen Hand, Faculty Member, School of STEM

Why is it so critical to keep people from falling for disinformation? In this episode, Dr. Bjorn Mercer talks to APU STEM professor Dr. Karen Hand about the steps people can take to identify disinformation campaigns that are aimed to manipulate. Learn ways to increase your information literacy, how to check the reliability and credibility of online sources, and why it’s so important to read widely to learn other points of view. Also learn about confirmation bias, how social media companies use their algorithms to keep people addicted and in an echo chamber, and why governments need to regulate tech giants to stop the rapid and intentional spread of false information.

Listen to the Episode:

Subscribe to Exploring STEM
Apple Podcasts | Google Podcast | Spotify

Read the Transcript:

Dr. Bjorn Mercer: Hello. My name is Dr. Bjorn Mercer. And today we’re talking to Dr. Karen Hand, assistant professor in the School of STEM. Our conversation today is about information, misinformation, and disinformation. Welcome, Karen.

Dr. Karen Hand: Great to be here today.

Dr. Bjorn Mercer: Great. This is a really important topic. It’s a timely topic. And it’s a topic honestly that has been around as long as humans have been around, but has become more intense and more people are aware of it because of the internet, because of social media, and just because of how information can travel “so quickly” today. So I’m going to jump into the first question. What is the difference between information, misinformation, and disinformation?

Dr. Karen Hand: That is a very good question. And I agree about its timeliness and importance in today’s internet society. I’m going to begin with a definition from Merriam-Webster dictionary for information: “Knowledge obtained from investigations study or instruction, intelligence, news, facts, and data.”

So in summary, information is knowledge, facts, data. And misinformation on the other hand is false information that is spread regardless of intent to mislead. Misinformation spreads extremely quickly today and easily in the internet age through social media. However, the intent is not always nefarious. A person could be well intentioned and think that they have accurate information when it’s actually false, and that would be considered misinformation.

Disinformation on the other hand is deliberately misleading or biased information, a manipulated narrative, manipulated facts, propaganda. So disinformation is knowingly spreading misinformation, even intentionally creating that misinformation for nefarious purposes.

Disinformation can cover a wide range of types of disinformation. Propaganda, often used by governments designed to appeal to emotions. Pseudoscience, pseudoscience can range from anti-vaccination to climate change denial, even to miracle cures that don’t have an actual scientific basis. And other types of disinformation include conspiracy theories and that nebulous term fake news.

So, over the ages, governments have been accused of launching disinformation campaigns where they intentionally disseminate false information for political purposes with the intention to deceive public opinion. And one example of that, going back to the Cold War, would be a disinformation campaign conducted by the Soviet Union called Operation Infection that was aimed at influencing opinion to believe or propagate the myth that the US had invented HIV and AIDS.

More recently, the Russian government has been accused of running disinformation campaigns to influence both the 2016 and 2020 elections. And, even more recently, I’ve read a bit about a disinformation campaign that the Russians have been conducting aimed at undermining confidence in COVID-19 vaccines. And in this disinformation campaign, through websites and social media posts, they’re playing up the risk of vaccine side effects, questioning the efficacy of the vaccines, and even emphasizing the false claim that the US rushed the vaccines through the approval process. Or even saying that these are not approved vaccines because they’re only under emergency-use authorization and other false and misleading claims.

And then, there was the big lie of the 2020 election propagated by our own president at the time. And on the very night of the election, he began tweeting saying that he had won the election, that the election had been stolen through fraud. And this type of disinformation campaign, the power in it is in the repetition, the continuous and frequent repetition of the lie. Tens of thousands of people believed those lies. And that culminated in the insurrection at the US Capitol and several deaths.

So you can see the extreme importance of knowing the difference between information, misinformation, and disinformation. And in doing whatever we can to avoid people falling for the disinformation because the stakes are very high.

Dr. Bjorn Mercer: And that was absolutely wonderful. I love that you went through each definition and provided some examples. The one thing that I think about when I think of misinformation is very good people sharing misinformation. They don’t mean to.

And back in the old days, not that long ago, before the internet, before social media, we would inadvertently share misinformation amongst each other, just in our household, amongst family. But now, one person could potentially share misinformation that could then go on and on and on and share amongst a group of people and go to other groups of people.

And then, obviously with disinformation, that’s information that is, on purpose, put out there to confuse people. I’m glad you brought up the election because obviously it’s still raw and a lot of people are still upset by it. But I understand why people are upset because they were essentially the target of disinformation.

One of the weirdest parts of that is the whole Dominion voting machine things and blaming Venezuela. Each state has their own election process. And so, it’s not as simple as like an exciting movie where hackers hack in and take over the machines and then throw the election. Hackers would literally have to hack into every single county, every single state. And so, the scale, when you think about it, you’re like, “Well, that’s actually not possible” but like you said, the repetition of it.

And so, so many really good-natured people were thrown off balance because they were obviously concerned and still are concerned about the country. But when they’re fed disinformation and fed lies by some people, it’s understandable why they are upset.

How do you counter that kind of misinformation from people who care, and disinformation from people who are just trying to manipulate in such a fractured political environment?

Dr. Karen Hand: Well, you’ve definitely hit on a key thing, which is the very polarized and fractured political environment that we’re currently in. And one thing that I think about is that the “big lie” didn’t just come out of thin air.

Over a period of years, the president told his followers, “Don’t trust the mainstream media. It’s all lies. It’s fake news.” So here’s an example where true journalism is being labeled as fake news in the minds of some people. And the power in that is to then say, “Don’t trust the mainstream media. Only trust what I say or only trust these one, two or three sites that I approve.” And so, naturally those sites will be slanted in one direction and will be disseminating and repeating in that repetitive fashion the disinformation.

So if you have convinced people that they should not read widely and should not even listen to anything being said in the mainstream media but should only listen to certain people or certain sites, then you can gain control of all the information they hear.

So I would say to counter it, the biggest thing is to tell people that they should always read widely. If you read one point of view, you should go seek out the other points of view, or you should search on that theory and add the word “debunked” and see how people are debunking it. Because guess what? If you’re right in your crazy theories, then you should be able to read a rebuttal of that theory and not be convinced by it. There should be no fear in reading widely.

But on the other hand, if you’re being misled and then you read an actual rebuttal that explains what’s wrong with that belief, you may learn something. You may change your views.

So in my case, as painful as it is, I will sometimes seek out some videos or articles about conspiracy theories and force myself to watch them just so that I understand what kind of disinformation is being perpetrated, and then I can better answer it.

But the thing is, I’m not afraid to do that. I’m not afraid to read information from all sides and all leanings and slants because I trust my own judgment. I trust that I can weigh the facts and come to a conclusion and determine what the truth is from that wide reading.

Dr. Bjorn Mercer: And that is exactly what I recommend to people is, when you get your news and when you consume information is to read the gamut. And just like you said, don’t rely on one source of information. So if you get information from CNN or MSNBC, you need to go to Fox. I would even throw in you need to go to Breitbart. You need to go to HuffPost. You need to go to the Jacobin. I would even throw you need to go in the CBN, like Christian Broadcast Network, to get such a variety of different perspectives, to see what people are saying.

Because one of the things that really surprised me was the messaging that was going out before and after the election, I was not privy to because I am in a certain algorithm that my phone knows what I like. Even though I click on a bunch of things, there are certain things that I just didn’t see.

And so, that actually leads me to the next question. In a sense, social media dominates human attention. Can you explain what confirmation bias is and how social media companies use algorithms?

Dr. Karen Hand: Yes. So confirmation bias refers to the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories. It’s the natural human tendency to seek out and interpret and remember new information in accordance with pre-existing beliefs.

And an example I can think of recently is a friend of mine who’s a very strong, anti-vaxxer, posted an article saying that some people who received the vaccine or someone who received the vaccine still developed a mild case of COVID.

Well, she was posting it with the intention of trying to say, “See, the vaccine doesn’t work.” And I reposted it and said, “This is great news. We know it’s 95% effective, and in the 5% who might still get COVID, it will be a mild case just like this person.” So there’s confirmation bias reading the same article, but each person interprets it differently.

And then, social media companies use adaptive algorithms intended to feed users more of the same information according to what they have already viewed and liked and retweeted and commented on.

They use the information from our views, our likes, our tweets, retweets, our comments to assess what our interests are and what our views are so that they can feed us posts and advertisements intended to keep us engaged and continually scrolling through our feed.

Their purpose in doing this is the more they can keep us engaged, the more advertisements they can feed to us. It becomes more lucrative. The problem with this is it creates very one-sided echo chambers and a very politically polarized environment online.

So what are some of the things we can do to maybe foil this algorithm a little? One example I can think of is I’ve recently had my first vaccine. And I was just curious, I looked for Facebook groups discussing side effects of the vaccine just to see what people were experiencing. And I found a couple of very legitimate ones where people were just sharing information.

But I accidentally stumbled into a couple that were just full of anti-vaxxers. And I’ve heard other people say that they did that and they immediately left the group. Well, I didn’t. I stayed there and I’m doing a lot of reporting to Facebook of the posts that provide false information about health.

But also, I’m probably confusing the algorithm because I am reading and viewing things that are in opposition to what I actually view. So again, the more widely you read and that could even include what posts and tweets you look at, it might confuse the algorithm. Or some people advise, just give a like to everything. And then the algorithm will be very confused.

Dr. Bjorn Mercer: And that totally makes sense. And the algorithm, they’re all created originally with good intent, of course. Rarely is anything created with nefarious intention to completely confuse people, but all the social media companies have tried to create a customizable experience for people.

And so, unfortunately what that does is that if you’re a centrist, you get a lot of centrist news. If you’re a leftist, a leftist, rightist, news from the rights and everything else in between.

And anti-vaxxer is, again, those people are trying to make informed decisions and so often they’ve been misinformed, just like the great example you had with the article. Vaccinations don’t save everybody 100% of the time from ever even getting the sniffles. Even when you’re vaccinated, you can still get sick. The intention of a vaccine is for you not to be hospitalized or potentially die and to maybe get a milder form of the sickness.

And so anti-vaxxers have unfortunately gone down that anti-vaxxer rabbit hole in which, like you said, they’re getting confirmation bias and they’re getting into almost a mindset of not trusting authority. And unfortunately, there’s a lot of examples out there of why they cannot trust authority or the government or stuff like that.

Now, do you also find that people have poor scientific literacy? I always like to talk about information literacy and being able to parse through information or evaluated, judge it and then use it or not use it. Scientific literacy seems to be something that people greatly struggle with. Do you agree or not agree?

Dr. Karen Hand: Oh yes, absolutely. I agree with that. Typically, when children are going through school or adolescents going through high school, a lot of times students will ask, “What will I ever need this for in the real world?” Well, I think in this past year, especially we’ve found out exactly why you need science, you need to be taught science. It’s just crucial to understanding the world that you live in. And in the recent case, it’s been crucial to trying to understand the virus and the pandemic and the vaccines that are being developed.

And I do find that science literacy is a problem. There are a lot of people who don’t understand the science, maybe they didn’t pay attention in their science classes. It wasn’t their favorite subject. And they lack the curiosity or the initiative to go out and read more and seek to understand and to update their skills.

So I would definitely say that anything that we can do to encourage people to learn about science and to understand science better and to understand how that scientific knowledge applies to our world and to important daily decisions that we have both as a society and individuals, I think that it’s crucial that we put an emphasis on that.

Dr. Bjorn Mercer: That is absolutely wonderful advice. We should always work on our information literacy. And science literacy for somebody like me, who, I’m not a scientist, I always have to work on. And it takes a little extra research, but that’s good. And just like with the COVID vaccine, read an article from CNN, read an article from Fox. Go and read an article on Newsmax or on the HuffPost and see how each of them report on a similar story. And then do some research and make your own judgment or talk to people. Talk to people whom you know will be level-headed if that makes sense. No, that’s excellent.

And so, recently, there was a documentary on Netflix that was called “The Social Dilemma.” Very popular, a lot of people have talked about it. And what was it about this documentary that surprised and shocked you?

Dr. Karen Hand: Okay. First I will say that is an excellent documentary and I very much recommend everybody to watch it. And especially if you have kids, watch “The Social Dilemma,” you need to understand these things. This documentary focuses on how the big social media companies manipulate users by using algorithms that encourage addiction to their platforms.

And it also shows how these social media platforms harvest a lot of personal data and use that data to target users with specific ads and specific posts. The final emphasis on it is about the fact that all of this is occurring sometimes with very dire consequences. And in a void of regulation, this has largely gone unregulated.

So, I guess one of the most shocking things about it is just how deliberately algorithms are programmed to try to get each user to return back to the social media platform again and again. So it’s like intentionally programmed addiction. And even the computer programmers working at these companies who were aware of what was going on, they reported that they couldn’t break their own addiction to their own feed.

And then the other I think important point that is made through it is that the social media and the fake news have created the widest polarization and political ideologies that we’ve ever seen. And in this environment, it’s easy to see how this has encouraged the growth of conspiracy theories like QAnon. Because if you’re in an echo chamber and this is all you see and you’re in a network of people who also believe the same things, it can be very easy to just get sucked into that.

And so, I think the bottom line of this documentary would be to emphasize the need for regulation, that basically the companies who are designing algorithms to intentionally addict people and to encourage these echo chambers, they won’t change without regulation because there’s the profit motive. And if some boundaries and regulations aren’t put into place, it’s very unlikely that the situation would change.

Dr. Bjorn Mercer: I completely agree. And “The Social Dilemma” was a great documentary. There’s a certain aspect that was a little dramatic, kind of the scripted portion of it, but that had a very specific purpose, which was good. And what really stood out to me were, just like you said, the people who created the algorithms all had, number one, good intentions and just wanted to make a great product, and they created something that really is addictive.

One of the more difficult things is all the big tech companies and especially social media companies really do need regulation. But, governments are very bad at regulating technology that is new or on the cutting edge because they don’t get it.

Unfortunately, a lot of our elected officials just don’t understand what the 25-year old programmers are doing, or what teenagers are consuming because they’re just not in that demographic. How can the government regulate these tech companies while still encouraging innovation but really trying to help with mental health and addiction?

Dr. Karen Hand: That is a really difficult question. And I agree with you very much that the government always lags behind in regulation with new technologies because of the fact that they are new. For example, another topic that I’ve been reading about with the autonomous cars, that’s another area where regulation is lagging. Before the technology can actually be implemented and put on the road, laws need to be put in place.

[Podcast: Autonomous Cars Will Change Travel Forever]

With the internet unfortunately, it went the other way around. The internet emerged first without laws to protect people in these situations. So I don’t really have a specific idea on what that regulation should look like, but I could just say that I agree with you that it needs to have balance. It needs to balance the need for protections and the need to allow innovation.

And for example, there has been some internet-related legislation passed in the past particularly to try to protect children on the internet. And so, I think that would be an example where they applied some regulation, but it was within reason. So whatever changes need to be made, they’re going to have to take, find a balanced compromise that protects the innovators of the company but also protects the consumers.

Dr. Bjorn Mercer: And that’s perfect. I just think about when Congress was interviewing the CEOs of Facebook and Google and Twitter and the like. When you watch those hearings which are on C-SPAN and most typically people find those boring, they’re very telling because politicians will usually ask questions that allow them to get their minute bit of information where, “Okay, I’m going to ask this question. It’s really grandstanding, but it makes me look good to my supporters.” And so when you watch those, so many of the questions are just bad questions. And it really makes me think that the politicians have very bad information in scientific literacy or computer literacy, or even emotional health literacy, that you have this opportunity to talk to these billionaires but as a government official, you should not care how much somebody is worth.

And the one that I always think about and I apologize I can’t remember who it was, but it’s this one representative talking to one of these powerful people said, “Why is my campaign email going into the spam folder?” Are you trying to make a point because you think the tech companies are filtering out Republican emails, or are you just saying, “I need my people to get my email so I can get more donations and so I can get reelected?” In that line of questioning, it didn’t seem like he was like, “Oh, well we need more balanced everything.” It was just more of like, “What about me?” which is very disappointing when it comes from an elected official who are supposed to serve the people. And that brings me to the last question is, what strategies can people use to verify the credibility of information that they find online?

Dr. Karen Hand: Well, like we’ve mentioned before about the importance for better science literacy, there’s an importance for information literacy that’s of the utmost importance. Learning to assess the relevance, the accuracy, the bias, the reliability, the credibility in the content you’ve read is really one of the most important skills of the 21st century. And this is something that should be taught throughout school and in all courses at all levels of education.

So, when evaluating information online, first, don’t take your information from a meme you see on social media. If you see a claim made in a meme, that requires further research. You need to go to primary sources to verify what you see. But if you’re reading an article, for example, and you want to evaluate its credibility, look at things such as, who is the author? What is their background? What is their level of expertise? What website is that article posted on? And then investigate that website. What is the purpose of that website?

If you see an alarming article about vaccine side effects and then you look at the website hosting it and you go to the “About me” and you’ve read and discovered that it’s an anti-vaccine website, well, there you go. You know what the bias and the purpose is.

So that’s another question would be, does this website have a particular bias in one direction or another? And there are bias checking websites you can go to that can help you determine that. Allsides.com is one. They have a media bias chart.

And then also you can look up any website that’s in their database and they will classify it as left or leans left, center, leans right, right, or maybe mixed. So you can see what others have determined the bias to be.

And you can also check fact checking sites. Many people are familiar with Snopes. That was one of the first fact checking sites to come along. There’s also PolitiFact, there’s ProPublica. There’s numerous fact checking sites. And the good thing about those, is that in addition to determining if a certain assertion is true, false, or mixed, maybe mostly true, mostly false, they will give the explanation as to why. They will give all the information on that.

Another question you can ask. Does the article that you’re reading present only one side of the issue? Or are multiple perspectives provided in that article? And again, if it only presents one side, then it’s up to you to go to Google and search for information on the other side and compare the two. And actively seek articles from both sides and both points of view on any topic.

And then, with the science articles in particular, if you read something, a news article about a scientific discovery that was made or the results of a trial, if it gives you the name of the actual journal article that they’re reporting on, go to that journal. Many of these science journals are public domain. You can go and pull up the actual journal and read the peer reviewed scientific journal article. And if you’re not used to reading those kinds of articles, it might take a little while to get used to the format they’re presented in. But anybody can do it. You will get more familiar over time and then you’ll learn how to scan the article and verify what it says.

So I think the bottom line, and another example talking about verifying things is going back to the meme. If you see a meme that claims certain statistics, and these are statistics that you could go find in a public database, then you need to go to the actual government website where the data like the CDC website or whatever the topic is, and you need to pull up the actual figures and examine them for yourself and then decide whether what you saw presented was accurate or not.

There’s so much data that’s publicly available. It’s just very important that people get used to going and looking firsthand at the data for themselves. So, I guess my bottom line is don’t be afraid to read widely, investigate deeply. And don’t ever let anyone tell you not to read certain points of view or not to read from certain sites because that is censorship.

Dr. Bjorn Mercer: Exactly. And really what you said right there at the end is absolutely wonderful because people will try to control what you believe and think of by telling you to not experience or not be informed about other issues. The best thing you can ever do is be informed about all issues, be informed about all sides and then make your own decision.

And I love that you said read peer review journals. Read journals because if you’re not used to it, try it out, talk to people who can help you out. And the more you do it, the easier it gets. And so, that’s really the most important thing is just to really expose yourself to as much as you can.

And then, I would also add. If your expertise isn’t in this or that, find a friend or a family member whom you trust that is level headed and have them help you, say, get through some information and get through some data, versus like you said, just relying on memes or Twitter posts to guide what you think. I would recommend never rely on Twitter to guide what you think because Twitter isn’t out there to really help you. Twitter doesn’t think about your best interests in heart.

So, absolutely wonderful and really great things to say. Karen, it’s been really great talking today. And today we’re speaking with Dr. Karen Hand about information, misinformation, and disinformation. And of course, my name is Dr. Bjorn Mercer. And thank you for listening.

Dr. Bjorn Mercer is a Program Director at American Public University. He holds a bachelor’s degree in music from Missouri State University, a master’s and doctorate in music from the University of Arizona, and an M.B.A. from the University of Phoenix. Dr. Mercer also writes children’s music in his spare time.

Comments are closed.