What is fake news? How can you spot it? And what can we do about it?
Welcome to TechFirst with John Koetsier. Two of the biggest topics in the news over the last few years have been Coronavirus and fake news, and they’ve been tightly linked over the past few months.
The problem is: we can’t really talk if we can’t agree on facts … or whether there’s no such thing as “fact” or “truth” at all. And it’s hard to have national or International conversations about what’s going on when nation-states and interest groups are fueling and funding fake news all over the globe.
In this episode, I chat with Mitch Chaiet, the Media Innovation and Entrepreneurship Coordinator at the University of Texas at Austin.
Some of the topics we cover:
- Why Hollywood puts conspiracy theory fodder in movies
- Why the History Channel has “Ancient Aliens” on its list of shows
- Companies that create fake news
- Middle East oil deals
- Nations that create fake news
- Types of fake news
- Elon Musk
- Information Disorder
- Bill Gates, vaccines, 5G, and Coronavirus
- And much more …
Listen: fake news, the truth about lies
Better yet, subscribe to TechFirst with John Koetsier wherever podcasts are published:
- Apple podcasts
- Google podcasts
- Spotify podcasts
- And multiple other places … see them all on Anchor
Watch: fake news, the truth about lies
And … full transcript: Fake news, the truth about lies
John Koetsier: What is fake news? How can you spot it? And what can we do about it?
Welcome to TechFirst with John Koetsier. There’s been two massive stories over the last couple of years. I’m not talking about Trump, although we may talk about him a little bit, but I’m talking about Coronavirus obviously in the last three months, and I’m talking about fake news.
The problem is it’s hard for us to have conversations and to solve problems if we can’t agree on basic facts, or if there’s no such thing as “fact” or “truth” at all. To chat about this, I’m going to bring in Mitch Chaiet.
Mitch is the Media Innovation and Entrepreneurship Coordinator at the University of Texas at Austin. Welcome, Mitch!
Mitch Chaiet: Hi. Nice to be here.
John Koetsier: Wonderful to have you. I’m excited about this conversation. I’m excited about what we’re going to chat about. I’m excited about what we’re going to learn about. And why don’t we start here … is fake news anything I don’t agree with?
Mitch Chaiet: Fake news is definitely not anything you don’t agree with. However, the term itself can be pretty harmful. A better term for everything that you were referring to is “computational propaganda.”
John Koetsier: Wow. Computational propaganda, that’s a $10 word there.
Mitch Chaiet: That is a huge new academic term. I wouldn’t say it’s new actually, but the term ‘fake news’ is colloquial and it refers, it implies that there’s something that’s real, in addition to fake. So computational propaganda refers to the fact that people are using your own personal data to target you with propaganda.
John Koetsier: Interesting.
Mitch Chaiet: In the form of something that might be false, in the form of an article which is what most people classify as fake news.
John Koetsier: Yeah, and you’ve mentioned that there’s a difference between True/False type stories and Intent to Harm, correct?
Mitch Chaiet: Sure, sure, sure. So classifying this sort of information can get pretty nitty-gritty, and it’s also very hard to tell sometimes. A lot of it is based on reality and these narratives are expertly crafted in a way to where they seem plausible, which I’ll delve into with some examples in a minute. However the most cut and dry way to distinguish a piece of content is one, by its truth. So is it factually true? Would a fact-checker journalistically agree with this statement? In addition, the intent to harm comes into play. So whether or not somebody is promoting this out of fear, or are they trying to bring somebody down? And again, this could be classified between people, it could be classified between governments. There’s a lot of different people spreading different things online, and these sort of classifications go into everything.
So I have an interesting and easy-to-read infographic right here which delves into the three types of “fake news.” The first is “misinformation,” which seems to be a very hot buzz word nowadays. But what misinformation generally leads to is false information being propagated unknowingly. Or content that’s designed to associate two different happenings that sort of correlate with each other but have no actual scientific representation of happening at the same time.
So misinformation could be your friend of a friend knows a doctor who knows a guy who knows somebody else who sent you this gigantic long text thread saying, ‘This is how you protect yourself from the coronavirus’ and if you’ve ever received a group chat the first thing to look for in that is a source. Usually these long blocks of texts they’re called “copy pastas,” which is an internet term, but they’re very common on WhatsApp for instance. And they’ll usually at the very end tell you to like, or share it, or spread it to 15 different people. And it’s just an asset that’s designed to spread through the internet, that’s explicitly what that block of text serves to function and do.
Whether or not the health information is true or not, or pretty innocuous, is certainly up to the example, but WhatsApp has actually taken action against that, and they’re now limiting within their platform the amount of times a single message can be forwarded. However, copy and paste gets rid of that.
The second example, on the other side, is “mal-information.” So that is truthful or factual information spread to harm a government or an entity. So one example of that is with Middle East oil deals, for instance. I’m being a little vague here, but when American corporations deal with state-owned organizations, for instance, and those state governments have a branch of government designed for information operations, they will actually output propaganda or narratives that help the deal go a certain way. And this is sort of at the national scale.
And I bring up negotiations versus the US and Russia, and OPEC as sort of an interesting example of that, where foreign or nationalized companies intentionally spread and harass companies that they’re doing business with in order to get a better deal in the public eye. It could also be, if an ex leaks a nude photo of somebody and that could be very much just as classified as mal-information.
And where those two sort of come into play is what we call “disinformation.” And that is specifically fabricated content that is not false, designed to harm, and that is quite broadly what everyone refers to as “fake news.”
John Koetsier: Let me give you an example of something that came up recently in my Facebook feed and maybe you can tell me where it fits.
And so, somebody who was let’s say more right-leaning among my friends circle, shared something that was from CNN and said, ‘See fake news.’ And it was a story about a reporter who had followed up on Elon Musk’s promised contribution of masks or ventilators to a hospital and had asked the governor’s office in California, ‘Hey, what’s up with that, where are those?’ And the governor’s office had replied, ‘Actually, we haven’t seen them yet, there’s no sign of them.’ And so the story was reported as ‘Elon Musk breaks promise’ or something, I’m totally paraphrasing but ‘Elon Musk’s promised medical supplies not seen,’ or something like that, ‘haven’t arrived.’
And so Elon Musk saw that and he tweeted, and he got a doctor at a hospital who received one of those things and said, ‘Hey, we’ve got it right here.’ And so he called out fake news. And so the person who put it in my Facebook feed said, ‘See, that’s fake news, CNN is horrible.’ And so I tried to follow up and I said, ‘Well, you know, actually they got the story right in terms of getting a quote from the governor’s office, the governor’s office didn’t have it.
It turned out to be false information but I’m not sure that’s fake news.’ What’s your opinion? Would you classify that story as fake news?
Mitch Chaiet: I would not, simply because it sounds like there was information traveling at a faster rate. So obviously Elon Musk having a Twitter account is going to move a lot faster than a newsroom talking to a government PR … rep.
John Koetsier: Yes, exactly, their PR person, right? And so this reporter actually in this case, actually checked with the Tesla press representative and that rep didn’t get back to them, clearly didn’t have the information on hand or something like that, had to be checked out or something like that.
So that is a scenario where it’s a little bit challenging. Yeah, and actually a comment from a viewer here, John, it was not corrected when I checked it a couple days after. So you’re right about that, if you discover that it’s wrong you should issue a correction to that piece of news. That’s something that I would do on Forbes, for instance, but I did not see at the point that I checked that article on CNN.
Excellent. Mitch, to get back to you then. Let’s talk a little bit about our reaction to fake news. How good are we at detecting it? And how can we get better?
Mitch Chaiet: Well, it’s a media literacy problem. It’s not a technology problem. We have a billion different avenues for connecting different devices. Some people might get their news from an Alexa feed, others on their mobile phone. Others might see a screenshot of a Donald Trump tweet on their local news, like I saw yesterday. And the way this information sort of spreads is very hyper-personalized.
So one of the coordinated ways that people who do spread propaganda, one of their goals is to sort of own a group or an environment. So Facebook groups targeting certain kinds of people where they can then seed sketchy WordPress links, for instance, was a very common tactic used by the Russians in 2016. They would build these communities of certain demographics, African Americans, or Jews, or pretty much anyone that they had an interest in swaying, collect them in what looked like an authentic community and then seed them with certain ideas designed to sway how they vote. So identifying this stuff on the ground is very tough. Even academics who have been working at it for years, fall for some things.
And specifically to react into it, it can be tough when you see somebody in your news feed fall for this information. It’s not usually a malicious thing. It’s somebody got duped, you know? So immediately calling somebody out when they do share something that’s demonstrably false and saying, ‘This is bad, you should delete it’ is usually not the best way to approach things. In a news feed, for instance, you might say, I noticed that Snopes had an article that says this is true. And then asking them where they found it is a really interesting question, because then that doesn’t bring up the, you know, they shared it for a reason. I don’t like this, but you’re questioning somebody’s ability to delineate what’s true.
But asking them where they found it and then describing how that source might not be the best place to receive that information when there are others that are much more, you know, healthy, we’ll call it that. I’m not going out and saying that hydroxychloroquine doesn’t cure coronavirus. I can’t say that, I have no medical training. However, I know getting that information from a YouTube channel that is not the CDC, is probably not the best place to determine that. And that’s really what I focus on when I’ve done a lot of calling out lately.
John Koetsier: That makes a ton of sense. Absolutely, absolutely. I have a comment here that I want to address as well. This is from Ann Greenberg, a viewer, she says, “The decision to make an inquiry is also biased” and that is 100% true. I’m just answering that from a media perspective.
What you pay attention to and what you draw attention to has a bias and that’s where we speak about a ‘left-leaning’ paper or publication or news agency, or a ‘right leaning,’ or ‘centrist’ or other things like that, and that can hide or that can be systemic bias over time. That’s not equivalent to fake news, I’m assuming, but it does color how you report on and how you characterize certain things.
Am I correct on that?
Mitch Chaiet: Certainly, certainly. One of my favorite visualizations to show people is from Indiana University’s Observatory for Social Media, and what they’ve done is they’ve simulated a network environment, something like 50,000 users on a social network, and what the nodes represent are people who lean either right or left. And this simulates people sharing certain things and then people unfriending each other or blocking each other, sort of in an abstract geometric way. It’s very different with actual people, but it really demonstrates how content can be designed to spread through folks and really segment them out.
And what’s concerning sort of in regard to these communities, the term “Information Disorder” is broadly classified as an addiction to sharing information that is deemed false by fact checkers. And that alone is quite broad, but what I’m consistently seeing are communities that are susceptible to one form of conspiracy theory might be susceptible to false health news, and really it’s frankly kind of scary. There are these massive communities that just pick up conspiracy after conspiracy after conspiracy, and now that those groups have been sort of delineated and carved out, as you will, they’re the waterbeds and people are now spreading rivers down them.
John Koetsier: Yeah, and that’s interesting, right? I mean, and as I shared with you before we actually got on today, Facebook actually had a targetable segment of people, about 78 million people in this targetable segment. So if you’re an advertiser, you choose a targetable segment to send your ad to, and the targetable segment was ‘people who respond to fake news.’ I forget the name of the segment that the news just came up … what’s that?
Mitch Chaiet: “Pseudoscience” was the category that you could target.
John Koetsier: Pseudoscience. Exactly, right, and so the example of an ad was I think a beanie cap that blocked with tinfoil or it blocked EM radiation right? And so pretty interesting stuff that it was targetable.
Mitch Chaiet: Yeah. So John, I’d like to talk about the History Channel when it comes down to examples like that. The History Channel up until about eight or nine years ago shared really interesting things like civil war documentaries and an entire 24-hour reel on World War II, very much in line with the History Channel. Now from a business and content standpoint, if you spend $1 million, for instance, on a single hour of World War II documentary, let’s call it, that has a significant return on investment and it already happened in the past.
So, you know World War II is not something that’s typically disputed over the fact that it happened, how it happened, when it happened. However, if you were to take that same million dollars, and invest what you would normally get one hour of content from into an entire season of hour episodes of Ancient Aliens where every single episode you come to the conclusion that aliens might exist, it’s very simply profitable. It’s the reason those six Sharknado movies were made, because for the budget of one movie that has a little somewhat of a return, you can make six movies that are so ridiculous that people buy into that.
John Koetsier: I absolutely love that example and I have responded to History Channel on a couple of different occasions, and just, what is going on here? You know, I don’t mind the reality stuff that is like Forged in Fire or something like that, or whatever, but these Ancient Aliens and other shows like that that are just absolutely pseudoscience, what purpose do they have on History Channel? And I think they absolutely cheapen themselves by putting that there. I love that you brought that up.
Mitch Chaiet: Sure. I mean, the purpose that they’re on there is because they provide absolutely no context. You can just, if you’re eating popcorn and you see the History Channel, you can watch Ancient Aliens for 5 minutes, or 10, or 24 hours, and it’s the exact same thing and it’ll keep you hooked. Where this plays into fake news is actually quite interesting, because all of that content that you see in a show like Ancient Aliens is based off of reality. However, it twists reality into something that may or may not have happened, but it is plausible enough that it actually did, and so that is actually a new sort of genre that’s been heavily used called the “unreal.”
John Koetsier: Yeah, yeah. The interesting thing there for me, and because I’ve had a number of debates about some topics that I consider to be fake news or pseudoscience, or other things like that. It’s very challenging to have those conversations because any sources that I might provide and give as evidence get dismissed by people who do not believe in sort of mainstream reality, let’s put it that way. And any sort of sources that they might have, which might be a random YouTube video, as you mentioned, or something anecdotal that one person noticed a dead bird beside a 5G transmitter or something like that, have the force of the law of God, right? You cannot budge them from that … that seems to be more common. Do you see that as well?
Mitch Chaiet: I do, I do. So one of the things I talk about is Poe’s Law, which is actually a term from memes about whether or not something is true or not. And that at its most basic form is when somebody takes an Onion article seriously and they don’t see the domain.
John Koetsier: Yes, especially a foreign leader.
Mitch Chaiet: Yes, exactly, which is fun. What Poe’s Law refers to is that fine line in between satire and reality that says, without a big indicator delineating whether or not it’s one or the other, it’s very hard to tell, especially in extremist environments. In between satire and reality, we now have this burgeoning form of propaganda I mentioned called “beyond real” and that’s designed to make you believe nothing is true.
John Koetsier: Yes. That’s really, really challenging because we enter a sort of a post-truth society, a post-fact society, perhaps an alternative-fact society, and having a conversation across different party lines, different ideology lines, becomes very, very challenging because I live in my reality bubble, you live in your reality bubble, and never the twain shall meet. And so having a conversation that leads somewhere is challenging.
Mitch Chaiet: That’s certainly, you can’t teach people to come out of their cage. That’s, and a lot of the times when somebody’s in their cage, you know, mentally, for the analogy, and you come in and start poking through the cage at them, it’s not really the best way to do it. But it’s very basic, train people very slowly, and very passionately, and very understandingly to look at different sources. And where I go just to shout them out, I like AllSides. Have you ever heard of them?
John Koetsier: I have not heard of all sides.
Mitch Chaiet: So they are a news aggregator, which, they organize articles into certain topics. So Donald Trump tweeted something, or somebody bombed somebody, or this thing and that thing happened, and they have a database of media bias. So whether or not a particular outlet leans left or right, what’s their reporting factual information like? Do they spout pseudoscience? And none of those are disqualifiers, and all they do is for a certain topic, they’ll show you five different articles from five different outlets leaning different ways.
John Koetsier: Very, very interesting. Is that allsides.com?
Mitch Chaiet: I believe it’s .com yeah.
John Koetsier: Okay.
Mitch Chaiet: It should be pretty easy.
John Koetsier: Okay. That is super interesting. And you know, one thing I want to address is because, we’ve probably seen these things on social and online as well, that there are some people who say that those who believe in what some of us may think is fake news, or some of us might think is conspiracy theories, are dumb, or are stupid or something like that.
And the reality that I’ve found is that a lot of people who I find somehow in some conversation espouse some sort of conspiracy theory, or something that I might think is fake news, are actually really intelligent people in a lot of cases. Not all cases, but in a lot of cases they’re really, really smart and they’re trying to look beneath the surface. They’re trying to find the story behind the story. And let’s be honest, there often is a story behind the story. There often are powerful interests at play in a lot of things that happen in our society.
And then somehow in some ways, my perspective is, I feel they go down the rabbit hole and every piece of information that supports their hypothesis, they accept, and every piece of information that does not support their hypothesis, they reject. So, but I find a lot of people who do believe in a conspiracy theory to be quite intelligent.
Mitch Chaiet: Sure, and that’s because it’s specifically crafted for intelligent people a lot of the time. So one of my favorite sort of go-to examples, I studied production technology in college for movies, for music, pretty much everything in between that goes into creating something digitally. And I’ve interviewed people who claim that Hollywood are putting in Satanic symbols in the movies to represent something.
And on the flip side of that, I know producers who are like, ‘Yeah, if I throw a pyramid with an eye on it somewhere in my film, all the crazies on YouTube come out and I get three times as many views,’ which is pretty interesting…
John Koetsier: Weaponizing fake news for marketing.
Mitch Chaiet: Yeah, exactly, because the storytelling of these narratives is really very interesting and a lot of them are based on reality or things that look to be true. So just maybe in the last 24 hours, a leaked database of emails and passwords from the WHO, the CDC and the Bill and Melinda Gates Foundation was leaked online. It takes a very, very, very large amount of technical knowhow, judging from your average grandpa, let’s just start there, staring at that stuff and hearing that that happened, to determine whether or not those passwords are even real. You know, they could have just been random emails put into a Pastebin and made to look like a leak. And who’s to say whether or not it’s true. But what’s interesting is after that happened there were more narratives claiming to be from that leak. And those are what’s interesting because you’ve primed a whole group of people with the fact that, oh, I don’t know whether or not it’s true, but the WHO was hacked or something? And then, here’s a meme that says,’Here’s something that was found in that hack.’ You know, it’s the white elephant. If you tell somebody ‘there’s no elephant in the room’ three times, anyone is going to start thinking, ‘oh, there might be one, somewhere behind me.’ And that sort of seeding is really, really vulnerable.
John Koetsier: Yeah, yeah. I can totally see that because there’s an element of truth and an element that somebody added to it, and together they form something very, very dangerous. This has been a super interesting conversation. We can’t go on forever, but I did want to ask you, because you’ve been studying fake news for quite some time right now. Who creates fake news? And you’ve answered that a little bit. You’ve answered that and sometimes corporations that are engaged in business battles with other interests, sometimes it’s nation states, other things like that, sometimes interest groups. Can you talk a little bit about what you’ve found, and what you see, and some of the motivations that you’ve seen behind creating fake news?
Mitch Chaiet: Sure, and it boils down to profitability. Quite simply, it’s very easy to say ‘Bill Gates wants to put a microchip into every single person through a 5G coronavirus vaccine,’ and then people click on that and then you get ad [revenue]. So there’s a whole slew of people, especially in Europe, that are just focused on that. They just make money by putting up fake WordPress accounts and getting ad revenue. And then there’s the folks that actually want to influence or promote a certain ideology, sway voters, that’s really common in white supremacist circles for instance. There are examples of them promoting content through roadblocks trying to target 12-year-olds…
John Koetsier: Wow.
Mitch Chaiet: … and get them [into] certain ideologies. So then there’s the coordinated attempt to sway and actually condone a certain action from people who are viewing the content and not just get people into a profitable rabbit hole. And that typically comes from fringe coordinated groups, QAnon, everywhere from the BJP in India, which is their major political party, but what’s most interesting is they’re both using the exact same tactics. And so for 2020, what we’re going to see is the rise of alternative social networks. I’ve seen that across the world pretty much, and I mean that in various different ways, but people bringing people off of Facebook into a telegram chat because you can’t track anything there. And they, you know, the more and more you delineate, and gab, for instance, gab.ai is sort of the golden puppy of alternative social networks where you can say anything you want with no sensor check. So those are all great platforms on their own, what they host is entirely up to them, and I’m all for that to some extent. When that content comes from one alternative social network onto the mainstream social media networks in a sort of coordinating capacity, that’s what computational propaganda refers to.
John Koetsier: MmmHmm.
Mitch Chaiet: Basically it’s anyone with an opinion.
John Koetsier: Yeah, yeah, yeah. That makes a ton of sense. Excellent. I just want to thank you for taking this time with us. It’s been fascinating, it’s been interesting. You’ve immersed yourself in this stuff. I’ve heard that that’s dangerous, that when you start looking too deeply at propaganda or conspiracy theories, you can become one yourself. Have you noticed that?
Mitch Chaiet: No. That’s really my superpower, is being able, you know, I grew up very artistic and I knew how to critique an image or critique a piece of work without any amount of identity within that, and that really allows me to scroll through this content with no harm.
John Koetsier: Well, thank you so much Mitch, and thank you everybody else for joining. I really appreciate it. My name is John Koetsier of course, I appreciate you being along for the ride. Thanks for those who added questions and please like, subscribe, share, comment, or all the above. If you’re on the podcast later on, would love it if you’d rate it and review it. That’d be a massive help. Thanks so much. Until next time, this is John Koetsier with TechFirst.