What if the very things we want big social platforms to do to stop the spread of extremism, racism, propaganda, and lies is actually causing more problems than it solves?
Many want YouTube, Facebook, or Twitter to manage and moderate what people say. Others, who value free speech higher than social harms that its exercise might cost, say that muzzling people is unethical and in some countries illegal. The big social platforms are in the crosshairs either way, and that’s one of the reasons Elon Musk is buying Twitter.
Support TechFirst: Become a $SMRT stakeholder
But where do people go when they get kicked off the big platforms for spreading covid misinformation or saying unacceptable things? According to some, they go places that just make things worse.
In this episode of TechFirst, we chat with Daryl Davis, a black musician who has convinced 200 KKK members to stop being racist, and Bill Ottman, the CEO of alternative social network Minds. Check out my post on Forbes, or keep scrolling for full video, audio, and a transcript.
Our questions:
- why are we so divided?
- aren’t we seeing more racism lately? why?
- should any speech be censored?
- should social platforms open source their algorithms?
- what’s the solution to the division and anger we’re experiencing?
- how do we bridge our reality bubbles?
(Subscribe to my YouTube channel)
Subscribe to the podcast: does social media censorship cause extremism?
Transcript: Daryl Davis & Bill Ottman on extremism, racism, reality bubbles, and social media platforms like YouTube, Facebook, and Twitter
(This transcript has been lightly edited for length and clarity.)
John Koetsier: Does social media censorship cause extremism? According to some of the leaders of alternative social networks, yes, it does. It’s a very interesting time right now, to put it lightly. Elon wants to buy Twitter to protect free speech. Many are up in arms about that. And there’s a huge debate about what social networks should allow or not allow, and what role they should play in what we say, what we amplify, and how we do public discourse.
To chat about that, we have Bill Ottman, who’s the CEO of Minds.com, and a special guest, Daryl Davis, a blues musician and black man who has convinced more than 200 KKK members to change their views on racism.
Welcome to TechFirst, gentlemen.
Daryl Davis: Thank you.
Bill Ottman: Thank you.
John Koetsier: We seem to increasingly live in different worlds with different ideas, different facts, different perceptions of everything. Why is that? Maybe, Daryl, we’ll start with you.
Daryl Davis: Well, you know, things are always changing, always evolving. That’s how the world operates. Just like it spins around, it goes around the sun, you know, things are always changing, always revolving. And human beings are creatures of habit … not creatures of change. So, it tends to mess with their psyche, and they get involved in wanting to keep things the same, not wanting anything radical to change, etc. And when you mess with that, they tend to act out. And that’s what we’re seeing.
John Koetsier: Bill, your thoughts?
Bill Ottman: Yeah. I echo that. I think it’s very easy for people to get stuck in their comfort zones. And, you know, change can be painful, but I think that what we learn is that a little bit of pain is actually healthy, you know… or discomfort. I mean, you even…
Daryl Davis: Discomfort.
Bill Ottman: I think discomfort more so than pain. It’s, you know, whether you’re talking about dialogue or you’re even talking about like physical health. I mean, sometimes you gotta work out if you wanna, you know, get in shape, you gotta… So, I think the pattern sort of transcends in a couple areas.
John Koetsier: Bill, let’s turn to you. You wrote a paper about social networks, about free speech, and you are essentially saying, with some co-authors, that when you censor ideas in social media platforms — let’s say, Facebook, let’s say, Twitter — you actually cause extremism. Talk about that.
Bill Ottman: Yeah. I mean, that’s what the data shows. And you know, to be fair, it’s not that that’s 100% true all of the time. You know, there are cases I think where certain unlawful ideas do need to get censored, and that’s just how it is. I mean, that’s the world we live in, so…
But what do you expect if you throw someone off a website, where do they go? Well, you just have to follow them and you see that they go to other smaller forums with less diversity of ideas, and their ideas get reinforced and they compound.
Now, on Minds, we do have pretty strong diversity of thought. And so we are an alternative forum where people do go sometimes when they get banned. But I wouldn’t say their views are necessarily amplified when they come because we do have diversity of opinion, though that can happen. The point being that sort of both options are on the table. There are alternative forums they can go where their ideas will become amplified because they don’t have exposure to alternative ideas, or they could go somewhere like Minds where there is sort of diversity of thought, and they can get exposure to contrary ideas to what they think, which can fuel the evolution of their ideas.
John Koetsier: So, I want to hear Daryl’s story in a moment, but just a quick follow-up question for you, Bill, you said that’s what the data shows. What data did you dig up, look up, survey, study?
Bill Ottman: Many. I mean, probably over a hundred different studies. So, you know, really, we’re looking at isolation and the correlation between isolation and radicalization. Dr. Nafees Hamid put out a very fascinating study that we included, really talking about the difference between online and offline radicalization. And he found that actually only 20% of online radicals — people who have sort of adopted some of extreme ideology online — actually escalate into violent extremism, as opposed to the majority of radicalization which escalates into actual violent extremism is actually happening offline.
So, I think that the fascinating part of this data is that, you know, yes, we don’t like to see really ugly ideas online. However, most of the time, those people and ideas are not escalating into violent extremism.
So, you can actually invert the conversation and say, “Oh, it’s actually good that that person is expressing themself, online, as opposed to going underground and going offline where things can actually have a higher likelihood of escalating into violence.” So, it’s really about inverting the lens with which we’re seeing things that we disagree with and kind of questioning ourselves like, “How am I reacting to seeing something that hurts me or offends me? And is there something more productive I can do? Is there some sort of positive intervention or engagement that I can take part in?”
John Koetsier: This is a fascinating conversation, just the way you framed that because, you know, I probably have a lot of views that you might disagree with, right? Whether that be around COVID, whether that might be around other things, who knows? I don’t even know what you think.
But I just think of my interactions with friends, sometimes relatives on Facebook, on Twitter, and the fact that somebody thinks X is true, or Y is caused by… I’ll say ‘Zee’ for our Americans, but it’s ‘Zed’ for Canadians. You know, that’s just so offensive. How could somebody think that? It offends my entire worldview. You know, there’s no factual foundation for that. And how we react there is quite interesting. And how you frame that, I’ll have to do some thinking about that. We’ll get deeper into that.
But, Daryl, I want to bring you in here, and I wish that you would tell your story briefly because it is a fascinating story of a man that has converted KKK racists to not hate black people. How did that happen?
Daryl Davis: Sure. Well, first, let me just address a couple things that you said and that Bill said.
I don’t think kicking people off of Twitter or Facebook, whatever, causes extremism. I think what it does is it causes them to perhaps follow a path that may lead to extremism. The extremism already exists, and they’re on different platforms and different areas. And, you know, when you get kicked off of something, you go somewhere else.
And it’s quite possible that you might go in that direction to somewhere where it already exists, and it embraces you and welcomes you and amplifies you. So, in that regard, and you were saying how you hear all these things and you are offended. I’m up the mindset that I cannot offend you. You can only allow yourself to be offended.
John Koetsier: I agree [laughing]. But it’s hard.
Daryl Davis: Yeah, I know. I know. I’ve been there, man. Trust me, I’ve been there. But, you know, people say a lot of offensive things. And whether I want to be offended by it or not is up to me. You know, I can’t be offended by somebody else’s ignorance. That’s their issue, not mine. But, yes, they are offensive. Okay.
So, my story is I’m a 64-year-old, obviously, a black guy. But I’ve done a lot of traveling around the world and seen a lot of different races, a lot — well, you know, there’s only one race, human race, but a lot of different skin colors, I’ll say — religions, cultures, ideologies, belief systems, etc., and all of that has shaped me.
And the most racism that I have experienced, and I’ve been to 61 countries and all 50 states, on 6 continents, and it’s right here in my own country. And so it bothers me.
And as a child, I had rocks and bottles and soda pop cans thrown at me at the age of 10 while marching in a Cub Scout parade, in which I was the only black scout, in an otherwise all-white parade. I didn’t understand it. It had never happened before. I’d been around the world, nobody had ever treated me like that before. I had no idea what was going on. I was so naive. I thought the very few people, there were lots of white people there, but the very few white people who were doing this, I thought perhaps they didn’t like the scouts until I realized I was the only scout getting hit. That’s how naive I was. I
t’s like a former Neo-Nazi, who’s one of my best friends today. He had met somebody from the Cameroons recently, who had come here to this country, and he was talking with them, and the guy from the Cameroons said, “I didn’t realize I was black til I got here.” [Laughter]. Because that racism did not exist there.
People are just people. But here, you know, you are defined by what you look like.
So, it was the same thing with me. I was that naive. And I formed a question in my mind, which was, how can you hate me when you don’t even know me? And for the next 54 years from the age of 10, I’ve been looking for the answer to that question.
So, I’ve sought out white supremacists, white nationalists, white separatists, whatever you want to call them — you know, a rose by any other name is still a rose, as far as I’m concerned — and sit them down and just try to figure out where are they coming from? Why do they believe this way?
And that’s why I said, “I have heard everything. And it’s up to me whether I want to be offended by their offense or not.”
So, that’s how I came to do that. Many people have sat down and talked with me, some refuse to do so, some want to fight me, and we’ve unfortunately had to engage in violence. But, fortunately, those have been few and far between. So, I’ve learned a lot. But what I’ve learned mostly is this, that we all, no matter how far I go from this country, whether it’s next door to Canada, or to Mexico, or to the other side of the globe, no matter how different the people may be, who I encounter, they may not look like me, speak my language, or worship as I do, we all are human beings. That’s what I figured out.
And as human beings, we all want these same five principles, five core values in our lives.
We all want to be loved. We want to be respected. We want to be heard. We want to be treated fairly. And we all want the same things for our family as anybody else wants for theirs.
And if we can learn to apply those five core values when we find ourselves in an adversarial situation or in a culture or society in which we’re unfamiliar — it doesn’t have to be about race, it can be about any other hot topic, abortion, nuclear weapons, global warming, the war between Russia and Ukraine, the last presidential election, whatever it is, you’re on one side, I’m on the other side — if we can apply those five core values when we’re having a conversation with an adversary, our navigation will be much more positive and much more smooth.
And that’s how I have become the impetus for people in the KKK and other white supremacists to rethink that ideology and ultimately leave that ideology behind, even renounce it. And some of them even come out with me and speak out against it and try to de-radicalize people who are in their former group and try to prevent others from joining.
I don’t like to say that I converted anybody. Yes, over 200 have left. I’ve been the impetus for them to convert themselves.
John Koetsier: Fascinating, and huge respect. Can you say those five items one more time?
Daryl Davis: Absolutely. Everyone wants to be loved, we all want to be respected, we all want to be heard, we want to be treated fairly, and we all basically want the same things for our family as anybody else wants for theirs. And when we have that conversation, we find that out. A missed opportunity for dialogue is a missed opportunity for conflict resolution. Okay? You know, we start at opposite ends of the spectrum.
If you spend five minutes with your worst enemy, you’ll find something in common. And that chasm, that gap begins to narrow. Spend another five minutes, you find more in common and it closes in more.
When you get here, you are in a relationship with that person. You know, you may not be best friends, but you’re having a relationship with your enemy. You spent even more time, it gets even closer, and now you’re in a friendship. You’ve found so many things in common that the trivial things you have in contrast, such as skin color, or whether you go to a mosque, a church, a temple, or a synagogue, begin to matter less and less. So, you know, I always said when two enemies are talking, they’re not fighting. They’re talking. It’s when the conversation ceases that the ground becomes fertile for violence.
John Koetsier: There’s a lot to take in there. And I’m sure we’re gonna get into a number of those things as we continue the conversation.
Just before I go back to Bill here, there’s a perception that we are seeing more racism lately. I should speak clearly. There’s a perception that there is more racism lately, whether it’s Asians and COVID, whether it’s BLM and people who are against BLM, or whatever. What is your perception here?
Daryl Davis: Okay. Racism has increased incrementally. It’s always been here. I think you said it right the first time. Not so much that there is so much more racism, but we are seeing it manifest more. It’s always been here.
But in the last five years, we’ve seen it come out from under the rock, under the carpet, out the closet door kind of thing. It’s always been here, but now it’s beginning to manifest itself some more. And there are reasons for that, you know, that we can get into if you want. But, you know, in terms of BLM, in terms of other things like that, yes, a lot of that have contributed to it, and it’s the perception.
You know, perception is one’s reality, whether it’s real or not, one’s perception is one’s reality. And you cannot change somebody’s reality. They only know what they know. And if you want someone’s reality to change, what you have to do is offer them a better perception. And if they resonate with your perception, then they will change their own reality because their perception becomes their reality.
And that’s what I’ve been able to do in terms of when I sit down with these people who hate me, at opposite ends. I offer them a better perception rather than attack their reality and tell them how wrong they are.
John Koetsier: Amazing. Amazing. Bill, let’s turn back to you. We’re basically talking about Facebook and Twitter here. I think they’re the dominant social networks in, let’s say, North America, but we could also say Europe, we could say significant portions of the rest of the world as well. What are the core problems there, in your opinion?
Bill Ottman: Yeah, I think that we’re seeing some similar ideas that we’ve been discussing come to the surface with this recent episode between Elon and Twitter.
And, you know, he has been talking about how he thinks there needs to be less bias, more transparency. And where I really agree with him on is the open-source algorithms because when we’re scrolling through our feeds, right now, the way it’s structured on most big tech apps is that we don’t know why we’re seeing what we’re seeing. We don’t know why our posts may or may not be surfaced to our followers.
And so, you know, the transparency of like when a post comes across my feed of, why am I seeing this? What is the algorithm doing? We’re not saying algorithms are bad. That’s kind of like saying math is bad. But we do want to know, “Okay, is my post being demoted for some particular reason?” Or, “Is this post that I’m reading being fed to me for some particular reason, whether it’s because, you know, they tracked my location or they’re targeting me for some reason?”
I’ll bring up one really interesting piece that came out in The Washington Post after we published the paper, but this is research that I’m very fascinated in. It’s about the idea of what’s called Algospeak. So, essentially, what’s starting to happen is because certain words are censored and you get kicked off if you use certain words on social media, groups, all across the political spectrum, LGBTQ groups, political groups, you know, all different kinds of groups are encoding new language to get around the algorithm.
So, if you can’t say, for instance, the word sex, then they now have some kind of coded word which would be like hex, and they’ll just use the word hex to talk about that. And this is sort of happening in extreme groups, in normal groups. The fascinating thing about the Post article was that — and this was covered by Taylor Lorenz, I highly recommend you check it out — she was acting as if it was affecting many progressive groups and they were actually getting censored on TikTok and Instagram.
You know, because typically, in the last five years, free speech has largely been politicized, unfortunately, and it’s seemed to be more of like a conservative issue. But really, it’s not. It is a cross-spectrum phenomena that people on all sides of the aisle have a problem with… not everybody, but, you know, there are realms of the left and right which care strongly about free speech. This is a fundamental freedom that traditionally has been not a political issue. So, I find the evolution of language in response to these restrictions is really fascinating to me. And it really just proves that you cannot ban words. And if you do, all that you’re doing is causing those communities to change their strategy.
So, it’s just another example of the whack-a-mole. So maybe they’re not getting banned, but they’re being banned from using certain words and so that causes them to morph and just do the same thing in another way.
John Koetsier: So, let… Go ahead, Daryl.
Daryl Davis: I was going to say, like Bill gave the example of making another word for sex if you can’t say the word sex, you know, there are a lot of dog whistles out there that cause people to do certain things. And we’ve seen that in the last four or five years being politicized.
For example, the phrase “Take Our Country Back” is one of those key phrases that means something totally different than what you might expect. It rings very clear with some people what it means. “Make America Great Again” means something totally different than “Make America Great.” If you and I, or most people who run for president would say, “I’m gonna make America great,” or “I’m gonna make America greater than it’s ever been.” Very few would ever say, “I’m gonna make America great again.” That rings something totally different. And a lot of people know what he’s talking about when he says that.
“Take Our Country Back” was originally a Ku Klux Klan slogan from 1954 when Brown vs the Board of Education desegregated schools.
You can find footage of Klan leaders standing with a burning cross behind them in front of a microphone or megaphone saying, “We’re gonna take our country back. I’m not letting my little white boys and girls go to school with little niggers. We’re gonna take our country back.” What did he mean? Back to segregation. Okay? And now, that was your slogan. Alright? Very well known.
Then 2009, a new political party was born, and it was called the Tea Party. And that was the Tea Party slogan, “Take Our Country Back,” “Take Back Our Country.” And they had a rally on the Capitol steps. You know, I only live 30 minutes from the Capitol. I’m right across the D.C. line in Maryland.
And I went down there, and I interviewed some of those people, and I said, “Why are you using a Ku Klux Klan slogan?” “Oh no, no, no, no, no, we don’t mean it like that.” “Well, what do you mean you? You know, it’s a Klan slogan, right?” “Yeah, yeah, yeah, but that was way back. You know, we mean something different.”
I said, “What do you mean? You’re leaving it open-ended. You’re not saying take our country back from what, or take our country back to what, you know? You’re leaving it open-ended.” “Well, what we mean, sir, is we’re gonna take our country back from the Democrats. We’re gonna take our country back to Republican rule.” “Okay. That’s fine. Say that. Close it off. That way, there’s no room for misinterpretation.” “Well, sir, that’s what we mean.” I said, “Okay. Well, I have one problem with that.
The one problem is Jimmy Carter was a Democrat, Bill Clinton was a Democrat, where was the Tea Party then? Where was ‘Take Our Country Back’ at that time? Now, there’s a black man in the White House, and you all are screaming, ‘We’re gonna take our country back.’ What are somebody like me supposed to think?” “Well, sir, that’s what we mean. We don’t like Obama’s tax reform.” Well, you know, the original Tea Party from the Revolutionary War, the Boston Tea Party was all about taxes. That’s why they dumped the tea in the ocean, I mean, into the harbor.
So, they didn’t like Obama’s tax reform stuff. Alright, fine. But why use a Ku Klux Klan slogan? Because it rallied a lot of people for another alternative reason.
When Donald Trump came out in front of the White House and gave his little one-and-a-half-minute speech before everybody marched down to the Capitol, you can go find it. What did he say? He said, “You will never take back our country with weakness. You must show strength.” There was that phrase, again, “Take Back Our Country.”
And what do we see? We saw strength in the Capitol rotunda. We saw people marching around with a Confederate battle flag, a Camp Auschwitz T-shirt. You don’t have to ask them what they want, you know what they want. They want to take our country back. And so that’s the dog whistle where people have gotten around. I’ve been doing this for 40 years.
I have seen Neo-Nazis and the Klan march down the street with a megaphone saying, “6 million Jews was not enough. Gas the Jews. Send the niggers back to Africa.”
Okay. I have heard that, I’ve seen it, I’ve even got footage of it. I’ve even got boat tickets from the Ku Klux Klan that they printed up saying, “Here’s your free ticket back to Africa,” you know, that kind of thing on a boat. So, I can show you this stuff.
Now, granted, that’s freedom of speech.
But you can’t say things like that and get away with it, so, you know, especially online now. Back then, there was no online. We’re talking 1980s and stuff, alright, and 1990s. But now, online you can’t say that stuff without getting kicked off. Now, should some words be banned? Well, some words are banned. Okay? And for legitimate reasons. For example, the Supreme Court ruled you cannot shout “fire!” in a crowded theater. And there’s a good reason for it, unless there is a fire, because you will create a panic and cause people to get trampled for no reason if there’s no fire. So, they banned use of that word. Then later on, they banned the use of the word “hijack” or “gun” in the airport. You start playing around with those words, you will get arrested unless somebody has a gun or somebody’s doing a hijack. You know, you just say it to cause a panic, you will go to jail because you’re gonna cause a riot.
Well, when the Klan marches through a black neighborhood shouting, “Ship the niggers back to Africa,” that also causes a riot, you know? So, where do you draw the line? Or when the Nazis were gonna march through Skokie, Illinois, back in the day, and talking about, “You know, 6 million Jews is not enough,” that can cause a riot.
John Koetsier: Mm-hmm. Mm-hmm.
Bill Ottman: Yeah. And I think that what we find is that context matters and that’s what the Supreme Court ruling on “fire!” in a theater and different threats in the airport. You know, understanding the context of the use of the word is really essential.
And I think that that’s where the big social networks are getting it wrong. They’re not paying attention to context nearly enough. It’s much more of just kind of a big blanket ban, and what that does is that that causes people to kind of be confused, and it empowers the words too much because there are all different types of contexts. I mean, look at Daryl right now. I mean, using the words in a contextual way which explain what was happening. I mean, right now, Daryl could potentially be banned from Facebook or Twitter if he said that.
So, clearly, something is wrong when we’re not allowing ourselves to understand the intent behind it.
Daryl Davis: Right. And can the algorithm detect the intent, or just detect the word?
John Koetsier: Yeah, that’s a question I want to dig into. And, Bill, I want to get your opinion on that because you’re a tech guy. You’re a CEO of an alternative social network. And we tend to have this perception, and I say the greater we, the world, people, and we think these big tech companies — the Facebooks, the Twitters, the Reddits — they have AI, it’s so sophisticated, machine learning, it’s incredible. It’s amazing what they can do, it’s almost scary, facial recognition, other stuff like that.
And then we see what they are doing in terms of content moderation. And I have multiple friends who I know are not racist and violent, or whatever else, and sometimes they’re just talking about dinner or something like that, and they got put in Facebook jail.
I don’t know about you guys, but I’ve seen that comment more in the past six months than I ever saw it before, you know, “I got put in 30-day Facebook jail and here’s the screenshot why.” They come back and they tell you, and you’re going like, “Hmm.” And so you’re wondering what happened there.
Bill, you said that the algorithm should be open-sourced, and there was actually a report today or a story today that that is impossible because the amount of data that goes into whatever gets seen is huge, the convolutions are complex, all that stuff. Is it actually possible to quote/unquote “open-source the algorithm,” which Elon has talked about for Twitter, but also for Facebook, for LinkedIn, for other networks like that, and let people know this is why you’re seeing what you’re seeing. And also, on their own things, this is how many people saw it or why it wasn’t shown, or other things like that.
Is that even possible?
Bill Ottman: Oh, yes, absolutely. It’s possible. I would love to see the source of who was saying that it’s not. I think that maybe they’re conflating the data with the code? You know, the code can certainly be completely open. Now, the data that is sort of fueling the machine learning, I don’t think that anyone is necessarily suggesting that all of the data be open, because there’s a lot of personal data in there. But, yeah, let’s follow up. I would love to see who was saying that that’s not possible.
John Koetsier: Okay.
Bill Ottman: And then, otherwise, in terms of the context, like, right now it just sort of seems like they’re being lazy and they’re overwhelmed. And I think a lot of this is their own, you know, you sleep in the bed that you make. They created these terms of service which are so restrictive, and one day to the next, you don’t know what eggshell you have to watch out for.
And so, contrasting to what free speech advocates are pushing for to kind of align content policies as more of a common carrier approach, more aligned with the First Amendment, Supreme Court Justice Clarence Thomas is advocating this, many others, and it’s totally reasonable. Other major communication companies are subject to common carrier provisions where they can’t ban people based on their beliefs. You can still use the phone, you know, no matter what your beliefs are.
And, yeah, so, I think that they have to sleep in the bed that they make. But that being said, I think we all know from experience scrolling through YouTube or whatever, sometimes the recommendations are amazing and it’s incredibly valuable. And they’re like, “Oh, you know, look at that Jimmy Hendricks video that I’m getting fed. I mean, that’s exactly what I wanted to watch right now.”
So, there’s really smart people working on this stuff over there. This isn’t a jab at any of them. But I think that what the algorithms are doing is enforcing content policy that is not well thought through. And our challenge to these big networks is please show us the research that you have conducted which is justifying your content policy.
So, where is the peer-reviewed evidence that your de-platforming is actually resulting in a healthier internet and in de-radicalization?
Well, they’re not going to be able to find anything because their rate of de-radicalization is essentially non-existent, because they just throw the people off the platform and don’t engage at all. So how are they going to de-radicalize anybody?
John Koetsier: That is a good question. And I want to turn to Daryl in a moment and just ask what the solution is going to be. But what’s interesting about what you’re just saying there, Bill, is that you get the scenario when stuff is being banned — whether it’s for what some would consider good reason, what others would consider to be bad reason — well, you get the scenario that people say, “This will be banned soon, watch it now,” or you get the scenario, “This was banned, so it must be true because they’re in a certain subgroup that believes a certain thing, that believes big tech doesn’t want people to know or doesn’t want people to see.”
Bill Ottman: Exactly. It reinforces that conspiracy.
John Koetsier: It reinforces the conspiracy theory. So, I want to turn to you, Daryl. You’ve had hard conversations with people who hate you at first sight for no reason. We have this world of massive platforms where they have done what is good for them. They have done what is good for them by, “Oh, people are engaging with this content, they’re coming back, they’re liking, they’re replying, they’re having conversations. They’re on our platform more, so we’re gonna promote that more to others.” And that’s led to scenarios where we’ve had very great divisions driven even deeper. What is the solution for social platforms to help bring us together, or at least allow us to have conversations and maybe not prioritize the conflict so much?
Daryl Davis: Well, you know, you’re talking about social media, but we also find that in mainstream media…
John Koetsier: Yes, we do.
Daryl Davis: Fox News, CNN, MSNBC. You know, a lot of people dedicate themselves to one of those channels, and they’re being fed whatever that one channel is giving them, and that becomes the truth to them. It becomes their reality. So, it’s not just social media, you know, Facebook, Twitter, Instagram, whatever, it’s also mainstream media.
I believe the answer is providing more and better information to combat disinformation and misinformation. That’s how we resolve that because, as Bill pointed out, when you drive people away, they seek a place where they can be heard.
That’s one of those values. Everybody wants to be heard. They want to be able to express themselves. And so if they can’t do it here, they will do it there. And there may be a nefarious place, a dark place where it’s amplified and it festers and it grows, and then it explodes like Charlottesville, Virginia … like the Capitol insurrection, things like that.
So, we need to be able to allow these people onto our platform, give them a platform where they can air these views, no matter how contrary they may be, as long as they’re not advocating for harming people.
In this country, the Supreme Court also ruled that we have the right to hate. We don’t have the right to hurt. And when you cross that line, that’s where it becomes a legal issue. So, give these people a platform, give them guidelines, you know, they’re welcome to express their view regardless of how contrary or abhorrent they may be. And the best way to combat that is to provide better views. But what people are afraid of is how many people will they radicalize by airing and amplifying those views?
Yeah, there’ll be a few who will fall through the cracks and get on. But basically, what you’re saying is America is too stupid to realize this is BS. You know, it’s like this, I remember on TV back in the ’70s and the ’80s, they did not show a lot of interracial stuff going on, even though it was going on out here in the real world. But the stations did not want to show it because they said, “You know, people are not ready for that. They’ll go off.”
Well, who are you to tell me what I’m ready for and what I’m not ready for? I had to grow up with Leave It to Beaver and The Andy Griffith Show. And these are great shows. These are great shows. But little Beaver Cleaver and his brother Wally did not have any friends that looked like me. Little Opie Taylor in Mayberry didn’t have any friends that looked like me. I had to watch Lost in Space, you know, with the robot and Space Family Robinson and Dr. Smith. They went all over the universe trying to get back to earth. They saw people with three arms, four eyes…
John Koetsier: Not a single black person…
Daryl Davis: Not a single black person in the universe, man, you know, until Star Trek came along. Star Trek came along, and you had not only a black person on there, but she was a lieutenant.
John Koetsier: Yes, Lieutenant Uhura.
Daryl Davis: Lieutenant Uhura. The first interracial kiss, not in this country, but the first interracial kiss on TV was between Captain Kirk, William Shatner, and Lieutenant Uhura, Nichelle Nichols on Star Trek. And that was in 1969. It’s been going on long before 1969. I used to watch American Bandstand.
I remember Dick Clark…it was either the 25th anniversary or the 20th anniversary, Dick Clark put it on. Alright? And this is in the 1970s. And he would bring back all the rock and roll stars from the ’50s, you know, because it’s 25 or 20-year reunion. So he would show a clip of them appearing on “American Bandstand” in black and white, singing their big hit. And then halfway through the song, that person would walk out live from behind a curtain in color and finish out the song. And Dick Clark would walk across the other side of the stage and meet them at the end of the song. If it was Brenda Lee or Connie Francis, he’d kiss them on the lips, and you’d see it. If it was Gladys Knight, or Martha Reeves, or Ronnie Spector, you know, one of the black singers, he’d still walk and kiss them on the lips. But right when he got there, the camera would pan to the audience so you would not see him kiss them on the lips. Yeah.
John Koetsier: Wow, wow.
Daryl Davis: And so it’s things like that, you know, that we are behind the times.
Technology-wise, we are the greatest nation on the face of this earth. But ideology-wise, we’re behind. You know, there are third-world countries out there that have female presidents, female prime ministers. All we do is argue over, “Can a woman be president? Can a black man be president? Can a Mormon be president?” Back in 1960s, it was, “Can a Catholic be president?” We are so caught up in all this ideological BS that we forget we need somebody who can run the country, regardless of gender, color, religion, etc.
John Koetsier: Love it.
Bill Ottman: Yeah.
Daryl Davis: We are living in space-age times, but thinking with stone-age minds.
John Koetsier: Wow. Money quote right there. Bill, I want to ask you a question about what Daryl just said. Daryl said, as I asked, you know, when social media networks amplify things, and maybe things that are factually incorrect, according to some people, and he said, “Hey, provide information that counters the misinformation.” How do you do that? Do you provide sort of like an AllSides thing? So, that’s allsides.com, you see different angles on the same news stories. Do you provide, you know, other people think this when somebody shares something that says that? How do you do that?
Bill Ottman: A lot of it has to do with, you know, control over your own experience. I mean, I think that you should not be seeing things that you’re not interested in seeing, and obviously, you know, young people…
John Koetsier: Isn’t that part of the problem, Bill, then we stay in our own reality world?
Bill Ottman: It is. But I think that, ultimately, your control over your technology is foundational.
That being said, we just rolled out a whole build-your-algorithm feature so that you can actually, on a little slider, adjust your echo chamber. So, you can say, you know, “I want to see more opposing ideas to what I’m used to.” And, you know…
John Koetsier: I hope that doesn’t go all the way down to zero, because in the real world — let’s talk real world, you know, pre-technology — you’re gonna run across the crazy speaker on the corner, you’re gonna run across somebody else who talks to you in the subway or something like that. In our virtual worlds that we’ve created, it seems like we have these pure halls of ideology that are only aligned with what we believe because of the choices that we made and the algorithm just reinforces, reinforces, reinforces that. And I hope we can’t dial that down to zero.
Bill Ottman: No, I mean, it’s pretty much impossible on any social network to completely stop contrary ideas from getting into your feed. We also have the boost feature, which anybody can use to… The boost is really kind of a algorithm or an echo chamber buster as well. Basically, you can earn tokens and then boost your content to everybody’s feed.
And no, I mean, we’re not trying to enable echo chambers, but I think that just the concept of presenting someone with an option to see ideas that they disagree with, to see more people on the left, to see more people on the right. These are options that are educating someone while they’re looking at the options. So, on Facebook, or Twitter, or YouTube, there’s no setting that you can go to to adjust your algorithm. So, just the idea of putting it into someone’s brain, “Oh, I might want to see ideas that are different from my own,” like, that is foreign to the vast majority of people’s social media experience right now.
Right now, people go in and they’re trying to be reinforcing their ideas, they’re trying to subscribe to everyone in their tribe. Just the prompt of sending someone a message that says, “Would you like to increase your tolerance for opposing ideas?” That in itself is sort of a unique design that we’re bringing to the table and it does educate people. You know, you can’t force these things down people’s throats because they’ll leave. So you need to really be conscious of easing people into it.
And, yeah … I think that Daryl’s strategy is very much not about pushing too hard, because then you can get the backfire effect where they think that this is some of a conversion therapy type situation. [Laughter]. And you really do have to be careful of that, so…But in terms of better information, you know, right now on Twitter, you’ll get a fact-checking message, which was fact-checked by five think tanks. And we haven’t built this yet, but it’s something that we’re planning. What if if a post was deemed to be controversial, you could kind of click a button and see a visualization of the debate on both sides? Like you were saying with AllSides, you know, they’re doing great work.
There’s another site called Ground News, which does a very similar thing. They visualize the debate. They don’t tell you what is true and false. And those types of things, I think are very healthy.
John Koetsier: I want to give … we’ve got to wrap this. I want to give the last word to Daryl, and I’ll kind of … I’ll ask a question, but go where you want to go with this and say what you want to say. And we’ve talked a little bit about opposing viewpoints. The words ‘conservative’ and ‘progressive’ have been used here, ‘left’ and ‘right’ have been used here.
Personally, and right or wrong, I don’t know, I hate labels. I hate labels — that’s progressive, that’s conservative, that’s regressive, that’s left, that’s right — because I find that a lot of people start identifying with a label and they meet a new idea, and they say, “Does that label fit in this chunk of things that I agree with? Is that leftist or is that rightist?”
And then I’m against it or for it, depending on where it fits or where it lands. And they don’t think through an idea based on the idea, and the evidence for it, and the thoughts about it, and what people are saying about it, and other things like that. And so we close our minds and we don’t think. Daryl…
Daryl Davis: Exactly. And I agree with you 100%. I don’t like labels either because, you know, in this country, we try to pigeonhole somebody into a certain box. Otherwise, we are very confused. We can’t navigate it unless we know exactly what category they fit into.
And the problem is, people are multidimensional, they’re multifaceted. You know, I might like this from the conservative side, I might like that from the liberal side, I might like that from the libertarian side, or whatever. And all of that is what makes me.
You know, but if I take one thing conservative, then I’m a conservative. If I take one thing liberal, I’m a liberal. If I take one thing, I’m a Marxist, you know, whatever.
John Koetsier: And then you’re an enemy [laughing].
Daryl Davis: Listen, as you know, you mentioned, I’m a musician, I’m a professional musician. That’s how I make a living. You know what? But my degree is in jazz. I have a degree in jazz, but I’m out here playing rock and roll. I play country. I play boogie-woogie, rockabilly, swing, R&B. So, when I play a couple blues songs, all of a sudden I’m a blues artist. You know, because I have a degree in jazz, I’m a jazz artist. You know what? I am a musician. If you are payin’, I’m playin’. You just tell me what you want to hear [laughing].
John Koetsier: Love it, love it, love it. I probably put you in a box because I got a pre-email, it said blues musician. So, apologies for the box.
Daryl Davis: No, no, no, no problem. Other places call me a rock and roll musician. You know, other places call me a jazz musician, or I play swing dances, where people do the Lindy Hop and Jitterbug. I’m a swing musician. I played for Chuck Berry. I played piano for him for 32 years. So, obviously, I can play rock and roll. You know, so I’m definitely a rock and roll musician.
But I play what I have to play, when it’s called for. If it’s a variety gig, that’s what I do. If I play a blues festival, yes, I’m gonna play the blues. You know?
John Koetsier: I love it, I love it, I love it. I have a quote, which is on my Twitter as the pinned tweet, and I can’t recall exactly what it is [laughing] I invented it … but reality is multidimensional. People are multidimensional. And everything has so many different sides, not just one, not just two, not just three, not just a couple labels. And if we could accept that about each other, that’d be a wonderful thing. I want to thank you, Daryl. I want to thank you, Bill, for conversation on some topics that are hard, that are challenging, that are difficult. They’re not just hard in terms of technology, they’re hard in terms of our emotions, in terms of who we are, in terms of what we think is right or wrong, in terms of what we think should happen with the world and the direction of our countries and other things like that. And you’ve done a magnificent job. Thank you so much for this time.
Daryl Davis: Thank you very much.
Bill Ottman: Thank you for having us.
TechFirst is about smart matter … drones, AI, robots, and other cutting-edge tech
Made it all the way down here? Wow!
The TechFirst with John Koetsier podcast is about tech that is changing the world, including wearable tech, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer, and much more. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.
Consider supporting TechFirst by becoming a $SMRT stakeholder, connect to my YouTube channel, and subscribe on your podcast platform of choice: