Should social media censor free speech?
Something fairly unprecedented happened this past week: Facebook and Twitter both blocked a NY Post story.
Now … whatever you think about that story (and I think it’s pretty flimsy) blocking it almost immediately is pretty shocking. How should social media deal with controversial subjects … or false information?
In this episode of TechFirst with John Koetsier we chat with Bill Ottman, founder and CEO of the open source social network Minds.com, about what big tech and big social should do, about algorithms, shadow banning, free speech, virality, and what Ottman is doing with Minds.com to fix it.
Scroll down for full audio, video, and a transcript of our conversation …
Subscribe to TechFirst: social media censorship, & politics
Watch: the NY Post, Biden’s laptop, and the right path forward
Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.
Read: Social media, censorship, politics: the NY Post, Biden’s laptop, and the right path forward
(This transcript has been edited for length and clarity).
John Koetsier: Should social media censor free speech? Welcome to TechFirst with John Koetsier. So, something fairly unprecedented happened this week.
You probably know about it, everybody’s heard about it. Facebook and Twitter both blocked a New York Post story. Now, whatever you think about that particular story — and we’re obviously talking about the Hunter Biden laptop story from Rudy Giuliani. I think it’s pretty flimsy, but whatever you think about it, blocking it almost immediately … that’s pretty shocking, right? I mean, that’s pretty hardcore. It’s not like independent third-party fact checkers looked at that and said, ‘Hey, that’s false and you should block it.’ But, there’s potentially some reasons for why they did that.
How should social media deal with controversial subjects or false information? Today we’re discussing this with Bill Ottman, who’s the founder and CEO of the open source social network Minds.com. Welcome, Bill!
Bill Ottman: Hey, thanks for having me, John.
John Koetsier: Bill, let’s just dive into the most topical thing right here, and we’re going to get into all these things, right? Like what should social networks do? Should there be censorship? Should there be no censorship? What does free speech look like? All that other stuff, right? But, with the New York Post story, in your opinion, what should have happened?

Bill Ottman, found of Minds.com
Bill Ottman: I do not think that it should have been taken down. And I think that even Dorsey doesn’t think it should be taken down. I mean, he made statements that it was unacceptable.
John Koetsier: Yes.
Bill Ottman: So, you know, certainly not—
John Koetsier: And Jack Dorsey, of course, is the CEO of Twitter.
Bill Ottman: Right. And so, because that policy that they had — which they have already changed about hacked information — that would essentially apply to like a significant percentage of journalism, if they were to not allow that type of sourced information to be published through a news website on the platform.
So it’s really just an impossible standard to uphold and … yeah.
John Koetsier: It is really interesting, because obviously there’s been a huge struggle — you could say over the past few years, but particularly in the last maybe 12 months, 6 months, and getting more and more to the forefront as the U.S. election comes closer and closer — there’s been this huge struggle of fake news. What is fake news? Is fake news everything I disagree with? Is fake news something verifiably false? What is it? And how should the social media platforms deal with it? If you look at controversial political subjects, how, in your opinion, should social media treat them?
Bill Ottman: I think that we need to look at decentralized reputation scores and trust scores. And we can build trust around certain users and content with certain methodologies. But, I mean, misinformation could be, you know, you could make a mistake.
John Koetsier: Yes
Bill Ottman: You could be wrong. You could make a harmless mistake, or it could be a very harmful mistake, but it could be something that you don’t intend to do. And so should you be banned from the platform? Or should your content be banned for the [inaudible].
You know, misinformation, once you go down that rabbit hole, you are opening yourself up to a whole world of impossible standard. So, I think that drawing the line of where, at least where we draw the line and we encourage other platforms to draw the line, is around the First Amendment and … with certain other edge cases.
John Koetsier: Can you go into that a little more detail? What do you mean there?
Bill Ottman: Sure. So, no one really knows what the policy is on Facebook and Twitter and YouTube. I mean, for instance, on Twitter, there’s certain explicit content that you can find.
John Koetsier: Yep.
Bill Ottman: For sure, like, very much so. And … but certain other stuff is not okay. And there’s not a consistent, clear policy that people can cling onto. And that’s the same on all the platforms, and—
John Koetsier: Even if there is a consistent and clear policy though, they can’t really seem to enforce that really. Because, if you look at — I think it was about three weeks ago or something like that, maybe it was actually when Donald Trump got COVID, and people were posting some things, ‘Hey, I hope he dies’ or something like that, right?
And Twitter banned that, because that was essentially violent speech, that was you can’t hope that somebody’s going to die. And then lots of people came out of the woodwork and said, ‘Well, interesting. I didn’t know you had this policy, because people have told me that they wished I had died,’ you know?
And they might be activists. They might be women. They might be other minorities or something, whatever the case might be. And they came out and said, ‘What happened here? Why did you not apply it earlier?’
Bill Ottman: Yep. And then there are cases where people say, ‘Oh, I want to kill you!’ and it’s sarcastic or it’s not serious. And, you know, when the AI is detecting all the language and processing everything, it’s not smart enough to understand context fully yet.
John Koetsier: Mm-hmm.
Bill Ottman: And a lot of reliance is on this AI and machine learning. And in some cases they’re just allowing those tools to run the show and then they do damage control afterwards. So, and that is dangerous. There’s going to be too much collateral damage.
John Koetsier: Yeah, I agree with you on that. Facebook had an issue on that. I wrote a story for Forbes on that about, I would say about four months ago, and it was when COVID was pretty new still maybe five months ago in that case, maybe even more — how long have we been in COVID? It feels like it’s been forever. And they were banning anything that mentioned COVID or coronavirus because they were trying to ban some fake news, and the AI just got out of control, right?
So, there’s an argument to be made that Facebook, Twitter, Minds, other social networks, are the new public square. They are the new public square … they’re where we have our conversations, they’re where we talk to friends/family/strangers.
And if you ban speech there, you’re actually infringing on free speech, generally. Obviously they’re private corporations, and by law, private corporations can do — public corporations in this case — can do what they wish and have their own regulations.
Where do you fall on that?
Bill Ottman: The thing that bothers me the most about those [companies] in particular, and the other kind of FAANG companies, is that it feels like they brought everyone in under this idea of free expression. And then over the last number of years, they’ve drastically changed the policies. So people who spent many years building up a following thinking that one thing was okay, then suddenly, you know, their million followers, you know, the platforms don’t really care. And so, it’s similar to how Facebook brought in the restrictive algorithms in the newsfeed and cranked down the organic reach. To me, well, that’s more of a soft form of censorship.
John Koetsier: Yep.
Bill Ottman: Because you have got 100,000 followers, but you’re only reaching 4% of them.
John Koetsier: Yes.
Bill Ottman: If you’re lucky.
John Koetsier: 4%? That’s high.
Bill Ottman: That’s high? Yeah, I know, it’s probably like one now. And we have no idea what’s going on. And everyone put so much energy into building up these pages, and then they really just changed the game.
John Koetsier: Yep.
Bill Ottman: And, you know, Twitter, their slogan was ‘The free speech wing of the free speech party.’ I mean, these are important words and I think the companies obviously should have to have whatever content policy they want. I don’t think that every social network, you know, if you’re some sort of like a religious social network, or … whatever social, you’re a fishing social network and you don’t want this content.
Like, you should [have the] ability to impose [restrictions], but when you market in that way and become a multibillion user network, I think that that’s a little bit of a different scenario. Now we take sort of a different perspective, which we can get into, but I’ll stop there.
John Koetsier: It is interesting, right? You have a cats-only social network, ‘No dogs allowed!’ you don’t post about dogs or whatever — it’s possible. We will get into what you do on Minds.com and how that works as well.
There is, of course, also the yelling-fire-in-a-theater argument, right? There’s the argument that you can’t just say everything, or anything, at any time. There are limits to free speech. Does that apply to social media?
Because one of the things that we see with social media is, with virality, what I said to you in the public square 20 years ago, stayed there. Those words fell to the ground and who knew, right? Now, some random thing that I tweet you could be horrifically offensive and get retweeted 5 million times … and it’s a big deal, it’s viral. And the same can happen with false information.
How do we bridge that gap?
Bill Ottman: Yeah. I mean, sometimes certain things are worth sharing that are wrong. You know, you want to provide the con— especially if it’s done within the context of, you know, you share it with context, like why you think this is so stupid.
John Koetsier: Yes.
Bill Ottman: To say that stupid ideas don’t deserve to exist, is just … nothing in history shows that getting rid and censoring stupid ideas is actually a productive solution.
John Koetsier: Well, my stupid idea is your genius plan, right? So who knows.
Bill Ottman: Right, right. And look, I think education for users on how to research, how to understand context, how to — like, that’s what we really need to be focusing on.
Because it’s a losing battle to expect that every single piece of content uploaded to social networks with hundreds of millions or billions of users is going to be able to get fully vetted. And that’s not even including comedy, which is intentionally wrong sometime, and it’s used as, you know, in some cases, almost a healing mechanism for people to deal with really problematic issues.
So there’s the realm of serious content, and then there’s the realm of — I mean, Trump retweeted a Babylon Bee article, unironically.
John Koetsier: Yes.
Bill Ottman: Like two days ago.
John Koetsier: Yes. Yes. I know.
Bill Ottman: That’s unbelievable.
John Koetsier: That has happened before. I’m pretty sure in Pakistan or India, there have been government officials have retweeted The Onion stories, not realizing that they were satire and—
Bill Ottman: That’s like the golden test if you’re a good satire site.
John Koetsier: Exactly. Exactly. Well, but what you were saying there is, you were saying that there’s no way that at-scale social media networks can know what is appropriate, what is inappropriate, what works, what doesn’t work, when hundreds of millions — in fact, billions as you said — are sharing, are tweeting, whatever. You’ll get a kick out of this one. I don’t know if you saw this, but it was, I think earlier this week, some farmers in Canada, East Coast Canada, got their pictures of onions flagged by Facebook and removed, because they were ‘sexual’ in nature.
Bill Ottman: Hmmm, yep.
John Koetsier: So some algorithm saw something, and banned them.
Bill Ottman: It’s really interesting you bring that up. I
was talking to the creator of this AI framework, NSFW.js, and it’s sort of, it’s like a TensorFlow tool which can detect explicit imagery … and it’s open source, so we’re actually considering leveraging some of those tools. And he was saying that onions are problematic.
John Koetsier: Really?
Bill Ottman: Yeah, because there’s sort of a spatial relationship between them and, you know, maybe a certain body part.
John Koetsier: Yes.
Bill Ottman: But, yeah, I mean, I think that there should be tools for users to opt-in to a spectrum of different content policies. So, certain people may just have no interest in seeing satire, or maybe they have no interest in seeing, X, Y, and Z other type of content.
And, I do think it’s the job of the social networks to make it very clear to you as a user how to control your experience and giving you as many possible tools to control your experience as they can. And additionally, there’s something that we’re working on, sort of a content prediction and quality tool, where you can actually vet/bet on the future quality and relevancy of content within a certain category.
John Koetsier: Yes.
Bill Ottman: And people can actually build up reputation within certain categories, and content can build up reputation within certain categories, not just likes and, you know … the problem with just going off likes, is that it’s so ripe for manipulation.
John Koetsier: Mm-hmm.
Bill Ottman: And when the trendings are being based on that sort of shallow metric, it makes them subject to just being owned and gamed.
John Koetsier: That’s a good point. And let’s — that brings up the topic of algorithms. And algorithms are something that, I mean, nobody outside of computer science or mathematics even thought of a decade ago, two decades, three decades ago.
But algorithms govern a lot of what we see, as increasingly our reality and our understanding of reality is formed by the digital platforms that we use, where we consume information, where we create information. The algorithms that govern what’s visible, what we see, what the platform thinks we’ll like, what the platform thinks we won’t like, what will keep us on the platform — maybe clicking more, maybe watch another ad or something like that — control us in a large extent.
And that becomes dangerous, right?
So you mentioned controlling your experience and, absolutely, you kind of want everybody to be able to control their experience. But if you map that to a real world scenario where I met you in that theoretical public square, you know, the guy on the pulpit yelling that ‘hell is coming tomorrow’ or something like that, that happens! It impinges on my reality.
And there’s some serendipity sometimes to that, and there’s some good information that comes in that punctures my reality bubble and comes inside. So there’s some value there.
Bill Ottman: Absolutely.
John Koetsier: I’d love for you to comment on that, and then we’ll go a little farther on algorithms.
Bill Ottman: If that guy was on Twitter, you know, he might get banned for misinformation.
John Koetsier: Yes he might.
Bill Ottman: So, yeah, absolutely.
I mean, that is an echo chamber breaker. It pierces your filter bubble because you’re walking down the street, you know, you got your headphones in, you’re in your experience, and then the guy is like throwing pamphlets in your face and talking about 2012, or … well, now it’s gonna be 2045.
John Koetsier: Yes.
Bill Ottman: So, the algorithms, to me, should by default be reverse chronological. Because that is what you decided as a user that you want to see. Now, giving people the ability to opt-in to different types of algorithms … fine.
You know, Twitter basically puts you back to the algorithmic feed. You can’t stay on the chronological one. There’s that little icon on the top and every time you click it, every time you log in, you have to put it back if you want reverse chronological. They won’t let you just keep it. And it’s not to say that — I mean, algorithms are math, so it’s not like algorithms are bad, but they a hundred percent need to be open source, because we have no idea what they’re doing, who they’re favoring, who they’re punishing, who they’re, you know, who’s getting throttled. That type of information is really important for the community to understand the software that they’re interfacing with.
John Koetsier: I like what you said there, that algorithms need to be open source. I’m currently writing a science fiction book, which is future news. So in my day job, I write today’s news. For fun, I’m writing future news and insights from the future.
And one of the things that I wrote in there is — so it’s a news story from five, seven years in the future, where there’s a movement that all algorithms must be open source, must be visible so that you know why Facebook showed you this post. And that’s one thing that I was going to ask you as well, like, should everything that you see, that’s governed by an algorithm, have metadata with it?
Such that you can see, why did I see this ad? Why did I see this post?
Bill Ottman: That would be good.
John Koetsier: And then Facebook can say, well, you saw that post because you’ve shown a predilection in the past to click on information from that source, or you’re interested in that topic, or whatever. But we should know that.
And we should know, as you mentioned, those algorithms. Like I called it ‘TweetRank’ probably about seven years ago or something like that, when I was fed up with the chronological stream and I was getting too much garbage. And I was saying, ‘Twitter should do something called like ‘TweetRank,’ you know, — which is sort of like PageRank which Google did, of course, right, which pages are the best, and surface those in search — and I want to see the best tweets.
Well, that is good, but what don’t we see when that happens, right? That’s the challenge.
Bill Ottman: Yeah. That’s why you want both options. You want to be able to toggle back and forth, and sit on the setting that you’re feeling at a particular moment in time. It shouldn’t be an either/or [scenario]. And I really like the idea of allowing sort of a hover over where you can see the metadata. In Facebook and in Google’s case would be like, ‘Well, you know, you were in another browser tab,’ or ‘you weren’t on our site, but you were on another browser tab and [we know that] because [we’re following] you around everywhere you’re going in your browser … and so it would be pretty extensive.
John Koetsier: ‘Last week you searched for that’ … and there you go. That is pretty interesting. We also see that platforms depress stuff. They, whether that’s shadow banning of individuals, or whether that’s simply — certainly LinkedIn and Facebook, and I believe Twitter as well, make things — it’s harder to make things go viral when they take people off the platform. If there’s a link in there, you put a YouTube video—
Bill Ottman: Right.
John Koetsier: In Facebook, or whatever the case. Anything that takes people away from the platform … they don’t like, and they don’t show it to as many people. And I would just like to know that. I would just like them to admit that, and to see in pseudo code or plain English, as much as possible, what the algorithm is doing, and why it’s doing what it’s doing.
Bill Ottman: Yeah. They make these vague blog posts where they say, ‘Oh, you know, we’re going to be rewarding native video a little bit more.’ But they don’t really tell you what’s going on, and it’s sort of, you know, their argument would be, ‘Oh, well, you know, we have to [unclear] against games and we just want to improve the quality of the content in your feed.’ It’s just the same thing over and over again. But again, to be—
John Koetsier: Our economic imperative has no bearing on our decisions of what we do in our algorithm, at all.
Bill Ottman: Nope. No, money is not a factor. Again, Jack Dorsey made a statement the other day, like, oh, how great would it be if we opened up all of our algorithms. And it’s just like … and it’s the same thing when Zuckerberg talks about how he’s doing his year-long project about decentralization and crypto. And it’s like, okay, so … great, you’re sort of half hearted talking about these issues because you know it’s where things are going, but you’re totally unwilling to make the changes where it counts.
On your [contact] tracing app right now … it’s like, we’re going to do decentralization and blockchain, but it’s going to be mostly proprietary, and we’re going to try to control the whole thing, and … it’s a mystery. Because I think that they’re well aware — I mean, look at all the stuff going on with bitcoin, and Square just bought $50 million of bitcoin — like those people know where things are moving, but it feels like they’re clinging to control while they can, because they know that where things are going is in much more of an open source, decentralized direction.
John Koetsier: Interesting. Well, that’s a good segue, because I want to talk about Minds.com. So I first heard about it probably, maybe three weeks ago or something like that, created an account on there just to play around … didn’t have a chance to chat with you back then. Came up yesterday, so we set this up.
And you’ve created a different kind of social network. It’s open source. Talk a little bit about what you’ve created there, and what you’re trying to do.
Bill Ottman: Yeah. So Minds is [an] open source, community owned social network, as well. We actually have over 1500 community shareholders ’cause we did an equity crowdfunding round a few years back, and that actually converted into stock when we did our series A.
And we’re actually gearing up for a Reg A+ to bring even more of the community into the early stage ownership structure. That’s a key element. And then, obviously flipping everything about mainstream social on its head. So, being [super privacy] focused with encryption and decentralizing the infrastructure as much as possible where it makes sense. You know, fully decentralized everything isn’t fully possible with a good UX, but certainly moving in that [direction]. And like, we just launched a prototype of a tool where you can optionally post to what’s called the Permaweb—
John Koetsier: Mm-hmm.
Bill Ottman: Which is through the Arweave network. And that is a fully decentralized data and content storage solution, and so that’s an optional feature for people. And so it’s much more of a resilient system where, okay, yeah we’ll publish to S3, and, you know, we use AWS … but having this backup is totally essential. And there’s really cool projects like IPFS and other distributed systems that are just increasingly going to give users more control and be really valuable. So—
John Koetsier: So that’s the technology side to a large extent of what you’re doing. What are you trying to do in terms of how somebody feels when they’re using the service? What are you trying to do in terms of how information travels on the service, and what it just feels like to be a member there?
Bill Ottman: Definitely. Yeah. I mean, that’s the most important thing.
And when we originally launched our mobile apps back in 2015, we were so disillusioned from the strangling organic reach on Facebook — ’cause we actually had like millions of followers over there and drove a bunch of traffic for a long time. But then it almost became a waste of time — and we were like, cannot rely on this even a little bit for the future, it’s only [getting] worse.
And so we built this virtual currency reward system where you earn tokens for your contributions to the network every day. And one token is worth a thousand impressions. So you can earn these tokens and then boost your posts for more exposure, because, you know, they’re strangling everyone. We want to help creators be seen and get exposure.
And it’s really been sort of a magical thing, even despite being a few million users, tiny fraction of the size, many small and medium sized creators get better reach on Minds — using the boost tool and earning the tokens — than they do on Twitter. I mean, they’ll be on Twitter for 10 years and still be getting like 10 [likes]. So, the mass over there is only relevant to you if you can reach those people. Now, I think that we’ve become much more focused on quality of content — and like I mentioned earlier with some of the prediction tools, and vetting tools, and moving the algorithms towards more of a quality-centric world, as opposed to just raw engagement.
But I still think raw engagement feeds are worthwhile to be able to have access to. It’s, again, the optionality that’s really important.
But, I mean, we take the First Amendment approach, because based on the research that we have compiled, looking at studies from Nature and many different universities that talk about how censorship increases violence actually, and increases polarization. I’ll just read you one quote from this Nature publication, “Our mathematical model predicts that policing within a single platform, such as Facebook, can make matters worse and will eventually generate global dark pools in which online hate will flourish.”
And so this is what study after study shows, is that, even if it’s with good intention — I think we all want less hate speech. We all want to have safe communities. But we really have to ask ourselves what’s happening with the networks on the internet when these censorship policies are brought into full steam on the largest communications [platforms] in the world. And there’s a growing body of evidence that what is happening, that the content policies on the big networks are fueling the cultural divide and a lot of the polarization and civil unrest.
So, it’s counterintuitive, but that’s why we brought on our new advisor, Daryl Davis, who’s a deradicalization and race relations expert. He’s personally deradicalized members of the KKK, as a black [man] … I mean, alright, let me ask you a question. Imagine if Facebook, what do you think would happen if the 20,000 moderators on Facebook were all mental health workers and counselors and people who are actually engaging — as long as it’s not illegal, like true harassment, like that stuff has to go — but for the edge cases, these people who are like disturbed people … what would happen if we had 20,000 people who were productively engaging them?
John Koetsier: Interesting. I don’t know. I do know that some people who are trolls, who seem to lack empathy and seem to lack the ability to understand how others feel with what they say and how they treat them, and that is probably due to psychological issues. And perhaps there’s people who understand those better and can help treat those. I don’t know.
Bill Ottman: Yeah. I mean, it’s a long term project. You know, in Daryl’s case, many of the people that he got to turn over their robes, you know, it was [a] multi-year engagement of communication, and that’s what it’s going to take.
And people like Deeyah Kahn have done TED Talks on this also, directly engaging hate head-on. And the evidence actually shows that that’s really the only way to change minds. You’re almost guaranteed not to change their mind if you ban them. In fact, the opposite, I mean, you can’t communicate with them if you ban them. So—
John Koetsier: Well, what I found compelling is what you said, the quote that you gave from the Nature study, which was that when you ban, and when you block a lot of things, then you create these dark pools. They go find someplace else where they are only communicating with themselves. And I can totally see how that ensures that they become more deeply rooted in whatever worldview they have. It’s reinforced by the reality bubble that they now go into. So I get that. That makes some sense.
Bill Ottman: And you know, it does make it difficult, because I think there are platforms who are abusing the phrase “free speech” right now. And they’re sort of not … they’re not approaching it in a really productive way. Which is why you sort of see a divide, politically, with free speech enthusiasts. You know, you have digital rights organizations like EFF and ACLU, and these types of organizations who are obviously pro free speech. I mean, they’ve been defending some of the worst people in history, but they knew that that mattered. But at the same time — because they understand what Daryl understands.
But then there are other platforms that just try to use it as like a marketing tool and don’t really want to be productive with it. I don’t need to name names, but I think that people really just aren’t familiar with the research around censorship, and so the gut reaction is, ‘Oh, my gosh, that’s horrible content. It just needs to go.’ And I do actually believe that platforms should be able to have their own policy.
John Koetsier: Mm-hmm.
Bill Ottman: And so it’s okay for some platforms to get rid of that, but, yeah, they have to know that when they do that, it will go to other sites and they’re not contributing necessarily to make it to the solution. And so, for platforms that are willing to tackle the problem head on, I think that it’s gonna take time. It’s not an easy job. It’s actually a really hard job to deal with this and build all the filtering tools, and engage with the volunteer organizations to actually engage. But, you know, that’s where we stand and we want to help solve that problem.
And certainly for the big networks, I think that they need to adjust their thinking around censorship. And let’s compare data. I mean, ten years from now, let’s see who has actually deradicalized more people. I’m happy to do a podcast with you and Dorsey and Zuckerberg and—
John Koetsier: Ten years from today, it’s going on the calendar. One other thing I wanted to get to about Minds.com, and that was, honestly, I’m not a hundred percent sure I totally understand what you were doing there, but what I saw in one of the releases that you had, it sounded almost a little bit like creating a network collection of blogs.
That’s what I kind of saw as sort of being able to put a pro site that had social features and was integrated into Minds.com, but also had an independent existence. Can you talk about that a little bit?
Bill Ottman: Yeah. I mean, to be honest, this is the stuff that I’m most excited about in terms of the creator economy. And so we’ve built the last year, we’ve non-stop been building out both fiat payment tools and crypto payment tools for creators so that we can have revshare programs with the creators who are helping drive the most traffic to Minds.
And so, within, we have two premium products: Minds Plus and Minds Pro. In Minds Plus everyone gets access to this — the Minds Plus feed of exclusive content — but can also post into it. And we’re taking 25% of our revenue and proportionally sharing it with the creators who drive the most engagement to that premium feed. And with [Minds Pro] you can set up your own custom domain and sort of design your channel to be more like your own website, but it’s still powered by Minds.
And I think that just giving people more control, more monetization tools. You know, this is what creators need. I saw the piece you did about the creator economy. It’s just amazing how fast it’s growing.
John Koetsier: Mm-hmm. Well, that is interesting. And, honestly, in some ways a vision like that, of connected websites with this connective social tissue between them, sounds a little bit more like maybe some of the dreams we had of the web way back when … somewhere in the nineties, when blogging started and it was the thing, not social networking. Very interesting.
Bill Ottman: Well what’s happened is that these nasty third-party ads have become the norm. And you know, a lot of bloggers and creators — and it’s not their fault, to be honest — they’ve been bullied into it by Google and all the other, you know, Taboola and all these sort of nasty ad networks that are all proprietary.
There’s actually a really cool project that we’ve been talking to [called] EthicalAds. And no spying, it’s more tech focused. You would actually be a good intro with them. But we have an internal ad network with the boost system. So when you boost with the tokens, we distribute that throughout the network. There’s no surveillance. We’re gonna be bringing in like the ability to target at hashtags, but certainly not like [unclear] people around in order to advertise to them.
And that does make it harder, and a little bit less accurate, but we really have to ask ourselves what world we want to live in and if we want a privacy-by-default world where maybe users can opt in to receiving more relevant content, but it really has to be more consensual—
John Koetsier: Mm-hmm.
Bill Ottman: Than right now. And introducing serious rev share rules, I think that’s really the future. Because if we have this huge subscription revenue stream, and we can share that with the creators who are helping us drive that subscription revenue. You know, and we also allow creators to set up their own membership tiers, similar to some of the other creator sites.
But for those who don’t, that’s not an easy thing to do, to build up a whole membership base. Some people can do that and it’s worth doing for those who can, and everyone should really try, but if they can tap into our revenue stream, it’s much more sort of immediate satisfaction. So we’re really excited to launch that.
John Koetsier: Very cool. Very cool. Well, Bill, I want to thank you for taking some time out of your Saturday afternoon to chat about these topics. They’re pretty key. They’re pretty critical. They’re pretty topical … and appreciate your perspective.
Bill Ottman: Likewise. Thanks for having me.
John Koetsier: Hey, for everybody else, thank you for joining us on TechFirst. My name is John Koetsier. I appreciate you being along for the show. You’ll be able to get a full transcript of this podcast in a couple of days, maybe a week, at JohnKoetsier.com. The story will come out at Forbes after that. And the full video is always available on my YouTube channel — maybe Minds.com too, who knows. Thanks for joining.
Maybe share with a friend. Until next time … this is John Koetsier with TechFirst.