Biden’s Peloton vs national security: what’s risky about the president of the US using a Peloton or a Fitbit? And … what does that mean for the rest of us and our safety?
In this episode of TechFirst with John Koetsier, we chat with ex-googler Ben Barokas, founder and CEO of Sourcepoint, who is now running a privacy-focused company for, as he puts it “the sins” of his prior jobs in adtech.
Scroll down for full audio (and subscribe to the podcast!), video, and a transcript …
Check out the Forbes story here …
Subscribe to TechFirst: Biden’s Peloton
Watch the video: the privacy danger of smart products
(Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.)
Read: what’s risky about Biden’s Peloton … or the smart products you use?
(This transcript has been lightly edited for length and clarity.)
John Koetsier: What is the cost of making the world smart and connected? Welcome to TechFirst with John Koetsier.
Most smart things are smart because they have brains in the cloud, right? Brains in the cloud means data transfer — going up, going down, going across a lot of different services — and that means vulnerability. Like perhaps, the president of the United States using a Peloton or a Fitbit, or for that matter, any kind of different apps.
To explore the future of smart and safety, we’re chatting with ex-googler Ben Barokas, who’s the founder and CEO of Sourcepoint. Welcome, Ben!
Ben Barokas: Thank you, John.
John Koetsier: Hey, happy to have you here. I know you’ve got some connection issues and some challenges — you’re in a busy place, so we’re just going to cross our fingers and hope for good connectivity and hope for good audio. We’ll see how it goes. Let’s start right off at the top. What’s risky about President Biden using a Peloton or a Fitbit?
Ben Barokas: You know, I mean at the end of the day, the White House is a relatively secure place and they have the best of the best. I think they can probably work out the issues with the wifi and the chip set within the Peloton.
That being said, we’re always concerned when the leader of the free world has security vectors by which can be compromised. And so, we need to look at those very, very closely. That being said, I think him having a Peloton is a lot less dangerous than our former president, Trump, having the ability to tweet to his Proud Boys. So, I think we all have to put these things into perspective and kind of go from there.
John Koetsier: Yeah, that could be the case. So we’ve seen in the past that U.S. Army bases and some personnel were ID’d by Strava, right? — the running app. And I heard that Michelle Obama used, I think it was a Peloton as well, but ripped the camera out. Have you seen other people do that as well?
Ben Barokas: I don’t think you necessarily have to be so violent as ripping the things out.
A little bit of tape goes a long way when you want to maintain your privacy and you’ve gotten to a level of sketchiness where, you know, if the blue light’s not on, are they still watching you?
That being said, if you were to look at the new book Spy Catchers that came out around the new head of the DNI, it’s really interesting to understand what can be done. Yeah, certainly I think that policies and procedures need to be put in place, especially when there’s surveillance objects in everyone’s pocket.
The ability to have agency over your own data is really crucial. And the ability for the Armed Forces to keep their personnel safe and keep our country safe, there should be the ability to turn off on an app-by-app or device-by-device basis things like precise location. Now, telling people that they can’t use Uber on base … okay, that’s probably not the end of the world. But telling them that they can’t use it when they’re off base … all right, well, people should be free to make their own selections and choices.
But sometimes signing up and — not enrolling, but enlisting or being commissioned, you know, that has additional ramifications. So, certainly if the U.S. Military made sure that you can’t use precise location while being in service, so be it. It’s probably not a bad thing.
John Koetsier: Let’s talk about risks to average people, and it’s actually a great day to have this conversation, because it’s Apple Data Privacy Day.
Ben Barokas: It’s everybody’s data privacy day.
John Koetsier: Exactly. [Laughter]
Ben Barokas: [Indistinguishable] … introducing on iOS 14, the next level of consent in order to pass along the IDFA. But this is a day to be celebrated around the world for users to rise up and take agency over their own data and know what that actually means.
John Koetsier: What’s the actual risk for the average person? They might have some smart products in their home, maybe some smart speakers, whatever they’ve got … whatever router that their internet service provider gave to them, and they’re running maybe a hundred, 150 apps on their phone.
What are the risks that they run in terms of data privacy?
Ben Barokas: The risks are that you get tracked, right? And there are — you don’t know where that data is going. That data is often non-encrypted. I think there are certainly things that you can do if you’re outside of the home; you can utilize a VPN that actually encrypts a good amount of your data.
That being said, even if you’re using a virtual private network in order to encapsulate your data on the way to the servers, the question is once somebody has your identifier for advertising or your unique ID from your mobile device, and then you’ve given them permission to give them Lat/Long or not only your location coordinates, but any number of browsing behaviors … once that gets to someone else’s servers, it’s very easy to copy that data and send it along.
The question is: who do you trust? Do you trust anyone? Now we can go into some level of—
John Koetsier: Let’s ask that question, actually. Who do you trust? You were happy about Apple’s Privacy Day. What kind of phone do you use?
Ben Barokas: I am. I am on an Apple. I do use a VPN. I don’t have 150 apps on my phone, but you know, I use Uber and Lyft, and they know my location when I want a car being pulled up. I use Zwift, I use Strava … anybody can go see my runs and my cycles. I walk a few blocks before I turn it on from my house and I turn it off a few blocks before I get home. That being said, I mean, if someone was really tracking me, it wouldn’t be so hard to try and—
John Koetsier: It wouldn’t be too hard to triangulate based on start points, end points, and everything like that. In fact, there was a story — I think it was about six months ago, that the New York Times published — that they could basically track where President Trump was at the time by seeing apps that were clearly from Secret Service personnel. They were being tracked by IDFA. So, lots of challenges.
Being tracked is one of them; your private data being out there is another one. People hacking into that, perhaps, is yet another one that we haven’t talked about. What’s the solution?
Ben Barokas: The solution is really, it’s about education; it’s about taking agency over your own data, and it’s about making good decisions about where and whom you trust. And I think that’s the crucial part of the equation: is trust.
Now, we’re going to get better in terms of zero trust systems. We’re going to get better with cryptography. But in order for there to be personalization, in order [for] there to be customization of experience, we need to have some level of trust.
And so, you know, you’re careful, you don’t have to read privacy policies, but more and more they are cliff notes to make you understand that who upstream and downstream has access to your data. And if they’ve been hacked, probably makes a good case for keeping your data out of their hands.
John Koetsier: I really wonder about that, honestly. You mentioned education, you mentioned taking agency — and those are wonderful things, those are great things — but let’s talk real right here for a second.
You’re talking teenagers … and they want what they want, and they’re going to just click through. You’re talking people who … they see lots of pop-ups, they see lots of warnings. They don’t really care, they’re not super technical, ‘Okay. Okay. Yes, yes, yes. Get out of my way. Let me get to the content I want, the games I want, everything else I want. I don’t want to pay for it. I’ll sell my attention. I’ll sell my eyeballs. I’ll sell my time. But I’m, you know, just give me the things that I want.’
And yet we’ve seen there aren’t just personal costs, right? The personal costs of being tracked or the personal costs of potentially being hacked. There’s also societal costs. We saw that with Cambridge Analytica. We saw that with potentially other ways of maybe even fake news hacking into our political systems via foreign actors and other things like that. Is there an industry-wide solution? Is there a solution — and you don’t want to be draconian and impose something necessarily — we know that there’s 30 to 50 to 60% of people who just don’t care and won’t get educated.
What’s the solution there?
Ben Barokas: I mean, I believe that not caring is anyone’s right, right?
John Koetsier: Yes.
Ben Barokas: I think though, to your point, the value exchange has to be very clear and everyone should have the choice to either pay with data and attention — which we normally understand to be advertising, but doesn’t always have to be advertising, that can be bifurcated — or you can pay with fiat currency.
But understanding that there is a transaction that’s occurring every time digital utility is being utilized, and every time there is content that is being consumed, must be clear, and that must be understood globally. And once that’s understood, then everyone has made a free choice.
I call it “compensation consent,” right? So we think about what GDPR is doing. We think about where CCPA and CPRA is going. We understand what Washington State and Virginia are proposing. I think what it comes down to is plain talk around compensation. And this is the issue that we’ve dealt with. The problem there is everyone is afraid of friction and everyone is afraid of understanding what does the user value.
And we understand that, hey, Google and Facebook, they provide such incredible amounts of value to the user: free email, incredible maps, the ability to communicate with anyone in the world with one click. This isn’t just an amazing set of value exchange that gives the ability for many of these walled gardens to suck in so much of your data without you being bothered. And up til now, they haven’t been perfect stewards, but they haven’t been terrible either. They haven’t turned into Big Brother.
Having been at Google, you know, the people that I worked with there were really stellar. Now, there are many folks that view them as Big Brother and think that Big Tech is dangerous. I don’t blame them. It has gotten to a point where Big Tech is almost an oligopoly. And so what do we do to curb that? Do we put power back into the people? Is there a framework by which this can occur?
And I think it really comes back to compensation choice. What are you willing to trade for the utility? What are you willing to trade for access to content? Hopefully access to the truth, which is something that we need more of today. But I—
John Koetsier: So you started Sourcepoint as, perhaps even a bit of a counterpoint to where you spent the rest of your career in adtech and everything like that. What did you start it to do? And how are you helping the situation?
Ben Barokas: Penance for my sins.
John Koetsier: [Laughter] How many years of penance do you have to pay?
Ben Barokas: I don’t know. I’m going on five, but we’ll see how long it takes. I’m down for the get down. It’s not that I look at what I did previously as exploiting people’s data. That being said, I’ve seen vendors, I’ve seen actors, I’ve understood processes that were not as above board as they should be. And I feel that users need to have a better understanding of the transactions that occur as they travel around the web and they go from app to app.
And I felt so strongly about it that I founded Sourcepoint. I raised $47 million, I deployed a team across Europe and across the United States. We will be in Asia-Pac and South America before the end of next year. And this is something that is so important to the future of the web and the future to our digital lives that that understanding is made. And, you know, I feel like doing well by doing good is something that we should all strive for—
John Koetsier: Absolutely.
Ben Barokas: And we should try to align incentives so that the people that are doing good, get paid more than the folks that are being shady.
John Koetsier: That would be wonderful.
Ben Barokas: And so I’m doing my best to align those incentives in creating technology that can be deployed that will do exactly that.
John Koetsier: Let’s talk about privacy … the future of privacy then. You’re building something that does enhance privacy to pay, as you mentioned somewhat facetiously, for your sins of the past. What does privacy look like in 10 years?
Ben Barokas: I believe that people will be smarter because they will have to overcome the friction that educates them around their choices. I look at the world sort of 90-9-1, right?
90% of users are really not going to care, and they’re going to happily give their data and their attention to places that they trust. And that’s something that they can determine. I think 9% of people will evaluate and be intelligent around choosing who they do business with and under what auspices.
Who do they pay with their data and trust with their data? Who do they pay fiat currency? And what is their relationship? How do they manage it? And then I believe there’ll be a percentage of individuals that will be very close kept and they will be able to pay to not have their data be spread out.
John Koetsier: Is that the 1%?
Ben Barokas: That is the 1%. And unfortunately, I think we’re going towards that path where you need to be able to afford that, right? And the 1% may be the one[s] that are most close kept with their data; and they can pay for personalization, they can pay for customization if they want it. And the others, I think they’ll need to provide data that is collected in maybe not so direct ways.
But I do believe they will need to be given the opportunity to consent. I think that it won’t happen blindly. The question is: what is the granularity of consent going forward? Is it happening with every web page? Is it happening on the device level? Is it happening on the application level? What are the bundles of consent that you’re given? How detailed are those choices? And that’s what we’re hoping to influence and create frameworks around.
John Koetsier: Wonderful. Well, Ben, I want to thank you for taking some time here, and you know, the people in the background weren’t awful, actually. They were actually pretty good; you can give them a little pat on the back when we’re done here. Thank you for your time.
Ben Barokas: Thank you, John. Have a great evening.
John Koetsier: Everybody else, thank you for joining us on TechFirst. My name is John Koetsier. I appreciate you being along for the show. You will be able to get a full transcript of this podcast in about a week at JohnKoetsier.com. The story comes out at Forbes shortly thereafter, and the full video is always available on my YouTube channel. Thank you for joining. Until next time … this is John Koetsier with TechFirst.
Well, now you have to subscribe. It’s mandatory
Made it all the way down here? Who are you?!? 🙂
The TechFirst with John Koetsier podcast is about tech that is changing the world, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer (yeah, that’s this one!). Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.
Subscribe on your podcast platform of choice: