Did you know your Mac transmits a log of every single app you open? Apple has made privacy a core part of the brand — including entire TV commercials dedicated to it — but as a self-described hacker and security researcher recently found, every Mac sends a stream of data about every app you open (and more) to Apple.
And … sends it unencrypted. And … bypasses any local VPN software you’ve installed.
In this edition of TechFirst with John Koetsier we’re chatting with Jeffrey Paul, the hacker who found and wrote about the problem. We chat with him about why Apple did this, who else could see the data, what Apple’s changing, and what this means for the future of computers.
(Hint: it’s not great.)
Scroll down for full audio, video, and a transcript of our conversation …
Listen: Apple GateKeeper: security win or fail?
Watch: Macs phone home every time they open an app
Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.
Read: How one hacker forced Apple to change its GateKeeper security policies
(This transcript has been lightly edited for length and clarity).
John Koetsier: Did you know that your computer transmits a log of every single app you open? Welcome to TechFirst with John Koetsier.
Apple is a company that has made privacy a core part of the brand, including entire TV commercials dedicated to it. But as a self-described hacker and security researcher recently found, every Mac sends a stream of data about every app that you open — and more data — to Apple, and sends it unencrypted. It also bypasses any local VPN software that you might’ve installed.
To get all the details, we’re chatting with Jeffrey Paul. Welcome, Jeffrey!
Jeffrey Paul: Hi, thank you.
John Koetsier: Hey, it’s a real pleasure to have you here. You had a blockbuster article. You investigated what was going on. What data is my Mac sending to Apple?
Jeffrey Paul: So Macs, by default, send a tremendous amount of data to Apple. Not just about app launches, but you have all sorts of system services that people generally opt into, things like iCloud, iMessage, FaceTime, Siri. All of this uses the network, so you’re transmitting a tremendous amount of data.
The issue that I specifically wrote about is one called Gatekeeper, and it uses a system called OCSP which checks on each app launch, whether or not the app that you’re launching is malware or is known to be malware to Apple. And it does that using the network. And so, even though this malware check data is cached for part of a day, every time you open an app that you haven’t opened in 24 hours, it’s sending this information to Apple.
John Koetsier: Yeah. And it’s not just some data on an app that you open, which is potentially bad enough. I mean, there could be some app that Apple doesn’t want you to open for some reason, or some app that maybe the government of your local country doesn’t want you to open. It’s not just the name of an app, right?
It’s additional data as well. It’s location data, or something that could be construed as location data via your IP address and a few other things, correct?
Jeffrey Paul: Absolutely.
So, here’s the thing that a lot of people don’t realize: when your computer makes a connection to another computer on the internet, it’s sending your IP address. It has to, in the normal way that computers connect to other computers on the internet. And if that contains any identifying information whatsoever, even if it’s just — not like a computer ID or an Apple ID, but the set of apps that you open that might be unique in your city or your village — that IP address information and timestamp of when that request comes in can be correlated with that unique information at other times.
And what that allows people to do — because an IP address identifies your coarse location, approximately city-level and ISP level — so what that means is that if you’re the only person that opened say, Tor browser in a region or Tor browser and one other weird app that nobody else in your city uses, and you open it every day at one IP, and then another day you travel to another city and you open those two apps, Apple can see that that person that had those two apps — even though it’s not sending a unique identifier, it’s not sending your Apple ID or anything like that — they can see that that is person moving around.
And the more uncommon apps that you use, the greater specificity or fingerprint indicates that computer. Does that make sense?
John Koetsier: It absolutely makes sense. We see a lot of fingerprinting going on in mobile as people try and understand what mobile device is accessing various apps and where it goes, and tracking advertising and all that stuff. And we notice that fingerprinting is also active on the desktop now.
In fact, IP addresses can be startlingly accurate. Sometimes not, sometimes they’re aggregated and you can see that apparently my location is 200 kilometers away from where I am, because they’re going through a VPN or something like that. But sometimes, like Google, you know, finding me on a map on my desktop, it shows pretty close to where my house is. It’s pretty close to my neighborhood.
Jeffrey Paul: The other thing to consider there though, is that your IP address and a timestamp to your carrier uniquely identifies you as a subscriber with a defined subscriber address and a payment card and a bank account and all that other kind of stuff. So even if it’s not to Apple or, you know, other service providers … totally clear that John Smith is accessing it, to a person who has access to carrier records, such as the government, then this information becomes laser-focused.
This IP at this timestamp means this house, this street address, or this subscriber identity.
John Koetsier: Yeah, yeah, absolutely. What’s Apple’s stated rationale here?
Jeffrey Paul: Gatekeeper and the OCSP checks are an anti-malware system. Malware is one of the biggest problems in computer security today. Most people do not have the expertise or knowledge to keep their computers free of malware if they are given unfettered ability to run programs on it, ’cause you double-click on a program, you don’t know what it does, and game over.
And so Apple has put in a tremendous amount of time and effort and resources into keeping their devices — and when I say their devices, your devices that you purchase from Apple that have Apple’s logo on it — keeping their devices free of malware.
Because they consider that part of the value that they deliver when you buy a Mac or you buy an iPhone, that it’s harder to get software that’s going to steal your data on that thing. And that’s true.
Apple is the leading platform for prevention of malware. It’s also the leading platform for Apple-related censorship.
And you can’t really have one without the other, right?
John Koetsier: Yes, indeed. There’s another core challenge here. So, you know, there may be a good stated purpose for all this, and we’re going to get to Apple’s statement about what it says about this, and also what it’s changing in the future as a result of your post, of your article on this.
But there’s another core part of this that is perhaps even more shocking. All that data was being sent unencrypted, correct?
Jeffrey Paul: That’s correct. And when you say ‘all that data,’ specifically, these OCSP checks. They were not intended as telemetry. These are checks to see if the developer’s certificate for an app has been revoked by Apple. So if Apple finds malware being released by a developer they can yank their certificate and your computer will check, it’ll see that and then it won’t launch their malware anymore.
Which is a good thing, normally.
But these checks were being transmitted unencrypted, and because the vast majority of developers on the Mac platform only release a single app, when it checks to see if that developer’s certificate ID is revoked, it’s actually saying, ‘Hey, guess what? I just launched this app, the one app released by this developer.’
John Koetsier: Yes.
Jeffrey Paul: And so, even if you don’t know anything about developer certificate IDs, if you just see this identifier when this app launches, that’s a unique identifier for the app, even if in Apple’s internal record keeping it’s a unique identifier for the developer of the app.
John Koetsier: Yeah. The other interesting part is that this data that was flowing to Apple servers, well-intentioned or not, was bypassing VPNs. And that is a particular no-no.
Maybe not for the average person who doesn’t really care, but perhaps if you’re maybe a security researcher that some people don’t like, perhaps you’re a political figure of some sort, perhaps you’re a dissident of some sort, that’s a real challenge isn’t it?
Jeffrey Paul: Edward Snowden famously said, ‘Not caring about privacy because you have nothing to hide is like not caring about free speech because you have nothing to say.’
We have to preserve privacy throughout society, not because most people need or care about privacy, because most people, as we’ve said has been demonstrated, do not care about privacy or need privacy. But there’s a small percentage of people in our society that absolutely need free speech and absolutely need privacy because they change the world. They’re labor organizers, or political organizers, or they’re speaking truth to power, or they’re investigative journalists investigating corrupt government or corrupt military, and things like that require privacy.
And if privacy just isn’t available in general, then we sideline all of those people, we sideline all of their work. So even if you personally don’t care about it, we need to preserve these rights.
John Koetsier: Talk to me about what this means for your computer, your technology, the things that you paid money for that you have in your home that you consider to be yours.
Does this make them, in a sense, not yours? Does this make them something that you’re kind of renting or something like that? How does this change the experience of owning a product?
Jeffrey Paul: So, Apple is sort of leading the way on this. And I opened my article that I was tickled to find that a lot of people cared about it, because I thought that this was going to be sort of brushed under the rug and this OCSP transmit data leak that is inadvertent telemetry, it’s been going on for years.
This started with Catalina over two years ago, and I knew about it then. Other people knew about it then. It wasn’t secret, but nobody seemed to care. And so I opened this article with a link to a story by Richard Stallman, who founded the Free Software Foundation. It’s a story called The Right to Read, and it’s a far future science fiction story about a future where you no longer have the ability to share data with other people, that it’s been made illegal, and there’ve been locks put on your computer to prevent you from doing it, and breaking those locks is illegal.
And Cory Doctorow also — I linked to another one of his stories — has been writing about sounding the alarm on this and the entire industry has been moving in this direction for decades. This is not a new thing.
On iPhones, you cannot wipe and reinstall an iPhone without the iPhone talking to Apple … transmitting a serial number at Apple. On an iPhone, even software that you have produced yourself from scratch cannot run on an iPhone without connecting to Apple over the internet and obtaining permission to do so first.
Now, this makes this platform virtually free of malware. It also makes it virtually free of dissent against Apple.
So to answer your question directly, this is a growing trend in consumer electronics and eventually computers in general, because you can’t make a computer that will only run certain programs. And so what this means is that if you don’t lock these things down as like large vendor corporations, these computers can be used to do all sorts of things that the government and these corporations don’t want done with them.
So we will see a day — I hope we don’t, but it’s probable that we will see a day where most computing devices available for purchase by the general public will override the wishes and desires of the owner of that device to serve a remote entity, whether it’s a corporation or a government. We already see that happening on mobile — we’ve lost that ability on any mobile platforms already.
John Koetsier: And essentially what you’re doing then is you’re taking what was a computer, a multipurpose general use tool, and creating an appliance which is—
Jeffrey Paul: Absolutely.
John Koetsier: Only good for its design purposes.
Jeffrey Paul: But the thing you have to remember is that there’s no such thing as an appliance. They’re just phones that have computers in them, or washing machines that have computers in them.
John Koetsier: Yes.
Jeffrey Paul: And because of that, it’s not like, you know, a microwave where you just plug it in and it has a blinking clock on it, you push some buttons on it. Everything, because it is actually a computer and no amount of technology or [unclear] will make it not a computer these days, and because increasingly they’re connected to the network, they end up being surveillance devices — whether intended as such or not.
John Koetsier: And especially the ones that have recently had computer chips and wifi capabilities attached to them, and are being used as appliances but have that computer capability in them, are often insecure and very noisy on the network and vectors for somebody attacking and getting into your envelope, your security envelope.
So Apple responded to what you said — which has to be somewhat gratifying. It’s not every blog post that somebody writes that Apple changes direction on and releases information on. They said that they were going to encrypt. That’s a good thing. What else did they say, and what did they not say?
Jeffrey Paul: So I was as surprised as anyone that one random guy with a blog can point a finger and yell at the largest corporation in the world and get them to encrypt their data and delete their logs. That was shocking to me.
That said, what Apple did here really is as close to an admission of guilt as you’re ever going to get from Apple PR. They’re encrypting the data. They’re deleting the logs. But ultimately, this is part of a greater trend as Apple tries to transition into a services company where Apple is going to be collecting a lot more data. And even if you opt out of all of the Apple services and all of the Apple online stuff, and just want to use an iPhone or a Mac as a device that you connect to internet services that are not Apple, that’s harder and harder to do.
To answer your question specifically about what Apple didn’t touch here … as Apple has grown and expanded into other services, Apple has tried to differentiate themselves very starkly from other services companies as being sort of the privacy option. Google is the ad company. Apple doesn’t want your personal data, it’s on device, etc., etc., etc. And that, for the most part, is true. There are many people inside of Apple who care very deeply about privacy. And I don’t think Tim Cook is lying when he says that privacy is a human right. I think he believes that.
But we can’t think about these companies — we can’t personify Apple. Apple is thousands of people with vastly different attitudes and hopes and dreams and politics, every single one of them. And the things that Apple does, being a company that large, sometimes the left hand doesn’t know what the right hand is doing.
They do produce cohesive products, but the way that each one of these little individual features works, you know, they have big product launches. They launched 20 big new features that use internet, they have millions and millions of users. We’re not talking about like a couple of servers in a closet here. We’re talking about huge teams that are launching giant infrastructure to support these things.
And sometimes they launch a half dozen of them with a new software release. So, the fact that Apple is doing things like using OCSP to check certificates, which is pretty industry standard, the industry standard is to not encrypt. There’s a ton of stuff that is totally industry standard that is crazy from a privacy and security standpoint. It’s not necessarily an indictment of their focus on privacy, but I’ll tell you what it is: when you want to use a computer without talking to Apple, you can’t in a lot of ways.
Like … you can’t release software on the Mac without being part of Apple’s developer program. I mean, you can, but no one’s going to figure out how to run it.
So a lot of people that know the Mac might think, oh, to run on-site software you just right-click it and select open, right? Well, guess what? You actually have to do that twice because the first time you do that, it won’t give you the open box. The button says ‘move to trash’ and it mentions malware. You have to do it a second time to get the open button and it still mentions malware.
Nobody’s going to figure this out. If you’re selling software and you want to sell millions and millions of copies of your software, you can’t do that. That’s just not going to work. You have to participate in Apple’s developer program.
Back to what you were saying about this bypassing VPNs, Apple accepting all of their internal bundled OS apps from user level firewalls like Little Snitch, these are indicators that Apple is trying to exert more control over the platform. And DHH, David Heinemeier Hansson, tweeted about this recently. These do have legitimate security benefits. It’s not bullshit when Apple says that this is something that benefits users, ’cause it does benefit lots and lots of users in a big way.
But it also benefits Apple’s revenue model.
And we can’t conflate these, we have to be very clear about where the line is between things that benefit the user and things that benefit Apple. And Apple has increasingly made it harder to find the escape hatch to diverge from the path that benefits Apple to maybe benefits the person who owns the computer or develops the software. On mobile, it’s gone. You can’t do this. You have to play ball with Apple to run mobile apps, even on your own device.
You can’t even download the compiler, the Xcode suite from Apple to build Mac apps without identifying yourself to Apple with an Apple ID, which requires a telephone number. In many countries, you can’t get a telephone number without a government ID, so this all ties back to the idea of Apple wants your privacy, but not from Apple.
Apple says, ‘Trust us, it’ll be okay. We’ll protect your data.’ But the U.S. military makes it so that Apple can’t keep your data private. It’s illegal in the United States for Apple to keep your data private from the state if the state asks for it.
John Koetsier: Yeah. And this is the core challenge, perhaps, not so much that we distrust Apple — because there’s some well-intentioned people there and a well-intentioned company overall — but things change. And when you establish norms of how computers work, and how data security happens and other things like that, and the environment changes, the political environment could change, who knows what will happen in the future? Who knows what will be illegal in the future? Who knows what will be unethical or immoral in the future as society evolves and changes?
Jeffrey Paul: Vlogs don’t go away.
John Koetsier: Exactly. Exactly. So a real challenge.
Jeffrey Paul: And even if Apple deletes all of them, because they were transmitted unencrypted, the military surveillance organizations that monitor all this traffic that runs across the internet backbones and ISPs, they’re going to save this forever. So Apple might delete all the data, Apple might stop logging all the data, but the last two years of your pattern of life data, what you open when and where and from which IPs, that’s going to be saved by the NSA forever.
John Koetsier: Yes. Wow, good thing I didn’t have Tor or anything like that. That would make me a special target. So —
Jeffrey Paul: Well, the other option is we just promote Tor and get as many people as possible to open and run Tor and then running Tor and preserving your privacy isn’t anomalous anymore.
John Koetsier: Exactly.
Jeffrey Paul: That’s really the goal here I think.
John Koetsier: It’s kind of a novel twist on security via obscurity, I guess, hey? Security or privacy by hiding in the crowd, I don’t know, being a part of the herd. Hmm. So this podcast is about tech that’s changing the world and innovators who are shaping the future. You’re a hacker, you’re a security researcher. Why do you do that? And what difference do you hope to make?
Jeffrey Paul: So, when I was a little kid, I saw this movie about these guys that would be paid by banks to hack into the banks and steal money. And then they go to the bank the next day and be like, ‘Hey, guess what? Here’s your money back, we’ll collect our fee now.’ It’s a rather famous movie, and if you know my username you’ll understand the deep impact that this has had on my life.
And I, like many hackers, have always been fascinated by puzzles and challenges, and if you give me a problem I’m going to try to solve it. And this is, just like writers or artists or anyone else, this is very much a compulsion. This is not something that we choose. We wake up and we have to do this. If you dangle the string in front of the cat, the cat is going to chase it. If you — Google used to famously recruit by putting just math problems on billboards, and when you would solve the math problem you’d put a .com on the end, and it was a Google recruiting page.
There are a number of us that see things in the world and are forced — but not an option for us — so we’re forced to figure out why it works that way and how it does what it does. And as a result of that, I’ve always been deep into computers, deep into any sort of electronics or other technology, cryptography, you know, what little kid doesn’t like having a secret decoder ring, right? And I was lucky to have been born at a time where cryptography, especially network cryptography, was really sort of taking off as the internet just started to be encrypted.
You know, I was using PGP which was then new software when I was a little kid, and I grew up with it. And I grew up with network cryptography and it’s always been just like my blood to me. It’s just something I do every single day.
And so these communications out of computers and the growing security of desktop computing, and I want to be clear: these cryptographic protections that also totally violate your privacy and issue your ability to control the device that you own in your hand, also bring a tremendous amount of security from like stupid mundane threats like malware, like viruses, like things that will steal your data. It centralizes that control with an organization that is presumed to be trustworthy by millions, but it does rule out all of those other millions of people breaking into your devices, which is great for most people.
That said, all of these devices I keep a very close eye on, because I use them. I’m talking to you on an iMac right now. I have 10 computers in this room, less than half of them are Macs, but that means there’s like four Macs in this room. I have a bunch of different phones. I have a bunch of different radios, all sorts of stuff. And I like to inspect these things and see what they’re doing, because it’s totally opaque to most people.
You buy a new Mac, take it home, you turn it on, you hit ‘No to analytics,’ hit ‘No to iCloud,’ and it’s sitting there. There’s no programs running, you think maybe it’s not talking to the network. Guess what, it’s sending tons of data to the network.
When you open Apple News, it’s sending the news articles you look at from your IP. When you open Weather, it’s sending your — all the cities you’re at, right? When you open Stocks, it’s sending the news items you click on and which stocks you add to it. When you open any of the media apps on it, it’s on the Apple OS right? Sending all of that data about what you click on … all of this information that’s leaving the machine is opaque to most users, they have no idea it’s happening.
One of the really, really big things in an upcoming article on my blog is going to be how tremendously much spyware is embedded in almost every single app on the App Store.
And it’s spyware from the manufacturer of the app and Apple expressly permits it in the apps in the App Store. When you open it, it sends — even if you’re not doing anything — it sends your activity to the manufacturer of the app. And of course that comes with your IP and it comes with unique identifiers, you’re logged in. And so every app on your phone basically can track you, whether you have GPS location turned on, then they can track you very specifically, and if you don’t, they can track to city, to city, to city as you move around, from your IP, your location. And Apple expressly allows this sort of tracking in apps in the App Store. And most people have no idea. This isn’t visible to people on their computers, on their phones, and surfacing that I think is really, really important, making it visible. That’s why I’m such an advocate for the program Little Snitch on the desktop, because it shows you — pops up a box, says, ‘Hey, this is about to happen. You want this to happen?’
John Koetsier: Well, I’m grateful that there are people like you who have to be like the cat and hit the string, and find the puzzles, and decrypt them, understand them, and solve them, and then share the results with the world. Thank you for spending some time with us.
Jeffrey Paul: Thank you for having me on. I really appreciate it.
John Koetsier: Excellent. Well for everybody else, thank you for joining us on TechFirst. My name is John Koetsier. I appreciate you being along for the show. You’ll be able to get a full transcript of this podcast in about a week at JohnKoetsier.com. The story at Forbes usually comes out right after that. And the full video, if possible — this one will have to be reviewed.
I agreed with Sneak already, with Jeffrey already, that if he agrees with the video, if he likes the video, if he didn’t say ‘um’ too many times, then this will be published on YouTube. So we’ll see if that happens. Thank you for joining. Until next time … this is John Koetsier with TechFirst.
Subscribe for free
Made it all the way down here? Wow. You’re dedicated 🙂
The TechFirst with John Koetsier podcast is about tech that is changing the world, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.
I’d appreciate it if you’d subscribe on your podcast platform of choice: