iOS is safer than Android, right? Usually … because getting on the iOS app store is harder than getting on Google Play. There’s more scrutiny of apps, their code, and functionality.
But now, for the first time ever, security researchers have found an ad fraud network on Apple iPhones that uses click injection to steal potentially hundreds of millions of dollars. It’s in over 1,200 apps with billions of downloads and has been since mid 2019, in apps like Talking Tom, Asphalt 9, PicsArt, Gardenscapes, and Helix Jump.
It allegedly works by spying on your activity on the phone and sending fake clicks on ads it sees you engage with. To learn more, in this episode of TechFirst with John Koetsier we’re going to chat with the man who found it: Danny Grander, Co-Founder & Chief Security Officer at Snyk, a digital security company.
Get the full audio, video, and transcript of our conversation below …
Also: please note that the ad network Snyk is accusing of ad fraud, Mintegral, denies the allegations. Mintegral’s full statement is in my updated article on this at Forbes …
Subscribe to TechFirst: ad fraud via Chinese ad network
Watch: ad fraud via Chinese ad network
Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.
Read: ad fraud via Chinese ad network
John Koetsier: iOS is safer than Android, right? Well, maybe …
Welcome to TechFirst with John Koetsier. So we often think of iPhones as safer than Android, right? Getting on the iOS store is harder than getting on Google Play. There’s more scrutiny of apps and what they do, their code, those sorts of things, their functionality.
But now, for the first time ever, security researchers have found an ad fraud network on Apple iPhones that uses click injection to steal potentially hundreds of millions of dollars. It’s in over 1,200 apps with billions of downloads, has been since mid 2019. We’re talking apps like Talking Tom, and Asphalt 9 is in there, PicsArt is in there, Gardenscapes, Helix Jump … many others as well. It works by spying on your activity, on the phone.
And to learn more, we’re going to chat with the man who found it, who is Danny Grander, Co-Founder and Chief Security Officer at Snyk. Snyk is a digital security company. Welcome, Danny!
Danny Grander: Thank you, John. It’s good to be here.
John Koetsier: Excellent. Super happy to have you here. Now let’s just dive right into it, big picture: what did you find? And what ad network is doing this?
Danny Grander: Absolutely. So we identified an SDK malicious component that is getting integrated into different iOS applications, and getting into the App Store. That’s how they, [are] getting downloaded to end-users, the consumers. And so in that SDK, which is distributed as a regular ad network SDK, something that developers can use to monetize on their apps through ads. So, along with the advertisement for functionality, we identified that it has some malicious functionality which leaks some data back to the company.
But also, with the help of partners in the industry — Mobile Marketing Platform is the leading one — we identified also that they perform actual ad fraud, click attribution fraud.
And this is something that we needed partners that have actual data to confirm, because what we did on our end is just reverse engineered the library that is available to any of us for download through the package manager, through CocoaPods. And so we reverse engineer found a functionality, but we also confirmed that an actual attribution fraud is performed by the company.
John Koetsier: So let’s back up for half a second, because some people won’t know what an SDK is, software development kit. What is an SDK, and how does that get into an app?
Danny Grander: So, SDK is just another software component, so it’s a general name for a software component that you integrate into your app. An SDK can be for different purposes. When we talk about advertisement network SDK, we typically mean a component that, you know, a developer integrates, and that component is responsible for showing the advertisement within the real estate that the advertisement network gets, within the application the developer has developed.
John Koetsier: Right. So, I mean, you could get an SDK for like integrating single sign-on with Facebook. You can get an SDK for, I dunno, some other functionality, maybe video functionality in your app…
Danny Grander: Absolutely.
John Koetsier: …you can get an SDK for just about anything. And this one happens to be one for showing ads, monetization of your app from an ad network. What ad network is doing this?
Danny Grander: So we’re talking about a company named Mintegral, and we believe it’s a Chinese company. It’s a spinoff out of a bigger company called Mobvista, and so that’s also the name of their advertisement network. And so this is practically all we know about this company.
John Koetsier: Interesting. And so, talk about the scope here. How many apps was this in? How many installs, downloads do they have?
Danny Grander: Right. So as you mentioned, this SDK got integrated into more than 1,200 applications. This has been happening for more than a year now, and so obviously not every application has the same popularity and the same number of downloads per month. And so we will, you know, we’ll definitely observe like some small portion of the application having huge number of downloads while then we have a long tail of lesser download counts.
But in general, like in total for all these apps, we estimate there are more than 300 million downloads per month.
John Koetsier: Wow.
Danny Grander: And just to kind of give another example is, so if we take some of the most popular apps on that list — and so you mentioned Helix Jump is one, and so just that app was downloaded more than 500 million times, more than half a billion times in all the history.
So it’s not necessarily the versions that are affected, we don’t have this data. The 500 million downloads is something we looked at, you know, that the vendor side that they themselves are sharing this data. But this gives you some kind of ballpark to kind of start estimating what is the number of affected applications — sorry, users and devices…
John Koetsier: Yes, yes. So this is an ad network that — and I should say a malware, basically — an ad network SDK that is using a term of ad fraud called click injection. How does that work?
Danny Grander: Right. So what happens typically is that when you are, as a developer and game developer, you want to monetize on advertisement. You can go and start working with one advertisement network, but what would typically happen is you would work with multiple networks, because you want to basically maximize on advertisement in all the various geographies and types of campaigns and all that. So you would integrate several SDKs into your applications. And so there would be actually another SDK that would mediate who, in any given point, who would be the network that will be, you know, giving the opportunity to present advertisement to the user.
John Koetsier: Yeah.
Danny Grander: So, in case the user was exposed to an advertisement, or was clicking actual engagement and it led to the installation, say of another application, that advertisement network would be attributed the actual activity, the click, and they will get the revenue, the dollars for that campaign, right? And they will also share that with the developer, they will share some part of that, you know, some with the developer. So that would typically happen.
And what happens in this case is that the SDK tracks, it basically instrument itself into all the networking activity and all the open URL activity within the app and open store kind of App Store activities. And so they track it, and when they see a click happening, you as the user of the game or the player of the game, click on some app, they identify it and then what they do is send another fake click from their servers to kinda say that they were responsible for that click, that it was coming from their network.
So, you know, there is some degree of sophistication to do it. Basically, you need to identify that the campaign, the given campaign is in your network and in a competitor network, and only then fake the click, right? But basically the way this industry works there, is that they attribute the success like the click, the event, to the last engagement point, to the last touch point.
John Koetsier: Yes.
Danny Grander: So in this case, suppose your legitimate network presents an advertisement, Mintegral sees that, and they immediately fake another click and so they get the attribution. And what we did to confirm that is basically we asked several partners to look at their data and see, and check for two events coming from two different networks with less than three minutes difference between them. And so this is a very unlikely scenario to happen in, you know, in real life.
John Koetsier: Yes.
Danny Grander: This happens less than 1% of time, right. And so with this particular network, they basically identified it happens more than 20% of the time. So …
John Koetsier: So to sort of summarize there, basically, when you have an app as a developer, you’ve got a game you send out there in the world, you want to monetize that, you want to make some money, you put some ads in there, you put some SDKs with ads in there, when people click on an ad and they perform an action that might be, in this case it’s installing an app and other cases it could be different things — then there’s an attribution functionality that happens in there, and you get credited for that ad that worked, and you get some money from that.
Danny Grander: Absolutely.
John Koetsier: And this SDK is basically saying, oh, wow, somebody did something with regard to an ad, they clicked on it, they installed something, they did, they converted and we’re going to send another click really, really quickly, grab credit for that because it’s last-click attribution, and then we’re going to make some money from that.
Now is this the first time that we’ve seen this kind of attribution fraud, this kind of ad fraud on iOS?
Danny Grander: So generally, we’ve seen some cases like that, but not in iOS. We’ve seen cases, there is such a case in 2017 and then a year later in 2018 on Android. And so that we’ve seen attribution fraud happen and this is a major type of fraud apparently that happens.
But this is the first time we see such a case on iOS, and with also with this kind of technical sophistication around bypassing getting into the App Store, but also obfuscating all their secret functionality, and malicious functionality, and getting it through the App Store, and then, you know, actually performing, successfully performing this type of fraud for more than a year.
So this is absolutely the first time we’ve seen something like that.
John Koetsier: Interesting, because we’ve seen this before in Android, right? Because Android used to have a thing where a new app gets installed and it sends a broadcast on the device and saying, hey, there’s a new app available. And people used to be able to grab that signal and then basically send out that click.
Danny Grander: Exactly.
John Koetsier: But we haven’t seen that on iOS.
Danny Grander: Yeah, they are using, just to kind of add to that, they have been using a method called the method swizzling. It’s like a technique, basically it’s a type of function hook they perform on the central functions that [are] responsible for communication and for clicks, click events, like open URL events.
And this is something that is, you know, this is very uncommon to do for an app basically, right?
Like they are an SDK, they were supposed to just do some very specific things like showing an advertisement, right, but they instrument themselves in all the network communication of the app and basically not only take this data and send the data back to their servers, but also perform the actual fraud.
And one important thing to say is that that functionality is running even if the SDK was disabled. So basically if a mediation SDK, mediation network disabled this, you know, didn’t enable this SDK because in this particular geography, you would want to use a different network say, but the kind of the malicious functionality would be still running. So that’s an…
John Koetsier: So as a developer, you didn’t actually maybe even choose that particular SDK, you just happened to have a mediation platform SDK that aggregates multiple other SDKs in.
Danny Grander: So yeah, that’s one case. I’m not sure if you actually kind of don’t get to choose. I think you do, but what I meant is that as soon as you integrated it for even like 1% of your audience, you want to use that, right, and you do not enable it in 99% of your items, it will still be working and doing its thing.
So that’s the thing, I’m not sure you don’t get to actually choose. I think you do need to choose to integrate…
John Koetsier: Okay.
Danny Grander: But when you did the all kind of the hooking, the method swizzling, all this thing is happening even though the iOS — sorry, that particular SDK wasn’t enabled.
John Koetsier: So what are the privacy implications here? What is the app watching? What can it see? Obviously it’s doing ad fraud, so it’s watching some of the things that you’re tapping on or clicking on, so it can monetize those. What else is it doing? Can it see anything outside of the apps that it’s been integrated into?
Danny Grander: So first off, this is within the scope of the app. I want to be very clear on that. You know, the SDK is installed within like, is deployed within a specific app, and so they are doing their interception within that particular app.
So all the traffic that goes out of that app, they can actually instrument and intercept and lead back. It does not go outside of that, but within the app you have like, quite a lot of data, right?
It will depend, it would absolutely depend on the app, right? Some apps would have, you know, secret whatever like chat text messages, right? Like that’s, other apps might have just the number of coins you have on some game, right?
John Koetsier: Yes.
Danny Grander: But it’s important to say that we basically have seen three types of kind of hooks, interception logics they have.
One would be intercepting open URL calls that is, you know, when you click on the URL or when the app opens, say, you know, a QR code scanner. So you’re scanning a QR code and then open URL happens, basically you’re assigned to a different URL. So that’s one thing they are tracking and they’re sending back the URL and a bunch of other kind of parameters along.
Second thing they’re tracking is StoreKit event. Basically this is when you open an app store from within the app, right?
Like you have this kind of, you know, and so, but I think the most dramatic, to your question to the privacy violation here, is that they actually intercept all HTTP communication, all HTTP request and response and, you know, POST or GET requests, but they can leak back the URL and the headers.
They do not leak back the body. So it’s also important to say, because the functionality of leaking the body is there, but it seems like they have some sort of bug there? and so it wasn’t actually doing that. But, the headers part and the URL part it’s quite a lot. So first of all, in the HTTP request headers, you have all the cookies and the authentication talk, and so that in itself is very significant, right? So along with potential private data, right, like it depends on the app again, right? It would vary from one app to another. But it would have, absolutely have some sensitive information there.
So it’s very hard to say for, you know, to generalize it, but very clear that we’re talking about all HTTP request headers and the URLs. So that’s, that can be…
John Koetsier: So they know where you’ve clicked, what you’ve clicked on, where you went, and some of the details that are included in the actual functionality of that click, like you said, the POST and the GET parameters for those clicks. But even the bad guys have bugs, so they didn’t need the request.
Danny Grander: Yep.
John Koetsier: I guess that’s small mercies. How is something like this not seen by Apple? How has this, I mean, apps are scrutinized before they get approved and of course SDKs are integrated into an app. Is it dynamically loaded after that?
Is it something that Apple wouldn’t see even if they were looking pretty hard? How’s it get past App Store review?
Danny Grander: Yeah. So that’s a great question.
First of all, you know, historically Apple’s been doing a great job and I think they’re still doing a great job around, you know, scrutinizing the applications that get into the App Store. And their security is top notch, that’s for sure.
Now, how come this happens? So, you know, this is inevitable, really. Like it’s impossible to scan every little piece of code within every component there is for each and every app, right? And so, there will be misses, right? And I think what is important in such cases is, you know, this is a cat and mouse game in the end. And, you know, what’s important is how quickly you react, how well you react for different challenges from the mouse, right?
And so, and this is again, Apple’s been doing great on this front for a long while, but it’s very hard to kind of come at this point and say, you know, how come you missed it? It’s really impossible to kind of catch all the evil little things that happen.
They’re, again, doing absolutely great compared to say Android, but it’s hard to kind of find everything.
And one other aspect around that, is the data was obfuscated. So you look at the SDK, it looks benign. You basically, you don’t see anything fishy there, you know, you look at the strings, it all looks fine, but then you kind of start digging in and you identify that there’s a Base64 encoded string are actually not Base64, it’s some variation of Base64 encoding. When you decode it, you know, like through the actual base, the actual decoding they are using, you get some interesting strings immediately.
And so, you know, that is something I would suppose that Apple would automatically flag, but, you know, they made attempts to evade proxy interception or simulators on basically what is called an anti-debugging trick. So they try to identify when you as a researcher or an automated framework for analysis is trying to analyze the application. So it’s like, you know, we’re all good, the mouse is very good, behaves very well, but when it gets your device and given a command to fully function, then when it’s like, you know, does its thing.
John Koetsier: The command to fully function, how does that happen? Does it phone home? How’s that work?
Danny Grander: So it’s very simple in the initialization, in the first phase when the SDK is initialized, it gets its configuration from the server. S
o that configuration gives the command of first of all, whether enable or disable anti-debugging tricks, like kinda anti-tampering, anti-reverse-engineering protections so that it gives a flag for each and every type of hook I mentioned, you know, they open URL, the StoreKit and the general HTTP traffic interception.
So each and every one of these is a flag that can be set, and so this also can be done statistically, right? Like it’s make it harder to detect. I can choose to enable the flag for just a portion of devices or just one specific geography or whatever, anything else I wish to kinda do so.
John Koetsier: And in fact, you mentioned that as well, because you said you checked with a number of attribution partners and it was, I think 20% is the number that you referenced, you know, 20 to 30% or so of the cases where they were actually committing fraud. So they weren’t trying to do it 100% of the time, which would be really, really obvious.
Danny Grander: That’s right.
John Koetsier: They were trying to do it at a low level, and that’s probably part of the reason why they were so successful and managed to escape discovery for a year.
Danny Grander: Absolutely. Yeah. I think on that level, like the data level I’d say, right, like we’re actually marketing, like mobile marketing platforms have the data, but it’s still, they were flying beneath the radar.
So, you know, when we inquired and posed a concrete question, like ‘Given a click, are any other clicks coming from a different network within less than three minutes?’ And so, you can simply query this, your data and then you should see less than 1% case. And you know, when you see 25, and I think we’ve seen some 24% to be able to be precise.
So that’s a significant number, right? So, and you’re absolutely right, they could be greater, they could go up for whatever 50% or all the clicks, right. Like, but obviously it’s a balance they had to keep, you know, to not be unearthed.
John Koetsier: Yes. I have a question about Apple’s new methodology around ad attribution. So Apple, of course, in iOS 14 which is coming out very, very soon, is introducing SKAdNetwork and deprecating to some extent the IDFA, which is an identifier on the device for advertisers.
And so with SKAdNetwork, which is Apple’s new way of doing app install attribution, would this type of spam, sorry, this type of adware even work?
Danny Grander: So I think what will be changing in this case is that at this point, only Apple will have the data. So, you know, for good or for bad, right? Like they might be able to identify these kind of anomalies in the data, like the one I mentioned, but now we have multiple MMMPs, mobile marketing platforms that have the data.
And I’m not an expert on advertisement, you know, industry and the architecture there, but from what I gather, that would be the major change. Basically they will be owning the data and they will be able to identify, hopefully right, such discrepancies, such significant anomalies within the data. And it would be harder for others to do it basically.
John Koetsier: Yes, yes. So how did you find it? Talk a little bit about, you know, maybe a little bit about what you do and why were you looking for this, and how did you find it? It seems like a really challenging thing to find, and yet you found it.
Danny Grander: Absolutely. So, first of all, at Snyk we’re building security products for developers, for DevOps professionals and application security professionals, and that’s what we do. We started with a software composition analysis product that is identifying risks, security risks, and malicious components in your open source dependencies, in your components that you pull in into your application.
We expanded to containers and you know, now infrastructure is go, so this is our space. Like, we analyze components that developers use in their application. Now we have a research team, a big research team, and previously we’ve been working mainly in the open source world. We fixed at this point, unearthed, found and fixed thousands of vulnerabilities in open source components, basically working with the maintainers.
And this, when we expanded to CocoaPods, it’s basically the package manager for iOS. That is where we started looking into also types of components, because in CocoaPods it’s not just open source components. It can be any type of component. So this is a good example of one closed source component. This SDK, Mintegral SDK, is a closed source code and so we started looking at this as well, and we found really like, especially with the recent upcoming changes from Apple around advertisement, we found some interest in looking into those. So we started looking at several, but this one in particular caught our eye and the more we dig, the more, the bigger it became. S
o this was just, you know, we’ve been doing that this year, the research team, that’s what we are focusing on. We look at open source components, looking for vulnerabilities and very common vulnerabilities and components, but we also look at closed source components and identify anything that, you know, is not good. So that’s really kind of where we started, and absolutely, this is also where we’ll be looking more for, right? Like this, the more we dug into this space, we understood that there is, you know, there’s a lot of things happening there. It’s a big world and we have a lot to contribute there with our research, with our capabilities, you know, looking into those components, analyzing them in masses, right? Like doing mass analysis, it’s not just taking the SDK apart one line at a time. We can do that at scale.
So we’ve been, we are planning to look for more such cases and not only in the advertisement industry. You probably know, like supply chain attacks in the last few years are on the rise.
This is absolutely a problem. There is a lot of, you know, the types of attack vary. Sometimes it’s [a] developer’s account is being taken over by some malicious actor and they insert some malicious code. Sometimes it’s just, you know, open source library appears on the package manager and it at some point turns malicious.
So, you know, there is a type of squatting attacks, basically when you name your package with a similar name to a legitimate package, so the developer is basically unknowingly installs the wrong package because he had a minor typo.
So there is a lot of techniques that bring malicious code into the application of a developer. And this is absolutely one of them, you know, when basically from the start there is a malicious intent delivering an SDK or a component to the world.
So that’s, I think where also our advantage is. We know and do well kind of analyzing binary code, not only open source. So that’s also where we can, you know, it took us I think, less than a week to fully disassemble this thing and understand how it works and proceed to disclosing to Apple…
John Koetsier: You have disclosed it to Apple already?
Danny Grander: Yes, absolutely. We disclosed to Apple on the 17 and had a call with them planned. We’ll have another call with them, yeah, so, this is already, you know, been…
John Koetsier: Do you know what action Apple is taking as a result of that?
Danny Grander: So we do not get updates on all their progress there, but they are, you know, they’re working on it. They’re absolutely taking this seriously and I’m sure they will be able to comment on this.
John Koetsier: Okay, excellent. What will they do in that scenario? I assume they’ll be notifying developers. I assume they’ll be trying to get rid of that SDK code. I’m assuming apps won’t be taken down for this because it doesn’t seem like it … what do you think will happen?
Danny Grander: So it’s hard to say, you know, they have several options that yes, I don’t see everything that they see, but I’d guess that they definitely would want to stop the malicious functionality, right? You know, while we speak, there is data being leaked, right, it’s the functionality is there. And so that is something, you know, I would want to stop immediately and this can be achieved by removing moving dabs, but that’s I guess very aggressive.
John Koetsier: That’s kind of the nuclear option, yes.
Danny Grander: Yes.
But I guess you can, for example, reroute the DNS for the domains, you know, where the data goes, right? So you basically on the device level can make the address that the data is being sent to, whatever, your own or just nothing, right? So that will go out.
And I think, you know, it does require updates and this is just one idea that came to mind, but I’m absolutely not in a point where I can see all their kind of levers they can push in to get rid of this. But longer term, absolutely, they would want to get rid of those components in this app. So basically the developers would need to, you know, deploy new versions. We’ll be waiting to see what the action — we did give them one week, heads uh…
John Koetsier: Heads up?
Danny Grander: Heads up, yep. But, and it’s absolutely not a huge time window to kind of act on this incident. This is how we look at it: this is an incident, this is not a vulnerability, a security vulnerability that, you know, you would disclose to a vendor and basically they can take some time to triage and understand the impact. This is an ongoing attack, right? And so that’s why we chose to follow a very short timeline in this case.
John Koetsier: Yes.
Danny Grander: And so, yeah, so that’s basically…
John Koetsier: That makes sense. Yeah. So I’m trying to understand the scope of what Mintegral or Mobvista, which you said is the parent company, might be, what type of dollars they’d be pulling out of this. I mean, we know, as you said, it’s been running since August 2019. We know that the apps it’s currently in, 1200 of them, are getting 300 million downloads a month, in and around there.
So that’s definitely in the billions.
We know that mobile user acquisition, mobile marketing to get somebody to install your app, I mean, that can easily be $3 per install, $5 for some high value areas. For instance, I noticed in the list of apps, there’s a dating app in here. We didn’t see a FinTech app in the list that I checked that had this SDK installed, but there’s going to be some that are higher leverage and you can spend $20, $30 to acquire a user in some of those really high value verticals.
Do you have any sense of, you know, is it hundreds of millions? Is it potentially billions of dollars that they may have just sucked out of the ad ecosystem here with this fraud?
Danny Grander: So frankly, it’s very hard to estimate from the data we do have at this point. You know, it’s very, very hard to even guesstimate this number. I would be also very interested to know what are the amounts we’re talking about here. I’m sure the mobile marketing providers, other ad networks, can actually look at their data when this goes public, and they actually can analyze and see exactly what those numbers are, because A, you know, they can do the math.
They know what campaigns they have, where they had the click, and then, you know, and so MMP is the mobile marketing platforms, with their help we can kind of get to that number. But this is not something we did at this point. We really focused on the technical aspects, you know, we were looking at software components, we analyzed this code, we found all the functionality there. We, you know, posting our technical blog post around it, and sharing all the factual data, factual things that we identified.
John Koetsier: I understand, that makes a ton of sense. So you’re still looking for more. Obviously given the length of time that this one existed for, and the fact that it remained undetected, didn’t get greedy and kick into case on 100% of app installs and clicks. We really don’t know if there’s more out there presumably.
Danny Grander: Right. And you know, that’s always the case. Like, you know, if we would not found this, you know, we would assume there is nothing bad there, right?
John Koetsier: Yeah.
Danny Grander: So I, maybe as a security person, I’m on the other side of it. I’m kind of, let’s assume there are a lot of those, right. Same, you know, if you look at even the Covid situation. We’ve been living our lives and then this thing happened, and it’s like, oh wow, apparently this thing can happen. And so what I’m assuming is, well, we’ll have more Covids coming in and it’s not necessarily a pessimistic view, it’s just, you know, I think definitely as a security person, it gets you more ready for this case.
And, you know, I have no doubt this is how Apple’s security team is looking at this. They assume there are more such cases and I’m sure they will be doing, they will be looking for such.
And you know, they actually have a lot of data and capabilities to do that. So I’m sure they’ll also be having their own wins in identifying more cases. Some things that we might even not hear about, but it’s, you know, that’s also where they are the strongest to kind of, you know, to look and identify this.
John Koetsier: It kind of seems you have to be both an optimist and a pessimist to be a security researcher. A pessimist that there’s more bad stuff out there, and an optimist that there’s more work for you down the road.
Danny Grander: Yeah. Yeah, I guess. And so, yeah, like if you look at the history of any security or privacy, you know, it’s always, there are things are becoming more complex and you know, we’re improving in a lot of fronts. Absolutely, yes. But obviously there is also progress on the other end and, you know, fraudsters attackers will be getting better as well. So, yeah, this is the world we’re living in.
John Koetsier: Excellent. Well, Danny, thank you so much for spending so much time with me. I really do appreciate it.
Danny Grander: Absolutely, thank you, John.
John Koetsier: Excellent. Well, and everybody else, thank you for joining us on TechFirst. My name is John Koetsier. I appreciate you being along for this show. You’ll be able to get a full transcript of this podcast in about a week at JohnKoetsier.com. And the story on Forbes will actually in this case appear very, very quickly on, frankly, Monday. This is Sunday that we’re recording this. So the full video will be available after that on my YouTube channel as well.
Thank you for joining, maybe share with a friend. Until next time … this is John Koetsier with TechFirst.