Mobile measurement in the age of privacy with Brian Krebs of MetricWorks

explosion apocalypse iOS 14 mobile measurement

As every mobile marketer on the planet knows … iOS 14 will in effect deprecate the IDFA by making it opt-in. The IDFA is likely going the way of the dodo bird and the third-party cookie. The Google Ad ID (GAID) might follow in a year or so.

Does that mean the end of mobile marketing measurement?

Not according to Brian Krebs of MetricWorks. In this episode of Retention Masterclass, Peggy Anne Salz and John Koetsier chat with Brian about measurement and retention in the age of privacy.

Scroll down for full audio, video, and a transcript of our conversation …

Subscribe to Retention Masterclass: mobile measurement after IDFA


Watch: Mobile measurement in the age of privacy

Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.

Read: Mobile measurement without granularity

(This transcript has been lightly edited for clarity.)

John Koetsier: The iOS 14 sky is falling. Or is it? Hello and welcome to Retention Masterclass. My name is John Koetsier. 

Peggy Anne Salz: And my name is Peggy Anne Salz, and we are, as always, co-hosts on your show. 

John Koetsier: Absolutely. As always — wow, we’ve been doing this show forever!

Peggy Anne Salz: As always … we’re getting up to episode 20, John. And this was initially just because we were thrilled about the topic. And now it’s like … 

John Koetsier: That is true.

Peggy Anne Salz:  … it’s the real deal.

John Koetsier: That is true.

Peggy Anne Salz: Yes. Great to be a pioneer. 

John Koetsier: Well, as every mobile marketer on the planet knows, iOS 14 will in effect deprecate the IDFA by making it opt-in. So the IDFA is going the way of the dodo bird, that mobile identifier that marketers use. With of course, the other dodo bird … the third-party cookie. The Google Ad ID (GAID) might follow in a year or so.

Peggy Anne Salz: Yeah, it’s looking pretty possible. And that of course means measurement. Yeah, marketing measurement is fundamentally changing … granular to aggregate, user level to device-specific to general, to who knows? That’s why we’re here on this show. Literally, who knows? 

John Koetsier: Well, you know what? I think we have some ideas and we might have a guest who has some more, or at least we hope so. That’s why we have him on the show.

Question is: what does this all mean for optimization? For cohorts? For LTV calculations? For all the data and metrics that marketers have been using for the past few years? 

Peggy Anne Salz: And that’s why we have him, right? We have Brian Krebs. He has spent almost a decade — well, actually most of the decade — building solutions for measurement for mobile marketers. Here’s what I like, John, he’s on a mission to measure, right? So with that in mind, we have high expectations. No pressure, Brian. Welcome to Retention Masterclass! 

Brian Krebs: Thanks so much John and Peggy. It’s really great to be here. 

John Koetsier: Awesome, it’s great to have you. Brian, we’re going to jump right into it.  IDFA deprecation is having, and will have, massive consequences for marketers. What’s the biggest change that you see? 

Brian Krebs

Brian Krebs, CEO at MetricWorks

Brian Krebs: Yeah, it’s just, so when IDFA disappears, devices go dark … almost essentially on day one.

So, this early-next-year timeframe when this privacy update rolls out for iOS 14, devices go dark almost on day one in terms of tracking. So, you still have your IVFVs, as long as, as an app developer, you have taken the time — this time right now — to refactor a lot of your code, a lot of your internal app analytics and business intelligence to make sure you’re focused on IVFVs.

You’re still okay from an app analytics standpoint, but in terms of marketing, we’re really focused on tracking. And that means understanding which devices are in which apps across app developers, developer accounts, in the terms of the iOS iTunes store.

When that happens, a lot of things kind of fall. Definitely retargeting. You can’t understand users in your app unless you know the IDs and those are consistent in other apps. Right? So retargeting falls immediately. Measurement in its current state falls immediately, kind of replaced by this new thing: SKAdNetwork.

That is new, it’s in a new version, but actually it was released a little while ago to almost no fanfare or even notice actually, outside of just a couple like really, really in-depth industry analysts. But it’s now received its upgrade. That will be the de facto standard for measurement, and measurement in its current form completely goes away. So, everything goes dark basically, in terms of marketing. 

Peggy Anne Salz: That’s so optimistic. 

John Koetsier: It’s more optimistic than Eric Seufert, who announced this topic with a big atom bomb explosion on his stories.

Brian Krebs: Exactly. It gets pretty amped up as far as the rhetoric. And look, it is … it’s a major, major change in the industry and the industry has moved through these changes in the past.

It’s not like everything’s going to completely fall apart. People will innovate. People will build new solutions that are more privacy-focused, is the key.

Right now  we’re in a world where we’re not worried about privacy as an ad tech industry and as a … just a mobile industry in general. And GDPR and kind of legal barriers have cropped up. And personally, I think those are just absolutely necessary and long, long overdue. But as an industry, it’s still taken us a while to catch up. This is kind of the nail in the coffin, if you will. Maybe not apocalyptic, but definitely significant. 

John Koetsier: Yeah, and interesting, we had a question as well about fingerprinting. And according to the letter of the law for Apple, fingerprinting is tracking. And therefore, if you haven’t asked for permission to track, you know, you’re going to have issues there as well. 

Brian Krebs: That’s right.

John Koetsier: Peggy, moving on to you. 

Peggy Anne Salz: Yes, absolutely. I wanted to just cover the stuff at the top. So I want to step back, because you are focused on measurement. But hey, you also have a company doing that as well, MetricWorks, right? And you’ve been doing this longer than we’ve been worried about iOS 14. So there’s something there, some sort of history. Just give me a brief background about your company. You founded it for a reason. Now you have a mission. What is it? 

Brian Krebs: That’s right. MetricWorks has been around for a while, but in like kind of various forms. So I was a co-founder of TapHeaven. We were originally a DSP building kind of solutions to make marketing more effective, is what I’ve been doing now for the better part of a decade. TapHeaven was eventually sold to a Norwegian company called Target Circle, and then a couple of years later, it was actually broken back out into a separate subsidiary called MetricWorks.

Kind of a lot of my same TapHeaven team is with me within MetricWorks, and certainly we’ve picked up some great resources from the Target Circle umbrella along the way. But yeah, so over that time, it’s just about making mobile marketers more effective. And that in the early days was kind of DSP and RTB, which just took way longer than I personally expected to pick up on mobile. So it was a play that was way too early.

Learned my lesson there, and we’ve since shifted into automation at a greater scope. So, rather than bidding on those impressions, those really strong prediction algorithms understanding conversion rates, we kind of went a level up to essentially automate all of mobile user acquisition across all channels. So that’s where we’ve been for a while now. And as of about 9 months ago, we started a research project trying to understand the limitations of the last touch attribution model and kind of figuring out how to get around those limitations. And that took us into a variety of paths, including econometrics, including incrementality testing.

But, you know, several months later we found out that the IDFA indeed was disappearing. So it just happened to be a really solid set of timing basically. 

John Koetsier: Very, very good timing. I mean, they often say about startups ‘Timing is everything,’ right? It may be even more important than the actual idea in some cases.

Brian Krebs: So true.

John Koetsier:  Let’s dive into it then. So, in iOS 14, you already said stuff goes dark, right? There’s a lot of things that you won’t be able to do.

But as a marketer, especially as a retention marketer, you need to understand retention. You need to understand your daily average users. You have app analytics for that, of course, but you need to understand LTV, and ROAS, and stuff like that. You can get that at certain levels. What’s that look like in iOS 14, in your view? 

Brian Krebs: Yeah, it’s a great question. So app analytics, like you said, it kind of covers part of the retention picture, but it’s an incomplete piece of the picture, right? Because if you’re looking at retention, if you’re looking at retaining users, you’re looking at making sure that your app or your product is optimized to engage users and retain them, of course.

And app analytics, no problem there, IDFV will be your go-to proxy to understand users or devices.

The thing is, there’s another part of retention, which is understanding where those users are being sourced from. If you don’t have a great idea of where users are being sourced from, and you’re not acquiring quality users, your retention is going to be severely hampered anyways. So that is the key here.

When IDFA disappears, it’s understanding where your users are being sourced from, and from each media source, and to hopefully fairly deep levels of granularity from a media source perspective. Where are the high retention users being sourced from? Where are the high value users being sourced from? That is a question that will no longer be able to be answered, at least in the way we’re used to it today, as soon as IDFA disappears. 

Peggy Anne Salz: So very much, it’s about a different approach — and you’re going to talk about that approach in a moment — but it’s also about a mindset. I mean, what you’re basically saying is that, that search for those highly valuable users … their probability or propensity that they will retain at a high level is gone. Right?

Brian Krebs: Yeah.

Peggy Anne Salz: So, elaborate on that, because what does that mean? Does that mean that I have to do it completely differently? Or I need to tweak it slightly? 

Brian Krebs: So, in this new world, we’re going to all as marketers end up with SKAdNetwork replacing the glue that MMPs used to provide, from the standpoint of attribution, right? Where are my users being sourced from? And when that happens, that change carries with it a bunch of limitations that are really limitations inherent in SKAdNetwork.

So number one, SKAdNetwork, you got a hundred campaign IDs. Some people are running thousands of campaign IDs per media source. So that’s already a massive limitation right there in granularity, if you will.

Also, SKAdNetwork, it has a problem of data completeness also, if you will. So number one, the install date of users is obfuscated … for privacy reasons. So you never really know when users install, even though you get an install postback that you can count. You don’t know really when that user installed. Cohorting KPIs by install date, that also evaporates.

Peggy Anne Salz: Yeah.

Brian Krebs: And then finally, from a data completeness standpoint, you’re only measuring a single conversion value. And that happens to be this six bit number basically, that most people are already thinking of as an ID, like some sort of event ID, you know, level completions in a game, something like that.

Others are thinking more of buckets like LTV buckets, or revenue buckets, maybe even retention buckets, but you only get 64 of them. And those are even limited in scope just because you get this rolling 24-hour timer …

Peggy Anne Salz: Yeah.

Brian Krebs: … that first 24-hour period is so critical now, because if that first 24-hour period elapses and the user doesn’t come back into your app, giving your app code a chance to reset that timer with an increase in the conversion value ID … it automatically stops and fires off that postback to the attributed network.

And that’s all the data you get.

So there’s going to be a little bit of tail-wagging-the-dog scenario, where companies I’m talking to are already looking at this from a product perspective. How do we manage our product in order to make sure that within those first 24 hours — which is the only period we’re guaranteed to know about — that we get as much information about the value of the user as possible?

But in general, that SKAdNetwork solution is going to be the only solution replacing that normal last touch glue. But even that has problems. As I alluded to before, we’ve been kind of cozy, in this sort of … this blanket that is more of like an emperor’s new blanket, if you will. It’s nothing, basically. We’re just kind of wrapping ourselves up in nothing, and it’s comforting. Last touch has been around forever, and it’s kind of really well-aligned with how we buy users as well in mobile. Usually user acquisition is billed on a CPI or cost per install basis.

So, last touch has kind of made sense from that perspective as well, but, in general, last touch has so many problems with it. Even SKAdNetwork, an implementation of last touch has severe limitations from what we’re used to today. Even today’s solutions that are all based on last touch are severely crippled, simply due to the issues with last touch.

Mostly the fact that last touch, you’re looking at the device level. So number one, you can’t understand true incrementality. You’re only understanding a single touch, the last one in the conversion path, which could be made up of dozens of touches.

John Koetsier: Yeah.

Brian Krebs: That last touch might not actually have very much incremental value at all. 

John Koetsier: Absolutely. That’s a huge issue. I mean, because anybody who knows how they actually buy something, or install an app, or anything like, that knows that there’s multiple factors that come into play most times. And sometimes the last one is by far the least important — it’s just a little trigger, it’s just a little reminder or something like that. So there’s definite …

Brian Krebs: That’s it.

John Koetsier: … issues for that. There’s lots of issues as well — it’s interesting, I’m just going to riff for a little bit, because you talked about companies re-architecting their apps and their app experience in order to ensure they get as many touches in the first 24 hours after install, so they get as good data as they can during that only period when they’re actually guaranteed to get data. 

And that’s really interesting because will you sacrifice long term retention for immediate engagement? And will that have long term issues? Or will immediate engagement have good things to say, or to influence your long term retention? Who knows?

We don’t know those things right now. It’s going to depend on implementations of how people do that, but it’s going to be super interesting to study that. I want to ask you about kind of a workaround. You’re proposing a workaround that’s sort of a top down approach, sort of your focus on delivering a new kind of future-proof mobile measurement solution.

Talk about how you plan on approaching it. 

Brian Krebs: Yeah, absolutely. So, as I mentioned, three quarters ago basically, we started this research project just trying to solve the problems inherent in last touch attribution. The fact that honestly, in measurement, we’re focused on the causal effect between media spend and incremental value, right?

And incrementality, incremental value, or uplift — whatever you want to call it — is so critical.

Even more now and as time has gone on, because across all these media sources we’re used to as mobile marketers, there is such a high overlap in audience. Right? And that’s just increased over the years. Back in the day, these ad networks would actually have the notion of exclusive inventory. I mean, that’s just not a thing anymore, right? They’re more or less like glorified DSPs, you could essentially say, all targeting the same exact audience. Right?

So, over the years, as the audience overlap has increased across all these media sources, incrementality has become more and more relevant. Because, you’re just adding touches to this kind of stream. The analogy I hear often is the fishing poles in the stream, right? It’s the same group of fish, each media source you’re adding is just another fishing pole.

And the critical thing here is not really to optimize your marketing based on what the last touch happens to be, the ads that happened to get the last touch. It’s really optimizing the media mix, which is optimizing the perfect number of fishing poles and the perfect mix of fishing poles in that stream. And that’s been really the key for a long time. It’s just not been really recognized by the industry as what your job should be as a marketer. So, what we’re doing is saying, okay, let’s fix that problem.

The nice thing about it is, it just so happens to be privacy-centric as well, if you will — or privacy compatible, let’s say — simply because it only works with aggregated data. So, early on, we identified two main branches or approaches, if you will. Number one, is a top down regression-based measurement solution where we’re using econometrics. We’re using techniques similar to media mix modeling, but it’s actually not quite that. You could think of it as taking media mix modeling and applying it, adapting it, to measurement … something that it’s just not meant to do right now.

The other thing though, is when you’re looking at regression, you can’t just have a stand-alone regression-based solution to measurement. Specifically, because while it’s able to do a lot of cool things, top down and incorporates everything, so clearly it can understand incrementality … something that a bottom up or device-level solution, like these last touch attribution models we’ve been so used to, just can never do.

When you’re operating at the device level, you can’t understand all the other touches that happened on this device, based on the entire media mix, overall.

So, number one, you can understand incrementality. But number two, you’re a little hamstrung from the sense of you can only understand — based on regression — correlations. And when we’re talking about measurement, you’re worried about the causal effect between ad spend and incremental value. So you need something else.

And what we looked at originally was incrementality testing. And I truly believe that is the more or less gold standard in understanding the true ground truth of incrementality. But it was also hamstrung due to the IDFA deprecation. So when you’re talking about incrementality testing, what you’re doing is you’re really running a randomized controlled trial like you would in like a pharmaceutical company. Just taking a population, dividing it up into two separate groups randomly — that’s key here —  into a control group and an experiment group, or a treatment group, or a test group, whatever you want to call it. And that treatment group is the one that sees ads. The control group does not. There’s a variety of things that people have been looking into that type of thing for a long time. 

Peggy Anne Salz: Right.

Brian Krebs: In fact, Facebook already offers that type of incrementality testing as these uplift tests, and it uses ghost ads. It’s a really good way of doing it. There’s a ton we can talk about just as far as incrementality testing alone, honestly … 

Peggy Anne Salz: It’s kind of a model actually, Brian, you know, to be fair. I mean, that’s a topic of its own. 

Brian Krebs: It is.

Peggy Anne Salz: And you’re talking about that modeling. I want to just understand when you say, you know, this is sort of like backwards engineering. We’re looking for a causal effect. Can you just give me a little bit of practical understanding here of what that means, what I’m looking for in behavior, in measurement? Because I can’t measure very much anymore. Right? 

Brian Krebs: Yeah, no, that’s right. So, you have these two sides: a regression side if you were an econometric side, and like an incrementality or experimentation side. Right? On the regression side, it’s actually pretty straight forward.

What econometrics is, is simply looking at historical trends, looking at historical data, usually in a time series and understanding the historical correlations between these dependent variables: whatever could be affecting whatever you’re measuring; and the independent variable: whatever you are trying to measure. Right? 

Peggy Anne Salz: Mm-hmm.

Brian Krebs: So when you apply econometrics to marketing, or marketing measurement in particular — which is obviously what we’re looking at, specifically — you’re taking all these various things that could be affecting whatever you’re trying to measure. In this case, it’s uplift.

Let’s say it’s uplift in terms of retention, right? Based on the theme of this show, I think that’s a reasonable way to go.

So, we’re looking at retention as our incremental uplift. What are the things that could affect that in terms of marketing? It’s ad spend, across all your various media sources, at hopefully quite granular levels … campaigns, ad sets, down to even publisher apps when you’re talking about SDK networks … countries, all these dimensions, right? Where’s my ad spend going? Where are my impressions coming from? Even where are my cliffs coming from?

You can incorporate so many other things trying to proxy like latent demand. Trying to understand what is my underlying demand for my product that really isn’t being affected at all by my media spend, right? You can incorporate so many different things.

If you’re a weather app, you can incorporate weather, right, into the model. All these things can be “dependent variables” we call them. And that all interacts in certain ways over time and correlates, or doesn’t, with the actual dependent variable in this example, retention. So you might see certain days where you spend less on certain media sources and there is a constant correlation with a decrease in retention. Whereas on days where you happen to increase impressions on those specific media sources, there’s a correlated increase or improvement in retention.

And you find those correlations and you can measure the correlations. You can figure out the uncertainty in the correlations, and you can look at those correlations very specifically at each individual media source level. And you can figure out kind of where they are and really allocate retention, installs, uplift to the various media sources in that fashion.

John Koetsier: It sounds like something that requires a very high level of data science. And what I’m remembering right now is the Arthur C. Clarke quote, which is that “Any sufficiently advanced science is indistinguishable from magic.” 

Brian Krebs: Exactly. And honestly that …

John Koetsier: So … help us understand how this is not magic. 

Brian Krebs: That was actually one of the keys going into this from a product perspective or a strategy perspective, really. It starts there.

How do you take econometrics — which, you know, obviously large consumer packaged goods firms have been using for, I mean, honestly, like a few decades now — and successfully, not for measurement, but for other purposes including what-if analysis and forecasting. But how do you take this concept that has been used successfully in other industries, but apply it to measurement? Something that traditionally has been looked at as very, very deterministic.

Even though, as I mentioned, it’s been deterministic, but been a false truth, if you will, or a false reality. But at least it’s understandable. How do you take something that everyone has been able to understand — last touch gets all the credit — and find a way to apply something that is just not super understandable like regression?

Even a regression at its core is sort of understandable. You’re building this pretty complex model and trying to divvy up all the various installs or retention across all the media sources like that. It’s a difficult problem. The main way we try to solve that is number one, trying to depict it in a pretty understandable way. Retention is just a bunch of variables affecting something you’re trying to measure. How do we figure out what the correlations are between those two things over time? And you can actually translate that — that’s regression — you can translate that into a pretty reasonable measurement methodology. If there’s a high correlation, you’re probably getting a lot of installs truly, or a lot of value out of those media sources where it’s correlated.

And likewise, the opposite on the other side.

John Koetsier: Well, we all know that correlation means causation. So …  I’m joking there. I understand what you’re saying, but …

Brian Krebs: Yeah. Well, that’s the critical point though. Obviously it doesn’t. 

John Koetsier: Yes.

Brian Krebs: So that’s the second aspect here where you can’t just rely on correlation, or regression, or econometrics.

You have to have some sort of experimentation process where you’re validating and enriching those models. And the way you do that is — I wish it was incrementality testing, and I’m hoping that somebody eventually figures that out, even with the lack of IDFAs. The big problem there is, as I was mentioning, you need this big population that you can split randomly into a control group and a treatment group. Without IDFAs, and a pretty sizable chunk of them, there’s no real way to do that. 

John Koetsier: Yeah.

Brian Krebs: So IDFA deprecation really crushed incrementality testing, but there are still other ways. In just the most naive approach, you can just do a time series sort of experiment. In other words, you can just turn off some traffic. 

Peggy Anne Salz: Yeah. 

Brian Krebs: And based on that date, you can kind of understand, okay, what is the decrease in uplift? And certainly that should, as long as all the variables are controlled, that should give you the ground truth of incrementality as well. But there’s a lot of variables that you have to account for there. 

John Koetsier: Yes. 

Brian Krebs: And we’ve used something algorithmically to deal with that. But, yeah, that’s going on another level.

John Koetsier: Which is very challenging as well, right?

I mean, if you want to do incrementality testing, if you want to like do some ghost ads or in some way like that, you have to, there’s extra spend you’ve got to do. Or if you’re gonna do it by turning off sources, then you’ve got to have balls of steel, because maybe all of our installs are coming from this source and I’m turning it off. And my growth rate tanks, and my boss comes down and yells at me, right?

Brian Krebs: Yup, yup. And also, there’s a …

John Koetsier: It’s really challenging.

Brian Krebs: … there’s a couple key things there. You’re exactly right, you nailed the issues. And incrementality testing can be costly. Ghost ads is a way to take the price down, but it still has its own issues. You need a lot of help from the channel and you kind of have to trust the channel. In kind of a more of a time series experiment, like I explained, you’re just kind of turning something off. It reverses from a dollar cost to an opportunity cost. 

John Koetsier: Yes.

Brian Krebs: And the key there is running tests that have as little opportunity cost as possible, while maximizing information gain. So you kind of need a methodology …

John Koetsier: Having your cake and eating it too. I understand. Very simple, not a problem. 

Brian Krebs: Pretty much. And really, the way you can do it … so you’re hitting on some truths, honestly, you’re trying to as best as possible [to] have your cake and eat it too. And not every case is that possible.

But, what you can do, is you can use data science to kind of rank experiments in order of information gained to opportunity cost ratio. So the advertiser, the marketer, or the game publisher, is the one that can kind of prioritize those experiments that you proposed in some sort of ranked order … understand, analyze each experiment and then execute them when it’s not so prohibitive to do so.

So, yeah, in general, we kind of use a methodology like that. Try to limit the experiments as much as possible. Make them granular. Don’t turn off an entire ad set on Facebook. Turn off just the German traffic, or the Italian traffic, and only for a predefined amount of time to hit statistical significance.

And then, even then, it’s not perfect, right? The time that you turn that off could have coincided with COVID-19, the next big spike or something like that, that threw it off. There’s no way to control for all variables. So then you need something like “causal inference” is the name of the group of algorithms that kind of takes a look at, okay, let’s try to forecast what should have continued happening based on previous trends. And, when we turned it off, we can compare that to what actually did happen, and make sure that some weird variable didn’t screw up the experiment. 

John Koetsier: Here’s where we get into the real magic, Peggy. 

Peggy Anne Salz: This is it, Brian. Yeah, ’cause I’m listening to you and I’m saying, ‘Yes, I fully get this. Really appreciate it.’ Most of my friends and colleagues are in very small UA teams and we want to leave them with something rather than all the heavy lifting here, right, Brian? ‘Cause that sounds like, oh, just go out and hire another army of data scientists or whatever. It sounds really tough.

So I’d love to have some examples of one, you know, why bother, right? So there’s got to be like a business benefit to do all this. And then you’ve been working with clients, you know, you’ve been doing this. I’d like to get back to the practical here. We’ve been up here a little bit in the big picture, and I’d like to get down to the idea of like a UA team. I was talking to a guy at a company, great guy, and I’m like, ‘How big is your UA team?’ He says, ‘I’m it,’ right.

Really smart guy, really good company, very niche app. So you’re able to do that. You’re able to be very focused with, you know, do more with less. What are you able to do with your clients? 

Brian Krebs: Yeah, great question. So that’s kind of like how it works at a deeper level and all the various pieces required to do this.

But I think the overall message is once IDFAs go dark, gone are the days when you can really do marketing, you can optimize retention, you can really optimize much of anything outside of your own product without strong data science. Because you’re left with some very, very weak, incomplete, and just really crappy signals which are mostly incorporated into SKAdNetwork.

So, this IDFA deprecation has essentially increased the burden, especially on the smaller app developers, because now you do need this data science — and to be perfectly honest, again, you needed it all along, simply because what you were used to measuring in a really simple way was not actually truly a great picture of incrementality.

But in general, at least you could, you had something and it was easily understandable. Those days are gone.

So, the way that we’re looking at this is we have a data science team and we’ve been doing these types of algorithms for a while in other forms, right … LTV prediction and then automation and bid optimization. So we’ve essentially put this together in order to essentially democratize it. Make it available for everyone, even those, especially those maybe, without the resources for a data science team. So, it’s making these concepts approachable in a kind of a prepackaged form.

But certainly we have, as far as practical examples, we have been working hard for over a month now on our betas. We have this BetaPro program that we set up, certainly with some of our existing customers using some of our other products. But we’re also, over the last couple weeks, have started pulling in external companies that we’re interested in working with, into that beta program. And we’re getting some really interesting results.

But the bottom line is the overall business benefit to every single beta participant — and certainly our future customers of this product when it’s released in full — is the ability to keep the lights on when IDFAs go dark. When all the devices kind of disappear from your purview, from the standpoint as a marketer, right, in terms of tracking specifically. You get to keep the lights on.

And you don’t just get to keep the lights on in some limited way, like SKAdNetwork’s trying to provide. Your lights stay on at the same brightness that they are today basically. Right? You get to keep all the cohorted KPIs, cohorted by install date, that you’re used to today. Retention, of course, but that also includes revenue, LTV, ROAS … and you still get to measure them at the granularity that you’re used to today. So not just those hundred campaign IDs that SKAdNetwork allows, but all the various granularities that you’re used to today: country, campaign, ads that are ad group depending on the channel … so on and so forth.

So the real benefit here is basically just being able to measure as you’re doing today with the massive boost of measuring true incrementality, rather than just last touch.

John Koetsier: So let’s talk about this next few months here. We know Apple is finalizing or fully implementing all the privacy parts of iOS 14 sometime in 2021. We don’t know when. It could be January, it could be December. Most likely, my guess is sometime in Spring. And frankly, my guess is that Apple wasn’t ready either, the ecosystem wasn’t ready. But from what I’m hearing in a couple of different places, Apple wasn’t totally ready either and SKAdNetwork had some significant issues, even with the limited feature set that it offers.

But we have this chunk of time … it might be 3 months, it might be 6 months. How should marketers use this chunk of time? 

Brian Krebs: Honestly, by not just relaxing. And there is a significant number …

John Koetsier: Come on 🙂

Brian Krebs: … and it feels great. We all just got like this death sentence …

John Koetsier: Day of execution.

Brian Krebs: … and then all of a sudden it was lifted. Yeah, exactly. Exactly. So you want to relax, you know. You were headed for the block and all of a sudden it was … everything was gone, and it feels really, really good.

And I know a lot of marketers were jumping up for joy, obviously. You have to stop jumping for joy at some point, because the winners, I personally believe, are those that are using this time and maximizing their preparedness basically. Because this isn’t a stay of execution indefinitely. That execution is still coming.

So, the winners out of all this — some players are just damaged beyond repair, honestly. Retargeting, they’re going to have it really, really tough. Apple just clarified their stance even on sharing email addresses and things like that. You still have to have that opt-in. So, there’s a real big problem in terms of some of these players. But the marketers that win are going to be those that really maximize/utilize this time by exploring, especially within measurement.

Measurement powers everything else.

So, especially within measurement, taking this time to understand their measurement apparatus of the future. How is it going to look? It can’t just be SKAdNetwork. It’s just not going to happen. I know a lot of people were hoping that Apple just miraculously, you know, installs all these things in the SKAdNetwork last second, where they totally destroy their original purpose of … 

John Koetsier: Because Apple really cares about marketing measurement. It’s very, very important for …

Brian Krebs: Oh very much.

John Koetsier: … Apple to have marketing measurement. That really …

Brian Krebs: Yeah. Oh, I’ve heard people that believe that the stay of execution was really the great work, the cries of the industry. I’m pretty sure Apple barely even heard those cries, being a $2 trillion giant, they are so far into the clouds we can barely even see them. So, this is coming, and as an industry we’re not going to be able to do anything about it.

SKAdNetwork is not going to solve the issues. Just because, by increasing granularity, increasing the amount of post-install data that you get outside of that single conversion value we get now, that just makes it easier and easier as you add data points, to identify a single user again. 

John Koetsier: Mm-hmm.

Brian Krebs: They specifically don’t want that. That’s the whole reason why it was designed this way. So, hoping that Apple just installs all these huge feature updates into SKAdNetwork is just an exercise in futility.

What you really need to do is understand SKAdNetwork is going to be one signal, a very incomplete signal, but at least it’s going to be a really reliable signal for that last touch in the conversion path. None of the other touches, but the last touch. It’s going to be very reliable for that.

You need something else though, and it could be a variety of things, something you can build in-house. A company like ours obviously is trying to democratize solutions that put a bunch of things together in order to get the job done. But, you need to be utilizing this time to make sure you can measure your data when the lights go out. 

John Koetsier: Totally agree with that. On the reliability, I have a post coming out soon about three areas of potential fraud in SKAdNetwork. So stay tuned on that, but I’ll turn it over to Peggy for the close.

Peggy Anne Salz: That’s so optimistic here, the fraud we’re going to have… you know, all you can hope to do is keep the lights on. Whatever happened to setting the bar? This is great, Brian. I love a dose of common sense and hard truth, even if it’s hard to swallow sometimes. I’m also trying to riff a little bit in my mind here, John. You know, thinking about those products, I mean, how needy is my app going to be now, right? It’s like, ‘Ohhh, you downloaded me and now I need to get all of this data.’ I mean, it’s a little bit, you know, women don’t like needy things anyway, and now this app is going to be needy in the first … 

John Koetsier: They don’t?

Peggy Anne Salz: Well … we won’t go there. But in the first day? It’s like the first date. Anyway … 

Brian Krebs: I think you’re right. We’re going to all be inundated with stuff on that first day we download an app now. It’s going to change the onboarding experience of apps for sure. 

John Koetsier: Peggy, I might go away from all this measurement crap and just try and build a frickin’ good product. Who cares about measurement? I’m just going to build something everybody wants, and then … 

Peggy Anne Salz: Yeah and if they want it the first day, all the touch points, everything lines up, you know, it’s gravy, right? But, Brian, we do have to close. It’s been a great show. Learned a lot. I’m sure our audience did as well. And thanks for joining us on Retention Masterclass — although when retention is no longer possible, John, we’ll have to think of a different name …

Brian Krebs: Exactly. 

John Koetsier: It’s always possible. It may be in product, and it will impact your measurement of your marketing and other things like that, but it will change. 

Peggy Anne Salz: Yeah.

Brian Krebs:  That’s right. That’s a good point. You’ll always at least be able to continue your product strategy. That’s the one thing under your control. 

John Koetsier: Brian, thank you so much for being with us.

Brian Krebs: Thank you. It’s been a blast. 

John Koetsier: It has been a lot of fun. For everybody else, Peggy already said it, but thank you for joining us on Retention Masterclass. It’s been our pleasure to host you. Whatever platform you’re on, hey, like, subscribe, share, maybe comment, whatever. If you love the podcast, rate it, review it. That’d be a massive help. 

Peggy Anne Salz: And of course, that’s a wrap. And you’ll be reading show notes, I mean John’s website, my website, you’ll find out more. And until next time, hey, we’re here … keep well, keep safe. This is Peggy Anne Salz signing off for Retention Masterclass. 

John Koetsier: Thank you so much.

Made it all the way down here? Subscribe to Retention Masterclass