The average app loses 77% of its daily active users (DAUs) within the first 3 days after install.

If you're in any way involved with mobile apps, you've probably heard this before. This is stat from BusinessofApps is all but infamous, and there's plenty more where that came from. Here's another one:

Research into worldwide retention rate shows that the average retention rate across 31 mobile app categories was 25.3% on Day 1, before falling to 5.7% by Day 30.

Do you hear that? You lose almost 75% of your users after only one single day, and almost 95% of your users by day 30!!!

It's easy to see why so many alarm bells go off, but is it possible that we're looking at it all wrong?

We think so.

What is mobile retention rate?

According to Adjust, A retention rate "gives a number to the percentage of users who still use an app a certain number of days after install. It is calculated by counting unique users that trigger at least one session in one day, then dividing this by total installs within a given cohort."

...but does this really make sense?

The real question lies in how to define this particular phrase: "users who still use an app a certain number of days after install."

Do definitions of user retention and user churn account for real life?

Here's an example of an actual human experience with an app:

  • September 1: I download the app, open it, swipe around and realize I need more time for the onboarding.
  • September 3 (2 days later): I open the app and restart the onboarding but I don't finish it because I again get distracted
  • September 4 (3 days later): I completely forget about the app but I play a game for 20-30 minutes that I could have spent in the app.
  • September 6 (5 days later): I think about opening the app as I unlock my front door, but then I start a conversation as soon as I walk in the house and I don't get back to the app.
  • September 8 (7 days later): I think about the context and that reminds me of the app, but I decide I should wait until the weekend to dive into it
  • ...and it goes on like this.

So when did the app lose me?

According to Adjust, it lost me on the 3rd (the last time I actually opened an app and 'triggered an app session'), but is this correct?

Certainly not!

How about the 4th ? Possibly.

Maybe it lost me on the 8th...?

We're vastly oversimplifying the concepts of churn and retention, and while it certainly makes our reporting easier, it hurts our ability to effectively engage and activate our customers

Here's an actual plot of user retention from a fitness subscription app:

Note that, for the fitness app, 16.9% of the users who come back for a second visit come back AFTER a 10 day gap. 5% come back after a 30 day gap! Were these users churned? Were they churned and reactivated? Have they now been retained? Would you have said that they were retained on day 29?

Virtually every user on this plot would have shown up on BusinessofApp's statistic for Day 1 churn, but is that right? Were these users actually churned?


Here's another example. This one is an education subscription app:

Of those users who come back, almost 40% of the users come back after 5 days, but 22% take a full 10 days to come back. 5% take 50 days and there's still a cohort (only ~1.5% of users, but still) who take over 100 days to come back!

The problem with traditional definitions of user retention and churn

The issue is, by tying fixed time periods to the concepts of retention and churn, we're introducing rigidity that doesn't reflect real life, and that causes us to make a bunch of bad decisions.

For example, we may be desperately trying to retain customers who aren't actually churning. This means sending discounts we never needed to send and potentially annoying users with needy messages:

So what's the alternative to the traditional approach to retention and churn?

The answer, like the answer to most of the problems in marketing, is to get close to your customer: The more you learn about each individual customer the more you can understand what their individual life looks like, and the way you do this is by gathering as much data as you possibly can.

Here's how we do it:

Mobile apps have a huge amount of data in the form of events. Any time a user does anything in an app, features like the user's ID, the event name, and a timestamp are recorded in the event stream.

Most times, though, businesses aren't using that data effectively. These events are typically only used to measure against, set triggers, or filter users. Most of them just live in your CDP like Amplitude, Segment, or Mixpanel, waiting to be queried.

1. Aampe ingests all of your events

No, we didn't stutter. We ingest ALL of your events.

Every one of them.

Because every event is useful in its own way for understanding each customer's individual journey.

2. You drag each of your main events into a funnel

It doesn't mean that we don't use the rest of your events. This just tells our AI which events are most important to you. Which ones indicate a conversion (like a check out or subscription renewal) and which events indicate a step in the right direction (like an add to cart or completed workout).

3. Our model tracks each user's actions as they traverse the hundreds of events in your app

We understand:

  • Which apps are performed most often (indicated by the size of the dots in the diagram)
  • The general order and structure of the events (in other words, if you've completed an event, what's the probability of the next several subsequent events (the darker the line, the more common the "bridge" is between those two events)), and
  • Which events are most likely to lead to an eventual conversion (indicated by the proximity of the "event dot" to the "complete checkout" dot in the diagram below):

Now we can find hundreds or even thousands of complicated patterns in user activity (or inactivity) that more closely represent real life...Not just looking to see if a user opens an app during a day or not.

How we define churn and retention

Here's a diagram of a user's interaction with an app.

  • The grey humps indicate user activity (the bigger the hump, the more events the user performed at that time)
  • The orange, red, and brown dots on the top are the events the app has in their funnel.
  • The blue dots at the bottom are messages our AI model sent.

Note that the user performs some initial activity and then disappears:

At Day 1, BusinessofApps and Adjust would have considered this user churned. They'd say the same thing at day 30.

Instead of admitting defeat, our AI starts sending this user messages.

  • It sends a couple messages a week later. No response.
  • It tries again the next week, changing up the value proposition of the message to try to get interest. No response.
  • It tries again the following week and then the week after, slowing its cadence to prevent the user from getting annoyed and uninstalling the app.
  • It tries again almost 6 weeks later, and gets a response!

As user activity picks up, so does our messaging cadence until we've turned this user into a power user (4 conversions in 10 days).

Just a note: In the diagram above, our model was executing a multi-channel approach with a combination of SMS messages and push notifications.

This wasn't "predictive churn" or a "reactivation campaign." This is a conversation.

It started at the user's first interaction and is still continuing to this day. It's not a rigid campaign. It's a dance.

Our current "customer engagement" tools force us to think about retention and churn in concrete terms, because they can only handle simple commands ("If App open = 0 for > 7 days, then send message"), but this isn't representative of real life.

In life, there is no churn. There's only "until next time."