Most companies get it by now—churn is bad.
Retention hacking is the new growth hacking—it doesn't matter how many customers your startup acquires if none of them stick around.
There are quite a few tactics that many PMs use to boost their retention—like improving their user onboarding. But to get directly at churn reduction, you need to first diagnose your product's specific problems and make adjustments.
The good news is that if you're willing to take a dive into the numbers, you can find out exactly why users stop using your app.
What is cohort analysis?
A cohort is simply a group of people with shared characteristics.
Cohort analysis is a type of behavioral analytics in which you group your users based on their shared traits to better track and understand their actions. Cohort analysis allows you to ask more specific, targeted questions and make informed product decisions that will reduce churn and drastically increase revenue. You could also call it customer churn analysis.
To find out why your users stop using your app, you have to answer the three W's of user retention:
- Who is engaging with your app—and who isn’t?
- When do they churn?
- Why do they lose interest?
You can only do this by segmenting your users into groups—or cohorts—based on a particular trait. The 2 most common cohort types are:
- Acquisition cohorts: Groups divided based on when they signed up for your product
- Behavioral cohorts: Groups divided based on their behaviors and actions in your product
Acquisition cohorts help you determine the who and the when, while behavioral cohorts enable you to dive into the why.
Let’s take a look at cohort analysis techniques in more depth.
1. Look at when users churn
Using acquisition cohorts, you can find out when in the customer lifecycle your users tend to drop off. The most common way is to use a chart like the one below.
The rows indicate a timeline and the number of customers you acquired at each time interval (the who). Each column represents the amount of time that has elapsed since the users subscribed (the when). Every cell has the percent of the original acquisition number that has been retained at that period in time.
Some things to consider when performing an acquisition cohort analysis:
- Period (by day, week, or month). Use shorter periods for younger companies, longer periods for older companies.
- Scope. The larger the scope of the retention period, the more difficult it is to come up with an accurate hypothesis for what's going wrong in the process. Do analyses for each retention period: early (up to 8 days), middle (8-90 days), and late (90+ days).
- Expectations. According to Hiten Shah, the retention rate depends on your segment. For a high-velocity, low-priced app, a relatively high churn rate—10-15%—can be normal. For an app that has a higher barrier to entry, you'll be looking for a much lower churn number—2-3%.
Once you've put together your chart, look at where your users drop off at a concerning rate. Is it in day 3, once they've been prompted to sync their data? Or is it in week 4, right after they’ve reached the end of the onboarding material you were trickling out? Asking questions like this should give you an indication of where users are getting tripped up.
To give you a clearer picture, let's take a fictional data set from a productivity app.
If you look closely, you'll see that the biggest percentage dropoff is right around the 2-week mark—the average dropoff from day 14 to day 15 is a full 3 percentage points. That information can help us start making hypotheses about why users are leaving.
2. Find the sticky features
Once you have a timeframe for when your users are consistently dropping off, you can take stabs at what behaviors caused that churn (the why).
In our productivity app example, we know that we would have to make some adjustments in the early stages of the customer retention period. Behavioral cohorts could help us figure out what's happening around day 15:
- Users who regularly engage with the checklist feature in the first two weeks
- Users who regularly use the social features (chat, in-app mail, collaborative workflows)
- Users who enable push notifications upon first customizing their settings
The correlation between behavior and churn will be more apparent for more specific behaviors. General behaviors, such as “app engagement in the first 30 days,” don't give you much insight into what is keeping users engaged.
Let's put this into practice. Here's the average churn rate for the productivity app based on our acquisition cohort analysis.
Here's that same overall churn compared to the churn of users who use one of the core features—the checklist feature.
We see that there's a very low churn for the users who engaged with the core feature (the red line), and most of the people who churned did not use this core feature.
It could be because it was not part of the in-app onboarding or because the checklist feature is too many clicks away from the home screen. In order to retain more customers, we have to make adjustments to increase engagement with the checklist feature.
As a real-life example, mindfulness meditation app Calm was able to 3x their customer retention by pinpointing their stickiest features. They realized that the majority of users who stuck around had engaged with the reminder feature, so they used in-app cues to improve engagement with that core feature.
3. Compare behavioral cohorts
Unfortunately, it's not always as easy as just finding one clear link between behavior and retention. It’s often a combination of behaviors that keep users engaged with your app.
Your goal is to pinpoint the common behaviors of your most engaged users. Invert that, and you'll come up with the users who aren't sticking around.
All of this can be done in a spreadsheet with some conditional formatting, but that often proves to be extremely time-consuming. Luckily there are tons of tools out there that streamline the process. Tools like Amplitude help you create behavioral cohorts painlessly. You can combine and compare cohorts, quickly testing your hypotheses.
Iterate, rinse, and repeat
Figuring out how to fix the issue can often be just as difficult as diagnosing it. If you know that user engagement depends heavily on using a core feature, you can't pester your customers with emails and push notifications to force them into engagement. In fact, harassing your customers might increase your churn.
Instead of jumping the gun on big product changes, A/B test modifications on your problem cohorts to get an idea of what works and what doesn't. This way you can make data-backed changes that are guaranteed to reduce churn. Once you've successfully improved your retention based on one behavioral cohort, rinse and repeat.