Quantitative product metrics help us to understand whether our product is getting better, how people are using our tools and where they might be encountering problems. In this blog post, I explain how we (Climate Policy Radar) defined our user lifecycle funnel and tracked it in Google Analytics.
What is a user lifecycle funnel?
I first learnt about this concept in Laura Klein’s book ‘Build Better Products’ (from which the image below is borrowed). If you work in product management or design I strongly recommend you give it a read.
A user lifecycle funnel helps us understand:
- the stages a user goes through when going from a first-time visitor to becoming a happy recurring user
- what proportion of users are dropping out at each stage
As a product manager, this helps me prioritise where we need to focus our efforts to make our tool more valuable to our users. It’s also one of the best tools I’ve found for onboarding new team members and explaining how the product works.
Below are our 5 stages and how we track them in analytics. Note that I decided against tracking revenue, as our product is not a commercial one.
Step 1: Awareness
What it is: how we get people to hear about our product.
How users become aware: Search engines, referring sites, social media and pr/marketing.
The metric that indicates if a user is aware: unique visits.
How we track this in analytics: unique visitors is a default metric in Google Analytics.
Step 2: Education
What it is: how we help people learn enough about our product to know they want to use it.
How users become educated: The intro text on our homepage, FAQs, explainer videos and good UI / UX are the main ways.
The metric that indicates a user knows enough about our product to want to use it: usage of search.
How we track this in analytics: visits to the search results page/site search usage event. The only thing we had to do here was enable advanced tracking of search so that we can see how different search terms/parameters are performing.
Step 3: Engagement
What it is: how users will achieve something valuable using our tool.
How users become engaged: this one is harder for a content site like ours. Some users might find all the information they need by browsing document titles on the search results page, reading summaries of each document or downloading the document as a PDF. However, right now our key differentiator from other similar products is that we enable users to search the full text of documents for exact matches, related phrases and even translations of non-English documents. We return them a list of passages that match their search highlighted in the original document. So my engagement metric focuses on these differentiators.
The metric that indicates a user is is engaged: interaction with passages that match their search, the document viewer or the download link for the original document.
How we track this in analytics: this one required us to add some custom tracking to our tool so that we could measure how many users were interacting with our physical document page. This includes interaction with the:
- “matches in document” element (left-hand side)
- document viewer (right-hand side)
- original document link
We also set up some custom variables on document pages so that we could understand how different types of documents perform. These include things like category (eg. policy), type (eg. strategy), Country, data, language, translation and format.
Step 4: Recurrence:
What it is: how users will become regular users.
Users will become long-term recurring users if they find enough value in our product to come back and use it again. As we build up data on which users are coming back and how often, we will try to discover what happens early on in their use of our product to turn them into repeat users. Facebook famously found that users who connected with seven friends in their first ten data were more likely to become active retained users.
Success at this stage when: a user returns to the site.
How we track this in analytics: my favourite way of doing this is to look at cohort retention. This shows what percentage of new users from a given week are still active in future weeks. We also use the default returning users metric and keep track of how many users return to the site and perform specific actions on their original and repeat visits, such as performing a search or viewing document passage matches. This is being done via event tags that we have set up in Google Tag Manager.
Step 5: Conversation:
What it is: how we will get permission to communicate with users. I renamed this from ‘conversion’. In e-commerce, a conversion is a purchase. For us, the main aim of this stage of the funnel is to see how many users value our tools enough to start a conversation with us, and the knowledge we’ll get about our users by being able to do user research with them. This gives us rich qualitative insight into how users are using our tool, and what impact the tool is having. These conversations give us an evidence base for conversations with partners and donors about our impact and help us to plan future improvements.
How we attract users to start a conversation: contacting support, filling in a feedback widget or micro survey on our site, requesting to download a CSV of our data, signing up for our newsletter, signing up for an alert for when new features become available and submitting missing data are all ways in which we get the conversation started with users. Longer term, my account functionality like search alerts of custom collections will help us start conversations with more users.
Success at this stage when: total number of users contacting us for any of the above reasons.
How we track this: this isn’t something we track in analytics tools. It has to be measured manually by collating the results from all our CRM and all our different Google Forms.
Tracking this data has helped me spot where in our product lifecycle users are dropping out. These are the areas where we need to focus our attention to so that we can maximise the number of users who get value from our tools. In a future blog post, I will explain a bit more about what we learnt, how we conducted further research to better understand the problem space, and some of the improvements that we prioritised in response.
Last but not least, I want to say a big thank to Anne Cremin. She has kindly been volunteering her time with Climate Policy Radar to help us out with our Google Analytics and Google Tag Manager set up. Thanks Anne!