analytics data-science product-management strategy

Superior Analytics Starts with Proven Questions

Welcome to Part 4 of the series, Transforming Your Product Team’s Analytics Prowess. Our transformation journey has come a long way. We know how to assess our data cultures and find the data we need. After clearing that organizational overhead, we get to the fun part: the analysis. Unfortunately, too many PMs ask the question “What metrics should I track?”. Others confidently boast metrics but deflate when you ask why those numbers matter.

 

The most overlooked aspect of analytics is also the most critical — knowing which questions matter.

 

A few years ago, the business press talked about the data deluge. It refers to scenarios where so much data is generated that it overwhelms the organization. “Self-serve” and “open data” initiatives make this problem worse.

 

It’s a risk to give PMs without basic analytical acumen unfettered access to data. It’s like giving a child nunchucks and then asking them to show you something cool. At best, nothing cool actually happens. At worst, someone’s getting hurt.

 

Alas, you’re here because you want to transform your product analytics. In this post, we’re going to train the kid (PM) on how to properly use the nunchucks (the data). Specifically, we’re going to focus on asking the right questions. We’ll break it down as follows:

  1. Guiding Principles for Doing Product Analytics
  2. An Analytics Framework
  3. Questions You Should Expect to Answer

Some Guiding Principles

The following are principles that should you navigate product analytics. These are not comprehensive. More importantly, they’re not rules.

  • Instrument in anticipation of the questions. You might be pleased to know that the questions don’t change much. The nature of the answers might be different, though. If you think of the questions ahead of time, you can surmise what data might be helpful to answer them.
  • You won’t have everything at first. That’s ok. After instrumentation, you might still be missing a few key data points or attributes. That’s fine. At least you didn’t forget everything else.
  • Know why the questions matter. Analytics is not a cheap process. Like all things, you need to prioritize which questions to answer first. If the question doesn’t advance you towards your product goal, answer it later.
  • Tell a story. Often you’ll see reports and dashboards that are oozing with metrics. If you don’t communicate what the data means and why it matters, you’re probably going to lose your audience.
  • Provide insight instead of data. This one is similar to the last principle. Teach your audience something. Bring attention to a blindspot.

An Analytics Framework

Here’s a framework that I’ve used over the past decade or so. Click the upcoming image to download a PDF resource.

 

This framework is dual-purpose. First, it works as a maturity model. The critical dependency is how competent your organization is at doing three things:

  1. Understanding the product and business objectives
  2. Expecting questions that will show progress OR clarify the path to progress
  3. Taking the measurements to answer those questions

 

As your competency increases in defining and measuring, your team is in a better position to answer to make decisions. I’ll get into the actual questions for each type of analysis in the next section.

 

 

You can also use the framework as a step-by-step guide when trying to do an analysis.

Expect to Answer These Questions

After 15 years of analytics work, I’ve learned that there are four atomic questions that organizations ask.

  1. What is happening?
  2. Why?
  3. What might happen?
  4. What should we do?

Sure, there may be different permutations of these questions. There might be some nuance and difficulty in answering them. But for the most part, this is a standard line of questioning that people have.


Sidebar: These questions are agnostic of context. They are not limited to Product Analytics. For example,

  • I’ve run experiments on Bayesian-based context identification and recommendation engines during my graduate studies
  • Participated in two-week-long negotiations between Wharton MBAs and PennLaw students.
  • Defined competitive strategy against Playstation in Europe.
  • Led the acquisition for a billion-dollar game development platform.
  • Developed Go-to-Market plans for wearable augmented reality.
  • Investigated why people don’t like to take training in corporate HR systems.

Each section below goes into a type of analysis and what questions to expect.

Monitoring — What is Happening?

The point of this analysis is to measure progress against a goal that you’ve set. You want to compare the behavior you’re seeing with your expectations. These are the standard reports and dashboards that everyone obsesses over.

 

(Picking KPIs and metrics is a topic for another day. But, if you’re so inclined, read this post to sharpen your use of metrics: What is a North Star Metric? by Sean Ellis.)

 

The answers to monitoring questions are descriptive. If used well, they can start a dialog. The organization can discuss if objectives are being met and if should something be done about it.

 

You can decompose “What is happening?” into a few specific questions like:

  • How many times did X occur?
  • How did it happen? In what context?
  • Who did what?
  • Where and when did it occur?

Understand — Why is it happening?

Remember the principles about insights and storytelling? This is where strong analyses begin to shine. If you can explain or tell a story about why interesting things are happening, the organization is in a better position to make decisions.

 

Take the example of a new product launch. A team that knows why some markets are underperforming can now look into remedies. It saves them a lot of money if they respond faster.

 

Or take the example of a UX design team. Imagine if they were able to generate personas based on real behavior. They’d be able to design more effectively and increase product satisfaction.

 

Expect explanatory questions to look like this:

  • Why did this happen?
  • What contributed to this event?
  • How are these situations different?
  • What happens if data is cut this way?

Projecting — What might happen?

Everyone is interested in the future — short-term and long-term. Projections allow teams to explore future states and optimize for given objectives. These objectives are usually quantifiable, like revenue, user growth, or headcount. In other cases, the objectives are qualitative like changing market perception.

 

These questions are central to planning, strategy or roadmap sessions. You can use them in response to an important business context.

 

Typically, you’ll see these questions phrased like:

  • What might happen?
  • When might it happen?
  • How will it occur?
  • How much will it cost? Bring in?
  • What are other scenarios?

Sidebar: Two examples of how my team has used projections.

 

The first example is about product/business planning. When we were setting 2019 OKRs, one business had a very aggressive revenue target. It seemed almost impossible to achieve.

 

One of my team members, Jon Nguyen, built a model that predicted that business’ revenue at the end of the year. Several inputs accounted for client churn, distributions of win rates, funnel conversion rates, and deal sizes. From this model, he was able to start a discussion amongst senior executives about the target setting and a change of strategy to achieve the targets.

 

The second example is about operational improvement. My other team member, Dan Lopez, was involved with the transformation of our client support operations. We had a growing backlog of support tickets and a dropping client satisfaction score.

 

Several non-support teams were enlisted to close cases and lower the backlog. This included a few hundred employees from Product, Client Success, Solutions Consulting, and Engineering. We hypothesized that the effects would be temporary and that the backlog would steadily climb back to a problematic level in a few short months. Dan built a seasonality-adjusted projection that took into account the hiring, firing, and productivity rates of our support reps. Seven months later, the actual backlog levels were within 3% of Dan’s projections.


Prescribing — What should we do?

Prescriptive analytics brings all the analyses together and recommends the right course of action. Given a situation and a clear goal, analytics here should direct what needs to happen.

 

This is the last stage of product or business analytics. The recommendations should drive towards a specific outcome and simplify the decision-making process.

 

The questions will usually look like the following:

  • What should we do?
  • How should we execute?
  • Who is involved?
  • How much do we invest?
  • How much will we make?
  • What are the risks and considerations?

 

In the previous examples, my team offered a list of strategic options along with their analyses. In some cases, organizations won’t make a rational decision. Either way, you should be proud of the good work that you did.

Questions Bring Clarity. Start there.

In your day-to-day, you probably hear a lot of metrics being thrown around. Planning sessions are chock-full of them. Launch reviews and business reviews are also plastered with numbers. I challenge you to ask the presenter what the questions those numbers answer? Then, follow up with clarifying why those questions are important.

 

The point of this is not to be obnoxious. The point is to encourage clarity. Watch the room have a more substantive discussion.

 

As you assess your team’s analytical maturity or conduct analyses, use the following resource. Click the image to download the asset.

 

 

Now you know the questions to expect. In the next part of the series, I’ll talk about the other side of analytics acumen — data literacy.

Leave a Reply

Your email address will not be published. Required fields are marked *