The purpose of using product analytics is to learn. Learn what your customers experience, their frustrations, and their needs. While it works great for bigger companies, it doesn’t work that well for smaller ones. It happens because of one word that’s often missing. bigger
Product Analytics isn’t just a tool for learning.
It’s a tool for learning at scale. When you don’t have the scale, you don’t get the learnings.
Don’t get me wrong, having a scale doesn’t guarantee anything either. There’s just a better chance to learn from data when you have thousands of active users. No matter the tool, it’s still only a proxy for people’s experience. The more proxies you have, the more insights you lose on the way.
Bigger teams don’t have any other choice. With an enormous user-base, this is the only way to measure experience. And it doesn’t tell the whole story anyway.
Take the Net Promoted Score, for example. It’s a common proxy for customers’ loyalty. But, at its face value, this metric has been proven to be mostly useless for organizations. There is no correlation between the NPS number and the quality of customers’ experience. Nevertheless, everyone is using it.
As a result, many small teams start adopting metrics, like NPS, in their products because bigger teams are using them. It’s a “best practice.”. They probably know what they’re doing. Right?
I’ve made this mistake too. Early on, when only 40 people were using Flawless App, we decided to add NPS into the product. It took us a week to design the screen and set everything up. Eventually, after a month, we only get ~15 responses.
The data we received wasn’t actionable. It didn’t show us customer loyalty, nor did we learned anything about people’s frustrations. We spent a week for nothing. And early on, a single week is a lot to pay.
I’m not going to repeat the same with CrossPatch, though. CrossPatch is currently in closed beta, and I’m slowly onboarding a few people to the product. I didn’t spend a second on NPS. Nor did I build in-app onboarding.
If you create an account right now, all by yourself, there’s a 90% chance you won’t understand anything about it. I’ve done this on purpose. Instead, guessing people’s experiences from data-points, I’d rather talk with them directly. So I could understand the full depth of their circumstances.
Last week I had five calls with different teams. We created CrossPatch accounts together. We talked about their needs and how the product might fit into existing processes. I learned more in a week than I would ever have learned with analytics in a month.
Having under 100 users early on isn’t a disadvantage. It’s a blessing. At this scale, you can talk to people directly and get the most comprehensive context. Before getting to quantity analyses, you got to have a quality foundation first.
But how exactly talking to people is better than using heavy analytics in the early stage. Let’s make a quick comparison.
How long does it take to get started with either of those learning approaches?
To be able to talk with customers, you got to have a direct communication channel. In most cases, it's sending an email to schedule a call. The whole process is mostly out of your control. People are busy. Not everyone wants to spend an afternoon talking about your product. So it takes a while to set up.
Setting analytics up doesn't take much time, though. It's either copy-pasting of pre-defined script or sprinkling events across the product. Sure, you still need to plan out the metrics and events you want to collect. But it doesn't take as much time as scheduling because you're mostly in control.
How long does it take you to query new information and learn?
Here everything depends on your interview skills. Because you’re in a direct conversation, it takes seconds to ask the right questions. You may also discover something new from people’s responses. Often those unexpected situations lead to problems worth exploring.
Although querying data from analytics is much faster on paper. The real speed of learning is determined by the feedback delay. What you see now in the analytics is merely a reflection of actions you took in the past. The data says nothing about current customers’ frustrations.
How much do you know about the root cause of the customer’s circumstances? Can you construct a chain of cause-effects that lead the customer to the struggle?
When talking to people, you can go as deep into any topic as you want. That's what I call a "respectful interrogation.". Asking the sequence of "Why?" to get to the root. So it's much easier to evaluate the true nature of the problem. Imagine how much more effective your solution would be when you underhand the reason for a problem to occur.
With analytics, you're bounded by the events you defined. And those events are based on the current assumptions you have. Even if you keep expanding the context infinitely, you'll never get the whole picture. Analytics is only a proxy for an experience. And like any other proxy, it missed the details.
How valuable is the information you got? Can you act on it now? What impact is it going to have on customers?
The value of talking to people is proportional to the depth of the learning. When you understand the full picture, it's easier to act on the problem. You see many angles of the same case. And can identify the cheaper solution that would still solve the user's problem.
It takes seconds to extract data but days to understand them. And even after all this time, you still won't get the full story. So the value of learning diminishes because of the lack of depth and significant feedback delay.
There are things you can’t learn without analytics. That’s for sure. Behavior patterns of a large group of people would be impossible to spot just by having few interviews. But that’s not the point.
The point is, you don’t need this information early on. Those patterns aren’t going to shape the product. It doesn’t matter. What matters is the screaming struggle users have. The unique circumstances they are frustrated about. Or an unexpected experience they found delightful.
When you’re starting, talking with people directly happened to be the fastest way to learn about this stuff. So instead of following the “best practices.” follow what works for you.