In this ProductTank San Francisco talk, Scott Castle (VP & GM at Sisense) shows us how we can use data other than A/B tests and clickstreams to build products with longer, more complex user flows. Given that so many of us work in Enterprise and B2B products, where the user journey is rarely simple or clear, Scott provides some more advanced methods for business analysis to help us make sense of product performance in a more complex environment.
As an opener to his talk, Scott shows off the UI for one of the products he’s responsible for, highlighting the huge number of active UI elements, and the complexity and sophistication of the product. Clearly, analysing a simple clickstream of events isn’t going to help him make sense of what his users are doing, or exactly what they’re finding useful! Thankfully, there is a plethora of tools available to help manage the various kinds of data a B2B product generates, and many of them come with some kind of built-in analytics. Scott talks about blending those different data sources to help him build up an understanding of what his customers are doing, and an understanding of whether his product teams are building the right things to create customer success.
In the world of B2C, A/B testing provides a simple method of determining which offers are likely to convert into more users. Tools such as Clickstreams and Amplitude are great for collecting data from short user flows, but they don’t provide much data for more complicated products. For companies that sell multiple products or have multiple additional features, it’s often more complicated to work out which features or which bundled products are driving the most user value.
Lucky for us, Scott explains a method he uses to understand how much each aspect of a B2B product is relatively impacting his customer base. Even luckier, it’s relatively straightforward – he tracks what product capabilities his customers are using with 60 days of a deal closing, and attributes the value of the deal to those features (Assuming that his customers will initially be primarily focusing on the key value they see in the product).
As he builds up that picture, he can start to work out which features and capabilities are most valuable to which segments of his market, and start planning his marketing and sales plans.
Sometimes new features are added which solve a specific customer problem, or old features are replaced with new-&-improved versions, but the replacement doesn’t get adopted – at least not the way you’re expecting. Scott explains how something similar occurred at Periscope Data – a feature was rolled out that supported a frequent customer request, which he knew several users had already found a work-around for. What baffled him was that he could see in his data that a lot of customers were continuing to use the workaround, even though the feature was now fully supported!
Because he had feature usage data for his product, he was able to see that the number of people using the “old / hacky” method went up alongside the number of people using the newly released support. And with that knowledge, he was able to get on the phone to these customers, interview them, and understand what was happening.
As it turns out, in this case, the long-term users had simply developed a habit of using the hack, and kept on using the method they knew – and teaching it to their colleagues.
Of course, tracking the new feature was relatively straightforward – it was a clear widget that Scott’s team could instrument – but tracking the old hacked solution was harder. It involved writing a SQL query to specifically look for the markers of the work-around (Which Scott’s team was aware of), and then tracking that behaviour as they rolled out the new feature.
In the enterprise world, many features and products aren’t adopted by users on Day 1 of their release or availability. In many cases this is down to things like procurement processes, infrastructure update backlogs, and user inertia, but the result is the same – a long delay before your B2B technology is rolled out across your customers’ organisation (occasionally measured in months!)
But how do you know if a usage delay is “normal” or a cause for alarm? When you’re releasing new features, you’ll have adoption KPIs and goals, and you need to make sure they are balanced to give a realistic expectation of your customer’s behaviour. Scott’s suggestion is to track time-to-adoption for various kinds of features and, over time, you’ll start to build up a good idea of how long it generally takes your customers to adopt different kinds of features.
Scott shares some examples from his own experiences, including a few graphs that show remarkably clear trends between customer spend and adoption delays. Of course, your customers’ behaviour will likely be specific and unique, and some correlations will be stronger than others. His point is that gathering data on representative features will help you to estimate a projection of how long your adoption cycle is likely to be, and then balance your KPIs to reflect that in a meaningful, constructive way.
When getting feedback from users, their tenure with your product will determine how you can make sense of their behaviour and their feedback, as different things will be relevant to new and long-term users.
To help you find these different cohorts of users in your data, Scott recommends looking at the difference in time between when your users first started using a given feature, and when they most recently used that feature. You’ll generally see some clear patterns and plateaus that will indicate users who have reached different phases of familiarity and buy-in with your product.
A/B testing is a powerful tool, but it’s not the only tool available to product people who want to build up a data-driven understanding of their customers’ usage of their product. And it’s certainly not even the most appropriate tool in a complex B2B environment. With the aid of relatively straightforward SQL queries and good data visualisation, it’s possible to start building up a much more sophisticated understanding of your customers, and of your product’s performance in both long and short timeframes.