You Don’t Have To Be As Clairvoyant As Steve Jobs To Build Great Products "Product people - Product managers, product designers, UX designers, UX researchers, Business analysts, developers, makers & entrepreneurs 3 November 2015 True Analytics, Data, Google Analytics, Mixpanel, Pendo, Mind the Product Mind the Product Ltd 1050 Product Management 4.2
· 5 minute read

You Don’t Have To Be As Clairvoyant As Steve Jobs To Build Great Products

SAN FRANCISCO - SEPTEMBER 05:  (L-R) Musician KT Tunstall, Apple CEO Steve Jobs and Starbucks founder Howard Schultz look at new iPod Nano's at an Apple Special event September 5, 2007 in San Francisco, California. Jobs announced a new generation of iPods.  (Photo by Justin Sullivan/Getty Images)
(Photo by Justin Sullivan/Getty Images)

It’s fashionable to characterize software product leaders as magically prescient. Steve Jobs famously brushed aside customer feedback; he “just knew” how to build software products people would love. Twitter’s Jack Dorsey is similarly clairvoyant according to popular myth, as is Zuck. And so on.

But even the greatest product managers know the reality of great product design is different – and more intriguingly complex. It’s a mix of science and art, balancing user feedback with leaps of vision. And the view isn’t always pretty. Today, many companies rely on antiquated methods like Excel spreadsheets to try to build the best, most intuitive software – hardly the best option in a data-driven world.

Which data-points should product managers track, and how should that inform product decision-making?

Product Data You Shouldn’t Live Without

When it comes to product design, nearly all software products use “internal objects” to help track vital stats. For instance, in a customer-relationship management system, popular objects for data collection would be a specific company or a contact. Think of whatever super-basic TPS report you get about your company’s stats. That’s probably tracking data using internal objects.

Internal-object data does have limits when used to think about design. It typically requires work from your engineering team to collect and report. Moreover, this data only represents the current state of the application itself – not how individuals or teams are actually using it.  Without visibility into user behavior, you cannot understand a product’s usability or efficiency, which is critically important. Maddeningly, this data requires care and feeding and sometimes goes missing in new software releases, making the overall data it tracks occasionally stale.

That’s why great PMs have become ninjas of cross-referencing data across systems. Google Analytics and other web analytics solutions like Mixpanel expand PMs’ purview dramatically. These solutions provide site traffic data like page views and visitor-level data, helping PMs answer questions like: How popular is that page? What’s the path visitors take to get task X accomplished? What color should that button be to encourage users to click? These are fantastic data points for people who make SaaS software accessible over the web. But this data can’t tell you everything, either. Your engineers may still need to massage the data to turn it into usable info. Similarly, web analytics are page-view-level only; they reveal nothing about how people use items within a page.

What do users really want? Salesforce and customer service tools like Zendesk or Desk.com are other treasure-troves of product data. Complaints or inquiries can be mapped to features, so PMs can understand which features see lots of action or trip users up (and make them easier to navigate). These systems can capture features that customers want, too. Feature-request data tends to come to PMs either raw or semi-cooked, but it can be worth slogging through.

Increasingly PMs are integrating Salesforce and customer service data into application lifecycle management (ALM) software like Atlassian’s Jira and products from my former company Rally. PMs use these to coordinate schedules and estimates for specific features they’re working on – the core of their jobs. Customer-service data gives them the customer’s POV on how the feature should work and how important to the overall experience it might be. Newer tools like Uservoice solicit and rank requests directly from customers, providing a datastream ready-made for PMs’ use. Some engineering / ops teams use performance monitoring solutions like AppDynamics to understand how a product is used. Those tools can reveal surprises, like great features nobody has discovered or work-arounds users have figured out on their own.

Finally, user surveys and focus groups enable PMs to ask focused questions of actual users (and, often, settle internal prioritization debates). It’s easy to become enamored with the qualitative data you glean this way – actual users are telling you how they want your product to work! – but these tools aren’t perfect either. Assembling a pool of “testers” can be time-consuming, expensive and occasionally biased. Often user surveys and focus groups happen too late in the product lifecycle process to provide future-oriented guidance.

How do you bring these data-streams together and make sense of them? The most sophisticated organizations pump all the info into a data warehouse and get a data-science team to parse it. However, that process is expensive and often time-consuming. New alternatives are emerging (like my own company, Pendo) that bring all this data together in one place without the need for a data warehouse.

Balancing Data With Vision

How to use all this data for smarter product decision-making – without inducing committee-think or stifling flashes of intuition? Here are a few good pointers.

Aim for fast, focused experiments. I’m a fan of The Lean Startup by Eric Ries, who advocates for tight loops of experimentation and data to support product learning. Another vote for quick-and-dirty experimentation with data is Marty Cagan of Silicon Valley Product Group.

Don’t hog the data-set! Data must be broadly accessible to matter in product decision-making. It might make your data scientists feel important to control a ginormous queue of requests, but that exclusivity won’t help the whole team make a better product.

Formalize product reviews. In his blog post The Product CEO Paradox, Ben Horowitz discusses how product-oriented CEOs can step away from the grind of product development without wrecking the product vision in the process. Great CEOs provide the filter, uniting analytics with instincts. Formalized product reviews keep everyone pulling in the same direction.

Stay open to surprises. You’ll never attain perfect product data, nor will the data interpret itself for you. The fun part of product design is occasionally abandoning your preset notions about what the product “really” needs and just observing instead.

My point is simple: don’t worship data blindly. Gaining access to better product data is empowering, of course. But data’s purpose is not merely to answer a bunch of narrowly defined questions. Think of data as also field research: a window into how people use your product in the wild. You might be pleasantly surprised at what you observe – and maybe discover a fresh opportunity for your product in the process.

Comments 6

I feel like this massively underestimates the power of qualitative user / ux research. Surely knowing the people intended to use your product inside out is the easiest way to create something they will love in the future? Nowhere in the article mentions speaking to people (apart from focus groups which are probably the worst form of research).

I feel knowing people’s goals / needs / motivations etc etc is super valuable and arguably something that Apple does brilliantly. Talking to BlackBerry users in 2006 wouldn’t directly reveal the need for an iPhone, but talking to BB users would reveal the things they needed (email, internet) and things they hated (crappy unresponsive software, need keys because touch typing on screens at this time was rubbish). Plus they carry a phone AND an iPod because, only way to roll in 2006, right? Put that all together – listen to people, what’s important, what they do, needs, problems, etc – iPhone. No idea if Apple DID speak to BB users but hopefully the point comes across.

The entire industry has been based on talking to customers. That’s the only thing that has ever happened to date. The point here is there are other ways of doing it.

Not how I read it – I read it as a data-led company talking about making product decisions and then having a very minor mention of more qual user research. With qual stuff mentioning focus groups and surveys. If I read it wrong then cool, but that’s certainly how it came across to me.

I think the minor mention of qual research and data was more driven by the thinking that everyone has always been (or should have been) doing that. The usage data collection and analysis, in my opinion, has often been lacking or non-existent.

I specifically mention user surveys and focus groups which is essentially speaking to users, so I’m not following the “massively underestimate.” Sure, you need to talk with users, but for many customers this is ALL they do. I’m asserting that this is not enough and that there are real challenges with simply doing this.

The best product managers need to triangulate data from a variety of sources — this is how insights are really derived.

Join the community

Sign up for free to share your thoughts

About the author