Everybody is trying to create products that make people’s lives better. However, by simply focusing on metrics, rather than the humans behind them, we risk making bad decisions that have unintended consequences for people and society. To be better, Kim explained how we need to focus on creating features that meet some of people’s needs, as defined by Maslow. Once we do this, we can ensure our teams become goal focussed, values guided and data informed, rather than trying to steer the ship just by looking at the speed dials.
Product managers are excited by making people’s lives better. So far we haven’t done a terrible job – we’ve got the world’s maps in our products, can hail a cab in a moment, or have anything we like delivered to our door.
But there is evidence that the increased amount of time we spend online is making us less healthy. From mental health to the health of our democracies, we are causing problems with well-intentioned software.
There aren’t many, if any, doctors of evil sitting working out how to pull down society. The individual design decisions are being made by people like us – along with lawyers, leaders, and customer service people. By making it up as we go, though, we’re creating unintended consequences that are dangerous for people.
Data collection is one such consequence that we have yet to design properly. We have more information about one another than ever before, yet have still to decide collectively how it is acceptable to use it and how to store it. From advertising based on behaviour to decisions about home loans, these are awkward calls that we need to face head-on rather than have made without conversation.
In many organisations, metrics-centred design reigns supreme. By exclusively designing features that push these metrics, we forget about the wider impact of our choices. It’s the equivalent of trying to steer a ship by just looking at the speed dials rather than out of the window.
We start to confuse the goal with the metric. Infinite scroll is a great feature that increases engagement but “hooks” people on our product. When we’re sharing vocabulary with drug dealers, we probably have an issue.
How can we Make Product Decisions Differently?
First, let’s be explicit about what human-centred design means. By using Maslow’s hierarchy of needs, we can create a set of factors to drive with our decisions.
We should try to build something that supports at least one of Maslow’s hierarchy for a group of humans. This must also not be at the cost of another group’s needs.
Once we have a clear approach to human-centred design, we can define what’s acceptable in our profession. Currently, the digital products we’re building are the largest ever human subject experiment in history. They are not being regulated or even agreed to by the subjects.
The medical profession has institutional review boards which ensure that any research is conducted in line with the principles of the Nuremberg Code, a set of ethics which arose from the Nuremberg Trials. These principles explore the benefits to the subject or society, whether the approach is the only way, what kinds of harm are possible, and what you’re doing to mitigate the risk of harm.
This is also why diverse teams are crucial – the way that we perceive harm varies according to our experience.
In these types of research, consent must be both informed and truly voluntary. It can be withdrawn at any time and cannot be under any kind of coercion.
Let’s be goal-focused, values-guided, and data-informed, rather than data-driven.
Goals then are what you’re trying to achieve. Values tell us what solutions are not acceptable, along with the boundaries of those that are. For example, if you want to buy a house, you can decide to save money or rob a bank. Both will get you a new home, but one is unacceptable because of the values that most of us have.
Your teams need to create these values for themselves, and these values can then guide the decisions being made in complex circumstances.
Measuring What Really Matters
Never put up a naked metric – always consider the goal that it relates to. It’s also up to you to understand what trade-off you’re making in order to move the metric. If you don’t know the other side of your decisions, then you’re more likely to build something that runs the risk of doing harm.
Until we learn to measure what we value we will continue to make bad decisions. Take responsibility for generating insight about your users and how your product makes their lives better.