It was a Friday afternoon and I was feeling nervous. I was about to meet our biggest customer, an established enterprise owning some of the world’s most valuable brands. As our product helped companies, well, build their brand, it was a big deal.
The meeting was planned to take only 25 minutes, so we got straight to the point. There had been a serious legal incident with celebrity images published outside of the license agreement, and the customer was now facing catastrophic claims. As our product processed most, if not all, of the organization’s digital assets, we were asked to come up with a solution that would prevent similar incidents in the future.
Taking the challenge back to the team, we decided to use artificial intelligence (AI) to tackle the problem. Pretty soon a problem appeared. While the AI could flag up a decent number of cases, we weren’t able to deliver the reliability required for real-world use.
On Track vs on Target
Euphoric about the AI solution, the team started to investigate alternative problems that the AI could solve. One achievement was the ability to recognize brand. During a follow-up meeting with the client, the team presented their results. The key users, unaware of the cutting-edge technology, were not impressed. It just didn’t contribute value to their use case.
This is a very common pattern. You start with a goal, ideate a solution, try the solution and end up iterating to make the solution work. However, the goal gets lost.
As a product manager it’s important to keep things on target instead of on track.
Ideally, the cross-functional product team has learning built in. They navigate on vision and iterate based on the lessons they experience, almost subconsciously. Unfortunately, the truth for many product teams is somewhat different. Decisions, knowledge, and learning are not concentrated within one team.
I worked with a team at a local government organization on developing an app to report illegal dumping sites. One feature request, however, didn’t make sense. A government official made the request to add user authentication. The team had been focusing on the number of app installs and incidents reported, so adding a barrier seemed counterintuitive.
During a follow-up session the government official stated: “There has been a 10% dumping increase ever since the app launched. We believe citizens are abusing it to get rid of their trash.” The team was stunned, they just couldn’t believe what they heard. “Offenders are clearly not engaged with the community, so why would they bother using the app?”
Initially the team started to compile a list of arguments against the official’s reasoning. Unfortunately, there was no evidence to decide one way or another. But then they realised, at this point, it didn’t matter. There was a far more important lesson learned. The team was measuring the solution, app usage, but had not been informed about the actual goal.
With short cycles, MVPs, and POCs, there is plenty of opportunity to learn. Many teams have retrospectives and the “lessons learned” is a ritual that originates from ancient project management best practices, dating back well before agile was conceived. Unfortunately, as my example above shows, it’s often the case that nobody wonders “who needs to learn what”.
Looking at a typical design sprint, an entire day goes in framing the “what to learn” and that’s already after the team decided to schedule a sprint to tackle a specified problem. Defining what to learn is at least as important as executing and extracting the lessons.
All too often a lessons learned session is limited to “what did I learn” or “what could have gone better”. Possible answers mostly cover ideas on how current activities could better deliver the solution.
“The solution” is often only one option out of many. It is tempting to use lessons learned to try to improve that one solution, while actually it is more valuable to reflect on lessons learned in the light of the targeted goal. Hence it is important to realize the solution is only a tool, a means to an end. If you don’t have the right tool for the job, you should switch tools. Surprisingly, we often look to improve the tool and forget about the job to be done. The main question should be “what did we learn about the goal we’re chasing and the current strategy we’re trying to approach it”.
For the illegal dumping site app, there was an implicit key assumption between the solution and the intended goal. Initially it was assumed that faster clean-up would discourage future offenders and reduce the amount of illegal dumping. This assumption proved to be wrong.
Eventually it all comes down to differentiating between facts and assumptions. A strategy is built on a set of relationships between cause and effect – causalities – that hopefully will get you to your goal. With this, that will happen. Some of the causalities are, however, assumptions.
The biggest challenge to identifying true causalities is that it’s not clear what are facts and what are assumptions. Or worse, what facts are actually assumptions. And questioning everything is just not an option if you want to move forward.
A lessons learned session is a great opportunity to get back to the basics. For a brief moment leave the what and direct the team to the why, where, and how. Where do we want to get, why does it have value, and how do we plan to get there. A simple goal and strategy, in practical wording.
Time and again I meet great teams working on ambitious solutions. But goal and strategy get replaced by commitments and deadlines. Certainly in large enterprises dependability is valued over anything else. Having a regular lessons learned session is invaluable to stay on target.
When guiding a team through lessons learned sessions it’s easy to get overwhelmed. The first time it took me several sessions and some headaches to finally compile a list of lessons learned that had true insights on goal and strategy. Questioning how and why is not an easy thing to do. Here are some guiding principles that have helped the teams I’ve worked with to deliver breakthrough solutions:
- Identify doubts and disagreements. Screen the plan and get to the underlying causalities. Does the team agree on facts and assumptions? Where did we take the biggest leap of faith?
- Look beyond the team. Other initiatives, customers, and events often provide valuable information to examine if the teams strategy is still valid. The coffee machine is for me a PM hotspot to get a lead on interesting facts that might be valuable to the team’s journey.
- Focus. A lessons learned session provides input on what to do next, but it shouldn’t cover next steps. Deciding on plan and strategy changes is another, completely separate, discussion. It’s easy to mix both up. But it’s important to get the facts right, to have a common understanding of the situation before discussing what to do about it.
I’ve also found that, during a lessons learned session, many questionable assumptions can surface which don’t have a clear answer. I strongly believe that in most cases it’s better to ship and continue work as planned. Most answers come, by themselves, along the way.
Eventually the AI initiative shut down. A little later the team discovered a point in the asset production workflow where license information was readily available. All we had to do was connect systems to make data flow through. The solution was not groundbreaking, but it was exactly what was needed to reach the goal.
In general there is this ingrained fear of building the wrong thing, of wasting money and effort. This sentiment can cripple an organization. We grow up with the lesson to think before we do, but when the thinking takes months and years without anything being done (and learned), nobody gets anywhere. That is exactly the value of a lessons learned. By all means ship, and learn while doing so.