The data says your website sucks – Rachel Obstler on The Product Experience "Product people - Product managers, product designers, UX designers, UX researchers, Business analysts, developers, makers & entrepreneurs October 10 2021 False Podcasts, Product launch, Mind the Product Mind the Product Ltd 6411 Product Management 25.644

The data says your website sucks – Rachel Obstler on The Product Experience

BY ON

Why do good people make bad products? You know the feeling: you’re trying to complete a transaction on someone else’s site or app, and it just. Does. Not. Work. You wonder who the poor product manager is for it — or if there even is one. And then you remember the last time you sat in on customer support calls, and how hard your own product can be to use.

How does this happen? We chatted with Rachel Obstler, Heap’s VP of Product, to understand how we can use metrics to suck less.

Featured Links: Follow Rachel on LinkedIn and Twitter | Heap | ‘Why Every UX Designer Should Be a Data Analyst’  Modus article

Support our sponsors

Give Sprig (formerly Userleap) a try for free by visiting Sprig.com to build better products.

Episode transcript

Lily Smith: 

Did you know that 22% of consumers think the majority of websites are harder to use than they should be? And 80% of people think fewer than 75% of websites have a clear explanations of what therefore and how to use them properly.

Randy Silver: 

I mean, really, that’s a lot of numbers. And it sounds like you’re trying to blame me for all of those. I mean, I know I’ve worked on a lot of things, but not nearly that many sites. But doesn’t

Lily Smith: 

it just make you feel a little bit responsible for some people having a bad day, though?

Randy Silver: 

You can’t pin this on me. I mean, I’ve always tried to make it better. But I can’t deny that in some companies, it did feel like a losing battle.

Lily Smith: 

Okay, well, it’s a good thing. We talked to Rachel obsolesce, EDP at heep, about how we can use data in a better way to understand how our customers use our sites, and tackle the challenges that they’re facing.

Randy Silver: 

Ooh, that sounds like it could be really useful. Let’s, let’s get right to it.

Lily Smith: 

The product experience is brought to you by mind the product.

Randy Silver: 

Every week, we talk to the best product people from around the globe about how we can improve our practice, and build products

Lily Smith: 

that people love. Because it mind the product calm to catch up on past episodes, and to discover an extensive library of great content and videos,

Randy Silver: 

browse for free, or become a mind the product member to unlock premium articles, unseen videos, ama’s roundtables, discounts, discount store conferences around the world training opportunities.

Lily Smith: 

Mind product also offers free product tank meetups in more than 200 cities. And there’s probably one way you Hi, Rachel, thank you so much for joining us on the product experience. It’s really lovely to be talking to you today.

Rachel Obstler: 

Oh, yeah, it’s great to be here. Very excited.

Lily Smith: 

And our topic today is all around kind of usability and analytics. But before we get started, it’d be great to have a quick intro into your background and what you’re doing today.

Unknown: 

Sure. So first, yeah, thanks for having me. I’m Rachel oppler. I am currently the VP of product at heap analytics. And I have been in various product roles through most of my career. So I’ve had a very small stint in consulting, then a little bit of product marketing, a little bit of general management, but largely, I’ve worked in product management for over 20 years, because I love it.

Lily Smith: 

And did you know, so presumably, when you started at heep, you kind of knew lots about the struggles that product managers have with analytics, or the joy of some of the data that you can get from analytics in order to improve your product. But how did you how did you feel about analytics before you started it? Here?

Rachel Obstler: 

I am, I’ve always been very interested in data. In fact, like some of the organisations I’ve worked with in the past have been products that were data products, I guess you could say. And and yes, I was very intrigued by the idea of being able to have access to all the data you need when you need it in an easier way. And, and I I’ve also been always very interested in both combining like data, and also the more soft skills that you often need as a product manager. So yes, I was really interested in heap and the idea of making analytics a lot more easy to get and accessible for a broader set of product and and not just product management, also, Product Marketing, customer success, product analytics professionals, just everyone

Lily Smith: 

is such a fundamental piece to the puzzle that is product management, isn’t it?

Rachel Obstler: 

It is and i and i think there’s a real role that analytics can play around aligning teams. So I think I think of what happens when you’re a product manager and you find a piece of insight, right? Like what do you do? Well, you don’t just stop there, you share it, right? Like you share it with your team, you want them to know that answer that that piece of information. And so so I just think there’s a big role for analytics to play in helping align teams as well as making the right decisions.

Lily Smith: 

And your business heep undertook a study in which it was said that more than a third of consumers would rather get stuck in rush hour traffic then use a bad website which I thought was kind of hilarious and also kind of awful at the same time. So why do you think we get so agitated with, you know, bad usability websites?

Rachel Obstler: 

Yeah, so that was my favourite question in the survey. I think it’s primarily because, well, number one, people are busy. But number two, they know right now they know what good looks like. And so if you’ve seen it working, and I everyone’s gone on to the Amazon website, regardless of what you think of Amazon, and, and they really just push the boundaries of like speed and ease of use, I mean, the one click purchase is just like a great innovation. And so when you start getting used to that type of performance, when you don’t have it in the other applications that you use is just really noticeable. So I think it’s just that expectations are being set and constantly exceeded, and everyone then raises their expectations.

Randy Silver: 

So if we know what good looks like, why is there still so many crap sites out there?

Rachel Obstler: 

That is such a good question. Um, there are many reasons for that, you know, so, you know, we were talking a bit about data before, it may just be that certain groups don’t have the access to the data that they need. We talked to a lot of companies, you know, prospects, customers. And, you know, a high percentage of them are pretty mature when it comes to using data to drive decisions, but there’s just as many of them that are very early on. And they’re really largely making decisions based on gut instinct. Maybe it’s a combination of gut instinct and talking to customers. But the fact is, like some of these user journeys, these these user experiences that you have to perfect, you’re not going to get the type of insight just from talking to people like it’s hard for people sometimes to articulate why they didn’t buy something, or why they got stuck somewhere, right? You really need a combination of anecdotal and the data to to guide you through. So I think that’s part of it is maybe not having the data or, or sometimes it’s just cultural, that the organisation is used to making gut decisions. And in order to change a culture like that you really need to do it both bottoms up and top down, like you need to have people who are doing things in a different way. And you need the leadership to be encouraging people and rewarding people for using data to make decisions as opposed to just got decisions.

Randy Silver: 

So Rachel, that sounds really interesting. You know, I may or may not have worked on projects in the past, where the incentive structure for the company versus the incentives for the customer may not have been aligned, we might have done things like force people to sign up for an account before we let them actually do anything. How do you deal with things like that? Have you changed the culture in a company where you’ve got a good feeling that you know, you’re not serving the customers best interests?

Rachel Obstler: 

That is a really good question. And, of course, we’ve been talking about data, data plays a huge if not the predominant role. And doing that, the best way to show that you should be doing something different, is to try it and see what happens. And there’s a lot of ways to do that. So that you’re not completely shutting off the way things work today, you can do with a small sample set, that you can run an experiment for a small period of time. So it has a minimal impact on the business. But it gets you the data, you need to make the business case that that, you know, for instance, in your case, allowing people to try out some stuff before making them create an account is going to lead to better results over time. I think of a an analogous or similar, maybe a similar example, which is I think that marketing teams have really been trained to gather as much Intel and information upfront as they can, before letting people try something because that’s what tells you which of those people you should forward to sales, which of those people are going to be good leads. You know, in a world of digital and product lead growth, the most important piece of information is what people are doing in your product, right? That’s the thing that really tells you what offers you should give them whether or not they’re going to make a good customer, whether or not there’s a good fit there. And that’s much better information than anything you know about where they live or what demographics they are in general. And so it’s it’s really just the A mindset shift. And the best way to make that shift is through, I think experimentation and proving it.

Lily Smith: 

And a lot of analytics tools these days, or maybe not all, you know a lot of them, but some of them and capture the the actual end to end journey, like the recording of the full session. And we’re kind of trained as product managers to really think about the ethics behind capturing data of our users. And this, to me feels like almost like another sort of step down that route of lakes, you know, should we be actually capturing all of this information? And and what’s your, your view on that?

Rachel Obstler: 

Yeah, I mean, I think that, it’s, it’s important to make sure that you are following whatever rules and guidelines that your company has around identity management. And so as long as you’re protecting the identity of the user, and whether you’re doing it with a key or some other way, but like you have a concept of who that user is just because you can follow the whole journey, like you know, where they started, and where they stopped, this was the same person, you don’t need to know who they are. And, and you and the other thing is, a lot of times like you do want to look at individual user journey, because again, you want to just see, like, what exactly is happening where people get stuck. But it’s not about that one user, right? It’s about there’s a whole set of users that are getting stuck at this particular point. And therefore you want to understand what’s happening there. Right? So I think it just comes down to making sure that you’re being responsible with identity. But then looking at things that aggregate and if you do look at things on an individual user basis, you’re doing it as an example of something that is happening at an aggregate and you’re protecting the identity.

Lily Smith: 

And I think one of the other really fascinating stats from the study that that he did was that, I think it was 43% of consumers think that the user experience wasn’t designed around their needs. But 95% of product teams say that it’s very easy to navigate their site. So there’s like a huge disconnect there. And how, like, how does that happen? How does that disconnect happen? And and as product managers listening to this, like, Where can we understand? You know, where can we find the 43% of people who were using our site and, and help them better?

Rachel Obstler: 

Yeah. I, you know, we all intend his product people to build amazing user journeys, right, we all intend to put out the perfect thing that works from the beginning. And and the reason why there’s that disconnect is because it’s hard. It’s hard to do that. What what digital transformation is about is creating net new customer journeys or user journeys, right, you’re either digitising something that could in the past, only have been done in person, maybe you’re creating new experiences that could never have happened before. And so when you’re creating things that are new, you are you are by definition, doing something that no one’s done before. And it’s hard to make it perfect to begin with. And so what is really key is then having the information that you need, so that you know where people are getting stuck, you know where the friction is. And that is also actually not the easiest thing to do in the world. So that’s why it’s so important to not be caught up with what you think what you think the happy path is, right? I’ve designed this, here’s, here’s the journey, these are the steps that users must take. And I will measure whether they take them, because the reality is that users don’t do what you expect them to do. They find every which way to accomplish something that you have not expected. And so knowing what they’re doing, maybe they found a better way to do something, maybe they’re getting stuck in a place that you just didn’t anticipate. This is all really what what product management is about today, especially with digital products, is how do you make sure that you make that user journey as seamless as possible, as understandable as possible? And that you don’t assume again, back to that gut thing you don’t assume you know why people are completing something or not, but you look at the data to figure out where things are going wrong.

Randy Silver: 

Okay, so that’s the tricky part, right? Because there’s potentially an awful lot of data available, and you start harvesting it and making it available to people and they all want to look at it from there. Oh, perspective. So how do we get set up? How do we make sure that we’re actually instrumenting the right things, and looking at the data in a way that he saw rather than just from the perspective of, say, my bonus double measure or my KPI.

Rachel Obstler: 

So I would say that you don’t write that if you’re starting from the point of view of a bunch of assumptions that you need to make about the data that you need to have, then you might have already missed something you probably did, right? So start from the point of view, if at all possible of collecting everything, just collect it all, and let the data tell you what’s happening. And, you know, there’s a lot of places where you, you do have to make some assumptions like, okay, I want to understand the user journey from this point to this point, right? So you have to make some assumptions, but don’t make assumptions of what are the important points in there, or that you know, every step that you think is important. You know, one thing that that we’ve done with the heat product is, is we released a feature called effort analysis, and our customers have just been loving it. And the reason why is, what it does is you start out with a with a user journey. So you’ve got a path and you create these milestones that you think are the milestones. And what it does is it tells you how much effort and time the users are actually spending an each of those sections, each of those milestones. And it’s amazing how insightful that is just to see, there’s actually 50 on average 50 interactions happening in this milestone to get to the next one. And there’s a huge drop off, and people are spending, you know, three to five minutes on this. And they’re often leaving the site and coming back to to complete it. Right? There’s a lot you can learn about just capturing all the data and seeing what level of effort is actually happening on a site. And then you can start diving in and you can start coming up with hypotheses, like why are people doing all these things, you can look at a replay of, of the users that are going through this and getting stuck here and see exactly where they’re getting stuck. But you got to start with all the data to then tell you where you should be diving in.

Lily Smith: 

Sprake formally usually is an all in one product research platform that lets you ask bite sized questions to your customers within your product or existing user journeys.

Randy Silver: 

Companies like Dropbox, square, open door loom, and shift all use springs, video interviews, concept tests, and micro surveys to understand their customers at the pace of modern software development. So they can build customer centric products that deliver a sustainable competitive advantage.

Lily Smith: 

So if you’re part of an agile product team that wants to obtain rich insights from users at the pace you ship products, then give sprig a try for free by visiting sprig comm again, that’s sprig SP r I g.com. So how does that fit in with it feels like it goes very much against a kind of common trend now of just have your Northstar metric and, you know, hang everything off of that and focus on your one thing. That’s the thing that matters to the most to the business or to the user or whatever. And it feels very, it feels like a very different approach to that. So what’s your view of Northstar metrics?

Rachel Obstler: 

Yeah, so that’s not necessarily inconsistent. But let me explain like so. So let’s so let’s say you have a Northstar metric. And your Northstar metric is I’m trying to remember what Facebook’s was, but it was like 10 friends set up in the first week or two weeks or something like that. Right. So that’s a Northstar metric. And you can measure with the Northstar metric, you know, how many of our users are getting there? Right? How many of them are completing this? How many are completing it longer or shorter? But do you know why, like the ones that are not getting there? Why are they not getting there? Are they getting stuck somewhere in the site? Like, is there some friction that we don’t know about? And so that’s where you have to really dive in. And so, so so i think it’s it’s two very different things. One is like, what is the metric that we want to look at to figure out like, do we think our business is moving in the right direction? The other one is what sort of user journeys Do we have and is there friction inherent in them and how do we reduce that friction So it’s like once you find out maybe your Northstar metric isn’t going in the right direction. What’s actionable? Right? And how do you how do you figure out what to do to fix it? I think you also kind of ask the question about Northstar metrics in general. And and I think they’re really interesting. They’re tough if you’re not really sure what it should be. Right? So I think that’s the hardest thing about them is that there’s plenty of great examples of companies that picked really good Northstar metrics. And then there’s plenty where they didn’t necessarily, and I’ll tell you like, from heaps point of view, we kind of have a Northstar metric like it has to do with monthly Korean users. But I’m not convinced that we’ve we’ve gotten to the point where I would elevate it to the Northstar metric, right? I think that we’re still early and there’s a number of different metrics that we’re looking at. And and we’re still kind of figuring out what what we want that Northstar metric to be. So I think they can be really great. But the trick is picking one that really is the right thing to drive your business.

Randy Silver: 

For the companies that you see getting value out of the way they do analytics. Is there something in common? Is there, a Northstar for getting your analytics up and running? Well?

Rachel Obstler: 

That’s a good question. There are a number of things that are important, I would say, and I always get back to I forget who coined this, this idea, but like people process technology, as in, if you try to solve the problem, just with technology, you might fail. If you try to solve it just with org structure, culture, people like that type of thing, you might also fail because you don’t have the tooling you need. And if you try to just solve it with process, and you don’t have either the other two, it’s it’s also a problem, right? So I think the companies that do this, well approach the problem in a holistic way. Right? So they’re thinking about what’s the right tool to get the data and, and the analysis and capabilities I need. And I should say, it’s not just one tool, it’s often a tool chain, right? So you know, you want to make sure that your analytics is connected to say, your product, but also what you’re doing around maybe in app messaging or email sending, right you want to have, I think, Lily, you mentioned this earlier, but this view across the entire customer journey, the customer journey doesn’t just include when you’re on someone’s website, it also includes what emails you’re sending them, if you’re popping up messages in app, so so it’s thinking about the technology, it’s also thinking about the process, how often do you want people to review data? Are you expecting teams to show data when they’re talking about a new development that they want to do or a project? Or are you making sure that there’s a good process around evaluating after you’ve launched something, whether or not it worked or didn’t work? So there’s a whole practice around the data? And then lastly, there’s a culture, right? So if someone is making decisions based on gut is someone checking that, like, Where is the data to back this up? Are people being rewarded for trying experiments, even if they fail? Because at least they’re trying an experiment as opposed to just doing something, you know, without any basis for whether they should they should do it or not? So it gets back to the companies that do it? Well, I think, take a more holistic approach and make sure that they have the right culture, the right staffing, the right tooling and the right processes.

Lily Smith: 

And you kind of mentioned there are about companies that do this well, so is it just to say is it just those things that you’ve discovered, that sets them apart? That makes them do it well? Or is there some other kind of secret sauce that they have? Do they have, like, a data guru on their team, or just like a particular way of approaching it that really, you know, makes them good at this type of analysis?

Rachel Obstler: 

Yeah, you know, one thing I’ve noticed, and we do the same thing at heap is that um, so and I think Randy alluded to this before, like, data is not easy, right? It’s and, and I’ve worked with a lot of product managers that are very good at it, but it’s still it. That’s not their job, their sole job, right to really understand data to do correlations like to really figure out what’s driving things to figure out what’s, what’s the best metric to correlate to our business outputs. It can be really hard stuff to do to get predictive about things. So The companies that do this well, they get in the right expertise to do the more complex things. And so that could include a data science team, that could include just people who are focused on product analytics, making sure that the tooling is in place that you have the right tech stack for it. And when it comes to mind, and it’s, I think a lot about an AI, product lead growth, you know, the idea that that nowadays, really the customer is in the driver’s seat or the prospect, they want to be able to come to your website, they want to be able to figure out what you do, they want to be able to try it for themselves and see if they get value. Not everyone does, some still want to talk to sales, but many people want to operate that way. And a company that is has really capitalise on that and done it well is Atlassian, right. So they spend a lot of time hiring data scientists, as data scientists are really focused on growth, on how to make sure that they understand what users are doing. They understand what what new capabilities they should be offered, they know when to offer those things. They know how to make the process very smooth. There are many companies doing this. But if you look at you know, companies that have been really successful, the last couple years, they’ve really capitalise on this idea of self service and making it easy for people to try what they want to try. And in that case, as with other product capabilities, as well, having data scientists and product analysts that can really focus on that, and help the product team is really important.

Randy Silver: 

This is one of those places where we can make the mistake of trying to make it smooth or easy for everyone, instead of trying to optimise for the best segments. Because you know, that might actually be making it harder for some people, but they may not be good customers is that is that the type of thing these working with data scientists can help us with?

Rachel Obstler: 

Potentially, so I mean, data scientists can help with a lot. You know, they they they could help with, again, like correlation to different business metrics that you care about. They can help with figuring out like, hey, what is the optimal path through through what’s happening here. But uh, but I think that, that what you’re mentioning there also comes down to just in general, like a business strategy and a prioritisation call. So you can look at the data there. But you also have to think about where you’re going as a company, what segments are you going to focus on? Sometimes it’s a focus issue, right? Like, sometimes you can’t prioritise every single segment that’s out there, and you have to make some choices. So it is a it is a very good question, which is, should you focus? Or should you be more broad? And I would say, it’s almost an impossible one to answer in general, it depends on your specific situation. And I don’t know, if you have a better answer for that one. I don’t think I have a better answer for that

Lily Smith: 

one. And just thinking about those poor people who find that they’re experiencing the equivalent pain of being stuck in rush hour traffic, as you know, when they’re using it a bad website? How do we go about using quants data to understand where people are experiencing usability issues?

Rachel Obstler: 

Yeah. So one place to start is to just see where users are getting stuck or falling off, right? So let’s say you have a user journey in mind, you have a start point and an end point. And just knowing First of all, what percent of users are dropping off, then finding exactly where they’re dropping off. Right. So that’s where having all the data is helpful, because it’s possible that there is one field in one form that is one part of your process. That is the thing that is causing a lot of friction. And being able to have a data set that identifies that for you, and will even surface it automatically, is hugely helpful saves a lot of time may get you to answers you would not have gotten to otherwise, but at the very least will get you to the answer way faster than you would have otherwise. So that gets back to that idea of having all the data. But also this idea of you know, if it’s obvious where users are falling off, the system could even just say, here’s where it is like right here. You know, there’s other ways to get at that data. You can always just blindly try a bunch of experiments, but if you know exactly where it is, it’s much easier to make a change that is going to have a real big impact. And that’s, that’s, for instance, that’s what a lot of our customers say that the heap helps them with is the idea that they can just have an impact a lot faster, because it’s much more precise about where where the issue is.

Lily Smith: 

Okay, so for those listening, who are dying to get back to work, or maybe they’re at work, but they’re just having a coffee break and listening, and get stuck into their data, what are your kind of top tips for product people working on analytics? What are the kind of the advice that you find yourself always giving to product managers?

Rachel Obstler: 

On analytics? Let’s see. So first of all, I would say, like, let’s say you’re a product manager that is new, in a company or working on a new product, I would definitely say first start by just understanding like doing a baseline of performance of the product. Right? So and, and figuring out what those metrics are that tell you what that baseline is, right. And there’s a lot of standard ones, they don’t always work 100% across every single product. So you have to think about what makes sense for your product. But standard ones are things like you know, daily active users or monthly active users. As you know, I mentioned mq us earlier for heap. So we do something slightly different. It’s not a monthly active user, we call it a monthly querying user, like someone that got some insight out of heat. So you know, we are cells have adjusted a bit based on what the product does, and what makes sense, but looking at, you know, feature usage of the area that that you are in, you know, there’s just there’s a set of baseline metrics that you can put together, and it starts to give you a picture of just like, Where are you starting? Right? And then and then beyond that, it’s thinking about we talked a bit earlier, about Northstar metrics, I don’t think you have to decide on the very one Northstar metric if that’s a lot of pressure. But start figuring out what your main KPIs are, like, what are the main things that you want to drive, and it’s okay for there to be a couple of them like try to limit it to five, that’s five is a lot, three is better. But maybe those are things like you know, and they should align very closely to the goals you have for the product. So if you’re trying to drive more usage of a certain feature than that could be a KPI. If you’re trying to drive more, again, like monthly or daily active users in the product, that can be a KPI. But it’s it’s something that you think that you as a product manager, number one, it’s important to the business. And then number two, you can you can drive it, you can see a line of sight to the things and the changes you can make. And that like I’ll give a really good example, which is that we have a team that that focuses on integrations and heap and integrations with other tools. It could be tools sending in like other types of data into heat that we want to have in the system, it could be tools that we’re sending data out to, because we want them to take some sort of action, like send someone an email. And one of the things that we really wanted to do is make sure that more of our users who signed up for an integration, deployed it successfully. And so we saw some drop off there. And so one of the main metrics that the team KPI that this this team was focused on was, you know, within a week, where people are customers who sign up for something, actually deploying it and using it successfully. And just having that focus and making a couple changes in the ease of use of setup move that metric like about 10% in a quarter. Yeah, right. So it’s just a great example of having something that is measurable that you can have an impact on and that is important to the business. And once you kind of have those things set up, then you can start thinking about something like is there an Northstar metric that we care about, but I think just starting by baselining performance and then coming up with the key Shinnecock key KPIs to do that, all the time, KPIs that you think are important, even just for that quarter is a really good start.

Randy Silver: 

Rachel, is there a way of balancing leading and lagging measures that you need to worry about when you’re doing this? Because I’ve seen people focus way too much on lagging in the past. And I’m just curious how you deal with that.

Rachel Obstler: 

Yeah, that is such a good question. And is one of the bane of my existence is lagging metrics. Because, you know, they lag and they’re really hard. You’re moving them in a timely fashion. And so what I was saying Sometimes you end up with lagging metrics. And the reason why you end up with lagging metrics as a product team is because they’re important business metrics, right? So my favourite one to be like a pet peeve is NPS. nps is a very lagging metric. But it’s kind of an accepted metric to understand, you know how well a product is liked and usable across the industry. So it’s kind of hard to not use it and not report on it. So when you have a lagging metric, it is really important to then figure out what are your leading metrics that are that are going to tell you whether or not you’re heading in the right direction, so that you should see that lagging metric moving in a period of time. So I think what you said, Randy, about balancing it is really important. Like anytime you have a lagging metric, you got to figure out what your leading metric or metrics are going to be. and focus on moving those.

Lily Smith: 

Rachel, we are way out of time. But it’s been so awesome talking to you about this topic. Thank you so much for joining us.

Rachel Obstler: 

Oh, thank you, Lily, Randy. I really enjoyed this. Happy to do it again anytime.

Lily Smith: 

Say Randy, now you know, and there is no excuse for being the cause of someone’s frustration.

Randy Silver: 

Here were that simple. I’m not sure all of life’s challenges can be solved by product analytics. But you know, that’s why we all go for jobs, right? We got into this because we love a challenge.

Lily Smith: 

Exactly. And that’s why you and I are here to help where we can. So if you like this and you want to hear more, subscribe and like

Randy Silver: 

yay.

Lily Smith: 

haste, me, Lily Smith and me Randy silver. Emily Tate is our producer. And Luke Smith is our editor.

Randy Silver: 

Our theme music is from Humbard baseband power. That’s p au. Thanks to Ana Cutler who runs product tank and MTP engage in Hamburg and plays bass in the band for letting us use their music. Connect with your local product community via product tank or regular free meetups in over 200 cities worldwide.

Lily Smith: 

If there’s not one Nagy you can consider starting one yourself. To find out more go to mind the product.com forward slash product tank.

Randy Silver: 

Product tech is a global community of meetups during volume for product people. We offer expert talks group discussion and a safe environment for product people to come together and share greetings and tips.

[buzzsprout episode='9273691' player='true'] Why do good people make bad products? You know the feeling: you're trying to complete a transaction on someone else's site or app, and it just. Does. Not. Work. You wonder who the poor product manager is for it — or if there even is one. And then you remember the last time you sat in on customer support calls, and how hard your own product can be to use. How does this happen? We chatted with Rachel Obstler, Heap's VP of Product, to understand how we can use metrics to suck less. Featured Links: Follow Rachel on LinkedIn and Twitter | Heap | 'Why Every UX Designer Should Be a Data Analyst'  Modus article

Support our sponsors

Give Sprig (formerly Userleap) a try for free by visiting Sprig.com to build better products.

Episode transcript

Lily Smith:  Did you know that 22% of consumers think the majority of websites are harder to use than they should be? And 80% of people think fewer than 75% of websites have a clear explanations of what therefore and how to use them properly. Randy Silver:  I mean, really, that's a lot of numbers. And it sounds like you're trying to blame me for all of those. I mean, I know I've worked on a lot of things, but not nearly that many sites. But doesn't Lily Smith:  it just make you feel a little bit responsible for some people having a bad day, though? Randy Silver:  You can't pin this on me. I mean, I've always tried to make it better. But I can't deny that in some companies, it did feel like a losing battle. Lily Smith:  Okay, well, it's a good thing. We talked to Rachel obsolesce, EDP at heep, about how we can use data in a better way to understand how our customers use our sites, and tackle the challenges that they're facing. Randy Silver:  Ooh, that sounds like it could be really useful. Let's, let's get right to it. Lily Smith:  The product experience is brought to you by mind the product. Randy Silver:  Every week, we talk to the best product people from around the globe about how we can improve our practice, and build products Lily Smith:  that people love. Because it mind the product calm to catch up on past episodes, and to discover an extensive library of great content and videos, Randy Silver:  browse for free, or become a mind the product member to unlock premium articles, unseen videos, ama's roundtables, discounts, discount store conferences around the world training opportunities. Lily Smith:  Mind product also offers free product tank meetups in more than 200 cities. And there's probably one way you Hi, Rachel, thank you so much for joining us on the product experience. It's really lovely to be talking to you today. Rachel Obstler:  Oh, yeah, it's great to be here. Very excited. Lily Smith:  And our topic today is all around kind of usability and analytics. But before we get started, it'd be great to have a quick intro into your background and what you're doing today. Unknown:  Sure. So first, yeah, thanks for having me. I'm Rachel oppler. I am currently the VP of product at heap analytics. And I have been in various product roles through most of my career. So I've had a very small stint in consulting, then a little bit of product marketing, a little bit of general management, but largely, I've worked in product management for over 20 years, because I love it. Lily Smith:  And did you know, so presumably, when you started at heep, you kind of knew lots about the struggles that product managers have with analytics, or the joy of some of the data that you can get from analytics in order to improve your product. But how did you how did you feel about analytics before you started it? Here? Rachel Obstler:  I am, I've always been very interested in data. In fact, like some of the organisations I've worked with in the past have been products that were data products, I guess you could say. And and yes, I was very intrigued by the idea of being able to have access to all the data you need when you need it in an easier way. And, and I I've also been always very interested in both combining like data, and also the more soft skills that you often need as a product manager. So yes, I was really interested in heap and the idea of making analytics a lot more easy to get and accessible for a broader set of product and and not just product management, also, Product Marketing, customer success, product analytics professionals, just everyone Lily Smith:  is such a fundamental piece to the puzzle that is product management, isn't it? Rachel Obstler:  It is and i and i think there's a real role that analytics can play around aligning teams. So I think I think of what happens when you're a product manager and you find a piece of insight, right? Like what do you do? Well, you don't just stop there, you share it, right? Like you share it with your team, you want them to know that answer that that piece of information. And so so I just think there's a big role for analytics to play in helping align teams as well as making the right decisions. Lily Smith:  And your business heep undertook a study in which it was said that more than a third of consumers would rather get stuck in rush hour traffic then use a bad website which I thought was kind of hilarious and also kind of awful at the same time. So why do you think we get so agitated with, you know, bad usability websites? Rachel Obstler:  Yeah, so that was my favourite question in the survey. I think it's primarily because, well, number one, people are busy. But number two, they know right now they know what good looks like. And so if you've seen it working, and I everyone's gone on to the Amazon website, regardless of what you think of Amazon, and, and they really just push the boundaries of like speed and ease of use, I mean, the one click purchase is just like a great innovation. And so when you start getting used to that type of performance, when you don't have it in the other applications that you use is just really noticeable. So I think it's just that expectations are being set and constantly exceeded, and everyone then raises their expectations. Randy Silver:  So if we know what good looks like, why is there still so many crap sites out there? Rachel Obstler:  That is such a good question. Um, there are many reasons for that, you know, so, you know, we were talking a bit about data before, it may just be that certain groups don't have the access to the data that they need. We talked to a lot of companies, you know, prospects, customers. And, you know, a high percentage of them are pretty mature when it comes to using data to drive decisions, but there's just as many of them that are very early on. And they're really largely making decisions based on gut instinct. Maybe it's a combination of gut instinct and talking to customers. But the fact is, like some of these user journeys, these these user experiences that you have to perfect, you're not going to get the type of insight just from talking to people like it's hard for people sometimes to articulate why they didn't buy something, or why they got stuck somewhere, right? You really need a combination of anecdotal and the data to to guide you through. So I think that's part of it is maybe not having the data or, or sometimes it's just cultural, that the organisation is used to making gut decisions. And in order to change a culture like that you really need to do it both bottoms up and top down, like you need to have people who are doing things in a different way. And you need the leadership to be encouraging people and rewarding people for using data to make decisions as opposed to just got decisions. Randy Silver:  So Rachel, that sounds really interesting. You know, I may or may not have worked on projects in the past, where the incentive structure for the company versus the incentives for the customer may not have been aligned, we might have done things like force people to sign up for an account before we let them actually do anything. How do you deal with things like that? Have you changed the culture in a company where you've got a good feeling that you know, you're not serving the customers best interests? Rachel Obstler:  That is a really good question. And, of course, we've been talking about data, data plays a huge if not the predominant role. And doing that, the best way to show that you should be doing something different, is to try it and see what happens. And there's a lot of ways to do that. So that you're not completely shutting off the way things work today, you can do with a small sample set, that you can run an experiment for a small period of time. So it has a minimal impact on the business. But it gets you the data, you need to make the business case that that, you know, for instance, in your case, allowing people to try out some stuff before making them create an account is going to lead to better results over time. I think of a an analogous or similar, maybe a similar example, which is I think that marketing teams have really been trained to gather as much Intel and information upfront as they can, before letting people try something because that's what tells you which of those people you should forward to sales, which of those people are going to be good leads. You know, in a world of digital and product lead growth, the most important piece of information is what people are doing in your product, right? That's the thing that really tells you what offers you should give them whether or not they're going to make a good customer, whether or not there's a good fit there. And that's much better information than anything you know about where they live or what demographics they are in general. And so it's it's really just the A mindset shift. And the best way to make that shift is through, I think experimentation and proving it. Lily Smith:  And a lot of analytics tools these days, or maybe not all, you know a lot of them, but some of them and capture the the actual end to end journey, like the recording of the full session. And we're kind of trained as product managers to really think about the ethics behind capturing data of our users. And this, to me feels like almost like another sort of step down that route of lakes, you know, should we be actually capturing all of this information? And and what's your, your view on that? Rachel Obstler:  Yeah, I mean, I think that, it's, it's important to make sure that you are following whatever rules and guidelines that your company has around identity management. And so as long as you're protecting the identity of the user, and whether you're doing it with a key or some other way, but like you have a concept of who that user is just because you can follow the whole journey, like you know, where they started, and where they stopped, this was the same person, you don't need to know who they are. And, and you and the other thing is, a lot of times like you do want to look at individual user journey, because again, you want to just see, like, what exactly is happening where people get stuck. But it's not about that one user, right? It's about there's a whole set of users that are getting stuck at this particular point. And therefore you want to understand what's happening there. Right? So I think it just comes down to making sure that you're being responsible with identity. But then looking at things that aggregate and if you do look at things on an individual user basis, you're doing it as an example of something that is happening at an aggregate and you're protecting the identity. Lily Smith:  And I think one of the other really fascinating stats from the study that that he did was that, I think it was 43% of consumers think that the user experience wasn't designed around their needs. But 95% of product teams say that it's very easy to navigate their site. So there's like a huge disconnect there. And how, like, how does that happen? How does that disconnect happen? And and as product managers listening to this, like, Where can we understand? You know, where can we find the 43% of people who were using our site and, and help them better? Rachel Obstler:  Yeah. I, you know, we all intend his product people to build amazing user journeys, right, we all intend to put out the perfect thing that works from the beginning. And and the reason why there's that disconnect is because it's hard. It's hard to do that. What what digital transformation is about is creating net new customer journeys or user journeys, right, you're either digitising something that could in the past, only have been done in person, maybe you're creating new experiences that could never have happened before. And so when you're creating things that are new, you are you are by definition, doing something that no one's done before. And it's hard to make it perfect to begin with. And so what is really key is then having the information that you need, so that you know where people are getting stuck, you know where the friction is. And that is also actually not the easiest thing to do in the world. So that's why it's so important to not be caught up with what you think what you think the happy path is, right? I've designed this, here's, here's the journey, these are the steps that users must take. And I will measure whether they take them, because the reality is that users don't do what you expect them to do. They find every which way to accomplish something that you have not expected. And so knowing what they're doing, maybe they found a better way to do something, maybe they're getting stuck in a place that you just didn't anticipate. This is all really what what product management is about today, especially with digital products, is how do you make sure that you make that user journey as seamless as possible, as understandable as possible? And that you don't assume again, back to that gut thing you don't assume you know why people are completing something or not, but you look at the data to figure out where things are going wrong. Randy Silver:  Okay, so that's the tricky part, right? Because there's potentially an awful lot of data available, and you start harvesting it and making it available to people and they all want to look at it from there. Oh, perspective. So how do we get set up? How do we make sure that we're actually instrumenting the right things, and looking at the data in a way that he saw rather than just from the perspective of, say, my bonus double measure or my KPI. Rachel Obstler:  So I would say that you don't write that if you're starting from the point of view of a bunch of assumptions that you need to make about the data that you need to have, then you might have already missed something you probably did, right? So start from the point of view, if at all possible of collecting everything, just collect it all, and let the data tell you what's happening. And, you know, there's a lot of places where you, you do have to make some assumptions like, okay, I want to understand the user journey from this point to this point, right? So you have to make some assumptions, but don't make assumptions of what are the important points in there, or that you know, every step that you think is important. You know, one thing that that we've done with the heat product is, is we released a feature called effort analysis, and our customers have just been loving it. And the reason why is, what it does is you start out with a with a user journey. So you've got a path and you create these milestones that you think are the milestones. And what it does is it tells you how much effort and time the users are actually spending an each of those sections, each of those milestones. And it's amazing how insightful that is just to see, there's actually 50 on average 50 interactions happening in this milestone to get to the next one. And there's a huge drop off, and people are spending, you know, three to five minutes on this. And they're often leaving the site and coming back to to complete it. Right? There's a lot you can learn about just capturing all the data and seeing what level of effort is actually happening on a site. And then you can start diving in and you can start coming up with hypotheses, like why are people doing all these things, you can look at a replay of, of the users that are going through this and getting stuck here and see exactly where they're getting stuck. But you got to start with all the data to then tell you where you should be diving in. Lily Smith:  Sprake formally usually is an all in one product research platform that lets you ask bite sized questions to your customers within your product or existing user journeys. Randy Silver:  Companies like Dropbox, square, open door loom, and shift all use springs, video interviews, concept tests, and micro surveys to understand their customers at the pace of modern software development. So they can build customer centric products that deliver a sustainable competitive advantage. Lily Smith:  So if you're part of an agile product team that wants to obtain rich insights from users at the pace you ship products, then give sprig a try for free by visiting sprig comm again, that's sprig SP r I g.com. So how does that fit in with it feels like it goes very much against a kind of common trend now of just have your Northstar metric and, you know, hang everything off of that and focus on your one thing. That's the thing that matters to the most to the business or to the user or whatever. And it feels very, it feels like a very different approach to that. So what's your view of Northstar metrics? Rachel Obstler:  Yeah, so that's not necessarily inconsistent. But let me explain like so. So let's so let's say you have a Northstar metric. And your Northstar metric is I'm trying to remember what Facebook's was, but it was like 10 friends set up in the first week or two weeks or something like that. Right. So that's a Northstar metric. And you can measure with the Northstar metric, you know, how many of our users are getting there? Right? How many of them are completing this? How many are completing it longer or shorter? But do you know why, like the ones that are not getting there? Why are they not getting there? Are they getting stuck somewhere in the site? Like, is there some friction that we don't know about? And so that's where you have to really dive in. And so, so so i think it's it's two very different things. One is like, what is the metric that we want to look at to figure out like, do we think our business is moving in the right direction? The other one is what sort of user journeys Do we have and is there friction inherent in them and how do we reduce that friction So it's like once you find out maybe your Northstar metric isn't going in the right direction. What's actionable? Right? And how do you how do you figure out what to do to fix it? I think you also kind of ask the question about Northstar metrics in general. And and I think they're really interesting. They're tough if you're not really sure what it should be. Right? So I think that's the hardest thing about them is that there's plenty of great examples of companies that picked really good Northstar metrics. And then there's plenty where they didn't necessarily, and I'll tell you like, from heaps point of view, we kind of have a Northstar metric like it has to do with monthly Korean users. But I'm not convinced that we've we've gotten to the point where I would elevate it to the Northstar metric, right? I think that we're still early and there's a number of different metrics that we're looking at. And and we're still kind of figuring out what what we want that Northstar metric to be. So I think they can be really great. But the trick is picking one that really is the right thing to drive your business. Randy Silver:  For the companies that you see getting value out of the way they do analytics. Is there something in common? Is there, a Northstar for getting your analytics up and running? Well? Rachel Obstler:  That's a good question. There are a number of things that are important, I would say, and I always get back to I forget who coined this, this idea, but like people process technology, as in, if you try to solve the problem, just with technology, you might fail. If you try to solve it just with org structure, culture, people like that type of thing, you might also fail because you don't have the tooling you need. And if you try to just solve it with process, and you don't have either the other two, it's it's also a problem, right? So I think the companies that do this, well approach the problem in a holistic way. Right? So they're thinking about what's the right tool to get the data and, and the analysis and capabilities I need. And I should say, it's not just one tool, it's often a tool chain, right? So you know, you want to make sure that your analytics is connected to say, your product, but also what you're doing around maybe in app messaging or email sending, right you want to have, I think, Lily, you mentioned this earlier, but this view across the entire customer journey, the customer journey doesn't just include when you're on someone's website, it also includes what emails you're sending them, if you're popping up messages in app, so so it's thinking about the technology, it's also thinking about the process, how often do you want people to review data? Are you expecting teams to show data when they're talking about a new development that they want to do or a project? Or are you making sure that there's a good process around evaluating after you've launched something, whether or not it worked or didn't work? So there's a whole practice around the data? And then lastly, there's a culture, right? So if someone is making decisions based on gut is someone checking that, like, Where is the data to back this up? Are people being rewarded for trying experiments, even if they fail? Because at least they're trying an experiment as opposed to just doing something, you know, without any basis for whether they should they should do it or not? So it gets back to the companies that do it? Well, I think, take a more holistic approach and make sure that they have the right culture, the right staffing, the right tooling and the right processes. Lily Smith:  And you kind of mentioned there are about companies that do this well, so is it just to say is it just those things that you've discovered, that sets them apart? That makes them do it well? Or is there some other kind of secret sauce that they have? Do they have, like, a data guru on their team, or just like a particular way of approaching it that really, you know, makes them good at this type of analysis? Rachel Obstler:  Yeah, you know, one thing I've noticed, and we do the same thing at heap is that um, so and I think Randy alluded to this before, like, data is not easy, right? It's and, and I've worked with a lot of product managers that are very good at it, but it's still it. That's not their job, their sole job, right to really understand data to do correlations like to really figure out what's driving things to figure out what's, what's the best metric to correlate to our business outputs. It can be really hard stuff to do to get predictive about things. So The companies that do this well, they get in the right expertise to do the more complex things. And so that could include a data science team, that could include just people who are focused on product analytics, making sure that the tooling is in place that you have the right tech stack for it. And when it comes to mind, and it's, I think a lot about an AI, product lead growth, you know, the idea that that nowadays, really the customer is in the driver's seat or the prospect, they want to be able to come to your website, they want to be able to figure out what you do, they want to be able to try it for themselves and see if they get value. Not everyone does, some still want to talk to sales, but many people want to operate that way. And a company that is has really capitalise on that and done it well is Atlassian, right. So they spend a lot of time hiring data scientists, as data scientists are really focused on growth, on how to make sure that they understand what users are doing. They understand what what new capabilities they should be offered, they know when to offer those things. They know how to make the process very smooth. There are many companies doing this. But if you look at you know, companies that have been really successful, the last couple years, they've really capitalise on this idea of self service and making it easy for people to try what they want to try. And in that case, as with other product capabilities, as well, having data scientists and product analysts that can really focus on that, and help the product team is really important. Randy Silver:  This is one of those places where we can make the mistake of trying to make it smooth or easy for everyone, instead of trying to optimise for the best segments. Because you know, that might actually be making it harder for some people, but they may not be good customers is that is that the type of thing these working with data scientists can help us with? Rachel Obstler:  Potentially, so I mean, data scientists can help with a lot. You know, they they they could help with, again, like correlation to different business metrics that you care about. They can help with figuring out like, hey, what is the optimal path through through what's happening here. But uh, but I think that, that what you're mentioning there also comes down to just in general, like a business strategy and a prioritisation call. So you can look at the data there. But you also have to think about where you're going as a company, what segments are you going to focus on? Sometimes it's a focus issue, right? Like, sometimes you can't prioritise every single segment that's out there, and you have to make some choices. So it is a it is a very good question, which is, should you focus? Or should you be more broad? And I would say, it's almost an impossible one to answer in general, it depends on your specific situation. And I don't know, if you have a better answer for that one. I don't think I have a better answer for that Lily Smith:  one. And just thinking about those poor people who find that they're experiencing the equivalent pain of being stuck in rush hour traffic, as you know, when they're using it a bad website? How do we go about using quants data to understand where people are experiencing usability issues? Rachel Obstler:  Yeah. So one place to start is to just see where users are getting stuck or falling off, right? So let's say you have a user journey in mind, you have a start point and an end point. And just knowing First of all, what percent of users are dropping off, then finding exactly where they're dropping off. Right. So that's where having all the data is helpful, because it's possible that there is one field in one form that is one part of your process. That is the thing that is causing a lot of friction. And being able to have a data set that identifies that for you, and will even surface it automatically, is hugely helpful saves a lot of time may get you to answers you would not have gotten to otherwise, but at the very least will get you to the answer way faster than you would have otherwise. So that gets back to that idea of having all the data. But also this idea of you know, if it's obvious where users are falling off, the system could even just say, here's where it is like right here. You know, there's other ways to get at that data. You can always just blindly try a bunch of experiments, but if you know exactly where it is, it's much easier to make a change that is going to have a real big impact. And that's, that's, for instance, that's what a lot of our customers say that the heap helps them with is the idea that they can just have an impact a lot faster, because it's much more precise about where where the issue is. Lily Smith:  Okay, so for those listening, who are dying to get back to work, or maybe they're at work, but they're just having a coffee break and listening, and get stuck into their data, what are your kind of top tips for product people working on analytics? What are the kind of the advice that you find yourself always giving to product managers? Rachel Obstler:  On analytics? Let's see. So first of all, I would say, like, let's say you're a product manager that is new, in a company or working on a new product, I would definitely say first start by just understanding like doing a baseline of performance of the product. Right? So and, and figuring out what those metrics are that tell you what that baseline is, right. And there's a lot of standard ones, they don't always work 100% across every single product. So you have to think about what makes sense for your product. But standard ones are things like you know, daily active users or monthly active users. As you know, I mentioned mq us earlier for heap. So we do something slightly different. It's not a monthly active user, we call it a monthly querying user, like someone that got some insight out of heat. So you know, we are cells have adjusted a bit based on what the product does, and what makes sense, but looking at, you know, feature usage of the area that that you are in, you know, there's just there's a set of baseline metrics that you can put together, and it starts to give you a picture of just like, Where are you starting? Right? And then and then beyond that, it's thinking about we talked a bit earlier, about Northstar metrics, I don't think you have to decide on the very one Northstar metric if that's a lot of pressure. But start figuring out what your main KPIs are, like, what are the main things that you want to drive, and it's okay for there to be a couple of them like try to limit it to five, that's five is a lot, three is better. But maybe those are things like you know, and they should align very closely to the goals you have for the product. So if you're trying to drive more usage of a certain feature than that could be a KPI. If you're trying to drive more, again, like monthly or daily active users in the product, that can be a KPI. But it's it's something that you think that you as a product manager, number one, it's important to the business. And then number two, you can you can drive it, you can see a line of sight to the things and the changes you can make. And that like I'll give a really good example, which is that we have a team that that focuses on integrations and heap and integrations with other tools. It could be tools sending in like other types of data into heat that we want to have in the system, it could be tools that we're sending data out to, because we want them to take some sort of action, like send someone an email. And one of the things that we really wanted to do is make sure that more of our users who signed up for an integration, deployed it successfully. And so we saw some drop off there. And so one of the main metrics that the team KPI that this this team was focused on was, you know, within a week, where people are customers who sign up for something, actually deploying it and using it successfully. And just having that focus and making a couple changes in the ease of use of setup move that metric like about 10% in a quarter. Yeah, right. So it's just a great example of having something that is measurable that you can have an impact on and that is important to the business. And once you kind of have those things set up, then you can start thinking about something like is there an Northstar metric that we care about, but I think just starting by baselining performance and then coming up with the key Shinnecock key KPIs to do that, all the time, KPIs that you think are important, even just for that quarter is a really good start. Randy Silver:  Rachel, is there a way of balancing leading and lagging measures that you need to worry about when you're doing this? Because I've seen people focus way too much on lagging in the past. And I'm just curious how you deal with that. Rachel Obstler:  Yeah, that is such a good question. And is one of the bane of my existence is lagging metrics. Because, you know, they lag and they're really hard. You're moving them in a timely fashion. And so what I was saying Sometimes you end up with lagging metrics. And the reason why you end up with lagging metrics as a product team is because they're important business metrics, right? So my favourite one to be like a pet peeve is NPS. nps is a very lagging metric. But it's kind of an accepted metric to understand, you know how well a product is liked and usable across the industry. So it's kind of hard to not use it and not report on it. So when you have a lagging metric, it is really important to then figure out what are your leading metrics that are that are going to tell you whether or not you're heading in the right direction, so that you should see that lagging metric moving in a period of time. So I think what you said, Randy, about balancing it is really important. Like anytime you have a lagging metric, you got to figure out what your leading metric or metrics are going to be. and focus on moving those. Lily Smith:  Rachel, we are way out of time. But it's been so awesome talking to you about this topic. Thank you so much for joining us. Rachel Obstler:  Oh, thank you, Lily, Randy. I really enjoyed this. Happy to do it again anytime. Lily Smith:  Say Randy, now you know, and there is no excuse for being the cause of someone's frustration. Randy Silver:  Here were that simple. I'm not sure all of life's challenges can be solved by product analytics. But you know, that's why we all go for jobs, right? We got into this because we love a challenge. Lily Smith:  Exactly. And that's why you and I are here to help where we can. So if you like this and you want to hear more, subscribe and like Randy Silver:  yay. Lily Smith:  haste, me, Lily Smith and me Randy silver. Emily Tate is our producer. And Luke Smith is our editor. Randy Silver:  Our theme music is from Humbard baseband power. That's p au. Thanks to Ana Cutler who runs product tank and MTP engage in Hamburg and plays bass in the band for letting us use their music. Connect with your local product community via product tank or regular free meetups in over 200 cities worldwide. Lily Smith:  If there's not one Nagy you can consider starting one yourself. To find out more go to mind the product.com forward slash product tank. Randy Silver:  Product tech is a global community of meetups during volume for product people. We offer expert talks group discussion and a safe environment for product people to come together and share greetings and tips.