Spotify, crypto, and ethics – Cennydd Bowles on The Product Experience "Product people - Product managers, product designers, UX designers, UX researchers, Business analysts, developers, makers & entrepreneurs February 02 2022 False Podcasts, Product ethics, The Product Experience, Mind the Product Mind the Product Ltd 5861 Product Management 23.444

Spotify, crypto, and ethics – Cennydd Bowles on The Product Experience

BY ON

Product people are charged with representing the needs of the customer as well as the needs of the business—but what happens when these are in conflict? We asked ethics consultant Cennydd Bowles to chat with us about how to handle some tricky situations, including Spotify and Joe Rogan, crypto/blockchain, and handling customers that might contradict your (or your company’s) values.

Listen to more episodes…


Featured Links: Follow Cennydd on LinkedIn and Twitter | Cennydd’s website | NowNext Studio | ‘What is the Gini Coefficient?’ piece at CFI | Cennydd’s book ‘Future Ethics: the must-read guide to the ethics of emerging technology

Episode transcript

Lily Smith: 

So last week I promised that Randy would be back for this week’s episode, but he’s not fully mended. So I bought it a special guest co host. Batman. Does that make me Robin?

Randy Silver: 

If you want, really, I mean, it’s really me, Randy. But I’ve got to milk it. Well, my voice recovers in the last couple of weeks. I’m feeling a whole lot better tonight. But our guests today could really be counted as kind of a superhero. He’s a former designer. He’s now a professional ethics consultant. And he’s here to talk with us about some of the great moral dilemmas that product people might face today. You know, things like Spotify and Joe Rogan, and crypto and more.

Lily Smith: 

KENNETH Bose has also written a book on the topic called Future ethics. So if you enjoy this chat, you can read all about it to now to the pod cave. The product experience is brought to you by mind the product.

Randy Silver: 

Every week, we talk to the best product people from around the globe about how we can improve our practice, and build products that people love.

Lily Smith: 

Because they mind the product.com to catch up on past episodes and to discover an extensive library of great content and videos, browse for

Randy Silver: 

free, or become a mind the product member to unlock premium articles, unseen videos, AMA’s roundtables, discounts to our conferences around the world training opportunities.

Lily Smith: 

Mind new product also offers free product tank meetups in more than 200 cities. And there’s probably one.

Randy Silver: 

Kenneth, thank you so much for joining us. For anyone who, well, he didn’t know how to pronounce your name, and B doesn’t recognise it. Can you just give us a quick background on yourself? How did you get into the design and product space? And what do you up to these days?

Cennydd Bowles: 

Your thing Randy so i Yes, I’m kind of bowls. Hi, everyone. I’ve been a designer for both two decades working in UK Tech and a little bit of Silicon Valley. I graduated actually got a master have a master’s in it and thought I was going to be a software engineer coming out of that. But actually it was the design the Human Factors stuff that really put it took in me. So that’s where I’ve ended up. About five years ago, I started to realise I was maybe I don’t know if dissolution is the right word, but I was starting to feel that things weren’t quite going in the direction that the industry needed. And I had a real interest in ethics. I hadn’t studied this previously or anything like that at that stage. But I, I was able to take a bit of time out to try and understand that space better. And the more I got into it, the more I realised, there’s a hell of a lot there that we as an industry are not on top of. And so I decided, you know, this may be something is like this is maybe something that I could do to add some some value here to try and translate work that’s happening from moral philosophers, ethicists, writers, artists, critics and so on, try and translate that and contextualise it for working designers, PMS, engineers, and so on. So that’s what I do now is I consult on ethics for tech teams, and particularly design and product teams.

Randy Silver: 

And it’s more specifically and you’ve got a book of the same title. It’s future ethics, isn’t it? So what exactly is future ethics?

Cennydd Bowles: 

Yeah, so there’s kind of a double meaning there, right. It’s, you know, what is the future of ethics, but also what is the ethics of the future. And it’s possibly the latter meaning that’s more more interesting to me. I think a lot of the emerging technologies that are obviously going to really hit the mainstream in the next few years, have significantly deeper moral importance than some of the ones we have today. AI is the obvious one, there’s a very strongly growing field of AI ethics. But that’s not the only space there are, there are plenty of other emerging technologies that pose quite significant ethical, social and political challenges. And of course, the time to have conversations about those is now not when they come to market in 510 1520 years time, we need to try and understand what that’s going to mean for our world of how we design those things right in the first place. Before potential damage is unleashed upon our markets and even on the world.

Lily Smith: 

And one of the things you mentioned there was you were had a moment of considering the ethics of of what you were designing. What was it specifically that you were kind of looking into at that time or working on if you’re allowed to talk about it, talk about it.

Cennydd Bowles: 

So yeah, I didn’t have a particular kind of moment of conversion. It wasn’t that I was filled with horror at what I was doing or anything like that. Before I moved into that space, I was at Twitter for three years I was design manager in the UK so effectively heading up the UK design But they’re also filled in as a pm for by nine months when we were sort of struggling to hire. So I sort of did a bit of DMing as well. And, you know, I thought generally the company made relatively good decisions at the time, but I think it made some mistakes. And I think I was I was quite disappointed with the way it was failing to progress, particularly on things like user safety. And as I said, I had this growing interest. And actually, I’ve been lucky enough, I’ve been giving talks on design, particularly for a good decade or so. And in one of them quite a quite a prominent talk in 2011, I did mention, I think we’re missing a trick on ethics. And that feeling, I think, stayed with me. And it was mostly just, you know, diving into that space a little bit made me realise just how deep it was, and how little attention we’ve been paying to the people who are on those shores before us. And so I just find it sort of doubly rewarding. You know, diving into that, and realising, you know, maybe this is this is the role that I can do, then as to, as I mentioned, to put that into context for people. But it wasn’t one particular moment where I was, you know, saying I have to quit, and I have to suddenly completely change my life, it was a little more of a kind of a creative process that I was getting a little bit more and more concerned. And this was around about 2015, that that happened. And obviously then in 2016, we had a significant growth of interest in technology ethics, a lot of it was predicated around the Brexit and Trump elections, which I think, surprised people and made people think, well, the world isn’t quite rotating along the axis that we thought it was, maybe we were part of that surprise, and that shock, and maybe we need to do better.

Lily Smith: 

Yeah, it’s really interesting, how much emphasis there is now and all of the product talks, you know, that I have on ethics. And I think as product managers and product people, we are much more aware of it as a topic than we were, you know, even just five years ago. So before we get into other specifics, flip back again, because Randy mentioned your book, future ethics. So there just seems to be like no and of ethical dilemmas online at the moment, but also, like you said, with future technology, self driving cars and things like that, what’s your sort of personal long term vision for an ethical and safe future for us?

Cennydd Bowles: 

Goodness, I mean, that that’s quite a deep question. If we’re talking future as in the next 30 years or so, then, obviously, the first thing we have to talk about is climate crisis, we have to deal with that we have to find ways to live with the enormous disruption that that’s going to create. The tech industry is responsible for around about 2% of global emissions. Well, that’s rather data centres alone are responsible for by 2%, which is the same as the aviation industry. So we have to do better on that. But really, the answer to that is the same as I think my broader vision for ethical and responsible technology, which is technology that isn’t just built for the benefit of shareholders, isn’t just built even for the benefit of users, which I think designers sometimes get a little bit too fixated upon. We also have to consider technology’s potential impact on non users, on communities, minorities, groups of employees, for example. And do our very best to do our best according to all of those interests, and also even the not the interests of non human animals, and then the planet itself. So it’s about balancing and living up to our responsibility to this much broader set of stakeholders for our work. And recognising Yes, of course, we want to benefit our businesses. And of course, we want to benefit the user, but that’s not enough.

Randy Silver: 

Okay, I’m gonna make it easier on you and give you a couple of those. Let’s do a couple of specific role playing exercises. And so years ago, I was actually at a Sirius Satellite Radio when they signed Howard Stern and give him his own channel, give him a few 100 million dollars to do that. And it was a really interesting thing, because there was no question he was moving away from terrestrial radio, there was a different case of how it was being regulated. There was no question that we were the publisher of his content to in some way. Now we can look at what’s going on with Spotify. And with Joe Rogan, it’s an interesting thing, and they’ve paid him a lot of money to give him half exclusively on their platform. If I’m a product person at Spotify, one of my goals is going to be to increase subscriptions and revenue. A leading metric for that is, I’m guessing, time spent listening by people with this, presumably that all is more interactive, but it’s more likely to renew my subscription etc. And then with Music, I have to pay royalties. So podcasts is a potentially more more revenue positive area and I want to grow that. On the other hand, I’ve got this potential issue. Where would I even start? I mean, it’s I’m not sitting necessarily the person making the business decisions here. As a product person, how do you take an ethical approach to, to all this? Any any advice?

Cennydd Bowles: 

Hmm. I generally think product managers have more power than they realise with this kind of thing. That said, this is probably mostly a policy driven solution. I think what Spotify really have to do is to make it clear, what values are they going to expect content producers on their platform to adhere to, and then they have to enforce those. And we’ve seen this across other social media properties that if you do that too late, it gets very tricky to sort of retrospectively put some in. So most of most of that action is really within policy, but I think product managers have, obviously there are there are tactical things they can do, you know, within within the product, right, you can have sort of inap interventions, things like, you know, like caution stickers, like the equivalent of those parental advisory labels used to get on on foul mouthed CDs, that kind of thing. So you know, in app warnings to say, you know, hey, there’s some controversial stuff in here. Bear in mind, this may not meet our editorial standards, or it may be factually questionable. And of course, then PMS could say, well, you know, here’s a resource base, this is very much the kind of thing that Facebook have done right to combat myths and misinformation, pointing people who are consuming this content, to alternative perspectives, verified perspectives and say, Look, we don’t necessarily condone Mr. Robbins perspectives, and you might want to read more about the truth and the facts here. You can obviously do things like sort of deprecated material in the user interface, if the user has shown that they’re not gonna be sympathetic to that message. You know, if they’re subscribing to a bunch of left wing podcasts, they’re probably not going to be wildly interested in what Joe Rogan has to say. And then, of course, you can do things like highlighting both sides, you know, say, well, here’s a podcast, full of potential full of misinformation, thanks to Joe Rogan. Here’s another perspective from someone who’s, you know, taking opposing view. The problem with that, of course, is you get into that both sides isn’t the plagues, a lot of news properties, like the BBC in the New York Times. So assuming you have the capacity to go deeper, this is always about incentives. When you mentioned, you’re absolutely right, you’re going to have time spent listening is your key metric. So I think you’ve got three approaches, you can try and tackle that metric in three ways. So one of the things you can do is you can look to replace that metric, try to find something else that has responsible and ethical importance kind of kind of baked in. And you might say, then, if your Spotify Well, what matters is not really time spent listening, maybe it’s the number of podcasts that people sample, maybe we believe that it’s better for people to be exposed to different points of view. And actually, the way we measure that is by proxy, saying, Well, if we’re listening to 20, podcast hosts, rather than one on repeat, then they’re probably getting a broader worldview. So you could replace that metric. The second thing you do is you can modify the metric. And this is what YouTube have done. They had exactly the same, they had a time spent or a time viewed metric. And they have, I believe, modified that to quality time viewed. The challenge comes in across while defining what does quality mean, so you have to have some internal kind of grading and say, essentially, this is junk food, quality content, and then this is the high nutritional stuff. And, you know, there’s usually like a sort of secret internal metric, and you use that as some kind of multiplier. So it might be that Spotify wants to move in that direction. But what I really like doing, and what I tried to train companies to do, is to be realistic, and say, You’re not going to get rid of that primary metric, it’s it’s so critical to the business, what you might need to attend to wait, I call it a mutually destructive metric. So a metric that you compare, with that primary metric to say, if we do what we are probably tempted to do, which was to gain that primary metric, then the other one will suffer. So in the example of, you know, consumer games, for example, you know, it’s something like, you could have average revenue per user, but your destructive metric for that could be App Store rating, right, you could, you could really juice people to try and get revenue out of them with all sorts of dodgy dark patterns, but they’re going to hit you on the App Store eating or you could have, you know, app on in store rates or something like that. But you know, both of those move in a positive direction, then you’re probably doing something right. So probably my advice to Spotify would be if you can’t modify that metric that strongly, then look for a counterbalance look for a mutually destructive metric that will help keep you a little bit in check. So

Randy Silver: 

let’s take this out a little bit wider in this this could apply to o substack and medium in any number of other places as well. Spotify carries a lot of content from artists and podcasts like ours, that they don’t pay us for that we willingly use them as a distribution channel. And they benefit from that. But with Joe Rogan, specifically, and a number of other artists, they have paid to make them exclusive on their platform. Does that have an effect on on the guidance that you gave? Is there an additional responsibility when you are not just a platform when but when you are actively sponsoring the content?

Cennydd Bowles: 

Yeah, I think there is quite a big difference from a moral point of view, I’m talking I’m not I’m not a legal expert. But I think from a moral point of view, you are condoning you’re endorsing and supporting that content. And that turns you from a neutral platform into a publisher. And that means obviously, there are responsibilities that you take on, you know, a lot of the social media platforms, as you know, try to play this Get Out of Jail Free card saying, Well, we are neutral, we are not hosting the content, but merely allowing users to contribute. And, you know, this has been a forefront of a lot of debate. And I think there are really there’s two things that change that dynamic. One is where there’s algorithmic curation of timelines. And I think if you are Twitter or Facebook, and you have a purely user curated chronological timeline, then I think you probably are a platform, but the minute you apply some kind of sorting algorithm to it, then there is I think it case that you are prioritising certain content over others, against user wishes or without user consent, or without user least input into that. And I think that makes you closer to being morally responsible for what you’re promoting. And then the second variant is, as you say, with Spotify, it’s it’s exclusive paid content. And I, for me, that completely changes the game, I don’t think Spotify can get away with the same excuse that Twitter and Facebook and YouTube and so on, have tried to get away with and saying what we’re just we’re here for whatever users put on the platform. And I’m pretty sure that regulators are going to start to look at it that way, as well. So yeah, I think it’s there is a very significant ethical difference between those stances.

Lily Smith: 

So you can imagine, like in the future, obviously, at the moment, product managers, like you say, have some influence and quite a good amount of influence in the experience that customers have in these different environments. But there’s only there’s a limit to the amount that product people can do. And it does feel like we’re getting to the point where these businesses really, really need to be held accountable. And like you said, you know, at Twitter, they made some mistakes. Facebook, we see them making loads of mistakes, and we’re not even like really under the hood of the business. So, you know, is there going to come a point where we ended up with like a global ethical committee for technology or something like that? Do you see it going in that direction?

Cennydd Bowles: 

It’s possible that there will be national, in fact, it’s I would say, probably, though, there are going to be national, government sponsored bodies that will be looking closer at technology, and particularly wanting to audit algorithms, and to show, you know, asking tech companies to prove that their algorithms are free of bias or as close to free of bias as possible, that the decisions they take can be explained in a way that the public can understand. I think that’s kind of a an important part of a democracy, right? If a decision is taken about you, you have a right to demand that justification for them. Globally, I can’t see it happening. Because there’s just too much divergence in what various territories want from the internet. I don’t think there’s as much divergent in the moral perspectives. You know, you sometimes hear people say, well, ethics is ethics is subjective and relative. And actually, you know, if you talk to an ethicist, almost all of them disagree with that, you know, it’s context dependent. But we actually have a lot that we do agree on. We all we all have some sort of central ethical truths that really don’t vary by country. But globally, it’s not realistic. I think, I think the other group of people other than regulators that are that are really in a position to influence. This is tech employees at large. And so the big trend we’re seeing right now in tech ethics, I would say, his employees are realising they have a bunch of collective power. So this is why you’re seeing for instance, the walkouts that have happened to Google. First over project Maven, which is this DoD project that could be used to analyse drone information, drone footage. And then there was a second walkout essentially at wildcat strike over allegations of sexual harassment from an executive at the company. And so tech workers have a huge amount of collective power. So I think there’s going to be pressure from regulators for sure. And possibly some, some proper structures on a national level or state level in the US, but also the pressures coming from underneath too. And it’s also coming from the press, and it’s also coming from the public. But let’s focus on those two forces maybe for now. You know,

Randy Silver: 

we can say, we aim to be good and work for good. But, you know, Google famously got rid of its Don’t be evil motto. And if you look at the the underlying amount of money that’s spent on lobbying, and things like Department of Defence contracts, and the Jedi contract and things like that, these things are absolutely massive financial drivers, how much of an effect to these walkouts. And these internal efforts actually have?

Cennydd Bowles: 

I think, I think, a good amount to be fair, tech workers are in enormous demand, that has only increased in 2021, with this whole group, resignation, and so on. This is very much candidates, market, tech workers are expensive, they’re hard to hire. And, you know, they lift the hands off a keyboard, nothing gets built. But they also have the power to inspire press coverage. And, you know, everyone has noticed that the press cycle has turned very much against the tech sector. Certainly in the last five years, I’d say that’s really the tech sectors fault. But the press, you know, smart journalists now have very good contact decks of critical voices, insiders, at companies who may be willing to leak recent employees and so on. And they know how to build a narrative that sells stories. And even if a walkout, you know, a one hour protest, doesn’t immediately cause, you know, an executive vice president to start sweating, the result of press coverage and things like that. And then as a result that the public sentiment that will turn against the company, it really does have, I think, a building effect. So yeah, that’s going to be a lagging indicator, right? It’s going to take a little while for that momentum to really snowball into something big. But it’s very clear, there’s lots of evidence, lots of data, that shows the public has now really become much more sceptical about the ethics of the tech sector. And you know, that is going to be seen in terms of brand toxicity and people essentially leaving behind products and services that they don’t feel comfortable with.

Lily Smith: 

And I suppose, you know, earlier, you were talking about the, I guess, the sustainability of technology companies and or companies in general, and that we have to consider not just growth and profit and revenue, but our impact on the wider world. So in that vein, what’s your kind of opinion on crypto and cryptocurrency and how that is, you know, who’s kind of going to stop that? Or not necessarily stop it, but make it more sustainable? Find a way to make it sustainable, if that’s possible.

Cennydd Bowles: 

Yeah, I think cryptocurrencies really serve one cohort, and that’s people who are in cryptocurrency stocks, you know, or not stocks, but, you know, holdings in certain cryptocurrencies and that’s really the only thing that that sector has been interested in. I see personally, I see almost no ethical case for cryptocurrencies at all. First, I would say that any cryptocurrency that uses proof of work, to, to mine coins or whatever it might be comment tokens, that should just be plain illegal. To be honest, I think it is, unjustifiable, from a moral perspective, the ecological harm that it does. And there are a couple of people who dispute that, but the evidence is pretty strong, that it’s enormous ecological waste to chatting their proof of stake might be a more sustainable model, but I still feel that cryptocurrencies are predicated on inequality. You know, the Gini Coefficient. This is a measure of economic inequality. The Gini coefficient for cryptocurrencies is worse than the most corrupt regime, most corrupt national regime in the world. There’s a lot of motivation for cryptocurrencies that, frankly, is illicit, you know, that is antithetical to paying tax on assets and things like that. Now, I don’t understand maybe too, one sided, I’ll recognise there are some use cases for particularly for web three, that I think are a little bit more positive. You know, the idea of rewarding artists, for example, artists do not get a fair shot under modern day capitalism. So if we can find new ways to compensate people for creative work, and to do that collectively within groups without having to go through centralised systems. I can see the appeal I can see the validity of that. But anything that’s built on that infrastructure that’s predicated on you know, proof of work, so anything based on Aetherium, for example, until they switch to proof of stake which has been promised for forever. And Diane, you know, call me when it happens, then I still I still really deeply worry about what’s happening in that space. And I guess my final thing to say on crypto is you’ve had certain companies, particularly notably Coinbase, who said, you know, we’re we’re not going to talk about how social impact anymore, you may have come across this as bait meeting months ago, something like that, maybe they said, We’re sort of sick of this social justice debate, the only thing we’re going to talk about our company now is our mission. And, you know, we don’t want the politics to get in the way of that. But that’s a laughable statement, because Coinbase is a heavily politically ideological organisation, as is the entire cryptocurrency movement. It’s foundationally. Libertarian. And, you know, unfortunately, it sometimes even drifts toward far right ideologies. So Professor called David Columbia, who’s written a book called The politics of Bitcoin, talking about the intersection between cryptocurrency ideologies and far right ideologies. And so the claim that there’s no ideological and no political and no social implications of crypto is just is just ridiculous. It’s a heavily skewed sector, if you like,

Randy Silver: 

let’s go back to the one potential positive, you said, Listen, I’m curious to dig into that for a moment where you said it could potentially benefit artists and get them more direct interest in the value of their work. I saw something last weekend, and I didn’t interrogate it really deeply. So I’m not going to pretend I’m an expert on it. But there’s something on open sea, something like 80% of the NF T’s minted on that were fraudulent in nature. Is there any evidence that you’re seeing that it’s actually benefits artists? Or is this still a garbage in garbage out problem?

Cennydd Bowles: 

Right now, honestly, I think most of its junk. But there are were three people whom I respect, who are forward thinking, and he recognised that most of the NF t’s going around are either ripped off without the artists consent, or they’re essentially being used, as, you know, inflation vehicles, inflationary assets, you know, sort of pump and dump schemes, and all these kinds of things tend to tend to permeate that space. But who recognise that there is something better around the corner, and what they’re working toward, it’s a more utopian future, for the web, and for that kind of collective ownership and collective, you know, membership and support. And I think that kind of mutuality, that really is, that’s appealing to me. I want to see what they can come up with, you know, they keep saying they’re seeing this, you know, a far future, I need to see more evidence of what that’s actually going to become because he’s say right now, the use cases are polluted. And, you know, frankly, ethically abhorrent?

Lily Smith: 

Can if it’s been really interesting having some of these discussions and debates with you this evening? We are, let me just say today, we’ve got time for one more question. And I think a lot of people will find all of this discussion really fascinating. I mean, I love having these conversations and trying to tackle some of these big problems when chatting to my product folk. But if they’re listening, and they’re thinking, actually, no, I really, I want to do more than just talk about it with people in my network. I actually probably want to get into this and help companies with their moral dilemmas and their questions on ethics. What’s What’s your kind of advice for how to learn more how to like really get into this type of work?

Cennydd Bowles: 

I think you have to do a bit of legwork, I’m afraid. I think the tech industry has a slightly infuriating habit of whenever we discover a new field, we like to think that we’re the first people upon those shores. But of course, there have been people looking at the social and ethical consequences of technology for decades. So read them, listen to them, invite them into your companies, pay them for that time, you know, whole book clubs to read their work, because they’ve been watching us from the sidelines watching us running into the same walls again and again, sort of tearing their hair out saying why aren’t they listening? So listen to them. You know, my book is an attempt to sort of point to some of that work. So you know, of course, you might want to start there. But I’m not saying you necessarily have to go away and read Aristotle and can’t. You can’t if you want to, but I wouldn’t necessarily recommend it for the average technologist. But the other thing is find allies, right? Find other people within your company or within your communities who are interested in talking about this, because it still feels like a bit of a niche pursuit at times and it’s hard work. It’s unforgiving work, and it can actually be risky work as well. And we know what happens to the person who stands up again and again and says I’m not comfortable with this This doesn’t feel ethical. Eventually that person gets routed around in some kind of way. So what you need to do if you think you might be that person, find other people start conversation groups within your own teams or in offices, and see what you can share and see what kind of change that you can affect together. But as part of that, yes, listen to the experts, bring them in, pay them, you know, really embrace what they have been studying for what for decades now.

Lily Smith: 

Amazing. Thank you so much, Kenneth. It’s been great talking to you. My pleasure. Thank you.

Randy Silver: 

Okay, Willie, let’s get back to superheroes. If you could have the power of flight or invisibility, which one would you pick? Which one do you think is a more ethical power to have?

Lily Smith: 

Oh, man, I don’t know. You need to give me like more notice before you ask me questions like that I can dwell on it for a bit longer.

Randy Silver: 

I was talking visibility was just creepy. I mean, you’re only going to use it for things that don’t seem very ethical. So

Lily Smith: 

yeah, I was gonna say flight actually. Yeah, I was definitely I was definitely gonna say flight.

Randy Silver: 

Okay, listeners, if you have a contrary opinion, please do let us know. But in the meantime, tune in next week, same bat podcast, same bat time. We’ll catch up with you. And we’ll have another wonderful guest See you then.

Lily Smith: 

Haste, me, Lily Smith and me Randy silver. Emily Tate is our producer. And Luke Smith is our editor.

Randy Silver: 

Our theme music is from humbard baseband power that’s P A you thanks to Ana killer who runs product tank and MTP engage in Hamburg and please based in the band for letting us use their music. Connect with your local product community via product tank or regular free meetups in over 200 cities worldwide.

Lily Smith: 

If there’s no one near you, you can consider starting one yourself. To find out more go to mind the product.com forward slash product tank.

Randy Silver: 

Product Tech is a global community of meetups during buying for product people. We offer expert talks group discussion and a safe environment for product people to come together and share burnings and tips

Product people are charged with representing the needs of the customer as well as the needs of the business—but what happens when these are in conflict? We asked ethics consultant Cennydd Bowles to chat with us about how to handle some tricky situations, including Spotify and Joe Rogan, crypto/blockchain, and handling customers that might contradict your (or your company's) values. Listen to more episodes…
Featured Links: Follow Cennydd on LinkedIn and Twitter | Cennydd's website | NowNext Studio | 'What is the Gini Coefficient?' piece at CFI | Cennydd's book 'Future Ethics: the must-read guide to the ethics of emerging technology

Episode transcript

Lily Smith:  So last week I promised that Randy would be back for this week's episode, but he's not fully mended. So I bought it a special guest co host. Batman. Does that make me Robin? Randy Silver:  If you want, really, I mean, it's really me, Randy. But I've got to milk it. Well, my voice recovers in the last couple of weeks. I'm feeling a whole lot better tonight. But our guests today could really be counted as kind of a superhero. He's a former designer. He's now a professional ethics consultant. And he's here to talk with us about some of the great moral dilemmas that product people might face today. You know, things like Spotify and Joe Rogan, and crypto and more. Lily Smith:  KENNETH Bose has also written a book on the topic called Future ethics. So if you enjoy this chat, you can read all about it to now to the pod cave. The product experience is brought to you by mind the product. Randy Silver:  Every week, we talk to the best product people from around the globe about how we can improve our practice, and build products that people love. Lily Smith:  Because they mind the product.com to catch up on past episodes and to discover an extensive library of great content and videos, browse for Randy Silver:  free, or become a mind the product member to unlock premium articles, unseen videos, AMA's roundtables, discounts to our conferences around the world training opportunities. Lily Smith:  Mind new product also offers free product tank meetups in more than 200 cities. And there's probably one. Randy Silver:  Kenneth, thank you so much for joining us. For anyone who, well, he didn't know how to pronounce your name, and B doesn't recognise it. Can you just give us a quick background on yourself? How did you get into the design and product space? And what do you up to these days? Cennydd Bowles:  Your thing Randy so i Yes, I'm kind of bowls. Hi, everyone. I've been a designer for both two decades working in UK Tech and a little bit of Silicon Valley. I graduated actually got a master have a master's in it and thought I was going to be a software engineer coming out of that. But actually it was the design the Human Factors stuff that really put it took in me. So that's where I've ended up. About five years ago, I started to realise I was maybe I don't know if dissolution is the right word, but I was starting to feel that things weren't quite going in the direction that the industry needed. And I had a real interest in ethics. I hadn't studied this previously or anything like that at that stage. But I, I was able to take a bit of time out to try and understand that space better. And the more I got into it, the more I realised, there's a hell of a lot there that we as an industry are not on top of. And so I decided, you know, this may be something is like this is maybe something that I could do to add some some value here to try and translate work that's happening from moral philosophers, ethicists, writers, artists, critics and so on, try and translate that and contextualise it for working designers, PMS, engineers, and so on. So that's what I do now is I consult on ethics for tech teams, and particularly design and product teams. Randy Silver:  And it's more specifically and you've got a book of the same title. It's future ethics, isn't it? So what exactly is future ethics? Cennydd Bowles:  Yeah, so there's kind of a double meaning there, right. It's, you know, what is the future of ethics, but also what is the ethics of the future. And it's possibly the latter meaning that's more more interesting to me. I think a lot of the emerging technologies that are obviously going to really hit the mainstream in the next few years, have significantly deeper moral importance than some of the ones we have today. AI is the obvious one, there's a very strongly growing field of AI ethics. But that's not the only space there are, there are plenty of other emerging technologies that pose quite significant ethical, social and political challenges. And of course, the time to have conversations about those is now not when they come to market in 510 1520 years time, we need to try and understand what that's going to mean for our world of how we design those things right in the first place. Before potential damage is unleashed upon our markets and even on the world. Lily Smith:  And one of the things you mentioned there was you were had a moment of considering the ethics of of what you were designing. What was it specifically that you were kind of looking into at that time or working on if you're allowed to talk about it, talk about it. Cennydd Bowles:  So yeah, I didn't have a particular kind of moment of conversion. It wasn't that I was filled with horror at what I was doing or anything like that. Before I moved into that space, I was at Twitter for three years I was design manager in the UK so effectively heading up the UK design But they're also filled in as a pm for by nine months when we were sort of struggling to hire. So I sort of did a bit of DMing as well. And, you know, I thought generally the company made relatively good decisions at the time, but I think it made some mistakes. And I think I was I was quite disappointed with the way it was failing to progress, particularly on things like user safety. And as I said, I had this growing interest. And actually, I've been lucky enough, I've been giving talks on design, particularly for a good decade or so. And in one of them quite a quite a prominent talk in 2011, I did mention, I think we're missing a trick on ethics. And that feeling, I think, stayed with me. And it was mostly just, you know, diving into that space a little bit made me realise just how deep it was, and how little attention we've been paying to the people who are on those shores before us. And so I just find it sort of doubly rewarding. You know, diving into that, and realising, you know, maybe this is this is the role that I can do, then as to, as I mentioned, to put that into context for people. But it wasn't one particular moment where I was, you know, saying I have to quit, and I have to suddenly completely change my life, it was a little more of a kind of a creative process that I was getting a little bit more and more concerned. And this was around about 2015, that that happened. And obviously then in 2016, we had a significant growth of interest in technology ethics, a lot of it was predicated around the Brexit and Trump elections, which I think, surprised people and made people think, well, the world isn't quite rotating along the axis that we thought it was, maybe we were part of that surprise, and that shock, and maybe we need to do better. Lily Smith:  Yeah, it's really interesting, how much emphasis there is now and all of the product talks, you know, that I have on ethics. And I think as product managers and product people, we are much more aware of it as a topic than we were, you know, even just five years ago. So before we get into other specifics, flip back again, because Randy mentioned your book, future ethics. So there just seems to be like no and of ethical dilemmas online at the moment, but also, like you said, with future technology, self driving cars and things like that, what's your sort of personal long term vision for an ethical and safe future for us? Cennydd Bowles:  Goodness, I mean, that that's quite a deep question. If we're talking future as in the next 30 years or so, then, obviously, the first thing we have to talk about is climate crisis, we have to deal with that we have to find ways to live with the enormous disruption that that's going to create. The tech industry is responsible for around about 2% of global emissions. Well, that's rather data centres alone are responsible for by 2%, which is the same as the aviation industry. So we have to do better on that. But really, the answer to that is the same as I think my broader vision for ethical and responsible technology, which is technology that isn't just built for the benefit of shareholders, isn't just built even for the benefit of users, which I think designers sometimes get a little bit too fixated upon. We also have to consider technology's potential impact on non users, on communities, minorities, groups of employees, for example. And do our very best to do our best according to all of those interests, and also even the not the interests of non human animals, and then the planet itself. So it's about balancing and living up to our responsibility to this much broader set of stakeholders for our work. And recognising Yes, of course, we want to benefit our businesses. And of course, we want to benefit the user, but that's not enough. Randy Silver:  Okay, I'm gonna make it easier on you and give you a couple of those. Let's do a couple of specific role playing exercises. And so years ago, I was actually at a Sirius Satellite Radio when they signed Howard Stern and give him his own channel, give him a few 100 million dollars to do that. And it was a really interesting thing, because there was no question he was moving away from terrestrial radio, there was a different case of how it was being regulated. There was no question that we were the publisher of his content to in some way. Now we can look at what's going on with Spotify. And with Joe Rogan, it's an interesting thing, and they've paid him a lot of money to give him half exclusively on their platform. If I'm a product person at Spotify, one of my goals is going to be to increase subscriptions and revenue. A leading metric for that is, I'm guessing, time spent listening by people with this, presumably that all is more interactive, but it's more likely to renew my subscription etc. And then with Music, I have to pay royalties. So podcasts is a potentially more more revenue positive area and I want to grow that. On the other hand, I've got this potential issue. Where would I even start? I mean, it's I'm not sitting necessarily the person making the business decisions here. As a product person, how do you take an ethical approach to, to all this? Any any advice? Cennydd Bowles:  Hmm. I generally think product managers have more power than they realise with this kind of thing. That said, this is probably mostly a policy driven solution. I think what Spotify really have to do is to make it clear, what values are they going to expect content producers on their platform to adhere to, and then they have to enforce those. And we've seen this across other social media properties that if you do that too late, it gets very tricky to sort of retrospectively put some in. So most of most of that action is really within policy, but I think product managers have, obviously there are there are tactical things they can do, you know, within within the product, right, you can have sort of inap interventions, things like, you know, like caution stickers, like the equivalent of those parental advisory labels used to get on on foul mouthed CDs, that kind of thing. So you know, in app warnings to say, you know, hey, there's some controversial stuff in here. Bear in mind, this may not meet our editorial standards, or it may be factually questionable. And of course, then PMS could say, well, you know, here's a resource base, this is very much the kind of thing that Facebook have done right to combat myths and misinformation, pointing people who are consuming this content, to alternative perspectives, verified perspectives and say, Look, we don't necessarily condone Mr. Robbins perspectives, and you might want to read more about the truth and the facts here. You can obviously do things like sort of deprecated material in the user interface, if the user has shown that they're not gonna be sympathetic to that message. You know, if they're subscribing to a bunch of left wing podcasts, they're probably not going to be wildly interested in what Joe Rogan has to say. And then, of course, you can do things like highlighting both sides, you know, say, well, here's a podcast, full of potential full of misinformation, thanks to Joe Rogan. Here's another perspective from someone who's, you know, taking opposing view. The problem with that, of course, is you get into that both sides isn't the plagues, a lot of news properties, like the BBC in the New York Times. So assuming you have the capacity to go deeper, this is always about incentives. When you mentioned, you're absolutely right, you're going to have time spent listening is your key metric. So I think you've got three approaches, you can try and tackle that metric in three ways. So one of the things you can do is you can look to replace that metric, try to find something else that has responsible and ethical importance kind of kind of baked in. And you might say, then, if your Spotify Well, what matters is not really time spent listening, maybe it's the number of podcasts that people sample, maybe we believe that it's better for people to be exposed to different points of view. And actually, the way we measure that is by proxy, saying, Well, if we're listening to 20, podcast hosts, rather than one on repeat, then they're probably getting a broader worldview. So you could replace that metric. The second thing you do is you can modify the metric. And this is what YouTube have done. They had exactly the same, they had a time spent or a time viewed metric. And they have, I believe, modified that to quality time viewed. The challenge comes in across while defining what does quality mean, so you have to have some internal kind of grading and say, essentially, this is junk food, quality content, and then this is the high nutritional stuff. And, you know, there's usually like a sort of secret internal metric, and you use that as some kind of multiplier. So it might be that Spotify wants to move in that direction. But what I really like doing, and what I tried to train companies to do, is to be realistic, and say, You're not going to get rid of that primary metric, it's it's so critical to the business, what you might need to attend to wait, I call it a mutually destructive metric. So a metric that you compare, with that primary metric to say, if we do what we are probably tempted to do, which was to gain that primary metric, then the other one will suffer. So in the example of, you know, consumer games, for example, you know, it's something like, you could have average revenue per user, but your destructive metric for that could be App Store rating, right, you could, you could really juice people to try and get revenue out of them with all sorts of dodgy dark patterns, but they're going to hit you on the App Store eating or you could have, you know, app on in store rates or something like that. But you know, both of those move in a positive direction, then you're probably doing something right. So probably my advice to Spotify would be if you can't modify that metric that strongly, then look for a counterbalance look for a mutually destructive metric that will help keep you a little bit in check. So Randy Silver:  let's take this out a little bit wider in this this could apply to o substack and medium in any number of other places as well. Spotify carries a lot of content from artists and podcasts like ours, that they don't pay us for that we willingly use them as a distribution channel. And they benefit from that. But with Joe Rogan, specifically, and a number of other artists, they have paid to make them exclusive on their platform. Does that have an effect on on the guidance that you gave? Is there an additional responsibility when you are not just a platform when but when you are actively sponsoring the content? Cennydd Bowles:  Yeah, I think there is quite a big difference from a moral point of view, I'm talking I'm not I'm not a legal expert. But I think from a moral point of view, you are condoning you're endorsing and supporting that content. And that turns you from a neutral platform into a publisher. And that means obviously, there are responsibilities that you take on, you know, a lot of the social media platforms, as you know, try to play this Get Out of Jail Free card saying, Well, we are neutral, we are not hosting the content, but merely allowing users to contribute. And, you know, this has been a forefront of a lot of debate. And I think there are really there's two things that change that dynamic. One is where there's algorithmic curation of timelines. And I think if you are Twitter or Facebook, and you have a purely user curated chronological timeline, then I think you probably are a platform, but the minute you apply some kind of sorting algorithm to it, then there is I think it case that you are prioritising certain content over others, against user wishes or without user consent, or without user least input into that. And I think that makes you closer to being morally responsible for what you're promoting. And then the second variant is, as you say, with Spotify, it's it's exclusive paid content. And I, for me, that completely changes the game, I don't think Spotify can get away with the same excuse that Twitter and Facebook and YouTube and so on, have tried to get away with and saying what we're just we're here for whatever users put on the platform. And I'm pretty sure that regulators are going to start to look at it that way, as well. So yeah, I think it's there is a very significant ethical difference between those stances. Lily Smith:  So you can imagine, like in the future, obviously, at the moment, product managers, like you say, have some influence and quite a good amount of influence in the experience that customers have in these different environments. But there's only there's a limit to the amount that product people can do. And it does feel like we're getting to the point where these businesses really, really need to be held accountable. And like you said, you know, at Twitter, they made some mistakes. Facebook, we see them making loads of mistakes, and we're not even like really under the hood of the business. So, you know, is there going to come a point where we ended up with like a global ethical committee for technology or something like that? Do you see it going in that direction? Cennydd Bowles:  It's possible that there will be national, in fact, it's I would say, probably, though, there are going to be national, government sponsored bodies that will be looking closer at technology, and particularly wanting to audit algorithms, and to show, you know, asking tech companies to prove that their algorithms are free of bias or as close to free of bias as possible, that the decisions they take can be explained in a way that the public can understand. I think that's kind of a an important part of a democracy, right? If a decision is taken about you, you have a right to demand that justification for them. Globally, I can't see it happening. Because there's just too much divergence in what various territories want from the internet. I don't think there's as much divergent in the moral perspectives. You know, you sometimes hear people say, well, ethics is ethics is subjective and relative. And actually, you know, if you talk to an ethicist, almost all of them disagree with that, you know, it's context dependent. But we actually have a lot that we do agree on. We all we all have some sort of central ethical truths that really don't vary by country. But globally, it's not realistic. I think, I think the other group of people other than regulators that are that are really in a position to influence. This is tech employees at large. And so the big trend we're seeing right now in tech ethics, I would say, his employees are realising they have a bunch of collective power. So this is why you're seeing for instance, the walkouts that have happened to Google. First over project Maven, which is this DoD project that could be used to analyse drone information, drone footage. And then there was a second walkout essentially at wildcat strike over allegations of sexual harassment from an executive at the company. And so tech workers have a huge amount of collective power. So I think there's going to be pressure from regulators for sure. And possibly some, some proper structures on a national level or state level in the US, but also the pressures coming from underneath too. And it's also coming from the press, and it's also coming from the public. But let's focus on those two forces maybe for now. You know, Randy Silver:  we can say, we aim to be good and work for good. But, you know, Google famously got rid of its Don't be evil motto. And if you look at the the underlying amount of money that's spent on lobbying, and things like Department of Defence contracts, and the Jedi contract and things like that, these things are absolutely massive financial drivers, how much of an effect to these walkouts. And these internal efforts actually have? Cennydd Bowles:  I think, I think, a good amount to be fair, tech workers are in enormous demand, that has only increased in 2021, with this whole group, resignation, and so on. This is very much candidates, market, tech workers are expensive, they're hard to hire. And, you know, they lift the hands off a keyboard, nothing gets built. But they also have the power to inspire press coverage. And, you know, everyone has noticed that the press cycle has turned very much against the tech sector. Certainly in the last five years, I'd say that's really the tech sectors fault. But the press, you know, smart journalists now have very good contact decks of critical voices, insiders, at companies who may be willing to leak recent employees and so on. And they know how to build a narrative that sells stories. And even if a walkout, you know, a one hour protest, doesn't immediately cause, you know, an executive vice president to start sweating, the result of press coverage and things like that. And then as a result that the public sentiment that will turn against the company, it really does have, I think, a building effect. So yeah, that's going to be a lagging indicator, right? It's going to take a little while for that momentum to really snowball into something big. But it's very clear, there's lots of evidence, lots of data, that shows the public has now really become much more sceptical about the ethics of the tech sector. And you know, that is going to be seen in terms of brand toxicity and people essentially leaving behind products and services that they don't feel comfortable with. Lily Smith:  And I suppose, you know, earlier, you were talking about the, I guess, the sustainability of technology companies and or companies in general, and that we have to consider not just growth and profit and revenue, but our impact on the wider world. So in that vein, what's your kind of opinion on crypto and cryptocurrency and how that is, you know, who's kind of going to stop that? Or not necessarily stop it, but make it more sustainable? Find a way to make it sustainable, if that's possible. Cennydd Bowles:  Yeah, I think cryptocurrencies really serve one cohort, and that's people who are in cryptocurrency stocks, you know, or not stocks, but, you know, holdings in certain cryptocurrencies and that's really the only thing that that sector has been interested in. I see personally, I see almost no ethical case for cryptocurrencies at all. First, I would say that any cryptocurrency that uses proof of work, to, to mine coins or whatever it might be comment tokens, that should just be plain illegal. To be honest, I think it is, unjustifiable, from a moral perspective, the ecological harm that it does. And there are a couple of people who dispute that, but the evidence is pretty strong, that it's enormous ecological waste to chatting their proof of stake might be a more sustainable model, but I still feel that cryptocurrencies are predicated on inequality. You know, the Gini Coefficient. This is a measure of economic inequality. The Gini coefficient for cryptocurrencies is worse than the most corrupt regime, most corrupt national regime in the world. There's a lot of motivation for cryptocurrencies that, frankly, is illicit, you know, that is antithetical to paying tax on assets and things like that. Now, I don't understand maybe too, one sided, I'll recognise there are some use cases for particularly for web three, that I think are a little bit more positive. You know, the idea of rewarding artists, for example, artists do not get a fair shot under modern day capitalism. So if we can find new ways to compensate people for creative work, and to do that collectively within groups without having to go through centralised systems. I can see the appeal I can see the validity of that. But anything that's built on that infrastructure that's predicated on you know, proof of work, so anything based on Aetherium, for example, until they switch to proof of stake which has been promised for forever. And Diane, you know, call me when it happens, then I still I still really deeply worry about what's happening in that space. And I guess my final thing to say on crypto is you've had certain companies, particularly notably Coinbase, who said, you know, we're we're not going to talk about how social impact anymore, you may have come across this as bait meeting months ago, something like that, maybe they said, We're sort of sick of this social justice debate, the only thing we're going to talk about our company now is our mission. And, you know, we don't want the politics to get in the way of that. But that's a laughable statement, because Coinbase is a heavily politically ideological organisation, as is the entire cryptocurrency movement. It's foundationally. Libertarian. And, you know, unfortunately, it sometimes even drifts toward far right ideologies. So Professor called David Columbia, who's written a book called The politics of Bitcoin, talking about the intersection between cryptocurrency ideologies and far right ideologies. And so the claim that there's no ideological and no political and no social implications of crypto is just is just ridiculous. It's a heavily skewed sector, if you like, Randy Silver:  let's go back to the one potential positive, you said, Listen, I'm curious to dig into that for a moment where you said it could potentially benefit artists and get them more direct interest in the value of their work. I saw something last weekend, and I didn't interrogate it really deeply. So I'm not going to pretend I'm an expert on it. But there's something on open sea, something like 80% of the NF T's minted on that were fraudulent in nature. Is there any evidence that you're seeing that it's actually benefits artists? Or is this still a garbage in garbage out problem? Cennydd Bowles:  Right now, honestly, I think most of its junk. But there are were three people whom I respect, who are forward thinking, and he recognised that most of the NF t's going around are either ripped off without the artists consent, or they're essentially being used, as, you know, inflation vehicles, inflationary assets, you know, sort of pump and dump schemes, and all these kinds of things tend to tend to permeate that space. But who recognise that there is something better around the corner, and what they're working toward, it's a more utopian future, for the web, and for that kind of collective ownership and collective, you know, membership and support. And I think that kind of mutuality, that really is, that's appealing to me. I want to see what they can come up with, you know, they keep saying they're seeing this, you know, a far future, I need to see more evidence of what that's actually going to become because he's say right now, the use cases are polluted. And, you know, frankly, ethically abhorrent? Lily Smith:  Can if it's been really interesting having some of these discussions and debates with you this evening? We are, let me just say today, we've got time for one more question. And I think a lot of people will find all of this discussion really fascinating. I mean, I love having these conversations and trying to tackle some of these big problems when chatting to my product folk. But if they're listening, and they're thinking, actually, no, I really, I want to do more than just talk about it with people in my network. I actually probably want to get into this and help companies with their moral dilemmas and their questions on ethics. What's What's your kind of advice for how to learn more how to like really get into this type of work? Cennydd Bowles:  I think you have to do a bit of legwork, I'm afraid. I think the tech industry has a slightly infuriating habit of whenever we discover a new field, we like to think that we're the first people upon those shores. But of course, there have been people looking at the social and ethical consequences of technology for decades. So read them, listen to them, invite them into your companies, pay them for that time, you know, whole book clubs to read their work, because they've been watching us from the sidelines watching us running into the same walls again and again, sort of tearing their hair out saying why aren't they listening? So listen to them. You know, my book is an attempt to sort of point to some of that work. So you know, of course, you might want to start there. But I'm not saying you necessarily have to go away and read Aristotle and can't. You can't if you want to, but I wouldn't necessarily recommend it for the average technologist. But the other thing is find allies, right? Find other people within your company or within your communities who are interested in talking about this, because it still feels like a bit of a niche pursuit at times and it's hard work. It's unforgiving work, and it can actually be risky work as well. And we know what happens to the person who stands up again and again and says I'm not comfortable with this This doesn't feel ethical. Eventually that person gets routed around in some kind of way. So what you need to do if you think you might be that person, find other people start conversation groups within your own teams or in offices, and see what you can share and see what kind of change that you can affect together. But as part of that, yes, listen to the experts, bring them in, pay them, you know, really embrace what they have been studying for what for decades now. Lily Smith:  Amazing. Thank you so much, Kenneth. It's been great talking to you. My pleasure. Thank you. Randy Silver:  Okay, Willie, let's get back to superheroes. If you could have the power of flight or invisibility, which one would you pick? Which one do you think is a more ethical power to have? Lily Smith:  Oh, man, I don't know. You need to give me like more notice before you ask me questions like that I can dwell on it for a bit longer. Randy Silver:  I was talking visibility was just creepy. I mean, you're only going to use it for things that don't seem very ethical. So Lily Smith:  yeah, I was gonna say flight actually. Yeah, I was definitely I was definitely gonna say flight. Randy Silver:  Okay, listeners, if you have a contrary opinion, please do let us know. But in the meantime, tune in next week, same bat podcast, same bat time. We'll catch up with you. And we'll have another wonderful guest See you then. Lily Smith:  Haste, me, Lily Smith and me Randy silver. Emily Tate is our producer. And Luke Smith is our editor. Randy Silver:  Our theme music is from humbard baseband power that's P A you thanks to Ana killer who runs product tank and MTP engage in Hamburg and please based in the band for letting us use their music. Connect with your local product community via product tank or regular free meetups in over 200 cities worldwide. Lily Smith:  If there's no one near you, you can consider starting one yourself. To find out more go to mind the product.com forward slash product tank. Randy Silver:  Product Tech is a global community of meetups during buying for product people. We offer expert talks group discussion and a safe environment for product people to come together and share burnings and tips