Privacy by design: flipping the script on privacy in product "Product people - Product managers, product designers, UX designers, UX researchers, Business analysts, developers, makers & entrepreneurs August 08 2021 False data privacy, Ethics, Guest Post, Mind the Product Mind the Product Ltd 1819 Privacy, eye, product, consumer Product Management 7.276

Privacy by design: flipping the script on privacy in product

BY ON

For tech giants like Apple and Google, user privacy has traditionally been seen as something to avoid rather than a selling point for their products. However, driven by changing consumer sentiment and mounting political pressure, the same companies that previously appeared to approach privacy as an obstacle are now pushing it to the forefront of product marketing.

At a developers conference last month, Google showcased its intent to give users the ability to increase their privacy. In a small but essential contribution to user privacy, the company intends to allow people to rapidly delete their recent search history on Chrome, create locked photo folders on Pixel devices, and become more aware of location tracking when using Maps. Android will also get an enhanced suite of privacy features, giving users the ability to more easily restrict an app’s access to device functions such as location tracking, sound recording, and camera usage.

Meanwhile, Apple’s recent iOS 14.5 update included a powerful new privacy tool called “App Tracking Transparency,” within which iOS device users gain greater visibility and control over what data apps they’ve installed on their phones share with third parties. Critically this update prompts users to “opt-in” rather than “opt-out” of tracking. Apple has also made privacy a focal point on its new websites for its Apple Wallet and Apple Pay products.

Similarly, while Microsoft recently experienced backlash over the productivity score feature within its 365 product, the Seattle-based tech giant has been using privacy as a selling point at least since it ran its privacy-focused ad campaign in 2013. The company has also continually called for tighter regulation around personal consumer data and has taken a pro-data protection political stance.

While big tech’s move to increase user privacy is welcome, it is also vital from a business point of view. Behind today’s movement to pivot products towards greater user privacy are unavoidable trends in consumer preferences and legislative tolerance, which product teams can’t afford to ignore.

Consumers want privacy

In 2019, over 60% of adult respondents to a Pew Research Center poll agreed with the statement that “it’s not possible to go through daily life without having data collected by companies or the government.” Since then, consumer awareness of privacy has continued to evolve.

Critically for businesses, a heightened awareness of privacy now informs consumer choices. As shown in a recent report from Flurry, an analytics company, globally, only around 13% of users have chosen to opt into tracking when prompted under Apple’s recent IOS update.

User reactions to Apple’s new privacy update demonstrate that consumers prioritize privacy whenever they have the option to do so. This idea is also backed up by research that shows that many consumers will prioritize privacy when choosing products. In a recent large-scale global consumer sentiment survey by CISCO, 32% of privacy-aware respondents said they would, or already have, switch service providers due to a company’s data sharing policy.

Even among loyal brand acolytes, poor privacy can become a turn-off. A 2021 survey of Australian tech users with several devices from a particular company showed that long-time brand fans were more concerned than ever about their privacy when using that specific brand’s services or products.

Politicians and legislators want to promote privacy

Throughout the last several years, tech companies have repeatedly had to face stricter scrutiny from legislators globally. As a result, big tech is no stranger to privacy-related lawsuits. However, as privacy becomes a focal point for voters, what has in the past amounted to endless legal battles or empty political discourse is increasingly turning into real legislative change.

The recent passing into law of the California Privacy Rights Act (CPRA) is one example. At the same time as the US presidential election last November, voters in California, the world’s sixth-largest economy, passed a law that will give them GDPR-like protection by 2023. Across the US, similar consumer privacy-focused bills are now in train in other major states such as Texas and New York.

Within the EU, a recent ruling by the Court of Justice of the European Union also opens tech companies to a vastly increased volume of GDPR penalties. As per the ruling, GDPR complaints will no longer necessarily have to come through the country where an organization is based but, rather, will be able to originate in the complainants’ jurisdiction. This new status quo will clear the backlogs which have held countless complaints up in court.

Change is guaranteed, proactive response is vital

Whether driven by tighter interpretations of existing legislation, new laws (such as the increasingly likely prospect of a US federal privacy law), or by consumer preferences, the privacy requirements placed on product-focused businesses are only expected to get stricter. Ultimately, all trends point to a future where the end-user will expect and demand to control how their personal data is processed and used.

Accordingly, and as is already happening in some big tech firms, product teams will need to take the initiative and make privacy features inherent to product design. The concept of Privacy by design (PbD) gives a framework for how proactive privacy can work practically.

Privacy by design needs to become standard practice

Not a new concept, the thesis for PbD was developed in the mid-1990s by a team led by Ontario Privacy Commissioner Ann Cavoukian. Though over 25 years old, its fundamental principles can still make for a refreshing read for product design teams struggling to co-mingle privacy and functionality.

While the concept of PbD outlined by Cavoukian incorporates seven fundamental principles, three, in particular, may have relevance to product design teams today:

Privacy should be proactive

Just like how the best products often anticipate user desires and pain points before they happen, product privacy ideally needs to be designed as a core feature from the get-go.

One way of doing this is to involve a privacy-focused professional, such as your organization’s data protection officer, in the design process from the beginning. With product requirements growing increasingly crowded, at least one person on the team needs to champion user privacy to ensure it’s not seen as an afterthought or expendable.

At the beginning of the project and later through regular check-ins, the “privacy champion” needs to constantly question the product from a privacy point of view. In practice, this means bringing the team’s attention to any requirements, both internal and external, that may be applicable when designing a specific product.

Depending on the product that’s being designed, teams may also have to adhere to best practices or code of conduct for a specific technology. Here it’s good to be aware of not only regulations and acts that are already in place but also of pending legislation, such as the proposed ePrivacy Regulation.

All privacy practices relating to the project should be shared with both the user community and other stakeholders so as to create a community of continuous improvement. Ideally, the privacy professional should also carry out a risk assessment pinpointing the potential privacy risks to users.

Privacy should be user-centric

To empower users to take control of their data, products should be designed around their interests and needs. As outlined by Cavoukain, doing this means working critical principles such as consent, accuracy, access, and compliance into all aspects of a product. Ultimately, product teams need to put the user at the center of privacy controls and see privacy from their perspective.

To do that, teams need to provide users with a clear path to understanding how data is used, give appropriate notice in the event that something is about to change, and support individuals with access to strong privacy controls.

For example, the team behind the Metadistretti e-monitor, a device that lets medical professionals and family members monitor cardiac patients remotely, made sure that patients wearing the e-monitor were able to see exactly what type of data was collected about them and to whom it was sent via a browser and an app. Users even have the ability to send different types and quantities of data to different groups and individuals.

In general, the entire product should be designed in such a way that the user understands the potential consequences of their choices and is not manipulated to make a specific one. Data collection that is not necessary for the core function of the product should be “opt-in” only (i.e., the highest privacy setting).

Additionally, the contact information for those responsible for user data protection for the product (or the organization behind the product) should be available and easy to access for users.

Data use should be transparent

Vital for gaining trust, data use needs to be as transparent as possible. Anyone using your product should be able to find out what happens to any information they give you.

At the most basic level, this could take the form of a straightforward, “plain language” privacy policy that anyone who wants to use your products has to read. For example, an information exchange for biomedical researchers, which collates participant’s genomic data, asks its volunteers to watch an educational video about the potentialities of having their genomes sequenced. After watching this video, the participants must also give their explicit preliminary consent to the process. It is only after the volunteers complete these two steps and understand the potential consequences of having their highly sensitive personal data collected that the company sends them a saliva sample collecting kit.

The UX Guide to Getting Consent” by The International Association of Privacy Professionals (IAPP) provides advice on how to obtain user consent in relation to the GDPR.

Remember that your claimed use of the information you collect also needs to be verifiable by an independent third party. Discussing this concept in terms of privacy by design, Cavoukain paraphrases Ronald Reagan in saying the user should be able to “trust, but verify.”

Privacy is not a net loss

With legislation and consumer attention honing in on privacy, no company can afford to treat privacy as a bolt-on in addition to their products. However, giving users control of their personal information does not have to be a drain on a product’s business case either. On the contrary, products that use the minimum amount of consumer data will ultimately be safer from a cybersecurity perspective and more attractive for consumers—a growing percentage of whom place a premium on privacy.

This changing status quo around privacy points to a future where privacy concerns will move from design obstacles into product features. Getting ahead of this trend is vital for today’s product teams. Rather than looking at privacy as a trade-off between business needs and user requirements, product-focused professionals who see privacy as a “win-win” feature are likely to understand their users better and create more attractive products. The reason why is that done right, user privacy is never a negative.

Discover more content on product ethics. For even more content on a range of product management topics, see our Content A-Z.

For tech giants like Apple and Google, user privacy has traditionally been seen as something to avoid rather than a selling point for their products. However, driven by changing consumer sentiment and mounting political pressure, the same companies that previously appeared to approach privacy as an obstacle are now pushing it to the forefront of product marketing. At a developers conference last month, Google showcased its intent to give users the ability to increase their privacy. In a small but essential contribution to user privacy, the company intends to allow people to rapidly delete their recent search history on Chrome, create locked photo folders on Pixel devices, and become more aware of location tracking when using Maps. Android will also get an enhanced suite of privacy features, giving users the ability to more easily restrict an app's access to device functions such as location tracking, sound recording, and camera usage. Meanwhile, Apple's recent iOS 14.5 update included a powerful new privacy tool called "App Tracking Transparency," within which iOS device users gain greater visibility and control over what data apps they’ve installed on their phones share with third parties. Critically this update prompts users to "opt-in" rather than "opt-out" of tracking. Apple has also made privacy a focal point on its new websites for its Apple Wallet and Apple Pay products. Similarly, while Microsoft recently experienced backlash over the productivity score feature within its 365 product, the Seattle-based tech giant has been using privacy as a selling point at least since it ran its privacy-focused ad campaign in 2013. The company has also continually called for tighter regulation around personal consumer data and has taken a pro-data protection political stance. While big tech's move to increase user privacy is welcome, it is also vital from a business point of view. Behind today's movement to pivot products towards greater user privacy are unavoidable trends in consumer preferences and legislative tolerance, which product teams can’t afford to ignore.

Consumers want privacy

In 2019, over 60% of adult respondents to a Pew Research Center poll agreed with the statement that "it's not possible to go through daily life without having data collected by companies or the government." Since then, consumer awareness of privacy has continued to evolve. Critically for businesses, a heightened awareness of privacy now informs consumer choices. As shown in a recent report from Flurry, an analytics company, globally, only around 13% of users have chosen to opt into tracking when prompted under Apple's recent IOS update. User reactions to Apple's new privacy update demonstrate that consumers prioritize privacy whenever they have the option to do so. This idea is also backed up by research that shows that many consumers will prioritize privacy when choosing products. In a recent large-scale global consumer sentiment survey by CISCO, 32% of privacy-aware respondents said they would, or already have, switch service providers due to a company's data sharing policy. Even among loyal brand acolytes, poor privacy can become a turn-off. A 2021 survey of Australian tech users with several devices from a particular company showed that long-time brand fans were more concerned than ever about their privacy when using that specific brand’s services or products.

Politicians and legislators want to promote privacy

Throughout the last several years, tech companies have repeatedly had to face stricter scrutiny from legislators globally. As a result, big tech is no stranger to privacy-related lawsuits. However, as privacy becomes a focal point for voters, what has in the past amounted to endless legal battles or empty political discourse is increasingly turning into real legislative change. The recent passing into law of the California Privacy Rights Act (CPRA) is one example. At the same time as the US presidential election last November, voters in California, the world's sixth-largest economy, passed a law that will give them GDPR-like protection by 2023. Across the US, similar consumer privacy-focused bills are now in train in other major states such as Texas and New York. Within the EU, a recent ruling by the Court of Justice of the European Union also opens tech companies to a vastly increased volume of GDPR penalties. As per the ruling, GDPR complaints will no longer necessarily have to come through the country where an organization is based but, rather, will be able to originate in the complainants' jurisdiction. This new status quo will clear the backlogs which have held countless complaints up in court.

Change is guaranteed, proactive response is vital

Whether driven by tighter interpretations of existing legislation, new laws (such as the increasingly likely prospect of a US federal privacy law), or by consumer preferences, the privacy requirements placed on product-focused businesses are only expected to get stricter. Ultimately, all trends point to a future where the end-user will expect and demand to control how their personal data is processed and used. Accordingly, and as is already happening in some big tech firms, product teams will need to take the initiative and make privacy features inherent to product design. The concept of Privacy by design (PbD) gives a framework for how proactive privacy can work practically.

Privacy by design needs to become standard practice

Not a new concept, the thesis for PbD was developed in the mid-1990s by a team led by Ontario Privacy Commissioner Ann Cavoukian. Though over 25 years old, its fundamental principles can still make for a refreshing read for product design teams struggling to co-mingle privacy and functionality. While the concept of PbD outlined by Cavoukian incorporates seven fundamental principles, three, in particular, may have relevance to product design teams today:

Privacy should be proactive

Just like how the best products often anticipate user desires and pain points before they happen, product privacy ideally needs to be designed as a core feature from the get-go. One way of doing this is to involve a privacy-focused professional, such as your organization's data protection officer, in the design process from the beginning. With product requirements growing increasingly crowded, at least one person on the team needs to champion user privacy to ensure it's not seen as an afterthought or expendable. At the beginning of the project and later through regular check-ins, the “privacy champion” needs to constantly question the product from a privacy point of view. In practice, this means bringing the team’s attention to any requirements, both internal and external, that may be applicable when designing a specific product. Depending on the product that’s being designed, teams may also have to adhere to best practices or code of conduct for a specific technology. Here it’s good to be aware of not only regulations and acts that are already in place but also of pending legislation, such as the proposed ePrivacy Regulation. All privacy practices relating to the project should be shared with both the user community and other stakeholders so as to create a community of continuous improvement. Ideally, the privacy professional should also carry out a risk assessment pinpointing the potential privacy risks to users.

Privacy should be user-centric

To empower users to take control of their data, products should be designed around their interests and needs. As outlined by Cavoukain, doing this means working critical principles such as consent, accuracy, access, and compliance into all aspects of a product. Ultimately, product teams need to put the user at the center of privacy controls and see privacy from their perspective. To do that, teams need to provide users with a clear path to understanding how data is used, give appropriate notice in the event that something is about to change, and support individuals with access to strong privacy controls. For example, the team behind the Metadistretti e-monitor, a device that lets medical professionals and family members monitor cardiac patients remotely, made sure that patients wearing the e-monitor were able to see exactly what type of data was collected about them and to whom it was sent via a browser and an app. Users even have the ability to send different types and quantities of data to different groups and individuals. In general, the entire product should be designed in such a way that the user understands the potential consequences of their choices and is not manipulated to make a specific one. Data collection that is not necessary for the core function of the product should be “opt-in” only (i.e., the highest privacy setting). Additionally, the contact information for those responsible for user data protection for the product (or the organization behind the product) should be available and easy to access for users.

Data use should be transparent

Vital for gaining trust, data use needs to be as transparent as possible. Anyone using your product should be able to find out what happens to any information they give you. At the most basic level, this could take the form of a straightforward, “plain language” privacy policy that anyone who wants to use your products has to read. For example, an information exchange for biomedical researchers, which collates participant’s genomic data, asks its volunteers to watch an educational video about the potentialities of having their genomes sequenced. After watching this video, the participants must also give their explicit preliminary consent to the process. It is only after the volunteers complete these two steps and understand the potential consequences of having their highly sensitive personal data collected that the company sends them a saliva sample collecting kit. “The UX Guide to Getting Consent” by The International Association of Privacy Professionals (IAPP) provides advice on how to obtain user consent in relation to the GDPR. Remember that your claimed use of the information you collect also needs to be verifiable by an independent third party. Discussing this concept in terms of privacy by design, Cavoukain paraphrases Ronald Reagan in saying the user should be able to "trust, but verify."

Privacy is not a net loss

With legislation and consumer attention honing in on privacy, no company can afford to treat privacy as a bolt-on in addition to their products. However, giving users control of their personal information does not have to be a drain on a product's business case either. On the contrary, products that use the minimum amount of consumer data will ultimately be safer from a cybersecurity perspective and more attractive for consumers—a growing percentage of whom place a premium on privacy. This changing status quo around privacy points to a future where privacy concerns will move from design obstacles into product features. Getting ahead of this trend is vital for today's product teams. Rather than looking at privacy as a trade-off between business needs and user requirements, product-focused professionals who see privacy as a "win-win" feature are likely to understand their users better and create more attractive products. The reason why is that done right, user privacy is never a negative. Discover more content on product ethics. For even more content on a range of product management topics, see our Content A-Z.