Privacy in the Age of the Customer

On the 25th May 2018 the General Data Protection Regulation (GDPR) enters into force in Europe. Accordingly, organizations are turning their attention to how to improve their privacy and data protection practices. Those familiar with the new measures will know that they suggest a consumer-centric and proactive approach towards the collection and processing of personal information.

However, this is exactly why a majority of organizations see the GDPR as a burden – with most suffering from a lack of clarity on what their consumers really think about personal data collection and processing.

In response to this government and private institutions have conducted a number of studies that provide a snapshot of consumer expectations. The European Commission, for instance, published the results of a Eurobarometer, where more than 26 thousand citizens were surveyed about their perspectives on privacy and data protection[i]. The results indicate a clear expectation of the majority for businesses to provide the means to protect customer’s online privacy. And while more than half of respondents take some sort of action to protect themselves by, for instance, changing their browser settings to enable more private online surfing, 70% of the respondents agreed that the default settings of their browsers should prevent their information from being shared without their consent.

This indicates that while online consumers are willing to take certain preventive actions to ensure their online privacy, they also expect to be protected by the technology providers or parties they interact with online.

At first sight, this notion seems to contradict with what the industry demands; which is almost a free market where organizations can take advantage of new opportunities enabled by data-driven innovation. The best example of this potential conflict is evident from the results of the same survey on online direct marketing; about 90% of citizens, civil society and public authorities favour an opt-in approach, indicating that consumers should be asked for their explicit consent before they are sent direct marketing offers.

On the other hand, 73% of the industry favour an opt-out approach where companies just send direct marketing to whomever they can reach out to without collecting explicit consent – but with the understanding that the customer can reply to stop these communications if they want.

Even though it seems like what the industry demands is completely opposite of the consumers’ demand, this contradiction is only an illusion. The only reason why we believe in this illusionary contradiction is because the situation-specific factors are never considered along with the general factors that impact our privacy calculus.

First coined by Culnan and Armstrong[ii] in 1999,  privacy calculus refers to a trade-off analysis where we, as consumers, compare the potential losses and risks we associate with disclosing information in a specific context, to the benefits we anticipate from disclosing such information. Most of the time we do this subconsciously.

We already know, by mere observation as well as academic research, that there are several general factors that influence everyone’s privacy calculus. For instance, people who had their privacy invaded in the past tend to be more sceptical towards online information sharing. We also know that people who tend to be more ‘technology-optimistic,’ who are almost always positively biased for technological developments and how that might impact society, tend to also be more positive towards online information sharing. Needless to say, our trust towards an organization, its reputation, and the rewards such organization offers in exchange of our data also influences our decision to share our personal information with this organization.

In addition to these general factors, there are also several situational factors that influence our data sharing behaviour. For instance, people develop different threat perceptions regarding the same personal information accessed by different organizations: while an attempt by an unfamiliar organization to access personal information may be deemed as a threat, organizations with assurance mechanisms such as a privacy assurance statement or privacy personalization features may create a sense of protection, and influence people to share more easily. We may also have a disclosure decision depending on the type of data that is being asked from us. People tend to share their demographic details more easily compared to their geo-location data.

With an ever-increasing number of security breaches and privacy incidents under the spotlight, consumers’ awareness about privacy is becoming significantly higher. Consumers are starting to act on the basis of how well they believe their privacy is protected by the organizations collecting and processing their personal information. According to the aforementioned Eurobarometer, the majority of people (71%) think that providing personal information is increasingly part of modern life and accept that they need to share it if they want to obtain products or services. Yet, they expect their explicit approval to be required before their data is collected and processed, and are concerned about not having complete control over the information they provide online.

Given these challenges, how can organizations get beyond these concerns and instil a sense of protection and trust to their customers?

Companies can turn the privacy calculus in their favour by practicing contextual privacy. Contextual privacy, as referred to by F. Khatibloo (2013)[iii] of Forrester Research, is a business practice in which the collection and use of personal data is consensual, within a mutually agreed-upon context, for a mutually agreed-upon purpose. When companies embrace this approach, they will experience a paradigm shift. Consequently, privacy will stop being a burden and will become a potential source of competitive advantage. This a conclusion we can reach in confidence, because we know that, today, customers value transparency and entrust their business to organizations who treat their customers with respect and make an effort to gain their trust.

Here are some best practices for senior management to start including transparency, informed consent and trust-building in their everyday business practices;

Do your homework

It is critical to get to know your customer base and their attitudes towards privacy. While typical market research methodologies, such as focus groups, interviews or online surveys, can perfectly do the job, you may want to invest in more creative ways to get more insightful cues. For instance, did you know that Craig Newmark, the founder of Craigslist, spends the first hour of every day doing customer service, only to get to know the Craigslist customers better? Not every manager may have the time for that, but maybe organizing events that will give you the opportunity to spend face-to-face time with your customers as well as inform them about your practices on data collection and processing will serve as the perfect rapport-building opportunity; two birds with one stone!

Take the responsibility of creating awareness

This is something often overlooked. Recent research shows that[iv]  the majority of consumers are not aware of the amount and the nature of data companies are collecting about them. It is important to take responsibility for awareness creation, and clarify what you collect about them and why. With transparency, comes trust. With trust, comes consent. Then, how can you build trust by being transparent, yet not scare or bore your customers off of your explanations? Channel 4, from Britain, does a great job of informing its customers by using humour and visual content. They have prepared a short and entertaining video with the comedian Alan Carr, which explains the type of data that is collected by Channel 4’s website and how it is used, as well as how the views have complete control over their data.

Make critical information accessible

The purpose of a privacy policy is to inform the users about the personal data collected by the organization, how this data is used and who else is involved in the collection, storage and/or analysis of this data. These are critical media to communicate your intentions but they mostly are long, boring texts. And let’s face it, nobody reads lengthy and boring privacy policies. There is a need for a second generation of privacy policies; policies where companies experiment with creative methods for getting the necessary information across, using multiple outlets and methods. Similar to the Channel 4 case mentioned above, using visual templates that are easy to understand with a quick glance can be good examples.

Another interesting approach could be pure simplification, just like Creative Commons did (see figure). They created a visual language through a set of combinable  icons and reduced the complexity of privacy policies to a set of indicators scannable in seconds[v]. As suggested by Aza Raskin, this could be an example that organizations can follow. Maybe a set of icon, just like Mozilla proposed[vi] as early as 2010, can be a simple yet powerful approach to making privacy policies more accessible and consumable. Food for thought!

What’s in it for me?

Do not forget the privacy calculus – as much as your customers are worried about the risks disclosing personal information may bring, they are also motivated by the benefits disclosure may offer. It is important  to think which benefits you are offering to your consumers in exchange for their personal information; tangible benefits such as reward points or  product discounts? Or intangible benefits such as easy access to certain services or the convenience of better personalization? According to a recent survey by Salesforce, called the State of the Connected Customer, over sixty percent of millennial consumers and over fifty percent of Gen X customers agree they’re willing to share data with companies that send personalized offers and discounts, provide personalized in-store or online shopping experience, or give product recommendations that match their needs[vii]. Whatever the incentive, you organizations need to make it compelling and make customers feel as if they’re getting something special in return.

Data-driven innovation and business modelling is booming worldwide, but it is not without its challenges. Originally with roots in philosophy, privacy and ethics are two questions intertwined with data analytics, and they require a solution in computer science[viii], along with responsible steering. We hope our suggestions help you steer responsibly.

[i] Special Eurobarometer 431 on Data Protection (June 2015) Available online at http://ec.europa.eu/public_opinion/index_en.htm

[ii] Culnan, M. and Armstrong, P. (1999) “Information Privacy Concerns, Procedural Fairness, and Impersonal Trust: An Empirical Investigation,” Organization Science, Vol.10 Iss.1, pp.104-115.

[iii] Khatibloo, F. (2013) “The New Privacy: It’s All About Context” Forrester Research Vision Report in the Customer Trust And Privacy Playbook For 2017. Available online at https://www.forrester.com/report/The+New+Privacy+Its+All+About+Context/-/E-RES108161

[iv] Morey, T. Forbath, T. and Schoop, A. “Customer Data: Designing for Transparency and Trust,” Harvard Business Review, May 2015.

[v] For more, please visit http://www.azarask.in/blog/post/making-privacy-policies-not-suck/

[vi] Please see https://www.w3.org/2010/api-privacy-ws/report.html for more details about and slides of the proposal.

[vii] Available online at https://www.salesforce.com/form/pdf/state-of-the-connected-customer.jsp

[viii] Hartnett, K. “How Humans Can Force the Machines to Play Fair,” Wired Magazine, November 26th 2016

 

 

 

Leave a reply

Time limit is exhausted. Please reload CAPTCHA.

Copyright © 2015 The Barrister. All rights reserved.