March 30, 2016
For interaction designers, it’s becoming common to encounter privacy concerns as part of the design process. Rich online experiences often require the personalization of services, involving the use of people’s information.
Because gathering information to personalize a customer experience can interfere with the overall experience — with negative consequences for the business — how do we navigate this increasingly difficult territory?
What are the guidelines for using data to personalize digital experiences, and how can organizations help people feel comfortable with personalization services that research clearly shows people want?
Exchanging personal information for things we want, whether physical or virtual, is nothing new. Long before the Internet, we freely published our home phone numbers and addresses in public directories. Remember the white pages? While they connected people and businesses in the same region, they opened the door to unwanted telephone marketing and junk mail.
People exchange privacy and personal information for social goods all the time — the trade just needs to be perceived as being worthwhile. In order to function properly, our civic and judicial systems, for example, require the use of our personal information which routinely becomes public record.
In May 2014, Accenture Interactive published an online survey conducted among participants between 20-40 years old:
These findings reveal an interesting story: While digital consumers broadly accept the loss of privacy in the digital space to receive personalized content, they are wary of how businesses store their information and don’t fully trust how that information is used.
In considering what might discourage customers from sharing personal information, we should bear in mind the difference between privacy and security. While both may give us reason to be cautious, security has to do with a business’ ability to protect data from hackers, whereas privacy involves how customers are protected from businesses misusing their data. One represents an internal threat, the other external.
What we trade: Social networks are where we trade most. We disclose demographic data such as gender, age, address and birthplace, and share our preferences for music, food, brands and even friends. Everything can be tracked — the amount of time we spend online, our activities and the types of content we consume and create, revealing valuable behavioral information and insight into our psychology.
What we receive: We gain access to a wide range of services offered by social networks including keeping in touch with loved ones, finding things we like, playing games, establishing business networks and seeking employment.
Example: Pinterest is one of the best examples of content personalization. As Kevin Jing, Engineering Manager on the Visual Discovery team at Pinterest explains on their engineering blog, “Discovery on Pinterest is all about finding things you love, even if you don’t know at first what you’re looking for.”
Requiring a high degree of personalization, Pinterest recommends content based on deep learning in the form of object recognition. Called visual search, the technology finds and displays visually similar images to increase user engagement, opening the door for intelligently targeted ads and greater monetization of the platform.
What we trade: By interacting with custom filters and allowing our activities and purchases to be recorded, we share preferences for product features and our buying habits. When we connect our social networks, information about friends and colleagues provide an additional layer of useful data. After all, we often buy what those in our social groups buy.
What we receive: For consumers, the purpose of e-commerce is to find and acquire products we want. The deeper the understanding a seller has of our psychology, the better the product placement. E-commerce personalization reduces friction around finding the things we like and makes those things easier to purchase.
Example: Bombfell - clothing service for men. After entering your height, weight, body shape, skin tone, clothing size, favourite brands and other preferences, Bombfell becomes your personal fashion advisor and personal shopper.
As Bombfell explains in its FAQ, “We use technology on the back-end that surfaces recommendations for each user based on fit and style. But a human stylist (the sort with years of experience working in production, design and merchandising in men’s fashion) has the final say.”
What we trade: Beyond normal census information, we provide public services with more preferences and behavioral data than ever before. And it’s not just consumption rates and patterns, but our preferences for the services themselves.
What we receive: From municipal services like utilities, taxation and legal support to tourism and travel tips, sharing our personal information enables providers to offer efficient access to these and many other essential services. Because the needs of individual citizens are unique, there is much to gain from personalization.
Example: Government portals like the one used by the state of Indiana are becoming more commonplace. Called my.IN.gov, the service is “a personalized dashboard that empowers the user to create a unique content experience.”
By setting up an account — and connecting social networks — citizens gain access to a customizable dashboard including personalized services, maps, local information and travel advisories. They can show and hide tiles of interest, indicate favorite services, manage newsletter subscriptions and even view interactive maps with rich state-wide statistical data.
The US Federal Government is not far behind, with their portal launching soon.
If we look to the future, there is a growing trend of exchanging privacy for personalization in the emerging space known as the Internet of Things (IoT). Designing for user experiences in this space goes beyond purely digital interactions to ones that include ecosystems of physical and digital services.
Disney’s MyMagic+ is the platform that supports their Magic Band, a device using RFID technology to enable guests to authenticate and unlock a wide range of benefits including gaining admission to the park and making purchases. Similar to mobile devices, the Magic Band tracks and communicates a customer’s location as they move within the park.
By aggregating personal preferences, past activity and location data, the MyMagic+ platform provides guests with frictionless and highly personalized experiences when staying at the resort. As Disney states, “Aggregate information can be used to better understand guest behavior and make improvements to the guest experience [including] managing wait times and improving traffic flow.”
With connected devices capable of collecting vast amounts of consumer data, opportunities for the personalization of services are almost endless. A number of key players in the IoT space are competing to provide customers with intelligent virtual assistants for connected home services.
Amazon's Echo promises integration between a number of connected services and uses audio sensors backed by ever-learning artificial intelligence. Always on, it awaits your command to start listening to you and your home environment. The result is an intelligent system that learns to anticipate needs and delivers a great degree of control over connected devices.
It’s easy to see the significant risk to privacy represented by this type of always-on service. Given that audible activities in your home are captured and sent to the cloud for processing, third parties with access to the cloud (legally or not) could potentially access your data.
“MyMagic+ provides an even more immersive, personalized and seamless Walt Disney World Resort experience than ever before. If you choose to participate in MyMagic+, we collect your information from you online and when you visit the Walt Disney World Resort. We value our guests and are dedicated to treating the information you share with us with care and respect.”
The following guidelines capture the key considerations we should keep in mind.
Taking something of value from someone else without his or her consent is… yup, stealing.
And someone agreeing to have their personal information used, if they are not adequately informed, does not imply consent. Many people do not understand the full implications of providing their personal information.
The legal guidelines for collecting and using personal data are well established, however, they don’t require an educational component – the onus is currently on users to understand the legalese.
People should always understand the nature of the trade, if they don’t, they will figure it out eventually and either love or resent you for how the exchange was managed.
Make it easy to understand the facts about how data is used and stored. Long type-heavy pages don’t accomplish this. And neither does hiding information away. This is where UX Design plays an important role. The interface should promote understanding and make information accessible.
The Government of Canada’s Guidelines for Online Consent is a good model to follow. It states that consent must be meaningful, and that individuals should “understand what they are consenting to.”
Give people good value. This is just good customer service. Making fair trades with people will help create trust with users, strengthen your brand and is a successful long-term strategy.
There are significant risks with being perceived as unfair, even if that perception is wrong. According to research published in 2002 by NPG, most people will punish perceived unfairness when given the chance, even when there is no immediate payoff to them. The negative emotions that arise toward unfairness provide ample motivation for customers to take action.
Netflix suffered such altruistic punishment from customers in 2011, when they lost a million customers in a single month as a result of perceived unfairness from unbundling their DVD and live streaming service.
Crucially, customers tend to infer bad motives if you ask for more from them without increasing the value you provide. To gain trust, clearly show users the value you offer as a result of using their personal information. This will help to avoid the risk of being perceived as unfair and any resulting altruistic punishment that may ensue.
Remember that personal information is not yours to keep. It’s still the user’s data, so give them control over it.
This has two implications:
Google Ads allows users to adjust their display ad preferences. Indicated by two buttons located in the top right corner of display ads, each button represents a different method for users to take control.
Clicking the “X” hides the ad immediately, and prompts the user to specify why the ad is not relevant and lists the choices: inappropriate, irrelevant or repetitive. Presumably, the feature enables Google to serve more relevant content in the future by allowing users to indicate which ads they don’t like, and why.
Clicking the “AdChoices” button directs users to an “About Google Ads” page. The page is currently text-heavy, offering little in the way of practical help on how to make changes to your profile, which could be greatly improved.
However, when users drill down past the landing page, Google provides a nicely designed and intuitive interface that allows users to control ad preferences:
Interestingly, this page was recently redesigned, pointing to the value Google places on customer privacy. For comparison, the previous version of the same page is below. It offered little direction for how to improve privacy preferences and only minimal control.
While there is still room for improvement, when it comes to making it easy for customers to manage personalization preferences, Google provides a good model to follow.
It is genuinely difficult to find a good example of an organization allowing users to easily take back their data. Perhaps this lack of examples is telling.
One of the largest collectors of customer data, Facebook, allows users to download an archive of their data. From there, accounts can be deleted. However, there is no clear language to indicate that Facebook actually deletes all information from their servers, or whether they continue to maintain profiles on users.
Were the process clearer, Facebook would almost satisfy the principle of giving ongoing control over data back to customers. But, they would still have to provide an easy way to do so.
We are still far from standardized best practices when it comes to giving customers full control over their personal information. It may be a while before it becomes common to expect greater control in this area. Maybe as digital consumers we have grown accustomed to providing personal information without expecting businesses to grant us a means to easily maintain control. Perhaps we can influence change by increasing our expectations in this area – and ask for control.
CX and UX professionals are faced with the difficult task of helping clients navigate design decisions around privacy and personalization of digital services. It’s up to us to help clients understand the implications to business outcomes and ethics. To do this well, it’s important to introduce privacy considerations into the design process as early as possible.
Customers expect that we make it easier for them to access services in ways that require personalization. However, many customers are not yet comfortable with this, raising concerns about the trade – even though they want the benefits.
And even customers who are comfortable, may only be so because they don’t fully understand how their personal information is being collected and used. This places the onus on businesses and UX professionals to adequately educate customers.
Savvy businesses will see this an opportunity rather than a threat. It creates an opportunity for organizations to differentiate and earn loyalty from customers. Privacy is a competitive advantage with a legitimate upside.
By getting informed consent, making fair trades and giving customers ongoing control of their data, businesses can build trust with their customers, provide excellent customer experiences and build ongoing brand loyalty.
Main image credit: flic.kr/p/4tBwKZ
This article was originally published in Smashing Magazine: ow.ly/103xjA
Posted in user experience, customer experience, public services, internet of things