As Spiderman could tell you, “with great power, comes great responsibility”. Aunt May was essentially quoting Voltaire when she said it, and it’s something we should all keep in mind.
New technology and new understanding can bring powerful tools for us to use. Einstein’s unlocking of the secrets of the atom via his Special Theory of Relativity led to our modern nuclear age – a tool that has led to great medical advances, just as it wiped out cities at the end of World War II. Wernher von Braun’s understanding of rocket technology eventually helped put man on the moon – but not before it let Germany build the V2 rockets that killed thousands and pounded England to dust.
Reading Susan Weinshenk’s book “How to make people do stuff” it’s easy to see how much influence we in the UX field really have at our fingertips, and much care we should take. UX shapes the integration between people and technology, between consumers and the business world. In many ways it shapes the filter through which we experience life. Powerful tools need to be carefully operated and applied, and UX is clearly a powerful tool. Ever wondered if we’re doing the right thing with it?
Let’s take a look at some of the processes we use.
Study them: see what makes them tick
We study people, learn their motivations. What they are scared of, what excites them. Here’s just a brief sample of the books currently helping us to nudge and convince people to do what we want:
- How to Get People to Do Stuff: Master the art and science of persuasion and motivation by Susan Weinschenk
- Predictably Irrational, Revised and Expanded Edition: The Hidden Forces That Shape Our Decisions by Dan Ariely
- The Upside of Irrationality: The Unexpected Benefits of Defying Logic by Dan Ariely
- The Invisible Gorilla: How Our Intuitions Deceive Us by Christoper Chabris & Daniel Simons
- Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein
Understand them: uncover their thoughts and inner responses
Behavioural and psychological research has provided deep insight into how people make decisions and why they behave as they do. Eye tracking shows us what people look at, regardless of what they want to tell us about their focus. Now we can even measure thoughts and responses in the brain. We can see when people are having a response, regardless of what they say. Soon we may even be able to dive down to the granular level of individual thoughts.
Thanks to the concept of big data (see Big data: The next frontier for innovation, competition, and productivity), we are now able to connect the dots with behaviour across wider and wider spaces. Now it’s possible for us to connect how people search to how they shop, how they use their mobile phone with where they buy socks. We can begin to predict what content and functionality people will need and serve it up before they even think to ask.
Influence them: lead them where we may want them to go
All design attempts to influence the user in some way, good design absolutely influences users to follow the correct path, or at least assists them in choosing the path to walk. At times though the right path can be difficult to see, and it can be tempting for us to choose a path for them that happens to fit our needs. In some cases we may even be tempted to use ‘dark patterns’ (see darkpatterns.org) to deliberately misdirect; to trick people into actions they wouldn’t necessarily undertake.
Amend habits: make or break them
We understand the science of habits (trigger > routine > reward) and design to change the habits that don’t fit our goals. We design triggers to invoke useful habits, we disrupt the routine/reward cycles of undesirable habits – undesirable in terms of our own goals, that is. We create new habits by associating them with existing ones, and by strengthening the reward offered.
When it goes wrong
We have some pretty powerful tools to play with. By studying, understanding, influencing and amending behaviour, we can have a powerful effect, and that power is sometimes abused. Every day we encounter designs that attempt to nudge us in a direction favourable to someone else. Designs that promote Okay over cancel. Designs that default in a Yes, hoping you won’t notice. Designs that mix metaphors (“Yes, I don’t want to receive emails”). Designs that attempt to make us feel guilty, excluded, inferior, stupid or uglyunless we make the right decision and purchase.
We probably all have our own favourite examples of naughty (and dark) patterns out there. Using our understanding of behaviours, drivers, visual flow, cognitive overload and more, dark patterns attempt to mislead or influence for the benefit of a business.
One clear example of this is where an e-commerce site places an item in the cart without the customer selecting it. For example in 2010 the Comet electronics store (www.comet.co.uk) was caught sneaking iPad cases into the cart of the customer when an iPad was selected. Unwary customers who failed to see this purchased an item they never actually selected. Some airlines use this trick, automatically selecting the travel insurance option whether the customer needs it or not.
SeeTickets.com automatically includes a £3.00 cancellation protection fee, but fails to show this anywhere near the total price table. In fact it’s as far away as you could get (see http://darkpatterns.org/library/sneak_into_basket/)
There are plenty of examples to find across the web. Look for websites that default the opt-in on marketing questions. Or websites that default the more expensive option, hoping you’ll miss (or be too lazy to select) the cheaper option. But it’s not always a deliberate misdirection. Shelfari influenced by mistake, and paid a price (see their blog entry here). Their social sharing invitation accidentally led users into inviting entire address books, rather than a few select friends – and their complaints exploded exponentially.
Big data can also lead us down a dark path. Google has a patent that will allow it to dynamically price content, depending on your behaviour. Buy the expensive top today, and the price offered to you at other stores tomorrow may be higher; you may never see the discounted price offered to other customers with cheaper buying patterns.
In another example Target (see this New York Times) has been tracking so much data on customers that apparently, under the right circumstances, it can tell with an 87% confidence rate whether a woman is pregnant, simply based on shopping patterns. It can even predict the due date, to within a certain window. Recently the father of a teenage daughter found he was going to be a grandfather notfrom his daughter, but from the pregnancy product flyers sent to his home by Target.
These are all examples of where UX has been used in a way to gain some advantage, but we also need to consider the ethical nature of our research itself. As with any research, looking for answers can in itself be damaging to the audience being tested. Consider the Stanford Prison Milgram Experiments, psychological experiments in which participants were treated as prisoners/guards (in one) and were asked to administer fatal doses of electric shock (in the other). Both experiments led to high levels of criticism and in at least one case to ongoing psychological trauma to participants. Not that UX research has ever been quite so drastic, but it’s important to note that research isn’t always harmless.
And the damage can sometimes fall back on ourselves. A colleague of mine recently explained how one of her previous roles was working on photos of starving children in Africa for a charity organisation. She was told to ignore all photos of starving boys, since the images of starving girls had a high rate of conversion for charity giving. Putting aside for a moment the ethical questions that raises, at a personal level this was something that had deeply affected her in a negative way.
Where is the line
It’s pretty clear that our field has a set of powerful tools to play with. Used in the wrong way this can have a negative impact on ourselves, our research subjects and on the wider audience in general. Without some form of guidance and with the best of intentions, any practitioner could easily slip across the line without realising it.
The UX Code
UX practitioners are generally pretty honourable people in my experience. After all, we get into this game through an innate desire to see technology work for people, rather than against them. But that doesn’t mean we can always know what’s right, so a code of conduct makes a lot of sense. It also makes sense that this code should be based on three core constructs:
1. Do no harm
Just as in the medical code of ethics, our core directive should be one of non-maleficence – “first, do no harm”. It’s a simple rule, and one that should be particularly directed at the research and analysis stages.
2. Abide by industry codes of conduct
Secondly we should abide by the unique codes of conduct relating to the industry we are working within. Many industries, particularly those with the potential to cause harm, have either a regulatory or voluntary code of conduct specific to the field. UX specialists working within that industry should understand and abide by the same standards.
3. Abide by a UX code of practice
There is no such thing just yet, but perhaps the time has come to create one. It should speak to how we respect both the business and the user, during research, analysis, design and testing phases. Privacy, honesty and respect should be core elements.
We all follow our own moral compass – but perhaps it’s time we found a stronger, clearer and more consistent line to avoid tripping over.