Will we love our phones?

UX is in for some interesting times.

Nobody knows the future, but it’s a relatively safe bet that we’re heading into some big changes when it comes to design. Google wants to know you better than your spouse does, and it’s quite likely it’ll get there. SIRI and Google Now are offering increasingly interesting ways to engage with your devices through speech, by simply asking questions and having the answer spoken back to you.

How far will that go? The movie ‘Her’ gave us one possible insight into that recently, with Joaquin Phoenix playing a man who falls in love with his Operating System, and again it’s not hard to see how such a thing can occur. But more interesting was a recent article on the BBC website, that discussed the ethics of robotics in our near future.

We all know they’re coming. Many of us have a toy robotic device somewhere, some of us have a Roomba cleaning the home already. It’s not quite the Jetson’s, but we’re moving into sight of that kind of future. In fact, South Korea is currently drawing up a code of ethicsfor how robots and people interact – not, as you may think, to control the robots and keep us safe – but to keep them safe from us.

As it turns out, that may not be required. The BBC article brought out some interesting findings. At a workshop earlier this year a researcher asked attendees to torture a robot, a small cute device called a Pleo that needed some emotional assistance and training from it’s human ‘parent’. After only an hour the attendees were asked to ‘murder’ the robot with a knife – and all refused. In anotherexperiment in 2011, RadioLab asked five children to hold a doll, a Furbie and a hamster upside down for ‘as long as they felt comfortable doing it’. Perhaps unsurprisingly, they could torture the doll for as long as their muscles held out, but both the Furbie and the live hamster got let off much sooner – even children can associate a robot with being alive and feel uncomfortable hurting it.

And soldiers, too.  In 2007 the Washington Post reported on an experiment with a landmine-clearing robot being trialed. The trial was called off by an angry colonel when the robot was still crawling about – because he felt it was ‘inhumane’ to see the device, most legs blown off, still struggling to reach it’s next target.

And exactly why am I talking about robots when this post is about phones? Blame Mr Phoenix. The point is clear, or should be. Humans feel empathy for things that appear to be sentient. The more alive it appears, the more we can feel empathy for it. The more we engage with it, the more we’ll bond with it. Electric Dreams, anyone?

Now we come to UX, and our immediate future.

We are moving into a space where the User Interface is shrinking. I doubt that voice will ever completely take over from the need for an interface, but it’ll certainly reduce and change it. Over time we’ll have a shallower interface to engage with but an increasingly deeper experience with the intelligence behind it. Big data and increasingly smart interpretations of it (and of us) will mean our devices and our systems will begin to interpret us and to serve us exactly as if they were alive, sentient and as friendly as can be.

And that means we’ll bond with them. In turn, that means we have to consider the emotional impact we have. Not just in designing those systems and interfaces, but in changing them.

Imagine you are in love, with the perfect partner. You are madly, deeply in love. One day you come home – and your partner has a leg missing. Or, is lying dead and unresponsive next to the pool. There’s a small card next  to them, telling you to call support for a free upgrade. So you’re fine, right?

We’re not quite there yet, we might upset our customers if we change a design and make it worse, or take away a favoured feature, but few will burst into tears and mourn.

One day soon however, that’ll change. Today we might regret the loss of a phone because we are losing a valuable tool. Tomorrow we might mourn the loss of ‘Eric’, our life manager within that phone.

As UX people, that thought should both thrill and scare us.

Gary Bunker

the Fore