I was busily reading NewScientist this week,catching up on a few issues I haven't had time to read in a while, when two articles in particular caught my eye.
The first was on teams who are playing with touch-screens to provide tactile feedback - basically, to make surfaces feel 'real' (Touchscreens touch back, 24th April).
One team was using vibrations to elicit feelings of hard surfaces, to trick the brain into feeling like the finger was moving onto and off a button. Another was using different vibration settings to transmit a sense of different surfaces, such as smooth, soft, rough, etc.
You can see another story relating to similar research on the BBC website.
As I read this, it struck me that this is a new dimension to interface design that is not yet being explored. In games and some well designed touch-screen apps the vibrate control is used to good effect, but generally at quite a macro level. For example in many shooting games, the vibrate is used to denote a hit, when a bullet lands on your player or something nearby blows up. This is quite an engaging approach that emotively connects us to the screen and the action.
But that's working at a macro level, connecting your entire character to an exterior event. Imagine being able to use your finger to 'feel' your way around a darkened room, feeling for where the door is - and touching the wet, slippery walls as you go. Or on a more practical level, imagine using a touch-screen virtual keyboard, that actually feels like it's projecting out of the glass.
The second article that fired my imagination was on the senses, a review of the book See What I'm Saying: The extraordinary powers of our five senses, by Lawrence D. Rosenblum and W.W. Norton.
What I got from this brief review of the book was that there is a vastly more complex interrelationship between our senses than we'd normally think. For example:
- Site and sound are closely interlinked; we effectively lip-read when we still understand people talking in a noisy room.
- A group of blind mountain bikers navigate by clicking their tongues and listening to the reflected sounds, just like bats (and Johnny English, though his never seemed to work quite right).
- Smells affect our perceptions of what we see and hear, even scents we cannot consciously detect shade our attitudes, judgements and behaviours.
- Eating in the dark makes food taste bland.
This made me think, how often do we design to use these inter-relationships effectively? If sounds, scents and touch all impact on perceptions of what we see, then how many interfaces make the most of this?
In the book review, NewScientist explained that the relationship between colour, light and intensity are currently well understood by chefs, and used in top restaurants around the world. Excellent restaurants do make great use of light, texture and sound to bring the flavour of their food to the pinnacle they aim for - so shouldn't we do the same in design?