The wonderful Suzanne Ginsburg recently shared her pointers for the evolving techniques of discoverability in touchscreen interactions.
Sure, things could be better, but many apps actually improve upon the status quo. In particular, there have been some innovative approaches to discoverability—the ability for users to locate something they need to complete a certain task. While discoverability can still be improved in many applications, the strategies in this article may be a helpful starting point for iPad and other tablet designers.
Touchscreens—and especially big touchscreens—offer us the opportunity to explore more natural, more direct interactions. Over the last 30 years, we’ve developed a host of abstractions to manage the desktop GUI: a bevy of menus, buttons, and tabs. But here’s the thing: our brains developed over millions of years to navigate the physical world, to explore objects physically and directly. Yet our daily computing lives are trapped in a temporary alternate universe of arbitrary and abstract metaphors, working at a cognitive distance from the information and tasks. Just watch a toddler use an iPad to see how readily she grasps touchscreen interactions; but your two-year old isn’t quite so capable with multilevel menu navigation, is she? As touch designers, it’s useful to conceive of our designs as infant-ready interfaces. Let your toddler lead the way.
Friends, buttons are a hack. As in the real world, they’re often necessary, but they work at a distance—secondary tools to work on primary objects. A light switch here turns on a lightbulb there. These indirect interactions must be learned; they’re not contextually obvious. The revolution that touchscreen devices are working is that they allow us, more and more, to use primary content as a control, to create the illusion of direct interaction.
<update date="March 28, 2011"> I don’t mean to suggest that we throw out all of our familiar buttons entirely. Light switches shall remain necessary, after all, and so shall buttons, especially where it’s necessary to trigger abstract actions (“share via Twitter,” for example). But it’s important to recognize those devices for what they are: necessary hacks for moments when direct interaction isn’t possible. Touchscreen interfaces allow that direct interaction in many more contexts. As new solutions arise, we should be open to putting our time-tested workarounds aside. When designing an interaction for touch, always ask: do I really need another button or control for this?
We’re just starting to explore these possibilities, and as we do, new interface metaphors are emerging. There’s some retraining that must be done, both for designers and our audiences. Suzanne’s examples offer some useful pointers on how to make touch controls easy to find and understand. Her recommendations include:
- Leverage mental models and exploit physical characteristics. Use real-world visual metaphors to hint how your app works.
- Spring into action. Use subtle animation to draw the eye and hint how the interface can be manipulated.
- Provide sneak peeks. Sweep secondary tools offscreen, but use animation to give a glimpse at where they may be found.
- Just-in-time features. Reveal controls when they make sense contextually.