Big touchscreens demand big gestures. The iPad in particular begs for swipes and multitouch combos that let you slap at the whole screen to control apps instead of tapping delicately at tiny buttons.

iPad apps like Facebook and Twitter, for example, demonstrate that it’s much easier to move through your history by nudging the canvas back and forth than it is to hit Safari’s tiny Back button. Small touchscreen buttons require extra thought and motor control—brain and strain—compared to coarse gestures that let you fling screens aside.

With iOS 5, the iPad finally gets exactly those kinds of coarse gestures to move between apps. Yay, right? Swipe left or right with four or five fingers to switch among recent apps, or pinch with four or five fingers to close an app and zip out to the home screen. No need to find the iPad’s elusive Home button; you can just paw at the whole screen to navigate apps.

I’m a huge fan of the spirit of these gestures, but I’m not crazy about the excution. I wish Apple had followed the interaction already adopted by other platforms, including BlackBerry Playbook, Nokia N9, and the next version of Microsoft Windows. All of these platforms use edge gestures, a technique that is at once more internally consistent and more deferential to individual apps.

The operating system is the frame

Frame
With iOS 5, iPad gets welcome app-switching gestures. But touch actions that should be limited to the edges are encroaching on app real estate.

Edge gestures let you move back and forth among apps by swiping from the edge of the screen. You start on the frame, or bezel, of the device and swipe into the canvas, creating the illusion of knocking screens aside by pushing them at the edge. If a swipe starts at pixel zero, in other words, it’s interpreted as an operating-system gesture for moving among apps. In Windows, Microsoft refers to this as “Edge UI.” See for yourself how it works:

This approach is elegant for more than its simplicity. Edge gestures match physical action with the conceptual metaphor of the operating system. If you consider apps as the front-and-center canvas of the device, then the operating system is the frame, the infrastructure that supports and presents the canvas. When OS-level gestures start on the bezel frame of the device, action matches expectation: this gesture works outside the current app. You’re working on the frame—the operating system—both physically and metaphorically.

iPad’s new app-switching actions are not edge gestures. Instead, these four- and five-finger swipes work within the canvas itself, territory that is supposed to be dedicated entirely to the current app. This creates some confusing competition with app interaction: will this gesture apply at the app level or at the operating-system level? Apple could have avoided this ambiguity by using edge gestures to switch apps. Likewise, to close an app and return to the home screen, a four- or five-finger swipe down from the top edge would fit the bill.

This isn’t only about metaphor, though. I’m jealous that Apple appropriated these gestures for the operating system.

Apple bogarted some sweet moves

We are in the earliest stages of developing a touchscreen gesture vocabulary and, in particular, of exploring the possibilities of multitouch gestures. Multifinger swipes, taps, and pinches promise to help us create interfaces that we play like an instrument more than we use like a tool.

My favorite example of this is Uzu, an addictive visual toy for iPad. The app has ten modes triggered by holding one to ten fingers to the screen. Here’s a demo by Uzu designer Jason Smith, who gradually begins to direct the app like a full visual orchestra:

This sort of full-hand use of touchscreen apps is promising. Just like expert typists fly through words, or power users deploy keyboard shortcuts fly through tasks, multitouch gestures can likewise help us to move just as effortlessly through touch interfaces. Abstract multitouch gestures are the keyboard shortcuts of touch. If done right, they will be expressions for fluidly transforming intent into action.

In that context, full-hand swipes and pinches would be mighty handy gestures for designers to deploy at the app level. Alas, Apple instead hijacked them for the operating system. By putting these gestures inside the canvas instead of at the edge, Apple has swiped some great gestures from designers’ arsenals.

Read more about...