Plenty has been written about the new iPhone 4S, with its voice-controlled virtual assistant Siri, and about iOS 5, its software. But in writing a book about both, I stumbled across an amazingly thoughtful feature that I haven’t seen a word about: something called AssistiveTouch.
If you’re blind, you can literally turn the screen off and operate everything — do your e-mail, surf the Web, adjust settings, run apps — by tapping and letting the phone speak what you’re touching. You can also magnify the screen or reverse black for white (for better-contrast reading).
In short, iPhone was already pretty good at helping out if you’re blind or deaf. But until iOS 5 came along, it was tough rocks if you had motor-control problems. How are you supposed to shake the phone (a shortcut for “Undo”) if you can’t even hold the thing? How are you supposed to pinch-to-zoom a map or a photo if you can’t even move your fingers?
One new feature, called AssistiveTouch, is Apple’s accessibility team at its most creative. When you turn on this feature in Settings->General->Accessibility, a new, white circle appears at the bottom of the screen. It stays there all the time.
When you tap it, you get a floating on-screen palette. Its buttons trigger motions and gestures on the iPhone screen without requiring hand or multiple-finger movement. All you have to be able to do is tap with a single finger — even a stylus you’re holding in your teeth or fist.
For example, you can tap the Home on-screen button instead of pressing the physical Home button.
If you tap Device, you get a sub-palette of six functions that would otherwise require you to grasp the phone or push its tiny physical buttons. There’s Rotate Screen (tap this instead of turning the phone 90 degrees), Lock Screen (tap instead of pressing the Sleep switch), Volume Up and Volume Down (tap instead of pressing the volume keys), Shake (does the same as shaking the phone to undo typing), and Mute/Unmute (tap instead of flipping the small Mute switch on the side).
If you tap Gestures, you get a peculiar palette that depicts a hand holding up two, three, four, or five fingers. When you tap the three-finger icon, for example, you get three blue circles on the screen. They move together. Drag one of them, and the phone thinks you’re dragging three fingers on its surface. Using this technique, you can operate apps that require multiple fingers dragging on the screen.
To me, the most impressive part is that you can define your own gestures. In Settings->General->Accessibility, you can tap Create New Gesture to draw your own gesture right on the screen, using up to five fingers.
For example, suppose you’re frustrated in Google Maps because you can’t do the two-finger double-tap that means “zoom out.” On the Create New Gesture screen, get somebody to do the two-finger double-tap for you. Tap Save and give the gesture a name—say, “2 double tap.”
From now on, “2 double tap” shows up on the final AssistiveTouch panel, called Favorites, ready to trigger with a single tap by a single finger or stylus. (Apple starts you off with one predefined gesture already in Favorites: Pinch. That’s the two-finger pinch or spread gesture you use to zoom in and out of photos, maps, Web pages, PDF documents, and so on. Now you can trigger it with only one finger.)
I doubt that people with severe motor control challenges represent a financially significant number of the iPhone’s millions of customers. But somebody at Apple took them seriously enough to write a complete, elegant and thoughtful feature that takes down most of the barriers to using an app phone.
I, for one, am impressed.
And I’d also like to hear, in the Comments, from people who actually use AssistiveTouch. How well does it work?
pogue.blogs.nytimes.com, 10/11/2011