ITBusiness.ca

You can look, and now you can touch, too

Facing a large video display showing an aerial view of a city, a man places the fingertips of both hands on the screen and moves his hands apart. The display zooms in on the area between his hands. Later in the same demonstration, he sorts photographs by pushing them around the screen with his fingertips.The man is Jeff Han, a part-time researcher at New York University’s Courant Institute of Mathematical Sciences and founder of Perceptive Pixel Inc., a company he has set up to commercialize his innovative touch-screen technology.

The video of Han’s demonstration at TED 2006, a technology, entertainment and design conference held in California, is all over the Internet. The idea is usually called multi-touch, though researchers doing similar work inside Microsoft Corp. reportedly call it surface computing. It’s the next step beyond familiar touch screens that let people perform simple actions by pressing a fingertip to one spot.

With multi-touch, you can touch two or more points at once and use finger movements to control an electronic device. For instance, you might enlarge an object by pulling two opposite corners, or scroll the display with one finger while moving an object with another.

While large multi-touch displays like Han’s are still in the demo stage, multi-touch is appearing in smaller devices. Apple Computer Inc. drew attention to the concept with the recent introduction of its iPhone, which substitutes a touch screen – with multi-touch capabilities – for a keypad.

Apple’s isn’t the only such phone, notes Stuart Robinson, director of handset component technology services at research firm Strategy Analytics Inc., in London, U.K. Samsung’s F700 and LG Electronics’ KE850 “Prada phone” incorporate similar screens.

So does the Onyx, a concept phone developed by Santa Clara, Calif.-based Synaptics Inc., and Pilotfish, developed by a design studio based in Munich and Taipei, last summer. The Onyx uses Synaptics’ ClearPad multi-touch display.

Screens too small
John Feland, human interface architect at Synaptics, says multi-touch “gives people a more natural way of interacting with the device,” and recognizing multiple touch points helps reduce the likelihood of common problems like closing an object by mistake. On the Onyx screen, you close something by placing one finger on the object while gesturing with another finger.

But Feland also admits that the screens of today’s mobile devices are too small to realize all the potential of multi-touch. He says Synaptics is interested in larger displays, although its particular technology is difficult to scale to large displays like Han’s. Some multi-touch prototypes use projection technology and algorithms that derive touch points from the shadows of the user’s hands.

Robinson expects to see multi-touch technology in various sizes, from mobile devices to large displays. But before it catches on, he says, there must be applications that take advantage of its capabilities.

It’s a chicken-and-egg situation, according to an official of another multi-touch developer who asked not to be identified. When startups in the field seek venture capital, potential investors ask whether popular PC software can use their multi-touch capabilities. At present the answer is no, so investors hesitate to jump in. This could explain why early entrants to the market such as FingerWorks, which made the GesturePad tool, have already ceased operations after only a short time.

A company like Apple, which controls both its hardware and software though a closed architecture, might be better able to break that logjam than most. So might Microsoft, thanks to its software muscle and some experience in selling input devices like mice and keyboards.

Both Apple and Microsoft refused interviews, and Bill Buxton, a Canadian pioneer in multi-touch research who now works for Microsoft Research, referred an interview request to a Microsoft media relations official in the U.S., who denied it. Han of Perceptive Pixel also didn’t respond to inquiries.

Buxton has, however, published some thoughts on multi-touch on his personal Web site. He points out that multi-touch screens will be very good for some purposes, and not good for others, so don’t expect multi-touch screens to replace the mouse or the keyboard.

Buxton also notes that multi-touch technology isn’t new. In fact, he began working on the idea in 1984. But, he adds, the mouse was invented in 1965, was first used commercially in 1982 and became ubiquitous when Windows 95 appeared in 1995, so there’s still some time. “Multi-touch technologies have five years to go before they fall behind,” he concludes.

Coincidentally, asked how long multi-touch will take to gain mass acceptance, Feland replies: “You’re probably looking at three to five years.”

Exit mobile version