2011 becoming year of the multitouch screen

Touch-screen panels have been around for more than a decade, but it was the 2007 introduction of a multitouch screen in Apple’s iPhone that galvanized the market. Now the business is going gangbusters — as are the innovations that touch-screen manufacturers hope will build on Apple’s success.
Multitouch technology has exploded into a $6 billion business for display manufacturers this year, with more than 200 vendors vying for a piece of the action — and it’s expected to grow to more than $13 billion by 2016, according to market research firm DisplaySearch. “It’s already a huge market, and growing fast,” says analyst Jennifer Colegrove.

Multitouch now dominates in smartphones, and with the introduction of the iPad in 2010, multitouch helped launch the market for tablet computers. The technology is now moving into everything from larger desktop PC displays to the in-flight entertainment systems found in the seatbacks of commercial airliners — and beyond.

A touch, not a press

Before the iPhone, most touch screens used pressure-sensitive, resistive touch panels, which required that the user physically press down on the screen. Resistive screens could track the position of just one finger at a time.

Apple chose a competing technology, projected capacitance, which responds to a light touch and can also sense a finger as it enters the electronic field above the touch surface — a technique called proximity sensing. The touch panel sits on top of the display media (most commonly a liquid crystal display). Capacitive touch-sensing technology requires a person’s finger (or a specially designed capacitive stylus) to disturb the electrical field; unlike resistive designs, it doesn’t work with an ordinary stylus or other inanimate objects.

Projected capacitive screens use a glass touch surface that offers a higher level of transparency than the plastic layer used in resistive technology, resulting in brighter colors. The glass touch surface is also more durable, and capacitive technology is more forgiving of surface scratches.

Apple’s major innovation with the original iPhone was figuring out how to track the actions of two simultaneous touches, which enabled the development of the iPhone’s now-familiar gestures: swipe, rotate and pinch/expand. “It’s really how the software is used that makes touch screens usable,” says Bruce Gaunt, a mechanical engineer at Product Development Technologies, a contract engineering firm that designs and integrates touch-screen technologies for manufacturers of cell phones and laptops. “That’s what Apple does really, really well.”

More recently, Samsung has had success integrating multitouch technology into active-matrix organic light-emitting diode (AMOLED) screens in devices such as its Galaxy S smartphone. Branded Super AMOLED, the technology places touch sensors directly on the screen itself rather than requiring a separate layer, which makes for a thinner display.

“Samsung is a pioneer in implementing touch in active-matrix OLED displays, and more are going to follow,” says Vinita Jakhanwal, an analyst at market research firm iSuppli.

Proliferation — and limitations

This year the smartphone market will reach a crossover point, with more than 50% of units including multitouch displays on projected capacitive or OLED touch screens, says Jakhanwal. Tablet computers are another fast-growing market for multitouch, as demonstrated by the iPad’s success, not to mention the slew of new tablets announced at this year’s Consumer Electronics Show.

Before long, laptops with dual multitouch screens will become available, perhaps led by Acer’s Iconia notebook, which is expected to ship in the first half of this year. Such designs replace the physical keyboard with a second display surface that can be used as a virtual keyboard or as an extended screen.

The device can be turned on its side, book style, to display two pages of a manuscript side by side, and users can turn the pages using a swipe gesture, as they do when reading on the iPad.

But building larger multitouch displays is more challenging, says Ken Bosley, software product manager for HP’s Consumer Desktop Global Business Unit. Because the manufacture of projected capacitive technology gets expensive when scaled much beyond tablet-size screens, HP uses an optical multitouch technology, in which two cameras mounted at the screen edges determine touch coordinates, for its TouchSmart line of desktop and notebook PCs.

“There’s a lot of durability issues with touch screens, and for it to work well in an upright form factor, [the unit] can’t wobble or move,” Bosley says. And while Apple tightly integrated multitouch technology with its operating system, Windows support is still evolving. “Windows 7 is not all that suited to touch, [so] we have worked super hard on [our] touch software,” Bosley says.
Because multitouch adds about $150 to the street price of a desktop PC, Bosley says, and because it’s still seen as a complement to, not a replacement for, a keyboard and mouse, some argue that multitouch is a waste of money on the desktop. A resistive touch screen would allow for basic on-screen pointing in order to, say, start playing a DVD. The problem, Bosley says, is that people now expect every touch-screen device to support multitouch, whether it benefits from the technology or not. “If you violate the expectation, it turns people off.”

A prime example: The TouchSmart offers an on-screen keyboard, even though most people never use it. People prefer to use the regular keyboard, Bosley says, “but we include it because people expect it.”

Michael Woolstrom, CEO of touch-screen manufacturer Touch International, agrees. His firm is working with business partners to deliver seatback-mounted multitouch screens that will be available to replace the current generation of resistive, pressure-sensitive screens in Airbus airliners by the end of this year. “[Users] want to have the sweep, pinch and expand gestures,” he says. “[Multitouch] is driving the user experience.”

One big problem with vertically mounted touch-screen displays is the “gorilla arm” effect, says Andrew Hsu, technology strategist at touch-screen maker Synaptics. People simply can’t work in a sustained manner with their arms extended outward — it’s too awkward and tiring.

On the TouchSmart, HP compensates for this to some degree by allowing the user to tilt the display back 30 degrees from vertical. That helps somewhat, but Bosley admits that using the touch screen at that angle is still awkward. HP’s ergonomic studies show that users tend to tilt the display depending on the application, he says. Put up Solitaire, for example, and users will tilt the display back, pull it closer and use the touch screen, but they tend not to use the on-screen keyboard in this manner; with nowhere to rest your arms, it’s too uncomfortable.

But on smartphones and tablets, which can be held at any angle, users are more likely to take full advantage of multitouch and to use the virtual keyboard. At this end of the market, the focus is on enhancing and expanding the multitouch experience.

More gestures on the way

Multitouch system designers are anxious to expand on the repertoire of gestures pioneered by Apple. Synaptics, for instance, offers its Scrybe gesture suite, which lets system designers choose from a common library of gestures and create a few customized ones of their own.

“You could, for example, [designate] a gesture to take you right into Amazon.com to buy a product,” Hsu says. Scrybe is currently marketed only for laptop touchpads, but Synaptics says the technology could be extended to touch screens.

See a quick demo of Scrybe gestures, currently only for laptop touchpads.

Swype offers a gesture suite for Android and other device platforms that lets users type by sliding their fingers over a virtual keyboard rather than tapping it. And GestureWorks‘ open-source gesture library offers more than 200 multitouch gestures to Flash and Flex developers.

Apple, too, is working to expand its gesture suite in its upcoming iOS 4.3, adding new swipe functions and support for up to five fingers — although the company has made it clear that it doesn’t intend to bring multitouch to vertical displays on desktops or laptops.

But these are not cross-platform standards, and HP’s Bosley thinks the new gestures are less universal — and less intuitive — than the three fundamental gestures popularized by Apple. “I saw a [recent] Apple patent, and it looked like American Sign Language. Why would anyone want to learn a new language?” he asks. “People will use standard, intuitive gestures that make sense on the screen. I don’t think they’ll be willing to learn a whole vocabulary of them.”

Mike McSherry, CEO of Swype, concurs. “It’s not realistic to expect the average user to learn more than a couple-dozen gestures,” he says. He thinks gestures will continue to be used mainly for launching apps, navigating and the like.

Moving into three dimensions
Capacitive touch-screen technology’s proximity-sensing abilities mean that it can detect movements not just in the X and Y planes across the screen’s surface but along the Z axis as well — it can sense a finger as it approaches the touch surface. In the future, proximity sensing could carry touch-sensing screens into three dimensions — if touch-screen makers can figure out how to apply it.

The technology has the potential to interpret not just the proximity of fingers but the gestures they make. For example, when a user flings his fingers outward, the touch screen might interpret that as a command to zoom in on an image on-screen.

“We can sense very, very small changes in capacitance, and because of this, we could sense swipes of a hand or opening and closure of a fist,” says Trevor Davis, director of marketing for touch screens at Cypress Semiconductor. Interpreting such sophisticated gestures, however, is a challenge.

On the one hand, simple “hover sensing” applications are already in use, such as a smartphone display that switches off when you hold it close to your ear, or the LED buttons that light up on the touch-screen display of Dell’s SP2009W monitor only when the user’s hand approaches the screen. But detecting the presence of a finger above the display is the easy part.

“The biggest problem is trying to decipher user intent,” says Hsu. Projected capacitive sensors don’t know if the finger they detect hovering over the surface was placed there intentionally — and, if so, what the user wants to do.

Even within an application context, the inability to clearly ascertain intent leads to usability challenges. “Once you dig through all of the interaction scenarios, proximity is really challenging,” Hsu says. The only viable use of proximity sensing today, he says, is for a simple device wake-up function.

Nonetheless, there’s quite a bit of R&D going on around the idea of using proximity sensors as a user interface, especially in automotive in-dash control systems, says Jérémie Bouchaud, an analyst at iSuppli. Developers are working on 3D gestures that will allow users to zoom in and out or move a map, and to “flick” content from one side of the display to the other by waving. The systems may also be able to discern whether the hand doing the waving belongs to the driver or the passenger, preventing the system from responding to driver gestures for safety reasons.

And for bigger touch screens that use optical rather than capacitive input, researchers are working on 3D optical touch technology that will detect motion within 50 cm of the screen. “These kinds of systems are expected within two years,” Bouchaud says.

Proximity sensing has other benefits that go beyond gestures. For example, says Davis, smartphone manufacturers could use proximity sensing to determine whether a device is sitting on a table, in your hand or on your lap, and adjust radio emissions and fan activity accordingly.

Beyond traditional computing devices

However multitouch gestures develop, one thing is clear: Multitouch displays will continue to proliferate beyond smartphones and tablets and into a wide range of computing and consumer electronics products that people interact with every day.

Multitouch controls have already been integrated into high-end cameras, cars and even home appliances, says iSuppli analyst Rhoda Alexander, and will gradually expand into lower-end models as well. She says multitouch probably won’t be embedded into large-screen televisions, but remote controls with electromechanical buttons will start to give way to multitouch controllers and multitouch apps running on smartphones, such as the L5 Remote for the iPhone.

“We’re really just seeing the tip of it right now,” Alexander says. “As we move forward, you’re going to see all sorts of devices that were traditionally electromechanical using touch screens, and in particular multitouch, to drive the devices.”

Check back next week for the next red-hot display technology to watch in 2011.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs