ITBusiness.ca

Multi-touch

Using infrared sensors like the ones on television remote controls, Texas A&M University students presented an inexpensive multitouch system at the Computer Human Interaction (CHI) conference in Vancouver.

“I like to consider it an optical force field; it’s like a picture frame where we shoot thousands of light beams across and we can detect anything that intersects that frame,” said Jonathan Moeller, a research assistant in the Interface Ecology Lab at Texas A&M University.


The frame is lined with 256 IR sensors, which are connected to a computer. When ZeroTouch is mounted over a traditional computer screen it turns the display into a multitouch surface. In a demonstration at CHI, Moeller played a computer game using multitouch controls and it worked without issue.

Taken one step further, if the screen is suspended then a user could paint a virtual canvas. Using an iPhone to select the colors, when users place their finger or hand into the frame the drawing begins to appear on a display that the system is connected to. If just a finger enters the frame then the brush will be narrow, but if an entire hand or arm enters it, then the brush stroke will be wide.

Moeller started working with multitouch about two years ago and he said one of the first projects included a projection screen and a camera. “And I thought to myself, this is really bulky. We can do this better.”

He has been working on ZeroTouch since then and thinks that two-dimensional interaction with the system is just the beginning.

“You can stack layers [of ZeroTouch] together to get depth sensing,” he said. That’s important in multitouch because then users can hover over objects, he said. Typically, hovering isn’t available with touch systems because a finger would occlude what it is hovering over.

CHI runs until Thursday.

Exit mobile version