Apple looks to take Multi-Touch™ to next level

“When it was first introduced in January 2007, iPhone’s Multi-Touch interface was a real breakthrough in operation of small portable devices,” Unwired View reports.

“Now, if the ideas sketched out in a new Apple patent application “Multitouch data fusion” are implemented, we may soon see another qualitative leap in the usability of user interfaces in various computing devices,” Unwired View reports.

“Multi-Touch interface is perfectly good for many of the device control and operation functions. But on-screen 2D object manipulation has some inherent limitations too, and there are quite a few actions that can be done better by other input means. And electronic devices that use MT, usually have quite a few of these other input means. That can include cameras, microphones, accelerometers, biometric sensors, temperature sensors, etc.,” Unwired View reports.

“What Apple is proposing in it’s patent app, is to fuse these secondary input means with Multi-Touch to improve the overall user interface,” Unwired View reports. “And it gives quite a few examples of how to do that.”

• Fusing voice input and Multi-Touch™
• Combining Multi-Touch™ and motion sensor data
• Marrying force sensors and Multi-Touch™
• Fusing Multi-Touch™ with visual input from device camera
• Combining gaze vector data with Multi-Touch™ gestures

Unwired View reports, “These are just a few of the possibilities described in patent app. Some of them, like facial expression/MT combination may be pretty far off. But many others, like voice input/MT, motion sensors/Multi-Touch, visual data/MT fusion are technically feasible already.”

Much more, including patent app illustrations, here.

[Attribution: Gizmodo. Thanks to MacDailyNews Reader “RadDoc” for the heads up.]

Reader Feedback (You DO NOT need to log in to comment. If not logged in, just provide any name you choose and an email address after typing your comment below)

This site uses Akismet to reduce spam. Learn how your comment data is processed.