Today at the Wall Street Journal’s D: All Things Digital conference, Microsoft will unveil Microsoft Surface, the first in a new category of surface computing products from Microsoft. Surface turns an ordinary tabletop into a surface that provides interaction with all forms of digital content through gestures, touch and physical objects. Beginning at the end of this year, consumers will be able to interact with Surface in hotels, retail establishments, restaurants and public entertainment venues.
The user interface works without a traditional mouse or keyboard, allowing people to interact with content and information on their own or collaboratively. Surface is a 30-inch display in a table-like form factor that small groups can use at the same time.
“With Surface, we are creating more intuitive ways for people to interact with technology,” Microsoft CEO Steve Ballmer said. “We see this as a multibillion dollar category, and we envision a time when surface computing technologies will be pervasive, from tabletops and counters to the hallway mirror. Surface is the first step in realizing that vision.”
Surface also features the ability to recognize physical objects that have identification tags similar to bar codes. This means that when a customer simply sets a wine glass on the surface of a table, a restaurant could provide them with information about the wine they’re ordering, pictures of the vineyard it came from and suggested food pairings tailored to that evening’s menu. The experience could become immersive, letting users access information on the wine-growing region and even look at recommended hotels and plan a trip without leaving the table.
Surface computing at Microsoft is an outgrowth of a collaborative effort between the Microsoft Hardware and Microsoft Research teams. Surface computing, which Microsoft has been working on for a number of years, features four key attributes:
• Direct interaction. Users can actually “grab” digital information with their hands, interacting with content by touch and gesture, without the use of a mouse or keyboard.
• Multi-touch. Surface computing recognizes many points of contact simultaneously, not just from one finger like a typical touch-screen, but up to dozens of items at once.
• Multi-user. The horizontal form factor makes it easy for several people to gather around surface computers together, providing a collaborative, face-to-face computing experience.
• Object recognition. Users can place physical objects on the surface to trigger different types of digital responses, including the transfer of digital content.
Microsoft plans to ship Surface with a portfolio of basic applications, including photos, music and virtual concierge applications that can be customized.
Surface will be made available through a distribution and development agreement with IGT, a global company specializing in the design, development, manufacturing, distribution and sales of computerized gaming machines and systems products.
More information can be found at http://www.surface.com
“Microsoft Surface: Behind-the-Scenes First Look (with Video)” from Popular Mechanics, which includes interview footage with NYU multi-touch interaction researcher Jeff Han is here.
MacDailyNews Take: First of all, this is a press release released today by Microsoft PR flacks in an attempt to take attention away from Apple CEO Steve Jobs at the Wall Street Journal’s “D5” conference today. Now, this type of technology is the future, but we’ll be steering clear of Microsoft’s implementation simply because we want our stuff to work.* Also, carting around a coffee table would be a back-breaker.
* If Microsoft doesn’t like our statement about wanting our stuff to work and therefore avoiding Microsoft products, too bad; they’ve earned it.
This type of tech has been floating around for years. Who owns which patent for what, or even if there are any meaningful patents, is anybody’s guess at this point. The best-known researcher in this field is Jeff Han. Han is a research scientist for New York University’s (NYU) Courant Institute of Mathematical Sciences. We first covered Jeff Han’s multi-touch interface work last February with a direct link to video of Han’s UI and a link to Wired’s “Cult of Mac” coverage. At the time, we wrote, “This could change everything. Again.”
After Steve Jobs’ Macworld Expo keynote unveiled iPhone, with its multi-touch interface, Han has updated his website (http://cs.nyu.edu/~jhan/ftirtouch/) with the cryptic blurb, “Yes, we saw the keynote too! We have some very, very exciting updates coming soon- stay tuned!”
Jeff Han presents his “Multi-Touch Interaction Research” work at the TED Conference 2006:
Perceptive Pixel, Inc. was founded by Jeff Han in 2006 as a spinoff of the NYU Courant Institute of Mathematical Sciences to develop and market the most advanced multi-touch system in the world. More info: http://www.perceptivepixel.com/
iPhone debuts third-generation PC user interface: Apple’s Steve Jobs changes the world – again – February 20, 2007
Researchers have bigger plans for ‘multi-touch’ beyond Apple’s iPhone – January 19, 2007
Video of how Apple’s rumored touch-screen Tablet Mac could work – February 13, 2006