Minority Report? Apple’s ARKit gets 3D real time gesture support

“While only an early proof of concept, I can’t help but see today’s ARKit revelation as yet another step toward Minority Report-style future computer user interfaces, a new ARKit technology that makes it possible for us to use our real hands within virtual spaces in order to interact with virtual objects in AR,” Jonny Evans writes for Apple Must.

“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in depth with augmented objects in 3D space,” said Daniel Carlman, co-founder and CEO of ManoMotion.

“Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality.”

Evans writes, “The integration will be available to developers in the next build of ManoMotion’s SDK, first for Unity iOS, followed by native iOS in subsequent updates.”

Read more in the full article here.

MacDailyNews Take: Very cool. Apple will usher in the augmented reality boom with the release of iOS 11 in mid-September!

Augmented Reality is going to change everything.MacDailyNews, July 21, 2017

The impact of augmented reality cannot be overstated. It will be a paradigm shift larger than the iPhone and the half-assed clones it begat.MacDailyNews, August 4, 2017

Apple ARKit to usher in an intriguing, amazing new world – August 31, 2017
Major developers reveal Apple ARKit apps ahead of iOS 11 release – August 29, 2017
Apple working on several prototypes of AR glasses – August 4, 2017
Apple’s next big move: Augmented reality – August 3, 2017
Apple’s rumored new glasses will be an even bigger deal than the iPhone – July 28, 2017
Apple smart glasses are inevitable – July 28, 2017
New app using Apple’s ARKit lets iPhone recreate 16 weeks of painstaking rotoscoping on the fly – July 27, 2017
Gene Munster: Apple Glasses will soon outshine the iPhone – June 28, 2017
Gene Munster: Expect Apple smart glasses in mid-2020 – June 27, 2017
Augmented Reality: Apple’s revolutionary offering leaves Google’s Android woefully behind – June 26, 2017
Apple’s AR is much closer to reality than Google’s – June 26, 2017
UBS: Apple may eventually launch ‘iGlass’ smart glasses – June 20, 2017
IKEA’s forthcoming Augmented Reality furniture app powered by Apple’s ARKit – June 19, 2017
Apple’s single most important WWDC 2017 announcement: ARKit – June 11, 2017
Apple CEO Cook discusses philosophy behind HomePod, ARKit’s potential market impact – June 6, 2017
Overnight, Apple will own the world’s largest augmented reality platform – June 7, 2017
Analysts: Apple’s Corning investment hints at AR glasses and wireless charging tech – May 14, 2017
Apple awards Corning $200 million in first Advanced Manufacturing Fund investment – May 12, 2017
Leaked document details Apple employee eye injuries, hints at Apple AR glasses – April 20, 2017
Apple began working on augmented reality glasses more than a year ago, sources say – March 27, 2017


    1. Gestures don’t have to be huge gestures with you waving your arms around. It could also detect fingers just above the screen while a finger is touching the screen.

      I’m intrigued that Apple’s 3D imaging system is reported to be so amazingly fast. That wouldn’t be needed if you’re just going to ocasionally make a huge gesture. My feeling is that the interaction will be very much more subtle and nuanced than what we’ve seen before.

      1. Gesturing w/o touching above a smartphone display has been introduced in the past by both Sony and Samsung in some of their smarphones. I think Sony called theirs ‘Air Touch’ or similar. The system supposedly worked by extending the capacitive detection range to a few millimeters above the display surface.

      2. Google has been developing a chip that detects hand motions near devices that have it installed. I think that one was called Project Soli. No camera required. The idea is that we all produce an electrical field which the chip can detect (up to individual fingers) and with proper software allow some interesting UI.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.