When a display is right in front of your eye, as it will be when wearing Apple Glasses, touchscreens become impractical. Therefore, Apple is investigating how users wearing AR smartglasses could manipulate virtual controls in the real world.
A recently revealed patent application showed Apple proposing a way to use Apple AR to display information on what would appear to anyone but the owner, as a blank screen. Now in a separate application, it’s looking at making any surface appear to “Apple Glass” wearers as a control panel, with buttons and a display.
Your “Apple Glass” might show you information, then, but you can’t do a lot with it unless you take the glasses off and poke the lens with your finger. Since AR already maps virtual objects onto the real world around you, though, it could be extended to make it appear as if there were buttons and controls conveniently located where you could touch or tap them.
In that case, your eyes would see the virtual object because “Apple Glass” is showing it to you, but your fingers could touch whatever is really in that spot in the real world. If the AR can determine that you’ve touched what you think is a button, it can react as if you actually have tapped a control.
MacDailyNews Take: Of course, this is precisely how the user would expect AR glasses to work. Such basic intuitiveness, well-executed, is from whence all Apple “magic” springs.