“The iPhone X contains a ‘TrueDepth’ 3D sensor that, when combined with Apple’s ARKit APIs, can interpret facial expressions,” Brad Dwyer writes for Prototypr. “After you grant them permission (and they undergo an extensive Apple review process to ensure they respect rules regarding data usage and user privacy), apps can use this data to access things like your ‘left eye blink percentage’ or ‘mouth close percentage.'”

“This data was used by Apple to create Animoji puppets that mimic your facial expressions. But the potential uses go far beyond animated poop emoji and, with some creativity, app developers can use facial motions and gestures as rich input devices that allow apps to be controlled in novel ways,” Dwyer writes. “This week, two new games were released that do just that.”

Rainbrow by Nathan Gitter is a frogger-like game completely controlled by raising and furrowing your eyebrows,” Dwyer writes. “Taking things one step further is an app released by my company called Nose Zone (free download in the App Store, iPhone X only). Nose Zone is another face-controlled app but, instead of using facial expressions for control, it uses the direction you point your face to control a laser projected from your nose.”

Read more in the full article here.

MacDailyNews Take: Those of a certain age and proclivity might want to lay off the Botox.

Apple wins kudos for accessibility and smart home tech empowering people with disabilities – May 18, 2017
Stevie Wonder thanks Steve Jobs, praises Apple for iOS accessibility – September 15, 2011

[Attribution: 9to5Mac. Thanks to MacDailyNews Reader “Fred Mertz” for the heads up.]