Privacy worry? Apple is sharing your face data with third-party app developers

“Apple just started sharing your face with lots of apps,” Geoffrey A. Fowler reports for The Washington Post. “Beyond a photo, the iPhone X’s front sensors scan 30,000 points to make a 3D model of your face. That’s how the iPhone X unlocks and makes animations that might have once required a Hollywood studio.”

“Now that a phone can scan your mug, what else might apps want to do with it? They could track your expressions to judge if you’re depressed. They could guess your gender, race and even sexuality. They might combine your face with other data to observe you in stores—or walking down the street,” Fowler reports. “Apps aren’t doing most of these things, yet. But is Apple doing enough to stop it?”

“After I pressed executives this week, Apple made at least one change—retroactively requiring an app tapping into face data to publish a privacy policy,” Fowler reports. “‘We take privacy and security very seriously,’ Apple spokesman Tom Neumayr said. ‘This commitment is reflected in the strong protections we have built around Face ID data—protecting it with the Secure Enclave in iPhone X—as well as many other technical safeguards we have built into iOS.'”

“Indeed, Apple — which makes most of its money from selling us hardware, not selling our data — may be our best defense against a coming explosion in facial recognition,” Fowler reports. “But I also think Apple rushed into sharing face maps with app makers that may not share its commitment, and it isn’t being paranoid enough about the minefield it just entered.”

Read more in the full article here.

MacDailyNews Take: As we wrote at the beginning of November:

On something like this, it’s better to be paranoid than not. Of course, Apple and iOS are very good about making this sort of thing opt-in – the user must approve apps’ access to facial data – but, there should still be some mechanism to identify and punish anyone who uses the data inappropriately.

That said, according to iMore‘s Rene Ritchie, “Once the app asks for authentication, it hands off to the system, and all it ever gets back is that authentication or rejection. Apple has a separate system, built into ARKit, the company’s augmented reality framework, that provides basic face tracking for Animoji or any apps that want to provide similar functionality, but it only gets rudimentary mesh and depth data, and never gets anywhere near Face ID data or the Face ID process.”

SEE ALSO:
App developer access to iPhone X face data spooks some privacy experts – November 2, 2017
Apple: Privacy is a fundamental right – September 27, 2017
Apple begins mining browsing data in Safari via differential privacy – September 26, 2017
Apple explains how it’s making Siri smarter without endangering user privacy – September 11, 2017
Apple’s cutting-edge ‘differential privacy’ is opt-in – June 24, 2016
Apple’s cutting-edge ‘differential privacy’ offers unique option for technology users – June 20, 2016
Apple’s use of cutting-edge tech will peek at user habits without violating privacy – June 16, 2016
Apple unveils iOS 10, the mother of all iOS releases – June 13, 2016
Apple previews major update with macOS Sierra – June 13, 2016
Apple seeks to use AI to keep Google off your iPhones, iPads, and Macs – June 15, 2016
Edward Snowden: Apple is a privacy pioneer – June 5, 2015
Tim Cook gets privacy and encryption: We shouldn’t surrender them to Google – June 4, 2015
Tim Cook attacks Google, U.S. federal government over right to privacy abuses – June 3, 2015
The price you’ll pay for Google’s ‘free’ photo storage – June 3, 2015
Apple CEO Tim Cook champions privacy, blasts ‘so-called free services’ – June 3, 2015
Passing on Google Photos for iOS: Read the fine print before you sign up for Google’s new Photos service – June 1, 2015
Why Apple’s Photos beats Google Photos, despite price and shortcomings – May 30, 2015
Is Apple is losing the photo wars? – May 29, 2015
How Google aims to delve deeper into users’ lives – May 29, 2015
Apple CEO Cook: Unlike some other companies, Apple won’t invade your right to privacy – March 2, 2015
Survey: People trust U.S. NSA more than Google – October 29, 2014
Edward Snowden’s privacy tips: ‘Get rid of Dropbox,” avoid Facebook and Google – October 13, 2014
Apple CEO Tim Cook ups privacy to new level, takes direct swipe at Google – September 18, 2014
Apple will no longer unlock most iPhones, iPads for government, police – even with search warrants – September 18, 2014
U.S. NSA watching, tracking phone users with Google Maps – January 28, 2014
U.S. NSA secretly infiltrated Yahoo, Google data centers worldwide, Snowden documents say – October 30, 2013
Google has already inserted some U.S. NSA code into Android – July 10, 2013
Court rules NSA doesn’t have to reveal its semi-secret relationship with Google – May 22, 2013

24 Comments

  1. Apple did not share your face with anyone. Apple has given access to the facility to scan your face to 3rd party developers so they can enhance ID capabilities or whatever on their specific apps. They cannot share the data with anyone, and YOU explicitly have to say it is ok.

    I have a question for anyone who is up on this science. Most facial ID systems I’ve seen demonstrated are based on flat scans of photographs, often lots and lots of photographs of an individual.

    Wouldn’t Apple’s True Depth Camera system, which is a direct model of your face, be superior to that of typical law enforcement or intelligence gathering systems?

    1. “Apple has given access to the facility to scan your face to 3rd party developers so they can enhance ID capabilities or whatever on their specific apps.”

      I believe this is incorrect, in that the full depth info used to establish ID is shared only as to pass/fail.

      As MDN/iMore indicate, the additional info shared with developers is NOT for the purpose of BETTER Face identification, but is a much LOWER level of info which is enough to characterize emotion etc for entertainment (such as animoji) and other potential purposes. None of which are better identification.

    2. “I have a question for anyone who is up on this science. Most facial ID systems I’ve seen demonstrated are based on flat scans of photographs, often lots and lots of photographs of an individual.

      Wouldn’t Apple’s True Depth Camera system, which is a direct model of your face, be superior to that of typical law enforcement or intelligence gathering systems?”

      It’s a matter of a level of confidence in your determination if the currently being viewed individual is indeed the living individual involved or something/someone else. Many dozens of photographs of an individual can be used to create an accurate 3D map of a person’s face and head. It’s not just the face that can be used. Further, from photographs color information can be used, e.g., a colorful tattoo on the scalp or neck.

      It’s also a tradeoff of the accuracy versus the number of scanned points. The most accurate is a 3D scan of the person with multiple expressions held during a full 360 rotation of the full color, high resolution sensor around the person’s head. Further, it’s a tradeoff between data storage and speed of processing and ease of setting it up.

      Apple’s system is better than the TYPICAL system used by government agencies but is nowhere near as good as some systems out there. As mentioned above, the best systems use the full head and neck of the individual. The identifying algorithms then have a larger data set from which to draw and can discount areas such as ears if a hat or hair is obscuring the person’s ears.

      The drawback to most systems input data is the source information availability. As you mention, most databases are based upon multiple 2D photographs (digital or film) of an individual. Most often the resolution of those photographs is significantly poorer than the scan by Apple’s FaceID system. Apple’s system depends, critically, upon the individual willingly submitting to scans of their face. That level of cooperation is not available for the vast majority of cases with government systems.

  2. This is a MASSIVE link-bait article. Apple does NOT share your FaceID data with anyone. Period. That’s tucked away in the secure enclave and turned into mathematical encrypted information. All an Apple gets is “yes” or “no” from Apple.

    Something like 50 points on your face or mesh to make animoji’s is trivial.

  3. https://www.theverge.com/2017/11/15/16658074/iphone-x-face-id-app-developers-3d-the-future

    Developers are already using the technology.

    “…According to the developer agreement, third-party app makers only have access to the visual facial mapping data, and not the same mathematical representation of it that is used to unlock the iPhone X using Face ID. Apple claims the latter is encrypted on the device itself, so not even its own employees have access to it. Yet developers do still have access to a map of a user’s face as part of the True Depth camera, along with data on as many as 50 facial expressions that could tell a developer how exactly you raise your eyebrows or move your mouth, to name a few telling instances. This is how Snapchat’s iPhone X-specific filters, demoed onstage during the phone’s reveal, appear more sophisticated than standard ones…”

    1. That flaw was fixed within one day of becoming public, and made available to every user. Not a single person appears to have been affected.

      When other companies have serious security flaws for which there’s no fix available to hundreds of millions of users, one has a legitimate source of concern

      This particular flaw is completely irrelevant to the issue at hand

        1. Hi King

          I can see your hearts in the right place, but I still need to differ with you. Sorry.

          I think you believe this root security flaw was based on a conscious decision. I don’t believe it was. I believe it was simply a bug that was never discovered.

          And I believe this experience really provided no lesson to be learned.

          When your code base is composed of many millions of lines of cold, it’s not possible to produce absolutely bug free product. Which would be inconceivable. And its also not, in any way, possible to produce test plans that test for every possible bug.

          With regards to this security flaw, Apple has done everything we could hope for. Produce your code, extensively test it, extensively beta test it, and after release, fix security flaws as soon as they show up.

          There really isn’t any more we could ask of them

          Pete

  4. So all the people on here that constantly harp about how Google steals your privacy, compared to Apple (never does) now have egg on their face. Touch ID and Face ID should be sacrosanct. Period. No compromise, at all…..

    1. I don’t know why anyone would up vote a statement like, “So all the people on here that constantly harp about how Google steals your privacy, compared to Apple (never does) now have egg on their face.”

      As reported by several sources, Apple does NOT give access to the FaceID or TouchID data sets themselves, ever. Apple itself does not have access to that data.

      What Apple is doing is giving app developers access to a limited subset of the FaceID data. The level of detail given to app developers CANNOT be used to reliably identify who you are. Fifty data points cannot identify you. If you think it can, you’ve been watching way too many bad TV shows or or bad movies.

  5. 3rd-party app developers (and even the Apple Animoji team/Messenger Team themselves) are exposed to a very limited subset of facial data when accessing the true depth camera. They are granted enough just enough facial data as to create a low-poly mesh of the face. A tiny fraction of the data that is used to authenticate unlocking your phone/making payments.

    I am sick to death of these click-bait articles and the fact that MDN regurgitate them for some desperate tiny fractions of a cent per click.

    After reading MDN’s take why did they post it at all? That’s me – I am done with MDN. Good bye everyone else, have a Merry Christmas and a Happy New Year – and may you all live long and prosper.

    1. I suppose that given those low-poly mesh scans, tech similar to pixel smoothing used for video might plausibly be used to approximate closer to a ‘true’ image of your face by extrapolating the ‘in-between’ points. The question is how close.

      1. Pixel smoothing reduces detail.

        A typical video frame has approx 2,000,000 discernible pixels values in it and is constrained to a simple 2D texture, not triangles. Did you drink your coffee yet? 😉

        It’s click bait son. It’s click bait.

        1. Like I said, ‘similar to’. Extrapolation is the key point here. Forensic science has done research in to how muscle structure is organized under the skin and how features develop over time due to growth and aging. I wouldn’t put it past those same researchers to develop tech using formulae to extrapolate curves in facial features (and as a result ‘in-between’ points) given the more basic points available in the low-poly mesh scan. In addition unless those low-poly mesh scans are identical each time they are generated, over time a more refined aggregate poly-mesh scan might be achieved.

        2. The reason we all have individual faces is that they are irregular. The FaceID scan of 30,000 points picks up those irregularities (as does an image with millions of pixels). If you take a few hundred data points and use even an extremely sophisticated algorithm to extrapolate all the intermediate points, the result will be significantly different than the full-resolution scan. All the irregularities will be smoothed away.

          That’s why you can’t take a blurred partial fingerprint and extrapolate the missing recognition points, and why all the television crime shows that extrapolate a license plate number from 12 pixels are fictional.

        3. You make a good point. I suppose then that there is some algorithm to still correctly identify persons wearing makeup. I understand that foundation is usually used to smooth over those skin irregularities. On the flip side if people take their FaceID scans with makeup on it may not work with changes in their makeup or wearing none at all.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.