An exclusive inside look at how artificial intelligence and machine learning work at Apple

“On July 30, 2014, Siri had a brain transplant,” Steven Levy writes for Backchannel. “Three years earlier, Apple had been the first major tech company to integrate a smart assistant into its operating system. Siri was the company’s adaptation of a standalone app it had purchased, along with the team that created it, in 2010. Initial reviews were ecstatic, but over the next few months and years, users became impatient with its shortcomings. All too often, it erroneously interpreted commands. Tweaks wouldn’t fix it.”

“So Apple moved Siri voice recognition to a neural-net based system for US users on that late July day (it went worldwide on August 15, 2014),” Levy writes. “Some of the previous techniques remained operational — if you’re keeping score at home, this includes ‘hidden Markov models’ — but now the system leverages machine learning techniques, including deep neural networks (DNN), convolutional neural networks, long short-term memory units, gated recurrent units, and n-grams. (Glad you asked.) When users made the upgrade, Siri still looked the same, but now it was supercharged with deep learning.”

“As is typical with under-the-hood advances that may reveal its thinking to competitors, Apple did not publicize the development. If users noticed, it was only because there were fewer errors. In fact, Apple now says the results in improving accuracy were stunning,” Levy writes. “This story of Siri’s transformation, revealed for the first time here, might raise an eyebrow in much of the artificial intelligence world. Not that neural nets improved the system — of course they would do that — but that Apple was so quietly adept at doing it.”

“I spent the better part of a day in the boardroom of One Infinite Loop at the Cupertino headquarters, getting a core dump of the company’s activities in AI and machine learning from top Apple executives (Cue, senior worldwide marketing vice president Phil Schiller, and senior vice president of software engineering Craig Federighi), as well as two key Siri scientists. As we sat down, they handed me a dense, two-page agenda listing machine-learning-imbued Apple products and services — ones already shipping or about to — that they would discuss,” Levy writes. “The message: ‘We’re already here. A player. Second to none.'”

Tons more in the full article – very highly recommendedhere.

MacDailyNews Take: Which may also help explain why Siri on Apple Watch got and continues to get such raves – people who had sort of given up on Siri on their iPhones gave it another try upon getting their new Apple Watches starting in April 2015 and, lo and behold, found Siri was “so much better on Apple Watch” when, in fact, Siri is so much better everywhere!

Why Apple will become a leader in artificial intelligence – August 8, 2016
Apple buys machine-learning startup Turi for $200 million
Saturday, August 6, 2016

Apple touts Artificial Intelligence in iOS and opens ‘crown jewels’ to developers – June 14, 2016
Google CEO: We feel we’re ahead of Apple, others in artificial intelligence – June 1, 2016
Smartphones to die out within five years, replaced by artificial intelligence – survey – December 9, 2015
Apple’s extreme secrecy retarding its artificial intelligence work – October 30, 2015
Apple hires NVIDIA’s artificial intelligence director – October 24, 2015
Apple acquires advanced artificial intelligence startup Perceptio – October 5, 2015
Apple buys artificial intelligence natural language start-up VocalIQ – October 2, 2015


  1. In offering Steven Levy this scoop, Apple is showing they feel stung by criticisms that their A.I. is inferior to those of Google, Amazon, et al. and is hobbled by their stance on personal security (privacy).

    Apple going candid on this issue reminds me of the way they responded to antenna-gate: by showing tech writers around their laboratories, showing the depth of their testing protocols.

    Apple’s own secrecy created the (large) cottage industry of speculation and FUD that bedevils them. They still believe in secrecy as a competitive advantage, but they pay a price for that in the form of poor P.R. so they open their doors from time to time.

    Here, it looks like they reaching out to the academic community, saying “We are doing significant work; we are trying to establish a model of machine learning from crowd-sourced data that does not compromise individuals. We are showing the industry that it can be done.” Once their scientists publish that promised paper on their implementation of Differential Privacy, the “laggard Apple” narrative should wither away.

  2. Me: “Call Jane at home on speaker.”
    Siri: “You have three numbers for Jane. Which would you like to use?”

    Why can’t Siri get this right?!?

    Also, why is the system so unresponsive and unpredictable? Sometimes if I hold the home button I get Siri, sometimes I get nothing, and sometimes I get Siri telling me it didn’t catch what I said. It’s not as if there’s significant variability in how long I’m holding the button or I’m running on anything other than the very latest hardware. This system or its integration needs some work to be consistency useful.

    I want to like Siri, and it’s sometimes even helpful, but it sure has a lot of irritants.

    1. So apparently this comment system takes text in angle brackets as some kind of markup or command…

      Anyway, above I was trying to indicate action: after I tell Seri to call Jane at home, Siri asks which number to use, then I tap the number labeled “home.”

  3. Very often, but not always, the following happens to me:
    1. Hold the home button to activate Siri.
    2. Wait a fraction of a second for Siri to indicate activation by a faint vibration/tone.
    3. Start speaking, but Siri has immediately stopped listening after activating.
    4. Receive a “Sorry, I didn’t catch that” statement from Siri.
    5. Start cursing, wishing I had just done the thing manually instead of calling out for Siri.
    6. Try again and it usually _listens_ the second time.

    Infuriating. What the hell is that? Stopping listening immediately after starting?
    When that doesn’t happen, I also love the following process:
    1. Activate Siri.
    2. Say what I’d like.
    3. Silence.
    4. Siri still listening, but showing a flat line to indicate there is no audio coming in, showing on the screen that it already heard me accurately.
    5. Wait another 5-10 seconds.
    6. Siri finally decides that the _long_ silence means I am done, and performs the action it already heard me say.

    The fact that BOTH of the processes above happen randomly (the former much more often than the latter) is kind of amazing. They happen intermittently, though, so I haven’t bothered to go to Apple for tech support.
    Someone at Apple needs to understand that fixing crashing bugs is good, but flakiness and other nonsense like this is also very important. When something is unreliable, people give up using that feature. Random punishment is a great way to extinguish a behavior.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.