How Apple’s machine learning beats Google Android’s

“While Google, Facebook and Amazon are each making significant progress as it relates to AI, it’s worth noting that Apple was the first company of the four to embrace it,” Gene Munster writes for Loup Ventures. “Apple’s AI roots date back to the mid 1990s with handwriting recognition on the Newton. In June Apple announced Core ML, a platform that allows app developers to easily integrate machine learning (ML) into an app. Of the estimated 2.4m apps available on the App Store, we believe less than 1% leverage ML today – but not for long. We believe Core ML will be a driving force in bringing machine learning to the masses in the form of more useful and insightful apps that run faster and respect user privacy.”

“What’s different when it comes to ML between Apple vs. Android? Google provides developers with TensorFlow compiling tools that make it easy for Android developers to integrate ML into their apps. Developer blogs suggest that Core ML makes it easier to add ML models into iOS apps, but we can’t compare the comparative ease of adoption,” Munster writes. “However, we can say they are different when it comes to speed, availability, and privacy.”

• Speed. ML on Apple is processed locally which speeds up the app. Typically, Android apps process ML in the cloud. Apple can process ML locally because app developers can easily test the hardware running the app (iOS devices). In an Android world, hardware fragmentation makes it harder for app developers to run ML locally.

• Availability. Core ML powered apps are always available, even without network connectivity. Android ML powered apps can require network connectivity, which limits their usability.

• Privacy. Apple’s privacy values are woven into Core ML; terms and conditions do not allow Apple to see any user data captured by an app. For example, if you take a picture using an app that is powered by Core ML’s vision, Apple won’t see the photo. If a message is read using an app powered by Core ML’s natural language processor, the contents won’t be sent to Apple. This differs from Android apps, which typically share their data with Google as part of their terms and conditions.

Read more in the full article here.

MacDailyNews Take: Yeah, but Google’s OSes are named after desserts. (smirk)

We’ve been seeing over the last five years a growth of this inside Apple. Our devices are getting so much smarter at a quicker rate, especially with our Apple design A series chips. The back ends are getting so much smarter, faster, and everything we do finds some reason to be connected. This enables more and more machine learning techniques, because there is so much stuff to learn, and it’s available to [us]… We use these techniques to do the things we have always wanted to do, better than we’ve been able to do. And on new things we haven’t be able to do. It’s a technique that will ultimately be a very Apple way of doing things as it evolves inside Apple and in the ways we make products… Machine learning is enabling us to say yes to some things that in past years we would have said no to. It’s becoming embedded in the process of deciding the products we’re going to do next.Phil Schiller, August 2016

SEE ALSO:
Apple launches new Machine Learning website – July 19, 2017
Apple’s Artificial Intelligence Director discusses computers that can remember – March 29, 2017
New hire could be critical step toward attracting high-profile AI research talent to Apple – October 18, 2016
Apple hires a big brain in AI to smarten up Siri – October 17, 2016
Apple transforms Turi into dedicated machine learning division to build future product features – August 31, 2016
An exclusive inside look at how artificial intelligence and machine learning work at Apple – August 24, 2016
Apple rumored to be taking big piece of Seattle-area office market in expansion – August 12, 2016
Why Apple will become a leader in artificial intelligence – August 8, 2016
Apple buys machine-learning startup Turi for $200 million – August 6, 2016
Apple touts Artificial Intelligence in iOS and opens ‘crown jewels’ to developers – June 14, 2016
Smartphones to die out within five years, replaced by artificial intelligence – survey – December 9, 2015
Apple’s extreme secrecy retarding its artificial intelligence work – October 30, 2015
Apple hires NVIDIA’s artificial intelligence director – October 24, 2015
Apple acquires advanced artificial intelligence startup Perceptio – October 5, 2015
Apple buys artificial intelligence natural language start-up VocalIQ – October 2, 2015

11 Comments

  1. “Core ML powered apps are always available, even without network connectivity.”

    Here’s hoping Apple will finally make Siri able to do AT LEAST basic tasks without needing a network connection.
    The pre-Siri Voice Command accurately did a whole bunch of things without a network connection. But, in order to use it, you have to disable Siri. It should just fail-over automatically when there’s no network.

    1. It depends up on the task and the size of the database required to fulfill it. Small tasks may only require traversing a relatively small, local database. Apple is already making this distinction in some cases, such as strictly locking up fingerprint data on iOS devices, never uploading it to the net.

  2. I’m glad Apple is kicking butt, at long last, in the AI realm.
    I’m glad Apple has competition from Google and others.

    But there’s quite a lot of work to do in order to get to real AI. It’s still a term too loosely thrown around by marketing without much actual ‘intelligence’ involved. It’s still about scouring through databases for answers to tasks. What we’re seeing now is improved ability to self-fill those databases using improved sensing/collecting methods.

    BTW: What’s described below is probably the most significant technology action/movement of 2017. I point out to the future, that we’re not all dummies back here in the twenty-teens. Some of us want to protect and improve the future of mankind:

    Elon Musk leads 116 experts calling for outright
    ban of killer robots

    Open letter signed by Tesla chief and Alphabet’s Mustafa Suleyman urges UN to block use of lethal autonomous weapons to prevent third age of war

    … While the suggestion of killer robots conjures images from science fiction such as the Terminator’s T-800 or Robocop’s ED-209, lethal autonomous weapons are already in use. Samsung’s SGR-A1 sentry gun, which is reportedly technically capable of firing autonomously but is disputed whether it is deployed as such, is in use along the South Korean border of the 2.5m-wide Korean Demilitarized Zone.
    The fixed-place sentry gun, developed on behalf of the South Korean government, was the first of its kind with an autonomous system capable of performing surveillance, voice-recognition, tracking and firing with mounted machine gun or grenade launcher. But it is not the only autonomous weapon system in development, with prototypes available for land, air and sea combat….

    My hashtag for these abominations:
    #CowardRemoteMurderMachines
    Specific humans ARE responsible for the actions of their devices. Personal responsibility in all things remains the goal of human evolution. Otherwise, we’re still on the level of apes, albeit notably smarter apes. Just not smart enough to survive ourselves.

    Keep humanity humane.
    #KeepHumanityHumane

    1. Evolution doesn’t have any goals. Goals are something set by intelligent designers, not something that comes equipped with natural processes like evolution. Even when we humans try to take charge of evolution through eugenics, gene splicing, and the like, we face the unknown terror of unintended consequences, the butterfly effect of chaos theory, and Murphy’s Law.

      If human evolution did have a goal, it doesn’t seem to be personal responsibility. On average, humans are really lousy at that. After a thousand generations, we’re good at making babies but just as good at killing them or letting them starve, then blaming someone else.

        1. You’re right. Any process can evolve toward an outcome favoured by special constraints. Here’s a question—can we predict that outcome? In a sense, we can, according to some evolutionary psychologists, who cite “convergent adaptation” as compelling evidence. Scientists aren’t done with the idea of God. They want to replace the irascible old man with the long white beard with a powerful natural principle that results in intelligent creatures like us, regardless of the starting point. If they’re right, then somewhere out there on another world, feisty warlike bipeds are jumping and barking at their telescopes, itching to take us on, just like in science fiction. I hope they’re right, because we badly need something to fight. But if these other creatures are rational, they’ll wait to invade until after we slaughter one other, then just walk (or slither) in and finish off (or snack on) the survivors.

  3. I clearly misused the term ‘evolution’. I suppose it’s yet another word we’ve/I’ve mashed into unintended usage. I think ‘development’, specific to both human culture and social structure, would be a better way to describe my point. It certainly would be useful if we ‘evolved’ into responsible creatures. But at this point, responsibility is a discipline, not a genetic trait. I believe there are more responsible natural human personalities. But we’re all slobs if we don’t have the perspective to observe and understand the consequences of our choices/actions.

    ♪♪ “What if God was one of us?
    Just a slob like one of us?” ♬

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.