“What’s different when it comes to ML between Apple vs. Android? Google provides developers with TensorFlow compiling tools that make it easy for Android developers to integrate ML into their apps. Developer blogs suggest that Core ML makes it easier to add ML models into iOS apps, but we can’t compare the comparative ease of adoption,” Munster writes. “However, we can say they are different when it comes to speed, availability, and privacy.”
• Speed. ML on Apple is processed locally which speeds up the app. Typically, Android apps process ML in the cloud. Apple can process ML locally because app developers can easily test the hardware running the app (iOS devices). In an Android world, hardware fragmentation makes it harder for app developers to run ML locally.
• Availability. Core ML powered apps are always available, even without network connectivity. Android ML powered apps can require network connectivity, which limits their usability.
• Privacy. Apple’s privacy values are woven into Core ML; terms and conditions do not allow Apple to see any user data captured by an app. For example, if you take a picture using an app that is powered by Core ML’s vision, Apple won’t see the photo. If a message is read using an app powered by Core ML’s natural language processor, the contents won’t be sent to Apple. This differs from Android apps, which typically share their data with Google as part of their terms and conditions.
Read more in the full article here.
MacDailyNews Take: Yeah, but Google’s OSes are named after desserts. (smirk)
We’ve been seeing over the last five years a growth of this inside Apple. Our devices are getting so much smarter at a quicker rate, especially with our Apple design A series chips. The back ends are getting so much smarter, faster, and everything we do finds some reason to be connected. This enables more and more machine learning techniques, because there is so much stuff to learn, and it’s available to [us]… We use these techniques to do the things we have always wanted to do, better than we’ve been able to do. And on new things we haven’t be able to do. It’s a technique that will ultimately be a very Apple way of doing things as it evolves inside Apple and in the ways we make products… Machine learning is enabling us to say yes to some things that in past years we would have said no to. It’s becoming embedded in the process of deciding the products we’re going to do next. — Phil Schiller, August 2016
Apple launches new Machine Learning website – July 19, 2017
Apple’s Artificial Intelligence Director discusses computers that can remember – March 29, 2017
New hire could be critical step toward attracting high-profile AI research talent to Apple – October 18, 2016
Apple hires a big brain in AI to smarten up Siri – October 17, 2016
Apple transforms Turi into dedicated machine learning division to build future product features – August 31, 2016
An exclusive inside look at how artificial intelligence and machine learning work at Apple – August 24, 2016
Apple rumored to be taking big piece of Seattle-area office market in expansion – August 12, 2016
Why Apple will become a leader in artificial intelligence – August 8, 2016
Apple buys machine-learning startup Turi for $200 million – August 6, 2016
Apple touts Artificial Intelligence in iOS and opens ‘crown jewels’ to developers – June 14, 2016
Smartphones to die out within five years, replaced by artificial intelligence – survey – December 9, 2015
Apple’s extreme secrecy retarding its artificial intelligence work – October 30, 2015
Apple hires NVIDIA’s artificial intelligence director – October 24, 2015
Apple acquires advanced artificial intelligence startup Perceptio – October 5, 2015
Apple buys artificial intelligence natural language start-up VocalIQ – October 2, 2015