Apple may leapfrog to massive AI advantage

Apple logo

According to Bloomberg News’ Mark Gurman, Apple’s iOS 18 AI capabilities will be on-device – no need for privacy-infringing cloud processing that rivals rely upon – and from which they potentially profit as Google, Microsoft, and other competitors’ users – specifically, their users’ personal data – is the product.

Kate O’Flaherty for Forbes:

The iOS 18 AI move is a huge win if you care about iPhone privacy, but it isn’t surprising, given that Apple is known for its strong focus in the area. It also sends a strong message to Apple’s biggest rival Google’s Android that the iPhone maker will do everything it can to win in the AI battlefield as competition ramps up.

“As the world awaits Apple’s big AI unveiling on June 10, it looks like the initial wave of features will work entirely on-device,” Gurman wrote. “That means there’s no cloud processing component to the company’s large language model (LLM), the software that powers the new capabilities.”

Apple’s AI capabilities in iOS 18 and beyond will require a huge amount of data processing power. The iPhone maker has been investing in more hardware able to host AI, and the iPhone 16 will apparently come with an enhanced neural engine… Yet running LLMs similar to ChatGPT without additional cloud support will be a challenge, says Android Authority, pointing out that “some of Samsung’s and Google’s most sophisticated AI features still require the power of cloud servers.”

Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

MacDailyNews Take: We’ll see just how much AI Apple can do on-device this June at WWDC – and how well they can market it. It’s quite possible that chatbot features will require users to opt-in to submit data to whichever cloud-based GenAI firms (Google, Baidu, etc.) with which Apple decides to ink deals.

Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!

Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

10 Comments

  1. Android does do some of the more advanced AI in the cloud (with less reliance each HW generation), but unless Apple is able to do everything on-device it really will not differ much from what the current crop of Android flagships will be offering when Apple releases their version of AI tech.

    4
    6
  2. If Apple is doing LLM on device, that’s huge. Apple gets to do AI at-scale for customers without spending hundreds of billions on server capacity like everyone else. Nobody has figured out how to monetize AI yet.

    If Apple pulls this off, its customers buy the compute cycles. Device sales are Apple’s monetization strategy, and moat. Keeps users investing in high-end phones. Also eliminates the #1 pain point of AI today: responsiveness/latency.

    Genius.

    14
    6
    1. GTP-4 is estimated to be made up to 1 trillion parameters. GTP-3.5 is made 175 billion parameters. Using a 32-bit precision would require around 650 GB. For inference (generating output) the RAM needed is the size of the model plus the weights and additional space for processing, so the ram needed for GTP-3.5 is roughly about 1.3 TB, then their is the power consumption needed to generate results,

      GTP-4 is roughly 5 times larger than GTP-3.5. GTP-5 and will most likely be made up of multiple trillions of parameters. There are strategies to reduce the size of LLM like model distillation, model pruning, quantization ect but the reduction in model size come at the cost model accuracy, fidelity, ability to generalize, handle edge cases, and add generated errors.

      but this is just for LLM models, once you add modalities like audio, images and video the parameter count goes through the roof. Video models have to be trained to understand and generate video content which involves processing spatial images and temporal information over time, physics, classification, object detection and a language layer to interface with the user just to name a few, Audio, images and Video is exponential more complicated the text.

      Apple’s on-device AI will be a scaled down mobile model with basic capabilities and limited generation. Look to Siri to see how Apple’s on-device performance compares to the competition. Apple will spin it as revolutionary and for security and privicy much of what they say will be true. But in terms of AI leadership Apple will be years behind.

      Maybe I’m wrong, time will tell.

      12
      3
      1. Will Apple be significantly increasing onboard memory to allow users to still have a good amount of storage space on their devices if they are still thinking of keeping the AI tech exclusively on the iOS device? Future devices starting at 1TB of storage?

  3. Yep, if Siri is ANY ref to on-device capability, I’m depressed. AND, why isn’t she a reasonable reference? I spelled Afghanistan wrong. AAPL’s spell check offered no suggestion. Siri NEVER relays my email address correctly…always transcribes; ____@mack.com. It’s in my address book and I’ve spoken it dozens and dozens of times. An idiot would know it’s “mac,” esp when you “work” for/were made by the company. Very, very slow learner.

    7
    3
    1. That’s the problem, there is no learning at all. I’d be happy to be able to train Siri to understand and carry out a few dozen very specific actions. I don’t need an infinite number of AI possibilities, I just want my own unique and oft-repeated requests to be understood and carried out 99% of the time. “Hey Siri, read back X note to me… Hey Siri, how much does X Apple product cost (it doesn’t even know this!!)… Hey Siri, how much battery is left on my X Apple device (it can’t do it!!!)”.

      4
      1

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.