As Apple integrates more AI features across its ecosystem (like Apple Intelligence), the company appears to be positioning itself around the idea that foundational AI models will increasingly become commoditized — much like other tech components — shifting value toward integration, user experience, privacy, and on-device capabilities rather than owning the most advanced raw model. This piece explores Apple’s strategic wager in the rapidly evolving AI landscape.
Instead of obsessing over owning the absolute best raw model, which changes on the minute depending on the criteria being tested, Apple is betting on ecosystem control as the real differentiator — while keeping the backend interchangeable for cost, performance, or strategic reasons.
Alistair Barr for Business Insider:
Apple’s decision to rebuild Siri around Google’s Gemini AI models looks, at first glance, like an admission of failure. After years of promising breakthroughs, Apple is reportedly paying Google roughly $1 billion a year to keep its digital assistant relevant.
Look closer, though, and the move represents something more radical: a giant bet that AI models will become commodities.
Top tech reporter Mark Gurman wrote this week that Apple’s revamped Siri, codenamed Campos, will launch later this year as a full-fledged chatbot embedded across iPhones, iPads, and Macs. The underlying intelligence will come from Google’s Gemini.
The more important detail is architectural. Apple is designing Campos so that the underlying AI models can be swapped out over time, according to Gurman’s report for Bloomberg.
So, Google’s Gemini might power Siri today, but tomorrow it could be OpenAI’s latest GPT offering, or Claude from Anthropic, or whatever Meta is cooking up next, or xAI’s Grok, or Mistral’s offerings, or even region-specific models such as DeepSeek or Alibaba’s Qwen in China…
Of course, it might turn out that underinvesting in your own AI capabilities is a strategic blunder of epic proportions. We’ll see in the next few years.
MacDailyNews Take: We’re perfectly fine with a hot-swappable Siri. We just do not want to be limited only to Google Garbage. As soon as possible, Apple, give users the ability to choose LLMs and even to assign individual LLMs for specific jobs; one for math, another for language, for coding, image-generation, etc. Then Apple users can have the best of all AI worlds!
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
[Thanks to MacDailyNews Reader “Fred Mertz” for the heads up.]

Hot-swappable LLMs are the least Apple can do for the long-suffering “Macs aren’t user-upgradeable anymore!” crowd 😉