In an era when tech giants are burning hundreds of billions on massive AI data centers and infrastructure bets, Apple is conspicuously spending far less, yet quietly positioning itself to claim the real prize: ownership of consumer AI at the device level. With 2.5 billion active devices already in users’ hands, Apple’s on-device focus, powered by efficient chips and model commoditization, lets it intercept queries locally, preserve privacy, and sidestep the hyperscalers’ costly cloud dependency. Yes, Apple missed the generative AI revolution (as evidenced by the chaos in Apple’s AI management ranks over the past few years), but, far from lagging behind, this catch-up strategy — rooted in the conviction that the future belongs to whoever owns the end user — could turn Apple’s biggest apparent miss since Steve Jobs initially overlooked the digital music revolution into one of the company’s most decisive victories.
Daniel J. Arbess for The Wall Street Journal:
The four hyperscalers — Amazon, Alphabet, Meta, and Microsoft — are investing $700 billion this year in artificial-intelligence infrastructure. Apple? A mere $14 billion. A chart of these figures circulates on social media. It looks like four skyscrapers and a mailbox…
[Apple] is making a cold-eyed bet that the most frenzied build-out in the history of American capitalism will produce inadequate returns.
Last October Apple announced the M5 chip, embedding a neural accelerator in every GPU core. This wasn’t marketing theater. Apple’s own benchmarks show the M5 running a 30-billion-parameter model—capable of drafting legal briefs, debugging code, synthesizing research—in under three seconds on a standard MacBook Pro. Eighteen gigabytes of memory. No internet. No subscription. No API key…
Apple recently announced a milestone of 2.5 billion active devices. It has already distributed its existing data center — one device at a time — into the pockets and onto the desks of a quarter of the world’s connected population. When Apple licenses Google’s Gemini for roughly $1 billion a year to handle what its on-device models can’t, it is renting the penthouse while its competitors mortgage the building. If a better model shows up next year — from Anthropic, DeepSeek, the open-source community — Apple switches landlords…
New devices with “Apple Intelligence” will locally handle the tasks, email summaries, writing assistance, photo editing, translation and search, that would otherwise generate cloud inference calls. Billions of daily queries will never touch a server. Hundreds of millions of knowledge workers on iPads and MacBooks currently paying $20 a month for ChatGPT or Copilot can get equivalent capability baked into hardware they already own.
MacDailyNews Take: Once again the best model for many tasks, Grok, goes unmentioned. But, we digress.
Apple still has time to catch up, and even leap over rivals, especially when it comes to AI-powered Siri. – MacDailyNews, August 11, 2025
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.e
