Why Apple might dump Intel’s x86 for their own A-series chips in future Macs

“Though many will scoff at the notion of an iPad and Mac that draw from the same family of application processors, it’s not as farfetched as it seems,” Sam Oliver writes for AppleInsider. “[Here’s] why today’s chipmaking giants could find themselves on the outside looking in during an Apple product launch in the near future.”

“Such a move is certainly in Apple’s DNA. Since Steve Jobs returned in 1997, the company has made its bones by ignoring the vox populi and doing what it felt needed to be done, sometimes dragging consumers kicking and screaming along with them,” Oliver writes. “Much has changed in the last 9 years. Computers have become “good enough” for the vast majority of consumers who use them for email, web browsing, and an occasional household budget spreadsheet — the gangbuster sales of Apple’s relatively underpowered MacBook Air line can attest to that.”

“To gain a real advantage, companies need a competitive edge that their rivals can’t just buy on the street,” Oliver writes. “This is what led Apple to custom chips for its iOS devices, and it’s why an ARM-powered MacBook isn’t an absurd idea.”

Muchad more in the full article here.

Related articles:
KGI: Apple is designing its own processors for Mac – January 14, 2015
Apple A9-powered MacBook Air? – December 16, 2014
Why Apple will switch to ARM-based Apple A-series-powered Macs – August 27, 2014
Intel-powered Macs: The end is nigh – August 4, 2014
Intel’s Broadwell chips further delayed; not shipping for most Macs until early-mid 2015 – July 9, 2014
Apple will inevitably drop Intel for their own A-series processors in the Mac – June 26, 2014
How long before Apple dumps Intel from MacBook Air? – June 26, 2013


  1. I don’t know. Intel spends a LOT of money developing chips. Apple went with the A series because there were no chips that met it’s needs. But for desktops and laptops, Intel does a decent job.

    But it might work for Apple. Hard to say.

  2. While using terms like “vox populi” (had to look that one up), the author does not really address (in detail) WHY Apple needs to create “A-based” Macs, although the article’s title is “Why Apple might…” WHY is the important question, not HOW.

    The answer to WHY is to create Macs that are not possible using Intel processors. If Apple can do a desired future Mac design using Intel, it would be a waste of time and resources to create an A-based version. Whether an “A10” (or whatever revision) is powerful enough is a secondary consideration. Obviously, it needs to be powerful enough to run a Mac (and not just “good enough”), but being power enough to run a Mac does not mean Apple should consider using it in a future Mac. There must be compelling reasons why using Intel won’t cut it, for creating revolutionary future Mac designs.

    If there is, I’m all for it. Otherwise, Apple should keep it simple and continue making progressively better Macs with progressively better Intel processors.

    1. Think about it from the other direction. If you are producing processors that CAN run your laptop/desktop machines that will also enable you to better control your product release cycle, why wouldn’t you?

      As far things Intel doesn’t allow, the A chips make certain payment options available that wouldn’t exist otherwise AND the graphics chip can be as powerful as you feel you need without being forced to make room for Intel’s integrated solution AND a more powerful secondary GPU.

      Right now, the top of the line A processor is nipping at the heels (by benchmarks) of what Intel is currently selling as a low range solution. In 5 years, what Apple is using as a phone CPU could be hitting numbers right smack dab in the middle of that year’s Intel mid range would be.

      1. Those may be valid and compelling reason why using A-chips in Macs allows Apple to create Macs that are not possible using Intel processors. Didn’t you read and comprehend what I actually wrote…? 😉

      2. Great machines with great processors sounds good. However, people don’t buy machines for such reasons. They buy them for the apps. If you kill Windows compatibility, you lose half of your sales. Forget it.

        1. I seriously doubt that HALF of current new Mac customers buy a copy of VMware Fusion or Parallels Desktop (or set up Boot Camp). MOST Mac users these days do not care about Windows compatibility. In the “PC” world, platform-specific apps became less and less important as the Internet became more and more important. That’s the key reason Windows lost its “death grip,” because even BEFORE the PowerPC to Intel transition for Macs, users were already doing more and more on the Internet using web browsers (where the platform does not matter), and not using platform-specific applications.

          Ironically, in the mobile space, apps DO matter, and Apple has an overwhelming advantage with its iOS platform; Microsoft (once king of the “app advantage”) is less than pathetic.

          1. You are correct, Ken. The real number of Macs that have Windows apps compatibility in one way or another is closer to 75%. Just look at the number of annual license renewals for VMWare, Parallels, and CrossOver.

            Like it or not, the present OS X success is mostly based on the fact that it uses x86 Intel chips. Lose that compatibility, and kiss future Mac sales goodbye.

            If you want an ARM chip to replace your Mac today, then go right ahead and buy an iPad with an accessory keyboard. Go right ahead….

  3. Apple, don’t dump intel x86 chips for A-powered ARM cpus! x86 chips are more powerful and allows for bootcamp so users can install windows which help a lot of people justify a Mac purchase. x86 chips are the way to go, Macs aren’t mobile iOS devices.

    1. Don’t worry. Unless Intel totally borks up its x86 chips on a level of IBM borking up the PPC mobile chips, this entirely A-Series in a Mac mythology will remain merely mythology. The work required to go from CISC to RISC CPU is outrageously difficult for every developer and offers zero incentive (I SAID ZERO!) at this time. This subject shouts out intense willful ignorance of computer technology. If only people bothered to do their homework before they spread these idiot rumors blahblahblah IASSOTS…

      1. DC, while Intel is currently in the driver’s seat with respect to laptop CPUs, A-series based Macs could happen without a total “bork up” on the part of Intel.

        I believe that your supporting rationale is highly flawed. If this transition to A-series CPUs were to occur, Apple would handle the “CISC vs. RISC” transition, not the developers. Apple would handle it in the OS and Apple would handle it via its software development tools. Apple made the transition from Motorola 68000 series to PPC processors developed under the AIM alliance. Apple made the transition from its Classic OS to OS X. Apple then made another major CPU transition from PPC to Intel CPUs. Apple now designs and fabricates (through foundry contracts) it’s own mobile processors. Why is it such a leap to believe that Apple might seek to leverage those processors in product lines other that iOS devices?

        I believe that you went overboard with your “intense willful ignorance” and “idiot” statements. What is the justification for your derision?

        1. Please, go through what it would take to turn an application dependent up on Intel APIs built into their x86 CPUs into an application that can run on a RISC CPU. No, there’s nothing flawed about my assertions that making that transformation is a total PITA and would NOT please any developers.

          And PLEASE don’t bring up the ’emulator’ word as that is also a no-go in this day and age. We’re now used to the speed of virtualization, another feature BUILT into modern Intel CPUs, whereby alternative operating systems run at native speeds.

          This whole issue is dead in the water. It is remarkable that it is undead in so many people’s ignorant minds. As I’ve repeated over and over: Prove it to yourselves kids and stop avoiding your homework.

          1. Developers don’t write assembler code directly for the cpu. Apple provides them with an abstractiion layer in the form of a programming language and an sdk with the relevant api for hardware. Apple provided a simple migration from ppc to Intel for developers who used apple’s own sdk. Those who developed for Mac OS 9 using commercial tools were stuck re-writing their code.

            Apple has shown many times that it knows how to execute hardware platform migration without significant loss of customers. If they choose to do this again, there is no reason why they would fail this time around.

            1. INTEL provide APIs that are written into their CISC chips. None of that code is in anyone’s RISC chip by definition.

              We keep going over the same ground over and over. Read about CISC CPUs. Read about RISC CPUs. It’s all right there.

    1. That was important at the time of the PowerPC to Intel transition, almost a decade ago. Not so much anymore… It will certainly not be a “deal-breaker,” if and when Apple decides to create a Mac powered by an “A-chip.”

      If anything, new (first time) Mac customers these days are trying to “get away from” Windows… 🙂

    1. GRID 2 Reloaded Edition:

      • Minimum System Requirements: Processor: 2.0GHz, RAM: 4GB, Graphics: 256MB, Free Space: 11GB.

      • The following graphics cards are not supported: ATI X1xxx series, ATI HD2xxx series, Intel GMA Series, Intel HD3000, NVIDIA 7xxx series, NVIDIA 8xxx series, NVIDIA 9xxx series, NVIDIA 1xx series and NVIDIA 3xx series.

      Sheesh! I have to wonder why they bothered to develop the game if they’re intent on neglecting this vast pile of GPUs! Is this laziness on their part?

      1. Even some turn-based strategy games like the recent “Civilization: Beyond Earth” have silly hardware requirements. It must be because of all the animations all over the screen. I’d say work more on the “strategy” and gameplay, and less on the eye-candy. It’s a strategy game (where I move my “pieces” one by one like chess), not a 3D action game.

        I was hoping this game would be a modern update of one of my all-time favorite games, “Alpha Centauri” (also a “Sid Meier’s” game). It’s more like Civ 4 with sci-fi (instead of historical) themes, which isn’t bad. But I like the AC game play and storyline much better, and the opposing AI players act more “human.” Nice graphics are fun for a while, but then become irrelevant (and even detrimental).

  4. NeXT solved the problem of making a CPU architecture-independent OS many years ago, and there’s no reason why Apple couldn’t offer Macs based on x86, ARM, or some future architecture to emerge from Apple’s labs in the next couple of years.

    If they make a MacBook Air this year that’s ARM-based, we could see any number of benefits, not the least of which could be a laptop with 30 hours of battery life.


  5. A USB-powered 12″ super-thin A-series MBA … sounds like it’s coming soon! Whoever the fuck needs to torture themselves with Windows can buy another model with old-style silicon.

    Bravo Apple if true!

  6. This ignores the basic requirement that Apple would then have 3 OSes to support — iOS, OS X ARM, and OS X Intel. Not to mention that software someone has purchased to run on OS X Intel would not run on OS X ARM, certainly not without some sort of virtualization, which would negate any performance/battery life gained by switching to ARM chips.

    Finally, convincing developers to re-write their apps for ARM processors will not be an easy task. Apple finally had some major app developers come out with OS X versions, like AutoCAD, and now it’s going to switch up on them? And not really for any marked gain in performance or price either.

  7. Just desire to say your article is as astounding. The clearness in your post is just cool and i could assume you’re an expert on this subject. Well with your permission allow me to grab your RSS feed to keep updated with forthcoming post. Thanks a million and please keep up the enjoyable work.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.