Intel’s Broadwell chips further delayed; not shipping for most Macs until early-mid 2015

“Intel’s line of 14-nanometer Broadwell chips, which are expected to be included in future versions of the MacBook Air, Retina MacBook Pro, and iMac, have been further delayed, reports Chinese site VR Zone,” Juli Clover reports for MacRumors.

“According to the site, while Intel will begin production on its extremely low power Core M processors in July and August for a 2014 launch, production on the U and H Broadwell chip series will not begin until much later in the year,” Clover reports. “As a result, the Broadwell U 2+3 dual-core chips with GT3 (HD 5000 or Iris) graphics, likely slated for use in the MacBook Air and the 13-inch Retina MacBook Pro, won’t be ready to ship until February of 2015. The Broadwell H 4+3e quad-core chips with Iris Pro graphics designed for the larger Retina MacBook Pro and iMac won’t be shipping until July 2015 at the earliest.”

“Intel’s continual Broadwell delays are likely impacting Apple’s own release plans,” Clover reports. “Rumors have suggested Apple is planning a fall launch of a refreshed Retina MacBook Pro and a new 12-inch MacBook Air, but it is unclear which chips the company will use. Apple may only be able to offer a minor Haswell processor boost for the Retina MacBook Pro, which will be the only update the line sees until Broadwell is ready.”

Read more in the full article here.

[Thanks to MacDailyNews Reader “Dan K.” for the heads up.]

26 Comments

      1. Why do you need 14nm chips? Your statement sounds like “Apple’s going to make a capacitive tablet? Good luck using your run-of-the-mill stylus with that! Yonk yonk yonk!”

        By that I mean, there are a LOT (and I do mean lot) of devices out there in use right now, some high performance, some low power and all of them not using 14nm process. Is there something that only that number provides that, say, just producing a better processor (that doesn’t have to have any of Intel’s backwards compatible baggage) wouldn’t provide? At, say, 28nm?

        1. Well, my comment was in response to your post. I don’t want to put words in your mouth, but I’m not sure how one is supposed to take, “…Apple has to make their own CPUs” other than to say Intel is failing and Apple needs to take it (because they can do better).

          Apple can’t. Nobody can. Not even close.

          Intel is experiencing this setback due to the insanely ridiculous nature of what it takes to get to 14nm. They’re there, they just need to improve the process to optimize yields.

          Again, good luck to anyone thinking they’re going to jump in there and take this on.

          It would be like as if when Apple couldn’t meet demand during the iPhone 5s launch, that someone would say, “Yet another reason why AT&T needs to make their own phones”. Only multiple that several fold.

          It’s not that 14nm is a magic measurement of power and efficiency, but by all measurements, nobody comes close in this class of processors.

          The only, tiredly overplayed, exception to where Intel doesn’t lead, for now, is in the mobile space, where Intel went in the wrong direction and is working to course change.

          Apple’s A series chips are impressive as hell, but the design is based on work by others. The actual fabrication of the chips are done yet by others.

          When it comes to building a processor that would compete with those in this class, where does the original reference design come from upon which to build???

          Then, who’s going to fabricate them???

          For this class of processors, Intel owns and dominates in all areas from design to fabrication.

          You might have just as well said, “Yet another reason why Apple has to buy Intel”.

          1. “due to the insanely ridiculous nature of what it takes to get to 14nm.”
            Again, who needs 14nm? IF you want to say “Intel is the only company doing 14nm” no argument there. If you want to say “Intel is the only company that CAN do 14nm” no argument there either. I’m just saying that there’s WAY more that goes into a processor than just that. Apple has shown that it can make processors (that only run iOS, but that’s all that’s important to them) better than anyone out there in the mobile space. One of the reasons why they can is because they own the compiler that produces the code. They could come out with an A9 processor, release a new version of Xcode and now every Apple developer has access to whatever features that chip provides.

            Intel makes VERY nice general purpose processors that can fit a wide variety of use cases. However, they have never attempted (and will never attempt) to make a processor that’s only intended to work JUST with OSX. If they ever did, they’d be hands down better than Apple at it. But they won’t.

            Once Apple creates a processor with the performance and thermal envelope (at whatever process size works for them) they need, they could move laptop and eventually desktop systems to the new processor with just a tweak of Xcode. I would be willing to bet that on that day, Intel also announces that their first customer for their new “We are the world’s best fab” model will be Apple. 🙂

            “When it comes to building a processor that would compete with those in this class”
            If by “class” you mean x86-64 instruction set compatible chips, Apple can’t do that better than Intel BUT has shown that it’s not needed (in the mobile space). In the beginning, OSX was not x86-64 compatible once before, it can be x86-64 instruction set incompatible once again.

            1. “Again, who needs 14nm?”

              You took that sentence out of context. That’s the explanation as to why the processor is being delayed. As far as who needs 14nm… anyone who can match Intel on everything else that makes the processor and then needs to match them on getting to 14nm. Again, good luck on any part of that.

              “Apple has shown that it can make processors (that only run iOS, but that’s all that’s important to them) better than anyone out there in the mobile space.”

              No they haven’t. They’ve shown that they can take ARM reference designs and design around them and others to design the A series which is then built (fabbed) by others.

              For desktop/laptop processors, who’s reference designs are they going to start with?

              “However, they have never attempted (and will never attempt) to make a processor that’s only intended to work JUST with OSX. If they ever did, they’d be hands down better than Apple at it. But they won’t.”

              That’s not entirely true. Intel made chips for the MacBook Air that were based on design requirements by Apple. Granted, they weren’t just available for OS X, but they did take design spec requests from Apple. Further, Intel fabs ARM chips for other customers.

              I’ve been saying this for a while now, but don’t be surprised if Intel finds itself more involved in iOS devices rather than ARM being in Macs.

            2. “anyone who can match Intel on everything else that makes the processor and then needs to match them on getting to 14nm”
              What I’m saying is a potential OSX only processor wouldn’t NEED to be 14nm. It would NEED to execute code generated by Xcode quickly and efficiently. If you can get to 14nm, gravy! But, making a desktop experience that feels the same to the end user whether you’re on Intel or A(n) is all that’s needed. If larger than 14nm is doing that now, it could continue to do that whether it’s Intel, ARM, Apple, AMD or whomever.

              “Granted, they weren’t just available for OS X”
              And there’s the rub. Any x86 chip Intel produces by default has to maintain a broad degree of backwards compatibility. Since Apple controls Xcode, they have a lot of control over what they’re sending to a processor. If Intel were to design a chip that has an execution profile geared only towards OS X, that would mean fewer transistors and THAT alone might be a bigger power/heat profile and performance hit even before going to 14nm.

              It would be if Intel were to give Itanium another shot, only instead of trying to get Windows and Linux distributions running on it, it just runs OS X. Actually, I like that idea, but not sure if Intel is willing to go down that way.

              “I’ve been saying this for a while now, but don’t be surprised if Intel finds itself more involved in iOS devices”
              Intel being more involved in iOS devices will be Intel as a foundry for Apple’s ARM ambitions. Apple’s performance is better than ANY other mobile processor out there PLUS it gives a competitive advantage when no other company can offer precisely what the A-series chips can be. I don’t see any positives for moving to Intel, but a definite set of negatives… the greatest of which is having to delay the release of some future iOS device because Intel has decided that 10nm is the way to go, but missed their estimates.

              “They’ve shown that they can take ARM reference designs and design around them and others to design the A series..”
              Not sure I understand this sentence. Apple’s earlier chips were simply ARM reference designs. The later processors take advantage of Apple’s ARM instruction set license. This is Apple’s own ARM based CPU core. This is similar to what AMD does with their x86 instruction set license.

  1. Whenever there is an article about Intel someone always brings up putting Ax chips in Macbooks…

    No that is not going to happen!

    Intel is working at the edge of process node technologies. There will be hiccups but their engineers are the best in the world and I have no doubt they will be able to resolve whatever issues is holding u Broadwell.

    1. There are no “issues” with Broadwell other than Intel’s inability to compete with ARM chips in smartphones and tablets. The Electronista report on this notes that the low-power Broadwell chips will be available in time for holiday season 2014. This is a repeat of the PowerPC debacle, where IBM/Motorola’s desire to win over the game player market tubed Apple.

  2. I would love a MacBook Air with A series chip. Just the ability to work on basic tasks like email, Pages, Keynote with crazy battery life and have the choice to fire up to Intel for the heavy with Final Cut.

    1. Dream on.

      If someday Apple does ship a MacBook Air with an Ax series chip, it won’t have two processors. Apple will never ship a MacBook Air with BOTH an Ax chip and an Intel CPU.

      1. I love it when people say NEVER. In addition, I love how people thing they know, for certain, what Apple, will do. Only Apple, know what they will or will not do, and for that matter, these plans are buried deep and securely inside Apple.

      2. Much in the same way Apple was maintaining both an PowerPC and Intel code for the day it would never switch to Intel processors.

        Um, vaguely remember that same line of thought on the Intel alliance.

  3. I’ve been saying for three years that 14nm was going to be very difficult, and would be delayed. It’s now been delayed several times, though I’m surprised at how long this new one will be. This will put their main production at a two year delay. Everyone else will be having similar problems, or worse!

    What has surprised me is that Intel’s chip road goes to 10nm, 7nm and even 5nm! Many experts in the chip industry still believe that 10nm is close to the end, and that 7nm is possible, but not assured. As for 5nm, I was in shock that Intel had that on their roadmap as though it’s was a sure thing, which it definitely is not.

    As process sizes move towards 14nm an almost exponential rise in difficulty begins. It’s not just the manufacturing portion that’s the problem either. 14nm has lines that are just 28 atoms wide, 10nm, 20 atoms wide, 7nm, 14 wide, and 5nm an absurd 10 atoms wide. Quantum effects are so severe, with electron tunneling, that it may be impossible to work around it.

    1. With all the money “supply chain genius” Cook is sitting on, it is inexplicable that Apple doesn’t have in-house fab for ARM chips.

      As for Intel — the only way Apple will have a chip advantage over any other computer maker is if Apple buys Intel. Costs and incompatibilities resulting from adopting a non-Intel chipset would be devastating to Apple, and AMD doesn’t seem to have superior hardware to offer. Assuming that Cook doesn’t have the balls to buy Intel outright, the smartest thing Apple could do is to license Intel technology and bring some Mac chipset production in house. Again, Apple can’t seem to find the cash to do it, so it will just wait in line for new chips like everyone else.

        1. “Mike” (one of many around here) isn’t realistic about what difference it would make if Apple bought Intel. In fact, if that happened it would only create further delays. Intel is, from what I’m reading, working on breaking new boundaries in chip fabrication. What can Apple add to speed that up? Nothing.

          As for moving from RISC PowerPC chips to CISC x86 Intel chips, that was a relatively easy jump. Going from CISC to RISC is INCREDIBLY harder when you have an established software base. Not Gonna Happen for the Mac platform. All the CISC Intel technology is now deeply embedded in all active Mac software.

          Thankfully, gratefully, Apple kept iOS OUT of the CISC CPU world, which gives it a great future with minimal reliance on CPU APIs.

          And no, I’m not going to further debate this soooo old and boring issue. Please simply go read about the difference between CISC and RISC CPUs and you’ll comprehend.

            1. Before I rip you to shreds for asking a meaningless question, could you please state it again in a more descriptive way? Oh and, would you please DARE to post as yourself, as opposed to an anonymous coward out to troll others? You’ll gain some modicum of credibility if you do. As stated, I have no idea what you’re attempting to ask or insult in your question.

            2. No insult. You said, “All the CISC Intel technology is now deeply embedded in all active Mac software.” That implies that you feel it’s virtually impossible if not impossible for Apple to move away from x86. If that’s what you’re saying, then we can agree on that point. If it’s impossible, it won’t be done.

            3. It is NOT impossible. We both know that. It is, however, a pointless endeavor. It would literally mean recoding EVERYTHING.

              Then add in the fact that Apple’s ARM chips are NOT as fast as Intel’s current chips.

              Then add in Apple having to dump the Xeon chips as well as the Core Intel chips, which is another hobbling step.

              Plus, this past week Intel FINALLY brought out the Broadwell chips. Sheesh, are they late! They were supposed to be out last fall. The next generation is going to be late as well. *sigh* But it’s not like the situation with IBM whereby there wasn’t a chance in hell that they’d fulfill their promises to Apple. Intel is running into a new bleeding edge set of problems at the atomic level, not a surprise to anyone. Slowly, new approaches to making new CPUs are being discovered and developed. It’s not as fast as certainly I would like. But it’s progress, It’s promising.

Leave a Reply to Bryan Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.