Intel unveils X-series platform: Up to 18 cores and 36 threads

“AMD announced its new high-end desktop (HEDT) platform, the 16-core 32-thread ThreadRipper a couple of weeks ago,” Peter Bright reports for Ars Technica. “At Computex in Taipei, it is Intel’s turn to update its HEDT platform, and it is one-upping AMD in the process. The Intel platform, consisting of the new X299 chipset and new X-series processors, will go all the way up to 18 cores and 36 threads.”

“The new platform has wider range and greater complexity than the X99 platform it replaces. At the low end are ‘Kaby Lake-X’ processors: the i5-7640X and the i7-7740X,” Bright reports. “These chips are very similar to the existing mainstream Kaby Lake processors that came to market earlier this year; four cores with either four (for the i5) or eight (for the i7) threads, two memory channels, and 16 PCIe 3 lanes from the CPU itself. The chips have a higher power envelope—up to 112W, instead of the 91W of non-X parts—and use X299’s new Socket 2066. Their clock speeds are a little higher, too; the $339 i7-7740X has a base clock of 4.3GHz and a turbo of 4.5GHz, compared to the 4.2/4.5GHz of the $339 i7-7700K.”

“Step up the price a little and things become more interesting. Out goes Kaby Lake-X, in comes Skylake-X,” Bright reports. “Skylake-X is not simply an existing Skylake processor in a new socket, instead, it’s a desktop version of the Skylake-SP core that’s going to be used in the next generation of Xeon chips… Wth its 18-core chip, Intel is scaling a bit higher for those who truly want the best that money can buy. Intel’s single-threaded performance, which is still critical in many workloads, is also going to be superior…”

Much more in the full article here.

MacDailyNews Take: Right back at ya, AMD! Competition is a wonderful thing.

Now, who wants a new, extensible Apple Mac Pro with an 18-core Intel Core i9 Extreme inside?

SEE ALSO:
Will Apple put AMD processors in Macs? – February 27, 2017
AMD’s Ryzen processor overclocked, notches new Cinebench world record – February 24, 2017

26 Comments

    1. Enough with the brainless brand cheerleading. Apple’s Macs require the best chipsets available. Those come from Intel.

      All the blubbering about how Apple’s 2 or 4 core RISC chips are good enough to complete with CISC chips from a decade ago are nauseating. If you cannot or will not understand the advantages of CISC, then please stop posting your dislike for Intel until you do.

      Jobs switched the Mac to Intel for many good reasons. Those reasons are still valid, and will be for years to come. The question is whether the fashion mavens at Apple have enough computer science brains to keep up with the many computer companies that do. Right now, every Mac model is either pathetically underpowered with aging chipsets, or tremendously overpriced. Using proprietary chipsets that place undue burden on either Apple or Mac software developers is a stupid way to attempt to regain the traction Apple lost while Timmy was focusing all his time on hipster store makeovers, emoji and Snapchat feature copying, rental computing, me-too services, and clouds.

      1. To be honest all these cores and all these threads are great but at the same time the real shake up will be one quantum computing really comes out and becomes stabilized on the market I believe it will be the biggest thing since the dawn of computing. It will change the way we truly use a computer not only by interface but by process. To be real that is the upgrade I’m looking at these are all coming into the challenges of heat versus wattage which at the same time is always a fight against each other to keep it cool enough to run processes better. Basically to have a much more robust platform to get more work done but also in a way that does not kill their path of upgradability and more money from a customer. I recall years ago hearing that Intel had 25 years of roadmap already in laboratories and in the back end of things where people don’t see already in preliminary development not a shocker. It just goes back to what I said to somebody else just because it’s not a market doesn’t mean it doesn’t exist.

        1. Well no. In all likelihood you will not use a quantum computer for day to day problems. They are not suited to the things that conventional computers do well, and vice versa not to mention that true quantum computers are a long, long way off. “…Large-scale quantum computers would theoretically be able to solve certain problems much more quickly than any classical computers that use even the best currently known algorithms, like integer factorization using Shor’s algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon’s algorithm, that run faster than any possible probabilistic classical algorithm.… On the other hand, quantum computers may be able to efficiently solve problems which are not practically feasible on classical computers…”

          Factorization of really big numbers for instance. Quantum computers are good at guessing shit, and not for using with databases

          ‘Consider a problem that has these four properties:

          The only way to solve it is to guess answers repeatedly and check them,
          The number of possible answers to check is the same as the number of inputs,
          Every possible answer takes the same amount of time to check, and
          There are no clues about which answers might be better: generating possibilities randomly is just as good as checking them in some special order.”

      2. Intel’s chips are actually RISC cores with a CISC instruction front-end for backwards compatibility, which was absolutely crucial to gain acceptance. CISC instructions are broken down in to RISC instructions and then fed to the CPU cores.

        So there’s really no “advantages” of CISC, other than if you’re an assembly programmer and take advantage of multi-step instructions versus using several simpler instructions.

        1. Your point is moot, because Intel does not allow direct RISC instructions. Intel offers a very powerful decoder on the front end so programmers can use the full x86 instruction set, then the cores receive discrete bit size instructions for cores to process at maximum efficiency.

          RISC is a good core architecture for processing simple tasks, but it is less efficient in storage, requires many more processing steps for complex tasks, and there is a huge volume of code that takes advantage of CISC programming.

          Like it or not, CISC programming has a dominant position in the market, which Apple now uses for the Mac. To abandon that would be more disruptive than moving from the old PowerPC to Intel — which was not seamless of many software vendors and customers, I might remind you.

          I am not saying Intel’s architecture is theoretically perfect, it has plenty of oddities, but Apple has nothing to gain by dumbing down the Mac to be even more like the iPad, which has proven to be a limited capability consumer machine and relative sales dud.

        1. You misspelled “PowerPC chips”.

          Nothing is stopping you from using RISC chips today if you think you can compete with the rest of the world using ARM or PowerPC or whatever. Good luck with that.

          To touch the endlessly repeated Jobs analogy: all the best trucks — as well as SUVs, cars, trains, agricultural machines, and construction equipment — all run on Intel or AMD chips. They remain well in the lead of competing RISC architectures in terms of total system performance by any measure that matters to professionals. An iPhone-focused chip architecture isn’t going to be found in the Mac for a long long long time to come, especially at the pace Timid Tim moves.

      3. The reason that Jobs made the switch to Intel was *not* RISC versus CISC. The decision, as I recall the process, was based on the fact that:

        1) The AIM (Apple – IBM – Motorola) alliance began declining in the late-1990s. IBM was more concerned about their Power servers and Motorola was floundering

        2) After a strong start in the mid-1990s with the PPC processors, they evolved very slowly through the G3 and G4 series, with clock speeds stagnating relative to Intel’s newer processors. By the early 2000’s, the G4 was still at 500 MHz while the Intel processors had advanced to the GHz range.

        3) Apple did not provide enough volume to drive the AIM alliance forward. The economy of scale enjoyed by Intel enabled it to advance its processor designs more quickly and offer them in volume at a reasonable price.

        4) Intel CPUs provide native Windows compatibility, making the Mac the most flexible consumer desktop option available. In my opinion, this was far more important in the mid-2000s (when the transition to Intel occurred) than it is now, over a decade later.

        What is different with Apple’s A-series processors? First, Apple licenses the ARM IP, but *owns* the processor designs and employs third-party foundries to manufacture its processors. Second, Apple consumes hundreds of millions of A-series processors each year – an economy of scale that I believe even surpasses Intel. Third, Apple fully controls the APIs and tools used to develop software on its A-series processors / SoCs, including deep programming options like Metal. Fourth, Apple has succeeded in winning the performance/watt war to date. The Qualcomm Snapdragon is second, and Intel is a very distant third. Fifth, Apple has rapidly evolved the performance of its A-series processors and created highly efficient SoCs with graphics processors capable of handling lots of pixels.

        The A-series processors are nearing performance parity with mainstream Intel desktop processors, and the next generation or two may vault Apple past Intel in that regard. If Apple’s A-series processors are architected to function in parallel, then multi-processor cards could provide desktops and laptops with a large number of processing cores and a great deal of graphics power in a package designed for hand-held mobile devices with low power consumption and low heat dissipation.

        I don’t know why you believe that CISC/RISC is an important issue in this context, Mike. Regardless, Apple has a real opportunity to split off on its own and transition more of its products to A-series processors in the future. I am willing to bet that this transition will begin in the next two years.

        1. Good points, but that’s not the whole picture, is it?

          Fundamentally, CISC and RISC is still a fundamental issue, and Intel won that war for personal computers. Period, the end. Intel is not folding up shop and ultra-mobile devices have no eclipsed the capability of Intel chips, even if Intel was relatively slow in adapting its offerings to the mobile phone market.

          1-3) Yes, and the weakest player of the triad wasn’t necessarily Motorola or IBM. Apple had very poorly managed its product portfolio and pricing, leading to poor economies of scale. But the fact that Apple has tapped out all available chip foundries — even relying on Samsung for many chips — tells us that Apple doesn’t have another source for moving the Mac to ARM chips. They would have to buy from one of the last best chip foundries with capacity to deliver — Intel. I guarantee the price to change over has been assessed and it isn’t worth it. Apple is sticking with Intel’s CISC chips for the Mac.

          4) Native Windows compatibility is huge and remains a huge driver for the Mac, even if you personally think you don’t need that capability. Windows has by far the biggest and most prestigious software catalog of any computing platform, with most businesses still relying on non-Mac software for their lifeblood.

          Apple’s A processor designs for the iPhone are not transferable to the Mac without massive redesign in both hardware and of course software too, especially desktop machines that have entirely different design goals. There is no economy of scale to be gained there and the switchover costs to suppliers, Apple, and 3rd party developers adds up quickly.

          Don’t underestimate Intel’s manufacturing prowess. They do have superior economies of scale for the chipsets with the full feature set that Apple needs to create Macs.

          Also, despite great strides by AMD, Intel remains in the lead with overall chipset performance, with cutting edge memory controllers and other stuff that is overlooked by the casual computer buyer. Apple must know this, they practically bet the Mac farm on Thunderbolt, which is an Intel technology.

          A-series processors ARE NOT nearing performance parity with Intel processors. Any computer scientist will show you that if you architected an ARM system to functionally compete head to head with an Intel chipset, then costs would be higher due to the need for more cores to crunch the shorter instruction sets, destroying any cost or efficiency gains that people think ARM has. Then you would have to kludge together I/O stuff that current ARM computers (like the iPhone and iPad) do not offer but which Mac user demands, as well as significantly greater graphics performance and options.

          Bottom line, there is no advantage of moving the Mac to mobile chips.

  1. Yah. This will be announced next Monday in a Pro-Mac (which is fully customizable and expandable) running parallel with 18 AX “neural” AI chips. Alongside VR capable graphic chips of course.

    It’s in the pipeline, y’all!

  2. Hmmm so worth waiting for in a new PC/Linux Workstation? Because you know cutting edge stuff never makes it into Apple’s pro line anytime soon and doubtful in the 201? redesigned Mac “Can’t re-innovate, eh heh, my ample ass” Pro.

    1. Different MDN article, same post. You have expressed your dissatisfaction with Apple and drooled over PC/Linux boxes ad nauseum. Are you going to wait for the next Mac Pro and MacBook Pro, or buy PC/Linux? Not that I really care, but the sooner that you decide the sooner that you will move on to a different rant.

      1. As you know we sometimes post things repetitiously hoping to keep catching Apple’s eye because you never know when they might be looking. Really it’s no sweat off your brow so you don’t have to read it and since I will never stop saying it.

        I am waiting to see what happens at WWDC since I have a moment to do that. I am giving Apple every last gasp of my patience (as are many) to entice us, but I suspect no matter what they do the solution will be overpriced and not as elegant as current and older tower solutions on other platforms. I just don’t think the paradigm they will eventually come up with will work any better.

        Just because an issue doesn’t effect you doesn’t mean you or anyone else needs to be coldly indifferent. I respect issues anyone else has and would encourage quick resolution for all.

        All sectors of Apple’s device markets should be firing on all cylinders and delighting customers, and this time Apple even admitted – they weren’t. Sorry a company with billions in the bank and the ability to foresee much further in advance than they did, or have staff paying attention, gets little of my sympathy. I hope WWDC delivers some news on the new Mac Pro.

        1. I hope that the WWDC delivers a **LOT** more than merely “new Mac Pro news” .. I hope that there’s specs, a price and a firm delivery schedule that starts in 2017.

          If Apple wants “courage” to “Innovate”, they need to get their head out of their new corporate doughnut (the mega-trashcan) and DELIVER a Mac Pro PRODUCT. On Time. On Schedule. On Budget.

        2. Amen to that Scotty! Over-designed pro computers that fly in the face of upgradeability and versatility for a wide variety of pro uses need not apply. No more closed innovate-to-a-dead-end trash cans!

  3. Many people would be happy if Cook was able to spare enough pocket change to be first in line for Coffee Lake chips instead of being the laggard of the industry in adopting each generation of chips.

    Coffee Lake is reputed to be 30% faster than Kaby Lake (in low power operations) and will be available starting late 2017. So should we get our hopes up that Apple could muster enough enthusiasm for the Mac to deliver Kaby Lake across the board in the next year, then Coffee Lake by mid 2018? The next year will be the defining moment for Apple — either it’s still a computer company or Timmy is making it a lifestyle image brand first and foremost.

    http://www.cultofmac.com/tag/coffee-lake/

  4. Skylake-X, Kaby Lake-X (and to a lesser extent, Coffee Lake-X) have been “open secrets” for months now. This announcement by Intel is just making it official.

    Unfortunately, they didn’t announce some of the — supposed — high end: the 24 core, 48 thread chips. It makes me wonder if those will actually materialize. Intel may have tried to produce them and found that the yields were just too low resulting in astronomical prices that no one would ever pay. We may never know.

    Another disappointment is that TB3 is not built into either the CPU (once rumored) or the X299 (also rumored, but even more strongly that for the CPU). This means that for a full blown, non throttled USB Type-C with TB3/USB 3.1b/DLP/etc. support you need 6 PCIe 3.1 lanes. (I’m strongly hoping the statement that they support PCIe 3.0 is either a typo or an oversight in what the spec really is.)

    But the real bottom line is the last line of the referenced article…
    “X299 motherboards and X-series processors should hit the market ‘in coming weeks.'”
    For most computer companies that means 2-6 weeks.
    For Apple that could mean 40+ weeks.
    Why?

  5. Great news, but because of Apple obsession with slimness and small form factors, Apple must design a custom motherboard.

    This takes about 6 months to a year apparently.

    If Apple wants to stay on the curve with the Mac Pro, then Jony needs to be sidestepped, the Mac Pro must use common off the shelf parts.

  6. What event in Apple’s history leads anyone to believe that they would announce specs for the next Mac Pro a year away?

    I don’t think so. Expect speed bumped notebooks and new iPads next week. Mac Minis and iMacs in September along with new iPhones.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.