Moore’s law is dead

“Moore’s law has died at the age of 51 after an extended illness,” Peter Bright writes for Ars Technica. “In 1965, Intel co-founder Gordon Moore made an observation that the number of components in integrated circuits was doubling every 12 months or so.”

“With a little more data and some simplification, this observation became ‘Moore’s law’: the number of transistors per chip would double every 12 months,” Bright writes. “Gordon Moore’s observation was not driven by any particular scientific or engineering necessity. It was a reflection on just how things happened to turn out. The silicon chip industry took note and started using it not merely as a descriptive, predictive observation, but as a prescriptive, positive law: a target that the entire industry should hit.”

“Problems with the original formulation of Moore’s law became apparent at an early date. In 1975, with more empirical data available, Gordon Moore himself updated the law to have a doubling time of 24 months rather than the initial 12. Still, for three decades, simple geometric scaling — just making everything on a chip smaller — enabled steady shrinks and conformed with Moore’s prediction.””

“In the 2000s, it was clear that this geometric scaling was at an end, but various technical measures were devised to keep pace of the Moore’s law curves. At 90nm, strained silicon was introduced; at 45nm, new materials to increase the capacitance of each transistor layered on the silicon were introduced. At 22nm, tri-gate transistors maintained the scaling. But even these new techniques were up against a wall,” Bright writes. “These difficulties mean that the Moore’s law-driven roadmap is now at an end. ITRS decided in 2014 that its next roadmap would no longer be beholden to Moore’s ‘law,’ and Nature writes that the next ITRS roadmap, published next month, will instead take a different approach.”

Read more in the full article – recommended – here.

MacDailyNews Take: R.I.P. Moore’s law. 51 years was an amazing run!

29 Comments

    1. I remember saying that when I had been used to using a 16MHz Mac LC and upgraded to an unbelievably fast Mac II FX, running at a blazing 40 MHz. It seemed beyond comprehension that anybody could have ever needed a faster computer than that.

      Recalling that Mac also reminded me of the hassles of SCSI external drives and in particular the need to use a special black coloured terminator with an FX instead of the usual terminators.

        1. You don’t still believe Gates said that comment do you? It’s been debunked years ago. Even IF he said it, it would have been more like …with “today’s” computers no one needs more RAM… I agree with you that we don’t have to follow Moore’s law and double the components, there are other way to get more speed – IF someone really needs it.

        2. It hasn’t been proved or disproved and naturally Gates would deny such an idiotic statement even if he HAD made it. But then on the vision thing Gates was never the sharpest pencil in the box which is what makes its believable (despite the legend in his own mind).

          Would be nice for those who can never get enough speed get more than Moore. For everyday tasks though we seem to be at a nice point, except games perhaps.

          http://www.computerworld.com/article/2534312/operating-systems/the–640k–quote-won-t-go-away—-but-did-gates-really-say-it-.html

    2. This is also one of the reasons computers don’t sell as quickly as they used to. It used be that a computer was considered obsolete after about 3 years. Now people keep them from 6 to 7 years because the computers are much faster than most people generally require.

      1. Yeah I look at it as good (money-saving) news actually. I still have an operating 11 year old 2005 G5 Mac Pro and a 8-9 year old 2007 Intel Mac Pro. I need to replace my main machine from 2007 stat but waiting to see if a new Mac Pro is coming out soon (so hurry Apple!). Or have to consider (gasp!) a PC workstation for my powerful needs.

        All my Macs have lasted a good long time. Unless they come up with some gigantic leap in processor technology we can all breathe a sigh of investment relief.

        1. Just had another lovely episode with Apple last night. While working remotely with an old MBP running Mavericks, a coworker sent me a Numbers file for my review.

          Click on the file – won’t open. I see Numbers 3.2.2 is installed, the latest version that Mavericks will run. Apple says I must update to open the file. No can do without migrating to EL CRAPitain.

          Apple wants to force users into updating the OS and Numbers — both of which would impact my time and workflow — and to enforce the ever-ratcheting software ladder, Apple intentionally makes the latest Numbers file format unreadable by any prior versions of Numbers. No way to preview, open, or convert the file. Does this sound like a user-friendly company to you?

          This is just one more example of how Apple is screwing its users. I don’t care if the new software is free, and I don’t give a shit if Apple’s newest file format is more better in every way. Apple is now jumping in with both feet on the forced obsolescene model of software. They fail to realize that Numbers is, and at this rate probably always will be, a poor imitation of a proper spreadsheet application for pros. With such horrid compatibility, there is absolutely no reason to recommend that anyone use Numbers. What a shame Apple has completely lost touch of obvious user needs.

          One last thing: Apple, your El Capitan interface is fugly. I’m not updating the laptop. I installed Office 2011 for the Mac instead. Not that it can read your proprietary Numbers file format, but it is better in every way and we’ll no longer be using Numbers ever again.

      2. Partially true. Consumers don’t have fast escalating power needs, but pros still update their machines every 3 years or less.

        It’s also clear that some manufacturers (most of Apple’s Mac range) have simply done nothing substantial to improve their products in the last 3-5 years. A 2010 Mac Pro with upgradeable GPUs can perform video tasks faster and better than a 2016 trashcan MP. A used/refurb MacBook Pro from 3 years ago is still head and shoulders above all the Airs and the neutered 2015 MacBook netbook. So to get an incremental improvement in speed, Apple’s newest MBPs cost more and offer less user flexibility. There are exceptions, of course. the hottest 27″ iMac is more desktop than any consumer will outgrow in a long time. But again, they can’t improve it either. It’s got no convenient way to upgrade it when RAM, GPU, or storage needs evolve. You can’t even easily add a matching monitor to any iMac, because Apple doesn’t make one.

        More sealed boxes, less user upgradeability, worse GUIs, buggier OSes from both Apple and MS. The reviews are all very clear on that. Its all very sad.

  1. The irony of the “faster chip” phase of PC marketing fully hit me when I put an SSD in my Macbook Pro 5 years ago, and thought “Wow, now that’s fast” (i.e. the bottleneck in PC speeds was not the chips, but the hard drives).

  2. Circuit size reduction, transistor size, and speed increases may be limited by physics. But this may be only a temporary slowdown.

    IBM has shown that you can stack traditional transistors on a single chip. And quantum tunneling transistors may be close at hand, which could make possible transistor increases by orders of magnitude on a single chip for decades to come.

  3. We knew Moore’s law was effectively dead when Intel started slipping their CPU release dates as of a few years ago.

    No doubt, it Moore’s Law will continue to live on in the mythology as a goal, an annoyance and baloney that never was more than a temporal observation.

    Meanwhile: Nanotechnology and quantum computing research continues apace, both of which could potentially leave ‘Moore’s Law’ in the dust.

  4. Moore’s Law is far from over. Did anyone really think it would be an exact layover template? It’s going to have a hiccup, but thinking dumbly that we’re just done improving technology exponentially is a ridiculously ignorant claim.

      1. That’s Wired magazine, though. They’ve never made an accurate prediction. They’ve been delivering a brand of gimmicky, optimistic, “gee-whiz” future technology predictions for years, and have never once come close to reality. The fact that Wired thinks carbon nano tube transistors will be a thing makes doubt that it’s anything more than hot air.

  5. It’s been a fun ride, but all good things (that push limits of physical possibility) must come to an end.

    I don’t think it will be a big deal for consumers, though. Like many others, I stopped craving a faster computer around the same time I got an SSD drive. I used to buy a new Mac every 2 years, thinking it was a totally justifiable purchase by the productivity that new hardware unlocked. But after swapping in an SSD drive, I’ve been rocking the same Mac for 6 years now, and I see no reason to go shopping for a new one anytime soon. I don’t really “wait” for my computer to do anything now. The bottleneck in the system is always me or the network connection.

  6. I’ll be able to tell you all how fast my 27 inch iMac (with the Skylake chip) is once they replace the RAM slots and logic board that fried an 8GB RAM module. GRRR! And yes it’s all under warranty but I’m not all that happy considering my 2007 iMac is still chugging along and the new one just (doesn’t) work.

  7. Didn’t it die a number of years ago with roughly the Pentium 4? That’s when the whole Centrino multi-core effort began because they couldn’t just keep punching up the Mhz/Ghz to ever higher numbers due to heat restrictions w/ the manufacturing process.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.