How the end of Moore’s Law will usher in a new era in computing

In 1965, Gordon Moore, the founder of Intel, predicted that the number of components that could fit on a microchip would double every year for the next decade.

Tom Hoggins for The Telegraph:

Moore revised his prediction in 1975 to a doubling of components every two years – a prophecy that remained true for another four decades.

The doubling of transistors – semi-conductor devices that switch electronic signals and power – meant that technology would become exponentially more powerful, smaller and cheaper. The fact that the smartphone in your pocket is now many thousands times more powerful than mainframe computers the size of buildings in the 1970s is a testament to this fact.

But over the past few years, Moore’s Law has begun to reach its natural end as we squeeze every nanometer of advancement out of silicon chips. By the mid-2020s it is believed that the Law will have plateaued completely as production costs increase and transistors reach their physical limits.

With Moore’s Law effectively becoming economically unsound, the technology industry will need to become more creative without an established blueprint to follow. As a result, a new era in computing could follow.

MacDailyNews Take: Sounds not like an end, but an opportunity to us. After all, necessity is the mother of invention.

5 Comments

    1. Yes, but as they are very good at particularly special problems, they are lousy at your average everyday computer operations. It doesn’t take a quantum algorithm to place a transaction in a debase or update a spreadsheet.

    2. Where have I heard this before? Oh yes…
      Vector computing is going to speed things up so much that we’ll be hard pressed to find problems it won’t compute quickly.
      Very long instruction word computing will revolutionize computational physics and image processing so much that programming will become trivial and the VLIW compiler will do 99% of the work for you.
      GPUs are going to revolutionize computational physics to such a degree that even your desktop will be faster than the fastest super computer of just a couple years ago.
      Multi core and multi thread computing will revolutionize systems to such an extent that a single server will be able to do the work of several hundred servers of just a few years previously.
      The list goes on and on and on.
      And the latest one (besides the quantum computing one):
      5G is going to revolutionize how we do wireless communications. Everyone will think of it as “before 5G and after 5G”.

      If they get quantum computers stable enough for a long enough period of time with enough q-bits they will be fantastic for a specific set of problems, e.g., cryptanalysis, but I don’t see a quantum computer showing up even in the corporate IT center for at least another decade and likely not for some time longer than that — and they won’t show up in your desktop or laptop for at least a decade after that. Further, you’re not going to see quantum computing in your cell phone or tablet until they come up with a way to make q-bits stable at 35 degrees C or higher (not just the advent of room temperature super conductors).

      Many of us have been hearing that practical nuclear fusion power sources are just 25 years away. Well, those practical nuclear fusion power sources have been just 25 years away for the last 60+ year. They’re still 25 years away.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.