Intel processor breakthrough biggest chip advance in 40 years; coming before year end

“Intel and IBM each separately announced competing developments Friday described as the biggest advances in semiconductor chip making in over 40 years,” Therese Poletti reports for The San Jose Mercury News.

“Using new materials and a new manufacturing process, the two companies announced advances that would increase the speed and power of chips for another decade,” Poletti reports. “But Intel of Santa Clara is apparently much farther along, saying it will launch new chips for computers, laptops and servers before the end of the year based on the advances.”

Poletti reports, “One of the most important features is that the faster chips will also consumer much less power, an epidemic problem for some companies in the industry. ‘It’s a real breakthrough … for both of them,’ said Rick Doherty, research director of The Envisioneering Group in Seaford, N.Y. ‘I wouldn’t be surprised if members of these teams were up for the Nobel prize.'”

“Intel said that the development will ensure that Moore’s Law will thrive well into the next decade. Moore’s Law is the name given to a prediction by Intel co-founder Gordon Moore, who said in the 1960s that the number of transistors on a chip would double every two years. That prediction has proved to be an industry benchmark that has paved the way for faster, cheaper and more reliable computers, cell phones and other consumer electronics,” Poletti reports.

Poletti reports, “Moore, 78, came out of retirement, where he spends some of his time in Hawaii, to issue a statement Friday about the Intel team’s innovation. He said Intel’s use of high-k and metal materials ‘marks the biggest change in transistor technology’ since Intel’s pioneering use of polysilicon in 1969.”

“Intel said its new family of chips, code-named Penryn, will have 410 million transistors, using the new materials combined with the 45-nanometer technology manufacturing process. This compares with about 280 million in current chips. Intel also said electrical leakage will be reduced by a factor of about 30 percent,” Poletti reports.

“Yoshio Nishi, a professor of electrical engineering at Stanford University, said IBM’s advance as still in the research phase,” Poletti reports.

Full article here.

Related articles:
Intel announces breakthrough ‘Penryn’ processor family; slated for production in second half 2007 – January 27, 2007
Intel to announce processor breakthrough; new chips will run faster, consume less power – January 27, 2007
Intel pledges 80-core processor within five years – September 26, 2006
Apple chose well: Anandtech – Intel Core 2 Duo ‘the fastest desktop processor we’ve ever tested’ – July 14, 2006
Apple chose well: Intel poised to take massive lead across the board over AMD – June 07, 2006
Intel first to demonstrate working 45nm chips – January 26, 2006
Intel-based Macs running both Mac OS X and Windows will be good for Apple – June 10, 2005
Apple to use Intel microprocessors beginning in 2006, all Macs to be Intel-based by end of 2007 – June 06, 2005

22 Comments

  1. Hg– I think you’re right. To me, this is setting the stage for a quantum change in computing at some point. If there is excess power to burn, at some point, someone’s going to see how to use it up. It’ll be interesting to see how things go, as I doubt the biggest breakthrough will be faster video game frame rates.

  2. “Don’t even have applications yet to take advantage of current technology.
    I hope developers take a fresh look at optimizing their applications for more powerful processors.”

    It’s bloody obvious that you don’t work in audio or video! =:-O

  3. Oh my GOSH, Spark, you’re RIGHT. By God, we didn’t even THINK of that. Your idea, it’s… it’s EXTRAORDINARY. This idea alone will send our entire industry into NEW and PROFOUND directions! THANK you Spark! Thank YOU!

  4. “Don’t even have applications yet to take advantage of current technology.
    I hope developers take a fresh look at optimizing their applications for more powerful processors.”

    Wrong again.

    MSFT can be credited for this advancement. Vista can easily take up all this new processing power just from playing solitaire.

  5. > MSFT can be credited for this advancement. Vista can easily take up all this new processing power just from playing solitaire.

    You’re right. Mac OS X Leopard will probably run fine on an old G4. It’s Windows Vista that will see the most benefit (of increased processing power), to overcome its bloat, inefficiency, and “legacy” coding.

  6. Leodavinci:

    Maybe you should review your EC101 ” width=”19″ height=”19″ alt=”smile” style=”border:0;” />

    The law of diminishing marginal returns is about how many chips INTEL could make and sell profitably at any given time. It’s about the idea, that given existing conditions, such as plant and technology, there is a finite number of chips they can produce and sell profitably. After they produce ‘n’ chip, if they produce any more, they would be selling them for a loss.

    I hardly see the relevance to the discussion ” width=”19″ height=”19″ alt=”smile” style=”border:0;” />

  7. BRAD T wrote: “Maybe you should review your EC101

    The law of diminishing marginal returns is about how many chips INTEL could make and sell profitably at any given time. It’s about the idea, that given existing conditions, such as plant and technology, there is a finite number of chips they can produce and sell profitably. After they produce ‘n’ chip, if they produce any more, they would be selling them for a loss.

    I hardly see the relevance to the discussion “

    Brad, the Law of Dimishing Returns can be applied to things other than economics. Case in point: you can stack Yagi antennas to achieve a gain of signal strength coming from a given direction. However, with each additional antenna there is some loss imposed on the system due to the uncontrollable losses in the coupling of them via coaxial cable. At some point, adding more will not give you any additional gain (typicallly 16 or so). Broaden your horizons a bit!

  8. “if nothing else, this will be great for battery life…you typically get 3 hrs of battery life with a 15″ MBP (normal use)…this would increase that to 4 hrs at the least.”

    How much of battery life depends on spinning the hard drive, lighting the monitor, powering speakers, etc.? Given that the 30-percent savings in electrical leakage doesn’t reduce the power consumption for those other functions, and not knowing how much of the power the CPU uses is lost to electrical leakage in the first place (in other words, what percentage of the TOTAL CPU power use does that 30-percent electrical leakage improvement figure apply to), I’m not sure how much this would improve battery life…

  9. Bryan,

    “I wonder if that would have been MDN’s headline if Apple hadn’t switched. Somehow I doubt it.”

    Or look at it differently, : I wonder if Apple would have switched if that headline wasn’t coming in the near future ?

    Steve did say that they switched because of what they saw Intel had in the pipeline. Yesterday’s Intel is not today’s Intel. Apple made a smart move for which it had started to prepared 5 years earlier (which is even smarter). So I believe respect is shown where respect is due.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.