Intel’s Haswell to deliver massive graphics performance boost

“Intel is expected to announce its fourth-generation Core processors code-named Haswell for laptops and desktops in June, but the company is already releasing teasers that talk about their performance,” Agam Shah reports for IDG News Service.

“The Haswell laptop chip will deliver up to twice the graphics performance compared to third-generation Core processors code-named Ivy Bridge, according to a slide deck released by Intel on Wednesday,” Shah reports. “The company is claiming graphics performance improvement of close to three times for Haswell desktop chips. The graphics capabilities in Haswell will enable high-definition gaming and video playback, which will reduce the need for separate graphics cards.”

Shah reports, “Intel is expected to announce the Haswell chips at the Computex trade show in early June. PC makers are expected to show laptops, desktops and tablets running on Haswell chips at the trade show, which will be held in Taipei between June 4 and 8. Intel has said that Haswell-based laptops may be released around the middle of this year… Haswell will also deliver double the performance on the same power consumption compared to Ivy Bridge chips… The Haswell chip will enable laptops to play 4K video… ”

Read more in the full article here.

Related articles:
Apple to ramp up production of Haswell-based MacBooks in May, sources say – April 27, 2013
Is Intel’s Haswell a has-been? – March 12, 2013
Intel demos 4th-gen ‘Haswell’ processors bound for Apple’s MacBook lineups – January 7, 2013
Apple MacBook Pro, MacBook Air to get new ‘Haswell’ processors in June 2013, go all-Retina? – December 28, 2012
Intel just announced new chips that will let future MacBooks have practically infinite battery life – September 13, 2011

19 Comments

      1. but IMO 10% is not massive…
        and also do I need graphics boosting ?
        yeah – I want raw power !!!!!!

        As mentioned 4 – 6 times the power of present day processors is required to gain desktop and laptop interest again. Then Tablets may not be around in 5 years – LOL.

    1. Rest assured the high-end of these new chips will ship in several speeds between 2.5 GHz and 3.8 GHz. MHz means almost nothing these days for comparisons. A modern 3 GHz chip can outperform a 3-year old 3.5 GHz chip of the same family. You need actual benchmarks to compare raw power, and of course these aren’t available til after they’re released.

  1. My reaction? Meh.

    This is intel desperately clinging to the notion that desktop and laptop can be a growth industry again. Trying to find something new to drive future sales, 4k video, and hoping to move the graphics card industry back under their tent.

    Pretty much a desperate and hopeless effort, lowering power consumption for same output is really the only important factor in my opinion the rest is a big meh.

    1. Improving battery life will make notebooks more useful and reliable, this should lead to more consumer interest and spending. Improving integrated graphics will not likely impress the “professional” gamer, but should be adequate for most users and eliminate the need for a dedicated graphics processor.

      As an aside, I suspect most “professional” gamers don’t make a living competing and live a vicarious existence through their digital characters.

    2. If Apple were to really bring manufacturing and assembly back from Asia – keeping the engineers to stay put in the US with chip fabrication in the US is very important.

      One point I have in question, Intel is an American company – right. And Apple can really side with the grass roots of Intel compared to AMD.

  2. Intel & graphics chip performance? Don’t make me laugh. Intel has been promising that nonsense for 10 years now and I don’t expect that this will be anything but an empty promise.

    I’ll take NVidia’s dicrete GPU graphics chip any day over Intel’s integrated graphics chip. Look at the performance of the 13″ retina MacBook Pro for a laugh at Intel’s graphics chip performance.

    What a load of codswallop.

  3. This sets up a buying season. A LOT of pressure on Jony Ive right now to get this right. He nails it, people will be switching in droves, he screws it up, Microsoft buys more time to fix mistakes.

    1. The pressure on Ive now to do this is gone.

      Everything Ive could do is long past as these components and sub-components have already been ordered & in production to enable late May assembly.

    2. Microsoft has had a *long* time to fix its mistakes, but has not made all that much progress. Still, it is always better when Apple hits a home run the first time.

      Integrated graphics have a pretty sorry history. A few years ago, Apple actually chose to keep using a previous generation of Intel CPUs in its MBPs rather than using the latest and greatest Intel CPU release because the older chips allowed the use of a discrete GPU. More recently, however, integrated Intel graphics solutions have picked up, and are actually decent for regular users. Gamers and professional users, however, still need the power of discrete (and power-hungry) graphics cards.

      One negative aspect of integrated graphics is that vendors are departing the graphics card market. Historically, there have been relatively few good graphics cards for the Mac, and those have generally been obsolete relative to the latest Windows PC offerings. If the majority of people become satisfied with integrated graphics, then you may be hard-pressed to find a Mac graphics card in a few years.

  4. I wonder if the Haswell chips might be used in a retina capable iMac, which could also display 4k graphics? I’m waiting patiently for the next bump in iMac graphics resolution!

  5. The Haswell chip will enable laptops to play 4K video…

    That’s kewl! But…

    The graphics capabilities in Haswell will enable high-definition gaming and video playback, which will reduce the need for separate graphics cards.

    Note the careful wording there. “…Will REDUCE the need for….” Yeah.

    IOW: Don’t count on it replacing your high powered graphics card. Haswell is only providing something of a speed boost. That is all. I highly suspect this will be used as an excuse for certain crap PC makers to provide crap graphics cards in their boxes with the excuse that Haswell will take up the slack. Don’t buy that. 😛

  6. Is Intel finally, finally, finally getting their graphics house in order? Looking back, wasn’t graphics performance and power the two nails in Intel’s coffin for mobile devices? Oh wait… they were WinTel at the time and could never fail.

    Even with new chips they still have the problem with adoption rate of Windoze 8 –
    http://www.zdnet.com/windows-8-microsofts-new-coke-moment-7000014779/

    Intel is dead in its current state and needs to pivot. From Seeking Alpha: http://seekingalpha.com/article/1370411-amd-hopes-arm-can-save-it?source=yahoo

    Here’s one of the nuggets from this article – “What all this tells me, as a stock buyer, is that Intel is increasingly going-it alone in the processor world, investing heavily in a very few designs and insisting that buyers take off-the-shelf components. But in the device and embedded markets, which are increasingly where the growth is, that won’t play. The chip gets designed around the device, rather than the device around the chip. Advantage, ARM.”

    I just don’t see the growth for Intel in its current form and I don’t believe that graphics performance will really stimulate the market other than for a select segment of gamers. Mobile is where the money is flowing and if they can’t figure out how to design processors for specific devices, then its not going to be too much fun being an Intel investor or employee.

  7. Intel graphics hardware performance is actually good, their big handicap are the drivers: nVidia makes the most solid, reliable, optimized video drivers in the industry, while Intel’s drive people crazy with graphic glitches and piss poor performance.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.