“Rather than focusing on whizzy graphics and a fancy new interface for the Mac’s new operating system, OS X Snow Leopard, Apple Chief Executive Steve Jobs has said his company will focus on improving the speed and stability of Apple’s computers,” Brian Caulfield reports for Forbes.
“That, however, doesn’t mean the new software won’t be a radical shift, with a trio of patents published this week making it increasingly clear that Apple will rely heavily on graphics processing units (GPUs), along with the central processing unit (CPU), to speed through everyday computing tasks,” Caulfield reports. “Apple’s patents… suggest that the Cupertino, Calf.-based computer maker sees the trend as a way to set itself apart from Microsoft.
“The three patents describe techniques for the tricky job of chopping up tasks and assigning them to different types of chips, such as graphics processors and traditional central processors,” Caulfield reports.
“The patents also point up another truth about Jobs’ approach to computing: He prefers to chart his own path instead of using the same ingredients as every other manufacturer. That maverick tendency was underscored earlier this year when Apple paid $278 million to purchase high-end chip design house PA Semi,” Caulfield reports.
Full article here.
Thomas Claburn reports for InformationWeek, “One patent, Data Parallel Computing On Multiple Processors, describes a technique to allocate processing tasks across CPUs and GPUs.
“‘Traditionally, GPUs and CPUs are configured through separate programming environments not compatible with each other,’ the patent application states. ‘Most GPUs require dedicated programs, which are vendor specific. As a result, it is very difficult for an application to leverage both CPUs and GPUs for processing resources, such as GPUs with data parallel computing capabilities together with multicore CPUs,'” Claburn reports.
“One of the inventors listed on this patent application is Aaftab Munshi, a software architect at Apple who worked previously as a senior architect at graphics chipmaker ATI. He is also listed as one of the inventors on the two related patents, Parallel Runtime Execution On Multiple Processors and Application Interface On Multiple Processors,” Claburn reports. “Jeremy Sandmel, listed co-inventor on the three patents, also appears to have worked at ATI.”
Claburn reports, “These patents almost certainly relate to Open Computing Language (Open CL), a new technology in Snow Leopard designed to make it easier to make use of GPU processing power.”
Full article here.
Nerds!
When apps can be written to utilize an OS-native data distribution system we’ll finally see the real potential of all those processor cores. It will also bring back the power of the pro “desktop” which has taken a back seat to the laptops at Apple for several years.
So Apple patented Nvidia’s CUDA which has been available on Windows for years? Good work Apple!
“That maverick tendency was underscored earlier this year when Apple paid $278 million to purchase high-end chip design house PA Semi,”
McCain should dump Palin for Jobs. A real pair of Mavericks…
Ugh, did the author have to use the word “maverick?” Couldn’t he have used “nonconformist” or “unorthodox?” LOL
Sorry, since it’s the season, I had to somehow slip some politics in the post.
Seriously though, Apple is definitely an innovator and I am interested in seeing how much faster they can make the OS. I would be very impressed if they can achieve that “snappy” feel of the finder in OS 9. Even with gigs of RAM, OS X’s finder just doesn’t pop to me. But don’t get me wrong, I love it – just being nit-picky.
With the leaps in technological advancements of processors, it seems as if software performance has improved fractionally as much. I bet Apple is working their magic refining and tuning the code to hopefully make the OS scream.
It’s not CUDA – CUDA will only work with nVidia’s chipset, and then, only on the newest ones. It won’t work with ATI graphics and won’t work on systems with integrated graphics, such as the Mac Mini. OpenCL will work on anything, with or without a GPU, with an ATI or an nVidia GPU. That’s the big difference.
I’ll agree that it is still a bit of catching up. Windows has had what: regular and Hybrid SLI, GPU accelerated physics and AI, and CUDA and all the other similar technologies out there – we were floored when we finally got multithreaded OpenGL and WoW ran faster. There are always pros and cons to being a more ubiquitous operating system, and this is one of the potential cons – especially when Jobs has never appreciated the game industry, and basically ignored the enterprise market.
@JadisOne
Perhaps he should have used the word “Change” ” width=”19″ height=”19″ alt=”grin” style=”border:0;” />
“Rather than focusing on whizzy graphics and a fancy new interface for the Mac’s new operating system, OS X Snow Leopard, Apple Chief Executive Steve Jobs has said his company will focus on improving the speed and stability of Apple’s computers,”
Hmm, a not-so-subtle jab at MS?
The folks who are hurriedly re-skinning Vista into Windows 7 should love this comparison.
@Done Before,
Yeah, and Mafia$ucks has really done a lot with that technology, haven’t they?
Fista just screams – in Solitaire that is.
“Yeah, and Mafia$ucks has really done a lot with that technology, haven’t they?”
Umm, what does it have to do with Microsoft? It’s an Nvidia technology. ATI has their own too. But you can’t patent something everyone else is doing.
Windows users: If you’re not afraid, you should be.
Apple users: Same as above but for different and much ” width=”19″ height=”19″ alt=”cool smile” style=”border:0;” /> reasons.
“Umm, what does it have to do with Microsoft? It’s an Nvidia technology. ATI has their own too. But you can’t patent something everyone else is doing.”
@Done Before
Then you don’t really understand what Open CL is then…
as the first poster said: Nerds! /end thread/
Obviously ‘Done Before’ considers himself (herself) more informed about this technology than those who have actually created it. How stupid of them to try to patent something that already exists and indeed as DB admits, exists at a company that they themselves worked for, perhaps they were on a toilet break when it was invented there. Such poor deluded fools, as must be all the companies and their own experts buying into Open CL too, none of whom can have a brain the size of the planet that DB lives on then.
ZuneTang… Where are you?…
Didn’t the Amiga do this. It had a strange set of processors that did different tasks. “Fat Agnus”, “Gary” and the Motorola 68000 all processed different tasks. I used to have one and thought it was pretty cool to do this sort of stuff. I am delighted that Apple have found a way to do this with Snow Leopard.
“How stupid of them to try to patent something that already exists”
OpenCL is just another GPGPU computing initiative. For ATI GPUs at least, it will be implemented as a layer on top of ATI’s existing GPGPU technology. Perhaps they’re just going to layer it on top of Nvidia’s CUDA and Intel’s Larabee too. That part is as yet unclear.
But go ahead, claim that Apple just invented GPGPU computing. I was particularly happy when they helped Intel invent the x86 processor a couple of years back because that proved to be a big benefit to the industry. They should get the credit they deserve for this advance too.
Interesting…right angle turn stuff, this is good, as it shows thought is continuing to happen at Apple. This is good. Poor microsh*t will take years to copy this. Guess that’s what happens when you dilute your engineering resources, constantly switching and diverting them, trying to win wars against competitors who are much better in their respective niches (apple, google, etc).
@ Renee Cousins
Nice feed, well explained.
@ British Mac Head
Yes, the Amiga did this from the beginning. Welcome to 1984 Apple ” width=”19″ height=”19″ alt=”wink” style=”border:0;” />
would that be big brass mavericks
We always here about how the next chip will do this or that and how performance will make some quantum leap, or this new technology will blow away the competition, so why then is Photoshop or Aperture, for example, performing not so much better now than before? and why is my macbook pro with 10.5.5 the worst machine i have had (restarts all the time, beach balls etc etc)? i used to run the old osx on my g4 powerbook for months without restarting, and if it was slower well so what? having to restart all the time is the biggest waste of time and to boot (sic) pisses me off heaps..
I would think about a new macbook pro, but rev A? and those shiny screens and ugly black frame and keyboard…hmmm i am not so in love with apple as I used to be…..
magic word: ‘wife” as in i may be sleeping on the couch for a while….
Oh and PS
Bloody batteries!!!!!!! i got a replacement over a year ago because the other one expanded to double size and now this one is doing the same!!!!!
I’m going back to pencil and paper.
No, Amiga didn’t do this. Cool as it was, with all its individual chips doing their individual tasks, this isn’t what it was doing. And nVidia wasn’t doing this, either. They had found a way to make their GPU take on some non-GPU tasks while not otherwise engaged – if I understand the Wiki correctly. And then, only if the program requests the help and is coded in a particular version of C?
This is not based on a particular hardware dongle (the GPU) but is part of the OS. It doesn’t care what you are using for CPU or GPU, only what each is doing vs what each is CAPABLE of doing. That said, it does sound like it is an extension of the earlier technologies. A significant leap, perhaps, but merely an extension rather than totally new territory.