“Since Nvidia has clearly demonstrated the superiority of its Pascal architecture in performance and energy efficiency to AMD’s Polaris, I have wondered if Apple would switch to Nvidia GPUs for its Macs,” Mark Hibben writes for Seeking Alpha. “It now appears that such a transition is underway, although it may not be until sometime in 2017 that consumers will be able to buy a Mac with Nvidia inside.”
“Bloomberg has reported that Nvidia posted several jobs for software engineers to develop specifically for the Apple Mac. Apple’s proprietary Metal graphics APIs, as well as OS X (now referred to by Apple simply as macOS) are specifically mentioned in the job postings,” Hibben writes. “There can be no doubt that Nvidia is hiring software engineers to develop graphics drivers for the Mac.”
“Most likely, the new postings indicate that the effort is fairly new and any products featuring Nvidia graphics processors would probably not reach consumers before next year,” Hibben writes. “It’s been widely assumed that AMD would be in expected refreshes of desktop and notebook Macs. A Mac Pro refresh has been long overdue, as has a refresh of the MacBook Pros. Many have wondered why it’s taking so long for Apple to update the hardware in these still very important product lines and have pointed to this delay as a sign of stagnation. I think it is perplexing that a company with Apple’s resources would not be able to upgrade the Mac product lines with the latest CPU and GPU offerings as soon as they become available. I certainly don’t buy the Apple stagnation hypothesis. One possible explanation is that Apple had been expecting Polaris to be available for its iMac and MacBook Pros… Chances are that when Polaris notebook GPUs become available, that’s when we’ll see the new Macs.”
Much more in the full article here.
MacDailyNews Take: An Apple-NVIDIA partnership has a bright, and mutually-beneficial, future indeed!
Job listing suggests NVIDIA is working with Apple on future Macs – September 23, 2016
Apple seems to be heavily into green computers so I wouldn’t expect Apple to be coming out with computers sporting the latest powerful graphics processors. I’ve seen the size of those graphics cards and how much heat they give off. I can understand they’re definitely not something Apple wants in its computers. Apple seems to have chosen its path for its consumer computers and it’s a low-power path. I think that’s fine for most home computer users. The Mac Pro is (or should be) the only exception.
A monitor with a built-in graphics card seems like an interesting idea although I’m not sure how practical that is. Apple has got some weird ways of doing things. It’s OK. If I don’t like they way they’re building computers, I can always buy some other company’s products. However, I’m sticking with my Macs.
AMD video cards don’t interest me in a new Mac Pro. Nvidia does with it’s CUDA acceleration capability. Having talked with the guy who actually designs the darn things at the last Siggraph Nvidia is the video card company of the moment. In any case Apple needs to give all pro’s options and not just one video card type fits all.
Yes, CUDA is a MUST and it HAS TO be working as standard on MBP!
CUDA is nVidia hardware specific. OpenCL is not. Nobody is going to write code specific for CUDA on any scale.
You mean like Adobe, most of whom’s pro applications use CUDA? Not a very good technical conclusion on your part now is it? Back to tech school for you.
My post said “on scale” meaning a large percentage of applications that will accelerate for GPU compute. For that matter, your comment is misleading as Adobe does OpenCL as well. Example:
Your argument is pointless if the need is accelerating Adobe apps. Hopefull all can be accommodated.
Your argument is pointless as we’re talking about general technologies, not the specifics of one particular developer. Developers don’t want to write and optimize for a GPU vendor specific basis if they don’t have to. Cuda predates OpenCL and had initial early support. I’m not knocking Cuda as a technology. Rather, I’m knocking Cuda as the future path for GPU compute. Sure, in an ideal world, accommodate both. However, if resources are limited (and they usually are), then writing for OpenCL will go a lot further as it will be supported across hardware vendors. That’s reality… don’t be so defensive.
Your argument is pointless too since Adobe hasn’t yet made those changes. Please give them the benefit of your expertise and show them the way. I don’t give a darn what technology will accelerate graphics apps, pick one we all can agree on and get ‘r done. I hate this one card good for one thing and another card good for something else so you better have the right card business.
I think you are not well informed. Sometimes open/general technology wins, but often it does not.
NVidia’s CUDA has huge widespread support throughout engineering, scientific, financial and other heavy compute applications. Nvidia pioneered GPUs as general purpose parallel computing devices, creates fantastic CUDA libraries for specific types of calculations, and lots of vendors have standardized on CUDA.
I would know, I only develop for CUDA because all the products in the same ecosystem only use CUDA.
Its been a rough time no having MacBook Pro’s and especially a Mac Pro that supports CUDA.