An Apple-NVIDIA partnership has a bright future

“Since Nvidia has clearly demonstrated the superiority of its Pascal architecture in performance and energy efficiency to AMD’s Polaris, I have wondered if Apple would switch to Nvidia GPUs for its Macs,” Mark Hibben writes for Seeking Alpha. “It now appears that such a transition is underway, although it may not be until sometime in 2017 that consumers will be able to buy a Mac with Nvidia inside.”

Bloomberg has reported that Nvidia posted several jobs for software engineers to develop specifically for the Apple Mac. Apple’s proprietary Metal graphics APIs, as well as OS X (now referred to by Apple simply as macOS) are specifically mentioned in the job postings,” Hibben writes. “There can be no doubt that Nvidia is hiring software engineers to develop graphics drivers for the Mac.”

“Most likely, the new postings indicate that the effort is fairly new and any products featuring Nvidia graphics processors would probably not reach consumers before next year,” Hibben writes. “It’s been widely assumed that AMD would be in expected refreshes of desktop and notebook Macs. A Mac Pro refresh has been long overdue, as has a refresh of the MacBook Pros. Many have wondered why it’s taking so long for Apple to update the hardware in these still very important product lines and have pointed to this delay as a sign of stagnation. I think it is perplexing that a company with Apple’s resources would not be able to upgrade the Mac product lines with the latest CPU and GPU offerings as soon as they become available. I certainly don’t buy the Apple stagnation hypothesis. One possible explanation is that Apple had been expecting Polaris to be available for its iMac and MacBook Pros… Chances are that when Polaris notebook GPUs become available, that’s when we’ll see the new Macs.”

Much more in the full article here.

MacDailyNews Take: An Apple-NVIDIA partnership has a bright, and mutually-beneficial, future indeed!

SEE ALSO:
Job listing suggests NVIDIA is working with Apple on future Macs – September 23, 2016

10 Comments

  1. Apple seems to be heavily into green computers so I wouldn’t expect Apple to be coming out with computers sporting the latest powerful graphics processors. I’ve seen the size of those graphics cards and how much heat they give off. I can understand they’re definitely not something Apple wants in its computers. Apple seems to have chosen its path for its consumer computers and it’s a low-power path. I think that’s fine for most home computer users. The Mac Pro is (or should be) the only exception.

    A monitor with a built-in graphics card seems like an interesting idea although I’m not sure how practical that is. Apple has got some weird ways of doing things. It’s OK. If I don’t like they way they’re building computers, I can always buy some other company’s products. However, I’m sticking with my Macs.

  2. AMD video cards don’t interest me in a new Mac Pro. Nvidia does with it’s CUDA acceleration capability. Having talked with the guy who actually designs the darn things at the last Siggraph Nvidia is the video card company of the moment. In any case Apple needs to give all pro’s options and not just one video card type fits all.

        1. Your argument is pointless as we’re talking about general technologies, not the specifics of one particular developer. Developers don’t want to write and optimize for a GPU vendor specific basis if they don’t have to. Cuda predates OpenCL and had initial early support. I’m not knocking Cuda as a technology. Rather, I’m knocking Cuda as the future path for GPU compute. Sure, in an ideal world, accommodate both. However, if resources are limited (and they usually are), then writing for OpenCL will go a lot further as it will be supported across hardware vendors. That’s reality… don’t be so defensive.

        2. Your argument is pointless too since Adobe hasn’t yet made those changes. Please give them the benefit of your expertise and show them the way. I don’t give a darn what technology will accelerate graphics apps, pick one we all can agree on and get ‘r done. I hate this one card good for one thing and another card good for something else so you better have the right card business.

        3. I think you are not well informed. Sometimes open/general technology wins, but often it does not.

          NVidia’s CUDA has huge widespread support throughout engineering, scientific, financial and other heavy compute applications. Nvidia pioneered GPUs as general purpose parallel computing devices, creates fantastic CUDA libraries for specific types of calculations, and lots of vendors have standardized on CUDA.

          I would know, I only develop for CUDA because all the products in the same ecosystem only use CUDA.

          Its been a rough time no having MacBook Pro’s and especially a Mac Pro that supports CUDA.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.