Facebook has no plans to bring Oculus Rift VR to the Mac anytime soon

“While so many virtual reality hardware companies have been tasked only with selling their own product, Oculus has had the intense challenge of building the entire industry’s ecosystem for the last several years,” Lucas Matney reports for TechCrunch. “That growth has come with some hiccups by way of lawsuits, product launch delays and executive shakeups, but Oculus is still very much alive and well under its parent company, Facebook, which has placed the company at the forefront of its 10-year project to take VR mainstream.”

“I had a chance to sit down with Oculus’ co-founder and head of Rift Nate Mitchell to hear about the company’s latest price cuts, some of the internal reorganizations within the company and what the likelihood was of getting Rift support on the Mac,” Matney reports. “During our interview, I noticed Mitchell’s work laptop, a last-gen MacBook Pro, sitting in the corner of the room, at which point I asked about the Rift’s lack of support for Mac.”

“Earlier Oculus development kits initially supported some of Apple’s more full-featured computers, but the more stringent spec requirements of the Rift ultimately left Mac users out of the loop for the product’s consumer launch,” Matney reports. “Mitchell tells me that Mac support is still something the company is looking at, but it may not be coming for a bit. ‘It’s something near and dear to my heart, we’re just not there quite yet,’ he says. ‘We do want to do OS X (macOS) support for Rift, it’s not something that’s currently on the roadmap for — I can even say — the next six months,’ Mitchell tells me.”

Read more in the full article here.

MacDailyNews Take: Whatever.

SEE ALSO:
Beware: Oculus Rift VR headset leaves embarrassing red marks and can cause wearers to feel ‘seasick’ – April 1, 2016
Palmer Luckey: Oculus Rift will support Mac if Apple ‘ever releases a good computer’ – March 3, 2016
Apple is working on VR user interfaces and gaming; looking for Oculus and Leap experts – February 10, 2015
Apple patent application reveals personal display headset invention – May 8, 2014
Apple continues to tweak Apple TV video headset accessory – April 10, 2014
Oculus co-founder says he didn’t want a deal with Apple, chose Facebook instead – March 27, 2014
Apple granted knockout patent for head-mounted personal display – December 10, 2013
iGlasses: Apple granted patent for head-mounted augmented reality displays – July 5, 2012

19 Comments

  1. MDN is right on. The parsing he uses in that answer is telling, when he says “more stringent hardware requirements” what he actually means is that their system is inefficient and can’t run well on anything but the most powerful fp32 hardware that leverages their software. I.e. It’s a bloated mess.

      1. With graphics, that’s actually true. Graphics api’s have become increasingly more efficient over the years, like apple’s metal, Vulcan, and dx12. But a lot of software that runs on older api’s is inefficient. The technical capability for vr is possessed by even higher end Intel integrated graphics chips, if the OpenGL/Vulcan API is used. But if you use earlier OpenGL versions, or dx11? Then they can’t do it. It’s all about optimization. Exactly why a PS4 pro can run vr apps, with a 3rd of the graphics horsepower of the minimum spec for an oculus rift system.

        1. Hmm, your level of graphics system knowledge seem to have scared away the Macglobin while making him look like an idiot in the process. Well played. (Who’s the true unpleasant peasant now?)

        2. I always like when that happens, often times when I make a comment contradicting something that someone has said that is definitely wrong, they either call me names or insists that I’m some kind of fraud who doesn’t know what they’re talking about. I’ve stopped commenting on political threads for this reason, I just can’t do it anymore. I’m going to stick to technology related articles that include topics on which I have a total understanding (graphics design/visual development, film editing, video compression/screen technology, networking, computer hardware, and application development).

          The misunderstanding of GPU technology and functionality is incredibly widespread, someone watches a 4 minute video on YouTube from Linus and all of a sudden they’re an expert in the field? This would be fine if Linus (although enterrtaining, and smart) wasn’t frequently incorrect when it comes to basic concept like thermal throttling, resource allocation, how parallel computing works, and stuff related to what I do for a living. If he’s talking about video games? Then yeah, he’s right a lot. But that’s not the end all be all of computing, and my business (architecture, video production, cad work, and graphic design) is a significantly larger segment of the global economy than pc gaming.

        3. As someone who has spent years writing directly to GPUs I don’t understand your “bloated mess” interpretation.

          Currently shipping Macs do not support higher end cards, which has been a source of frustration for many of us ever since the 2013 Mac Pro took to “Pro” out of “Pro”.

          Secondly, while API’s evolve to be more efficient, those efficiencies take time to adapt to and often are not as complete as non-developers might think.

          Finally, the big hurdle for VR relative to other graphics processing, is latency must be VERY VERY low. Slower cards which might seemingly generate stereo graphics at a decent frame rate for normal games will give someone in a VR headset nearly terminal nausea.

          It is not the least bit surprising that the highest end GPUs are necessary in 1st generation VR hardware and that Apple’s neglect of the high end GPU market has handicapped it here.

          Hopefully in the near future Apple provides Mac options that allow for the highest end cards to be used (and hopefully regularly upgraded for those of us that need that).

        4. Here’s why I call their software a “boated mess” (this is from the Wikipedia article on gpgpu computing, of which I am contributor to) :

          Any language that allows the code running on the CPU to poll a GPU shader for return values, can create a GPGPU framework.

          As of 2016, OpenCL is the dominant open general-purpose GPU computing language, and is an open standard defined by the Khronos Group.[9] OpenCL provides a cross-platform GPGPU platform that additionally supports data parallel compute on CPUs. OpenCL is actively supported on Intel, AMD, Nvidia, and ARM platforms.

          The dominant proprietary framework is Nvidia CUDA.[10] Nvidia launched CUDA in 2006, a software development kit (SDK) and application programming interface (API) that allows using the programming language C to code algorithms for execution on GeForce 8 series GPUs.

          Programming standards for parallel computing include OpenCL (vendor-independent), OpenACC, and OpenHMPP. Mark Harris, the founder of GPGPU.org, coined the term GPGPU.

          OpenVIDIA was developed at University of Toronto during 2003-2005,[11] in collaboration with Nvidia.

          Microsoft introduced the DirectCompute GPU computing API, released with the DirectX 11 API.

          Alea GPU[12] created by QuantAlea[13] introduces native GPU computing capabilities for the Microsoft .NET language F#[14] and C#. Alea GPU also provides a simplified GPU programming model based on GPU parallel-for and parallel aggregate using delegates and automatic memory management.[15]

          MATLAB supports GPGPU acceleration using the Parallel Computing Toolbox and MATLAB Distributed Computing Server,[16] and third-party packages like Jacket.

          GPGPU processing is also used to simulate Newtonian physics by Physics engines, and commercial implementations include Havok Physics, FX and PhysX, both of which are typically used for computer and video games.

          Close to Metal, now called Stream, is AMD’s GPGPU technology for ATI Radeon-based GPUs.

          C++ Accelerated Massive Parallelism (C++ AMP) is a library that accelerates execution of C++ code by exploiting the data-parallel hardware on GPUs.

          Pre-DirectX 9 video cards only supported paletted or integer color types. Various formats are available, each containing a red element, a green element, and a blue element.[citation needed] Sometimes another alpha value is added, to be used for transparency. Common formats are:

          8 bits per pixel – Sometimes palette mode, where each value is an index in a table with the real color value specified in one of the other formats. Sometimes three bits for red, three bits for green, and two bits for blue.
          16 bits per pixel – Usually the bits are allocated as five bits for red, six bits for green, and five bits for blue.
          24 bits per pixel – There are eight bits for each of red, green, and blue.
          32 bits per pixel – There are eight bits for each of red, green, blue, and alpha.
          Floating-point numbers Edit
          For early fixed-function or limited programmability graphics (i.e., up to and including DirectX 8.1-compliant GPUs) this was sufficient because this is also the representation used in displays. This representation does have certain limitations, however. Given sufficient graphics processing power even graphics programmers would like to use better formats, such as floating point data formats, to obtain effects such as high dynamic range imaging. Many GPGPU applications require floating point accuracy, which came with video cards conforming to the DirectX 9 specification.

          DirectX 9 Shader Model 2.x suggested the support of two precision types: full and partial precision. Full precision support could either be FP32 or FP24 (floating point 32- or 24-bit per component) or greater, while partial precision was FP16. ATI’s Radeon R300 series of GPUs supported FP24 precision only in the programmable fragment pipeline (although FP32 was supported in the vertex processors) while Nvidia’s NV30 series supported both FP16 and FP32; other vendors such as S3 Graphics and XGI supported a mixture of formats up to FP24.

          Shader Model 3.0 altered the specification, increasing full precision requirements to a minimum of FP32 support in the fragment pipeline. ATI’s Shader Model 3.0 compliant R5xx generation (Radeon X1000 series) supports just FP32 throughout the pipeline while Nvidia’s NV4x and G7x series continued to support both FP32 full precision and FP16 partial precisions. Although not stipulated by Shader Model 3.0, both ATI and Nvidia’s Shader Model 3.0 GPUs introduced support for blendable FP16 render targets, more easily facilitating the support for High Dynamic Range Rendering.[citation needed]

          The implementations of floating point on Nvidia GPUs are mostly IEEE compliant; however, this is not true across all vendors.[18] This has implications for correctness which are considered important to some scientific applications. While 64-bit floating point values (double precision float) are commonly available on CPUs, these are not universally supported on GPUs. Some GPU architectures sacrifice IEEE compliance, while others lack double-precision. Efforts have occurred to emulate double-precision floating point values on GPUs; however, the speed tradeoff negates any benefit to offloading the computing onto the GPU in the first place.[19]

          Vectorization Edit
          Most operations on the GPU operate in a vectorized fashion: one operation can be performed on up to four values at once. For example, if one color is to be modulated by another color , the GPU can produce the resulting color in one operation. This functionality is useful in graphics because almost every basic data type is a vector (either 2-, 3-, or 4-dimensional). Examples include vertices, colors, normal vectors, and texture coordinates. Many other applications can put this to good use, and because of their higher performance, vector instructions, termed single instruction, multiple data (SIMD), have long been available on CPUs.

          Essentially, FP32 has become the default metric for high end graphics chips, primarily due to shader performance. But raw floating point is not the biggest factor when it comes to GPU performance with regard to general purpose systems and tasks. If you’re specifically referring to nvidia chips as the “high end” then you don’t work much in deep learning or pro cad applications that rely on OpenCL/OpenGL to produce their results like the fire-pro cards from AMD which, although are not great at gaming when compared to the GeForce line? Are extremely competitive with quadros when it comes to professional applications.

          And my example of the PS4 holds true, even a base PS4 with a gpu from 2013 that approximately has 1.8 teraflops of FP32 performance can run VR apps because of optimization for that hardware. Oculus could easily make the minimum requirement similar to that level, but they cannot because of their code base. It is easier to throw higher performance hardware at the problem than optimize software, as that Ames everyone’s workloads less. I wouldn’t be surprised if they could shrink their codebase by 50% if they tried, and if they adopted Vulcan support, any gcn card post 2012 would be able to use the product. Albeit at a lower resolution of around 960p, but it would still work, and work well. Similarly if application developers would stop leveraging single core performance and take advantage of distributed work loads, we could a massive efficieny improvement.

  2. Mitchell is pretty specific about the possibility that something will change in about six months. If I were Apple, this is a company that I would be recruiting to develop software that I could demonstrate with new hardware in… September, maybe.

  3. Totally whatever. I wish they’d take the rest of their spyware sh*t with them. What is the ratio of Apple users to Facebook (and by extension Instagram, Oculus, etc.) compared to other companies? I’d be curious to know. Who can say? If it proves to be a viable business option (currently it most definitely isn’t), Apple may just ‘reinvent’ VR, too.

  4. AFAIK the technology hasn’t progressed beyond the 2014 neural differential benchmark. Many people like me, predominantly female, can’t use it. That may not qualify as a tragedy, but it makes its widespread acceptance in scientific research and in military and aviation simulators problematic. Not to worry, they can still make money on immersive videogames tailored to the emotional needs of 18-35 male subjects.

    https://qz.com/192874/is-the-oculus-rift-designed-to-be-sexist/

    1. You’re right, and that’s why AR is a much better solution with current technology. Until we get to the point of having a fully holographic 3D virtual space that all of our brains can process effectively, à la the holodeck, the. VR will remain a curiosity as you said for video gamers at least in the consumer space.

  5. I’m still puzzled on why Facebook has to update every 2 weeks. I’m still on Version 26. I don’t post anymore & just use it too see the news headlines, then hop over to the original site, without clicking the FB link. Makes me think that because Rift can’t get anyone’s data on OSX, they don’t want to lose money.

  6. Facebook? That other rogue company. Apple should ban any Facecrook products from infesting their eco-system.

    As for VR, the way Facecrook is seeing this new sector is the wrong way to approch it. Spoiled kid toy for rich kid. Ya! I’ll buy 2 to go!

    Gesh!

  7. Since Apple apparently does not give a shit about the Mac, why should anyone developing HW/SW?
    Apple these days is about Vampire Video for everybody- getting a discrete GPU/CPU setup with upgradeable memory capacity is measured in the thousands of Dollars.
    There is software being sold on the Mac App Store today that will not run on most of what they sell (HW) and it is not even leading edge in HW requirements. This simply should not be.

    1. What applications on the App Store don’t run on the majority of Mac hardware? The general requirements are a 64bit processor and macOS 10.11.6…. so anything that will run that will run the vast majority of applications. Can you list some so I can become familiar to inform clients, employees, and myself?

        1. That doesn’t really answer my question, and also seems to be false as Iook through the store… from what I’ve been reading in the games section almost every Mac made after 2011 can play every game sine the minimum requirements seem to be Intel hd graphics 4000 which would be the ivy bridge graphics chipset. That chipset is on par with mid range dedicated graphics from around 2010 in terms of raw performance. I’m still failing to see how that isn’t a majority of what they sell hardware wise, the new civ game looks like it will even run on one of my old iMacs from 2010… as far as other applications, the requirements seem to be 64-bit processor, and macOS 10.11.6 or 10.12.2 which gives a pretty solid group of systems.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.