Apple’s inevitable operating system convergence will be about input, not interface

“There will be a day when our iPads and MacBooks run the same operating system,” Michael Simon writes for Macworld. “Despite what Tim Cook wants us to believe, the crossover potential is simply too tantalizing for it not to happen; in fact, I’m willing to bet that somewhere deep inside Jony Ive’s laboratory there already exists a machine that merges the best of the mobile and desktop operating systems into a singular first-class experience.”

“But if the Surface and Pixel C have taught us anything, it’s that two worlds aren’t quite ready to meet,” Simon writes. “Even if Apple has dreamed up a technical solution to bring iOS and OS X together under one roof, there are a slew of little wrinkles that need to be ironed out before such a dream machine can even be considered for a keynote slot.”

“It doesn’t take a computer scientist to see that iOS isn’t built for a mouse and OS X isn’t made to be operated with our fingers. So, while a junction may be inevitable, it’s more likely to come after a continued, gradual shift rather than a sudden one,” Simon writes. “Any semblance of convergence needs to begin with a way to meld the disparate ways we interact with out devices. And I think that’s precisely what Apple is trying to do.”

Read more in the full article here.

MacDailyNews Take: Using new input devices and paradigms like the Apple Pencil and the iOS 9 trackpad enabled by 3D Touch* is a sublime and eye-opening experience. Anything is possible and the possibilities are virtually limitless!

*With iPhone 6s and iPhone 6s Plus, just lightly press on the keyboard with one finger and it becomes a trackpad. It’s beyond useful, makes things like text selection a snap, and points beckoningly (pun intended) to the future!

10 Comments

    1. You always have to be careful in your interpretation of statements made by Apple management. That was true for Steve Jobs and it is true for Tim Cook and others.

      There is what has been done – what came before. There is what is being done (generally led by Apple), such as multitouch and the new Apple Pencil. And there is what can be done, some of which is currently being investigated in laboratories at Apple and other companies, and some of which is still far in the future. Apple management is usually talking about the present and the very near future – perhaps a few years, at most. They do not reveal the R&D that is leading to the next great thing.

      As with many things that Steve Jobs said, there will be no “XX” right up until the time that Apple releases “XX”. Then it will become the new paradigm, the new possible. And all too soon, people begin to take it for granted and look for the next revolution.

      When the methods of interacting with computers and other devices reaches a point at which convergence is possible and useful, then you can bet that Apple will have already been working on it for years. In my opinion, the key is input and output. How do you provide guidance and data to the device (particularly miniature devices), and how do you extract useful data from the device in a natural and efficient way? Using a finger on my iMac is not the primary way that I would prefer to interact with it – a mouse is far better for most tasks. Pairing a mouse with an iPhone, however, is probably not a great option for most people. Eventually, however, new methods of interaction with computing devices will likely enable new revolutions in personal technology. I hope that Apple continues to lead the way.

  1. Had Steve lived for five more years, the OSes and devices would have converged already into a unified platform. He was merciless with respect to his vision and had no patience for legacy systems and legacy thinking.

    Ever since the humans started making and using hand tools (in the bronze age), we have been working directly with our objects of work: cutting, slicing, grinding, scraping, writing, painting, bending… Whether with our hands, or with various implements, we physically connected with these objects.

    Then, sometime in the last century, someone invented this novel concept of interacting with a mechanical device, where you would provide control mechanical input on one device (keyboard or mouse), and the object of your work would respond on another, several feet away (the display). The system was (and continues to be) rather unintuitive, and requires us to learn how to use it. Those of us who learn it at early age adapt to it easily; those who start in adulthood take a lot of time, but eventually figure it out. Still, unlike all other work and tools where we directly interact (pencil, saw, chisel, ruler, brush, screwdriver…), the keyboard-mouse-display concept requires training and effort before one can comfortably use it.

    Nine years ago, we got the iPhone. The interaction with our object of work again became direct; we no longer have to manipulate some mouse in order to interact with an application on display. The interaction is direct and intuitive. No training is required anymore; one-year old babies, and 90-year old grannies intuitively figure out how to use these devices, because they interact and respond directly, much like the pencil, scissors, glue stick, drill…

    There is no valid reason why Mac OS can’t lose the keyboard/mouse interface and move over to the multi-touch. iPad Pro is a great indication how this might work. Obviously, you would no longer have to put the display upright; instead it can be lying flat (or tilted) on the desk, the way we always had large work surfaces (drafting table, spreadsheets, notepads, etc). The only minor issue would be to develop an accurate interface that would reliably differentiate between a deliberate finger touch and a resting palm / arm. Mouse as a pointing device is an imprecise and difficult to precisely articulate (which is why we need to zoom in so far in Illustrator, in order to hit the correct pixel). This explains popularity of Wacom devices among illustrators. A Mac with the Apple Pencil is no different than an iPad Pro, except for a greater muscle and display size.

    Apple has made many radical transitions in its history (68k to PPC, System 9 to OS X, PPC to Intel, 32-bit to 64-bit, etc). IN each of those transitions, it dragged its developers (some kicking and screaming) along, and together with its core user base, it pushed everyone into the future. Nothing prevents Apple from doing this again and getting everyone to finally abandon the abomination that is the mouse. It has served us very well (John Dvorak’s opinion on the “newfangled device” notwithstanding) and it is time to retire it for a truly intuitive multi-touch Mac.

    1. “instead it can be lying flat (or tilted) on the desk, the way we always had large work surfaces (drafting table, spreadsheets, notepads, etc).”

      WONDERFUL! /s
      Don’t forget, we could have had horizontal or nearly horizontal monitors all along. Some desks even did that with CRT displays. We put screens vertically in front of us because it is INSANELY more comfortable than spending all day with the neck bent way over to look down at the desk surface.

      “Mouse as a pointing device is an imprecise and difficult to precisely articulate”
      It’s incredibly more precise and easy to articulate than the end of my big fat finger. And if using a pen, you’re still going to have to zoom in, in Illustrator.

      Adapting Pages to be compatible with the touch environment of iOS has resulted in a castrated cripple. If crippling functionality is part of the price of this convergence, no f’ing thanks.

  2. “The interaction with our object of work again became direct; we no longer have to manipulate some mouse in order to interact with an application on display.”

    Wow, you are a master of oversimplification. Touch on iOS can do very simple things. Be honest and you will realize that is true. In the overall scheme of how computers are used, touch can only handle a small percentage. Just because Apple makes most of its profit from iOS devices, we should not overestimate how much actual computing and content production could NOT be done without iOS.
    “The only minor issue would be to develop an accurate interface that would reliably differentiate between a deliberate finger touch and a resting palm / arm”. A finger is much larger than a cursor, so much for precision. Yes I know, I know, there will be a” breakthrough”, as if you could go down to Walgreens and buy a “breakthrough” off a shelf.

    That IS NOT a minor issue.

    Look honestly at what you can or can’t actually do on iOS. I am a HTML5 motion graphics and web producer, and the developers of the various apps that I use to produce elements. (over a dozen separate apps) ALL state that there are no plans for a touch interface because there just is not enough “horsepower” and the needed functions are to complex to be done in a touch interface due to the far more complex functions required to produce something more than a text document or a simple drag and drop do it yourself website. (VASTLY OVER-RATED,by the way) I have some motion graphics files that use over 100 separate timelines to produce a motion graphic that runs 23 seconds, each with as many as 50 or more time, color, size events. That 23 second animation can take up to a week to produce with hundreds of elements making up the events.

    If you can tell me that there will be a pie in the sky setup like that, I will buy the pie. Until then……

  3. Osx just being a touch interface? No thanks. As a user of voiceover, I rather like the keyboard, a lot. Its better, for example, going to the doc and pressing S for Safari, then return, than going to the home screen and moving my finger around to find Safari, then double-tapping on that icon once I’ve already found it.

    Sent from my iPhone

    >

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.