Apple’s Jony Ive explains why Apple ‘many, many years ago’ decided against adding touchscreens to the Mac

“Jony Ive says thinking different is actually pretty easy,” Connie Guglielmo reports for CNET. “‘Doing something that’s different is actually relatively easy and relatively fast, and that’s tempting,’ says the man who’s had a hand in every major Apple product design — from the colorful iMac and iBook to the iPod, iPad, iPhone and Apple Watch. ‘We don’t limit ourselves in how we will push — if it’s to a better place. What we won’t do is just do something different that’s no better,’ Ive said in an interview earlier this week to explain the design of the MacBook Pro, a major reboot of Apple’s most powerful laptop line.”

“That thinking explains why Apple ‘many, many years ago’ decided against adding touchscreens to the Mac, even as rivals dressed up Windows tablets and PCs with multitouch displays,” Guglielmo reports. “Instead, after two years of tinkering with larger touchpads and other approaches he won’t reveal, Ive and his team came up with a slim, multitouch strip that replaces the function keys at the top of your keyboard.”

Ive, who holds over 5,000 patents, spoke with CNET News Editor in Chief Connie Guglielmo about why the Touch Bar is just “the beginning of a very interesting direction” for Apple.

Jony Ive: When we were exploring multitouch many, many years ago, we were trying to understand the appropriate application and opportunities for [it]. We just didn’t feel that [the Mac] was the right place for that…. It wasn’t particularly useful or an appropriate application of multitouch… For a bunch of practical reasons. It’s difficult to talk [laughs] without going into a lot of details that puts me starting to talk about things that we are working on. I don’t really want to talk much more about it.

Read more in the full article here.

MacDailyNews Take: As we’ve been explaining for many years, Apple does touch right and, as usual, Microsoft and the Windows PC assemblers do it wrong.

Again, as we wrote of the new Touch Bar during our live coverage yesterday, “This is the smart way to have Multi-Touch on your personal computer, as opposed to the stupidity of smearing fingers all over your Retina display.”

To us longtime Apple watchers, Cupertino seems to be saying, “Multi-Touch on the screen only when trackpads are not part of the device.”MacDailyNews, November 19, 2008

Apple long ago tested and rejected what Microsoft is trying to sell to fools today.

Does it make more sense to be smearing your fingers around on your notebook’s screen or on a spacious trackpad that’s designed specifically and solely to be touched? Apple thinks things through more than other companies… The iPhone’s screen has to be touched; that’s all it has available. A MacBook’s screen does not have to be touched in order to offer Multi-Touch™. There is a better way: Apple’s way.MacDailyNews, March 26, 2009

SEE ALSO:
Apple’s new MacBook Pro’s Touch Bar runs iOS on Apple’s T1 processor – October 28, 2016
Apple does touch right and, as usual, Microsoft does it wrong – October 28, 2016
Amazon offers big savings on Apple’s all-new Touch Bar-equipped MacBook Pro models – October 28, 2016
IBT: Apple’s MacBook Pro Touch Bar is the coolest thing ever; will change the way we use laptops – October 28, 2016
Wired hands on with Apple’s New MacBook Pro: It’s a whole new kind of laptop – October 27, 2016
CNET on the new MacBook Pro: Apple’s amazing strip show reinvents the notebook – October 27, 2016
Hands on with Apple’s new MacBook Pro: Looks and feels so good it’s unreal – October 27, 2016
Apple debuts three new TV ads for all-new MacBook Pro with Touch Bar – October 27, 2016
Apple unveils groundbreaking new MacBook Pro with revolutionary Touch Bar and huge Force Touch trackpad – October 27, 2016

37 Comments

  1. I can see a few use cases involving stylus input – architecture, art, etc. – that benefit from the availability of a touch-sensitive display. The natural approach for an artist would be to apply the Apple pencil to the display, not a trackpad or touch bar. In those cases, the display also needs to be reclined substantially relative to the typical upright orientation, and the body/keyboard of a typical (non-convertible) laptop tends to get in the way a bit.

    Apple offers the iPad Pro for these types of situations, but it runs iOS, not macOS, and that appears to be a problem for some. Should Apple address this niche market by enabling Apple pencil functionality on MacBooks and iMacs, at least as an option? Not sure…

    1. In theory I agree with Apple & MDN. However, we’re living in an increasingly touchscreen world, from the elevator to the supermarket checkout, to cars…it’s actually becoming almost like a mindshare….see screen, touch screen. It’s mostly Apple’s fault for such a superior job with the first smart multitouch phone. Now people expect their screens to be touchable. Young people are growing up in a multitouch world. So while Apple might be technically right on the usefulness, or gorilla arm syndrome & fingerprints… mindshare is going another way.

      Just like it did for a larger smartphone. Apple might resist developing a touchscreen mac for a while, but I hope not too long. Apple is pretty smart. If they want to solve the fingerprint problem they will. As for the gorilla arm, that will require actual redesign of the people’s work station ergonomically. Apple is smart enough to do that too.

      If I was in Apple’s position though, I’d try to find a way to connect the iPadPro to the iMac or MacPro in a touchscreen capacity for now. That would cool.

      1. You can have this now with the duet app. Your iPad is connected to the host Mac vis lightning cable so there is no latency and the tablet charges as a side benefit. There is a pro version that supports tilt and pressure data from an Apple pencil but there’s a yearly use fee.

        I’ve own the basic version and can say it works as advertised. You will want to turn down screen resolution to get bigger touch targets. The Mac sees the duet as another monitor, so you cange it in system prefs: displays where you can set resolution and adjust relative position of the iPad to other screens. You can set the iPad to mirror the main display as well.

        1. I agree with most of what’s said about using touch as an input method for PCs in general. Ergonomically it doesn’t work… but like almost everything, there are exceptions.

          The MS Surface Studio answers one of those exceptions, probably the major one. The ideal input method for people who use computers, but are graphic arts people who traditionally, typically use pencils, pens, brushes and similar tools. Is it for everyone… no. But I’ll be surprised if the Studio isn’t a success. It would have to be junk.

          As a graphics person who sees the Surface Studio as the answer to a dream (except for the Windows UI), I have a major problem with the duet solution…beyond the kludgey iPad / Mac issue that MS has solved with one device that does it all.

          The yearly use fee. I don’t do SAS (software as subscription)… I have nothing to do with Adobe anymore and, as of now, don’t see that ever changing. I sure as heck am not going to get into HAS (hardware as subscription).

        2. See I’m a graphics person too and every once in a while I’ll still use an actual draft board to do mock ups. But those instances are diminishing as design becomes more digital for a mostly digital environment. People are now consuming their information on digital devices which means design itself is changing. So while cool the MS Studio Pro would have been a bigger hit 5-10 years ago when traditional printing was stronger. It might still be a hit within the advertising industry but those people are generally Apple loyalists and will likely wait for Apple to build a bigger iPad. LOL. Mobility.

          I actually don’t know what’s better… Apple continues to upkeep 2 distinct but brilliant operating systems or slowly merging them over time as touch devices surround us in all aspects of life from refrigerator to flying cars.

          I would imagine Apple wrestling with the application of all the opportunities and ideas.

          Still I hope that Apple designs a brilliant iMacTouch if only to shut up yammering morons who thinks that Apple is not innovating anymore.

    1. And MDN pointed out it only pertains to devices without other means of inputting. Also iPhones and iPads are not held vertically (like a computer monitor) but more horizontally as a true input (like a Wacom stylus platter) should, or Touch Bar.

      Touch screen computer screens are an awful solution especially if you use your screen for judging images, and the fact your arm will get tired hanging in space. An augmenting device, like a Touch Bar or Wacom tablet, is the ideal way to interface with a desktop computer. Just because you can Microsoft doesn’t mean you should, as Apple has learned.

      1. Bullshit. I’m not a Microsoft user, but I have used a touch screen Windows machine, and where appropriate, touch is a better way to go.

        All you fanbois say this until your beloved
        Apple comes up with it. Then, it’s the greatest thing ever.

        1. Calling us fanbois only weakens your argument. I don’t think those who want to use desktop touch screens should be denied just as those who don’t should be able to ignore it.

          The difference is when and if Apple does something it usually shows a lot more thought behind it than Microsoft or others are capable.

    2. It’s not the retina display. iPhone and iPads have touch optimized user interfaces. Macs do not. Trying to shoehorn a touch UI onto a desktop OS results in undesired consequences, as Windows shows.

  2. Have you seen the computer screen for these Windows Touchscreen? It’s a mess full of dirt, fingerprints, food stains and just disgusting… lol I’m happy with the big multitouch pad and soon the strip. Bring it to the iMac keyboard pleeeeeassseee!

  3. Michael, I concur. I’m not necessarily for a Mac with a touchscreen, but MDN’s argument of smearing your fingers on the screen is not convincing—we do it all the time with iPads and iPhones.
    In the article, Jony Ive did not explain why Apple decided against touchscreens, he simply stated that they decided against them “for a bunch of practical reasons.” He declined to give any of them.

    1. It’s been noted elsewhere: the iPad and iPhone have no other input method. The screen is designed for input and is typically done in a near-flat, horizontal position. A laptop screen is vertical where one’s hand and arm are fighting gravity and awkward angles to use. And the laptop has a big flat, horizontal area already dedicated to input. Adding the Touch Bar to this surface is brilliant. As far as fingerprints on screens, it is obviously a lot easier to wipe off an iPhone or iPad because their flat surfaces. The laptop screen cannot be simply rubbed up and down on a shirt or pant’s leg to clean up like the phone.

  4. An iPhone or iPad you hold in your hands and interact with it exclusively with touch, a Laptop is a totally different device..

    On the laptop, you have keyboard and trackpad inputs. Now you have a Touch Bar input as part of the keyboard. Your hands remain in that plane to interact with the device.

    I just don’t see myself typing along or interacting with an App by raising my hand to the screen to tap it to do something that I can do from the keyboard or trackpad. Now I can do many more things possibly from the touch bar without ever having to touch my screen and gum it up with finger prints

  5. I must admit that an iMAC screen that could be rotated near horizontal and operated on using the Apple Pencil would be attractive for certain tasks. Fingerprints are not a real issue as iPADs have proven. Perhaps Apple has an even better approach in the works but I would not be so quick to trash this creative idea just because it came from the competition.

  6. One thing that gets missed too often in this discussion is that the new, larger, ultra-sensitive trackpad is very similar to a Wacom-like device, allowing fine control and precision without the burden of stretching out to a vertical screen.

    I ordered a 15″ MBP yesterday, and I’m looking forward to exploring the Touch Bar, but I am equally excited about the elegance and precision of the new trackpad.

    Apple does indeed do touch right.

    1. @Markar12
      This will show my age, but I DO remember light pens and using them when I was with an engineering company who used IBM special cathode ray screens that were light pen interactive.
      These screens and pens were for a mainframe computer based CAD system, developed by Lockheed, called CADAM.
      I worked on this system for the first 14 years of my design career!
      The whole system was a CAD ( Computer Aided Draughting) system that you used a light pen to manipulate the pixels onscreen to create complex draughting geometry.
      These workstations and the accompanying mainframe computer were HUGE by personal computer standards today!
      I remember this CAD system because I started to bend this CAD software to my will and I was able to use this system more for some REAL decent looking graphic design work for the company and for myself, just around the time Apple first introduced the Macintosh!

  7. An iMAC with a screen that could be rotated near horizontal and operated on with an Apple Pencil would be quite useful for certain applications. The fingerprint issue is not really a problem as iPADs have shown. Perhaps Apple has an even better idea in the works. This is a good idea that should not be trashed just because it came from the competition.

    1. About nine years ago there was an Apple-approved third party touch screen version of the Mac, the ModBook. It got a lot of attention at the time, and it worked. But sales were not enough to keep the company afloat. Several years later the original engineer of the ModBook created a new company and tried again with the same ultimate result.

      At the time I was writing chemical and math formulas a lot, and rough sketches of lab setups, so I got a ModBook. It allowed me to make such entries using a stylus. For entering searchable quick notes, I found a handwriting app that did a good job of recognizing my cursive handwriting (it was better than anything I’ve found for my iPad Pro, that works offline).

      But for most of the work I did on that computer, the work was done much more efficiently with an Apple Bluetooth keyboard than with a stylus. That’s still the case with my iPad Pro, when I’m doing heavy text entry or working with a spreadsheet or database.

      Steve Jobs and Ives were right. For most tasks one does with a computer, touchscreen and stylus input isn’t the best ergonomic approach.

  8. I wonder if this new, larger trackpad has capabilities that weren’t revealed yesterday due to not being ready for primetime. Like those of the aforementioned Wacom devices?

    Just a thought…

  9. Judging by the smoking latest processor in the iPhone 7, if you need a touch screen, then buy the next iPad Pro and attach the keyboard you desire…with this option, why should Apple bother making at touch screen laptop when it’s already happening the other way around?

    1. I use the large iPad Pro with a keyboard that, frankly, looks pretyy much the same as an MBP keyboard. I can use Mac commands like spacebar to scroll in Safari or Command tab to see the app switcher like in MacOS. Sometimes I forget I’m working on an iPad and it feels more like a touchscreen MBP. It woukd be bad ass if someone marketed an iPad mount the could go from vertical to horizontal drafting table position smoothly like the new Surface Studio.

  10. If you use your laptop as a docked workstation with a big screen for graphic design and video editing, how are you supposed to ergonomically use this control strip with a dual monitor setup? Are they going to release this strip on a bluetooth keyboard so it can be used with the laptop closed and two big monitors?

  11. if apple was really smart, they would release a wired numeric keyboard with a T1 chip and a touch bar immediately. Throw in two usb 3.1 ports and a SD slot and that would be awesome. This way, nearly every Mac owner could use the touch bar and developers would be happy.

    In addition, they should make it possible to fully integrate an iPad Pro into the Mac to use it as a Wacom Cintiq for the artist/designer/architectural market. Possibly also integrate it with Final Cut Pro and Logic Pro too. Imagine stepping back from the recording console with your iPad Pro in hand to listen to recording but still having full control. To me, this would sell hundreds of thousands of iPad Pros. Five million keyboards. But they won’t do this because it would be less then a billion in total revenue. 🙂

    Obviously, Apple has the technology to do this today.

  12. Jony Ive has become quite rigid in his thinking. There’s a contradiction here, we’re lead to believe that the large screen iPad pro is all the computer that most of us will ever need and is also ideal as a creative device, Jony even built a stylus for it. Tim Cook says he uses his for everything. So if the iPad pro can be used for all things then why can’t the iMac, or MacBook be used in multiples ways? why not let the consumer decide what they want to use their device for? lets face it we’re all different we all work in different ways, so why can’t we have more flexible products? serious questions

    1. The answer is simple. Ergonomically, vertical touch-screen is a nightmare and goes against every most basic principle of user interface design. MDN often refers to this as “gorilla arm”, and in may ways, it is true.

      However, the idea of a flat-lying touchscreen (slightly inclined) would ultimately make the most intuitive, the most ergonomical and most efficient user interface out there.

      I had often argued that the iOS and MacOS would eventually converge into one system, based on the iOS paradigm, and that the traditional computing concept of keyboard-mouse-display would be relegated to history, replaced with a multi-touch display, with an auxiliary QWERTY keyboard when long-form text needs to be entered.

      For the entire duration of the human existence, we’ve been directly interacting with the object of our work. Cutting, grinding, slicing, drawing, squeezing, pressing, screwing… whatever the action, we did it either directly with our fingers, or using tools in direct contact with the object. Then, some 60 years ago, someone invented this awkward and unintuitive concept where we had a display that showed our work, and a keyboard with buttons that was used to interact with our work, and these two were completely disconnected, several feet apart. You couldn’t simply walk up and use it; you had to be trained to interact with it. Even the GUI that came with Lisa and the Mac was still troubled with the same problem — the work was there on the display, but you interact through a device(s) that’s several feet away. We trained ourselves and learned how to use them. We still do this, even though ten years ago, Jobs showed the world that there is a better way.

      While Jobs was alive, the progression line was quite clear. iPhone was there to introduce the “paradigm shift” (the by now worn out phrase of major change in the way we do things). We embraced multi-touch because you didn’t have to learn anything in order to embrace it (8-month old children could use it). The iPad was a logical exploration of the concept into a device that does things that more resemble desktop computing. Had Jobs lived a few more years, he would have no doubt killed Mac as we know it and extended iOS into the Mac area by building a touchscreen iMac-sized device that lies on the desk flat (or inclined) and is operated with (multi)touch-only interface. It may or may not came with a bundled keyboard and/or pencil, but with Jobs alive, Adobe, Microsoft and other major developers would have likely been dragged into porting their flagship products over.

      Unfortunately, he passed away before he could finish his goal, and the current Apple management simply doesn’t have that powerful RDF in order to convince all others to follow them.

  13. Definition: Gorilla Arm

    The side-effect that destroyed touch-screens as a mainstream input technology despite a promising start in the early 1980s. It seems the designers of all those spiffy touch-menu systems failed to notice that humans aren’t designed to hold their arms in front of their faces making small motions. After more than a very few selections, the arm begins to feel sore, cramped, and oversized — the operator looks like a gorilla while using the touch screen and feels like one afterwards. This is now considered a classic cautionary tale to human-factors designers; “Remember the gorilla arm!” is shorthand for “How is this going to fly in real use?”.

    IOW: Apple thought different.

  14. I tried a bit of pair programming the other day on a W10 laptop with touch. Every time I pointed to something on the screen just to remark or comment it was an interrupt in the workflow. I was not even pushing to the screen (no fingerprints were left ) it just saw everything as user input.

    I got fed up with the experience and left my co-worker with the job and did an old-style review on my own system…

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.