Apple CEO Steve Jobs considered saying goodbye to the pro market

“In recent years, many pros have started feeling like Apple’s jilted girlfriend. Through no fault of their own, the love just seemed to fade,” Ken Segall writes for Observatory.Apple might claim otherwise when confronted, but the telltale signs have been hard to ignore: Mac Pro; 17-inch MacBook Pro; Final Cut Pro X; Aperture.”

“Could it possibly be? Would Apple ever even think about saying goodbye to the pro market?” Segall writes. “I hope you’re sitting down for this, but Steve Jobs did in fact once consider that very option.”

Segall writes, “This was back in the days when iMac had established itself as a global bestseller. During one of the agency’s regular meetings with Steve, he shared that he was considering killing the pro products. His rationale was as you might expect: consumer products have an unlimited upside, while pro products are aimed at a niche market that eats up major resources.”

“Obviously, the pro market has value for Apple, even if its numbers are relatively small. Pros are opinion leaders, influencers and evangelists. Their love of Apple shows up in the purchase decisions of friends, family and colleagues,” Segall writes. “So Steve ultimately renewed his commitment to the pros—but he never said that this commitment wouldn’t evolve. Clearly Apple has changed its thinking about the pro market, and how it can best serve its pro users.”

Much more in the full article – recommended – here.

29 Comments

    1. he considered this – because the products were so good that the consumer level items were more like a prosumer level… hence no need to call pro stuff pro – just call it Apples

  1. This isn’t really even a story. A smart CEO *considers* everything. It’s all on the table, all the time, and he’s constantly trying to “separate the wheat from the chaff”. It’s an iterative process. One year’s wheat may be another’s chaff.

    1. Yes and no. Obviously Steve considered many options, and I’m sure some of them were pretty far out. However, Steve did not believe in wasting valuable time. When he brought an idea to a meeting like this, it was because he thought it was a viable option and wanted to hear opinions.

      1. Would have been a is disaster SJ was a great lateral thinker but he did have his black spots and in this sort of thing he seemed to think like an accountant where bottom line is everything rather than the evangelist who understands how its value I far more far ranging than that. Fits in with his thinking in restricting the iPhone and iPad range because its immediate effect could delete earnings yet a we see now in the longer run it does that to an even greater degree while leaving the company weaker in the market itself and susceptible to negative perceptions which further dilutes profitability. In the end the company has had to borrow and/or spend billions to preserve its share price as a result.

        1. In other words, Steve’s focus was too laser-like for his own good. Given the success of the iPad mini, I think the 7.9″ tablet form factor is vindication of Tim Cook’s strategy of broadening Apple’s product range, if that is what it is.

          Realistically, Apple will never be able to compete in the server market which is dominated by Microsoft and Linux. The only hope for Apple in regards to the server market is to provide specific software tools to manage iPads, iPhones and Macs in a networked environment and to make owning a Mac server as cheap as possible. I believe this is the strategy they are pursuing with the Mac mini server range – unlimited user licenses for an extremely cheap cost but with extremely limited server management tools.

          1. Not at all, we use an Xserver running OSX 10.6 server for our mail and web hosting. All said and done it has been running years flawlessly and we didn’t have to pay Microsluts outrageous exchange server cals fee, etc. OSX server, unlimited licenses for $999. So there is more to it than just managing iPads, macs, etc. and unfortunately we are mostly a Microslut winblows house, but the apple server is our shining jewel.

            1. Same here. I have one Xserve running mail services via Kerio and a second one as a file server for prepress and graphic design service. One has been running 24/7 since 2004 with an XRaid that gets pounded pretty hard all day. We bought a spare parts kit for it when we got it so we could have an extra raid controller, power supply, drive etc. Haven’t even opened the box. Good stuff at affordable prices with low total cost of ownership. Was really sad to see them discontinue Xserve.

  2. Anyone know how long a 2 hour movie in avchd takes to output to blu-ray. An F of a long time right? I’m considering a Mac Pro, and I’m not a pro. It just takes too long to do some stuff !
    Some people just want the BEST speed. I guess I could transcode from avchd to Apple prores and then output to blu-ray. Then the question would become how much memory would a 2 hour movie take up in ram. That would translate the avchd to Apple prores, and make it iintra frame only. Also….if I buy enough ram, can I do the whole editing process in ram, without using a hard disc or ssd ? (except for backup)

    1. You’d be better off transcoding the AVCHD movie to QuickTime format, which is H.264 MPEG-4 rather than to native Blu-Ray format. Blu-Ray is actually MPEG-2 format. QT results in lower density discs due to better compression technology.

    2. The Blu-ray specification includes 3 encoding options:

      – H.262/MPEG-2 Part 2
      – H.264/MPEG-4 AVC
      – VC-1

      Most commercial releases these days use H.264/MPEG-4 AVC, for example Sony’s new “Mastered in 4K” releases of “Spider-Man” / “Spider-Man 2” / “Men In Black” / “Total Recall”; “Oblivion”; the latest “Star Trek: The Next Generation”, etc.

      1. isnt H.265 just around the corner and will blow Bluray out of the water — plus — be beautifully compact – small enough to send the file over the net in its entirety – no need to stream it

        1. The next-generation High Efficiency Video codec (HEVC), H.265, has hit a major public milestone thanks to the work of the developer MultiCoreWare. MCW is launching a new commercial open-source venture around x265, and the source code for its x265 encoder is now available. Right now, the project is very much in early days — pre-alpha level code — but the x265 encoder is already impressively parallelized and supports all of the major instruction sets including AVX/AVX2 and FMA3/FMA4.

          The problem with H.264, however, is that while it can handle these types of encodes, it can’t do so while simultaneously keeping file sizes low. A new standard is necessary to push file/stream sizes back down while driving next-generation adoption, and that’s where H.265 comes in. It’s designed to utilize substantially less bandwidth thanks to advanced encoding techniques and a more sophisticated encode/decode model.

          Unlike H.264, which can extend to cover 4K television but wasn’t designed with the feature in mind, H.265 was built to match the capabilities of future screens and includes support for 10-bit color and high frame rates. This is early days — support and capability of the current alpha are limited to 8-bit color and YUV output, but we still wanted to take the alpha technology out for a spin. Armed with a freshly compiled version and some test clips, we set out to see what we could build.

          The parallelization performance looks good — Sandy Bridge-E, with six cores, is somewhat ahead of Ivy Bridge with four. Similarly, Ivy Bridge is beaten out by Haswell, thanks to the new core’s AVX2 support and better performance characteristics. Compared to x264, even on the –veryslow preset, x265 encodes take noticeably longer — our Ivy Bridge 3770K encoded the same file in H.264 in 129 seconds as compared to 247 seconds for H.265. Keep in mind, however, that this is very, very early software.

          At q=30 (file sizes of 6.39MB and 10.87MB), the H.265 video stream is arguably better than the H.264 encode stream. We’re not trying to claim this is an absolute — as always, encode settings matter a great deal and are sensitive to tweaking. But after waiting more than a year for H.265 to break cover, it’s clear that the new standard is going to offer what its proponents have claimed.

          Encode/decode support, meanwhile, is already going to be possible on a vast range of products. Modern CPUs are more than capable of decoding H.265 in software, OpenCL support is coming in future iterations, and hardware GPU support, while not formally guaranteed by AMD, Intel, or Nvidia for next-generation products, is a mid-term certainty. All three companies have previously leapt to include advanced video pipelines in their products — as the H.265 presentation notes, video is something that’s become ubiquitous across every type of device.

          Long-term, H.265 will likely succeed H.264′s position as the premier solution for advanced video, though that may depend on whether or not battery consumption while decoding can match H.264′s levels in the long term. That’s something we’ll only be able to evaluate once hardware is available, but for now we’re optimistic. H.265′s explicitly parallel model should map well against multi-core devices of the future.

          1. 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves? By Sebastian Anthony on June 4, 2012 at 1:02 pm

            Moving forward, the main takeaway is that it’s actually possible — right now — to transmit 8K television shows (and movies!) over the air. The main problem, though, is that there isn’t a single commercial display on the market that’s capable of displaying that resolution. Even 4K (2160p) televisions — which have a quarter of the resolution of UHDTV — are incredibly rare. Earlier this year at CES, Sharp showed off an 84-inch 8K TV (the world’s first, apparently), and LG showed off a beautiful 4K OLED unit (which is due to launch in 2013).

            For the time being, then, 1080p HDTV and 3D are here to stay — but in a few years, when (if?) 3D loses its vogue, 8K will be ready to strike.

            1. 1080 and 720 were an easy sell over standard def, which, in broadcast form, really only contained 240 in most programming due to interlace. 4K is impressive, but the perceived quality difference is low compared to HD at noramal viewing distance. 4K will have a much slower penetration and I find 8K very unlikely for public use.

  3. I would hope that the fact that the pro market (and the education market) kept Apple alive during the mid-90’s until Steve came back was enough motivation to keep supporting them.

  4. Oh, and Final Cut X was hamstrung when it first launched but has since come around and lacks very little of what video pros desire. I agree that Apple released early but they’ve worked hard to improve it.

    1. yes, Final Cut X shines – an example of change but required proof and backing — Apple didnt abandon the pro users – there is just fewer of them than the consumer folks

  5. There are many differences between Apple Computer of the late 1990s and Apple Inc. of the 21st century…size and assets being two important factors. Apple can afford to dedicate a reasonable amount of resources to the care and feeding of Apple-using pros. The pro devices and software bridge the gap between technology R&D and the application of those technologies to consumer mass-market products. In addition, the pro products are the visible examples of a company’s skills and knowledge. Pro products are often the harbingers of consumer products to come.

    There are valid reasons that the big car and motorcycle companies maintain racing teams. The pro products division is Apple’s racing team. And, while the new Mac Pro may not please everyone, I see it as a very positive sign that Apple is still committed to the pro community, even if the product is a couple of years late in arriving.

  6. ‘“Obviously, the pro market has value for Apple, even if its numbers are relatively small. Pros are opinion leaders, influencers and evangelists. Their love of Apple shows up in the purchase decisions of friends, family and colleagues,” Segall writes.’

    Yes. And this article unwittingly articulates the very reason why it just never made any sense for Apple to abandon its enterprise server products.

  7. Yes the initial pro market is smaller but it offers resellers an opertunity to upsell those that have the bucks and those that need the additional power. 2 things that could change the pc game forever for the new mac pro, 1. allow 2 user to use the the same machine at the same time( 2 keyboard 2 monitors 2 users) and 2 allow processor sharing via thunderbolt (super computer)

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.