Apple in 2015: Get ready for the biggest product lineup in 26 years

“This was a big year for Apple,” Binyamin Goldman writes for Medium. “We saw the biggest OS X update in quite a while, iOS opened up more then ever, two new iPhones, two new iPads, new Macs, and, for the first time since 2010, a new product category: the Apple Watch.”

“It truly was the companies biggest product lineup in 25 years… but not for long,” Goldman writes. “Every year, I try and predict what Apple has in store for the next 365 days. This year, Tim Cook and company will launch an extraordinary line of new products and updates, including new operating systems, new iPhones, new iPads, Apple Watch, and even a new line of MacBook. The biggest product lineup in 26 years.”

“In order to reinvent the iPad, Apple is reportedly working on a series of major hardware and software improvements for the iPad line. Imagine a world where to edit photos, photographers don’t go to a Mac, they go to their iPads. There is no reason for anyone to do home video editing on a Mac for any reason. The best version of iMovie should be on iPad. Imagine if Apple figured out how to make a full sized, on screen keyboard with feedback that made it feel as if you were typing on a real keyboard (keep an eye on that Taptic Engine). Editing documents is a much better experience on a tablet then any other device if you’ve got the keyboard,” Goldman writes. “This needs to be Apple’s new philosophy. And Apple’s device to show off this new philosophy is called the iPad Pro.”

Tons more about “Apple’s biggest product lineup in 26 years” in the full article – recommended – here.

[Thanks to MacDailyNews Reader “JES42” for the heads up.]

41 Comments

  1. This year, Tim Cook and company will launch an extraordinary line of new products and updates, including new operating systems, new iPhones, new iPads, Apple Watch, and even a new line of MacBook. The biggest product lineup in 26 years.”

    How about getting Mac OS X Yosemite working first, before you put another Mac OS out this year in 2015. A new OS every years is ridiculous in my opinion. How about one every two years. Yosemite still has a bunch of bugs that needs to be worked out.

  2. Whine, whine, whine.

    Fixing Yosemite and the other software issues won’t help the stock price or generate much consumer excitement. It needs to be done, but so do these other things.

    Apple SHOULD be able to do it all. Let’s hope they deliver.

    1. worrying about the stock price is a short term concern, or at least should be.

      getting things right, no matter how long it takes, is the correct long term strategy for success.

      if the stock price sinks in the meanwhile, view it as a buying opportunity. this is a great company, one to buy and hold and to benefit from the the dividends as well.

      but you won’t continue to have a great company if you don’t make things “just work” right in the first place. no matter the marketing and flash of new products

  3. It is all because of Grand Central Dispatch and memory leaks throughout the entire OS, starting of course with xCode, the development environment that is used for most of it!
    Still a quintillion time better than Windows!

    1. I can think of 3 reasons why the quote “There is no reason for anyone to do home video editing on a Mac for any reason.” is not true:

      1. Heat
      2. Speed
      3. Ease of use w/advanced options

    2. I thought the same thing, too. God forbid that people use computers for editing movies. I mean, it’s not like computers haven’t been used successfully for doing that or anything. It’s not like there’s established workflows. More breathless future hype.

  4. Personally I’ll like Apple to make Macs graphics 10 bit. Currently (from what I gather) it’s still 8 bit (millions of colours) and not 10 bit (per channel, 30 bit total for over a billion colours). Windows has had it since Win 7. Many third party pro monitors are 10 bit capable and has been for years.

    Apple’s MacPro and retina iMac while awesome in other respects aren’t (from what I’ve been reading) 10 bit capable (due to the OS). So Apple if you’re building pro machines for graphics pros make them fully so.

    Looking at the new pro graphics products announced at CES like the new 27 inch Cintiq which is 10 bit capable I’m lusting for them but the in the forums it shows that Mac users can’t get full use of them , it’s sad that Mac users can’t get the full colour gamut as Windows users.

    ——-
    I’m using older Mac and Monitors now (all at 8/24 bit) so I’m not sure whether what I’ve said above is correct but that’s what I’ve been reading.

    1. Unless you’ve inherited eyes from an off-Earth alien ancestor, I don’t know what you’re talking about.

      There’s 24 bit color (24 bits per color), which encompasses every color visible to the human eye plus some. 8 bits X 3 = 24. That’s 16,777,216 colors. The human eye can detect about 10 million colors. So 24 bit color is already overkill.

      The next step up in computer image bit depth is 32 bit. The extra 8 bits are used for representation of transparency. There are some effects that can be fiddled around into extra bits depending on what a manufacturer chooses to use. Then there’s the marketing OOO! AHH! baloney of ‘more bits of color must mean better color!’ when in fact it doesn’t.

      ∑ = Get the specs on what you’re buying. What are the extra bits of ‘color’ actually for? You may be buying something entirely pointless for your purposes just so you can be charged more money..

      Here’s on decent article on the subject:
      https://en.wikipedia.org/wiki/Color_depth#Deep_color_.2830.2F36.2F48-bit.29

      1. DC, professionals can benefit from greater bit depth (> 24-bit color) when processing high quality media. The raw images from professional camera can capture additional data that can be useful when manipulating the data.

        Apple Macs have been preferred by most media professionals for a long time. It is very possible that Macs only display 24-bit color. But I suspect that the professional software can work with greater bit depths before rendering to a 24-bit color space.

      2. there are such things as 10 bit monitors like the hp Dreamcolor

        from their release:

        “With 10-bits per color, the DreamColor panel produces over one billion colors2, 64 times more than mainstream LCD displays. Work in brilliant, trusted color and bring your ideas to life with the HP DreamColor Z27x Professional Display, featuring HP’s unrivaled integrated calibration engine, 4K1 input support, and 10-bit color that drives up to 1.07 billion onscreen colors2.”

        the new Cintiq (I’m using an older cintiq) is also 10 bit. Obviously pros (the cintiq costs over $2000) are asking for it and I’m sure those dudes who work in graphics shops know something.

        definition of 10 bit colour:
        “That’s where deep color comes in. With 10-bit color, you get 1,024 shades of each primary color, and over a billion possible colors”

        The human eye spectrum is much larger than the monitor spectrum. Apparently the extra bits of 10, 12 bit pipes is that there is better shading. Go google the issue, there are all kinds of tech charts.

        for example:

        from AMD release:
        “This is where AMD’s most recent series of workstation graphics cards come into play by natively supporting the emerging standard for 10-bits per color channel (or 30-bits per pixel) video output.
        Conventional display devices use 8-bits per color channel (or 24-bits per pixel) to display images and video. Although this amounts to more than 16 million colors, it still corresponds to a fraction of the colors we perceive in the real-world. This is illustrated in Figure 1, where the green triangle shows the boundaries of the sRGB1 color space on the CIE-xy chromaticity diagram. Conventional sRGB compliant 8-bits monitors can only represent the colors that lie in this triangle, while the human eye is capable of perceiving all the colors in the entire chromaticity diagram. This discrepancy is further emphasized by the fact that today’s most professional cameras and printers have a color gamut larger than that of sRGB (such as Adobe RGB2 shown by the red triangle in Figure 1), creating a bottleneck on the display side.”

        Click to access 10-Bit.pdf

        Many Pro Graphics cards are now 10 bit capable (like the above) so once again Pros seem to want them.

        Also apparently more bits allow greater flexibility when manipulating the image, for example using ‘LEVELS’ in photoshop distorts a lower bit depth image more than a higher bit depth image.

        I believe RAW files are already 12 bit capable, when seen on 8 bit monitor info is stripped out.

        there are also issues of making transparency and effects layers with the extra bits.

        Tests with grey scale bands show that a 8 bit monitor can have visible ‘banding’ while 10 bit monitors are much smoother.
        “https://photographylife.com/what-is-30-bit-photography-workflow”

        Adobe Photoshop is 10 bit per channel capable.

        (for normal users 24 bit depth is probably more than enough but the evidence seems to be mounting that higher bit depths give advantages to pros)

        —–
        eventually I’m sure Apple would go 10, 12 bit.

        once again I’m willing to be corrected.
        I’m using older equipment and have not tested 10 bit monitors myself.
        I’ve used macs for years and not PCs but like I said Windows 7 is already 10 bit capable.

        1. I’ll point out that sRGB is a terrible color space designed by Microsoft as a dead minimum standard. At Kodak we called it “Stupid RGB” because it is profoundly limiting.

          As for the human eye being able to see beyond ~10 million colors: Rubbish. I strongly suspect marketing garbage being shoved into the conversation in order to oversell whatever:

          Conventional display devices use 8-bits per color channel (or 24-bits per pixel) to display images and video. Although this amounts to more than 16 million colors, it still corresponds to a fraction of the colors we perceive in the real-world. <–Utter nonsense.

          As with audio (I've been in loads of arguments here about this topic) this subject is clearly crammed full of ignorance. I wish I had room, time, incentive to go on a diatribe here. But I already pointed to two very good documents over at Wikipedia which themselves reference loads of resources about the subject.

          Meanwhile: There have constantly been theories about the usefulness of manipulating color data beyond the sensitivity of the human eye. We're in an era when we can do that without much impact on processing speed. If you've got the CPU cycles, go for it! But it's typically overkill and pointless from my POV. Keep in mind that I am talking strictly about color and no other added data, such as transparency. 32 bit 'color' is now THE standard specifically because it takes into account transparency, which obviously is NOT 'color'. It's an image quality apart from color.

          Suggestion: Don't go by what I say or anyone else says about color. Dive into the subject yourself. Don't be daunted. All human senses are amazing. That we can play with and manipulate color and sound is amazing and highly recommended for fun and creativity.

          [Please discuss further, but not with me, thanks]

        2. This is complicated by the fact that we can see colours in nature that can not be displayed on a computer monitor, so, in practice, the monitor only displays a subset of the real world, so it could probably get away with an even smaller bit depth.

        3. An aside: One reason it is desired to work with as much data as possible for both imaging and audio is the goal of working with the best possible digital representation of ‘the thing’. The more data, the more is known about the source. “Identity” is the ultimate goal whereby there is perfect representation, removing errors and misinterpretations.

          Whether ‘identity’ is worth one’s efforts depends upon the project and its resources, including time. My personal POV is to work within the range of the human senses. Therefore, I end up jabbering about improving audio recording quality to the point of perfect human perceptible representation of the entire spectrum of sound, particularly up to 20,000 Hz. Meanwhile, I’m obviously pointing out the pointlessness of creating images that include colors we humans cannot perceive. I’m about our senses, not about ‘identity’.

          However, I do have a fun story series in the works that is all about ‘identity’ and what one can do with it. 😀

        4. I don’t have a problem with a higher bit depth (and higher sampling rate in audio) for manipulating the data, this allows manipulation and interpolation with minimum damage once rendered to 24 bit (or 16 bit in audio)

        5. derek

          I don’t really want to get into a fight because as I’ve said I’ve not done a side by side comparison.

          but you saying ‘it’s utter nonsense’ and then comparing it to the audio world seems limiting.

          to recap:
          1) tests show that the human eye sees much more in the real world than computer screens ability to project . The gamut is much wider.

          Higher bit depth also helps with dynamic range. Human eyes can see dynamic ranges much wider than practically any camera or monitor. Because of limited bit depths and sensor capabilities of cameras to match human eye abilities information has to be stored in multiple images for HDR processing. Apparently (I say apparently because I’m just reading this with no personal tests) 10 bit monitors can display and process (?) higher dynamic ranges.

          2) Tests like gradation strips (which I pointed out and linked to) show that there is for example less BANDING with 10 bit monitors. You can see discreet segments in 8 bit while 10 bit is smooth.

          Maybe it is all marketing B.S but when I google “10 bit less banding” I see numerous examples of images on 10 bit monitors compared to 8 bit which show smoother gradations.

          3) When you do destructive actions like LEVELs in photoshop more bit depth (as shown in tests) results in a better end image.

          4) from a Wikipedia article on higher than 8 bit depth:
          “As bit depths climb above 8 bits per channel, some systems use the extra bits to store more intensity range than can be displayed all at once, as in high dynamic range imaging (HDRI). Floating point numbers are numbers in excess of ‘full’ white and black. This allows an image to accurately depict the intensity of the sun and deep shadows in the same color space for less distortion after intensive editing. Various models describe these ranges, many employing 32-bit accuracy per channel. In 1999 Industrial Light & Magic released the OpenEXR image file format as an open standard that supports 16-bit-per-channel half-precision floating-point numbers.”

        6. Thank you Davewrite. Keep in mind that, as I said, I was only addressing color. Adding bits to represent more than color, such as transparency, is excellent and I’m hoping this will be standard. Perhaps I’m misinterpreting the entire conversation if in fact we’re talking about image details beyond simply color.

          Human eyes can see dynamic ranges much wider than practically any camera or monitor

          Yes. Thanks to dyrck for pointing this out. Going from one medium to another for representation of an image is an incredible pain in the neck. Apple and Kodak were the two collaborating companies who jumped on that stallion to come up with what became ColorSync profiles to help with this transition without losing data. There’s another huge topic for discussion.

          CRT – fluorescing phosphor colors.
          LCD – backlit liquid crystals.
          LED – can be the backlight OR the color source.
          Pigments and dyes – reflecting an image according to the color temperature and brightness of some light source.
          Light sources- The sun, metal-halide lamp, incandescent lightbulb, fluorescent light, LED…

          Then there’s biofluorescence…

          My apologies if I’ve been misinterpreting the full subject you’re discussing. Again, I focused only on color representation, versus other aspects of image representation. Dynamic range is a great additional aspect of image representation. Thanks.

        7. thanks for your polite post.

          all additional info you’ve added is appreciated.

          I just want info to make an informed decision as I want to buy that $2000+ Cintiq eventually (to replace the Cintiq I’m looking at now) . paying for a 10 bit monitor and not getting perhaps its full ability I must admit is a bit sad but no way am I going to Windows. I have confidence that if it’s needed Apple will get it right in time.

  5. Color me skeptical. As Apple has moved apps from OS X to iOS they tend to remove important features, not improve them. Look how iWork was gutted. I’m all for it if they would actually make the software good enough.

  6. Great, more software to go batty.

    Fix iTunes, please, Apple. Your redesign is confusing, and the podcasts don’t sync properly.

    And the Other problem with the iPad mini.

    But my 12-inch MBP is a thing of beauty.

  7. What’s with the “in 26 years” stuff? 26 years ago was 1989. What was happening in 1989 that made it a “biggest product lineup” year, that will finally be exceeded in 2015. Macs were going strong, but PowerBooks were not even in the lineup yet. Apple II models were still in the lineup, but winding down.

    Also, this guy writes (about 2014)…

    > It truly was the companies biggest product lineup in 25 years… but not for long.

    again implying that 1989 was some kind of milestone year for Apple’s product lineup. So, if last year was the “biggest product lineup in 25 years,” 2015 would be the biggest product lineup “since last year,” NOT in 26 years.

    He (the author) needs some common sense. 😉 He is probably referencing a comment made by Apple SVP Eddy Cue (not mentioned in the article), who said in an interview last year

    “We’ve got the best product pipeline that I’ve seen in my 25 years at Apple.”

    That is completely different. For one thing, he’s talking about HIS span of “25 years at Apple,” which is NOT about something happening exactly 25 (or 26) years ago. Apple actually did have a deep “pipeline” of upcoming products about the time Eddy Cue came to Apple, including introduction of PowerBooks, new (diversified) desktop Macs, and even the Newton handhelds. But that’s NOT an actual product lineup.

  8. It’s a challenge not to judge the quality of the message by the blatancy of Goldman’s Redneck English:

    “I try and predict”

    “There is no reason for anyone to do home video editing on a Mac for any reason”

  9. Hello MacDailyNews Community!

    I don’t really know what this is, but from what I can tell, it’s a website the copies and pastes the work of people who actually know what they are talking about, like Mark Gurman and Rene Ritchie, (which, by the way, I am flattered that the curators over at MDN have decided I am worthy of being part of such a group).

    While I don’t necessarily mind the extra attention, despite being uncredited, which by the way, some people really don’t like that you guys are blatantly plagiarizing, I’d look over at this Twitter conversation: https://twitter.com/markgurman/status/560613649363238912, I do mind a community based half on assholes without constructive complaints and half on people who are really hoping Apple makes some better displays.

    In the future, i’d appreciate my work not be plagiarized, without so much as my name being added, and that if you do choose to plagiarize my content, do not plagiarize it to a group of angry 60 year olds who have their head stuck in their rear and think the company this sites about is still in 2002.

    1. Firstly I’m 57 and that doesn’t render me unable to absorb new information and secondly, as far as I can see, MDN mention your name in the first line and then supply a link to your original blog and label it recommended. If it were me I’d be quite pleased. There are, without doubt some trolls who comment on this site and then there are the rest of us. I would probably never have found ‘The Tech Medium’ without MDN, why aren’t you thanking them?

Reader Feedback (You DO NOT need to log in to comment. If not logged in, just provide any name you choose and an email address after typing your comment below)

This site uses Akismet to reduce spam. Learn how your comment data is processed.