Expect Apple to release Ultra HD iMacs, not ‘true’ 4K

“According to a new report from NPD DisplaySearch global shipments of 8 megapixel (MP) UHD (ultra high definition) displays with 3840 × 2160 resolution are forecast to grow at a compound annual growth rate (CAGR) of 69% through 2017, while 9 MP 4K displays with 4096 × 2160 resolution will grow at an 8% CAGR,” Dennis Sellers writes for Apple Daily report. “So maybe we’ll see that UHD iMac and Cinema Display sooner rather than later.”

“Higher prices for 4K resolution screens are the reason for the disparity in growth rates, notes the research group,” Sellers writes. “For example, in 2013 the average selling price (ASP) for a 31.5-inch UHD display was $11,000, compared to $25,000 for a 31.1-inch 4K display.”

Sellers writes, “This is why I’d expect an UHD (3840 x 2160) iMac and Cinema Display from Apple rather than ones with ‘true’ 4K (4096 x 2160) screens. Plus, there are hooks for 3840 x 2160 Retina scaling in Mac OS X.”

Read more in the full article here.

MacDailyNews Note: 4K UHDTV (2160p) is 3840 pixels wide by 2160 pixels tall (8.3 megapixels). “4K” has become the common marketing nomenclature for UHDTV but technically it is not 4K. Consumer 4K resolution of 3840 x 2160 (at a 16:9, or 1.78:1 aspect ratio) differs from the industry standard of 4096 x 2160 (at a 1.9:1 aspect ratio).


    1. Actually it has very, very little to do with those extra pixels. It has almost everything to do with the larger market.

      TVs are moving to UDHTV sizes and not the Digital Cinema 4K standard. It is MUCH easier for manufacturers to scale to computer monitor sizes from TV sizes than it is to scale from virtually non existent, true 4K systems. (The sad state is that when viewing a true 4K movie on an UHDTV screen we’ll still be doing letter boxing or pan and scan!)

      Personally, I wish someone would ship a decent 2560 x 4096 monitor (10:16). It would give space at top and bottom for things like menus and such and still allow a full 4K window in the middle. And, at a diagonal distance of about 32 inches it would be about 150 ppi. But, alas, no one seems interested in such a beast — at least no many others than me.

      1. I use a 30″ and a 24″ computer monitor but higher resolution and more real estate to accommodate true 4K (I’m a producer/VFX Supe/DP) and more would be the bee’s knees. In my book you can never have enough computer speed or picture area for editing, 3D, etc..

  1. Marketing Morons at work. (Bless you Marketing Mavens!)

    Thank you MDN for pointing out that:
    4K” has become the common marketing nomenclature for UHDTV but technically it is not 4K.

    Sound familiar? 4G cell technology that’s NOT actually 4G? It’s just faster 3G? Conned you, haha?

    Gotta hate marketing morons. They don’t give a damn how they scam you, hurt you, screw you.… Just fork over the dough, SUCKER. 😛

    1. Yep, and drive capacities are decimal instead of binary (i.e. a Terabyte is 1000 GB instead of the correct 1024 GB) thanks to marketing a-holes. So that terabyte drive isn’t really a terabyte. Small difference, but annoying, and all just to screw you out of a lousy 24 gigs.

      1. I’ve run into some happy situations where the size rating on an HD or flash drive was accurate. There’s that little tingle of ‘yeah!’ when you realize the manufacturer cared about being factual. I of course want to buy more of their products as a result. Positive reinforcement consistently wins in the market. Thus Apple, etc.

        1. Apple was one of the first to start using 1000 instead of 1024 as the base of drive sizes and they still do. One of the few cases I think they made the wrong choice. It might have seemed simpler for regular people, but now regular people can’t accurately compare drive sizes due to the inconsistency.

          The “used” portion of my MacBook SSD according to OS X:

          277,729,909,241 bytes (277.73 GB on disk)

          1. Actually it was IBM way back in about 1980 that started the trend to count hard drives in decimal rather than 2^10 numbering like RAM. Seagate and others followed. Apple just followed in the late 80s. (And before anyone jumps in about the scaling of the 3.5 inch floppies, the sizes and names were driven by SONY. Those 3.5 inch drives showed up in SONY’s dedicated word processing systems LONG before they showed up in any Macs. Apple used SONY’s original sizes and nomenclature [400 kB, 800 kB] while IBM & Microsoft used a different formatting and ended up with less [360 kB, 720 kB, and 1.44 MB].)

            And it is NOT confusing for the average person. They have always thought of kilo as 1,000x, mega as 1,000,000x, etc. It’s just those that deal with RAM and such that consider using a 10s based system rather tan a 2 based system an odd way to do it.

            Besides how many “average consumers” (not the geek set) can even tell you how many bytes are in a gibibyte of RAM? I’d be surprised if more than 1 in 1,000 persons walking into Best Buy could give you the correct number.

      2. Actually, the international standards organizations settled on a Terabyte being exactly 1,000 Gigabytes a long time ago. It has very little to do with “marketing a-holes”.

        If you want to do your math in base two rather than base 10 then you need to start using the correct terms. One Tebibyte is 1,024 Gibibytes.

        It has absolutely NOTHING with “screwing you out of a lousy 24 gigs”.

  2. For computer displays, I want a return to an aspect ratio that is NOT wide. Humans eyes can more easily view vertical space, compared to an equal amount of horizontal space. That’s why book pages are generally tall, not wide. That’s why newspapers (and MDN’s page layout) put the text in tall narrow “columns.”

    If I could get a reasonably-priced up-to-date technology 1600×1200 (or higher res old-school 4:3 ratio) LCD display, that’s what I would use with my Mac mini. For a long time, I used an old (but high-quality) flat-screen CRT that could do 1600×1200 easily, and I really liked it. As a bonus, for a “multi-sync” CRT without fixed-size pixels, it could do MANY resolutions at equal sharpness (no single “native” resolution), and CRTs have no issues with “refresh rate.” But that display finally failed (and it was no doubt a power hog)…

    The next display I get for my Mac mini is going to be 1920×1200 (16:10 ratio), not the typical “1080p” (16:9 ratio). You can still get those at a reasonable price, and they are usually high-quality because they are targeted at “pro” customers who need more image “precision.”

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.