Next-gen HDMI will support 8K video with HDR

“The industry body responsible for HDMI has announced a first draft of the standard’s next set of specifications, and oh boy: we are getting a lot of extra pixels,” James Vincent reports for The Verge. “HDMI version 2.1 — which will be backward compatible with current HDMI devices — will support 8K video with HDR at a 60Hz refresh rate, or 4K video at a 120Hz refresh.”

“For those interested in the raw numbers, 4K resolution is 3,840 x 2,160 or 8.3 million pixels, while 8K is 7680 x 4320 or 33 million pixels,” Vincent reports. “Like all new A/V tech specs, what we’re seeing now is very much designed for the future. 4K content has only recently become relatively common, and 8K content is years and years away from hitting your screens. We are seeing the first 8K TVs and monitors enter the market, but they’re either prototypes or specialist products designed for photo and design professionals.”

“And, as well as the displays themselves, to get HDMI 2.1 running at full whack you’ll also need new cables that can support up to 48Gbps bandwidth,” Vincent reports. “Reading about a new tech spec is easy; you have to wait for the compatible hardware and content.”

Read more in the full article here.

MacDailyNews Take: Bring it on! Ooh, we can’t wait for the day when we look at our 4K TVs and say, “Those have to go, they look like shit!”

43 Comments

  1. WTF do we need 8K for? Even in standard HD most people look like s**t in movies and on TV. I don’t need to be able to count an actors nose hairs.

    Can someone that works in video production please give me a justification for this level of resolution, where the “improvement” is beyond the ability of human perception on a regular sized screen.

    The resolution race is clearly the result of an out-of-ideas tech industry and content creators needing to create a reason for people to buy a houseful of new TVs and re-buy all their content once again. I’m sure that after you all buy you new 8K tv, 2 years later the device industry will decide to move on to yet another “standard” interface connector so you will have to replace things yet again. This is insanity.

    1. I agree. Even HD at normal viewing distances still looks great, or even projected in a normal theater (as I have done). 4K is definitely at the more-than-reasonable resolution limit for a home based system (and to fully enjoy that you need a 75-85″ monitor & up) and the 4K format itself has barely even gotten a toehold on the market in home distribution via disc and Internet. It’s about telling stories, not an unfettered and ridiculous resolution race.

    2. “… this level of resolution, where the “improvement” is beyond the ability of human perception on a regular sized screen.”

      I wish people would do some real research before they make comments on things like this!

      For computer monitors set at approximately two feet from the user’s eyes to get below the real limit of average human perception the pixel density needs to be approximately 2,800 pixels per inch or denser. That’s about ten times the pixel density of even Dell’s recently announced “8K” 32 inch monitor.

      Now take the case of a TV in your home that is 10 feet from your eyes. The pixel density goes down to about 573 pixels per inch in order to be below what can be perceived by the average person. For an “8K” TV with a diagonal distance the pixel density is approximately 110 pixels per inch. That’s about a factor of five less than what gets the system better than what can be perceived. (And, just for the record, a 60″ screen at 10 feet is still only 147 pixels per inch, far below 573.)

      Thus if users want to get to the real limits of what can and cannot be seen there is still a LONG way to go in resolution.

      That is not to say that it is worth it. I’m not making any statement last to whether it’s worth having a display that is so pixel dense that the user CANNOT see the difference between that display and the real world. Worth is up to the individual.

      Hell, back around the turn of the century Evans & Sutherland was selling (and shipping) projectors that were natively 8192 x 4096. They were extremely expensive and they didn’t sell many, but they existed. Some may still be operational today. The projected imagery was truly impressive. But, other than specialized uses no one could justify the outrageous price (we’re not talking about 10s or even 100s of thousands of dollars per system).

      Whether there is value in “4K”, “8K” or even significantly higher resolutions depends upon the user and the use. If it’s not worth it to YOU, then don’t go there.

      1. Not sure where you are getting your numbers but I’ll dig up the article where video pros, eye specialists and other experts lay it all out, in detail what screen size you need at a certain distance to be able to tell 4K from HD. You are way wrong.

  2. The bottleneck is the data pipe. Whilst cable does do HD you can clearly see compression happening frequently.
    So 8k is great but when are you really going to see it.
    Then the other issue is content. That takes years to catch up.

    1. If ISP bandwidth can’t keep up, any move to higher resolutions will most likely be accompanied by ‘local’ storage in the form of downloads or Optical discs of some kind. The player will be directly connected to the display by the new HDMI and will be fine. The new spec has nothing to do with Internet bandwidth. Besides it’s always good to keep the ‘transmission’ lines as large as you can to keep it from becoming the bottleneck.

    2. Japan’s NHK is already making 8K video available via satellite.

      But who’s watching? Those with money to burn and people to impress.

      From August, 2016:
      Japan starts 8K TV broadcasts in time for Rio Olympics
      The country will have the world’s highest resolution Olympics coverage

      The engineering challenges associated with 8K are significant. Cameras, mixers, recording equipment, monitors and other studio gear must be built to handle uncompressed 8K video in real time, at data rates that can easily reach 100Gbps.

      Compression equipment then needs to take that signal and encode it into more efficient streams for broadcast, again in real time. The resulting satellite signal is several tens of megabits per second, and consumer receivers and televisions are required to decode it.

      For now, consumer reception equipment isn’t available, so NHK has set up several public viewing areas at broadcasting stations across the country.

      1. The interesting part to me (as a broadcast engineer / architect) is how NHK has developed all the tech themselves (8K cameras, codecs etc) and they actually tested their first 8K transmission in 2005! I thought it was a typo but it was a just over a fiber. But still, 2005.

        1. I didn’t know NHK were building it! I have to hand it to them.

          Side story: I’m sick-to-death of USA TV news. It’s like being stuck in a tube with the whole rest of the world outside and inaccessible. SO! I now predominantly watch news from outside the USA. NHK is my absolute favorite. You have to endure the varying levels of accents, but in general it’s wonderful. The best part is the arts and cultural programming. You learn something and don’t end up regretting that you watched it.

          http://www3.nhk.or.jp/nhkworld/en/live/

          And VERY little Middle East ‘BE SCARED’ bullshit. Sorry Israel. How about you take care of your own problems and leave my country out of it!

      2. BTW, I am sure you know but nobody needs to capture the uncompressed 8K signal except “local” broadcasters. That article is a bit misleading. Its likely that 8K will come in under 100 Mbps using HEVC but the codec (or maybe the encoder) allows up to something like 340 Mbps. The infrastructure isn’t far from being there for most of us. My cable provider already offers 250 Mbps high speed internet. They could also easily deliver 8K over cable (fiber) if the set top box could handle it. That’s the parts that are missing mostly, the consumer pieces such as decoder and TVs.

        1. I expect we’ll very slowly see 4K become accessible in the USA, slowly thanks to the almost universally AWFUL cable/optical companies we have in the USA, aka The Parasites.

          8K I simply don’t see making a dent in the USA except for those few who bother with something like 300 Mb/sec bandwidth. As an individual, I don’t see the point of paying for it. A business, of course they want at least that bandwidth! So after work, they sit around and watch what little 8K they can scrounge off the Internet.

          IOW: Marketing hype suckers, beware.

  3. Movie theaters use 4K and it is unlikely that I will ever have a bigger screen in my house then they have in a movie theater. 8K means a huge increase in data transmission so data costs would have to go up significantly. 8K is a solution without a problem.

    1. Movies theaters are 4k. Not exactly. They are 4k equipped, yes, but nearly EVERY movie that has been delivered to date was finished at 2k resolution, despite the resolution they were shot at. Visual FX are still done at 2k and the film is DI’d down to 2k and the final version is 1920 by 1080p. The public has not been seeing 4k movies in theaters. Look it up. Even 4K Blu Ray discs- The Martian, Mad Max, etc. all had to be upscaled. Yes, upscaled.

      So referencing theater screens to argue against 8k is meaningless- for now at least. And don’t worry data transmission. 8k won’t be a standard for another 10 years or so, by which point bandwidth will be well beyond where it currently it is.

      1. Just go to
        http://forum.blu-ray.com/showthread.php?t=270798
        for a list of UHD Blu-ray disks that belie some of your statements.

        Digital Cinema 4K (which is NOT UHD improperly called “4K”) is becoming a standard.

        And there really is no Negative-Interpositive-Internegative-Positive relation equivalent today. Things natively shot in true 4K are more and more being worked on in 4K and the final product is 4K.

        Are all productions being done this way today? No, but the number that are going from 4K to 2K then back to 4K are becoming rarer and rarer.

        Hell, there are even some things (extremely few today) that are shot in 8K. The 2020 Olympic Games will be shot in Super Hi-vision (what some people are calling “8K”).

        True 4K (4096×2160) and even true 8K (8192×432) as well as Super Hi-vision (7680×4320) are coming. I expect that by the end of 2018 all but indi-house productions will be done in true 4K end-to-end and some of the higher end productions will be done in 8K up until the final output to 4K.

        1. My above rebuttal was not to dismiss 4k or 8k. Trust me, I’m a huge advocate for these advances. I was merely pointing out the flaw in what AI G was using as a visual reference. I use Maya, Nuke, etc. and know all about DCI 4K. And yes, thank God, finishing in 2K is coming to an end, but not much yet has been released at this moment.

          Rogue One most likely was. The Force Awakens was, but the FX were rendered at 2K. I suspect 2016 was the final transition year for big budget movies. AI G’s argument was really based on movies completed before this transitional moment and essentially illogical.

          No offense to AI G, but very few really know much about 4k. Not you, obviously. All your points are spot on. Especially, the Olympics. NHK has already been real world testing 8k for quite awhile now, like the Mariners/Yankees game in the summer of 2015 and a few others.

  4. Actually the draft HDMI 2.1 spec that has been shown to the public has many, many unexplained pars (likely because those details have not yet been cast in stone). Just one example is the statement that it will support “10K” resolution. (And, yes, it’s supposed to support up through “10K” not just “8K”.)

    “10K” could mean anything from 9600×5400 (the home TV definition of “K” like “4K” is really 3840×2160) or 10240×5400 (based on the digital cinema format) or even 10240×640 (based upon a wide 16×10 format). What EXACTLY HDMI 2.1 will support has yet to be fully defined.

    There are a lot more details to come out. Some like the overall bandwidth (Gbps over the wire) are virtually done and immutable. Some like the possible refresh rates of lower resolutions are still in a higher state of flux. (Why is the current, leading proposal to have HDMI support the same top refresh rate — 120 frames per second in progressive mode — for all resolutions? If it really will support 120 fps at “10K”, why can’t it support 240 fps at “4K”?).

    However, this spec won’t be final and issued for at least another six months. Then it will take six to 12 months for systems to start incorporating it. Then it will take one to two years before it is mainstream.

    The bottom line is this… This is a huge step forward in the HDMI spec. It may stabilize the spec for as much as five years. (But, we’ll see minor tweaks for sure every year or so.)

    1. It’s called:
      Selfish,
      Gouging,
      Stupid,
      Lazy,
      Abusive
      ISP
      Biznizziz

      Screw Thy Customer <-The US Biznizz Motto.

      List of countries by Internet connection speeds

      2015 South Korea average connection speed:
      26.7 Megabits/second

      2015 USA average connection speed:
      12.6 Megabits/second

      The Top 10, Avg. Mb/s
      1) South Korea, 26.7
      2) Sweden, 19.1
      3) Norway, 18.8
      4) Japan, 17.4
      5) Netherlands, 17.0
      6) Hong Kong, 16.8
      7) Latvia, 16.8
      8) Switzerland, 16.7
      9) Finland, 16.6
      10) Denmark, 16.1

      What other countries have a higher Avg. Mb/s than the USA?
      – Czech Republic, 14.5
      – Romania, 13.1
      – UK, 13
      – Belgium, 12.8

      1. Don’t you think average speeds mean nothing when comparing nations the size of the US with countries smaller than Pennsylvania?

        I can assure you that the metropolitan areas in the US are very competitive. Mississippi and Wyoming, not so much.

        1. Agreed! I’ll add that the average population density of the USA is low compared to most of the rest of the world. That’s NOT an incentive to roll out 300 Mb/s bandwidth across the land. Point taken.

          I personally enjoy 30 Mb/s bandwidth, without having a server on the Internet. That beats the average in South Korea.

          Meanwhile, I have a couple techno friends who live out in ‘the country’ who shockingly still can’t get anything but land line modem based Internet, unless of course they invest in satellite Internet, which of course has negligible upstream bandwidth.

          Then there’s the consistent fact that even when provided with piles of free cash designated specifically for laying new cable for either rural or urban environments, the selfish, gouging, stupid, lazy, abusive companies have EATEN the money and NOT laid any new cable. This has happened across the USA with accompanying lawsuits if not extreme handwringing.

          Individual communities have, in some cases, created their own ISP companies in order to provide broadband to their residents.

  5. Manufacturers may wish for another new standard in which to sell new sets but a slower pace infrastructure in production and delivery will impede them every time. Plus they can slow down 4K sales if people stick with their “good enough” HD sets thinking 4K will be short lived so ultimately shoot themselves in the sales foot. (And yeah 4K looks mighty fine in your local D-Cinema.)

      1. It’s not just video.

        Think of professional photography to name just one market segment. Professional cameras run up at 50 Mp (some above 80 Mp). That’s well beyond the 33 MP of these “8K” display panels people are discussing today. The ideal/dream panel for those users might be as high as “12K” or “14K” and physical sizes in the range of 40-50 inches.

  6. Post Production: A 35″ or 40,” 8k monitor would be fantastic for my work. Digital Cinema cameras such as the Red or Alexa are shooting at 6k now. That is about the point resolution of the best Kodak 35mm motion picture film stocks. To be able to see all of those pixels at a 1-1 ratio would be fantastic. To have a large, 8k monitor for the client would also be very beneficial.

    Archiving: Being able to archive motion pictures and TV shows shot on film at 8k HDR image would be a true lossless archival method – especially if you could run PFClean or Diamond in 8k.

    Goin’ to the movies: Yes, I would LOVE to see 8k projection at my local cinema. 4k is god but, not quite there.

    At home: Yes! With a big enough screen, one that fills the same angle of view as sitting fairly close to a cinema screen, 8k will make a visible, noticeable difference. Of course, HDR and high bit rates make just a much difference in perceived image quality.

  7. Concept:
    How about we tell HDMI and its DRM (Digital Rights Manglement) crap to die and go to hell?

    How about a simple, streams EVERYTHING, no restrictions, video cabling standard? One that takes all the DRM infected video and streams it to our displays ALWAYS, no matter what, despite the DRM crap?

    THAT is what humans require. NOT this marketing moron, parasitic, We Never Trust Our Customers! bad attitude abuse that the likes of the MPAA foist on us like we’re their peon peasants.

    Kill HDMI and replace it with user-friendly tech.

    1. Agreed that HDMI should die. But in order to kill it, some company has to step up and lead the industry forward with a robust cost effective solution.

      Do you think that the price gouging dongle company that brought you the ADC, the Dock Connector, the Lightning Connector, and zips through iterations of Thunderbolt and DisplayPort every 2 years such that no two products can be plugged together without an adapter is the one who is going to set their suffering users free?

      I doubt it. Jony and Timmy’s butlers have a million dollar adapter budget so they have no clue what a PIA Apple’s adapter hell is.

      1. Agreed. Something akin to such lunacy appears to be going on.

        Maybe, as some tech journalists have written, USB-C is the great and glorious future and Apple is one of the leaders of THE WAY. We’ll see. For now, its a costly PITA.

    2. Seriously. I haven’t started shopping for new TV yet but my colleague was telling of the issues on his “higher-end” 4K Samsung TV. Even though all the HDMI ports are v2.0, they don’t all have support HDCP 2.2 for his 4K set top box. I think he said only one port supports HDCP 2.2.
      Can you imagine ? Its just maddening. Not only do you have to know how many crappy HDMI 2.0 ports a TV has (I need more than three Samsung!!) you need to know what version of customer-punishing DRM each port supports !
      Yoy got me all riled up, I’m gonna go pirate some movies now just to spite them 🙂

      1. Now, now. Artists deserve their due compensation!

        But I ENTIRELY agree with you otherwise.

        I suspect this is a transition period where the market is waiting to see how well this 4K pushing goes among the peasants. ShamScrum is dipping a toe into it, unwilling to entirely dive in. It reminds me of those ‘HD Ready’ TVs of the near past that were cheesy kind-of-sort-of workable when HDTV became real for all.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.