HDMI 2.1 spec supports a range of high resolutions and fast refresh rates

Shawn Knight

Posts: 15,304   +193
Staff member

Technology waits for no one. Even though HDMI 2.0 outputs are far from ubiquitous (even on new PCs), the group behind the popular standard is forging ahead with a new specification that delivers some eye-catching features.

The HDMI Forum on Tuesday announced the release of HDMI 2.1 with support for a range of high resolutions and fast refresh rates including 8K60Hz and 4K120Hz. Resolutions up to 10K are also supported for commercial / industrial / specialty usages with support for the following enhanced refresh rates:

  • Variable Refresh Rate (VRR) reduces or eliminates lag, stutter and frame tearing for more fluid and better detailed gameplay.
  • Quick Media Switching (QMS) for movies and video eliminates the delay that can result in blank screens before content is displayed.
  • Quick Frame Transport (QFT) reduces latency for smoother no-lag gaming, and real-time interactive virtual reality.

The new tech also supports an automatic low-latency mode, eARC (Enhanced Audio Return Channel) and Dynamic HDR, the latter of which ensures that every moment of a video is displayed at its ideal values for detail, depth, contrast, brightness and wider color gamuts.

There’s also a new cable, the Ultra High Speed HDMI Cable, which supports up to 48 Gbps of bandwidth and features a very low level of EMI emission. It’s backwards compatible with earlier versions of the HDMI specification and can thus be used with existing HDMI devices.

The specification is available as of today to all HDMI 2.0 adopters. Compliance Test Specification (CTS) will be published in stages during Q1-Q3 2018. More information will be available at CES 2018 in January, we’re told.

Permalink to story.

 
Hopefully the variable refresh rate will kill gsync and freesync. And will be as good or better than gsync. Since gysnc is better than freesync but costs a ton more to include.
 
Hopefully the variable refresh rate will kill gsync and freesync. And will be as good or better than gsync. Since gysnc is better than freesync but costs a ton more to include.
AMD offers FreeSync over HDMI for sometime now. It could be FreeSync again, as in the case of DisplayPort and Adaptive Sync. The thing is that Adaptive Sync either on Displayport or HDMI will never kill GSync because GSync means money for Nvidia and Nvidia will NEVER support a free standard. How do you think they end up making 2.5 billions every quarter? By giving free stuff to their customers?
As for GSync being superior to FreeSync, probably yes, but not as much when LFC is supported.
 
The difference are kind of great. Don't know how much for the human eye, but a computer would have to pump out 4x more pixels than a 4k monitor. 4k is 8,294,400, and 8k is 33,177,600.
Here I will help you. I took the image above and re-sampled the left side to half the quality of the right. If you look closely you can see the difference but it is no where near as pixelated as the OP would lead you to believe. If Techspot wants to keep their credibility, they need to stop with the over exaggerations.

4K-8K-Comparison.jpg

.
 
Last edited:
The difference are kind of great. Don't know how much for the human eye, but a computer would have to pump out 4x more pixels than a 4k monitor. 4k is 8,294,400, and 8k is 33,177,600.

Diminishing returns is the point. 4K v 8K on a typical let's say 55 inch HDTV (slightly larger than current average worldwide screen sales to account for the trend of increasing sizes every year) from let's assume a viewing distance of 9 feet (study of average viewing distance) will be pretty marginal gains for the human eye. Your pet eagle would love it though.

The picture is obviously marketing guff and just an illustration of the difference, treat it as such.
 
OTA is still 1080i so why in the world do I need 4K or 8K what about 16K. Going to be some years before OTA gets to 8K I am sure they'll skip over 4K or just maybe do 16K. HDMI improvements if needed but still 1080i / 1080p standard now. So much hype for a sharp detail picture today. I am looking forward to SONY 75 - inch I am not buying any other brand. Plug in the SONY to the rest of my SONY 8.1 Gear I am off. The next expense would be 12.3 Dolby Sound. Trying to see how that would work here. Any good to read reviews on HDTV/UHDTV.
 
I just bought a GTX 1070 Ti. Every card I looked at from 1070 and above all have HDMI 2.0 ports as well as HDCP 2.2 (both necessary for HDR and UHD Blu-ray). I'm already running an HDMI 2.0 cable to my 3440 x 1440 display (which sadly doesn't come anywhere near to HDR spec.) Just don't get ripped off buying the cable, they are not expensive, but I notice a lot of companies don't explicitly state the HDMI version of the cable and way, way, way overcharge for them.
 
Here I will help you. I took the image above and re-sampled the left side to half the quality of the right. If you look closely you can see the difference but it is no where near as pixelated as the OP would lead you to believe. If Techspot wants to keep their credibility, they need to stop with the over exaggerations.

View attachment 83775

.

Not sure why you're showing me the picture, since my point was about how a computer would have to pump out 4 times as many pixels as 4k to output an 8k image. I also took the original picture and reduced its resolution by 4 times, then zoomed in to compared the two. Not only did the right side look better, but so did the left side. I guess anti aliasing.
 
Not sure why you're showing me the picture, since my point was about how a computer would have to pump out 4 times as many pixels as 4k to output an 8k image.
That means you were the one that took my meaning away from what I was talking about. I wasn't talking about the processing difference. But since you really want to talk about it. The visual difference in the image above would be 16x processing difference. And it should be 4x processing difference between 4K and 8K. So yes there is not as much difference in 4x processing as there is in the image portraying 16x processing. So therefor the point you were trying to make is puzzling.
 
HDMI 2.1 with it's adaptive sync standard (VRR+QFT) is a direct competitor to G-sync.

The only way Nvidia can avoid supporting HDMI 2.1 is to eliminate all HDMI ports on their future cards and forbid their partners from adding them later.

I know it's audacious and farfetched to you, but there are consumers out there who get on their knees and dedicate an annual % of their income to their favourite companies no matter how they are treated, be it $1,000 buggy phones or monitors that only perform optimally with a certain brand video card etc. etc. so don't act shocked if it happens.
 
As for the article, I've been waiting for this since January when it was first announced. I even skipped on a killer deal for a "65 LG C7, hoping for the ultimate HTPC experience with perhaps a C8 or B8 or any other sub-20 ms lag OLED TV, then I read this;

"Compliance Test Specification (CTS) will be published in stages during Q1-Q3 2018"

Can anyone explain what this means? So anything released before that won't be properly tested, or is it nothing will be released before that time?
 
So they will be testing devices next year and likely release hdmi 2.1 in 2019 or are they saying the spec will be released sometime next in 2018?
Also wont the new devices also have to support of some the new features too? So it wont just be the the cable you need but possibily a new device that utilizies the new features.
 
As for the article, I've been waiting for this since January when it was first announced. I even skipped on a killer deal for a "65 LG C7, hoping for the ultimate HTPC experience with perhaps a C8 or B8 or any other sub-20 ms lag OLED TV, then I read this;

"Compliance Test Specification (CTS) will be published in stages during Q1-Q3 2018"

Can anyone explain what this means? So anything released before that won't be properly tested, or is it nothing will be released before that time?

I almost held off on getting my LG B6, just for the VRR. In the end I decided against it. But HDMI 2.1 looks to be a major game changer for gamers, that's for sure.
 
By the time HDMI 2.1 become avaliable in commercial products, DisplayPort will also be in another version, also DisplayPort over USB-C is a better solution than HDMI. It's connector is smaller, easier to connect, it can be used to other things... I hope HDMI become a thing of the past, USB-C DisplayPort is a better solution.
 
Back