HDMI 2.2 standard finalized: doubles bandwidth to 96 Gbps, 16K resolution support

midian182

Posts: 10,850   +142
Staff member
What just happened? Following the standard's debut at CES 2025, HDMI 2.2 has now been finalized by the HDMI Forum. The full spec confirms much of what we heard six months ago, including doubling the maximum bandwidth of HDMI 2.1 to 96 Gbps – more than DisplayPort.

While using HDMI 2.2 does not require a new connector, taking advantage of all its best features requires a new Ultra96 cable – a reference to its maximum bandwidth. 96Gbps is double the 48Gbps max bandwidth of HDMI 2.1 and is more than the 80Gbps supported by DisplayPort 2.1.

That amount of bandwidth will allow for some monstrous resolution and refresh rate combinations, including 4K at up to 240Hz and 8K at up to 60Hz in 4:4:4 format at up to 12-bit color depth. It also supports 12K@120Hz and 16K@60Hz with DSC (Display Stream Compression).

Following the issue of HDMI 2.1-branded cables not meeting the standard's spec, the Ultra96 cables themselves will clearly be labeled with the name, and manufacturers will be required to test each one individually. The cables will also be certified by the HDMI Forum.

Buyers of an Ultra96 Certified Cable can use the HDMI Forum's labelling program to confirm it's the real thing by scanning the QR code on the box.

HDMI 2.2 also introduces Latency Indication Protocol (LIP). This improves audio and video synchronization, especially for multi-device systems such as those with AV receivers or a soundbar. LIP could be especially useful to those whose systems can't seem to precisely synchronize the dialogue being heard with the actors' mouth movements.

HDMI 2.2 is backward compatible, so the cables will work with anything featuring an older HDMI port.

The first HDMI 2.2 devices are expected to arrive during the final quarter of 2025. AMD is rumored to support the standard in the next Radeon UDNA GPUs, though they might be limited to a maximum of 80Gbps, according to prolific leaker Kepler.

The gap between HDMI 2.1 and the first supported TVs was about 2 years, and it took around four years before the standard gained widespread adoption.

Permalink to story:

 
Would you be even able to notice DSC on a 16k display? 16k65" TV would be the equivalent of shrinking 4k down to ~16" display. Basically a 4k laptop screen. Maybe we'll see 200" 16k displays next year. I still want my 100"8k120 display.
 
Would you be even able to notice DSC on a 16k display? 16k65" TV would be the equivalent of shrinking 4k down to ~16" display. Basically a 4k laptop screen. Maybe we'll see 200" 16k displays next year. I still want my 100"8k120 display.
This is more for multi-display arrays where you can use multiple 4k screens for a huge screen.
 
Why isn't there more mention of refresh rates? That feels like the limitation more consumers (gamers) will brush up against in the near term.

Also when will marketers understand that this year's "ultrafast" cable is next decade's slow-as-molasses ancient relic. Calling your standard "ultra96" is eventually going to sound silly. But putting a "96" or "2.2@96" somewhere useful and permanent like on the connectors at both ends would at least give people a chance at using the right cable for the right job.
 
HDMI 2.2 finally brings HDMI up to speed with and even beyond DisplayPort, at least in raw bandwidth. But the real win here might be the mandatory certification and labeling. After the HDMI 2.1 branding chaos, it’s good to see the Forum tightening things up.
 
Each increase in speed comes with a reduction in maximum cable length, will these Ultra96 cables be limited to 15cm?
 
As long as this standard is not opened, I'll stay with DP. It is a joke that the only port used in TV's is locked from full use in Linux even, if there is absolutely no reason to do so.
 
This is fairly pointless update as far as TVs are concerned. 4k hasnt even hit majority of live TV channels yet....and I doubt it'll happen in the next 5 years. I see this benefiting PC more than anything.
 
Why isn't there more mention of refresh rates? That feels like the limitation more consumers (gamers) will brush up against in the near term.
Have a closer look at the chart in the article.
Also when will marketers understand that this year's "ultrafast" cable is next decade's slow-as-molasses ancient relic. Calling your standard "ultra96" is eventually going to sound silly. But putting a "96" or "2.2@96" somewhere useful and permanent like on the connectors at both ends would at least give people a chance at using the right cable for the right job.
Maybe for you, but for dummies it will sell cables to those who can afford them.
 
This is fairly pointless update as far as TVs are concerned. 4k hasnt even hit majority of live TV channels yet....and I doubt it'll happen in the next 5 years. I see this benefiting PC more than anything.
Each standard "improvement" is always useless until its not.
 
Would you be even able to notice DSC on a 16k display? 16k65" TV would be the equivalent of shrinking 4k down to ~16" display. Basically a 4k laptop screen. Maybe we'll see 200" 16k displays next year. I still want my 100"8k120 display.
Even 8K is pretty pointless in a traditional 10ft living room setup for a screen 85" and below, and largely unnoticeable at 100". Unless full-on TV walls become an actual consumer product, 16K is going to be a marketing gimmick at best.

We're deep in diminishing returns territory already in terms of resolution already - 8K TVs are barely a blip on the market for good reason. Manufacturers are focusing on color range/gamut, max brightness, and newer screen tech like MicroLED and such for a reason. OLEDs remain expensive to produce, with difficulty of manufacturing increasing dramatically as you go up in size, and will always have the burn-in issue, no matter how good they've gotten at slowing its progression. They really need a new technology with the potential to become increasingly affordable to produce over time, that doesn't have the scaling issues OLED does.

I've been waiting for 100"-120" TVs with OLED equivalent or better visual quality without burn-in potential for years now, and, sadly, it looks like I'll be stuck waiting for a while yet.
 
Each standard "improvement" is always useless until its not.
We've pretty much hit the limit in terms of resolution increase being visibly noticeable in common screen sizes for the majority of use cases. There are certainly other ways to make use of that additional bandwidth, but, unless the industry manages to make 100"+ screens somewhat affordable, even 8K is going to remain pretty pointless - let alone 16K.

For me, the biggest implication of the HDMI 2.2 standard is that I'm going to be holding off on getting a new AV receiver until I can get one that supports it. I fell into that trap last time, getting a 2.0 receiver right before 2.1 TVs hit the market. I'm really glad I ended up holding off on picking up the Marantz Cinema 30 - those things aren't cheap, and while it will likely be at least a couple years before they come out with a 2.2 equivalent, my current receiver is honestly still holding up quite well.
 
Back